An Introduction To High Content Screening (eBook)
John Wiley & Sons (Verlag)
978-1-118-85941-4 (ISBN)
Using a collaborative and interdisciplinary author base withexperience in the pharmaceutical industry and academia, this bookis a practical resource for high content (HC) techniques.
• Instructs readers on the fundamentals of highcontent screening (HCS) techniques
• Focuses on practical and widely-used techniques likeimage processing and multiparametric assays
• Breaks down HCS into individual modules for trainingand connects them at the end
• Includes a tutorial chapter that works through sampleHCS assays, glossary, and detailed appendices
Steven Haney is a Senior Research Advisor and Group Leader at Eli Lilly and Company. He edited the book High Content Screening: Science, Techniques, and Applications (Wiley, 2008).
Douglas Bowman is an Associate Scientific Fellow at Takeda Pharmaceuticals.
Arijit Chakravarty is the Director of Modeling and Simulation (DMPK) at Takeda Pharmaceuticals.
Anthony Davies is Center Director, Translational Cell Imaging, Queensland University Of Technology, Queensland, Australia.
Caroline Shamu is the Director of the ICCB-Longwood Screening Facility at Harvard Medical School.
Steven Haney is a Senior Research Advisor and Group Leader at Eli Lilly and Company. He edited the book High Content Screening: Science, Techniques, and Applications (Wiley, 2008). Douglas Bowman is an Associate Scientific Fellow at Takeda Pharmaceuticals. Arijit Chakravarty is the Director of Modeling and Simulation (DMPK) at Takeda Pharmaceuticals. Anthony Davies is Center Director, Translational Cell Imaging, Queensland University Of Technology, Queensland, Australia. Caroline Shamu is the Director of the ICCB-Longwood Screening Facility at Harvard Medical School.
1
INTRODUCTION
STEVEN A. HANEY
1.1 THE BEGINNING OF HIGH CONTENT SCREENING
Microscopy has historically been inherently a descriptive endeavor and in fact it is frequently described as an art as well as a science. It is also becoming increasingly recognized that image-based scoring needs to be standardized for numerous medical applications. For example, for medical diagnoses, interpretation of medical images has been used since the 1950s to distinguish disorders such as cervical dysplasias and karyotyping [1]. Cameras used in microscopes during this era were able to capture an image, reduce the image data to a grid that was printed on a dot-matrix printer and integrated regional intensities to interpret shapes and features. In essence, these principles have not changed in 50 years, but the sophistication and throughput with which it is done has increased with advances in microscope and camera design and computational power. In the early 1990s, these advances were realized as automated acquisition and analysis of biological assays became more common.
Advances in automated microscopy, namely the automated movement of slides on the stage, focusing, changing fluorophore filters, and setting proper image exposure times, were also essential to standardizing and improving biomedical imaging. Automated microscopy was necessary to reduce the amount of time required of laboratory personnel to produce these images, which was a bottleneck for these studies, especially medical diagnoses. A team of scientists from Boston and Cambridge, Massachusetts described an automated microscope in 1976 that directly anticipated its use in subcellular microscopy and image analysis [2]. The microscope, and a processed image of a promyelocyte captured using the instrument, are shown in Figure 1.1.
Figure 1.1 An early automated microscope used in biomedical research. (a) An example of an automated fluorescence microscope. Letters inside the figure are from the original source. The system is outfitted with controlled stage and filter movements (S and F), a push-button console for manual movements (B), a television camera and monitor (T and m) and a video terminal for digitizing video images (v). (b) A video image of a promyelocyte and (c) image analysis of (b), showing, an outline of the nucleus and cell borders, which can be used in automated cell type recognition. Reproduced with permission from [2]. Copyright 1974 John Wiley & Sons.
Until the mid-1990s, automated microscopy was applied in basic research to address areas of high technical difficulty, where rigorous measurements of subtle cellular events (such as textural changes) were needed, events that took place over long time periods or were rare (which made it challenging to acquire sufficient numbers of images of each event). In medicine, automated imaging was used to standardize the interpretation of assay results, such as for the diagnosis of disease from histological samples (where it was notoriously difficult to achieve concordance among clinical pathologists). Adapting quantitative imaging assays into a screening context was first described by Lansing Taylor and colleagues [3], who commercialized an automated microscope capable of screening samples in multiwell plates (a format that had emerged as an industry standard during this time period). The term “high content” was coined to contrast the low throughput screening in these imaging assays with the increasing scale of high throughput primary drug discovery screens. Many groups have since demonstrated the usefulness of automated microscopy in drug discovery [4, 5] and basic research [6, 7]. During this phase (the early 2000s), data acquisition, image analysis, and data management still imposed limits on image-based screening, but it did find an important place in the pharmaceutical industry, where expensive, labor-intensive assays critical for late-stage drug development were a bottleneck. One example is the micronucleus assay, an assay that measures the teratogenicity of novel therapeutics through counting the number of micronuclei (small nonnuclear chromosomal fragments that result from dysregulation of mitosis). An increase in the number of cells that contain micronuclei is indicative of genotoxicity, so this assay is frequently part of a screening program to make a go/no go decision on clinical development [8]. The assay requires finding binucleate cells and checking for a nearby micronucleus. For each compound assayed, a single technician might spend many hours in front of a microscope searching and counting nuclei. Automation of image capture and analysis not only reduced the work burden of researchers, but it also made the analysis itself more robust [9]. Similar applications were found in the field of cell biology, where automated microscopy was utilized to collect and analyze large data sets [10, 11].
Following from these early implementations, high content screening (HCS) has been widely adopted across many fields as the technology has improved and more instruments are available commercially. The speed at which images can be analyzed is limited by computer power, as more advanced computer technology has been developed, the scale at which samples can be analyzed has improved. Faster computers also mean that more measurements per cell can be made; shapes of cells and subcellular structures can be analyzed as well as probe intensities within regions of interest. This has led to the quantification of subtle morphological changes as assay endpoints. A widely used application of this approach has been receptor internalization assays, such as the Transfluor™ assay to measure the activation of GPCRs through changes in the pattern of receptor staining, from even staining over the surface of the cells to dense puncta following internalization of the activated receptors through vesicle formation [12]. Concomitant with the increase in the sophistication of the assays themselves, improvements in the mechanical process of screening samples has also fed the growth of HCS. Gross-level changes, such as integrating plate handling robotics and fine-level changes, such as improvements in sample detection and autofocusing, have improved the scale of HCS to the point where image-based readouts are possible for true high throughput screens (screens of greater than 100,000 compounds) [5].
HCS has a strong presence in basic biological studies as well. The most widely recognized applications are similar to screening for drug candidates, including siRNA screening to identify genes that control a biological process, and chemical genetics, the identification of small molecules that perturb a specific cellular protein or process. While operationally similar to drug screening, they seek to explain and study biological questions rather than lead to therapeutics explicitly. Additional uses of HCS in basic science include the study of model organisms. Finally, the use of multiparametric single cell measurements has extended our understanding of pathway signaling in novel ways [11].
1.2 SIX SKILL SETS ESSENTIAL FOR RUNNING HCS EXPERIMENTS
At this point we want to touch on the fundamental skill sets required to successfully set up and use an HCS system to address a biological problem, and how responsibilities might be divided up in different settings. The six major skill sets required to develop and run an HCS project are shown in Figure 1.2. Each area is distinct enough as to be a full-fledged area of expertise (hence introducing these areas as “skill sets”), but typically a person is competent in more than one area. It is rare that all roles can been successfully filled by one person. Therefore, the ability to develop a collaborative team is essential to HCS. It is also very important to understand that these roles vary between groups, and this can cause problems when people move between groups or as groups change in size. The skill sets are the following.
Figure 1.2 The basic skill sets essential for establishing and running HCS experiments. Skills noted in the figure are discussed in detail in the text.
1.2.1 Biology
The biologist develops the question that needs to be answered experimentally. In academia, the biologist is typically a cell biologist and oftentimes is also capable of collecting images by HCS as well. In industrial circles (pharma and biotech), a therapeutic team may be led by a biochemist or in vivo pharmacologist, who may have little training in fluorescence microscopy. The key area of expertise here is an appreciation of these problems and an ability to formulate strategies (experimental systems and assays) to address them. There is also a significant understanding of how cellular models in the laboratory relate to the biology in vivo. In addition to understanding the fundamental biological question, understanding how to establish a cellular model that incorporates relevant aspects of the biological environment is important.
1.2.2 Microscopy
Although many of the HCS systems are sold as turnkey “black-boxes,” it is important to have a good understanding of fundamental microscopy components (staining techniques, reagents, and optics) as each has a significant impact on the quality of data generated by the instruments. For example, the choice of illumination system and filter sets determine which fluorescence wavelengths (fluorophores) you can use to stain specific cellular compartments. Other microscope objective characteristics (numerical...
| Erscheint lt. Verlag | 22.12.2014 |
|---|---|
| Mitarbeit |
Stellvertretende Herausgeber: Anthony Davies, Caroline Shamu |
| Sprache | englisch |
| Themenwelt | Naturwissenschaften ► Biologie ► Biochemie |
| Naturwissenschaften ► Chemie | |
| Technik | |
| Schlagworte | Analytical Chemistry • Analytische Chemie • assays development • Biomolecules (DNA, RNA, Peptides, etc.) • Biomolekül • Biomoleküle (DNA, RNA, Peptide) • Biomolekül • Biomoleküle (DNA, RNA, Peptide) • Biowissenschaften • Cell & Molecular Biology • Cellular imaging • Chemie • Chemistry • Data Analysis • drug discovery • high content screening • Life Sciences • Molekularbiologie • Zell- u. Molekularbiologie |
| ISBN-10 | 1-118-85941-3 / 1118859413 |
| ISBN-13 | 978-1-118-85941-4 / 9781118859414 |
| Informationen gemäß Produktsicherheitsverordnung (GPSR) | |
| Haben Sie eine Frage zum Produkt? |
Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM
Dateiformat: EPUB (Electronic Publication)
EPUB ist ein offener Standard für eBooks und eignet sich besonders zur Darstellung von Belletristik und Sachbüchern. Der Fließtext wird dynamisch an die Display- und Schriftgröße angepasst. Auch für mobile Lesegeräte ist EPUB daher gut geeignet.
Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine
Geräteliste und zusätzliche Hinweise
Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.
aus dem Bereich