Home Protozoa Cryptosporidium
Cryptosporidium PDF Print E-mail
Thursday, 11 February 2010 00:00
Zia Bukhari
Andreas Nocker
Mark LeChevallier

  • Cryptosporidium sp. are important pathogenic protozoan parasites of humans and various animals species. Their transmissive stage (oocysts) occur ubiquitously in surface waters.
  • Oocysts possess thick outer walls that protect the parasites from harsh environmental conditions:
    • Depending on the water temperature, oocysts can survive in water for weeks to months.
    • Oocysts are highly resistant to chemical disinfectants, including chlorine at levels used in drinking water treatment processes.
  • The physiological characteristics of the oocyst, their small size (4-6 µm) and low infectious dose facilitate waterborne disease transmission.
  • In immunocompetent individuals the disease (cryptosporidiosis) is self limiting; however, can be life threatening in the young, elderly or the immunocompromised.
Protozoan parasites of the genus Cryptosporidium belong to the class Sporozoasida, family Cryptosporidiidae and phylum Apicomplexa. They are often referred to as coccidia. While some coccidia can undergo extra-intestinal development and are tissue-cyst forming (i.e. Sarcocystis, Toxoplasma), others develop in the gastrointestinal or respiratory tract, without formation of tissue cysts (i.e. Eimeria, Isospora and Cryptosporidium). Like other coccidia, it was originally thought that Cryptosporidium would be highly host specific and almost 20 species were named according to the species of the infected host from which they were isolated ( Tyzzer EE 1912; Levine ND 1984; Current et al. 1986; Fayer and Ungar 1986). Later, cross-transmission studies with mammalian isolates of Cryptosporidium indicated low host specificity, which first prompted Tzipori et al. 1980 to consider Cryptosporidium as a single species genus and then led Levine ND 1984 to suggest that only four species may be valid. Later the valid number of species was increased to six species with C. parvum causing respiratory and intestinal infections whereas C. muris causing stomach infections. Respiratory and intestinal cryptosporidiosis in birds was attributed to C. baileyi and C. meleagridis, whereas C. serpentis infected reptiles and C. nasorum infected fish.

Over the last 15 years, increased adoption of molecular characterization procedures has led to considerable reorganization of Cryptosporidium taxonomy and two species (Cryptosporidium hominis and Cryptosporidium parvum) have emerged to be of most significance from a public health perspective, and as such, of the greatest interest to the water industry. Of these species, C. parvum infects both humans and animals whereas C. hominis infects only humans ( Xiao et al. 2004). Additional Cryptosporidium species capable of infecting humans and causing disease (mainly in children and immunocompromised persons) include C. felis, C. muris, C. canis, C. suis and C. meleagridis ( Xiao et al. 2001; Muthusamy et al. 2006; Pieniazek et al. 1999). Fayer R 2009 reviewed the taxonomic status for Cryptosporidium and highlighted that 12 valid species infected mammals (C. hominis, C. parvum, C. muris, C. wrairi, C. felis, C. andersoni, C. canis, C. suis, C. bovis, C. fayeri, C. ryanae, C. macropodum), three species infected birds (C. meleagridis, C. baileyi and C. galli) and three species infected amphibians and reptiles (C. serpentis, C. varanii and C. fragile). The taxonomy of species infecting fish is somewhat unclear presently and a number of other Cryptosporidium genotypes also exist that may become recognized as valid species as further evidence is accrued.

In humans Cryptosporidium infections are acquired by ingestion or inhalation (i.e. fecal oral route, foodborne, waterborne, etc) of the parasite’s transmissive stage, which is known as the oocyst. The time between acquisition of infection and manifestation of symptoms (pre-patent period) is dependent on various factors (i.e. host susceptibility, virulence of the infecting strain of oocysts, oocyst infectivity, etc) but may be somewhere from 5-28 days (mean 7.2) ( Anderson et al. 1982; Current et al. 1983; Højlyng et al. 1987). Cryptosporidium infections are most pathogenic in neonatal hosts, however in humans infections have been reported from a 3 day old infant, from a mother with cryptosporidiosis, to a 95 year old individual. Disease often manifests itself with profuse, watery diarrhea, abdominal cramping, nausea, vomiting and low grade fever. In well nourished, immunocompetent individuals, cryptosporidiosis may last 2 -12 days but is usually self-limiting. Occasionally, infections may continue for two weeks or more and may require fluid replacement therapy. Presently no effective chemotherapeutic treatment is available and duration of disease is dependent upon the patient’s immune status. In patients with congenital or acquired immune deficiencies or malnourished individuals, infection can be considerably prolonged, resulting in malabsorption, severe dehydration and even death.

During the course of infection, Cryptosporidium undergoes distinct asexual and sexual phases of a life cycle in enterocytes of infected hosts. This culminates in the production of oocysts that are shed in the feces of infected hosts in large numbers (i.e. 105-107 oocysts per gram of feces). The voided oocysts each contain four mononucleated sporozoites and can disperse through the environment by various mechanisms (i.e. surface run-off and by leaching through the soil profile) to contaminate surface waters. Figure 2 shows Cryptosporidium oocysts labeled with fluorescein isothiocynate conjugated monoclonal antibody (green outer wall) and an oocyst labeled with 4’, 6’ diamidino-2-phenylindole (DAPI), which stains the four nuclei in intact oocysts a sky blue color. When oocyst contaminated surface water is used for drinking purposes, outbreaks of human disease (cryptosporidosis) can occur. The risk of waterborne human disease transmission is exacerbated by the fact the oocysts have a low infectious dose (see later), are able to survive in the environment for long periods of time and are not readily inactivated by chemical disinfection, particularly chlorine at levels normally employed in water treatment processes (see later).

To better understand the occurrence of Cryptosporidium, various surveys have been conducted worldwide ( LeChevallier et al. 1991; Rouse et al. 2001; Hashimoto et al. 2001; Drinking Water Inspectorate 2003; LeChevallier et al. 2003). In one study it was reported that infectious oocysts were detected in 22 of 82 (26.8%) surface water treatment plants within the US, potentially posing an annual risk range of 9-119 infections per 10,000 people ( Aboytes et al. 2004) by the waterborne route. Waterborne outbreaks have been reported in the United States, United Kingdom, and Japan with the most notable outbreak occurring in Milwaukee in 1993 where an estimated 405,000 individuals were impacted, including 100 deaths ( Mac Kenzie et al. 1994). The estimates for infection were based on retrospective surveys of self-reported diarrhea. Such estimates can be prone to recall bias and some have suggested that the actual numbers of infected individuals in the Milwaukee outbreak were over-estimated by 10 to 100 fold ( Hunter and Syed 2001). Irrespective of the actual numbers of individuals impacted, the magnitude of this outbreak is underscored by its economic burdens. Improvements to the water treatment plant alone incurred estimated costs of $90 million. Additionally, costs associated with medical expenses or loss of productivity were approximated to be around $96 million when using the original estimates of infected people, with approximately two thirds being due to productivity losses ( Corso et al. 2003). Where adequate monitoring or screening is implemented, it is not uncommon to find evidence for waterborne human cryptosporidiosis. For example data from 1992-2003 indicated there were 89 waterborne outbreaks registered in England and Wales of which 61 were attributed to Cryptosporidium and affected more than 4300 people ( Smith et al. 2006).

Cryptosporidium oocysts contaminating surface water and subsequently being responsible for waterborne human cryptosporidiosis can originate from many contributing sources (i.e. wild or domestic animals and untreated human sewage). Cattle and sewage discharges can contain large numbers of oocysts and surface waters receiving such discharges have been reported with 10- to 100-fold higher oocyst concentrations than pristine waters ( Bagley et al. 1998; Rose JB 2002). In some cases surface waters impacted from sewage and animal feces have been reported to contain 100 Cryptosporidium oocysts per liter of water (Lisle and Rose 1995) and others have reported sewage discharges to contain peak concentrations ranging from 104 to 3.9x105 oocysts per liter ( Medema and Schijven 2001). Atherholt et al. 1998 found a positive correlation between precipitation rates and oocysts levels in river waters, which may point to heavy rainfall facilitating oocyst transport from contaminated land into receiving waters. Once in a watershed, oocysts show considerable resistance to adverse environmental conditions and may survive for weeks or months ( Rose JB 2002).

Following the Milwaukee outbreak, there was heightened awareness for Cryptosporidium in the United States and the 18 month monitoring program under the Information Collection Rule (ICR) was implemented between July 1997 and December 1998. Monthly samples were collected and analyzed from the raw water at the plant intake of 296 water systems (representing 500 treatment plants). Of 5838 samples analyzed under the ICR only 7% were Cryptosporidium positive with a mean oocyst concentration of 0.067 oocysts per L ( Messner and Wolpert 2002). Calculation based on total oocyst count divided by total volume analyzed or median concentrations both rendered similar oocyst concentrations (i.e. 0.02 oocysts per liter). Poor and highly variable overall method recovery efficiencies (i.e. 12% ± 11%, range 1%-30%; n=140 Scheller et al. 2002) were probable reasons for high number of non-detects and for the high location to location variability where oocysts were detected. The ICR-Supplemental surveys (SS), using USEPA Method 1622, helped increase the median volume analyzed per sample and may have contributed to lowering location based variability. Monitoring of medium sized systems indicated more frequent occurrence of higher Cryptosporidium concentrations than monitoring at larger systems. Despite this the median Cryptosporidium concentrations in the ICR-SS were similar to those obtained from ICR data (i.e. 0.02 oocysts per liter). In addition to the 18 month monitoring study, the ICR-SS and other environmental monitoring supported the notion that Cryptosporidium concentrations can be wide and varied; however are likely to be ubiquitous in surface waters. The actual levels being detected are likely to be influenced by inconsistencies in method performance and may also be attributed to oocyst burdens from the contributing sources (i.e. sewage and/or animal manure). Rose et al. 1997 cited seven raw water monitoring studies in North America with average percent oocyst positive samples ranging from 9.1% to 100% (oocyst concentration range 0 – 240 oocysts per L). Lower occurrence (3.8% to 33.3%) and oocyst concentrations (0.001 and 0.48 oocysts per L) were noted in drinking water samples (n=158).

Following substantial method refinements and multi-laboratory validations, US Environmental Protection Agency (USEPA) Methods 1622/23 emerged in the new millennium with a promise of more reliable monitoring data for Cryptosporidium and Giardia. Two studies extensively use Method 1623 (in conjunction with cell culture-polymerase chain reaction-CC-PCR) to establish Cryptosporidium occurrence and infectivity in both surface water ( LeChevallier et al. 2003) and finished drinking water ( Aboytes et al. 2004). Data from these studies indicated seasonality in oocyst occurrence, with peak levels occurring most notably in the spring, but also in the fall months. The spring peaks in oocyst levels may be associated with rainfall ( Atherholt et al. 1998; Payment et al. 2000). During spring there are increased numbers of newborn calves and lambs and Cryptosporidium infections in such neonatal animals can be common. It has been noted that on some farms >99% of the neonatal calves with diarrhea were shedding large numbers of oocysts ( Bukhari Z 1995). To understand significance of oocysts for disease transmission, it is important establish whether they are alive or dead. Viability patterns of oocysts shed over the course of infection in experimentally infected calves, lambs and mice have shown large proportions (i.e. 30-50%) of oocysts were dead ( Bukhari and Smith 1997). Non-viable oocysts have no role in further disease transmission to other susceptible hosts. As Cryptosporidium is an obligate parasite, it can not multiply outside of a susceptible host and there is further degradation of oocyst viability with time and following exposure to harsh environmental conditions ( Bagley et al. 1998). Where viable (and infectious) oocysts contaminate surface water, consumption of untreated or inadequately treated (improper treatment or process failures) water can lead to waterborne human cryptosporidiosis. To help reduce the risk of waterborne cryptosporidiosis, many large water systems in the US participate in a voluntary program known as the “Partnership for Safe Drinking Water”. By review and optimization of treatment processes the goal of this program is to consistently produce filtered drinking water with a turbidity level less than 0.1 NTU, which exceeds existing USEPA regulations.

To better understand the risk of waterborne human cryptosporidiosis, it is necessary to understand the frequency with which Cryptosporidium oocysts can be detected in various untreated water sources, the recovered oocyst infectivity and how the various drinking water treatment processes could potentially impact physical removal of oocysts or their infectivity. The USEPA has published and implemented the Long Term II Enhanced Surface Water Treatment Rule (LT2ESWTR or LT2 rule) to supplement existing regulations. LT2ESWTR has required surface water systems to monitor raw water for Cryptosporidium, and based on a rolling 12-month average, systems will be categorized into different concentration dependant bins. Systems exceeding 0.075 oocysts per L will require additional treatment with approved removal/inactivation technologies effective for Cryptosporidium. (USEPA 2000). Detailed information can be found at http://www.epa.gov/OGWDW/disinfection/lt2/. The first round of the LT2 surveys began with the largest systems (serving at least 100,000 people) in October 2006 and LT2 monitoring was staggered such that the smallest systems (serving < 10,000 people) were scheduled for monitoring in October 2008. Following LT2 surveys, systems need to comply with any additional treatment requirements within three years. A second round of monitoring is scheduled six years after completing the initial round to determine if source water conditions have changed significantly.

Selected occurrence studies:

Numerous occurrence studies have been conduced worldwide and their exhaustive review is outside the scope of this section. The following examples are intended to highlight the variation in oocyst occurrence and provide a perspective on various factors that may influence oocyst concentrations in water sources used to draw water for treatment for human consumption.

  • Cryptosporidium monitoring was performed over 30 months for the French Seine and Marne Rivers (supply drinking water for Paris and the surrounding area) and analysis was performed using the French standard NF T90-455 procedure (adapted from Method 1623). Of 162 samples 46% (n=74) were Cryptosporidium positive with concentrations ranging between 0.03 and 0.32 oocysts per L at seven of eight sampling sites. One site was reported to present unusually higher levels of oocysts (35.9 oocysts per 10 L). Positive samples were found in all seasons; however, the highest oocyst occurrence was during fall. No association could be established between rainfall and fecal indicator counts and Cryptosporidium concentrations in this study ( Mons et al. 2009).
  • For 16 drinking water treatment plants in Galicia (northwest Spain) both influent and final effluent samples were analyzed for Cryptosporidium oocysts over a one-year period. From 128 samples analyzed using Method 1623, mean concentrations ranged between 0 and 10.5 oocysts per L for influent and between 0 and 3 oocysts per L for effluent samples. Significantly higher oocyst numbers were detected in the spring and summer ( Castro-Hermida et al. 2008). In another study in northern Spain, a total of 284 samples were tested over a 30 month period ( Carmena et al. 2007). A significant correlation between oocyst presence and turbidity levels was noted. Oocyst positive samples occurred in all seasons; however, detection was most frequent during fall. Moderate correlations between oocyst levels and rainfall were also found ( Carmena et al. 2007). The results have been presented in the following table.
  • SOURCE: Adapted from Carmena et al. 2007.

    CWTF: Conventional water treatment facilities with coagulation, flocculation, sedimentation, filtration, and disinfection
    SWTF: Small water treatment facilities including rapid filtration and disinfection only
    Tap water: with chlorination treatment only

  • Using Method 1623 for C. parvum quantification and cell culture-PCR (CC-PCR) for infectivity measurements, weekly samples from six watersheds were analyzed for one year. Additionally utilities were instructed to conduct event based sampling (i.e. rainfall related) when appropriate. Cryptosporidium oocysts were detected in 60 of 593 samples (10.1%) by Method 1623 and 22 of 560 samples (3.9%) by the CC-PCR technique. Approximately 37% of the Cryptosporidium oocysts detected by the immunofluorescence method were deemed viable and infectious. DNA sequencing of PCR amplicons indicated presence of both the bovine and human genotypes with > 90% of the C. parvum isolates having the bovine or bovine-like genotype. The authors calculated that most surface water systems would require an estimated 3-log reduction in source water Cryptosporidium levels to meet potable water goals ( LeChevallier et al. 2003).A follow up study by the same group examined monthly 100 L finished water samples from 82 conventional surface water treatment plants from 14 states. A total of 1690 samples were analyzed with 1.4% of the samples being positive for infectious Cryptosporidium. Consistent with the surface water monitoring data, genotyping showed that 23 isolates were C. parvum and one isolate was C. hominis. No association was apparent between the treatment plant characteristics or water quality and oocyst occurrence. Based on these occurrence data, the authors calculated annual risks of 52 infections per 10,000 with peak in annualized risk (200 infections per 10,000) in April ( Aboytes et al. 2004)
  • Cryptosporidium monitoring near various intakes to 45 drinking water treatment plants along the Saint Lawrence River in Canada indicated concentrations between 0.04 and 2.74 oocysts per L ( Payment et al. 2000). Accounting for method performance, it was postulated that actual oocyst concentrations may even be10-fold higher.
  • Storm water samples collected from a stream in the state of New York were analyzed by PCR followed by Terminal restriction fragment length polymorphism analysis of PCR products (PCR-RFLP). A total of 12 distinct genotypes were identified in the 27 of 29 positive samples ( Xiao et al. 2000).
  • A literature survey reported that three studies, analyzing a total of 158, had observed between 3.8% and 33.3% of samples as positive for Cryptosporidium. Oocyst concentrations in positive samples ranged from 0.001 – 0.48 oocysts per L ( Rose et al. 1997).
  • Monitoring of 72 municipalities across Canada indicated oocysts in 4.5% of raw water samples, 3.6% of treated water samples and 6.1% of raw sewage samples. Most of the participating municipalities were not situated downstream of agricultural areas or sewage effluents ( Wallis et al. 1996).
  • Two irrigation districts in adjacent watersheds in British Columbia, Canada, serving rural agricultural communities, were monitored for Cryptosporidium using large sample volumes (range 150 to 7000 L) In watershed A, which comprised a more complex system of surface water sources than watershed B and where cattle had direct access to creeks, significantly higher oocyst levels were found downstream than upstream of a cattle ranch. Peak concentrations were found in May. In watershed B, cattle had no access to creeks and the highest oocyst levels in this watershed were found in April. Results are summarized in the following table (summarized from Ong et al. 1996):

  • Monitoring of surface water samples (rivers and lakes; sample volumes of approximately 400 L) across 17 states in the US indicated 51% (n=181) of the samples tested positive with a geometric mean of 0.43 oocysts per L. Also 17% of 36 drinking water samples (approximate volume: 400-1000 liters were tested) with Cryptosporidium concentrations ranging from 0.005-0.17 per L. No correlation with indicator bacteria was noted ( Rose et al. 1991b).
  • Where 66 surface water treatment plants in 14 states in the US and 1 Canadian province were examined, a wide dispersion of Cryptosporidium concentrations were found. Oocysts were detected in 87% of the raw water samples with levels ranging from 0.07 to 484 oocysts per L (geometric mean: 2.7). Only about 32% of 242 Cryptosporidium oocysts observed in raw water contained sporozoites, suggesting that the majority of oocysts were non-viable. Parasite concentrations correlated well with water quality parameters (turbidity, total coliforms, fecal coliforms) (LeChevallier MW 1991). A follow on study reported 60% oocyst occurrence in 347 surface water samples collected over 5 years ( LeChevallier and Norton 1995). The authors compared these values with a study by Rose which found Cryptosporidium oocysts in 77% of raw water samples (n=107) with the geometric mean in the range between 0.91 and 28 oocysts per L raw water in the Western United States ( Rose JB 1988). Boutros reported presumptive Cryptosporidium spp. in 70% (n= 50) of surface water supplies in Pennsylvania with concentration ranging between 0.002 and 4.49 oocysts per L ( Boutros SN 1989).
  • Four rivers in Washington State and two rivers in California were tested for Cryptosporidium using an indirect immunofluorescence assay. All 11 river water samples tested positive with oocyst concentrations ranging from 2 to 112 oocysts per L ( Ongerth and Stibbs 1987).
In the Unites States chlorine is used as a primary disinfectant in 63 percent of the surface water treatment plants, and as a post-disinfectant in more than 93 percent of all surface water treatment plants (US EPA 1997). Occurrence of numerous waterborne outbreaks of human cryptosporidiosis have demonstrated that normal chlorination of water (i.e. free chlorine levels between 0.05 - 0.4 mg/L) is inadequate for inactivation of Cryptosporidium oocysts. Experimental evidence supports this notion. For calf-derived oocysts, free chlorine concentrations of 80 mg/L for 120 min reduced oocyst viability to zero and 2 logs reduction in infectivity was noted after 90 min exposure ( Sterling et al. 1989). These high chlorine concentrations are impractical for various reasons (i.e. unacceptable taste, odor and toxicity, costs, pipe-corrosion and disinfection by products).

Use of ozone as an alternative disinfectant was evaluated by Peeters et al. 1989 who reported concentrations between 1.11 - 2.25 mg/L, for contact times between 6 - 8 min were required to achieve 90 - 98% inactivation of oocysts. Where inactivation of C. parvum and Giardia lamblia were investigated with ozone and chlorine dioxide, under the same experimental conditions, C. parvum oocysts were 30 times more resistant to ozone and 14 times more resistant to chlorine dioxide ( Korich et al. 1990). In addition, several other studies have evaluated the efficacy of ozone. Finch et al. 1993 reported that 2.0 mg/L ozone at 7°C and 22°C resulted in 99.9% and 99.997% inactivation of oocysts respectively; however, use of elevated ozone concentrations could lead to the production of bromate, which is a potential carcinogen.

In Lorenzo-Lorenzo et al. 1993, the impact of UV light on Cryptosporidium oocysts was examined; however, due to inadequate description of their disinfection experiments, the effectiveness of UV light for Cryptosporidium oocysts inactivation failed to be recognized. Two years later, a unit known as the Cryptosporidium Inactivation Device (CID), which delivered a total UV dose of 8,748 mJ•cm-2 from low pressure lamps, was examined and 2- to 3-log inactivation of oocysts was reported using the fluorogenic vital dyes assay (4’-6’-diamidino-2-phenylindole and propidium iodide) and in vitro excystation ( Campbell et al. 1995). Following this, Clancy et al. 1998 confirmed the findings of Campbell et al. 1995; however, the apparent requirement for application of high UV doses to achieve significant oocyst inactivation continued to present an obstacle for the regulatory and water industry in supporting implementation of this technology.

Between 1998-1999, comparative bench scale and demonstration scale (215 gpm) Cryptosporidium oocyst inactivation studies were performed, which utilized medium pressure UV lamps and examined oocyst inactivation using in vitro viability assays (DAPI/PI and in vitro excystation) as well as mouse infectivity assays ( Bukhari et al. 1999). These studies conclusively demonstrated the effectiveness of low UV doses for Cryptosporidium inactivation in finished water and these data rapidly gained attention from both regulatory agencies and the water industry and instigated the revolution in the United States water industry for UV disinfection as a means to effectively inactivate Cryptosporidium oocysts. In addition to demonstrating the effectiveness of low levels of UV light for oocyst inactivation, Bukhari et al. 1999 also identified that in vitro viability assays under-estimated oocyst inactivation following their exposure to either UV light or ozonation ( Bukhari et al. 2000) and that mouse infectivity assays -albeit cumbersome procedures, needed to be the methods of choice for future studies of this nature. Following these initial studies, a number of other studies have confirmed the effectiveness of UV light for inactivating Cryptosporidium oocysts ( Bukhari and LeChevallier 2004) as well as Giardia cysts (Craik et al. 2001).

UV disinfection has also been accepted as is one of the treatment option by the USEPA in the microbial toolbox that is intended for selecting appropriate treatment options for systems exceeding 0.075 oocysts per L following the LT2 surveys. The LT2 has several requirements related to the use of UV disinfection, which include ascertaining UV doses for different levels of inactivation credit, performance validation testing of UV reactors, monitoring, reporting, and off-specification operation. The US EPA has developed a UV Disinfection Guidance Manual 2006, which is intended to help utilities navigate through the process of implementing UV disinfection to meet future compliance goals.

Selected disinfection studies:

  • For free chlorine, a CT value (chlorine concentration in mg/L, multiplied by time in minutes) of 9,600 is recommended by the US Center for Disease Control to achieve a 3-log reduction in oocyst viability during remediation of recreational water after fecal diarrhea incidents. When treating fresh oocysts (<1 month old) at pH 7, even higher CT values (10,400 and 15,300) were determined for two different isolates from Iowa and Maine. Oocyst viability was assessed in this study by cell culture (Shields et al. 2008).
  • Sequential treatment of C. parvum oocysts with ozone and monochloramine in natural waters was observed to yield statistically significant synergistic effects. Oocyst inactivation was measured in mice and increased with initial pH and ozone pre-treatment levels. Inactivation also depended on water quality and was lower in waters with higher turbidity, color, and organic carbon levels ( Biswas et al. 2005).
  • Purified C. parvum oocysts (107) were suspended in 10 mL of distilled water in transparent polystyrene microtiter containers and exposed to simulated global solar irradiation of 830 W/m2 at 40°C (i.e. batch solar disinfection -SODIS). This irradiation value is slightly lower than the value of 1000 W/m2 that can be expected at equatorial latitudes on a cloud-free day. Oocyst excystation levels declined from an initial 93% to 27% after 6 hours to 6% after 12 hours of exposure. Infectivity, as determined with neonatal mice, dropped more significantly to 7.5% after 6 hours and was eliminated after 12 hours ( Méndez-Hermida et al. 2005). A later study reported that similar treatment conditions resulted in complete loss of infectivity after 10 hours ( McGuigan et al. 2006). King et al. 2008 confirmed the efficiency of solar radiation to reduce C. parvum infectivity. Up to 90% of inactivation occurred in tap water within 1 hour. In environmental waters, presence of dissolved organics decreased solar inactivation. Using filters to compare the effect of different components in the UV spectrum, UV-B was identified as the most germicidal ( King et al. 2008).

Viability and infectivity assays

Establishing whether oocysts are dead or alive is critical to understanding their ability to propagate disease to susceptible hosts. This may be achieved with viability or infectivity assays. Ideally environmental monitoring should extend quantitative measurements to include a viability/infectivity component in their analysis to provide accurate risk assessments associated with the recovered oocysts. Traditionally oocyst viability was measured by in vitro excystation, a laboratory based procedure that mimics some of the anticipated biological triggers (i.e. elevated temperature, carbon dioxide, stomach acid, bile salts etc) and has been used extensively to determine whether oocysts are dead or alive (Reducker and Speer 1985; Woodmansee 1987; Robertson et al. 1993).

The rationale being that dead oocysts would not respond to biological triggers and, as a result, would not release their sporozites. Calculating a ratio of oocysts that have released their sporozoites to those that have not responded (i.e. intact oocysts) can provide an excystation ratio for a given population of oocysts. In the environment, especially as oocysts age, the sporozoites inside the oocyst can degrade from various environmental pressures. To get an accurate representation of oocysts that may have the potential to cause disease, it is important to differentiate these oocyst shells from oocysts that have actively undergone in vitro excystation. Typically this necessitates examining a sub-sample of a given oocyst population before in vitro excystation to quantify empty shells. Then excystation is performed to enumerate excysted shells and intact oocysts. The empty shells are subtracted from excysted shells and then a ratio of this adjusted number to intact oocysts can be used to calculate percentage excystation or oocyst viability. From the mid 1980’s until 1998-1999 in vitro excystation was deemed a ‘silver’ standard for determining whether oocysts were dead or alive. The neonatal mouse infectivity was deemed the ‘gold’ standard and a number of studies reported excystation and mouse infectivity to correlate well ( Korich et al. 1990; Korich 1993; Finch et al. 1993).

Both excystation and infectivity assays required large numbers of oocysts to establish oocyst viability or infectivity, which precluded their routine use for low numbers of environmentally derived oocysts and limited application of these assays to experimental studies. Viability assays utilizing fluorogenic vital dyes DAPI and PI ( Campbell et al. 1992), SYTO-9 and SYTO-59 ( Neumann et al. 2000) and fluorescence in situ hybridization (Smith et al. 2004) were developed to measure whether individual oocysts are dead or alive and were utilized for examining the viability of environmentally derived oocysts ( Robertson et al. 1992) as well as the efficiency of various disinfectants for oocyst inactivation ( Bukhari et al. 1999; Bukhari et al. 2000). Although these in vitro viability assays offered several advantages over traditional animal infectivity models in that they required significantly shorter time to produce results, were easy to use and relatively inexpensive, it was later demonstrated, particularly following UV exposure, that use of some in vitro viability assays could lead to grossly erroneous results ( Bukhari et al. 1999). As a result it became clear that demonstration of intracellular development was critical in determining whether oocysts are dead or alive ( Bukhari et al. 1999). This prompted studies to evaluate the suitability of cell-culture procedures as a less expensive and user-friendly alternative to neonatal mouse infectivity assays.

Since Current and Haynes 1984 first described in vitro cultivation of Cryptosporidium oocysts in human fetal lung cells, various investigators have utilized different cell lines and immunological or molecular detection procedures to characterize in vitro infectivity of C. parvum oocysts; however, no single protocol has been widely accepted or independently validated. Based on a number of studies ( DiGiovanni et al. 1999; Rochelle et al. 2004; Rochelle et al. 2002; Slifko et al. 1997; Slifko et al. 1999; Upton et al. 1994), some consensus was reached on the Human Ileoceccal Adenocarcinoma cell line (HCT-8) for hosting C. parvum oocyst infections (Figure 3); however, variation in oocyst pre-treatment, oocyst inoculation, incubation times for infectivity and variations in the detection procedures all presented obstacles in directly comparing oocyst survival or inactivation data generated with these various in vitro infectivity protocols. Several studies have demonstrated that oocyst isolation procedures (i.e. USEPA Methods 1622/1623) can successfully be combined with a cell culture for determining infectious oocyst occurrence in raw ( LeChevallier et al. 2003), finished waters ( Aboytes et al. 2004) and wastewater effluents (Bukhari Z 2010).

To utilize Cryptosporidium occurrence data (from environmental monitoring programs) for microbial risk assessments, it is imperative to use standardized or well-characterized protocols for oocyst enumeration and infectivity. With the various method permutations, comparison of oocyst infectivity data from various in vitro infectivity protocols is not readily feasible. Recognizing that independent validations, which assess both the reproducibility and predictive capacity of a method, are a key towards developing oocyst infectivity standards, a study comprising an international consortium (consisting of American Water Works Association Research Foundation, Drinking Water Inspectorate, UK, KIWA, Netherlands, United Kingdom Water Industry Research Ltd, United States Environmental Protection Agency, and Water Services Association, Australia) examined the sensitivity and reproducibility of in vitro cell culture infectivity assays using varying oocyst inocula of unknown (‘blind’) infectivity ( Bukhari and LeChevallier 2003 and Bukhari et al. 2007). The optimized cell culture based procedure subjected to these evaluations incorporated oocyst pre-acidification and exposure to bile salts immediately preceding inoculation onto HCT-8 monolayers followed by incubation at 37°C for 72 h and detection using immunofluorescence (IFA) to reveal clusters of developmental or endogenous stages (Figure 4). An advantage of this cell culture-IFA procedure, compared to other existing procedures, is its uniqueness in taking advantage of various oocyst pre-treatment triggers (i.e. acid treatment and exposure to bile). These modifications are intended to closely simulate conditions oocysts encounter when ingested by a susceptible host. Additionally the assay allows identification of non-infectious oocysts ( Bukhari and LeChevallier 2003), which undergo excystation and invasion of HCT-8 cells but fail to multiply intracellularly and generate ‘pin-points’ of invasion (Figure 5). Where qualitative or quantitative polymerase chain reaction (PCR) methods are used for detection of infection in the host cells, PCR based assays can not differentiate invasive stages from those undergoing active multiplication, which can lead to erroneous false positives and an over-estimation of oocyst infectivity ( Bukhari and LeChevallier 2003). In the Bukhari et al 2007 study performing ‘blind’ trials using cell culture-IFA, it was determined that this infectivity assay was highly effective for predicting the infectivity of oocysts of unknown infectivity, with a high degree of correlation (r2= 0.89) between estimated and actual number of infectious oocysts.

Cryptosporidium is an obligate parasite and can not multiply outside a susceptible host. This means oocysts in the environment are likely to degrade with time and lose their infectivity unless they are ingested by another susceptible host and complete their life cycle. In addition to ageing, various environmental pressures contribute to a reduction in oocyst viability and infectivity. Robertson et al. 1992 reported that oocyst exposure to -22°C for 6 days resulted in 10% of C. parvum oocysts surviving, which was reduced to 2% after 30 days. Fayer R 1994 reported that oocysts exposed to temperatures up to 67.5°C (for 1 min) were capable of infecting neonatal BALB/c mice. Oocyst survival in the environment has also been reported to depend on the ambient temperature. Oocysts incubated in reagent-grade water or in reservoir water at 4°C and 15°C remained infective for 12 weeks, whereas at 20°C and 25°C a 4-log10 reduction was observed for 12 and 8 weeks incubations respectively ( King et al. 2005). It has also been suggested that higher ambient temperatures lead to elevated metabolic activity inside oocysts and depletion of amylopectin energy reserves. Amylopectin below a critical level may prevent successful infection ( Fayer et al. 1998). Other environmental parameters influencing oocyst inactivation and survival were recently reviewed ( King and Monis 2007).

Selected survival studies:

  • C. parvum oocysts were seeded (at 107 oocysts per mL) into samples of surface and groundwater from two representative reservoirs and two aquifers in Florida. The initial infectious concentrations ranged from 103 to 104 oocysts per mL (as determined by cell culture). In general, higher oocyst inactivation was noted at higher temperatures. Inactivation rates are summarized in the following table (modified from Ives et al. 2007).

  • * Standard deviation.

  • Oocysts (106 per mL) were exposed to different temperatures (4°C, 18°C, and 25°C), pH levels (pH 7 and pH 10), and ammonium concentrations (0 mg per L, 5 mg per L, and 50 mg per L to simulate conditions encountered in an algae-based wastewater treatment system) for different amounts of time (up to 6 days). Viability was assessed using using inclusion/exclusion of two fluorogenic vital dyes (DAPI/PI). At the beginning of the experiments, between 80 and 82% of oocysts were intact and did not stain with PI. Results showed that increasing alkalinity, temperature, and exposure time increased the percentage of PI-stained oocysts. The highest occurrence of PI-stained oocysts (45%) was found after 6 days of exposure at 25°C and pH 10. After 4 days of exposure to either 5 mg per L or 50 mg per L of ammonium, oocyst viability was reduced from an initial 81% to approximately 41% and 15%, respectively ( Reinoso et al. 2007).
  • Purified oocysts and cysts (approximately 5 x 104 per mL) were submerged in filter chambers at a depth of approximately 1 m in a river in Norway ( Robertson and Gjerde 2006). Temperature at this depth varied between approximately 1°C and 7°C during the study period; freezing of the oocyst/cyst suspension was not observed. Viability was monitored during winter based on morphology and exclusion/inclusion of fluorogenic vital dyes (DAPI and PI). No viable Cryptosporidium oocysts could be detected after approximately 20 weeks either when submerged in the river or when stored in distilled water at 4°C as a control. Based on this study, it was concluded that none or very low numbers of the Cryptosporidium oocysts excreted in summer/fall would survive the winter period and be capable of causing disease in the spring.
  • Effect of storing C. parvum oocysts (20 and 300 oocysts per mL) in chlorinated tap water (residual 0.10 ppm to 0.12 ppm and pH 7.7) at 4°C and 10°C for 2 weeks, 4 weeks, 6 weeks, and 8 weeks was studied ( Li et al. 2004). Oocyst infectivity was measured using in vitro excystation and in vivo neonatal mouse infectivity assay. Results after 8 weeks of storage are summarized in the following table.

  • Oocyst viability (determined by in vitro excystation) was similar after 8 weeks of storage at 4°C or 10°C. In vivo infectivity yielded similar results to in vitro excystation following storage at 10°C; however, indicated considerably higher oocyst infectivity following storage at 4°C. These finding further support that in vitro viability data should be interpreted with caution; especially where these data are to be used in risk assessment based decision frameworks.

Typically with coccidian parasites, the oocysts voided in the feces of the infected host require a maturation period before they can propagate infection in other susceptible hosts. In contrast to this, Cryptosporidium oocysts are immediately infectious and capable of causing disease in other susceptible hosts. Over the years there has been plenty of evidence gathered to suggest that the minimum infectious dose for Cryptosporidium is likely to be small, a feature shared with other human intestinal protozoa such as Giardia ( Rendtorff RC 1979). Among some of the early animal infectivity studies, an inoculum size of 100 oocysts was reported to induce 22% infection in mice (Ernest et al. 1987), a dose of 10 oocysts produced an infection in 2 of 2 non-human primate infants ( Miller et al. 1990) and 5 oocysts produced clinical sign in gnotobiotic lambs ( Blewett et al. 1993).

Using the Iowa strain of C. parvum oocysts (of calf origin) infectivity trials in healthy adult human volunteers (individuals with no serological evidence of previous exposure to Cryptosporidium infections) indicated 100% infection with >1000 oocysts and 20% (1 of 5) infection with 30 oocysts ( DuPont et al. 1995; Chappell et al. 1996). Based on these studies the ID50 (dose necessary to cause infection in 50% of the dose recipients) for humans was estimated to be around 132 oocysts. Because the ID50 may be influenced by a number of factors (i.e. oocyst infectivity, virulence, strain variation and host related factors), it is not surprising there is considerable variability associated with this number. Others have reported the 50% infective dose in healthy human volunteers to be between approximately 10 and 1,000 oocysts depending on the C. parvum isolate ( Okhuysen et al. 1999; Ochiai et al. 2005).

Considering the laboratory based infectivity data in conjunction with raw and finished water monitoring data, especially during outbreak situations, it has become apparent that very low doses of infectious Cryptosporidium (perhaps even a single oocyst) may be capable of causing disease. Several risk assessments of Cryptosporidium in drinking water have been performed ( Perz et al. 1998; Fewtrell et al. 2001; Messner et al. 2001); however, the complexity of risk analysis is increasing further as the taxonomic understanding of species and genotypes infectious to humans is becoming more detailed and as oocyst viability and/infectivity is becoming reliable and routinely gathered during monitoring programs. Except for the finished water studies conducted at American Water ( Aboytes et al. 2004), most other risk assessment analyses have used source water monitoring data and adjusted the data with estimates of water treatment process removal/inactivation efficacy for oocysts. These performance based assumptions of treatment plant performance contribute further uncertainties to microbial risk assessment analyses and need to be eliminated in future studies.

Human dose response data from three C. parvum bovine genotype isolates has been subjected to Meta analysis ( Messner et al. 2001) to calculate the infectious dose for an unknown isolate. This is summarized in the Table below:

Source: Messner et al. 2001

Daily risk of Cryptosporidium infections is calculated by multiplying the volume of drinking water consumed daily (1.2 liters/day) by the concentration of oocysts and the risk of infection from a single oocyst:

The annual risk is determined by multiplying the daily risk by a factor of 350 (remaining 15 days of the year accounts for days when water is consumed from different sources). The USEPA established an annual acceptable risk of microbial infection in drinking water at 10-4 infections per person ( Regli et al. 1999). To achieve the <10-4 infection per person per year level, infectious Cryptosporidium oocysts would have to be absent in 290,000 liters of drinking water (assuming a 40% recovery efficiency with USEPA approved methods).

To understand the risk of waterborne human cryptosporidiosis and to implement appropriate control measures at drinking water treatment plants, it is necessary to understand occurrence of Cryptosporidium oocysts at the raw water intake and characterize removal efficiencies at various stages in the treatment train. Monitoring oocyst levels in treated drinking water is also desirable from a microbial risk perspective. The microbial risk assessment models are only as reliable as their data inputs. To circumvent the old adage ‘garbage in, garbage out’, it is necessary to have well characterized methods that can enhance the capabilities for oocyst enumeration, viability assessment and molecular characterization. Even though USEPA Methods 1622/23 are currently the sole standards in the US for regulatory monitoring of Cryptosporidium under the Long Term 2 Enhanced Surface Water treatment Rule (LT2ESTWR), considerable progress has been made in detection procedures that incorporate specific DNA based targets to enhance specificity of the detection procedures and provide opportunities for oocyst speciation. The following sections provide an abbreviated overview of some sampling and detection procedures.

Selected methods:

  • USEPA Methods 1622 for detection of C. parvum and Method 1623 for detection of G. lamblia and C. parvum are based on filtration (or centrifugation) of 10 L – 50 L raw water samples followed by elution, concentration, Immunomagnetic separation (IMS) and immunofluorescence antibody based detection with microscopic enumeration ( US EPA 1998 and US EPA 1999). Performance Based Measurements Systems (PBMS) framework has been used to characterize these methods in multi-laboratory round-robin validations and recovery rates can depend on the turbidity of water samples (Bukhari et al. 1998; Di Giovanni et al. 1999). To utilize PBMS several performance parameters must be characterized (i.e. method precision, bias, performance range, interferences, and matrix applicability). Providing a modification in the original method demonstrates performance equivalency (under the PBMS) to the original Method 1622/23 performance specifications, it is acceptable to replace the original step with the alternative following approved validation protocols. A number of manufacturers have taken advantage of the PBMS framework to demonstrate equivalency for their procedures. Some advances to the original Method 1622 and Method 1623 have included use of alternative filtration devices for collection of larger sample volumes (i.e. increased sample volume from 10 L to 50 L) or the use of continuous flow centrifugation as an alternative to sample filtration. While Method 1622 and Method 1623 are a substantial improvement on the methods employed during the ICR program, these methods do not provide information regarding oocyst species, viability, or infectivity.
  • Portable Continuous Flow Centrifugation (PCFC) is manufactured by Haemonetics, Inc (Braintree MA.), and has been extensively modified and optimized by Zuckerman et al. 1999 for collection of pathogens from various environmental matrices (raw water, finished water, wastewater effluents). The PCFC as a sampling device is a USEPA approved alternative Method for sampling in Method 1623. The unit is a rugged, self-contained system that can fit into a suitcase and can achieve a maximum centrifugal force of 4,300xg. The basic operational principles have been described earlier ( Zuckerman et al. 1999; Zuckerman and Tzipori 2006). Briefly, disposable inlet tubing is used to pump samples into a disposable centrifuge bowl with an outlet tubing from the bowl carrying the supernatant to drain. After sampling, flow from the inlet port is terminated while the PCFC continues centrifugation (approx 10 sec) of the sample to reduce the volume of the residue inside the bowl to < 250 mL. The bowl is removed; both the inlet/outlet ports are capped and bowls containing sample concentrates can be shipped for analysis. As new bowls and tubing are used for sample collection, this avoids contamination. Unlike filtration, which is prone to clogging, particularly with turbid samples, a major advantage of the PCFC is the ability to concentrate large volumes (up to 1000L) of turbid samples.
  • Ultra filtration membranes are capable of removing molecules in the 10-100 kDa range and may be used for simultaneous concentration of viruses, bacteria and parasites. In one study, high C. parvum doses (i.e. approx 6 x 105 oocysts) were spiked in 100 L tap water samples containing 0.01% wt/vol sodium polyphosphate. This was followed by ultrafiltration using pre-treated (500 mL of 5% calf serum recirculated for 5 min) single use Fresenius F200NR polysulphone hollow-fiber ultrafilters with a cut-off molecular mass of 30 kDa. For eight geographically dispersed sites, C. parvum recoveries for the ultrafiltration step alone ranged from 81-98% ( Hill et al. 2009). Inclusion of the IMS step to further concentrate oocysts and separate them from the contaminating debris continued to yield high recoveries also.
  • Flow cytometry has been used as an alternative to microscopy for detection/enumeration of anti-Cryptosporidium antibody labeled Cryptosporidium oocysts since the early 1990s. While flow cytometers are moderately sensitive for oocyst enumeration, their strength is in their ability to yield highly reproducible oocyst counts with mid level (i.e. around 100 organisms) spike doses. In the United States flow cytometry sorted oocyst preparations have become a standard for preparation of known inocula for environmental spiking studies to demonstrate method performance. For environmental samples containing contaminating debris, flow cytometers can be prone to clogging issues; however, this may be reduced with the use of IMS technology to specifically capture and purify oocysts prior to flow cytomtery. Where fluorescence conjugated antibody labeled oocyst and cyst populations have been sorted with flow cytometry, it has been shown that organisms could be successfully distinguished from background debris in the scatter plots ( Hsu et al. 2005).
  • IMS-PCR: IMS captured oocysts when subjected to PCR with primers targeting the 18S rRNA genes allowed detection of spike doses as low as 5 oocysts in natural waters with low turbidities. To ensure the PCR amplifications are specific, it is prudent to perform confirmatory steps in the PCR product. These can include amplicon analysis using restriction fragment length polymorphism (RFLP) or DNA sequencing of the PCR product. ( Sturbaum et al. 2002). An immunomagnetic separation procedure followed by a nested PCR targeting the cpgp40/15 gene encoding the sporozoite membrane glycoproteins can achieve sensitivity levels of approximately 1 oocyst per reaction. Restriction Fragment Length Polymorphism (RFLP) analysis of PCR amplicons can allow detection of both C. parvum and C. hominis. Investigators seeded 1.2 liters of river water and found that the detection limit was as low as 100 oocysts at low turbidity (around 200 NTU) and 200-600 oocysts at high turbidity (around 1,000 NTU) ( Ochiai et al. 2005). Another nested PCR method targeting the18S rRNA gene of Cryptosporidium spp. was shown to be more sensitive than epifluorescence microscopy and direct PCR. The assay consistently detected concentrations <5 oocysts per sample (Nichols et al. 2003).
  • RT-PCR can be used to target messenger RNA (mRNA) that are produced when oocysts exposed to elevated temperatures (i.e. 45°C, 20 min) are induced to generate heat shock proteins. Oocyst lyses releases mRNA from heat shocked oocysts and the recovered mRNA is specifically isolated using oligo(dT)25-coated magnetic beads. RT-PCR amplification of heat shock protein-70 (Hsp70) gene followed by amplicon detection using Polyacrylamide Gel Electrophoresis (PAGE) and silver-staining can yield a sensitive oocyst detection method. For spiked river and reservoir water sensitivity may by impacted due to matrix effects; however, this interference may be overcome by using southern hybridization. No signal was obtained from killed oocysts fixed in formalin; suggesting this method would not detect empty shells or those oocysts that can not produce mRNA following heat shocking procedures (i.e. non-viable organisms). A RNA internal positive control was used to avoid false-negative results ( Stinear et al. 1996). Another RT-PCR assay allowed for simultaneous detection of C. parvum oocysts and Giardia cysts. A RNA internal control helped to assess the efficacy of mRNA isolation and inhibition of amplification. The assay targeted the Cryptosporidium hsp70 mRNA and Giardia giardin mRNA. RNA (after heat-shocking the sample to maximize hsp70 transcript levels) was isolated using oligo(dT)25 magnetic beads. Water volumes up to 1500 liters were concentrated using fiberglass cartridge filters for detection of viable organisms. Amplification products were visualized using PAGE and silver-staining. The assay could detect one single viable cyst (or oocyst) spiked into 100 µL of creek or river water concentrate ( Kaucner and Stinear 1998).
  • Real time or quantitative (q)-PCR is a user-friendly approach for DNA amplification and simultaneous quantification in real time. The gel based detection of DNA amplicons performed following conventional PCR can be avoided in qPCR as fluorogenic dyes that intercalate with double stranded DNA can be used for amplicon detection. Primers targeting the Cryptosporidium Outer Wall Protein (COWP) gene have been used to detect DNA concentrations over 7 orders of magnitude with a low end detection limit equivalent to DNA from one oocyst per PCR reaction. The detection threshold in sewage samples was around 100 oocysts (using 1/100 volume of the extracted DNA as a template). This assay has been successfully used in a multiplex format, also detecting G. lamblia without affecting sensitivity ( Guy et al. 2003). This qPCR method was combined with a low cost IMS-free DNA extraction method for detection of C. parvum and G. lamblia ( Anceno et al. 2007). An earlier qPCR assay based on the detection of 18S rRNA genes in stool samples had a detection limit of 5 oocysts per reaction using two probes. Melting curve analysis allowed differentiation of 5 common pathogenic Cryptosporidium species (Limor JR 2002). A TaqMan assay was reported to detect a C. parvum-specific DNA sequence over six orders of magnitude with a sensitivity threshold of 6 oocysts per reaction (Fontaine and Guillot 2002).
  • Cell Culture (CC)-PCR and CC-qPCR: Samples containing oocysts are inoculated onto the surface of the chosen cell monolayers and incubated (48-72 h) at 37°C. During this time infectious oocysts excyst to release sporozoites that can invade susceptible monolayers where they begin asexual phases of their life cycle. Excess sample debris, oocyst shells and intact oocysts that did not undergo excystation are all removed by repeated washing of the monolayers and DNA is extracted from the cell cultures and tested for presence of C. parvum target genes (e.g., 18S rRNA or hsp70 genes ( LeChevallier et al. 2003; Keegan et al. 2003, Aboytes et al. 2004). Typically this method incorporates two amplification stages (i.e. cell culture and PCR) and it is not surprising the detection limits are generally low (i.e. 1-5 oocysts). A flow diagram of a typical CC-PCR procedure for environmentally derived oocysts is shown in Figure 7. Where qPCR was used, a detection limit of approximately 10 infectious C. parvum oocysts was reported whereas heat-inactivated oocysts yielded no signal ( Keegan et al. 2003). In much the same way, others have used infectious oocyst amplification on cell culture followed by Enzyme Linked Immunosorbent Assay (ELISA) based detection ( Woods et al. 1995) or reverse transcriptase (RT)-PCR to amplify mRNA generated from HSP 70 gene ( Rochelle et al. 1997).
  • CryptoPMA-PCR is based on treatment of water samples with propidium monoazide (PMA) prior to DNA extraction and PCR analysis. PMA selectively enters oocysts with compromised outer walls (i.e. dead oocysts) and then binds to DNA within each sporozoite thus preventing DNA amplification by DNA amplification procedures such as PCR. In contrast, oocyst that have structurally intact outer walls exclude PMA and can be readily be amplified by PCR. This method is useful for differentiating intact (potentially viable) oocysts from those that have lost their structural integrity ( Brescia et al. 2009).
  • IMS-Nucleic Acid Sequence Based Amplification (NASBA) is an alternative to conventional PCR based DNA amplification procedures. The NASBA procedure is a homogeneous, isothermal amplification process that uses three enzymes (reverse transcriptase, RNase H and T7 RNA polymerase), two target specific primers, with one of these primers using the T7 promoter sequence. NASBA can achieve a 108 amplification of the target DNA or RNA very quickly (i.e. within 90 min) and, unlike PCR, does not require equipment such as a thermal cycler. Using the NASBA approach the CryptoDetectTM commercial test was developed by Innovative Biotechnologies International, Inc. The method entails IMS separation and heat-shock of captured oocysts followed by mRNA isolation and NASBA-amplification of heat-shock mRNA. The assay can potentially yield information on the recovered oocysts viability. This is based on the assumption that only viable oocysts respond to heat-shock by production of mRNA to generate heat-shock proteins. Apparently the procedure has a theoretical detection limit of 1-5 viable oocysts and can be completed in 4-6 hours. While NASBA is a user-friendly, field adaptable technology, further independent testing of the CryptoDetectTM still remains to be conducted.


Figure 1:


Cryptosporidium oocysts using Differential Interference Contrast (DIC) microscopy.


Figures 2A and B

                    Crypto_2A   Crypto_2B

Cryptosporidium oocysts stained with fluorochrome conjugated antibody (left) & 4,6-diamidino-2-phenyl indole (right).


Figures 3A and B:

                   Crypto_3A    Crypto_3B

Concentration of Cryptosporidium oocysts by using Immunomagnetic Separation (IMS).


Figure 4:


This illustration depicts the life cycle of different species of Cryptosporidium, the causal agents of Cryptosporidiosis.

This diagram is part of the Public Health Image Library (PHIL) from the Centers for Disease Control and Prevention (CDC). It is public domain and thus free of copyright restrictions. We would like to express our appreciation for providing these images.

Source: http://phil.cdc.gov/phil/home.asp
Photo ID: 3386
Content provider(s): Centers for Disease Control and Prevention/Alexander J. da Silva and Melanie Moser

Last Updated on Tuesday, 03 April 2012 02:48


Hepatitis A
Hepatitis E


Copyright © 2017 Waterborne Pathogens. All Rights Reserved. Powered by SuSanA