|Thursday, 11 February 2010 00:00|
Protozoan parasites of the genus Cryptosporidium belong to the class Sporozoasida, family Cryptosporidiidae and phylum Apicomplexa. They are often referred to as coccidia. While some coccidia can undergo extra-intestinal development and are tissue-cyst forming (i.e. Sarcocystis, Toxoplasma), others develop in the gastrointestinal or respiratory tract, without formation of tissue cysts (i.e. Eimeria, Isospora and Cryptosporidium). Like other coccidia, it was originally thought that Cryptosporidium would be highly host specific and almost 20 species were named according to the species of the infected host from which they were isolated ( Tyzzer EE 1912; Levine ND 1984; Current et al. 1986; Fayer and Ungar 1986). Later, cross-transmission studies with mammalian isolates of Cryptosporidium indicated low host specificity, which first prompted Tzipori et al. 1980 to consider Cryptosporidium as a single species genus and then led Levine ND 1984 to suggest that only four species may be valid. Later the valid number of species was increased to six species with C. parvum causing respiratory and intestinal infections whereas C. muris causing stomach infections. Respiratory and intestinal cryptosporidiosis in birds was attributed to C. baileyi and C. meleagridis, whereas C. serpentis infected reptiles and C. nasorum infected fish.
Over the last 15 years, increased adoption of molecular characterization procedures has led to considerable reorganization of Cryptosporidium taxonomy and two species (Cryptosporidium hominis and Cryptosporidium parvum) have emerged to be of most significance from a public health perspective, and as such, of the greatest interest to the water industry. Of these species, C. parvum infects both humans and animals whereas C. hominis infects only humans ( Xiao et al. 2004). Additional Cryptosporidium species capable of infecting humans and causing disease (mainly in children and immunocompromised persons) include C. felis, C. muris, C. canis, C. suis and C. meleagridis ( Xiao et al. 2001; Muthusamy et al. 2006; Pieniazek et al. 1999). Fayer R 2009 reviewed the taxonomic status for Cryptosporidium and highlighted that 12 valid species infected mammals (C. hominis, C. parvum, C. muris, C. wrairi, C. felis, C. andersoni, C. canis, C. suis, C. bovis, C. fayeri, C. ryanae, C. macropodum), three species infected birds (C. meleagridis, C. baileyi and C. galli) and three species infected amphibians and reptiles (C. serpentis, C. varanii and C. fragile). The taxonomy of species infecting fish is somewhat unclear presently and a number of other Cryptosporidium genotypes also exist that may become recognized as valid species as further evidence is accrued.
In humans Cryptosporidium infections are acquired by ingestion or inhalation (i.e. fecal oral route, foodborne, waterborne, etc) of the parasite’s transmissive stage, which is known as the oocyst. The time between acquisition of infection and manifestation of symptoms (pre-patent period) is dependent on various factors (i.e. host susceptibility, virulence of the infecting strain of oocysts, oocyst infectivity, etc) but may be somewhere from 5-28 days (mean 7.2) ( Anderson et al. 1982; Current et al. 1983; Højlyng et al. 1987). Cryptosporidium infections are most pathogenic in neonatal hosts, however in humans infections have been reported from a 3 day old infant, from a mother with cryptosporidiosis, to a 95 year old individual. Disease often manifests itself with profuse, watery diarrhea, abdominal cramping, nausea, vomiting and low grade fever. In well nourished, immunocompetent individuals, cryptosporidiosis may last 2 -12 days but is usually self-limiting. Occasionally, infections may continue for two weeks or more and may require fluid replacement therapy. Presently no effective chemotherapeutic treatment is available and duration of disease is dependent upon the patient’s immune status. In patients with congenital or acquired immune deficiencies or malnourished individuals, infection can be considerably prolonged, resulting in malabsorption, severe dehydration and even death.
During the course of infection, Cryptosporidium undergoes distinct asexual and sexual phases of a life cycle in enterocytes of infected hosts. This culminates in the production of oocysts that are shed in the feces of infected hosts in large numbers (i.e. 105-107 oocysts per gram of feces). The voided oocysts each contain four mononucleated sporozoites and can disperse through the environment by various mechanisms (i.e. surface run-off and by leaching through the soil profile) to contaminate surface waters. Figure 2 shows Cryptosporidium oocysts labeled with fluorescein isothiocynate conjugated monoclonal antibody (green outer wall) and an oocyst labeled with 4’, 6’ diamidino-2-phenylindole (DAPI), which stains the four nuclei in intact oocysts a sky blue color. When oocyst contaminated surface water is used for drinking purposes, outbreaks of human disease (cryptosporidosis) can occur. The risk of waterborne human disease transmission is exacerbated by the fact the oocysts have a low infectious dose (see later), are able to survive in the environment for long periods of time and are not readily inactivated by chemical disinfection, particularly chlorine at levels normally employed in water treatment processes (see later).
To better understand the occurrence of Cryptosporidium, various surveys have been conducted worldwide ( LeChevallier et al. 1991; Rouse et al. 2001; Hashimoto et al. 2001; Drinking Water Inspectorate 2003; LeChevallier et al. 2003). In one study it was reported that infectious oocysts were detected in 22 of 82 (26.8%) surface water treatment plants within the US, potentially posing an annual risk range of 9-119 infections per 10,000 people ( Aboytes et al. 2004) by the waterborne route. Waterborne outbreaks have been reported in the United States, United Kingdom, and Japan with the most notable outbreak occurring in Milwaukee in 1993 where an estimated 405,000 individuals were impacted, including 100 deaths ( Mac Kenzie et al. 1994). The estimates for infection were based on retrospective surveys of self-reported diarrhea. Such estimates can be prone to recall bias and some have suggested that the actual numbers of infected individuals in the Milwaukee outbreak were over-estimated by 10 to 100 fold ( Hunter and Syed 2001). Irrespective of the actual numbers of individuals impacted, the magnitude of this outbreak is underscored by its economic burdens. Improvements to the water treatment plant alone incurred estimated costs of $90 million. Additionally, costs associated with medical expenses or loss of productivity were approximated to be around $96 million when using the original estimates of infected people, with approximately two thirds being due to productivity losses ( Corso et al. 2003). Where adequate monitoring or screening is implemented, it is not uncommon to find evidence for waterborne human cryptosporidiosis. For example data from 1992-2003 indicated there were 89 waterborne outbreaks registered in England and Wales of which 61 were attributed to Cryptosporidium and affected more than 4300 people ( Smith et al. 2006).
Cryptosporidium oocysts contaminating surface water and subsequently being responsible for waterborne human cryptosporidiosis can originate from many contributing sources (i.e. wild or domestic animals and untreated human sewage). Cattle and sewage discharges can contain large numbers of oocysts and surface waters receiving such discharges have been reported with 10- to 100-fold higher oocyst concentrations than pristine waters ( Bagley et al. 1998; Rose JB 2002). In some cases surface waters impacted from sewage and animal feces have been reported to contain 100 Cryptosporidium oocysts per liter of water (Lisle and Rose 1995) and others have reported sewage discharges to contain peak concentrations ranging from 104 to 3.9x105 oocysts per liter ( Medema and Schijven 2001). Atherholt et al. 1998 found a positive correlation between precipitation rates and oocysts levels in river waters, which may point to heavy rainfall facilitating oocyst transport from contaminated land into receiving waters. Once in a watershed, oocysts show considerable resistance to adverse environmental conditions and may survive for weeks or months ( Rose JB 2002).
Following the Milwaukee outbreak, there was heightened awareness for Cryptosporidium in the United States and the 18 month monitoring program under the Information Collection Rule (ICR) was implemented between July 1997 and December 1998. Monthly samples were collected and analyzed from the raw water at the plant intake of 296 water systems (representing 500 treatment plants). Of 5838 samples analyzed under the ICR only 7% were Cryptosporidium positive with a mean oocyst concentration of 0.067 oocysts per L ( Messner and Wolpert 2002). Calculation based on total oocyst count divided by total volume analyzed or median concentrations both rendered similar oocyst concentrations (i.e. 0.02 oocysts per liter). Poor and highly variable overall method recovery efficiencies (i.e. 12% ± 11%, range 1%-30%; n=140 Scheller et al. 2002) were probable reasons for high number of non-detects and for the high location to location variability where oocysts were detected. The ICR-Supplemental surveys (SS), using USEPA Method 1622, helped increase the median volume analyzed per sample and may have contributed to lowering location based variability. Monitoring of medium sized systems indicated more frequent occurrence of higher Cryptosporidium concentrations than monitoring at larger systems. Despite this the median Cryptosporidium concentrations in the ICR-SS were similar to those obtained from ICR data (i.e. 0.02 oocysts per liter). In addition to the 18 month monitoring study, the ICR-SS and other environmental monitoring supported the notion that Cryptosporidium concentrations can be wide and varied; however are likely to be ubiquitous in surface waters. The actual levels being detected are likely to be influenced by inconsistencies in method performance and may also be attributed to oocyst burdens from the contributing sources (i.e. sewage and/or animal manure). Rose et al. 1997 cited seven raw water monitoring studies in North America with average percent oocyst positive samples ranging from 9.1% to 100% (oocyst concentration range 0 – 240 oocysts per L). Lower occurrence (3.8% to 33.3%) and oocyst concentrations (0.001 and 0.48 oocysts per L) were noted in drinking water samples (n=158).
Following substantial method refinements and multi-laboratory validations, US Environmental Protection Agency (USEPA) Methods 1622/23 emerged in the new millennium with a promise of more reliable monitoring data for Cryptosporidium and Giardia. Two studies extensively use Method 1623 (in conjunction with cell culture-polymerase chain reaction-CC-PCR) to establish Cryptosporidium occurrence and infectivity in both surface water ( LeChevallier et al. 2003) and finished drinking water ( Aboytes et al. 2004). Data from these studies indicated seasonality in oocyst occurrence, with peak levels occurring most notably in the spring, but also in the fall months. The spring peaks in oocyst levels may be associated with rainfall ( Atherholt et al. 1998; Payment et al. 2000). During spring there are increased numbers of newborn calves and lambs and Cryptosporidium infections in such neonatal animals can be common. It has been noted that on some farms >99% of the neonatal calves with diarrhea were shedding large numbers of oocysts ( Bukhari Z 1995). To understand significance of oocysts for disease transmission, it is important establish whether they are alive or dead. Viability patterns of oocysts shed over the course of infection in experimentally infected calves, lambs and mice have shown large proportions (i.e. 30-50%) of oocysts were dead ( Bukhari and Smith 1997). Non-viable oocysts have no role in further disease transmission to other susceptible hosts. As Cryptosporidium is an obligate parasite, it can not multiply outside of a susceptible host and there is further degradation of oocyst viability with time and following exposure to harsh environmental conditions ( Bagley et al. 1998). Where viable (and infectious) oocysts contaminate surface water, consumption of untreated or inadequately treated (improper treatment or process failures) water can lead to waterborne human cryptosporidiosis. To help reduce the risk of waterborne cryptosporidiosis, many large water systems in the US participate in a voluntary program known as the “Partnership for Safe Drinking Water”. By review and optimization of treatment processes the goal of this program is to consistently produce filtered drinking water with a turbidity level less than 0.1 NTU, which exceeds existing USEPA regulations.
To better understand the risk of waterborne human cryptosporidiosis, it is necessary to understand the frequency with which Cryptosporidium oocysts can be detected in various untreated water sources, the recovered oocyst infectivity and how the various drinking water treatment processes could potentially impact physical removal of oocysts or their infectivity. The USEPA has published and implemented the Long Term II Enhanced Surface Water Treatment Rule (LT2ESWTR or LT2 rule) to supplement existing regulations. LT2ESWTR has required surface water systems to monitor raw water for Cryptosporidium, and based on a rolling 12-month average, systems will be categorized into different concentration dependant bins. Systems exceeding 0.075 oocysts per L will require additional treatment with approved removal/inactivation technologies effective for Cryptosporidium. (USEPA 2000). Detailed information can be found at http://www.epa.gov/OGWDW/disinfection/lt2/. The first round of the LT2 surveys began with the largest systems (serving at least 100,000 people) in October 2006 and LT2 monitoring was staggered such that the smallest systems (serving < 10,000 people) were scheduled for monitoring in October 2008. Following LT2 surveys, systems need to comply with any additional treatment requirements within three years. A second round of monitoring is scheduled six years after completing the initial round to determine if source water conditions have changed significantly.
Selected occurrence studies:
Numerous occurrence studies have been conduced worldwide and their exhaustive review is outside the scope of this section. The following examples are intended to highlight the variation in oocyst occurrence and provide a perspective on various factors that may influence oocyst concentrations in water sources used to draw water for treatment for human consumption.
SOURCE: Adapted from Carmena et al. 2007.
CWTF: Conventional water treatment facilities with coagulation, flocculation, sedimentation, filtration, and disinfection
In the Unites States chlorine is used as a primary disinfectant in 63 percent of the surface water treatment plants, and as a post-disinfectant in more than 93 percent of all surface water treatment plants (US EPA 1997). Occurrence of numerous waterborne outbreaks of human cryptosporidiosis have demonstrated that normal chlorination of water (i.e. free chlorine levels between 0.05 - 0.4 mg/L) is inadequate for inactivation of Cryptosporidium oocysts. Experimental evidence supports this notion. For calf-derived oocysts, free chlorine concentrations of 80 mg/L for 120 min reduced oocyst viability to zero and 2 logs reduction in infectivity was noted after 90 min exposure ( Sterling et al. 1989). These high chlorine concentrations are impractical for various reasons (i.e. unacceptable taste, odor and toxicity, costs, pipe-corrosion and disinfection by products).
Use of ozone as an alternative disinfectant was evaluated by Peeters et al. 1989 who reported concentrations between 1.11 - 2.25 mg/L, for contact times between 6 - 8 min were required to achieve 90 - 98% inactivation of oocysts. Where inactivation of C. parvum and Giardia lamblia were investigated with ozone and chlorine dioxide, under the same experimental conditions, C. parvum oocysts were 30 times more resistant to ozone and 14 times more resistant to chlorine dioxide ( Korich et al. 1990). In addition, several other studies have evaluated the efficacy of ozone. Finch et al. 1993 reported that 2.0 mg/L ozone at 7°C and 22°C resulted in 99.9% and 99.997% inactivation of oocysts respectively; however, use of elevated ozone concentrations could lead to the production of bromate, which is a potential carcinogen.
In Lorenzo-Lorenzo et al. 1993, the impact of UV light on Cryptosporidium oocysts was examined; however, due to inadequate description of their disinfection experiments, the effectiveness of UV light for Cryptosporidium oocysts inactivation failed to be recognized. Two years later, a unit known as the Cryptosporidium Inactivation Device (CID), which delivered a total UV dose of 8,748 mJ•cm-2 from low pressure lamps, was examined and 2- to 3-log inactivation of oocysts was reported using the fluorogenic vital dyes assay (4’-6’-diamidino-2-phenylindole and propidium iodide) and in vitro excystation ( Campbell et al. 1995). Following this, Clancy et al. 1998 confirmed the findings of Campbell et al. 1995; however, the apparent requirement for application of high UV doses to achieve significant oocyst inactivation continued to present an obstacle for the regulatory and water industry in supporting implementation of this technology.
Between 1998-1999, comparative bench scale and demonstration scale (215 gpm) Cryptosporidium oocyst inactivation studies were performed, which utilized medium pressure UV lamps and examined oocyst inactivation using in vitro viability assays (DAPI/PI and in vitro excystation) as well as mouse infectivity assays ( Bukhari et al. 1999). These studies conclusively demonstrated the effectiveness of low UV doses for Cryptosporidium inactivation in finished water and these data rapidly gained attention from both regulatory agencies and the water industry and instigated the revolution in the United States water industry for UV disinfection as a means to effectively inactivate Cryptosporidium oocysts. In addition to demonstrating the effectiveness of low levels of UV light for oocyst inactivation, Bukhari et al. 1999 also identified that in vitro viability assays under-estimated oocyst inactivation following their exposure to either UV light or ozonation ( Bukhari et al. 2000) and that mouse infectivity assays -albeit cumbersome procedures, needed to be the methods of choice for future studies of this nature. Following these initial studies, a number of other studies have confirmed the effectiveness of UV light for inactivating Cryptosporidium oocysts ( Bukhari and LeChevallier 2004) as well as Giardia cysts (Craik et al. 2001).
UV disinfection has also been accepted as is one of the treatment option by the USEPA in the microbial toolbox that is intended for selecting appropriate treatment options for systems exceeding 0.075 oocysts per L following the LT2 surveys. The LT2 has several requirements related to the use of UV disinfection, which include ascertaining UV doses for different levels of inactivation credit, performance validation testing of UV reactors, monitoring, reporting, and off-specification operation. The US EPA has developed a UV Disinfection Guidance Manual 2006, which is intended to help utilities navigate through the process of implementing UV disinfection to meet future compliance goals.
Selected disinfection studies:
Establishing whether oocysts are dead or alive is critical to understanding their ability to propagate disease to susceptible hosts. This may be achieved with viability or infectivity assays. Ideally environmental monitoring should extend quantitative measurements to include a viability/infectivity component in their analysis to provide accurate risk assessments associated with the recovered oocysts. Traditionally oocyst viability was measured by in vitro excystation, a laboratory based procedure that mimics some of the anticipated biological triggers (i.e. elevated temperature, carbon dioxide, stomach acid, bile salts etc) and has been used extensively to determine whether oocysts are dead or alive (Reducker and Speer 1985; Woodmansee 1987; Robertson et al. 1993).
The rationale being that dead oocysts would not respond to biological triggers and, as a result, would not release their sporozites. Calculating a ratio of oocysts that have released their sporozoites to those that have not responded (i.e. intact oocysts) can provide an excystation ratio for a given population of oocysts. In the environment, especially as oocysts age, the sporozoites inside the oocyst can degrade from various environmental pressures. To get an accurate representation of oocysts that may have the potential to cause disease, it is important to differentiate these oocyst shells from oocysts that have actively undergone in vitro excystation. Typically this necessitates examining a sub-sample of a given oocyst population before in vitro excystation to quantify empty shells. Then excystation is performed to enumerate excysted shells and intact oocysts. The empty shells are subtracted from excysted shells and then a ratio of this adjusted number to intact oocysts can be used to calculate percentage excystation or oocyst viability. From the mid 1980’s until 1998-1999 in vitro excystation was deemed a ‘silver’ standard for determining whether oocysts were dead or alive. The neonatal mouse infectivity was deemed the ‘gold’ standard and a number of studies reported excystation and mouse infectivity to correlate well ( Korich et al. 1990; Korich 1993; Finch et al. 1993).
Both excystation and infectivity assays required large numbers of oocysts to establish oocyst viability or infectivity, which precluded their routine use for low numbers of environmentally derived oocysts and limited application of these assays to experimental studies. Viability assays utilizing fluorogenic vital dyes DAPI and PI ( Campbell et al. 1992), SYTO-9 and SYTO-59 ( Neumann et al. 2000) and fluorescence in situ hybridization (Smith et al. 2004) were developed to measure whether individual oocysts are dead or alive and were utilized for examining the viability of environmentally derived oocysts ( Robertson et al. 1992) as well as the efficiency of various disinfectants for oocyst inactivation ( Bukhari et al. 1999; Bukhari et al. 2000). Although these in vitro viability assays offered several advantages over traditional animal infectivity models in that they required significantly shorter time to produce results, were easy to use and relatively inexpensive, it was later demonstrated, particularly following UV exposure, that use of some in vitro viability assays could lead to grossly erroneous results ( Bukhari et al. 1999). As a result it became clear that demonstration of intracellular development was critical in determining whether oocysts are dead or alive ( Bukhari et al. 1999). This prompted studies to evaluate the suitability of cell-culture procedures as a less expensive and user-friendly alternative to neonatal mouse infectivity assays.
Since Current and Haynes 1984 first described in vitro cultivation of Cryptosporidium oocysts in human fetal lung cells, various investigators have utilized different cell lines and immunological or molecular detection procedures to characterize in vitro infectivity of C. parvum oocysts; however, no single protocol has been widely accepted or independently validated. Based on a number of studies ( DiGiovanni et al. 1999; Rochelle et al. 2004; Rochelle et al. 2002; Slifko et al. 1997; Slifko et al. 1999; Upton et al. 1994), some consensus was reached on the Human Ileoceccal Adenocarcinoma cell line (HCT-8) for hosting C. parvum oocyst infections (Figure 3); however, variation in oocyst pre-treatment, oocyst inoculation, incubation times for infectivity and variations in the detection procedures all presented obstacles in directly comparing oocyst survival or inactivation data generated with these various in vitro infectivity protocols. Several studies have demonstrated that oocyst isolation procedures (i.e. USEPA Methods 1622/1623) can successfully be combined with a cell culture for determining infectious oocyst occurrence in raw ( LeChevallier et al. 2003), finished waters ( Aboytes et al. 2004) and wastewater effluents (Bukhari Z 2010).
To utilize Cryptosporidium occurrence data (from environmental monitoring programs) for microbial risk assessments, it is imperative to use standardized or well-characterized protocols for oocyst enumeration and infectivity. With the various method permutations, comparison of oocyst infectivity data from various in vitro infectivity protocols is not readily feasible. Recognizing that independent validations, which assess both the reproducibility and predictive capacity of a method, are a key towards developing oocyst infectivity standards, a study comprising an international consortium (consisting of American Water Works Association Research Foundation, Drinking Water Inspectorate, UK, KIWA, Netherlands, United Kingdom Water Industry Research Ltd, United States Environmental Protection Agency, and Water Services Association, Australia) examined the sensitivity and reproducibility of in vitro cell culture infectivity assays using varying oocyst inocula of unknown (‘blind’) infectivity ( Bukhari and LeChevallier 2003 and Bukhari et al. 2007). The optimized cell culture based procedure subjected to these evaluations incorporated oocyst pre-acidification and exposure to bile salts immediately preceding inoculation onto HCT-8 monolayers followed by incubation at 37°C for 72 h and detection using immunofluorescence (IFA) to reveal clusters of developmental or endogenous stages (Figure 4). An advantage of this cell culture-IFA procedure, compared to other existing procedures, is its uniqueness in taking advantage of various oocyst pre-treatment triggers (i.e. acid treatment and exposure to bile). These modifications are intended to closely simulate conditions oocysts encounter when ingested by a susceptible host. Additionally the assay allows identification of non-infectious oocysts ( Bukhari and LeChevallier 2003), which undergo excystation and invasion of HCT-8 cells but fail to multiply intracellularly and generate ‘pin-points’ of invasion (Figure 5). Where qualitative or quantitative polymerase chain reaction (PCR) methods are used for detection of infection in the host cells, PCR based assays can not differentiate invasive stages from those undergoing active multiplication, which can lead to erroneous false positives and an over-estimation of oocyst infectivity ( Bukhari and LeChevallier 2003). In the Bukhari et al 2007 study performing ‘blind’ trials using cell culture-IFA, it was determined that this infectivity assay was highly effective for predicting the infectivity of oocysts of unknown infectivity, with a high degree of correlation (r2= 0.89) between estimated and actual number of infectious oocysts.
Cryptosporidium is an obligate parasite and can not multiply outside a susceptible host. This means oocysts in the environment are likely to degrade with time and lose their infectivity unless they are ingested by another susceptible host and complete their life cycle. In addition to ageing, various environmental pressures contribute to a reduction in oocyst viability and infectivity. Robertson et al. 1992 reported that oocyst exposure to -22°C for 6 days resulted in 10% of C. parvum oocysts surviving, which was reduced to 2% after 30 days. Fayer R 1994 reported that oocysts exposed to temperatures up to 67.5°C (for 1 min) were capable of infecting neonatal BALB/c mice. Oocyst survival in the environment has also been reported to depend on the ambient temperature. Oocysts incubated in reagent-grade water or in reservoir water at 4°C and 15°C remained infective for 12 weeks, whereas at 20°C and 25°C a 4-log10 reduction was observed for 12 and 8 weeks incubations respectively ( King et al. 2005). It has also been suggested that higher ambient temperatures lead to elevated metabolic activity inside oocysts and depletion of amylopectin energy reserves. Amylopectin below a critical level may prevent successful infection ( Fayer et al. 1998). Other environmental parameters influencing oocyst inactivation and survival were recently reviewed ( King and Monis 2007).
Selected survival studies:
* Standard deviation.
Oocyst viability (determined by in vitro excystation) was similar after 8 weeks of storage at 4°C or 10°C. In vivo infectivity yielded similar results to in vitro excystation following storage at 10°C; however, indicated considerably higher oocyst infectivity following storage at 4°C. These finding further support that in vitro viability data should be interpreted with caution; especially where these data are to be used in risk assessment based decision frameworks.
Typically with coccidian parasites, the oocysts voided in the feces of the infected host require a maturation period before they can propagate infection in other susceptible hosts. In contrast to this, Cryptosporidium oocysts are immediately infectious and capable of causing disease in other susceptible hosts. Over the years there has been plenty of evidence gathered to suggest that the minimum infectious dose for Cryptosporidium is likely to be small, a feature shared with other human intestinal protozoa such as Giardia ( Rendtorff RC 1979). Among some of the early animal infectivity studies, an inoculum size of 100 oocysts was reported to induce 22% infection in mice (Ernest et al. 1987), a dose of 10 oocysts produced an infection in 2 of 2 non-human primate infants ( Miller et al. 1990) and 5 oocysts produced clinical sign in gnotobiotic lambs ( Blewett et al. 1993).
Using the Iowa strain of C. parvum oocysts (of calf origin) infectivity trials in healthy adult human volunteers (individuals with no serological evidence of previous exposure to Cryptosporidium infections) indicated 100% infection with >1000 oocysts and 20% (1 of 5) infection with 30 oocysts ( DuPont et al. 1995; Chappell et al. 1996). Based on these studies the ID50 (dose necessary to cause infection in 50% of the dose recipients) for humans was estimated to be around 132 oocysts. Because the ID50 may be influenced by a number of factors (i.e. oocyst infectivity, virulence, strain variation and host related factors), it is not surprising there is considerable variability associated with this number. Others have reported the 50% infective dose in healthy human volunteers to be between approximately 10 and 1,000 oocysts depending on the C. parvum isolate ( Okhuysen et al. 1999; Ochiai et al. 2005).
Considering the laboratory based infectivity data in conjunction with raw and finished water monitoring data, especially during outbreak situations, it has become apparent that very low doses of infectious Cryptosporidium (perhaps even a single oocyst) may be capable of causing disease. Several risk assessments of Cryptosporidium in drinking water have been performed ( Perz et al. 1998; Fewtrell et al. 2001; Messner et al. 2001); however, the complexity of risk analysis is increasing further as the taxonomic understanding of species and genotypes infectious to humans is becoming more detailed and as oocyst viability and/infectivity is becoming reliable and routinely gathered during monitoring programs. Except for the finished water studies conducted at American Water ( Aboytes et al. 2004), most other risk assessment analyses have used source water monitoring data and adjusted the data with estimates of water treatment process removal/inactivation efficacy for oocysts. These performance based assumptions of treatment plant performance contribute further uncertainties to microbial risk assessment analyses and need to be eliminated in future studies.
Human dose response data from three C. parvum bovine genotype isolates has been subjected to Meta analysis ( Messner et al. 2001) to calculate the infectious dose for an unknown isolate. This is summarized in the Table below:
Source: Messner et al. 2001
Daily risk of Cryptosporidium infections is calculated by multiplying the volume of drinking water consumed daily (1.2 liters/day) by the concentration of oocysts and the risk of infection from a single oocyst:
The annual risk is determined by multiplying the daily risk by a factor of 350 (remaining 15 days of the year accounts for days when water is consumed from different sources). The USEPA established an annual acceptable risk of microbial infection in drinking water at 10-4 infections per person ( Regli et al. 1999). To achieve the <10-4 infection per person per year level, infectious Cryptosporidium oocysts would have to be absent in 290,000 liters of drinking water (assuming a 40% recovery efficiency with USEPA approved methods).
To understand the risk of waterborne human cryptosporidiosis and to implement appropriate control measures at drinking water treatment plants, it is necessary to understand occurrence of Cryptosporidium oocysts at the raw water intake and characterize removal efficiencies at various stages in the treatment train. Monitoring oocyst levels in treated drinking water is also desirable from a microbial risk perspective. The microbial risk assessment models are only as reliable as their data inputs. To circumvent the old adage ‘garbage in, garbage out’, it is necessary to have well characterized methods that can enhance the capabilities for oocyst enumeration, viability assessment and molecular characterization. Even though USEPA Methods 1622/23 are currently the sole standards in the US for regulatory monitoring of Cryptosporidium under the Long Term 2 Enhanced Surface Water treatment Rule (LT2ESTWR), considerable progress has been made in detection procedures that incorporate specific DNA based targets to enhance specificity of the detection procedures and provide opportunities for oocyst speciation. The following sections provide an abbreviated overview of some sampling and detection procedures.
Cryptosporidium oocysts using Differential Interference Contrast (DIC) microscopy.
Figures 2A and B
Cryptosporidium oocysts stained with fluorochrome conjugated antibody (left) & 4,6-diamidino-2-phenyl indole (right).
Figures 3A and B:
Concentration of Cryptosporidium oocysts by using Immunomagnetic Separation (IMS).
This illustration depicts the life cycle of different species of Cryptosporidium, the causal agents of Cryptosporidiosis.
This diagram is part of the Public Health Image Library (PHIL) from the Centers for Disease Control and Prevention (CDC). It is public domain and thus free of copyright restrictions. We would like to express our appreciation for providing these images.
Links to useful external websites are provided in the following.
|Last Updated on Tuesday, 03 April 2012 02:48|