Cryptosporidium and Giardia are important parasites due to their zoonotic potential and impact on human health, often causing waterborne outbreaks of disease

Description of transmission modes of Cryptosporidium. Following ingestion (and possibly inhalation) by a suitable host (e.g., human host), excystation occurs (infective stage, (1)). The released sporozoites invade epithelial cells of the gastrointestinal tract or other tissues, complete their cycle producing oocysts which exit host (diagnostic stage, (2)) and are released in the environment (3). Transmission of Cryptosporidium mainly occurs by ingestion of contaminated water (e.g., surface, drinking or recreational water), food sources (e.g., chicken salad, fruits, vegetables) or by person-to-person contact (community and hospital infections) (4). Zoonotic transmission of C. parvum occurs through exposure to infected animals (person-to-animal contact) or exposure to water (reservoir) contaminated by feces of infected animals (4). Putignani and Menchella, 2010.

Geography of worldwide occurrence of human cryptosporidiosis outbreaks and sporadic cases. A color-coded distribution of the main cases of cryptosporidosis reported in the literature during the last decade (1998–2008) for the entire population (adults and children) is here represented. Waterborne and foodborne diseases are represented by red and grey color, respectively. Spreading of the infection due to HIV immunological impairment is represented by green and travel-related disease by pink color. When not applicable the definition of waterborne and foodborne disease, the term community disease has been applied to person-to-person contacts and represented by a pale blue color. For countries characterised by two or three coexisting transmission modes, a double color-filling effect plus thick border lines have been used, consistently with the above reported code. Putignani and Menchella, 2010.

Cryptosporidium spp. are coccidians, oocysts-forming apicomplexan protozoa, which complete their life cycle both in humans and animals, through zoonotic and anthroponotic transmission, causing cryptosporidiosis. The global burden of this disease is still underascertained, due to a conundrum transmission modality, only partially unveiled, and on a p…CiteDownload full-text Emerg Infect Dis . 2014 Apr;20(4):581-9. doi: 10.3201/eid2004.121415.

Large outbreak of Cryptosporidium hominis infection transmitted through the public water supply, Sweden

Micael WiderströmCaroline SchönningMikael LiljaMarianne LebbadThomas LjungGörel AllestamMartin FermBritta BjörkholmAnette HansenJari HiltulaJonas LångmarkMargareta LöfdahlMaria OmbergChristina ReuterwallEva SamuelssonKatarina WidgrenAnders WallenstenJohan Lindh

Free PMC article

Abstract

In November 2010, ≈27,000 (≈45%) inhabitants of Östersund, Sweden, were affected by a waterborne outbreak of cryptosporidiosis. The outbreak was characterized by a rapid onset and high attack rate, especially among young and middle-aged persons. Young age, number of infected family members, amount of water consumed daily, and gluten intolerance were identified as risk factors for acquiring cryptosporidiosis. Also, chronic intestinal disease and young age were significantly associated with prolonged diarrhea. Identification of Cryptosporidium hominis subtype IbA10G2 in human and environmental samples and consistently low numbers of oocysts in drinking water confirmed insufficient reduction of parasites by the municipal water treatment plant. The current outbreak shows that use of inadequate microbial barriers at water treatment plants can have serious consequences for public health. This risk can be minimized by optimizing control of raw water quality and employing multiple barriers that remove or inactivate all groups of pathogens.

Keywords: Cryptosporidium hominis infection; cryptosporidiosis/epidemiology; cryptosporidiosis/prevention and control; cryptosporidiosis/transmission; diarrhea; disease outbreaks; drinking water; molecular typing; questionnaires; risk factors; waste management, parasites; water microbiology; water supply; waterborne infections. https://pubmed.ncbi.nlm.nih.gov/24655474/

——-

Cryptosporidium and Giardia in surface water and drinking water: Animal sources and towards the use of a machine-learning approach as a tool for predicting contamination

Author links open overlay panelPanagiotaLigdaabEdwinClaereboutaDespoinaKostopouloubAntoniosZdragasbStijnCasaertaLucy J.RobertsoncSmaragdaSotirakibaLaboratory of Parasitology, Faculty of Veterinary Medicine, Ghent University, Salisburylaan 133, B-9820, Merelbeke, BelgiumbLaboratory of Infectious and Parasitic Diseases, Veterinary Research Institute, Hellenic Agricultural Organization – DEMETER, 57001, Thermi, Thessaloniki, GreececParasitology, Department of Paraclinical Science, Faculty of Veterinary Medicine, Norwegian University of Life Sciences, PO Box 369 Sentrum, 0102, Oslo, Norway

Received 29 January 2020, Revised 16 April 2020, Accepted 6 May 2020, Available online 11 May 2020.

Show lessAdd to MendeleyShareCitehttps://doi.org/10.1016/j.envpol.2020.114766Get rights and content

Highlights

Cryptosporidium spp. and Giardia duodenalis were commonly found in surface waters.•
Hot spots and seasonal pattern of contamination identified.•
Parasite assemblages/species identified in water were the same as those in animals.•
Zoonotic species/assemblages of both parasites were identified in all matrices.•
Machine learning approaches revealed interactions with biotic/abiotic factors.

Abstract

Cryptosporidium and Giardia are important parasites due to their zoonotic potential and impact on human health, often causing waterborne outbreaks of disease. Detection of (oo)cysts in water matrices is challenging and few countries have legislated water monitoring for their presence. The aim of this study was to investigate the presence and origin of these parasites in different water sources in Northern Greece and identify interactions between biotic/abiotic factors in order to develop risk-assessment models. During a 2-year period, using a longitudinal, repeated sampling approach, 12 locations in 4 rivers, irrigation canals, and a water production company, were monitored for Cryptosporidium and Giardia, using standard methods. Furthermore, 254 faecal samples from animals were collected from 15 cattle and 12 sheep farms located near the water sampling points and screened for both parasites, in order to estimate their potential contribution to water contamination. River water samples were frequently contaminated with Cryptosporidium (47.1%) and Giardia (66.2%), with higher contamination rates during winter and spring. During a 5-month period, (oo)cysts were detected in drinking-water (<1/litre). Animals on all farms were infected by both parasites, with 16.7% of calves and 17.2% of lambs excreting Cryptosporidium oocysts and 41.3% of calves and 43.1% of lambs excreting Giardia cysts. The most prevalent species identified in both water and animal samples were C. parvum and G. duodenalis assemblage AII. The presence of G. duodenalis assemblage AII in drinking water and C. parvum IIaA15G2R1 in surface water highlights the potential risk of waterborne infection. No correlation was found between (oo)cyst counts and faecal-indicator bacteria. Machine-learning models that can predict contamination intensity with Cryptosporidium (75% accuracy) and Giardia (69% accuracy), combining biological, physicochemical and meteorological factors, were developed. Although these prediction accuracies may be insufficient for public health purposes, they could be useful for augmenting and informing risk-based sampling plans.

Graphical abstract

Image 1

https://www.sciencedirect.com/science/article/abs/pii/S026974912030676X

——-

Global Issues in Water, Sanitation, and Health: Workshop Summary.

2Lessons from Waterborne Disease Outbreaks

Go to:

OVERVIEW

This chapter is comprised of three case studies of waterborne disease outbreaks that occurred in the Americas. Each contribution features an outbreak chronology, an analysis of contributing factors, and a consideration of lessons learned. Together, they illustrate how an intricate web of factors—including climate and weather, human demographics, land use, and infrastructure—contribute to outbreaks of waterborne infectious disease.

The chapter begins with an account of the massive cholera epidemic that began in urban areas of Peru in 1991 and swept across South America by Carlos Seas and workshop presenter and Forum member Eduardo Gotuzzo, of Universidad Peruana Cayetano Heredia and Hospital Nacional Cayetano Heredia in Lima, Peru. The authors describe current understanding of the role of Vibrio cholerae in marine ecosystems, and consider how climatic and environmental factors, as well as international trade, may have influenced the reintroduction of this pathogen to the continent after nearly a century’s absence. The epidemic persisted for five years, then reappeared, with diminshed intensity, in 1998. While attempts to control the epidemic through educational campaigns aimed at improving sanitation were unsuccessful in the short term, Seas and Gotuzzo report that, following a significant investment in sanitation in the wake of this public health disaster, transmission rates of other waterborne infectious diseases, including typhoid fever, declined in Peru. They note that, by understanding the ecology of V. cholerae, researchers may be able to predict relative risk for pathogen transmission from marine environments and thereby aid efforts at preventing epidemics.

In 1993, two years after cholera struck Peru, an epidemic of cryptosporidiosis in Miluwaukee, Wisconsin, sickened hundreds of thousands of people and caused at least 50 deaths, demonstrating that even “modern” water treatment and distribution facilities are vulnerable to contamination by infectious pathogens. In their contribution to this chapter, workshop presenter Jeffrey Davis and coauthors recount their investigation of this outbreak, which resulted from the confluence of multiple and diverse environmental and human factors. Based on lessons learned from their discoveries, the authors made—and authorities undertook—recommendations to prevent further outbreaks in the Milwaukee water system, resulting in significant improvements in water quality. Their findings have proven applicable to other water treatment facilities that share Lake Michigan and have received attention from water authorities worldwide.

The final paper in this chapter, by workshop presenter Steve Hrudey and Elizabeth Hrudey of the University of Alberta, Canada, discusses an episode of bacteria contamination of the water in Walkerton, Ontario, in 2000. The outbreak sickened nearly half of the town’s 5,000 residents and caused 7 deaths, as well as 27 cases of hemolytic uremic syndrome, a severe kidney disease. Several incidents of human error and duplicity figure prominently among the causes of this entirely preventable outbreak, the authors explain. “Because outbreaks of disease caused by drinking water remain comparatively rare in North America,” they conclude, “complacency about the dangers of waterborne pathogens can easily occur.” Based on their findings, they present a framework for water system oversight intended to save other communities from Walkerton’s fate.Go to:

THE CHOLERA EPIDEMIC IN PERU AND LATIN AMERICA IN 1991: THE ROLE OF WATER IN THE ORIGIN AND SPREAD OF THE EPIDEMIC

Carlos Seas, M.D. Universidad Peruana Cayetano Heredia Eduardo Gotuzzo, M.D., FACP  Universidad Peruana Cayetano Heredia

At Athens a man was seized with cholera. He vomited, and was purged and was in pain, and neither the vomiting nor the purging could be stopped; and his voice failed him, and he could not be moved from his bed, and his eyes were dark and hollow, and spasms from the stomach held him, and hiccup from the bowels. He was forced to drink, and the two (vomiting and purging) were stopped, but he became cold.

Hippocrates

After an absence of almost one century, cholera reappeared in South America in Peru during the summer of 1991. This event was totally unexpected by the scientific community, which had anticipated the spread of cholera to the continent from Africa and had hypothesized its introduction by Brazil following well-recognized routes of dissemination of the disease that involve trade and commerce. The further spread of the epidemic was very rapid; all Peruvian departments had reported cholera cases in less than six months; almost all Latin American countries, with the exception of Uruguay, had reported cases within one year of the beginning of the epidemic. The chains of events that triggered and disseminated the epidemic into the continent have not been fully elucidated, but evidence is being gathered on the possible role of marine ecosystems, climate and environmental factors, and the pivotal role of water. We discuss here the evidence in support of water’s role in cholera dynamics.

The Environmental Life Cycle of Vibrio cholerae

The natural reservoirs of V. cholerae are aquatic environments, where O1 and non-O1 serogroups coexist. V. cholerae survives by attaching to and forming symbiotic associations with algae or crustacean shells (Figure 2-1). In these environments, V. cholerae multiplies and can persist for years in a free-living cycle without human intervention, as it has been elegantly described by Dr. Colwell and her associates at the International Centre for Diarrheal Diseases Research in Dhaka, Bangladesh (Colwell et al., 1990).

FIGURE 2-1

Vibrio cholerae O1 attached to a copepod. SOURCE: Courtesy of Rita Colwell, Ph.D., University of Maryland.

A number of environmental factors modulate the abundance of Vibrio, including, but not limited to, temperature, pH, salinity, and nutrient availability. Under adverse conditions, V. cholerae survives in a dormant state with all metabolic pathways shut down, which can be reactivated again when suitable conditions return. Additionally, V. cholerae can produce biofilms—surface-associated communities of bacteria with enhanced survival under negative conditions—which can switch to active bacteria and induce epidemics.

The ability of V. cholerae to regulate its metabolism based on the environmental conditions of its natural reservoir may explain the endemicity of cholera in many parts of the world. During the cholera epidemic in Peru, V. cholerae was isolated from many aquatic environments, including not only marine ecosystems, but riverine and lake environments. Even one of the highest commercially navigable freshwater lakes in the world, Lake Titicaca, located 3,827 meters above sea level on the border of Peru and Bolivia, was impacted by the cholera epidemic of 1991.

Humans are only temporary reservoirs of V. cholerae. Interestingly, lytic phages modulate the abundance of V. cholerae in the human intestine, but on the other hand, V. cholerae are able to up-regulate certain genes in the intestine of humans resulting in a short-time hyperinfectious state. As illustrated in Figure 2-2V. cholerae is introduced to humans from its aquatic environment through contamination of food and water sources.

FIGURE 2-2

A hierarchical model for cholera transmission. SOURCE: Reprinted from Lipp et al. (2002) with permission from the American Society for Microbiology.

The Origins of the Latin American Epidemic

The Latin American cholera epidemic was officially declared in Peru during the third week of January 1991, almost simultaneously in three cities along the north coastal area of the country. By the end of that year, almost 320,000 cases had been officially reported to the Pan American Health Organization by the Peruvian Ministry of Health. Nearly 45,000 cases occurred every week, in what was considered the worst cholera epidemic of the century in Peru (Gotuzzo et al., 1994). There were several distinctive features of this epidemic:

  • Very high attack rates were reported soon after the epidemic started.
  • Cholera accounted for almost 80 percent of all acute diarrhea cases in the country irrespective of the degree of dehydration and age group.
  • The epidemic was initially concentrated in urban areas, where it spread very rapidly, suggesting a common source of dissemination.
  • Transmission was halted in very few areas, where treatment and chlorination of municipal water was possible, suggesting a critical role of water in the transmission of the disease.
  • Very low case-fatality rates were reported from urban areas where patients had access to treatment by well-trained health personnel, but higher figures were reported from isolated communities where patients did not have access to health centers, a situation similar to those reported from Africa in refugee settings under political instability.

Although the epidemic spread to neighboring countries, it never reached the magnitude seen in Peru, which suffered that year from serious economic constraints and reported the lowest level of sanitary coverage and sanitary investment in the region. During 1991, approximately 50 percent of the population in urban cities of Peru received treated municipal water; intermittent supply and clandestine connections were common in many cities of the country (Figure 2-3). Additionally, less than 10 percent of sewerage water was treated properly. These conditions prevailed before the beginning of the epidemic and were responsible for its very rapid spread. The epidemic lasted for five years until 1995, only to reappear again in 1998 with much less intensity, as shown in Figure 2-4 (WHO, 2008). The message conveyed to the population at the beginning of the epidemic to curtail transmission focused on avoiding eating raw fish and shellfish and to boil water for drinking purposes.

FIGURE 2-3

A shantytown in Peru during 1991. SOURCE: Instituo de Medicina Tropical Alexander von Humboldt, Lima, Peru.

FIGURE 2-4

Cholera in the Americas, 1991–2006. SOURCE: Based upon data compiled from and reported in the WHO’s Weekly Epidemiological Record.

Massive investment in sanitation followed the epidemic, which was responsible for a reduction in transmission not only of cholera but also of other enteric infections, such as numerous parasitic infections and typhoid fever. The case of typhoid fever deserves special mention. Many experienced doctors in Lima saw a marked reduction in the incidence of typhoid fever in their practices as a consequence of improvements in sanitation and hygiene, a situation that was also seen at our Institute (Figure 2-5).

FIGURE 2-5

Typhoid fever cases seen at the Alexander von Humboldt Tropical Medicine Institute in Lima, Peru, 1987–1993.

Before 1990, typhoid fever was responsible for the majority of episodes of undifferentiated fever lasting at least five days in Lima. Approximately 70 to 100 patients with complicated typhoid fever were hospitalized yearly in our institution the decade before the cholera epidemic; these figures were reduced tenfold after 1991. The reduction in typhoid fever incidence was so dramatic that the disease is almost unknown by the generation of physicians trained after 1991, with the subsequent delay in diagnosis and development of complications, an unthinkable situation the decade before 1990.

Still, a question remains unanswered: From where did this huge cholera epidemic originate? Although both nontoxigenic O1 and non-O1 V. cholerae strains had been isolated from environmental sources and from patients in Peru and other countries in the region, the hypothesis that suggested that these Vibrio became residents in aquatic environments of coastal Peru with further acquisition of virulence genes that mediated for toxigenic expression through phage infection seems unlikely. Additionally, genetic comparison of the Vibrio responsible for the epidemic; V. cholerae O1 serotype Inaba and biotype El Tor, with endemic agents in Asia, disclosed very similar patterns, suggesting common ancestors or spread from one place to another. The latter option seems more reasonable. Another hypothesis suggests that V. cholerae was seeded into the marine ecosystems of northern Peru a few months before the epidemic started, which seems more likely in light of what was discussed earlier—that Vibrio was imported from Asia transported by crew ships, or emptied from vessels discharging bilge water contaminated with the bacterium.

From its aquatic environment Vibrio was first amplified along the north coast and then introduced almost simultaneously into several cities of the country (Figure 2-6). This hypothesis was proposed after analyzing data generated by a retrospective study that reviewed charts of patients who had attended several hospitals along the Peruvian North Coast in 1989 and 1991, and disclosed that seven patients fulfilled the clinical definition of cholera proposed by the World Health Organization three months before the epidemic had started (Seas et al., 2000). These adult patients attended with severe dehydration and watery diarrhea, clinical presentation that had not been at these health centers the year before the epidemic. Although no convincing evidence proves definitively that these cases were due to cholera (clinical laboratory cultures had not been obtained for these cases), the clinical presentation is similar to that described in other epidemic areas for cholera, and also similar to that which many Peruvian doctors subsequently saw (Figure 2-7).

FIGURE 2-6

The seventh cholera pandemic. SOURCE: Carlos Seas. Cólera. Medicina Tropical. CD-ROM. Version 2002. Instituto de Medicina Tropical, Príncipe Leopoldo. Amberes, Bélgica. Instituto (more…)

FIGURE 2-7

Location of patients in Peru with presumed cholera, identified before the epidemic of 1991. SOURCE: Reprinted from Seas et al. (2000) with permission from the American Journal of (more…)

Which Forces Drove the Spread of Vibrio into the Pacific Coastal Areas of Peru?

Dr. Rita Colwell’s theory on the environmental niche for V. cholerae in aquatic ecosystems is crucial for understanding cholera dynamics. Factors that modify the survival of Vibrio in the environment may dramatically influence cholera transmission. Climate change and climate variability are among these critical forces. While the association between climate, environmental factors, and cholera transmission had been proposed a long time ago, the role of climate in cholera dynamics has been better elucidated in recent years.

Time series analysis has demonstrated a relationship between the appearance of cholera cases in Bangladesh and occurrence of El Niño-Southern Oscillation (ENSO). Observations have linked the interannual variability of ENSO with the proportion of cholera cases in Dhaka, Bangladesh. Additionally, climate variability due to ENSO and temporary immunity explained the interannual cycles of cholera in rural Matlab, Bangladesh, for a period of almost 30 years (Pascual et al., 2000). The net effect of ENSO—rise in both sea water temperature and planktonic mass—modifies the abundance of V. cholerae in the environment by affecting the concentration of plankton to which V. cholerae is attached and affects the concentration of nutrients and salinity.

Water temperature affects cholera transmission, as has been observed in the Bay of Bengal, Bangladesh. All these data support the role of ENSO in the interannual variability of endemic cholera. An unproven hypothesis suggests that El Niño triggered the epidemic of cholera in Peru in 1991 by amplifying the planktonic mass and dispersing existing Vibrio along the north coast of the country. Then Vibrio was introduced into the continent by contaminated food and subsequently by contamination of the water supply system (Seas et al., 2000).

Several studies conducted in Peru after 1991 have shown an association between warmer air temperatures and cholera cases in children and adults. Additionally, toxigenic V. cholerae O1 has been isolated from aquatic environments in the coastal waters of Peru, suggesting that it has been successful in adapting to these environments, as has been described in Bangladesh and India. These findings support the theory of an environmental niche for V. cholerae O1 in Latin America and temporal associations between ENSO and cholera outbreaks from 1991 onward (Salazar-Lindo et al., 2008).

The Spread of the Cholera Epidemic in Peru as a Model to Understand Transmission in the Region

As illustrated earlier in Figure 2-3, the cholera transmission cycle involves infection of humans by the consumption of contaminated food and water and further shedding of the bacteria into the environment via contaminated stools. Incredibly high attack rates accompany human infection under favorable conditions, especially in previously nonexposed populations. Very high household transmission rates also occur.

Transmission via contaminated water and food has been long recognized. During the Latin American epidemic, acquisition of the disease by drinking contaminated water from rivers, ponds, lakes, and even tube well sources were documented. Contamination of municipal water was the main route of cholera transmission in Trujillo, Peru, during the epidemic in 1991. Drinking unboiled water, introducing contaminated hands into containers used to store drinking water, drinking beverages from street vendors, drinking beverages when contaminated ice had been added, and drinking water outside the home are recognized exposure risk factors for cholera. In addition to the crucial role of water in the transmission of cholera, poor hygienic conditions also contribute to the spread of cholera by exposing susceptible persons to the pathogen. Educational campaigns were implemented throughout the country with little effect in the short term.

Certain host factors may have played a role in the transmission of cholera. Infection by Helicobacter pylori, the effect of the O blood group, and the protective effect of breast milk deserve to be mentioned. Studies from Bangladesh and Peru show that people infected by H. pylori are at higher risk of acquiring cholera than people not infected by H. pylori (León-Barúa et al., 2006). Additionally, the risk of acquiring severe cholera among people coinfected with H. pylori is higher in patients without previous contact with V. cholerae, as measured by the absence of vibriocidal antibodies in the serum (Clemens et al., 1995).

H. pylori is highly endemic in developing countries, particularly among low-income status individuals. Infection causes a chronic gastritis that induces hypochlorhydria, which in turn reduces the ability of the stomach to limit the Vibrio invasion. Patients carrying the O blood group, which is widespread in Latin America, have a higher risk of developing severe cholera. Higher affinity of the cholera toxin to the ganglioside receptor in patients with O blood group and lower affinity in patients of A, B, and AB blood groups may explain this association. Finally, the protective effect of breast milk, possibly mediated by a high concentration of secretory IgA anti-cholera toxin, has been proposed.

Preventing Future Epidemics

The scarce number of autochthonous cases reported from developed countries, such as the United States and Australia where Vibrio cholerae O1 is a resident of aquatic environments, provides additional support for the well-known concept that hygiene and sanitation can control cholera transmission. These relatively simple measures are very difficult to implement in the developing world (Zuckerman et al., 2007).

Alternative ways to prevent cholera transmission have been explored, including but not limited to the boiling and/or chlorination of water, exposing water to sunlight, filtering water using Sari cloth, and educating the population at risk on appropriate hygienic practices (Colwell et al., 2003). Using new information generated from the studies that delineated the ecological niche of Vibrio may help in predicting the onset of an epidemic, which may have a tremendous impact on prevention. Searching for V. cholerae O1 in municipal sewage and environmental samples in endemic areas could be used as a warning signal of future epidemics (Franco et al., 1997), and monitoring the movement and abundance of plankton by satellite seems attractive, but more studies are needed to support the implementation of these methods.

Conclusions

The cholera epidemic in Latin America was characterized by an explosive beginning with rapid spread in urban areas of Peru and other poor neighboring countries. The available information suggests that environmental factors amplified the existing Vibrio population and induced an epidemic, which was further amplified by contamination of municipal water and food. Water played a key role not only in maintaining Vibrio in its natural reservoir but also in disseminating the epidemic.

Acknowledgments

We would like to express our most sincere gratitude to Dr. Rita Colwell and Dr. Bradley Sack for sharing with us valuable information and images that were reproduced in this manuscript.Go to:

LESSONS FROM THE MASSIVE WATERBORNE OUTBREAK OF CRYPTOSPORIDIUM INFECTIONS, MILWAUKEE, 1993

Jeffrey P. Davis, M.D. Wisconsin Division of Public Health William R. Mac Kenzie, M.D. Centers for Disease Control and Prevention David G. Addiss, M.D., M.P.H. Fetzer Institute

The Investigation Begins

On Monday, April 5, 1993, the City of Milwaukee Health Department (MHD) received reports of increased school and workplace absenteeism due to diarrheal illness in Milwaukee County, Wisconsin. This appeared to be quite widespread, particularly on the south side of the city. In one particular hospital, during the previous weekend, more than 200 individuals had been cultured for bacterial enteric pathogens. The hospital ran out of bacterial culture media, yet none of the patients tested positive for bacterial enteric pathogens. Routine tests for ova and parasites done on many stools did not reveal pathogens. Pharmacies were experiencing widespread shortages of antidiarrheal medications. Because of the clinical profile of illness and the apparent magnitude of the outbreak, we considered this outbreak to be due to a product with wide local distribution with drinking water being the most likely vehicle. The Wisconsin Division of Health (now the Wisconsin Division of Public Health [DPH]) offered onsite assistance to investigate and control this outbreak. The offer was accepted; lead staff arrived on April 6 and additional team members arrived on April 7.

The Director of the Bureau of Laboratories, MHD, requested and received water quality and treatment data from the Milwaukee Water Works (MWW). While preliminarily reviewing these data, he noted striking spikes in turbidity of water treated in one of the two MWW treatment plants (the southern plant), and these spikes in finished water turbidity occurred on multiple days in late March and early April. This was reminiscent of a large waterborne outbreak of Cryptosporidium infections in Carrollton, Georgia, that occurred among customers of a municipal water supply (Hayes et al., 1989). On April 6, following discussion with DPH staff, the laboratory director selected some representative stool specimens among those that had tested negative for enteric pathogens and tested them for protozoan parasites including Cryptosporidium, which initially had not been done in the clinical laboratories. Early on April 7, DPH, MHD, and Wisconsin Department of Natural Resources staff met with MWW officials. By late afternoon on April 7, positive results for Cryptosporidium were reported found in stool specimens from three adults conducted by the MDH laboratory, and stools from five adults tested at other Milwaukee laboratories. These adults resided at widespread locations in southern Milwaukee and one neighboring municipality within Milwaukee County. Following a meeting with city and state public health and water treatment officials, Milwaukee’s mayor, John Norquist, issued a boil water advisory on the evening of April 7. The outbreak received considerable media attention for more than two weeks. Inordinate numbers of people were inconvenienced. Pharmacies continued to sell a lot of antidiarrhea medications. Many industries with processes dependent on treated water were challenged.

The city of Milwaukee occupies most of Milwaukee County. Three rivers—the Milwaukee, the Menomonee, and the Kinnickinnic—flow through the county and converge in the city, where they empty into Lake Michigan within a breakfront; the ambient flow of the river water entering the lake is southerly (Figure 2-8). Figure 2-8 also shows the location of the two MWW water treatment plants: the northern plant received water by gravity flow through an intake 1.2 miles offshore, and the southern plant received water by gravity flow through an intake 1.4 miles offshore.

FIGURE 2-8

Location of the three rivers that flow through Milwaukee County, Wisconsin, the breakfront protecting the city of Milwaukee harbor, and the northern and southern Milwaukee Water Works water treatment (more…)

Figure 2-9, which depicts daily turbidity values for treated water at both plants during March and April 1993, demonstrates several spikes in treated water turbidity at the southern plant (Mac Kenzie et al., 1994b). The first peak, which occurred on or about March 23, was the largest recorded at the southern plant in more than 10 years. It was followed by a considerably larger, sustained peak with maximum turbidity measurements on March 28 and 30, and another peak on April 5. The southern plant was closed on April 7, but the water there was sampled on April 8 (Mac Kenzie et al., 1994b).

FIGURE 2-9

Maximal turbidity of treated water in the northern and southern water treatment plants of the Milwaukee Water Works from March 1 through April 28, 1993. NTU denotes nephelometric turbidity units. (more…)

At the time of the outbreak, not much was known about Cryptosporidium in water. The pathogen had first been documented in humans in 1976 (Meisel et al., 1976Nime et al., 1976), and by the early 1980s, was recognized as an AIDS-defining illness (Current et al., 1983). There had been several water-associated outbreaks in the United States and in the United Kingdom prior to the Milwaukee event, although most were associated with surface water contamination (D’Antonio et al., 1985Gallagher et al., 1989Joseph et al., 1991Leland et al., 1993Richardson et al., 1991). The 1987 outbreak in Carrollton, Georgia, affected an estimated 13,000 customers of a municipal water supply, but it was not associated with high turbidity of treated water (Hayes et al., 1989).

To evaluate for other microbiologic etiologies for the Milwaukee outbreak we reviewed the results of laboratory examinations of stool samples conducted in 14 different local laboratories between March 1 and April 16, 1993 (Mac Kenzie et al., 1994b). No increase in bacterial enteric pathogens was found. Prior to recognition of the outbreak, between March 1 and April 6, only 42 Cryptosporidium tests (nearly all of them on samples from patients with HIV/AIDS) were conducted, but after the outbreak was recognized more than 1,000 Cryptosporidium tests were conducted in a seven-day period. During both intervals, nearly one-third of these samples tested positive for Cryptosporidium (Mac Kenzie et al., 1994b). The percentage of positive Cryptosporidium tests, although not as high as one might expect in an outbreak, were similar to the rates of positive tests noted during the Carrollton event (39 percent). We believe that these results reflect the limits of standard microbiologic testing for Cryptosporidium available at that time.

Thus, early in our investigation we established that Cryptosporidium was the most likely cause of the outbreak and hypothesized that treated water from the southern water treatment plant was the vehicle for the majority of human infections associated with this outbreak. In addition to the primary task of testing this hypothesis, there were many tasks and questions we sought to address, which included determining

  • the magnitude and timing of cases associated with the outbreak,
  • the spectrum of clinical symptoms experienced in a large population of persons infected with Cryptosporidium,
  • the incubation period of cryptosporidiosis following exposure,
  • the timing of contamination of Milwaukee water,
  • the secondary attack rate of cryptosporidiosis among family members not exposed to Milwaukee water,
  • the frequency of recurrence of the symptoms of cryptosporidiosis after initial recovery,
  • the presence of Cryptosporidium oocysts in Milwaukee water in water archived during the time of putative exposure of Milwaukee residents,
  • factors at the MWW southern water treatment plant that allowed Cryptosporidium oocysts to pass through in treated water to infect the public,
  • mortality associated with the outbreak,
  • the frequency of asymptomatic infection among exposed Milwaukee residents, and
  • the ultimate source of these Cryptosporidium oocysts: animals or humans.

To develop an epidemiologic case definition of cryptosporidiosis, we compared people with laboratory-confirmed infections with those who had clinical diagnoses (Mac Kenzie et al., 1994b). The age and gender profiles of these two patient classes were similar, although laboratory-confirmed cases were skewed, as one might expect, toward more serious illness. There was a uniform occurrence of diarrhea/watery diarrhea in all cases. Cramps, fatigue, muscle aches, vomiting, and fever occurred more frequently in the laboratory-tested individuals. Temporal distribution of the two patient classes was virtually identical (Mac Kenzie et al., 1994b).

Rapid Hypothesis Testing—Nursing Home Study

To rapidly test the hypothesis that the southern water treatment plant was the likely source of the outbreak, we examined rates of diarrhea among geographically fixed populations—residents of nursing homes—in different parts of Milwaukee (Mac Kenzie et al., 1994b). Due to their relative geographic location, nine nursing homes received drinking water primarily from the north plant; seven received water primarily from the south plant. Information on diarrhea was collected routinely at these nursing homes, so we were able to review their logs to establish the rate of diarrhea (defined as three or more loose stools per 24-hour period).

We found a spike in diarrheal illness peaking between April 1 and 6 among nursing home residents served by the south water plant. High rates of diarrhea continued into the following week and returned to baseline by April 19 (Mac Kenzie et al., 1994b). By contrast, diarrhea rates at nursing homes served by the north water plant remained at baseline throughout March and April. Importantly, the one nursing home in the south that obtained its water from a well had no increase in diarrhea rates. We tested stools from 69 nursing home residents with diarrhea from the south, and 12 from the north, for Cryptosporidium. Thirty-five (51 percent) of the southern samples were positive, but every northern sample was negative (Mac Kenzie et al., 1994b).

Magnitude and Impact of the Milwaukee Outbreak— Random Telephone Survey

To assess the magnitude of this outbreak, we conducted a random telephone survey of 840 households in Milwaukee and in the four surrounding counties, asking about the number of cases of watery diarrhea experienced between March 1 and April 28 (Mac Kenzie et al., 1994b). The response rate was 73 percent, and included 1,663 household members whose demographic features closely tracked 1990 census data. Among this sample, 436 (26 percent) were reported to have had watery diarrhea during the survey period, with the peak number of cases occurring during April 3 through April 5. As may be seen in Figure 2-10, the attack rate among MWW customers whose homes were served principally by the northern plant was 26 percent, compared with 52 percent of those from homes served principally by the southern plant. Residents receiving a mixture of water from both plants had an intermediate attack rate of 35 percent (Mac Kenzie et al., 1994b).

FIGURE 2-10

Rate of watery diarrhea from March 1 through April 28, 1993, among respondents in a random-digit telephone survey of households in the five county Greater Milwaukee area, by Milwaukee Water Works (more…)

In our survey of Milwaukee and the four surrounding counties there was an overall attack rate of watery diarrhea of 26 percent. Applying this to the population of the five-county area, and subtracting a background rate of 0.5 percent, we estimated that 403,000 residents had watery diarrhea associated with this outbreak (certainly other people outside the survey area had also become ill, but we could not estimate their numbers; Mac Kenzie et al., 1994b). In our survey, 11 percent of people with watery diarrhea visited health-care providers (44,000 estimated visits) and 1.1 percent were hospitalized (4,400 estimated hospitalizations).

Approximately 1.8 days of productivity (school or work) were lost per case patient, which projects to 725,000 person-days lost, including about 479,000 person-days among those in the workforce (ages 18 to 64 years; Mac Kenzie et al., 1994b). Based mainly on review of death certificate data, we attributed 69 deaths to this outbreak; most of these were premature deaths among people with AIDS/HIV infection (Hoxie et al., 1997).

Study of Short-Term Visitors—Determining the Timing of Exposure, Incubation Period, and Frequency of Secondary Transmission

We studied short-term visitors to the Milwaukee area to answer the following questions:

  • When was Cryptosporidium present in the water system?
  • How long was the incubation period?
  • How frequent was secondary (person-to-person) transmission?

Specifically, we studied people who visited the five-county Milwaukee area one time between March 15 and April 15, unaccompanied by other members of their households (Mac Kenzie et al., 1995b). We identified 94 such individuals who had stayed in the area less than 48 hours and had either laboratory-confirmed cryptosporidiosis (n = 54) or clinical cryptosporidiosis (n = 40) following their visit. Two-thirds of these visitors had stayed for less than 24 hours, and all had drunk beverages that contained unboiled tap water (the median amount consumed was 16 ounces while in Milwaukee; 32 percent of these ill visitors drank less than eight ounces) (Mac Kenzie et al., 1995b).

We examined the dates of arrival in Milwaukee and the dates of illness onset among the 94 brief interval visitors, as shown in Figure 2-11. Using dates of arrival as data of initial exposure, oocysts were presumed present in the treated water during 13 consecutive days (March 24 through April 5). By subtracting the date of arrival from the date of illness onset for each ill visitor, incubation periods could be calculated and the median incubation period was 7 days (range: 1–14 days). Diarrhea abated only to recur in 39 percent of visitors with laboratory-confirmed infection as it had in Milwaukee residents, suggesting that recurrence of diarrhea was not due to reinfection (Mac Kenzie et al., 1995b).

FIGURE 2-11

(A) Dates of arrival and (B) dates of onset of illness for 54 persons with laboratory-confirmed Cryptosporidium infection (black bars) and 40 persons with clinically defined (more…)

To determine the rate of secondary household transmission, we looked at nonvisiting members of the 94 visitors’ households. We surveyed 74 people who fit this description, of whom 5 percent experienced watery diarrhea; thus, we concluded that the rate of secondary household transmission was quite low (Mac Kenzie et al., 1995b).

To evaluate for the presence of Cryptosporidium oocysts in the public water supply earlier in the outbreak, we needed to identify a large quantity of archived water. We obtained large blocks of ice for sculpture made by one southern Milwaukee ice manufacturer. Because of visible impurities the ice blocks frozen on specific days could not be used as intended, but fortunately they had been saved. We sampled melted water from these ice blocks made with water coming from the southern treatment plant. These blocks were available for two different days of manufacture around the time of the outbreak (March 25 and April 9; Addiss et al., 1995Mac Kenzie et al., 1994b). To gauge Cryptosporidium oocyst levels in the water, we melted the blocks from each production day and separated each of the respective samples into aliquots, which were filtered to recover oocysts using a peristaltic pump and two different kinds of filters: a 0.45-micron (absolute pore size) membrane filter (one aliquot for each production day), and a 1.0-micron (nominal pore size) spun polypropylene cartridge (the other aliquot for each production day). At the time, polypropylene cartridges were the standard filtration technique to determine the number of oocysts per liter in raw or finished water; membrane filters were “cutting edge,” but these proved to be a much more sensitive means of detecting oocysts. Using the membrane filter, we detected 13.2 oocysts per 100 liters of melted ice from March 25 (before the peak in turbidity), and 6.7 per 100 liters on April 9 (after the boil-water advisory was invoked; Mac Kenzie et al., 1994b). While quite elevated, these likely underestimate concentrations originally in the water because freezing disrupts Cryptosporidium oocysts. The median infectious dose of C. parvum among healthy adult volunteers with no serologic evidence of past infection is 132 oocysts (Du Pont et al., 1995). More recently, the 50 percent infectious dose (ID50) of C. hominis among healthy adult volunteers with no serologic evidence of past infection was estimated to be 10 oocysts using a clinical definition of infection and 83 oocysts using a microbiologic definition (Chappell et al., 2006).

A Confluence of Events and Contributing Factors Leading to This Outbreak and the Investigation of the Milwaukee Water Treatment Plants

We collected considerable data regarding the operation of the two MWW plants and operating conditions during March and April 1993. Figure 2-12 depicts the water treatment process used by the MWW in 1993 in both treatment plants.

FIGURE 2-12

Depiction of the water treatment process used in the northern and southern Milwaukee Water Works water treatment plants in early 1993.

Raw water was introduced by gravity and rapidly mixed with chlorine for disinfection and a coagulant for mechanical flocculation. Following sedimentation, the water was rapidly filtered through sand-filled filtration beds (16 in the north plant and 8 in the south plant) and then stored in a large clear well prior to entry into water distribution pipes (Addiss et al., 1995Mac Kenzie et al., 1994b). The capacity for producing treated water in each plant was substantial; if one plant was shut down, the full catchment area would still be fully served by the other plant remaining in operation.

In September 1992, both plants changed the type of coagulant used, from the venerable alum to polyaluminum chloride. This was done in response to concern that lead and copper might leach from the aging water distribution infrastructure if the pH was too low, which was more likely to occur if the alum coagulant was used (Addiss et al., 1995).

From January 1983 to January 1993, the turbidity of treated water at the southern plant did not exceed 0.4 nephelometric turbidity unit (NTU). From February to April 1993, the turbidity of treated water at the southern plant did not exceed 0.25 NTU until March 18, when it increased to 0.35 NTU. From March 23 to April 1, the maximal daily turbidity of treated water was consistently 0.45 NTU or higher, with peaks of 1.7 NTU on March 28 and 30, despite an adjustment of the dose of polyaluminum chloride (Figure 2-13Mac Kenzie et al., 1994b). Although marked improvement in the turbidity of treated water had been achieved by April 1 with the use of polyaluminum chloride, on April 2 the southern plant resumed use of alum instead of polyaluminum chloride as a coagulant. On April 5, the turbidity of treated water increased to 1.5 NTU. During February through April 1993, the northern plant treated water turbidity did not exceed 0.45 NTU (Mac Kenzie et al., 1994b). There was no correlation between the turbidity of treated water and the turbidity or temperature of untreated water. From February through April 1993, samples of treated water from both plants were negative for coliforms and were within the limits established by the Wisconsin Department of Natural Resources for water quality (Mac Kenzie et al., 1994b).

FIGURE 2-13

Maximum daily raw and treated water turbidity at the southern Milwaukee Water Works treatment plant, March-April 1993. SOURCE: Wisconsin Division of Public Health (unpublished).

A federal Environmental Protection Agency (EPA) water engineer inspected both Milwaukee water treatment plants and found them to meet existing state and federal water quality standards at the time of the outbreak. However, at the southern plant the water quality data showed a marked increase in turbidity, which reflected poor filtration. The turbidity was measured every eight hours—the minimum amount required by authorities for routine monitoring. Prior to the outbreak, water turbidity was not generally viewed as a potential indicator of protozoan contamination.

The EPA inspector also found that, to decrease the costs of chemicals used in water treatment, the Milwaukee plants recycled water used to backflush and clean their sand filters. This backflushed water (backwash) containing whatever was caught by the sand filter was added to source water coming into the plant rather than being discharged into a sewer. Over time, such recycling of backwash effectively increases the concentration of any contaminant in the water being treated by the plant and increases the risk that the sand filters may not effectively remove the contaminant (Mac Kenzie et al., 1994b).

Weather conditions prior to the outbreak were also very unusual (Addiss et al., 1995Mac Kenzie et al., 1994b). An extremely high winter snowpack had melted rapidly while the frostline remained high, resulting in high runoff containing greater-than-usual levels of organic material. There was also extraordinarily heavy rainfall during March and April, that exceeded the previous record for the period between March 21 and April 20 (set in 1929) by 30 percent.

At the time of the outbreak, during periods of heavy rain, Milwaukee’s storm sewers frequently overflowed. During these periods, sewage was chemically disinfected but otherwise bypassed full sewage treatment (Figure 2-14). Thus, during periods of high flow, the storm sewer and sanitary sewer water that bypassed treatment then emptied into an area within a breakfront on Lake Michigan, just north of the intake for the south water plant as may be seen in Figure 2-15 and further depicted in Figure 2-16.

FIGURE 2-14

Milwaukee skyline demonstrating confluence of rivers merging just west of the Milwaukee harbor. The Milwaukee Metropolitan Sewage District plant is located on the land just south of the convergence (more…)

FIGURE 2-15

Milwaukee River emptying into the Lake Michigan harbor following a period of high flow and attendant creation of a plume. Note the breakfront and the southerly movement of the plume. SOURCE: Image (more…)

FIGURE 2-16

Location of the three rivers that flow through Milwaukee County, Wisconsin, the breakfront protecting the city of Milwaukee harbor and the northern and southern Milwaukee Water Works water treatment (more…)

At the same time, high and frequent northeasterly winds (an unusual wind direction in Milwaukee) probably accentuated the southerly flow of water out of the breakfront and toward the intake for the southern water plant (Figure 2-16). The winds also forced the water within the breakfront closer to the lakeshore, accentuating plumes of storm water and treated sewage that flowed through gaps in the breakfront toward the nearby south plant intake grid (Addiss et al., 1995).

At the southern water plant, personnel lacked experience with dosing the new coagulant in response to spikes in finished water turbidity. By the time the decision was made, on April 2, to resume the use of alum as the coagulant, treated water was already significantly contaminated with Cryptosporidium oocysts.

We also became aware of an additional factor that was of potential importance. In early 1993, a university in central Milwaukee was constructing new soccer fields. The drainage from these fields was directed into a small storm sewer that had to be connected to a larger main sewer. When construction workers cut into the main sewer to make this connection, they discovered a large impaction of bovine entrails and other waste from a large meatpacking plant located nearby. Ensuing investigation and inspection by city officials revealed a cross connection of a sewer from the abattoir kill floor with the storm sewer. This cross-connection existed for years, and these wastes accumulated over a prolonged time. Following correction of the cross-connection, removal of the impacted wastes and hauling the wastes away occurred in early March. Potentially some of these disrupted wastes could have been discharged through the storm sewer directly into the Menomonee River or directly reach the sewage treatment facility following correction of the cross connection. While it is not clear whether the existence and correction of the cross-connection and clean-up of the sewer influenced this outbreak, it was an issue that was addressed during the investigation.

The Ultimate Source of These Cryptosporidium Oocysts: Animals or Humans

As previously noted, as few as 10 Cryptosporidium hominis oocysts constituted an ID50 in adult volunteers (Chappell et al., 2006). An infected person persistently excretes billions of oocysts over an extended period. The mounting numbers of people ill with watery diarrhea, each of whom were likely excreting billions of oocysts into sanitary sewers during the course of their illness, placed rapidly increasing demand on the MWW water treatment system and perpetuated an explosive cycle of Cryptosporidium-related oocyst ingestion, illness, and oocyst amplification. Additionally, oocysts can remain infective in moist environments for two to six months (Fayer, 2004). The opportunity for infection in this outbreak was, therefore, both inordinately high and sustained.

From the random digit dialing survey, we determined that the highest attack rates of watery diarrhea occurred in people aged 30 to 39 years (Mac Kenzie et al., 1994b). These tended to be working adults, many of whom were commuting from lower risk to higher risk places. Using age-related results of the random digit dialing survey we estimated an attack rate of 18 to 20 percent among children under the age of 10 years, and less than 15 percent among independent-living individuals over 70 years of age (Mac Kenzie et al., 1994b).

Subsequently, the frequency of asymptomatic infection among exposed Milwaukee residents was demonstrated to be very high. Although there was no reliable serologic test for Cryptosporidium at the time of the outbreak, stool and serum samples were collected from volunteers during this period. Sera from children who had blood tests for lead levels around the time of the outbreak were preserved; later, when a serologic test for Cryptosporidium became available, these sera were tested (McDonald et al., 2001). The serologic study in children revealed that the prevalence of anti-cryptosporidium antibodies from southern Milwaukee increased from 7 percent prior to the outbreak to approximately 80 percent after the outbreak, indicating that most children were infected and many infections were asymptomatic. These data also supported prior studies implicating the southern plant. Interestingly, the prevalence of anti-cryptosporidium antibody in Milwaukee children tested in 1998 was 7 percent, indicating transmission had returned to baseline. The results demonstrated that the Milwaukee outbreak affected considerably more people than we had previously estimated (McDonald et al., 2001).

Efforts were made to obtain large-volume specimens of stool from volunteers with acute onset of diarrheal illness during the outbreak, store the specimens in potassium dichromate, and hopefully maintain viable oocysts for isolation and subsequent analysis. Only five specimens were obtained, including three from patients with AIDS—one of which was obtained in 1996 from a patient with chronic infection who initially was infected during the 1993 outbreak (Peng et al., 1997). CDC investigators purified oocysts from isolates obtained from stools of four of the volunteers. Approximately one million oocysts from each specimen were orally administered to two-day-old calves or four- to six-day-old BALBc or severe combined immunodeficient (SCID) mice. Further, the oocysts were ruptured, parasitic DNA was harvested, specific fragments were amplified, and the DNA was sequenced and analyzed. None of the isolates established infections in calves and mice, suggesting these were not bovine strains. Isolates in the overall study could be divided into two genotypes of, at that time, Cryptosporidium parvum, on the basis of genetic polymorphism at one locus; the four Wisconsin isolates were similar to isolates observed only in humans that were noninfective in cows and mice (genotype 1). The other genotype (genotype 2) was infective in calves or mice (Peng et al., 1997). Thus, as noted by the authors of the study, the genotypic and experimental infection data from the four isolates examined suggest a human rather than bovine source. However, the results come from the analysis of only four samples from a massive outbreak, and the degree to which these samples are representative of the entire outbreak remains uncertain.

In later studies, C. parvum genotype 1 became known as C. hominus (Morgan-Ryan et al., 2005). Infection with a strain that was human adapted rather than bovine adapted is consistent with the massive numbers of human illnesses and asymptomatic human infections noted in this outbreak.

Peng et al. noted that possible sources of Lake Michigan’s contamination with Cryptosporidium oocysts included cattle along two rivers that flow into the Milwaukee Harbor, slaughterhouses, and human feces (Addiss et al., 1995Mac Kenzie et al., 1994bPeng et al., 1997). At that time cattle had been the most commonly implicated source of water contamination in Cryptosporidium outbreaks outside the United States, but not conclusively within the United States (Peng et al., 1997). Measures for preventing water contamination have in some cases included the removal of cattle from watershed areas in or around municipalities. If, however, sewer overflows and inadequate sewage treatment are the primary source of water contamination in urban settings where anthroponotic7 cycles were maintained, focusing only on cattle could fail to eliminate a very important source of infection (Peng et al., 1997). In the Milwaukee outbreak, the latter point is very important because of the combined sewer overflows (prolonged high flow interval in March and April 1993) and attendant inadequate sewage treatment and the anthroponotic nature of the outbreak.

Lessons Learned

Among the many lessons learned from the 1993 Milwaukee Cryptosporidium outbreak, the following lessons and needs stand out:

  1. Consistent application of stringent water quality standards. At the time of the outbreak, drinking water was regulated either by the EPA or by individual states, as was the case in Wisconsin, and the MWW water treatment and quality testing results were in compliance with all state and federal standards. Existing state and federal standards for treated water were insufficient to prevent this outbreak (Mac Kenzie et al., 1994b). Moreover, it is important to use measures of turbidity in treated water as an indicator of potential contamination rather than viewing turbidity as an aesthetic measure of clarity. Consistent application of stringent water quality standards was needed. More stringent federal water quality standards, which had been under development for several years, were implementled shortly after this massive waterborne outbreak (Addiss et al., 1995). The vastly improved attention to monitoring and to the quality of water filtration is a powerful impact of this investigation.
  2. Application of technical advances to monitor water safety and minimize the amount of inadequately filtered water to the public. The post-filtration turbidity (and particle counts, if possible) of treated water should be monitored continuously for each filter to detect changes in filtration status. Alarm systems for each of the filters and particle counting devices are available to detect spikes in particles in the size range inclusive of Cryptosporidium oocysts and facilitate rapid filter shutdown and diversion of potentially contaminated water when thresholds are reached (Mac Kenzie et al., 1994b).
  3. Testing of source and finished water for Cryptosporidium. This was needed to detect risk for an outbreak and to determine when the water was safe to drink afterward. At the time of the outbreak, the sampling process for such testing was difficult and lengthy, and it was not standardized. Improved means of sampling and testing source and finished water for Cryptosporidium were needed.
  4. Environmental studies. A coordinated plan was needed to investigate the environment following a waterborne outbreak of Cryptosporidium infection, but not many such events had occurred. Thus, we had to overcome considerable challenges, particularly regarding designing, funding, and mobilizing appropriate studies relevant to human health. There needed to be equipment and trained human capacity for rapid deployment of specimen collection followed by prompt testing.
  5. Surveillance. Cryptosporidium infection was not a reportable public health condition at the time of the outbreak. Watery diarrhea proved to be a good case definition for Cryptosporidium infection in an outbreak setting; a more refined clinical case definition was necessary to detect sporadic cases. The random-digit dialing surveys were very valuable in assessing the scope and progress of this large community outbreak; and nursing home surveillance, as described, was very effective (Mac Kenzie et al., 1994bProctor et al., 1998). It would have been useful to have a surveillance system in place to analyze consumer complaints to the water authority before the outbreak as this spike in complaints to the MMW was very striking; these might have focused attention on the unusual turbidity of treated water that preceded the outbreak (Proctor et al., 1998).
  6. Testing of human stool and serum. Because of the time and expense involved, generally only patients with HIV infection, particularly those with AIDS, were routinely tested for Cryptosporidium at the time the outbreak occurred. In addition to delay in determination of the cause of the outbreak, the infrequent use of these tests likely contributed to delay in outbreak recognition. Improved assays were clearly needed. The striking data from the serologic testing of children (McDonald et al., 2001) demonstrated the value of serologic assays to assess background occurrence and the magnitude and impacts of outbreaks Cryptosporidium infection.
  7. Routine assays for Cryptosporidium. Physicians, other clinicians, and public health officials clearly needed to broaden and sustain the index of suspicion for Cryptosporidium infection (Mac Kenzie et al., 1994b). This was challenging because of added costs of testing and the limited assays available at that time.
  8. Communication. To our advantage, we had good interagency communication and worked closely with communities of individuals at greatest risk. For example, the AIDS service organization in Milwaukee had access to over 700 case patients, and we could monitor morbidity and mortality in that population (Frisby et al., 1997). As a result of the outbreak, we developed targeted public health messages and shared them with other health departments. However, due to insufficient understanding of the pathogen and its public health effects, we lacked guidelines for governmental response to findings of oocysts, increased turbidity of finished water, and elevated particle counts in finished water. During our investigation, we worked to establish interagency coordination to remedy this situation.
  9. The media. Electronic and print media were essential to communicating risk and delivering other important public health messages during the outbreak, and in particular facilitated public inquiry by setting up phone banks. The Milwaukee Journal and the Milwaukee Sentinel jointly produced an issue in Spanish to inform a large segment of non-English speakers about the outbreak, and they maintained a help line in Spanish as well. The media published treated water turbidity data on a regular basis, which was especially helpful to individuals with HIV infection and AIDS. Daily news conferences and televised updates occurred through the lifting of the boil water advisory on April 14 and related articles appeared daily in the news for weeks.
  10. Other lessons. The many outbreak-related studies conducted during the time of the outbreak yielded other lessons on the clinical spectrum of cryptosporidiosis, epidemiologic features of cryptosporidiosis, effectiveness of control measures in specific subpopulations, effectiveness of preventive measures, and the economic impact of such a massive outbreak. Recurrence of diarrhea after a period of apparent recovery was documented frequently—in 39 percent of persons with laboratory-con-firmed Cryptosporidium infections (Mac Kenzie et al., 1995b), a finding that has implications for patient counseling. A study among HIV-positive persons found that the severity of illness, but not the attack rate, was significantly greater in persons with HIV infection (Frisby et al., 1997). Among young children attending daycare centers, asymptomatic or minimally symptomatic Cryptosporidium infection was frequent (Cordell et al., 1997), a fact that may have contributed to several outbreaks associated with recreational water in other parts of the state several months after the Milwaukee outbreak (CDC, 1993; Mac Kenzie et al., 1995a). Surveillance studies revealed the potential usefulness of monitoring sales of over-the-counter antidiarrheal drugs as an early indicator of community-wide outbreaks (Proctor et al., 1998); the effectiveness of control measures and the absence of drinking water as a risk factor for the relatively low level of transmission during the post-outbreak period (Osewe et al., 1996); the importance of testing more than one stool specimen (Cicirello et al., 1997); the usefulness of death certificate review for estimating outbreak-related mortality (Hoxie et al., 1997); and the effectiveness of point-of-use water filters with pore diameters of less than 1 micron (Addiss et al., 1996). Additionally, a detailed cost analysis revealed the enormous economic impact of the outbreak (Corso et al., 2003).

Conclusions and Outcomes

The 1993 Milwaukee cryptosporidiosis outbreak was the largest documented waterborne disease outbreak in the United States. Cryptosporidium oocysts in untreated water from Lake Michigan that entered the plant were inadequately removed by the coagulation and filtration process at the Milwaukee southern water treatment plant. Water quality standards were inadequate to prevent this outbreak. There was a lack of laboratory testing for Cryptosporidium, which delayed recognition of the microbial etiology of the outbreak. While the environmental source of the oocysts in this outbreak is not specifically known, limited data from genotyping of Cryptosporidium DNA from a small number of human stool specimens obtained during the outbreak supports the hypothesis that the environmental source of the oocysts was human. Nonetheless, there was a confluence of important factors that contributed to the occurrence of this massive outbreak (Mac Kenzie et al., 1994b). How oocysts ultimately made their way from sewers into water feeding the intake of the southern water treatment plant (e.g., by sewage overflows related to heavy rains, cross-connections, or inadequate treatment at the sewage treatment plant located at the mouth of the Milwaukee River and facilitated by unusual wind conditions) is unknown.

Based on these conclusions and the opinions of EPA staff and other consultants, the MWW instituted continuous turbidity monitoring in all of its filters. They also put alarms on the filters to enable automatic shutdown if turbidity reached a threshold level, and they set and achieved the goal of maintaining turbidity at a very low level. The MWW modified and improved water treatment procedures and adopted very stringent water quality standards. Substantial enhancement of the filter beds in both treatment plants occurred and the new filter media was installed. The intake grid for raw water entering the southern plant was moved considerably further eastward. These efforts resulted in continuous production by the MWW of high-quality treated water with mean turbidities of 0.01 NTU (Mac Kenzie et al., 1994a). Furthermore, recognizing that Cryptosporidium oocysts are highly resistant to chlorine, the City of Milwaukee constructed ozonization plants at each treatment facility. Ozone disrupts the oocyst cell wall prior to disinfection.

When we studied turbidity events among Wisconsin surface water treatment plants over a 10-year period, we discovered other sites with similar challenges. For example, during the months of February through April, turbidity events occur frequently on Lake Michigan; these events affect all treatment plants that use water from this lake (Wisconsin Division of Public Health, unpublished data).

We advocated increasing Cryptosporidium testing of stools from persons with watery diarrhea and made Cryptosporidium infection a reportable condition (Hayes et al., 1989). Annually, the DPH receives about 450 reports of Cryptosporidium infection: some are recreational, some are connected with agriculture, but rarely do they originate in Milwaukee (Wisconsin Division of Public Health, unpublished data). We also advocated including Cryptosporidium testing in federal rules that stipulated the collection of information on both raw water sources and finished water (Juranek et al., 1995).

The massive Milwaukee waterborne Cryptosporidium outbreak, and the resulting modification of the city’s water treatment facilities, has received attention from water authorities worldwide. The attendant events and actions have been very instructive.Go to:

PREVENTION IS PAINFULLY EASY IN HINDSIGHT: FATAL E. Coli O157:H7 AND CampylobaCtEr OUTBREAK IN WALKERTON, CANADA, 2000

Steve E. Hrudey, Ph.D. University of Alberta Elizabeth J. Hrudey University of Alberta

Summary

In May 2000, a comfortable rural community of about 4,800 people in Canada’s largest province (Ontario) experienced an outbreak of waterborne disease that killed seven people and caused serious illness in many others. The contamination was ultimately traced to a source that had been identified 22 years earlier as a threat to the drinking water system, but no remedial action was taken to manage the public health risk. The operators of the system were oblivious to this danger and the regulators responsible for safe drinking water largely overlooked the problems that existed. Even as the outbreak unfolded the regulatory response was slow and unfocused, suggesting a serious loss of capacity to regulate drinking water safety had occurred in Ontario.

Introduction

Walkerton, located about 175 km northwest of Toronto, Ontario, Canada, experienced serious drinking water contamination in May 2000. The facts of this account are drawn from the Walkerton Inquiry (the Inquiry), a $9 million public inquiry into this disaster called by the Ontario Attorney General (O’Connor, 2002a). This disaster has resulted in a complete overhaul of the drinking water regulatory system in Ontario.

What Happened in Walkerton

Walkerton was served by three wells in May of 2000, identified as Wells 5, 6, and 7. Well 5 was located on the southwest edge of the town, bordering adjacent farmland. It was drilled in 1978 to a depth of 15 m with 2.5 m of overburden and protective casing pipe to 5 m depth (O’Connor, 2002a). The well was completed in fractured limestone with the water-producing zones ranging from 5.5 to 7.4 m depth and it provided a capacity of 1.8 ML/d that was able to deliver ~56 percent of the community water demand. Well 5 water was to be chlorinated with hypochlorite solution to achieve a chlorine residual of 0.5 mg/L for 15 minutes contact time.

Well 6 was located 3 km west of Walkerton in rural countryside and was drilled in 1982 to a depth of 72 m with 6.1 m of overburden and protective casing to 12.5 m depth (O’Connor, 2002a). An assessment after the outbreak determined that Well 6 operated from seven producing zones with approximately half the water coming from a depth of 19.2 m. This supply was judged to be hydraulically connected to surface water in an adjacent wetland and a nearby private pond. Well 6 was disinfected by a gas chlorinator and provided a nominal capacity of 1.5 ML/d that was able to deliver 42 to 52 percent of the community water demand (O’Connor, 2002a).

Well 7, located approximately 300 m northwest of Well 6, was drilled in 1987 to a depth of 76.2 m with 6.1 m of overburden and protective casing to 13.7 m depth (O’Connor, 2002a). An assessment following the outbreak determined that Well 7 operated from three producing zones at depths greater than 45 m with half the water produced from below 72 m. A hydraulic connection discovered between Well 6 and Well 7 reduced the security of an otherwise good-quality groundwater supply. Well 7 was also disinfected by a gas chlorinator and provided a nominal capacity of 4.4 ML/d that was able to deliver 125 to 140 percent of the community water demand (O’Connor, 2002a).

From May 8 to May 12, Walkerton experienced ~134 mm of rainfall, with 70 mm falling on May 12. This was unusually heavy, but not record, precipitation for this location. Such rainfall over a 5-day period was estimated by Environment Canada to happen approximately once in 60 years (on average) for this region in May (Auld et al., 2004). The rainfall of May 12, which was estimated by hydraulic modeling to have occurred mainly between 6 PM and midnight, produced flooding in the Walkerton area.

Stan Koebel, the general manager of the Walkerton Public Utilities Commission (PUC), was responsible for managing the overall operation of the drinking water supply and the electrical power utility. From May 5 to May 14, he was away from Walkerton, in part to attend an Ontario Water Works Association meeting. He had left instructions with his brother Frank, the foreman for the Walkerton PUC, to replace a nonfunctioning chlorinator on Well 7. From May 3 to May 9, Well 7 was providing the town with unchlorinated water in contravention of the applicable provincial water treatment requirements.

From May 9 to 15, the water supply was switched to Wells 5 and 6. Well 5 was the primary source during this period, with Well 6 cycling on and off, except for a period from 10:45 PM on May 12 until 2:15 PM on May 13 when Well 5 was shut down. Testimony at the Inquiry offered no direct explanation about this temporary shutdown of Well 5. No one admitted to turning Well 5 off and the supervisory control and data acquisition (SCADA) system was set to keep Well 5 pumping. Flooding was observed near Well 5 on the evening of May 12 because of the heavy rainfall that night, but why or how Well 5 was shut down for this period remains unknown.

On May 13 at 2:15 PM, Well 5 resumed pumping. That afternoon, according to the daily operating sheets, foreman Frank Koebel performed the routine daily checks on pumping flow rates and chlorine usage, and measured the chlorine residual on the water entering the distribution system. He recorded a daily chlorine residual measurement of 0.75 mg/L for Well 5 treated water on May 13 and again for May 14 and 15. Testimony at the Inquiry indicated that these chlorine residual measurements were never made and that all the operating sheet entries for chlorine residual were fictitious. The monitoring data were typically entered as either 0.5 mg/L or 0.75 mg/L for every day of the month.

On Monday, May 15, Stan Koebel returned and early in the morning turned on Well 7, presumably believing that his instruction to install the new chlorinator had been followed. When he learned a few hours later that it had not, he continued to allow Well 7 to pump into the Walkerton system, without chlorination, until Saturday, May 20. Well 5 was shut off at 1:15 PM on May 15, making the unchlorinated Well 7 supply the only source of water for Walkerton during the week of May 15. Because the Well 5 supply was ultimately determined to be the source of pathogen contamination of the Walkerton drinking water system, the addition of unchlorinated water into the system from Well 7 would have failed to reduce the pathogen load by any means other than dilution.

PUC employees routinely collected water samples for bacteriological testing on Mondays. Samples of raw and treated water were to be collected from Well 7 that day along with two samples from the distribution system. Although samples labeled Well 7 raw and Well 7 treated were submitted for bacteriological analyses, the Inquiry concluded that these samples were not taken at Well 7 and were more likely to be representative of Well 5. Stan Koebel testified that PUC employees sometimes collected their samples at the PUC shop, located nearby and immediately downstream from Well 5, rather than traveling to the more distant wells (~3 km away) or to the distribution system sample locations.

During this period, a new water main was being installed (the Highway 9 project). The contractor and consultant for this project asked Stan Koebel if they could submit their water samples from this project to the laboratory being used by the Walkerton PUC for bacteriological testing. Stan Koebel agreed and included three samples from two hydrants for the Highway 9 project. On May 1, the PUC began using a new laboratory for bacteriological testing, a lab the PUC had previously used only for chemical analyses.

The first set of samples submitted to the new laboratory was taken on May 1, but was improperly submitted with inadequate sample volumes for the analyses requested and discrepancies between the written documentation and numbers of samples sent. No samples were submitted by the PUC for May 8. The May 15 samples taken by the PUC repeated the problems with inadequate sample volumes and discrepancies in the paperwork.

Early on the morning of Wednesday, May 17, the lab phoned Stan Koebel to advise him that all of the water main construction project samples taken May 15 were positive for E. coli and total coliforms, and that the distribution system samples were also contaminated. Because these tests indicated only the presence or absence of indicator bacteria, it was not possible to estimate the numbers of indicator bacteria in each sample. Only the sample labeled Well 7 treated was analyzed by the membrane filtration method. The latter procedure would normally allow a bacterial count to be determined, but in this case the sample was so contaminated that it produced an overgrown plate with bacterial colonies too numerous to count (both total coliforms and E. coli > 200/100 mL). The Inquiry concluded that this sample was most likely mislabeled and was more likely representative of the water from Well 5 entering the distribution system.

The new laboratory was not familiar with the “expectations” (not regulatory requirements at that time) to report adverse microbial results to either the Ministry of Environment (MOE) or the responsible Medical Officer of Health (MOH). Accordingly, this lab reported these adverse sample results only to the PUC General Manager, Stan Koebel. In turn, he advised the consultant for the Highway 9 project contractor that their samples had failed so they would need to rechlorinate, flush and re-sample to complete the project.

On Thursday, May 18, the first signs of illness were becoming evident in the health-care system. Two children, a seven-year-old and a nine-year-old, were admitted to the hospital in Owen Sound, 65 km from Walkerton. The first child had bloody diarrhea and the second developed bloody diarrhea that evening. The attending pediatrician, Dr. Kristen Hallett, noted that both children were from Walkerton and attended the Mother Theresa School in Walkerton. Bloody diarrhea is a notable symptom for serious gastrointestinal infection, particularly infection with Escherichia coli O157:H7. Accordingly, Dr. Hallett submitted stool samples from these children to evaluate that diagnosis. By May 18, at least 20 students were absent from the Mother Theresa School.

By Friday, May 19, the outbreak was evident at many levels. Twenty-five children were now absent from the Mother Theresa School and 8 children from the Walkerton public school were sent home suffering from stomach pain, diarrhea, and nausea. Three residents of the Maple Court Villa retirement home and several residents of the BruceLea Haven long-term care facility developed diarrhea, two with bloody diarrhea. A Walkerton physician had examined 12 or 13 patients suffering from diarrhea to that time. Dr. Hallett first notified the Bruce-Grey-Owen Sound Health Unit, the responsible public health agency for Walkerton with its main office in Owen Sound, of the emerging problems on May 19. She expressed concerns to Health Unit staff that Walkerton residents were telling her something was “going on” in Walkerton, and the receptionist from the Mother Theresa School advised that the parent of one student stated that something was wrong with the town’s water supply.

An administrator at the Mother Theresa School called James Schmidt, the public health inspector at the Walkerton office of the Health Unit, to report the 25 children absent. She noted that some were from Walkerton, others from adjacent rural areas, and that the ill students were from different grades and classrooms. She suspected the town’s water supply. In contrast, the Health Unit officials suspected a foodborne basis for the outbreak, by far the most common cause of such diseases. Nonetheless, James Schmidt placed a call to Stan Koebel in the early afternoon of May 19. By the time he called, the chlorinator had been installed on Well 7 so that it was now supplying chlorinated water to Walkerton’s distribution system. According to James Schmidt, Stan Koebel was asked whether anything was wrong with Walkerton’s water and he advised Schmidt that “everything’s okay” (J. Schmidt, Inquiry Transcript of Evidence, December 15, 2000, p. 172). By the time of that conversation, Stan Koebel had been faxed the adverse microbial results from the Highway 9 project, the distribution system, and the sample labeled Well 7 treated two days earlier.

Later that afternoon, David Patterson, an administrator of the Health Unit based in Owen Sound, called Stan Koebel to advise him of public concerns about the water. Patterson asked whether anything unusual had happened in the water system. Stan Koebel mentioned that there was water main construction under way near the Mother Theresa School, but made no mention of the adverse bacteriological results or of operating Well 7 from May 3 to 9 and from May 15 to 19 without a chlorinator.

The Inquiry concluded that Stan Koebel’s lack of candor seriously hampered the Health Unit’s early investigation of and response to the outbreak. Because patients had bloody diarrhea, health officials suspected the outbreak was caused by E. coli O157:H7. The failure by PUC personnel to mention any problems with the Walkerton water system allowed health officials to continue with their misinformed search for a foodborne cause of the outbreak. At that time, Health Unit personnel were not aware that any outbreaks of this disease had occurred in a chlorinated drinking water system. Earlier waterborne outbreaks of E. coli O157:H7—Cabool, Missouri; Alpine, Wyoming; and Washington County, New York—involved unchlorinated drinking water (Hrudey and Hrudey, 2004). Stan Koebel’s reassurances about the water’s safety kept the Health Unit staff pursuing a foodborne cause. However, the emerging outbreak, with cases distributed across a wide geographic region and across the population from very young and very old, was making any foodborne explanation increasingly improbable.

Suspicions about the safety of the water were spreading in the community. The BruceLea Haven nursing home, where a number of patients had become ill, began to boil water on the initiative of the nursing staff despite the absence of any public health warnings. Some citizens, including Robert MacKay, an employee of the Walkerton PUC, also began to boil their water on Friday, May 19. After his conversations with health officials that afternoon, in which he reassured them about the water, Stan Koebel increased the chlorination level at Well 7. He also began to flush the distribution system through a hydrant near the Mother Theresa School and subsequently at other hydrants throughout the system until May 22.

By Saturday, May 20, on a long holiday weekend, the outbreak was straining the Walkerton hospital with more than 120 calls from concerned residents, more than half of whom complained of bloody diarrhea. After the Owen Sound hospital determined that a stool sample from one of the children admitted on May 18 was presumptive positive for E. coli O157:H7, the health unit notified other hospitals in the region because this pathogen may cause hemolytic uremic syndrome (HUS). This was important because antidiarrheal medication or antibiotics, which might be prescribed for diarrhea, can worsen the condition of patients infected with this pathogen, so emergency staff had to avoid dispensing such medication.

David Patterson asked James Schmidt to contact Stan Koebel again to determine the current chlorine residual levels in the water and to receive reassurance that the water system would be monitored over the holiday weekend. Koebel assured Schmidt that there were measurable levels of chlorine residual in the distribution system, leading health officials to believe that the water system was secure.

Early on Saturday afternoon, David Patterson contacted Dr. Murray McQuigge, the local Medical Officer of Health who was out of town during the onset of the outbreak, to advise him of the emerging outbreak. By that time, several people in Walkerton were reporting bloody diarrhea and ten stool samples had been submitted for pathogen confirmation. Dr. McQuigge advised that any further cases diagnosed with E. coli O157:H7 should be interviewed for more details, and he returned that evening to Owen Sound.

David Patterson called Stan Koebel to advise him that a local radio station was reporting that Walkerton water should not be consumed. Patterson wanted Koebel to call the radio station to correct this report and reassure the public about the safety of the Walkerton water supply, but Koebel was apparently reluctant to comply with this request. Patterson asked again whether anything unusual had occurred in the water system and Koebel again failed to report the adverse microbiological results from the May 15 samples or that Well 7 had been operating with no chlorination.

Robert MacKay, a PUC operator who had been on sick leave from the PUC, began to suspect something was wrong with Walkerton’s water. He had learned from Frank Koebel that the samples from the Highway 9 project had failed testing. MacKay phoned the Spills Action Centre (SAC) of the MOE anonymously to report his concerns and provide a contact number at the PUC for the MOE to call about the Walkerton water system. In the early afternoon of Saturday, May 20, Christopher Johnston, the MOE employee who received MacKay’s anonymous call, phoned Stan Koebel to find out if there were problems with the system. Johnston understood from this conversation with Stan Koebel that any problems with bacteriological results had been limited to the Highway 9 mains replacement project some weeks earlier, but that chlorine residual levels were satisfactory as of May 19. MacKay, now experiencing diarrhea himself, placed another call to the MOE number that evening to find out what was being done. MacKay was advised that Stan Koebel had been contacted, but that MacKay’s concern about drinking water safety was really a matter for the Ministry of Health. This feedback from the MOE was wrong: the MOE was designated as the lead agency for drinking water regulation in Ontario. MacKay was provided with a phone number for the wrong regional health office, eventually leading him to call back to the SAC. This time, the MOE SAC staff person agreed to contact the nearest MOE office, in Owen Sound, with a request to investigate the matter.

The outbreak continued to expand. By Sunday, May 21, there were more than 140 calls to the Walkerton hospital and two more patients admitted to the Owen Sound hospital. A local radio station interviewed Dr. McQuigge on Sunday morning and subsequently reported on the noon news that Dr. McQuigge believed that drinking water contamination was an unlikely source of this outbreak. At about that time, the Health Unit was advised that the first presumptive E. coli O157: H7 had been confirmed and that another patient sample, presumptive for E. coli O157:H7, was being tested for confirmation. David Patterson and Dr. McQuigge conferred with their staff about these results and decided to issue a boil water advisory at 1:30 PM on Sunday, May 21. The notice, hastily drafted by David Patterson, stated:

The Bruce-Grey-Owen Sound Health Unit is advising residents in the Town of Walkerton to boil their drinking water or use bottled water until further notice. The water should be boiled for five minutes prior to consumption. This recommendation is being made due to a significant increase in cases of diarrhea in this community over the past several days.

Although the Walkerton PUC is not aware of any problems with their water system, this advisory is being issued by the Bruce-Grey-Owen Sound Health Unit as a precaution until more information is known about the illness and the status of the water supply.

Anybody with bloody diarrhea should contact his or her doctor or the local hospital.

This notice was provided only to the local AM and FM radio stations; additional publicity by the television station or by direct door-to-door notification was not pursued. According to the report subsequently prepared on the outbreak with the assistance of Health Canada (BGOSHU, 2000), a community survey showed that only 44 percent of respondents were aware that the Health Unit had issued a boil water advisory on May 21 and only 34 percent heard the announcement on the radio. In retrospect, Health Unit personnel acknowledged that the community could have been more effectively notified. However, given Stan Koebel’s consistent reassurance about the safety of the Walkerton water system, the Health Unit’s caution in attributing the outbreak to the local drinking water at this emerging stage of the outbreak is understandable.

After issuing the boil water advisory, Dr. McQuigge notified the MOE SAC that there was an E. coli outbreak in Walkerton. In exchange, the SAC advised Dr. McQuigge about the anonymous calls about adverse results for the Walkerton water system. The Health Unit updated the MOE SAC that there were now 2 confirmed cases of E. coli O157:H7 and 50 cases of bloody diarrhea. The MOE called Stan Koebel to discuss the situation; Koebel once again failed to report the adverse samples from May 15 (reported to him on May 17). During his Inquiry testimony, Stan Koebel responded to a question about whether he had deliberately avoided disclosing these results during his conversation with MOE personnel by answering: “I guess that’s basically the truth and I was waiting on the Ministry of the Environment to call from the Owen Sound office with further confirmation” (S. Koebel, Inquiry. Transcript of Evidence, December 20, 2000, p. 108).

The Health Unit established a strategic outbreak team to deal with the emergency. Local public institutions were to be notified about the boil water advisory, but two high-risk institutions, the BruceLea Haven nursing home and the Maple Court Villa retirement home, were inadvertently missed. The Walkerton hospital had been reassured about the safety of the water until that afternoon and had not taken any measures to address water safety. In fact, hospital staff had been advising those caring for patients with diarrhea to provide ample fluids to maintain patient hydration, advice that caused many ill patients to be additionally exposed to contaminated Walkerton tapwater.

Once notified of the problems, the hospital was forced to find an alternative safe water and ice supply, shut off its public fountains, and discard any food prepared or washed with Walkerton tap water. By that evening, the Health Unit had notified provincial health officials of the outbreak and requested the assistance of major hospitals in London (over 150 km distant) and Toronto (over 200 km distant) in treating Walkerton residents and the assistance of Health Canada in conducting an epidemiological investigation.

By Monday, May 22, the Health Unit had received reports of 90 to 100 cases of E. coli infection. Phillip Bye, the regional MOE official in Owen Sound, who had been notified the previous evening about the outbreak, did not initiate a MOE investigation, even after being advised about the large number of cases of E. coli infection and that the Health Unit suspected the Walkerton water system. Only after being contacted later that day by Dr. McQuigge, who stressed the urgency of the situation, did the regional MOE initiate an investigation by sending environmental officer James Earl to Walkerton to meet first with the Health Unit before meeting Stan Koebel. The Health Unit advised Earl about the “alarming” number of illnesses and said that Health Unit investigations failed to reveal any plausible foodborne cause, making the water system highly suspect. David Patterson asked Earl to obtain any microbiological test results from the PUC for the previous two weeks. Earl was also informed of the anonymous call. He surmised, without any supporting evidence, that intentional contamination might be possible. When Earl interviewed Stan Koebel and asked about any unusual events of the previous two weeks, Koebel did not tell him about the adverse bacteriological results for May 15 or the operation of Well 7 without a chlorinator. However, Koebel provided Earl with a number of documents, including the May 17 report (results for May 15).

James Earl returned to Owen Sound and reviewed these documents that evening. Although Earl noted the result showing high E. coli numbers for the water system, he did not report this alarming evidence to his supervisor, Phillip Bye, or the Health Unit at that time. James Earl apparently believed that the boil water advisory eliminated any urgency concerning the revelation about adverse microbial results for Walkerton’s drinking water supply.

In the meantime, the Health Unit began to plot an outbreak curve that revealed an apparent peak of disease onset for May 17, suggesting a most likely date of contamination between May 12 and 14. They also plotted the residence locations for those who were infected. This plot revealed that cases were distributed all across the area served by the Walkerton water distribution system. By that evening, the Health Unit was convinced this was a waterborne outbreak, even though they had not yet been provided with the adverse results for May 15.

On Monday, May 22, the first victim of the outbreak, a 66-year-old woman, died. Subsequently, a 2-year-old child who visited Walkerton on Mother’s Day (May 14) and consumed only one glass of water, died on Tuesday, May 23. Ultimately, 5 more deaths to total 7, 27 cases (with a median age of 4) of HUS, a life-threatening kidney condition that may subsequently require kidney transplantation, and 2,300 cases of gastrointestinal illness were attributed to consuming Walkerton water. Stool cultures from victims confirmed exposure to E. coli O157:H7, Campylobacter jejuni, and other enteric pathogens (BGOSHU, 2000). Longer term health consequences have been both documented (longer term gastrointestinal symptoms, irritable bowel syndrome, arthritis symptoms and albuminuria; Garg et al., 2006a2008a,bMarshall et al., 2006) and found absent (renal sequelae among non-HUS cases; Garg et al., 2006b).

Well 5 was providing drinking water to Walkerton during the period of most likely contamination (May 12 to 14) according to the SCADA system. Well 5 was located close to two farms posing a water contamination risk from manure (Figure 2-17). The original hydrogeology report for Well 5, written in 1978, recognized the risk of contamination from nearby farms, having found that fecal coliforms appeared after 24 hours during pump testing (O’Connor, 2002a).

FIGURE 2-17

Location of Walkerton Well 5 near farms to south and west. SOURCE: Adapted from original photo taken for the Walkerton Inquiry by Constable Marc Bolduc, Royal Canadian Mounted Police, used with (more…)

As illness emerged in the community, the Koebel brothers remained convinced that water was not to blame and they continued to drink the water. In the past, they had often consumed Well 5 water before chlorination because they did not recognize the danger of pathogen contamination.10

Direct Causes of the Walkerton Outbreak

The immediate direct cause of failure was that organic loading from manure contamination arising from a nearby farm barnyard overwhelmed the fixed chlorine dose that was used by the Walkerton PUC, leaving no disinfection capacity to inactivate the pathogens entering the distribution system. If the chlorine residual had been monitored as it should have been by the PUC operators, this problem would have been immediately evident, but no valid chlorine residuals were measured during the critical period (around May 12) after contamination was washed into the shallow Well 5.

Despite very clear and unambiguous findings in the first report by Justice O’Connor concerning the roles and responsibilities of the operators in failing to prevent this outbreak, the public record on the operators’ roles and responsibilities became confused in December 2004 with the criminal conviction and sentencing of the Koebel brothers. They were charged with breach of trust, uttering forged documents, and common nuisance for their roles in the Walkerton outbreak. The prosecution agreed to a plea bargain, dropping the more serious charges in return for guilty pleas to common nuisance. Stan Koebel was sentenced to one year in jail and Frank Koebel was sentenced to 9 months of house arrest.

The trial and sentences were controversial. On one side, the admissions by the Koebel brothers of failing to perform monitoring and treatment, withholding adverse monitoring results, and falsifying operating records clearly pointed blame their way. However, the severe and systemic deficiencies in the operator training and regulatory systems of the Ontario government led the Walkerton Inquiry to find that the Koebel brothers were not solely responsible for the failures, nor were they the only ones who could have prevented the disaster.

The criminal trial in Ontario Superior Court increased confusion by means of a statement of “facts” agreed to by the prosecution to secure the guilty pleas. This agreed statement of “facts” contained inaccurate elements that were in direct contradiction of the well-documented findings of the Walkerton Inquiry. In this case the prosecution was obliged to gather evidence without reliance on evidence collected by the Inquiry, but their investigation was directed to the same set of facts, and it should have been able to reach the same conclusions as the Inquiry was able to document. The statement of “facts” attested to by the prosecution and the defense cited an epidemiologist as its sole authority to find that, even had the Koebel brothers increased the chlorine level in Walkerton’s water system dramatically, that action “would not have prevented this tragedy.” The epidemiologist acknowledged not being a specialist in disinfection during testimony at the Walkerton Inquiry. The prosecution concluded from the epidemiologist’s opinion: “It therefore cannot be said that the criminal conduct of Stan Koebel and Frank Koebel . . . their failure to properly monitor, sample and test the well water . . . was, in law, a significant contributing cause of the deaths and injuries.”

The operators were to ensure the chlorine residual was measured daily. Yet the Inquiry found “virtually all of the entries on the 1999 daily operating sheets are false. Fictitious entries in the daily operating sheets continued until the outbreak in May 2000.” Inquiry Commissioner O’Connor observed: “One of the purposes of measuring chlorine residual is to determine whether contamination is overwhelming the disinfectant capacity of the chlorine.” Accordingly, he found: “The scope of the outbreak would very likely have been substantially reduced if the Walkerton PUC operators had measured chlorine residuals at Well 5 daily, as they should have, during the critical period when contamination was entering the system.” At least eight days without valid chlorine residual monitoring passed between the contamination influx and the boil water advisory issued by the health unit, after illness was already widespread (Figure 2-18).

FIGURE 2-18

Outbreak curve for the Walkerton epidemic of gastroenteritis. SOURCE: Adapted from BGOSHU (2000) and Hrudey and Hrudey (2004).

The Walkerton PUC operators, if properly trained and acting competently, should have been alerted by the heavy rains and the obvious flooding that occurred on May 12. They should have commenced more frequent checking of their chlorine dosage and residuals. Even if they had stuck with the limited monitoring schedule required by the regulator, they should have recognized that the low to non-existent chlorine residual they would have measured on May 13 was cause for alarm. The chlorine dose should have been increased immediately to try to achieve a satisfactory chlorine residual. In this particular case, because Well 7 was able to provide the entire supply for Walkerton, the operators should have shut down Well 5 immediately, knowing its vulnerability. The Well 7 option was compromised by Frank Koebel’s failure to follow Stan Koebel’s instructions to replace a defective chlorinator on Well 7, an action that was not taken until May 19.

Once the operators became aware that water with no disinfection had entered the distribution system, the water storage should have been dosed with chlorine solution and the mains flushed. The regulator and local health authorities should have been notified of the problem and the actions being taken. If adequate chlorine residual could not have been restored, at a minimum a boil water advisory should have been issued immediately. These actions could have and should have all been completed in the first 24 hours after the May 13 morning sample, for chlorine residual would have identified the problem.

11 Even if these steps did not eliminate the consumption of contaminated water entirely, they would have substantially reduced the exposure of Walkerton consumers and the resulting health impacts. As it was, contaminated water was distributed for a full week longer than necessary. If the policies adopted in 1994 by the Ontario Chlorination Bulletin had been applied to Walkerton as they should have been, this vulnerable shallow well would have been equipped with a continuous chlorine residual analyzer. The continuous monitor should have been established in a fail-safe mode that would automatically shut off Well 5 when the chlorine residual fell below the set point for minimum effective disinfection.

The Walkerton operators should have realized their system was vulnerable to catastrophic failure. It was totally reliant on a single chlorination barrier that was not fail safe. The requirement for at least a second barrier (source protection) was identified more than 20 years earlier, but was never implemented. Water system operators must be able to recognize the threats to their system contrasted with the capability of their system to cope. They have a personal responsibility to ensure deficiencies are identified, made known to management, and effectively remediated. Pending necessary improvements, increased vigilance is required by operators together with contingency plans to cope with periods of stress.

The Koebel brothers lacked the training and expertise to identify the vulnerability of Well 5 and the need for additional safety barriers. They had been certified by a “grand-parenting” process that failed to provide them with the training needed to do their jobs properly. Their experience allowed them to handle the mechanical requirements of their jobs, but they lacked any understanding of water quality or associated public health risks. The Koebel brothers did not intend to harm their fellow citizens through their flawed practices. In fact, they continued to drink the water even as the outbreak was unfolding.12

13 There were many potential direct causes for this outbreak, including new water main construction, fire events, main breaks and repairs, contamination of treated water storage, cross connections, flooding, and human sewage or sewage sludge contamination of the wells. Despite the diversity of possible causes, the Inquiry found consistent and convincing evidence that this outbreak was caused by contamination from cattle manure being washed from an adjacent farm into the shallow Well 5 on or about May 12 because of the heavy rainfall that day. Consequently, the following explanations will focus on the evidence for and understanding of that specific cause. Other plausible causes will be mentioned only briefly. However, under different circumstances, each of these could have caused or contributed to an outbreak.14

Well 5 (Figure 2-17) was identified as being vulnerable to surface contamination from the time it was first commissioned. The hydrogeologist who conducted the original assessment of this well wrote in his commissioning report:

The results of the bacteriological examination indicate pollution from human or animal sources, however, this was not confirmed by the chemical analyses. The supply should definitely be chlorinated and the bacteria content of the raw and treated water supply should be monitored. The nitrate content should also be observed on a regular basis. . . .

The Town of Walkerton should consider establishing a water-protection area by acquiring additional property to the west and south in the vicinity of Well four [now 5]. Shallow aquifers are prone to pollution in farming and human activities should be kept away from the site of the new well as far as possible (Wilson [1978] as reported in November 8, 2000, W. I. Transcript; evidence of J. Budziakowski).

15 Pump testing on this well in 1978 demonstrated that bacteriological contamination occurred within 12 to 24 hours of initiating pumping, reaching a peak of 12 fecal coliforms per 100 mL after 48 hours. During the well’s first two years, the MOE conducted a number of inspections that revealed continuing concerns for surface contamination. These concerns included the nearby agricultural activities; the shallowness of the well with its relatively thin overburden (the layer of soil between the surface and the aquifer); observed fluctuations in turbidity; bacteriological monitoring indicating fecal contamination; and elevated pumping levels in concert with spring thaw and early rainfall suggesting rapid surface recharge of the shallow aquifer. In 1980, the bacteriological samples of the raw water at Well 5 reached as high as 260 total coliforms per 100 mL and 230 fecal coliforms per 100 mL, with 4 out of 42 samples that year showing bacterial contamination. Because none of the chlorine disinfected samples from Well 5 showed bacterial contamination, the poorer quality raw water seems to have been accepted despite the obvious signs of surface contamination. Turbidity measurements were found to be occasionally elevated (up to 3.5 NTU) and to fluctuate well beyond what would be expected from a secure groundwater source.

Unfortunately, the concerns about surface contamination influencing the raw water at Well 5 appeared to have been forgotten in the MOE files during the 1980s when no inspections were performed. However, the investigation by Golder Associates, Ltd. (Golder, 2000) after the outbreak confirmed that Well 5 was definitely under the influence of surface contamination, as the early water quality monitoring indicators had so clearly revealed. In a dramatic demonstration, a shallow pond (~10 cm deep) adjacent to Well 5 went dry within 30 minutes after the pump test commenced, and a deeper nearby pond dropped 27 cm over 36 hours of pumping. Furthermore, subsequent tracer tests conclusively demonstrated the hydraulic connection between the surface pond and the producing zone of Well 5. In fact, there were multiple entry points to the shallow aquifer feeding Well 5, possibly including point source breaches of the overburden by fencepost holes, improperly abandoned wells (none were located) and sand or gravel lenses. The investigations after the outbreak did not confirm the exact route of contamination entry into Well 5, but the relevant experts at the Inquiry agreed that the overall evidence for contamination of Well 5 was entirely consistent and the most plausible explanation for the outbreak.

The epidemiologic evidence and the timing of illness in the community strongly suggested that contamination occurred on or about May 12. Well 5 was the major source of water to Walkerton between May 10 and 15, with intermittent contributions from Well 6. The heavy rainfall experienced by Walkerton on May 12 peaked between 6 pm and midnight. Bacteriological sampling data were limited and were confounded by the inaccurate labeling practiced by PUC personnel. Given the location of the PUC shop in the distribution system downstream of Well 5, combined with the documented poor practices of the PUC operators, it was likely that the May 15 sample labeled Well 7 treated was actually taken at the PUC shop and represented Well 5 water entering the distribution system. This sample, the one that Stan Koebel concealed from health authorities, was heavily contaminated with greater than 200 E. coli per 100 mL.

A number of samples were collected by the local Health Unit, the MOE and the PUC between May 21 and 23. All Well 5 samples were positive for both total coliforms and E. coli while neither Well 6 nor Well 7 samples were positive for either bacterial indicator. A June 6 sample taken from the spring adjacent to Well 5 had a count of 80 E. coli per 100 mL. Pump tests were done at two monitoring wells near Well 5, including one on the adjacent farm, in late August 2000. After the 32-hour pump test, E. coli counts climbed to 12,000 per 100 mL on the monitoring well 225 m west of Well 5 and to 900 per 100 mL on the monitoring well 105 m west-northwest of Well 5. Dr. R. Gillham, the hydrogeology expert called by the Inquiry, concluded that a large area of the shallow aquifer supplying Well 5 had been heavily contaminated.

In addition to the reasonably consistent circumstantial evidence implicating Well 5 as the primary, if not sole, source of microbial contamination of Walkerton’s water supply, there was reasonably compelling evidence linking the bacterial pathogens that caused the human illness with cattle and manure samples from the farm near Well 5. Dr. A. Simor, the Inquiry’s expert on medical microbiology, described how pathogens were characterized in the laboratory (A. Simor, W. I. Transcript, February 26, 2000, pp. 142–146). Three methods were used to gain more evidence about the specific strains of pathogens identified: phage typing, serotyping and pulsed-field gel electrophoresis (PFGE).

The first method exploits the ability of certain viruses to infect bacteria. These viruses are named bacteriophages—phages for short. Different bacteria are susceptible to infection by different phages, so exposing a strain of bacteria to a range of different phages can be used to type that strain for its susceptibility to phage attack. That pattern of phage susceptibility can be used to distinguish one strain of bacteria from another strain of the same species.

The second method, serotyping, relies on detecting specific antigens on the exterior of a bacterial cell. These include O antigens that characterize components of the bacterial cell walls and H antigens that characterize the flagella (the whip-like tails that bacteria use for motion). For example, the name E. coli O157:H7 refers to the strain of E. coli with the 157 antigen in the cell wall and the 7 antigen in the flagellum. Individual strains of Campylobacter species, such as C. jejuni, can also be characterized by serotyping.

The third method, PFGE, looks at the molecular properties of the DNA found in a bacterial strain. Because the DNA provides the genetic material that causes specific strains of a bacterial species to be distinct, evaluating and comparing the DNA of individual strains provides a relatively direct method for identifying specific strains. In this procedure, DNA is extracted from the bacterial cell and is cut at chosen locations using specific enzymes to yield DNA fragments of varying size. These fragments are separated on a gel plate by electrophoresis to yield a pattern of bands distributed according to the relative size of the fragments. The resulting pattern can be interpreted in terms of the original DNA structure to compare with DNA from different strains. Identical strains will have identical DNA fragment patterns, while the patterns of closely related strains may differ in only a few fragments. Dr. Simor’s expert opinion at the Inquiry (A. Simor, W. I. Transcript, p. 160, February 26) was that strains differing by six or fewer DNA fragment bands are considered genetically related in the context of a common source for an outbreak. These advanced methods were used to compare pathogens recovered from cattle manure with those from infected humans.

By August 31, 2000, in the follow-up investigation, the outbreak team working for the Health Unit had identified 1,730 cases as suspected cases (BGOSHU, 2000). Following contact attempts by phone or mail, 80 percent of contacts were judged to have an illness related to exposure to Walkerton municipal water, and 1,346 cases met the definition adopted for the investigation. “A case was defined as a person with diarrhea, or bloody diarrhea; or stool specimens presumptive positive for E. coli O157 or Campylobacter spp. or HUS between April 15 to June 30. For the purposes of attributing cases to the water system, a primary case was defined as a person who had exposure to Walkerton water. A secondary case was defined as a person who did not have any exposure to Walkerton water but had exposure to a primary case as defined above. A person was classified as unknown if their exposure status was not indicated” (BGOSHU, 2000).

Of these cases, 675 had submitted stool samples for culture, yielding 163 positive for E. coli O157:H7, 97 positive for C. jejuni, 7 positive for C. coli and 12 positive for both E. coli O157:H7 and Campylobacter. The outbreak curve is plotted in Figure 2-18.

The second peak in the epidemic curve (Figure 2–18) has been discussed as possibly representing the second of two types of infection that occurred, with C. jejuni and with E. coli. Another possibility that was not discussed is that the second peak occured on May 23, the date that Dr. McQuigge gave his first press conference on the outbreak. The resulting high profile media coverage that day might have anchored May 23 in the memories of some victims when they responded to the survey performed later to determine the date of onset of illness for each case.

Various cultures were also done on environmental samples, allowing some comparison with the pathogens causing illness. The Health Unit collected samples from 21 sites in the Walkerton distribution system on May 21 and collected raw and treated water from Well 5 on May 23. Concurrent samples taken at Wells 6 and 7 showed neither total coliforms nor E. coli. Two of the distribution system sites remained positive for total coliforms and E. coli over several days. One of the distribution system sites, along with cultures from the May 23 raw and treated water samples from Well 5, was analyzed by PCR, another molecular diagnostic technique. This technique is able to amplify DNA from a sample to allow extremely sensitive detection for specific genes that may be present. Using PCR, these samples, representing a contaminated location in the Walkerton distribution system and Well 5, all showed the same genes for O157, H7 and the specific verotoxin, VT2.

Working with Health Canada and the Ontario Ministry of Agriculture and Food, the Health Unit undertook livestock sampling on farms within a 4 km radius of each of Wells 5, 6, and 7 between May 30 and June 13 (BGOSHU, 2000). They obtained livestock fecal samples from 13 farms and found human pathogens (mainly Campylobacter) in samples from 11. On two farms, both C. jejuni and E. coli O157:H7 were found. These farms were selected for further sampling on June 13. The results are summarized in Table 2-1. Farm 1 was located in the vicinity of Wells 6 and 7 and Farm 2 was located within sight of Well 5 (Figure 2–17).

TABLE 2-1

Culture Results from Two Farms Resampled on June 13.

The most telling features of these typing efforts are revealed in Table 2–2, which compares the strain characteristics from the cattle fecal samples at the two farms with the cultures from human cases infected with E. coli O157:H7 or Campylobacter spp. Details of the extensive strain typing work that was done have now been published by Clark et al. (2003).

TABLE 2-2

Pathogen Strain Typing Comparison Between Human Cases and Cattle Fecal Samples at Farms 1 and 2.

These results do not provide absolute confirmation that manure from Farm 2 was responsible for contamination of the Walkerton water supply for a number of reasons. The cattle samples were taken in mid-June, about a month after the suspected date of contamination, and it is not possible to be certain that cattle on Farm 2 were infected on May 12. Likewise, the DNA typing by PFGE must be recognized as much less certain than DNA typing used in human forensic analysis. Because bacteria reproduce by binary fission, each progeny cell is a clone of its parent (i.e., each progeny cell has identical DNA to the parent cell, so individual cells are not genetically unique). However, the DNA makeup of bacteria changes rapidly because their rapid rate of reproduction allows genetic mutation through a number of mechanisms that alters their DNA quickly compared with humans. In total, the level of evidence for this outbreak is far more compelling than the quality and level of evidence that has historically been available for outbreak investigations. The main features suggesting that Farm 2, located near Well 5, was the primary source of pathogens that caused the outbreak are the match of the phage type 14 for E. coli O157:H7, with PFGE pattern A and with phage type 33 for the Campylobacter spp.

Finding that raw water in Well 5 was contaminated by pathogens detected in cattle manure from a nearby farm does not explain how this contamination was allowed to cause the disastrous disease outbreak in the community. The water produced by all the wells serving Walkerton was supposed to be chlorinated continuously to achieve a chlorine residual of 0.5 mg/L for 15 minutes of contact time (ODWO). This level of disinfection would have provided a concentration-contact time (CT) value of 7.5 mg/L-min. That CT value is more than 150 times greater than literature CT values of 0.03–0.05 mg-min/L for 99 percent inactivation of E. coli O157:H7 and more than 80 times greater than a CT value of 0.067 to 0.090 mg-min/L for 99.99% inactivation of E. coli (Hoff and Akin, 1986Kaneko, 1998). Clearly, the specified level of chlorine residual and contact time was not operative for Well 5 in May 2000. If it had been, inactivation of the E. coli pathogen greater than 99.99 percent would have been achieved, a level of protection that is certainly not consistent with the magnitude of the outbreak that occurred in Walkerton.

Indirect Causes of the Walkerton Outbreak

Although the PUC operators were obviously culpable for their misdeeds of omission and commission, they were clearly not the sole cause of this disaster. The overwhelming impression that follows from reviewing what happened in Walkerton is that complacency was pervasive at many levels. The majority of the discussion of the cause of this disaster in the Part 1 Walkerton Inquiry report (O’Connor, 2002a) was devoted to evaluating the contributions from many other parties to this disaster. Those interested in pursuing the institutional failures are referred to the Inquiry report. A brief summary of those institutional failures is provided below.

The Ontario Ministry of Environment

  • The original certificate of approval was issued in 1978 for Well 5 without including any formal operating conditions to deal with the hazards that were apparent from the pump testing at the commissioning of this well.
  • In 1994 the MOE adopted a policy to require continuous chlorine residual monitoring for vulnerable shallow wells (under the influence of surface contamination) but the MOE failed to implement policy this for existing vulnerable wells like Well 5.
  • There was no follow-up on deficiencies identified during rare inspections including one that was conducted in 1998 which identified some of the deficiencies in the monitoring practices of the PUC operators.
  • The false records and clear deficiencies in performance of the Walkerton operators were not recognized by the MOE.
  • The MOE showed surprisingly little institutional knowledge about drinking water safety.

The Walkerton Public Utilities Commission

  • The PUC placed total confidence in their General Manager and it took no apparent interest or responsibility for oversight of PUC operations.
  • The PUC ignored an adverse MOE inspection report in 1998 and demonstrated no interest in assuring that the PUC was responding to the deficiencies identified.
  • The PUC maintained a substantial financial surplus while not making investments in improving the water system.
  • The PUC took an adversarial stance with the Health Unit when the outbreak occurred.

The Government of Ontario

  • The Ontario government slashed the budget of the MOE drastically (>40 percent) over the previous five years with little evidence of concern for consequences to public health.
  • Responsibility for drinking water safety was largely offloaded to individual communities without providing effective assistance for these communities to handle that responsibility.
  • Water testing was removed from the provincial laboratory without adding any regulatory requirement for mandatory reporting of adverse results by the private labs to the MOE or the local health units.

Preventing Drinking Water Outbreaks: Turning Hindsight into Foresight

The challenge in preventing drinking water disasters like Walkerton is to learn from the experience of such disasters. There have been a surprising number of drinking water outbreaks in affluent countries and many have features in common with Walkerton (Hrudey and Hrudey, 2004). The task is essentially one of turning hindsight into foresight. This requires a risk management approach that demands a committed focus on prevention, as evidenced by:

  • Informed vigilance, is actively promoted and rewarded.
  • Understanding of the entire water system, from source to consumer’s tap, its challenges and limitations, is promoted and actively maintained.
  • Effective, real-time treatment process control is the basic operating approach.
  • Fail-safe multi-barriers are actively identified and maintained at a level appropriate to the challenges facing the system.
  • Close calls are documented and used to train staff about how the system responded under stress and to identify what measures are needed in future.
  • Operators, supervisors, lab personnel and management all understand that they are entrusted with protecting the public’s health and are committed to honoring that responsibility above all else.
  • Operational personnel are afforded the status, training, and remuneration commensurate with their responsibilities as guardians of the public’s health.
  • Response capability and communication are enhanced.
  • An overall continuous improvement, total quality management mentality will pervade the organization (Hrudey and Hrudey, 2004).

Justice O’Connor concluded in his second Inquiry report (O’Connor, 2002b) that “Ultimately, the safety of drinking water is protected by effective management systems and operating practices, run by skilled and well-trained staff.” Ontario has committed to implementing substantial improvements in the scope and quality of operator training. Operators can prevent a disaster like Walkerton if they assure the following key elements are established and followed (Hrudey and Walker, 2005):

  • Operators must understand their system, including the major contamination hazards it faces in relation to the safety barriers and their capabilities for assuring safe water.
  • Operators must develop guidance limits for monitoring parameters that are able to detect abnormal conditions, based on knowing their system.
  • Operators must watch for and recognize signals for abnormal conditions (i.e., increase in turbidity or chlorine demand and drop of chlorine residual).
  • Operators must work with management to anticipate plausible abnormal conditions and plan effective responses well before a serious incident occurs, including appropriate notification of regulatory authorities. Preparedness should support but does not replace the need for thoughtful analysis and problem-solving as events unfold.
  • Operators must recognize when they are facing a problem that is beyond their understanding or training and call for assistance.
  • Operators’ understanding of their system should include recognition of any inherent vulnerability that needs improvement to reduce contamination risks.
  • Operators need to be prepared to take ownership of problems and lead efforts to ensure that their managers fully understand the existence of such problems that must be rectified.

Concluding Comments

The Walkerton outbreak provides a strong argument for the multiple barrier approach for assuring safe drinking water. This needs to be based on a preventive risk management approach like that described in the Australian Drinking Water Guidelines (NHMRC, 2004) or the WHO water safety plan approach (WHO, 2004) that is firmly grounded in a quality management approach that is founded on a thorough understanding of the risks facing any particular system (Hrudey, 2004).

Because outbreaks of disease caused by drinking water remain comparatively rare in North America, particularly in contrast with the developing world, complacency about the dangers of waterborne pathogens can easily occur. Yet, the source of waterborne disease in the form of microbial pathogens is an ever present risk because these pathogens are found in human fecal waste and in fecal wastes from livestock, pets or wildlife, making any drinking water source at risk of contamination before or after treatment (Hrudey, 2006b).Go to:https://www.ncbi.nlm.nih.gov/books/NBK28459/

——-

A Massive Outbreak in Milwaukee of Cryptosporidium Infection Transmitted through the Public Water Supply

List of authors.William R. Mac Kenzie, Neil J. Hoxie, Mary E. Proctor, M. Stephen Gradus, Kathleen A. Blair, Dan E. Peterson, James J. Kazmierczak, David G. Addiss, Kim R. Fox, Joan B. Rose, and Jeffrey P. Davis

Abstract

BACKGROUND

Early in the spring of 1993 there was a widespread outbreak of acute watery diarrhea among the residents of Milwaukee.

METHODS

We investigated the two Milwaukee water-treatment plants, gathered data from clinical laboratories on the results of tests for enteric pathogens, and examined ice made during the time of the outbreak for cryptosporidium oocysts. We surveyed residents with confirmed cryptosporidium infection and a sample of those with acute watery diarrhea consistent with cryptosporidium infection. To estimate the magnitude of the outbreak, we also conducted a survey using randomly selected telephone numbers in Milwaukee and four surrounding counties.

RESULTS

There were marked increases in the turbidity of treated water at the city’s southern water-treatment plant from March 23 until April 9, when the plant was shut down. Cryptosporidium oocysts were identified in water from ice made in southern Milwaukee during these weeks. The rates of isolation of other enteric pathogens remained stable, but there was more than a 100-fold increase in the rate of isolation of cryptosporidium. The median duration of illness was 9 days (range, 1 to 55). The median maximal number of stools per day was 12 (range, 1 to 90). Among 285 people surveyed who had laboratory-confirmed cryptosporidiosis, the clinical manifestations included watery diarrhea (in 93 percent), abdominal cramps (in 84 percent), fever (in 57 percent), and vomiting (in 48 percent). We estimate that 403,000 people had watery diarrhea attributable to this outbreak.

CONCLUSIONS

This massive outbreak of watery diarrhea was caused by cryptosporidium oocysts that passed through the filtration system of one of the city’s water-treatment plants. Water-quality standards and the testing of patients for cryptosporidium were not adequate to detect this outbreak. Read more…. https://www.nejm.org/doi/full/10.1056/NEJM199407213310304

——-

WATERBORNE DISEASE OUTBREAK IN US HISTORY

By Stephen Gradus, PhD
January 10, 2014

An Interview with Dr. Stephen Gradus, Ph.D., MT(ASCP), D(ABMM), City of Milwaukee Health Department.

Waterborne disease outbreaks are relatively rare events in our time, but just over two decades ago, Milwaukee experienced the largest documented drinking water outbreak in US history. Caused by the chlorine-resistant parasite Cryptosporidium parvum,the outbreak affected over 400,000 people—25 percent of Milwaukee’s population in 1993—and resulted in over $96 million in combined healthcare costs and productivity losses, according to a study by the US Centers for Disease Control and Prevention. We contacted Dr. Stephen Gradus, Director of the City of Milwaukee Health Department Public Health Laboratories Division since 1990, for a look back at the event, including lessons learned, and improvements implemented.

Microphotograph of Cryptosporidium parvum oocysts
Microphotograph of Cryptosporidium parvum oocysts Image courtesy of the US Centers for Disease Control and Prevention.

What was the first sign of trouble in Milwaukee in 1993, and how did the health department respond?

On Monday morning, April 5, 1993, the laboratory’s Chief Virologist and the Commissioner of Health received calls inquiring about the nature of apparent gastrointestinal (GI) illness reports in the City. The Director of Nursing had anecdotal information that some pharmacies were selling out of anti-diarrheal medications. Unknown to the health department at that time, the Milwaukee Water Works had received some complaints regarding the aesthetic quality of tap water. After the Health Commissioner inquired about any information our Public Health Laboratory (MHDL) might have regarding GI illness in the community, we proceeded to call local hospital microbiology laboratories and ultimately emergency rooms and determined there was extreme GI illness throughout the city based on the much higher ER patient numbers and increased workload for enteric disease tests. Other staff in the department were also seeing similar indicators throughout the city. I then contacted both the North and South water treatment plants (WTP) to obtain daily water quality data from the previous 30 days, which indicated the only changing trend was the increasing turbidity readings from the south water treatment plant, yet were within federal limits. The lab also started a rapid fax-back survey of city clinical microbiology labs the following day that confirmed a dramatic increase in testing for GI pathogens citywide yet no agents were identified that would reflect such widespread illness.

On day three, a local infectious disease physician and colleague called the laboratory with a single case of Cryptosporidium that fit the profile of ill patients. Simultaneously MHDL was also identifying the first few Cryptosporidium cases collected internally. Based on these findings, MHDL then requested all local clinical labs to test for Cryptosporidium in stool specimens, which was not being done routinely. By 8 p.m. that evening, April 7, 60 hours after the realization of widespread community illness, the Mayor issued a boil water advisory for 880,000 citizens based on the epidemiological evidence and the finding of eight Cryptosporidium cases detected that day. The advisory would last for 10 days and the southern WTP would be re-opened by June. Subsequently our state and local health departments, with the assistance of the Centers for Disease Control and Prevention (CDC), fielded 11 investigational teams to study and characterize what would be the largest documented waterborne outbreak in U.S. history.

How many people became sick and how many died as a result of the waterborne disease outbreak?

The initial study estimated that 403,000 residents of the five-county area around Milwaukee had watery diarrhea attributed to the outbreak. Subsequent studies suggested this was an underestimation.

The official outbreak-related attributable mortality was 69 deaths, of which 93 percent occurred in persons with AIDS.

How did your training prepare you for the challenge of responding to such a serious waterborne disease outbreak?

My training as a medical technologist and clinical microbiologist, combined with two years as a postdoctoral fellow at CDC and eight years of experience in the department, had provided preparation for addressing outbreak situations. Coincidentally I had given talks and written an article on Cryptosporidium as a waterborne pathogen prior to the outbreak. Our departmental team of dedicated public health professionals, along with a community of motivated clinical microbiologists, were key to a timely public health and laboratory response.

How did the public react to the boil water advisory and what were your greatest public communications challenges during the outbreak?

The public had many questions regarding the outbreak and the news media proved to be a critical source of information to the public, providing a steady flow of information daily for many weeks. Initially a phone bank was set up at a local TV station to field questions from the public as well. The boil water advisory affected food production and recalls, certain industry operations, medical and pet care, and the food establishment industry to name a few impacts of daily life of Milwaukeeans and surrounding communities.

Regularly scheduled press conferences by the Health Department and other officials worked well in providing information to the public.

Even though all water quality indicators were within federal guidelines, there was quite a bit of anger in the community regarding the outbreak, as well as loss of trust in the use of potable water.

As tragic as the Milwaukee episode was, were there any significant positive outcomes from the 1993 outbreak?

Since 1993, Milwaukee Water Works (MWW), with the endorsement of the Mayor and Common Council, in ongoing investment has committed $417 million in its infrastructure to ensure high-quality water (as reported to the Public Service Commission of Wisconsin). The capital budget is based on long-term planning to replace or upgrade existing infrastructure, and to install new infrastructure as needed. The Capital Improvements Program prioritizes projects based on results of water-related research, new technology and condition assessments of existing systems.

The immediate response was a renovation of facilities from 1993-1998 to strengthen the barriers related to source water protection, disinfection and filtration. The detailed improvements that Milwaukee Water Works has put forth are available at www.city.milwaukee.gov/water. These improvements have led Milwaukee to be a leader in water quality and water testing.

A key effort that also came out of the experience has been a collaboration of MWW with the Milwaukee Health Department that we call our Interagency Clean Water Advisory Council (IACWAC). IACWAC tracks and can respond to public health issues that may be related to water. Groundbreaking in the 1990s, the ongoing partnership is now recognized nationally for its effectiveness in protecting public health. The utility relays critical information about emerging contaminants, water treatment and water quality monitoring via communications with news media and customer service representatives, and at www.milwaukee.gov/water.

At the national level to address public health and regulatory issues prompted by the concern of waterborne cryptosporidiosis and in particular the Milwaukee Cryptosporidium outbreak, the CDC and EPA called a meeting at CDC in September 1994. Represented at this meeting were experts from water industry and utilities, the US Department of Agriculture, the Food and Drug Administration, local (including several of us from Milwaukee), state and federal public health agencies, laboratorians, and advocacy groups, totaling more than 300 individuals from 40 states. From the two-day meeting The Working Group on Waterborne Cryptosporidiosis was created which included 17 task forces to address specific topics related to waterborne cryptosporidiosis. CDC maintains a variety of online resources on Cryptosporidium at:  https://www.cdc.gov/parasites/crypto/index.html .

What is Milwaukee doing today to help guard against another waterborne disease outbreak?

We are very proud in Milwaukee that our water is some of the highest quality in the nation. We have an effective, multiple-barrier process of source water protection, ozone disinfection, chlorine disinfection, biologically active filtration, and continuous water quality monitoring. Milwaukee’s drinking water quality meets or exceeds all Wisconsin Department of Natural Resources (DNR) and EPA standards. The water utility’s water quality monitoring program tests for many more illness-causing pathogens and contaminants than are required by the EPA. In fact, the utility now tests source and treated water for more than 500 contaminants. Our inter-agency collaboration also continues, and will continue, in order to promote collaboration in protecting public health.

Have the lessons learned in Milwaukee been shared with other water systems?

The outbreak has been well-documented and written about both by news media and academic researchers – and we at the City of Milwaukee Health Department continue to get calls from media and those in academics for the historical perspective.
The events led to improvements worldwide in water quality treatment processes, water quality monitoring and regulations to protect public health. In particular, the ongoing partnership between MWW and the Milwaukee Health Department for water quality monitoring and public health surveillance, ground- breaking at the time, is now recognized nationally for its effectiveness in protecting public health.

The Milwaukee Health Department Public Health Laboratory (MHDL), http://city.milwaukee.gov/healthlab, was one of the original labs to participate in the validation study of the EPA Method 1622 for the detection and identification of Cryptosporidium and Giardia in water and has been testing for these parasites as well as culturable viruses since 1994. Visitors locally and worldwide have visited MHDL to observe and learn the EPA methods. More recently, as we continuously update our testing methodologies, MHDL is implementing the new EPA Method 1615 for virus detection by culture and quantitative molecular assays for enterovirus and norovirus genogroups GI and GII. The qPCR analysis may be completed within 24-48 hours. This allows MHDL to provide valuable information to our WTPs in a more timely manner while offering excess capacity to other utilities. Method 1615 is part of the third Unregulated Contaminant Monitoring Regulation which will occur during 2013-2015 to monitor 30 contaminants (28 chemicals and two viruses), and will provide a basis for future regulatory actions to protect public health.

Milwaukee is now recognized as a national leader in water quality, and we are proud to be a part of it. https://waterandhealth.org/safe-drinking-water/drinking-water/milwaukee-1993-largest-documented-waterborne-disease-outbreak-history/

——-

Costs of Illness in the 1993 Waterborne Cryptosporidium Outbreak, Milwaukee, Wisconsin

Phaedra S. Corso* , Michael H. Kramer*, Kathleen A. Blair†, David G. Addiss*, Jeffrey P. Davis‡, and Anne C. Haddix§Author affiliations: *Centers for Disease Control and Prevention, Atlanta, Georgia, USA; †City of Milwaukee Health Department, Milwaukee, Wisconsin, USA; ‡Wisconsin Division of Public Health, Madison, Wisconsin, USA; §Emory University, Atlanta, Georgia, USA

Cite This Article

Abstract

To assess the total medical costs and productivity losses associated with the 1993 waterborne outbreak of cryptosporidiosis in Milwaukee, Wisconsin, including the average cost per person with mild, moderate, and severe illness, we conducted a retrospective cost-of-illness analysis using data from 11 hospitals in the greater Milwaukee area and epidemiologic data collected during the outbreak. The total cost of outbreak-associated illness was $96.2 million: $31.7 million in medical costs and $64.6 million in productivity losses. The average total costs for persons with mild, moderate, and severe illness were $116, $475, and $7,808, respectively. The potentially high cost of waterborne disease outbreaks should be considered in economic decisions regarding the safety of public drinking water supplies.

Cryptosporidium parvum, a protozoan parasite that causes gastrointestinal illness, is transmitted by ingestion of oocysts excreted in human or animal feces. Typical modes of transmission include person to person, animal to person, by exposure to contaminated surfaces, and by ingestion of impure food or water (1). From 1990 to 2000, at least 10 cryptosporidiosis outbreaks associated with contaminated drinking water were reported in the United States (25). Although the health impact of an outbreak of cryptosporidiosis originating from a contaminated public water source has been carefully documented (6), little effort has been made to estimate the economic impact of such an outbreak. This study estimates the cost of illness associated with perhaps the largest outbreak associated with a contaminated public water source ever reported in the United States. In 1993, an estimated 403,000 residents of the greater Milwaukee, Wisconsin, area (population, approximately 1.61 million) became ill when an ineffective filtration process led to the inadequate removal of Cryptosporidium oocysts in one of two municipal water treatment plants (6). We assessed direct medical costs and productivity losses from diarrheal illness during the Milwaukee outbreak to estimate the total cost of illness and the average cost per person with mild, moderate, and severe illness.

This cost-of-illness analysis was based on epidemiologic data collected during and after the 1993 cryptosporidiosis outbreak in Milwaukee, Wisconsin. Primary data on utilization and cost of inpatient admissions were obtained from a review of medical and financial records from hospitals in the greater Milwaukee area.

Methods

Epidemiologic Burden of Illness

A telephone survey of 613 households provided estimates on the total number of persons in Milwaukee experiencing mild, moderate, or severe illness as a result of the cryptosporidiosis outbreak (6). Cases were defined as residents of Milwaukee County or the surrounding four counties (Washington, Ozaukee, Racine, and Waukesha) with onset of watery diarrhea from March 1 to April 28, 1993 (the outbreak period). When disease case estimates were adjusted for normal background diarrheal disease rates, investigators estimated that 403,000 residents of the five-county area experienced illness caused by the cryptosporidiosis outbreak (6). Of this group, an estimated 354,600 persons (~88%) did not seek medical attention; 44,000 persons (~11%) were seen as outpatients; and 4,400 persons (~1%) were hospitalized.

Cost of Illness

Following the design of the epidemiologic studies of the same outbreak, we categorized illness as mild, moderate, or severe by type of medical care sought during the outbreak period and the following 2 months (4-month study period) when persons were still likely to seek medical care (68). Persons with mild illness did not seek physician or emergency department care for their illness. Persons with moderate illness had at least one physician or emergency department visit but were not hospitalized. Persons with severe illness were hospitalized at least once during this period.

Previous studies and evidence collected during the outbreak suggest that underlying medical conditions such as AIDS can increase the severity of illness in persons infected with Cryptosporidium (9,10). To capture the effect of underlying condition on cost of illness, we further classified patients with moderate and severe illness as having no underlying condition, an underlying condition likely treated with immunosuppressive drugs, or AIDS.

Data on utilization and average cost of inpatient services, emergency department visits, ambulance transports, and medication for persons with moderate and severe illness were obtained from a review of medical and financial records from 11 of the 14 hospitals in the greater Milwaukee area. The three nonparticipating hospitals did not differ in the number of confirmed cases of persons infected with Cryptosporidium, nor did they serve specialty populations that would result in higher medical costs per case. Total cost of illness was estimated from average cost of illness multiplied by the burden of illness. All clinical and financial data were recorded on standardized forms and entered into a computerized database. We did not collect information that identified patients by name or billing account number. Additional data on use of services and costs for persons with mild illness and data on productivity losses were obtained from the City of Milwaukee Health Department and published epidemiologic studies on the outbreak (68).

Cost-of-illness estimates for mild, moderate, and severe illness included both direct medical costs and indirect costs associated with lost productivity. Medical costs included costs for inpatient and outpatient health services, ambulance transport, and medication. Productivity losses included time lost by infected persons due to illness and the time lost by caregivers or family members to tend to ill persons. All costs are presented in 1993 U.S. dollars. We did not include litigation costs, the cost of preventive measures (e.g., switching to bottled water), intangible costs associated with pain and suffering, or the cost to the local, state, and federal government to investigate and control the outbreak.

Medical Costs

We used several parameters to estimate the direct medical costs associated with diarrheal illness during the Milwaukee cryptosporidiosis outbreak (Table 1).

Inpatient and Emergency Department Health Care Costs

To assess the usage and average cost of inpatient admissions (e.g., hospitalizations) and outpatient services associated with emergency department visits, we reviewed all hospital medical charts for persons with laboratory-confirmed cryptosporidiosis as identified by the hospitals’ laboratory records. Because the sensitivity of diagnostic testing is relatively low, and many persons were not tested during the outbreak, we also reviewed a sample of charts for persons admitted to the emergency department or hospital with diarrhea for at least 2 days, as identified by the following diagnostic codes from the International Classification of Diseases, 9th Revision, Clinical Modification (CD-9-CM) listed in one of the first four diagnosis categories on the hospital discharge record: 007.20, 008.80, 009 and subcategories, 079.90, 234.10, 276 and subcategories, 558.90, and 999 and subcategories. During these admissions, either no laboratory testing was performed or tests were negative for Cryptosporidium and other intestinal pathogens. For the two samples, we included all costs for the inpatient admission or emergency department visit, regardless of whether the cost was directly attributable to cryptosporidiosis. Charts were excluded when the hospital admission was primarily for another condition (i.e., ICD-9-CM codes were not listed in one of the first four diagnostic categories) and when the onset of gastrointestinal illness occurred after hospital admission.

From the medical records, we collected data on resource use during hospitalization or emergency department visit, the use of ambulance transportation, self-reported use of medication before an emergency department visit or inpatient admission, and physician-prescribed medication following an emergency department visit or inpatient admission. Charges for hospitalization included diagnostic, laboratory, hospital room, and technical services (e.g., physical therapy, occupational therapy, and respiratory services); attending physician and nursing staff; medication; emergency department services; and other supplies or services not identified in the previous categories (e.g., medical-surgical supplies, clinic services).

Charges for emergency department visits and hospitalizations were converted to costs by using an average cost-to-charge ratio of 0.67, based on ratios obtained from 6 of the 11 hospitals sampled in the greater Milwaukee area; this figure is comparable to Wisconsin’s average operating cost-to-charge ratio (0.70) reported for urban hospitals in 1993 (14). Charges for specialty consultations not included in the hospital bill were excluded from this analysis because insufficient data were available on the number, duration, or charges for these services.

Outpatient and Ambulance Costs

We assumed that 95% of persons with moderate illness sought the care of a physician (one visit only) and that the remaining 5% required an emergency department visit. (Data collected from the epidemiologic investigation provided information on whether ill persons sought healthcare for their illness and whether they were hospitalized. No information was collected on whether a nonhospitalized healthcare visit was to seek physician or emergency department services. Therefore, in the absence of reliable data, we assumed that 5% of persons with moderate cryptosporidiosis went to the emergency department). For the latter group, we assumed that no additional physician visits were needed before or after the emergency department visit. The proportion of persons with severe illness who had a physician visit before hospitalization was obtained from chart review. We assumed that one physician visit was needed before (and none after) the hospitalization. The cost of a physician visit ($45) was obtained from data collected by the City of Milwaukee Health Department. (Since costs, and not charges, for physician visits were provided, obtaining cost-to-charge ratios was not necessary.) This figure is in the range found by other studies that have estimated the cost of a physician visit as ranging from $40 (1992 dollars) to $53 (1994 U.S. dollars) during this period

Use of ambulance transport was indicated on the medical charts for emergency department visits and hospitalizations. Ambulance transport was used by 4.9% of those with moderate illness involving an emergency department visit compared with 16.3% of those with serious illness. We used the 1993 rate set by the City of Milwaukee ($185.50 for conveyance, $12 for minor services, and $6 per mile) for the cost of an ambulance transport, and we assumed that the average distance per transport was 5 miles

Medication Costs

For mild illness, data regarding the duration of illness were collected by the City of Milwaukee Health Department by using a random digit dial survey. Methods for this data collection were published (6). For moderate and severe illness, data regarding the duration of illness before an emergency department visit or hospitalization, the percentage of persons self-medicating during this period, and costs for medication were obtained from the medical records. We assumed the percentage of persons who self-medicated, as obtained from emergency department records for a person with moderate illness, also applied to persons with mild illness and to persons who did not use the emergency department but sought other medical care.

We estimated that all persons with mild illness who self-medicated used four 2-mg tablets of loperamide antidiarrheal medication per day or two 32-oz packs of oral rehydration solution per week, at a cost of $2.44/d. In the absence of reliable data on the duration of self-medication for a person with mild illness, we assumed that persons took medication for 50% of the duration of illness.

From the medical records, we collected detailed drug information (i.e., type, quantity, and duration) for medications prescribed upon discharge for persons with moderate and severe illness. We assumed that medications prescribed for persons with moderate illness seeking an emergency department visit also applied to persons with moderate illness seeking physician care. Retail drug prices in 1993 were used to calculate all costs (12).

Data about recurrent illness for mild, moderate, and severe illness were obtained from two investigations conducted during the outbreak (7,8). On the basis of these data, we estimated that 21% of ill persons experienced a recurrent episode of diarrhea for 2 days. As we did for persons with mild illness, we assumed that persons with recurrent illness took medication for 50% of the duration of illness at a cost of $2.44/d.

Productivity Losses

Productivity losses for ill persons and their caregivers were estimated from data on days lost because of illness collected by the random digit dial survey conducted by the City of Milwaukee Health Department (6). In the absence of reliable data on the days lost by caregivers of persons with severe illness, we assumed that a caregiver was needed for 50% of the number of days hospitalized. The value of missed work time by a caregiver or person with diarrheal illness was estimated by using the average annual wages for residents of Wisconsin in 1993 (13), increased by 25% to include fringe benefits. Because the type of day lost (i.e., work or leisure) was not specified in the secondary data available, we used an average daily value of $81 (annual wage plus fringe benefits, divided by 365 days) (17). We valued the time of all persons based on the productivity of the average worker, regardless of the work force status of any person.

Top

Results

We reviewed approximately 2,000 medical records from October 30 through November 11, 1995, and identified 378 persons who met our case definition for a moderate or severe case of cryptosporidiosis. We collected data on 155 persons who met our case definition for a moderate illness (i.e., emergency department visit only) and 223 persons who met our case definition for severe illness (i.e., a hospitalization). Seventeen percent of persons with moderate illness and 63% of persons with severe illness in our sample had laboratory-confirmed cryptosporidiosis.

Average costs of illness for persons with mild, moderate, and severe illness were $116 for mild, $475 for moderate, and $7,808 for severe (Table 2). Direct medical costs represented 2% of the average cost for persons with mild illness, 13% of the average cost for persons with moderate illness, and 82% of the average cost for persons with severe illness. The average cost of illness for all persons who experienced diarrheal illness, weighted by the proportion in each illness category, was $239 per person: $79 in medical costs and $160 in productivity losses.

The total cost of illness associated with the cryptosporidiosis outbreak in Milwaukee was approximately $96.2 million: $31.7 million in direct medical costs and $64.6 million in productivity losses (Table 3). Medical costs accounted for 33% of the total cost of illness, including $790,760 for mild illness, $2.7 million for moderate illness, and $28.2 million for severe illness. Productivity losses accounted for 67% of the total cost of illness, including $40.2 million for mild illness, $18.2 million for moderate illness, and $6.2 million for severe illness. Nearly 43% of all costs were attributable to persons with mild illness, 22% to persons with moderate illness, and 36% to persons with severe illness.

Costs for Emergency Department Visits and Hospitalizations by Underlying Condition

The average cost for an emergency department visit for persons with moderate illness was $224 (Table 4). For persons with no underlying condition (84% of emergency department visits), the average cost for an emergency department visit was $213. For persons with underlying conditions (only one patient had AIDS), the average cost for an emergency department visit was $265. The average cost for a hospitalization for persons with severe illness was $6,312, with an average length of stay of 8 days. For persons with no underlying condition (34% of hospitalizations sampled), the average cost for a hospitalization was $3,131, with an average length of stay of 5 days. For persons with an underlying condition other than AIDS (52% of hospitalizations that met the case definition), the average cost for a hospitalization was $5,520, with an average length of stay of 7 days. Persons with AIDS (14% of hospitalizations) incurred the greatest average cost of hospitalization, $17,388, with an average length of stay of 16 days.

Top

Discussion

The massive waterborne outbreak of cryptosporidiosis in 1993 in Milwaukee caused illness in approximately 403,000 persons and generated substantial healthcare costs and productivity losses. We estimate that on average, ill persons incurred approximately $79 in medical costs and $160 in productivity losses, resulting in total medical costs of $31.7 million in total medical costs and $64.6 million in total lost productivity. Since epidemiologic estimates of incidence contribute substantially to total cost estimates for any outbreak, information on average cost of illness by severity can be applied to any range of epidemiologic estimates to assess the sensitivity of total costs. For example, in the Milwaukee outbreak, the 95% confidence interval for burden (incidence or prevalence) of illness ranged from 370,000 to 435,000 persons (2,400 to 6,400 for severe cases, and 38,000 to 50,000 for moderate cases) (6). Applying these epidemiologic burden of illness estimates to the average cost per case by severity, total medical costs and productivity losses for the Milwaukee outbreak ranged from $75 to $118 million.

Although only 1% of persons who experienced diarrheal illness associated with the outbreak were hospitalized, their medical costs accounted for 89% of the total outbreak-related medical costs. Persons with suppressed immune systems were the most severely affected, accounting for 66% of hospitalizations and 74% of the total outbreak-related direct medical costs. Persons with AIDS incurred hospital costs five times greater than persons with no underlying condition. Persons with underlying conditions other than AIDS incurred almost twice the cost of hospitalization compared with persons with no underlying condition.

During the 4-month period during and after the outbreak, the productivity of Milwaukee residents and visitors who experienced diarrheal illness and their caregivers was severely affected. Although mild illness did not represent a great strain on the use of medical care resources, productivity losses were substantial given the number of persons who experienced mild illness that debilitated them in some capacity. Productivity losses accounted for 98% of total costs for persons with mild illness and 87% of total costs for persons with moderate illness.

The cost-of-illness estimates in this study are conservative for several reasons. Primary data collection from medical and financial records limited our ability to assess all costs associated with the outbreak. For example, medical and financial records lacked details about physician visits, ambulance transports, or self-medication before admission, and cost information for professional services provided during hospitalization that were billed separately. Further, an estimate of the magnitude of the occurrence of illness among visitors to the greater Milwaukee area was not made. Conservative estimates were used for any assumptions made when reliable data were not available. Second, we excluded productivity losses associated with chronic illness that might have extended beyond our 4-month study period, and we also excluded productivity losses associated with premature death. An estimated 69 deaths occurring principally among persons with AIDS were attributed to Milwaukee’s cryptosporidiosis outbreak (18). Excluding the productivity losses associated with premature mortality potentially underestimates our results for total productivity losses associated with the outbreak.

While this study focused on the direct medical costs and productivity losses for illness associated with the outbreak, a broader perspective for the analysis would have included other nonmedical costs for infected persons, costs to businesses, and the cost to government agencies of controlling the outbreak and improving the public water system. Costs to government agencies alone, including the Centers for Disease Control and Prevention (CDC), the Environmental Protection Agency, the Wisconsin Division of Health (currently the Wisconsin Division of Public Health), the City of Milwaukee Health Department, the Milwaukee Water Works, and 17 local health departments, were estimated at >$2 million immediately following the outbreak (CDC, unpublished data). A class action suit filed by the residents of Milwaukee against the city continued to generate costs for the local government well beyond the immediate outbreak period. Businesses similarly experienced financial hardship during the outbreak because of employee illness, the necessity of using bottled water during the city’s boil-water advisory, and a decrease in beverage and food sales overall. Unaccounted costs for the infected person include costs incurred for self-protection (i.e., the purchase of bottled water) and the pain and suffering associated with illness.

Although the $96.2 million in illness costs attributed to the Milwaukee outbreak is substantial, estimated monetized annual costs of waterborne disease in the United States have been estimated at $21.9 billion (1991 dollars) (19). This figure is based on estimates of 7.1 million cases of mild to moderate waterborne disease and 560,000 cases of severe disease (20), and an average cost per case of $2,860, including medical costs and productivity losses (21). (Average cost per case, $2,860, is based on a study of a giardiasis outbreak in Pennsylvania in 1983–1984 [21]. Although the case-fatality ratio was lower than in Milwaukee, the cost per case was higher than our estimates because of a longer duration of illness. The authors [19] note that $2,860 likely overestimates the cost of a mild case and underestimates the cost of a severe case.)

In an era of limited health resources, decision makers must choose how to allocate resources to improve the public’s health. Measures taken to reduce the risk of waterborne cryptosporidiosis will also prevent other waterborne diseases. Water authorities often face the predicament of dealing with decreasing raw water quality, the high costs of new technologies, water filtration systems that do not completely remove all potentially pathogenic organisms, and increased public demand for safe water. The cost of this outbreak, which can be balanced against the cost of measures for preventing future outbreaks,1 is a reminder that failure to maintain safe drinking water supplies has substantial impact on the health and economy of a community.

Top

Dr. Corso is a health economist in the National Center for Injury Prevention and Control at the Centers for Disease Control and Prevention (CDC). During the time that this study was conducted, she was an economic analyst in the Prevention Effectiveness Branch of CDC’s Epidemiology Program Office. Her research interests include assessing attitudes toward uncertain longevity under expected utility and cumulative prospect theories and applying willingness-to-pay techniques to estimate societal values for preventing illness and death.

Top

Acknowledgment

We thank the following institutions for their tremendous contributions to this investigation: the Wisconsin Division of Public Health (formerly the Wisconsin Division of Health); the City of Milwaukee Health Department; the Medical College of Wisconsin; and the staff of the following hospitals that allowed us to conduct chart review: Children’s Hospital, St. Luke’s Hospital, Sinai Samaritan Hospital, St. Francis Hospital, Columbia Hospital, Froedtert Hospital, Doyne County Hospital, St. Joseph’s Hospital, Waukesha Memorial Hospital, West Allis Memorial, and Zablocki Veterans Affairs Medical Cent https://wwwnc.cdc.gov/eid/article/9/4/02-0417_article

——-

Cryptosporidium

Click the image to learn more.

In the spring of 1993, approximately 400,000 people fell victim to what Milwaukeeans have since referred to as “Crypto.” At least sixty-nine people—mostly people suffering from AIDS—died in this Cryptosporidium outbreak, which would become the country’s largest waterborne disease epidemic on record. These numbers do not include those who visited Milwaukee and drank the water before leaving, such as the college hockey teams in town for the NCAA tournament who left the city with an obscure pathogen in their bodies.

When thousands of people in Milwaukee and its surrounding suburbs began suffering from flu-like symptoms that spring, no one knew why. Hundreds of patients overwhelmed emergency rooms in early April with complaints of vomiting, abdominal cramps, and dehydration, while officials remained in the dark as to what might be causing such widespread illness. Pharmacies scrambled to restock anti-nausea medications as customers cleared the shelves. The situation at area schools reflected the extent of the illness’s spread, as attendance dwindled and staff shortages prompted at least one closure. As of April 6, the Milwaukee Health Department had not identified the source of the outbreak, but officials denied that the epidemic was connected to the city’s water supply. Lab staffs at area hospitals had been testing stool samples for bacterial pathogens, but the scope of the outbreak led scientists to focus on protozoan infections, especially Cryptosporidium. The following day, samples examined in County labs revealed the presence of this microscopic waterborne parasite. That night, Mayor John O. Norquist announced the labs’ findings and told residents to begin boiling their water.

While people throughout the city fell ill, most cases were clustered on the South Side. Those who consumed water from the southern water treatment plant got sick at twice the rate of those who received their water mainly from the northern plant. This information led officials to shut down the filtration plant serving the South Side. Unfortunately, the move came too late and the water containing the tiny parasites flowed freely through Milwaukee’s water supply system.  The boil advisory remained in place for an entire week, forcing the school system to throw away 68,000 servings of potentially dangerous Jell-O, while customers at Miss Katie’s diner declined water with their meals, despite waitresses’ promises that it had been properly boiled. Several weeks after the ordeal, officials revealed that a flawed water quality control process had allowed the pathogen to breach the plant’s filtration system.

In 2012, years after the outbreak and the reform to the water system that followed, residents of Milwaukee received a municipal services bill which included a special report from the Milwaukee Water Works identifying the city’s water as, “of the highest quality in the United States.” Cryptosporidium cases have continued to appear in the area, but Milwaukee’s drinking water has not been identified as the source in any cases since 2000. https://emke.uwm.edu/entry/cryptosporidium/

——-

Hans Abrahamsen – Schnee – Mahler Chamber Orchestra

Als het water ziekmakend is… ⭐

Leave a Reply

Your email address will not be published. Required fields are marked *