A short paper published in 1989 in the British Medical Journal proposed that modern hygienic practices might account for the post-industrialization increase in allergy. David P. Strachan, MD, studied the epidemiology of hay fever in more than 17,000 British children who were born during the same week and followed up to the age of 23 years. He found a significant inverse relationship between the number of children in a household and the prevalence of both adulthood hay fever and infantile eczema. He, therefore, suggested that siblings transmit infections to one another through their grubby hands and sneezy noses, and the resultant childhood illnesses confer protection against hay fever.1
That concept has since become known as the "hygiene hypothesis" and is widely used to explain the allergy epidemic. If our cleanliness is, indeed, the bane of our immune systems, then might it be possible to re-direct the immune response through exposure to bacteria, allergens, and similarly unpleasant things?
Allergies—specifically, type I hypersensitivity reactions—are the result of Th2-favoring immune systems. The immune systems of newborns typically are shifted in favor of the Th2 response.
The Th2 (humoral) immune response is favored by the developing fetus, so as to counter the Th1 (cellular) response of the mother to the fetus and thereby prevent premature delivery.2 After birth, however, the Th1 response is important for maintaining a normal, balanced immunologic response to pathogens and allergens. The hygiene hypothesis postulates that early exposure to common pathogens and allergens spurs the development of the Th1 response, resulting in a balanced and matured immune system. Modern hygienic practices, such as the use of antibiotics and vaccines, as well as smaller family sizes, have eliminated many of the opportunities for exposure to Th1-priming allergens, bacteria, and viruses.
Allergy is undoubtedly more common in children who live in the city than in their farm-dwelling counterparts.3 Early exposure to pollen, as well as animal-associated dander and feces, is believed to encourage the maturation of the immune system. This "allergy immunity," however, does not always survive if the individual relocates to the city,4 a possible effect of pollution intensifying the potency of pollens.5
Allergy also appears to be less common in individuals infected with Schistosoma mansoni, a species of helminth parasite that secretes anti-inflammatory cytokines.6 Th2 responses, specifically eosinophil recruitment,7 likely evolved as protective responses to parasitic worms. Evolution is a constant battle for the winning spot—and the helminthes' responses to Th2 reactions are anti-inflammatory cytokines. Because most people aren't willing to trade their allergies for intestinal parasites, some researchers have been attempting to purify the helminth-produced anti-inflammatory chemokine.6
Understandably, a more palatable alternative is desirable. Microbial colonization of the intestine begins shortly after birth, and our symbiosis with the hundreds of species of gut-residing bacteria is vital to the normal functioning of our digestive and immune systems.8 Through experiments with germ-free mice, researchers determined that the establishment of intestinal flora as a newborn is crucial to the development of a Th1-Th2 system that can be regulated correctly.9 When antibiotics are administered to infant mice, the gut flora is altered, and the Th1 immune responses are impaired.10