Prior to agriculture and the domestication of animals humans likely had little chance to interact on a large scale with a wide range of pathogens, with disease likely playing a minor role in evolutionary drive compared with more the pressing concerns of starvation, predation, and hypothermia.
Image Credit: Salov Evgeniy/Shutterstock.com
Around 10,000 years ago plants and animals were domesticated on a wide scale by humans, alleviating nutritional concerns, and living in larger groups lessened the risk of predation. The constant close proximity to other humans and animals, often in habitations with poor sanitation, however, generated favorable conditions for epidemics, producing a strong selection pressure towards a robust immune system in humans.
Many of humanity's most challenging diseases originated from domesticated animals, as while the process of crossing the species barrier is difficult, once achieved the new host presents an insufficient immune response due to lack of experience.
Early germ theory
Observers throughout history have noted that devastating plagues are often seemingly less potent upon re-exposure, and thus that people had acquired resistance. During the 16th and 17th centuries, natural resistance to disease was recognized to differ among human populations, influencing, for example, the proportion of African slaves exploited in tropical regions compared to captured Native Americans or Europeans due to innately greater resistance to yellow fever and malaria.
In 1546 Italian scholar Girolamo Fracastoro wrote a treatise on germ theory, wherein he surmised that invisible “seeds of disease” existed, being transmitted between people or by other means. It was not until 1677 that Antonie van Leeuwenhoek was able to identify these microorganisms using his newly developed microscope, though the significance of their role in disease was not recognized completely for many more decades.
Several scientists around the world developed germ theories based on microscopic organisms in this time, though microscopes at the time could do little more than confirm the presence of microorganisms, and so technology was still lacking.
For example, in the mid-1840s, Ignaz Philipp Semmelweis was a doctor at a hospital in Vienna when he noted that the spike in women dying during childbirth from a condition known as puerperal fever correlated with days when medical students were performing autopsies, and that the putrid odor produced by those suffering from puerperal fever was similar to that produced by corpses at the medical facility.
A close friend of Semmelweis died when they received a small cut during an autopsy, and this led him to implement a handwashing policy that he fought for the rest of his life to implement, although it was impeded by political events at the time.
Another strong early proponent of germ theory and handwashing was London physician John Snow, challenging the currently prevailing miasma theory of the time. He demonstrated that a local cholera outbreak originated from a public water pump located on the corner of Broad and Cambridge streets.
The pump was contaminated with sewer effluent, though what made the investigation so remarkable was the dedicated series of interviews performed by Snow that allowed him to piece the case together. This allowed him to trace the cause of cases of cholera that were not local to the pump, learning that individuals had traveled to the area or received water from there.
Around this time English botanist Reverend Miles J. Berkeley noted the presence of fungus on potatoes that were suffering from the blight, demonstrating that it was caused by a fungus as opposed to a “damp miasma”.
At the time he was mocked, and the blight would go on to kill many hundreds of thousands by starvation over the next few years. It was not until 1861 that German mycologist Anton de Bary conclusively indicated the fungus as the cause by rigorous scientific processes.
Micro-Biology: Crash Course History of Science #24
Pasteur and Koch
Louis Pasteur conclusively demonstrated that microorganisms were responsible for the fermentation and spoiling of food in 1857, debunking the idea of spontaneous generation by showing that heat, chemical, or filtration sterilization and subsequent sealing could maintain these conditions without microbial growth.
Pasteur became a relative celebrity scientist, developing a team of influential researchers that would go on to make many significant contributions to the field, including Charles Chamberland, inventor of the autoclave and a vaccine researcher; Ilya Metchnikoff, who discovered the process of phagocytosis and initially described innate immunity; and Albert Calmette, who developed the first tuberculosis vaccine.
In 1879 Pasteur saw that serial passage of chicken cholera caused the bacteria to become less deadly, and recycling chickens, which were in short supply in the lab, caused them to develop resistance to more virulent strains.
This artificial attenuation avoided having to find naturally occurring similar microorganisms, as Jenner had done with cowpox and smallpox, and was the basis for the first human-produced vaccines. He himself developed vaccines with this technology for anthrax and rabies in 1881 and 1885, respectively.
Robert Koch became a major force in the field of microbiology in the 1870s, pioneering a number of techniques for bacterial identification and isolation and receiving great funding from the Prussian government, founding the Koch Institute for Infection Diseases in 1891.
He eventually became a great rival of Pasteur’s, largely due to political and cultural disagreements. Pasteur reportedly favored the vaccination approach to disease control, while Koch supported a public education-based approach.
Other scientific differences in opinion between the two are apparent: Koch felt that the focus of research should be the elimination of microorganisms, and that they are immutable species that could not be changed; while Pasteur was perhaps more interested in controlling and instructing them.