Want to discuss these ideas through living systems frameworks? Join our workshop on this topic, Wednesday, October 16th, 2024.
This is the third installment from The Ecology of Care: Medicine, Agriculture, Money, and the Quiet Power of Human and Microbial Communities.
You can find the book’s introduction here and the first half of Chapter One here. I decided to offer the book (published almost 10 years ago) as a series to my Substack readers, with a chance to discuss the topics in weekly workshops, as the book seems more relevant than ever. I’m also recording audio—stumbles and all.
The Lobotomy of Care: Part Two
Dirty Cows and Whiskey Sludge: The Germ Theory of Disease
“I beseech you to take interest in these sacred domains so expressively called laboratories. Ask that there be more and that they be adorned, for these are the temples of the future, wealth, and well-being. It is here that humanity will grow, strengthen, and improve.”
—Louis Pasteur, microbiologist and chemist, in 1878
Cows, like Ted’s, did not adapt well to the Industrial Revolution. As industrialized cities became larger, more polluted, and more crowded in the 1800s, food needed to be transported longer distances, and spoilage became an issue. Fresh milk was particularly difficult to procure, because it spoiled so easily in warm weather, and—as the “commons,” or common grazing areas, in cities were turned over to development—there was little or no space for cows to graze within city limits. In the early 1800s, entrepreneurs in New York City came up with a brilliant and profitable solution: keep the cows indoors next to the whiskey distilleries and feed them the sludge left over from whiskey production.1
Suddenly fresh milk from these cows was available to city residents, and it sold like crazy. In 1852, three-quarters of the money spent on milk by New York City’s 700,000 residents was spent on “slop milk” from distilleries.2 Unfortunately, cows can’t be kept indoors and fed whiskey sludge without developing serious illnesses; they typically survived only a year under these conditions.3 Bacteria from these sick cows ended up in the milk supply, and without refrigeration they multiplied easily, along with other bacteria, such as E. coli and Salmonella, related to the filthy conditions of the dairies. “Slop dairy” from distilleries became a serious vector for disease in cities in the United States and Europe. Infant mortality in cities rose sharply after the distillery dairies began to sell the infected milk—climbing from 32 percent to 50 percent in just twenty-five years in New York City.4
(Feedlots for beef cattle were developed during this same period, starting on the south side of Chicago—made possible by the developments of the Industrial Revolution: trains to transport cattle, and tractors which created a surplus of corn and other grains for feed.5)
This stubborn insistence of cows—to live in healthy surroundings, or else—was one of several factors that would ultimately drive science, agriculture, and medicine into the modern era: an era that valued a new, sterile paradigm, replacing the fertile paradigm that had defined humans’ relationship to life up to that point.
During the 1850s, while distillery dairies were still thriving, Louis Pasteur, a French microbiologist and chemist, lost three of his five children (at ages two, nine, and twelve) to typhoid fever: a food borne illness caused by Salmonella bacteria. Pasteur’s research—already focused on the role of microorganisms in the process of fermentation—soon proved that heating foods, like milk, to high temperatures, and then sealing them (now called pasteurization) inhibited the growth of bacteria.
(Note: We have since learned that pasteurized milk—and many other industrially produced food products, such as deli meats, spinach, and eggs—can still be responsible for serious bacterial illnesses. Between 1980 and 2005, forty-one outbreaks, attributing 19,531 illnesses to the consumption of pasteurized milk and milk products, were reported to the CDC. The largest outbreak of salmonella in US history was from pasteurized milk from an industrial farm.6)
By showing that microorganisms were the cause of food spoilage and food-borne illnesses, Pasteur solidified the germ theory of disease. This ultimately created a sea change in medicine—as well as in human life. People, animals, and food started to be seen as potentially dirty and unsafe, because they could carry germs.
Laboratories and scientists, on the other hand, were clean.
Before the germ theory of disease took hold, doctors performing surgery were almost absurdly unconcerned with cleanliness. Surgeons’ hands, rarely washed, were placed directly into the patient’s wounds. Onlookers were encouraged to “take a feel” for educational purposes. Surgeons wore clothing covered in blood, as proof of their “popularity.”7 (In fact, surgeons garnered little respect in those days, and were seen only as a last resort for desperate patients.) Surgical instruments were crudely wiped, placed back into their velvet carriers, and reused, some having been sharpened on the sole of the surgeon’s boot. The floors of surgical wards were covered with human feces, urine, blood, and pus, and the hospital walls displayed a collage of phlegm and other body fluids. Consequently, infection was a major cause of death in the early 1800s, with 80 percent of operations plagued by “hospital gangrene” and a nearly 50 percent mortality rate.8
When Ignaz Semmelweis suggested—based on his observation of midwives—in 1847 that handwashing with a chlorine solution might be a useful addition to surgical practice, he was ruthlessly mocked by his peers; he ended up dying in an insane asylum.
Pasteur and his contemporaries changed all that. Joseph Lister started using carbolic acid to sterilize tools and clean wounds in 1867. Lister wrote about his method in the Lancet, and included suggestions to stop using porous materials in the handles of surgical tools, thus bringing surgery to a new level of safety. Handwashing finally became standard surgical procedure. As the death and infection rates from surgeries plummeted, surgeons were able to confidently explore and repair the human body, and surgery developed into a highly specialized and respected medical profession.
The creation of a sterile field, which was so essential for surgery, became the hallmark of all of modern medicine. Soon all doctors and scientists, not just surgeons, donned white coats as a symbol of the new era: a signal to the public of a new form of medicine, one that would not allow microorganisms to interfere with human life.
Magic Bullets: Modern Medicine as a Gateway to Reliance on Technology.
The horrendous conditions of World Wars I and II and the spread of disease through the newly industrialized world greatly increased the pressure to develop an antibiotic drug to treat the wave of illnesses that were cropping up. The widespread use and production of antibiotics began in 1939, when microbiologist René Dubos isolated the compounds gramicidin and tyrocidine from the soil bacteria Bacillus brevis. These became the first commercially produced antibiotics. Dubos’s discovery also revived scientists’ interest in researching penicillin—which had been used for centuries by herbalists. (Dubos, interestingly, was also an environmentalist. He coined the phrase “Think Globally, Act Locally” and warned about the danger of bacteria developing resistance to antibiotics as early as 1942.9)
Before World War II ended, a powerful and wide-acting form of penicillin was finally manufactured for commercial use. Once available, penicillin easily cured many infections that had previously been untreatable, such as syphilis, pneumonia, and wound infections.
The dirty industrial era shifted toward the clean technological era, with medicine at the forefront, and the race was on. If we could eliminate all germs, stop pain with anesthesia, and surgically repair any mechanical problems, we could essentially put an end to disease.
Penicillin became the gateway drug for generations of other drugs, creating the hope that scientists working in laboratories would eventually find a “magic bullet” for each and every illness—one that could cure, like penicillin did, in a matter of days or weeks. All we needed to do was aim the right bullet at each invader, and health would be restored.
Medicine, agriculture, and politics all used similar metaphors. War was increasingly seen as a kind of “medicine” to rid society of unwanted elements, and medicine and agriculture were increasingly seen as a kind of “war”—against bacteria, viruses, and insects. It seemed for a while that anything in society that was unwanted could easily be eradicated: garden pests, tooth decay, blood clots, “defective” populations, even “mental illness.”
We have left some of these ideas firmly in the past, yet the underlying mindset—that technology and science will solve all our problems as we enter the future—persists. As we speak, scientists just down the street from where I grew up are perfecting deep-brain stimulation devices for people with depression, and genetically engineered tomatoes in which to grow vaccines.
In the same way that the successes of antibiotics created huge expectations for new drug development, modern medicine as a whole became the “gateway drug” for our ever-increasing reliance on technology. The early successes of modern medicine satisfied such a deep survival need in humans that many people broadly assumed that anything innovative, “scientific,” or technological was somehow not only an improvement, but also necessary for our evolution, if not our very survival.
Pasteur was looking through a small frame. His model, developed in the unhealthy depths of the Industrial Revolution, had a targeted goal to solve a specific problem. The medical and agricultural researchers who followed his lead aimed to eliminate “problems” wholesale, rather than understanding the complex role that microorganisms, “weeds,” and insects play in the larger, fertile world of biological diversity: the world of meadows, sunshine, and clear, cool rivers that the sickly, whiskey-sludge cows were yearning for.
From hand sanitizers to Lysol, the illusion of sterility still permeates our daily lives, dangling an unfulfilled promise of health. While there have been some definite short-term gains for humans, there have been some enormous long-term losses for everything else.
I respect medical innovation and all that it has added to our lives; however, in my own lifetime I see that the shift toward a more “sterile” industrialized model of care—isolating parts, killing off whatever is not wanted, and relying purely on high-tech solutions to life’s problems—can sometimes be a dangerous ride. Like my own family’s history, it’s a freewheeling, fast-moving experiment with unknowable consequences.
The “Sterile” Model of Care
“People are fed by the food industry, which pays no attention to health, and are treated by the health industry, which pays no attention to food.” —Wendell Berry
At large industrial farms, animals are fed a diet of corn and soybeans, genetically engineered to withstand heavy doses of agricultural chemicals—typically sold by the same company that makes the genetically altered seeds. The application of insecticides, herbicides, de-wormers, and fungicides, to the land and animals, along with the tilling of the soil, create a “clean slate.” These chemicals work hard to kill off almost everything other than the crop itself, including the deep-rooted native plants, fungi, pollinators, earthworms, and other microorganisms that create the living structure of the soil.10 With less biodiversity above ground, there are fewer microorganisms below ground to break down the naturally occurring minerals in the soil and make them available to plants, so expensive chemical fertilizers must be added. The soil, in turn, becomes a relatively inert substance used to prop up plants, rather than a living ecosystem.11
Once its living structure collapses, the soil loses its natural carbon rich, sponge-like, moisture-holding capacity, so it also needs irrigation. The irrigation is really no better than the rain: it runs off the dirt easily, carrying silt and chemicals downstream and creating “dead zones” in rivers, lakes, and oceans. The practice of chemical agriculture also leaves farmers hugely dependent on large corporations that set the prices for genetically engineered seeds and the ever-increasing doses of chemicals that go with them.
Without the mycorrhizal fungi as intelligent living gatherers and sorters of nutrients, plants lose the ability to modulate their uptake of—and differentiate between—toxins and nutrients. Plants growing in heavily treated soils take up lead, arsenic, and other excessive minerals, and cannot solubilize the ones they need, like selenium and iron. They can become both toxic and nutrient-deficient as foods, resulting in unhealthy plants, animals, and people.12
The fat tissues in animals absorb the chemicals in their feed—cancer-causing dioxins, in particular—which are then passed along to those who eat factory-farmed animal products.13 Confined animals eating industrial foods are more likely to become sick, so their feed is supplemented with a constant stream of antibiotics (both intentionally and incidentally with antimicrobial Glyphosate residues14) to prevent infection and make them grow faster. The scientific community agrees that antibiotic use in beef, chicken, and pork farming contributes directly to the number of antibiotic-resistant infections that now plague hospitals, like the one where I gave birth to my two sons.1516
You’d never guess there was a problem with antibiotic-resistant infections at this hospital, because the large modern hallways are gleaming, the ceilings are high and airy, and light pours in through huge sparkling-clean windows. Someone plays a grand piano as you walk into the spacious entrance hall. Everything about the place speaks of professionalism, privacy, sterility, and wealth. It’s a bit like entering a very exclusive hotel.
Patients are asked to stand back from the desk while waiting in line with the receptionist so no one overhears anyone else’s information. If a patient is admitted to the hospital and her best friend is in the room next door, nurses are not allowed to tell either of them that the other one is there.
It’s not just the patients that are sequestered from the complexity of their natural communities. The medicines that doctors prescribe at this hospital are also far removed from the complex herbs, ecosystems, and cultural traditions that inspired their development and use. Instead, they are simple chemical compounds: often petroleum-based replicas of a single plant alkaloid, without variation, packaged in plastic with computer-generated instructions, handed out by pharmacists in white jackets, and billed to an insurance company.
Because the drugs in mainstream medicine are so potent, patients have a small but real chance of dying from the drugs the hospital prescribes for them, and an even higher chance of developing serious complications.1718 The occasional medicines that patients truly need are not the problem. It is the nonessential ones that have flooded the system—pushed by marketers from pharmaceutical companies at huge financial cost to the consumer—which create minuscule benefits and substantial risks. In fact, according to one study, iatrogenic issues—things like medical and pharmaceutical errors and hospital-acquired infections—are the number one cause of death in the United States.19
Although the cancer-causing dioxins from medical incinerators are mostly filtered out these days, this sparkling clean hospital—like all hospitals—has a large stream of waste. This includes radioactive waste from CT scans and other diagnostic techniques, drug waste, and tons of disposable plastic tubes, bags, syringes, and instruments—about one ton for every 100 patients. According to the US Department of Energy website, hospitals use 836 trillion BTUs of energy annually, have more than 2.5 times the energy usage and CO2 emissions of commercial office buildings, and produce more than 30 pounds of CO2 emissions per square foot.20
There is nothing surprising about any of this; in fact, this hospital is considered ahead of the curve, both in standards of medical care and environmentally sound practices. It’s just that the kind of high-tech care they provide uses a lot of nonrenewable resources and therefore takes a toll on the environment, by its very nature.
The Creation of a Medical Monoculture
There is a reason that the US health-care system insists on using high-tech, high-profit solutions to human illnesses, rather than focusing on prevention, community health, and other more ecological, systems-based approaches (even though these have been shown to be more efficient and low cost). Our current medical system was purposefully developed in a socially sterile “test tube” of an environment, as if it was genetically engineered to highlight certain traits and eliminate others.
In the late 1800s, there was still as much diversity in the medical field as there is in the meadow behind my clinic. Medical providers included many types of doctors: eclectics, homeopaths, osteopaths, chiropractors, naturopaths, surgeons, and “regular” doctors, just to name a few, all on relatively equal footing. In addition, there were local herbalists, midwives, and parish nurses. Most of these practitioners viewed health and the human body in a holistic, systems based way.2122 Some hospitals and medical schools employed a variety of practitioners all practicing under the same roof, while specialized schools and hospitals nourished particular niches of knowledge. (At the turn of the century, for example, there were more than twenty homeopathic medical schools in the US, and more than 100 homeopathic hospitals.23 )
Only a portion of doctors were practicing the kind of medicine that looks at things through dissection and microscopes, rather than taking a larger, ecological view. But there was a parallel shift going on in the politics and economics of medicine that turned the small but important idea of surgical sterility into a worldwide campaign to eliminate anything that threatened the white coats of MDs.
Two streams—the germ theory of disease, and a struggle for economic power and control—converged into a tidal wave that crashed over the medical landscape and swept all other views of health out to sea. It came to a head when a group of well-meaning but competitive doctors performed a radical surgery—with a hatchet, not a scalpel—on the US health-care system in its youthful years. It removed a portion of our collective memory, and altered the future of medicine in the United States.
The homeopaths, who had been very popular with the general public because of the safety and low cost of their medicines, had just started a national association from which they launched attacks on the “regular” doctors and their “heroic” methods—things like bloodletting, dangerous surgeries, and large doses of mercury and laxatives to treat nearly everything. In 1847, the “regular” doctors, in response, started the American Medical Association (AMA), creating a board to “analyze quack remedies and enlighten the public in regard to the nature and danger of such remedies.”24 Homeopaths, naturopaths, botanical practitioners, and other ecologically minded “quacks” were not allowed into the AMA. Regular doctors could lose their license even for consulting with them.25
The movement to consolidate professionals in order to have more power reflected a larger trend that was sweeping the nation: the corporation. Business models of efficiency, run by “experts,” were replacing personal values and personal choices of care.26
As laboratory science and the germ theory of disease took hold of the public imagination in the early 1900s—and mercury and blood letting were on their way out—the public started to look toward regular doctors and surgeons with more respect. The American Medical Association took this opportunity to establish a Council on Medical Education in order to “upgrade medical education.” The homeopaths at first supported this measure, because surveys in the American Medical Association’s own journal (JAMA) showed that graduates of the conventional medical schools failed the national exams at nearly twice the rate as graduates of homeopathic colleges.27 The council’s goal was to close half the medical schools, thus halving the number of graduates and creating a more respected, higher-paid profession for physicians.28 (Doctors, in those days, were often paid less than mechanics.29)
It was a laudable goal, except that it still had a hint of hostility in it, as well as more than a whiff of elitism. AMA members had been complaining bitterly that their profession was overcrowded, and that they weren’t getting enough respect, or enough money, because too many in their profession were of “coarse and common fiber.”30 The president of the AMA, in 1903, gave a speech that talked with disdain about night schools that allowed clerks, janitors, street-car conductors, and others employed during the day to earn a medical degree.31
“It is to be hoped that with higher standards universally applied, their number will soon be adequately reduced, and that only the fittest will survive,” declared the editors of the Journal of the American Medical Association.32 This sort of social Darwinism was typical of the era, and typical of the economic mind-set that says people perform better when they are forced to compete for limited resources.
In 1908, with funding from the Carnegie Foundation, the AMA’s Council on Medical Education sent Abraham Flexner to do a survey of all US medical schools, and to provide a report on the quality of their teaching and other resources. (Two years earlier, the CME had already classified US medical schools into three groups: A = acceptable, B = doubtful, and C = unacceptable, and had shared this information with Flexner.33) When Flexner’s report was published, it quickly became the basis of the AMA’s campaign to standardize medical education and close certain schools. Flexner recommended closing all of the nation’s 151 medical schools, except for thirty-one that he had chosen.34
In 1912, a group of state licensing boards banded together and agreed to base their accreditation policies on standards determined by the AMA’s council, and medical schools began rapidly closing down.35 The next year, the AMA established a Propaganda Department whose task was to “gather and disseminate information concerning health fraud and quackery,” and more schools were closed. By 1928, the Propaganda Department had been renamed the “Bureau of Investigation,”36 a sharp signal to the remaining “nonregular” doctors that the AMA had clearly established itself—without ever having gotten permission from the government—as a national regulatory agency for the practice of medicine.
Several changes ensued, dramatically altering how medicine was defined and taught.
In 1900, the United States had a flourishing women’s medical profession, with 7,000 female physicians (compared to 258 in England and 95 in France) who ran hospitals, taught in medical schools, and were deeply involved in important public health reforms.37 After the AMA’s council formed, six of the seven medical schools for women were shut down, resulting in a 33 percent reduction in female graduates.38
Fourteen medical schools for Black Americans had been established after slavery was abolished. Only seven remained, but on Flexner’s recommendation, five of those were closed, leaving only two.39 Flexner’s report had also suggested a limited role for Black physicians, portraying them as useful only “to treat their own race.” (Echoes of that thinking persist today, though in subtler language.) His view of their role was not primarily as physicians but as teachers of hygiene and sanitation to prevent diseases from spreading to White populations, as he portrayed Black people as a source of “infection and contagion.”40
Most of the Southern and Midwestern rural medical schools that were not connected to universities with large endowments were also closed, as were the affordable night schools that were open to poor and working-class medical students.4142
Holistic and preventive forms of medicine were no longer taught in medical schools, and states began passing laws that prevented naturopathic and other “nonregular” physicians from practicing medicine. The mechanistic, rather than the ecological, view of the body, and reductionist laboratory science, became the required basis of medical school curriculum and medical licensing exams.43
Under intense pressure from the new “medical profession,” state after state passed laws outlawing midwifery and restricting the practice of obstetrics to doctors, even though studies were showing that doctors had far less understanding of the birth process than traditional midwives. This made obstetric and gynecological care too expensive for most poor and working-class women.44
Finally, the Rockefeller Institute created a system by which medical research grants had to be vetted by a board of experts, primarily from the elite East Coast medical schools. (The first president of the institute was Flexner himself.) This led to more narrowing of the focus of medicine, and to further closings, isolation, and impoverishment of Southern and Midwestern medical institutions that depended on state funding rather than on wealthy benefactors.45
The combined result of all these changes made the next wave of doctors—who went on to become the next wave of medical school professors—primarily white, wealthy, competitive men who excelled at linear thinking. Our current system, with all its marvelous progress and its quirky flaws, was designed, taught, and developed by them. So what went missing? What would our US medical system look like if it had not been streamlined in the quest for higher profits? What would it look like if it had also been designed, taught, and developed by women, by Black and Brown people, by people interested in a variety of holistic systems-based approaches, and by people who had worked as janitors and street-car conductors, and grown up in poor and working-class households? We only know one part of the answer: Not like this.
Ron Schmid, The Untold Story of Milk: Green Pastures, Contented Cows and Raw Dairy Foods (Washington, DC: New Trends Publishing, 2003).
Ralph Selitzer, The Dairy Industry in America (New York: Books for Industry, 1976), 123
Robert M. Hartley, An Historical, Scientific and Practical Essay on Milk as an Article of Human Sustenance (New York: J. Leavitt, 1842), 67–68
Ron Schmid, The Untold Story of Milk: Green Pastures, Contented Cows and Raw Dairy Foods (Washington, DC: New Trends Publishing, 2003).
Philip D. Hubbs, “The Origins and Consequences of the American Feedlot” (Thesis, Baylor University, History Department, 2010).
C. A. Ryan, et al., “Massive Outbreak of Antimicrobial-Resistant Salmonellosis Traced to Pasteurized Milk,” JAMA 258, no. 22 (1987):3269–74.
Jason T. Miller, et al., “The History of Infection Control and its Contributions to the Success and Development of Brain Tumor Operations,” Neurosurgical Focus 18, no. 4 (2005):1–5.
Ibid.
See Andrew C. Revkin “A ‘Despairing Optimist’ Considered Anew,” New York Times, June 6, 2011.
“Glyphosate Interactions with Physiology, Nutrition, and Diseases of Plants: Threat to Agricultural Sustainability?” European Journal of Agronomy 31 (2009):111–13.
Elaine Ingham, Lecture, Northeast Organic Farming Association Summer Conference, 2014.
Walter Jehne, lecture and private interviews following the “Restoring Water Cycles to Reverse Global Warming” conference, Tufts University, October 2015.
http://www.who.int/mediacentre/factsheets/fs225/en/.
Monsanto patented Glyphosate (Roundup®) as a broad-spectrum antimicrobial, as well as an herbicide.
John Conley, “Antimicrobial Resistance: Revisiting the ‘Tragedy of the Commons,’” Bulletin of the World Health Organization 88, no. 11 (2010) 797–876, http://www .who.int/bulletin/volumes/88/11/10-031110/en/index.html.
“III. The Environmental Impact of Imprudent Antimicrobial Use in Animals,” Antimicrobial Resistance Learning Site, http://amrls.cvm.msu.edu/veterinary public-health-module/iii.-the-environmental-impact-of-imprudent-antimicrobial use-in-animals/iii.-the-environmental-impact-of-imprudent-antimicrobial-use-in animals.
J. Lazarou, et al., “Incidence of Adverse Drug Reactions in Hospitalized Patients: A Meta-Analysis of Prospective Studies,” JAMA 279, no. 15 (1998):1200–1205.
“To Err is Human: Building a Safer Health System,” Institute of Medicine, Report, 1999.
Gary Null, et al., “Death by Medicine,” Life Extension Magazine, March 2004.
“Department of Energy Announces the Launch of the Hospital Energy Alliance to Increase Energy Efficiency in the Healthcare Sector,” US Department of Energy.
Paul Starr, The Social Transformation of American Medicine (New York: Basic, 1982).
Barbara Griggs, Green Pharmacy: The History and Evolution of Western Herbal Medicine (Rochester, Vermont: Healing Arts Press, 1997).
Dana Ullman, Homeopathy: Medicine for the 21st Century (Berkeley: North Atlantic Books, 1987).
“AMA History Timeline,” American Medical Association, accessed May 6, 2015, www.ama-assn.org//ama/pub/about-ama/our-history/ama-history-timeline.page.
Harris L. Coulter, Divided Legacy, Volume III: The Conflict Between Homeopathy and the American Medical Association (Berkeley: North Atlantic Books, 1973).
Gerald E. Markowitz and David Karl Rosner, “Doctors in Crisis: A Study of the Use of Medical Education Reform to Establish Modern Professional Elitism in Medicine,” American Quarterly 25, no. 1 (1973):83–107.
Dana Ullman, The Homeopathic Revolution (Berkeley: North Atlantic Books, 2007).
Gerald E. Markowitz and David Karl Rosner, “Doctors in Crisis: A Study of the Use of Medical Education Reform to Establish Modern Professional Elitism in Medicine,” American Quarterly 25, no. 1 (1973):83–107.
G. F. Shears, “Making a Choice,” Cosmopolitan, April, 1903, 654.
Inez C. Philbrick, “Medical Colleges and Professional Standards,” JAMA 36, no. 24 (1901).
“Medical Education,” Science 17:763.
Andrew H. Beck, “Flexner Report and the Standardization of American Medical Education,” JAMA 291, no. 17 (2004).
“AMA History Timeline,” American Medical Association, accessed May 6, 2015, www.ama-assn.org//ama/pub/about-ama/our-history/ama-history-timeline.page.
Barbara M. Barzansky and Norman Gevitz, Beyond Flexner: Medical Education in the Twentieth Century (Westport: Greenwood Press, 1992).
Gerald E. Markowitz and David Karl Rosner, “Doctors in Crisis: A Study of the Use of Medical Education Reform to Establish Modern Professional Elitism in Medicine,” American Quarterly 25, no. 1 (1973) 83–107.
“AMA History Timeline,” American Medical Association, accessed May 6, 2015, www.ama-assn.org//ama/pub/about-ama/our-history/ama-history-timeline.page.
Shari L. Barkin, et al., “Unintended Consequences of the Flexner Report: Women in Pediatrics” Pediatrics 126, no. 6 (2010):1055–57.
Paul Starr, The Social Transformation of American Medicine (New York: Basic, 1982), 124.
Earl H. Harley, “The Forgotten History of Defunct Black Medical Schools in the 19th and 20th Centuries and the Impact of the Flexner Report,” Journal of the National Medical Association, 98, no. 9 (2006):1425–29.
Louis W Sullivan, MD, and Ilana Suez Mittman, PhD, “The State of Diversity in the Health Professions a Century After Flexner,” Academic Medicine 85, no. 2 (2010):246–53.
Andrew H. Beck, “Flexner Report and the Standardization of American Medical Education,” JAMA 291, no. 17 (2004).
Gerald E. Markowitz and David Karl Rosner, “Doctors in Crisis: A Study of the Use of Medical Education Reform to Establish Modern Professional Elitism in Medicine,” American Quarterly 25, no. 1 (1973):83–107.
Ibid.
Barbara Ehrenreich, Witches, Midwives, and Nurses: A History of Women Healers (New York: The Feminist Press at CUNY, 1973).
Gerald E. Markowitz and David Karl Rosner, “Doctors in Crisis: A Study of the Use of Medical Education Reform to Establish Modern Professional Elitism in Medicine,” American Quarterly 25, no. 1 (1973):83-107
The Ecology of Care: Chapter 1 (Part 2)