Chances are that every American has been prescribed medicine by a doctor, filled prescriptions at a pharmacy, or taken an over-the-counter medicine. On the labels of every drug are instructions for use, dosage, the strength of the drug, potential side effects and warnings, and other important information to safely take the medicine without harm. This information occurred after decades of American drug regulation. This regulation, ultimately, allows people to responsibly and safely take medicine and treat whatever ailments they might have, something that many in the modern day might take for granted.
Until the mid-nineteenth century, the American drug industry was mostly unregulated. In 1820, eleven doctors met in Washington to establish the U.S. Pharmacopeia, the first compendium of standard drugs in America, creating a foundation for later regulation. The standards within the U.S. Pharmacopoeia were used in 1848 to bar the importation of adulterated drugs, i.e. any drugs that did not meet these standards. This law, the Drug Importation Act, was the first major legislation regulating medicine, although it only dealt with the U.S. Customs Service inspections of drugs rather than domestic medicine. While Congress authorized funds within the U.S. Department of Agriculture (USDA) to investigate domestically adulterated medicine, there was not enough money to fund these investigations until the late nineteenth century.[1]
The predecessor to the FDA dates back to the Civil War era. President Lincoln formed the USDA Bureau of Chemistry in 1862, originally employing only one chemist. This bureau was transformed by the 1883 appointment of Dr. Harvey W. Wiley as Chief Chemist. Wiley devoted his career to ending the practice of adulteration in food. By the mid-1890s, he began publicly advocating for the enforcement of a national antiadulteration law. In 1902, Congress authorized the USDA to establish food standards, resulting in the creation of the “Poison Squad.” This morbid-sounding name was what the newspapers coined the group of twelve men undertaking the Hygienic Table Trials overseen by Wiley to start formulating these standards. This group was weighed before every meal before half of the group would be fed unadulterated food and the other fed food filled with common additives. These substances included borax, salicylic acid, saccharin, sodium benzoate, copper salts, and formaldehyde. Every two weeks, the control group and the group fed adulterated food switched. Less than a month after these trials began, the Washington Post reported on December 26, 1902, that the subjects eating the chemically laced foods had lost weight over the holiday, noting that morale was low amongst the men of the Poison Squad. These trials concluded in 1906.[2]
Harvey Wiley shown later in his career at the U.S. Department of Agriculture, ca. 1910. (U.S. Food and Drug Administration)
Wiley capitalized on the publicity generated by the reports of the Hygienic Table Trials and “muckraking” media exposure of the American food industry to pressure Congress into passing the Pure Food and Drug Act (also known as the Wiley Act in his honor) of 1906. This law prohibited misbranded and adulterated food, drinks, and drugs in interstate commerce and was enforced by Wiley’s Bureau of Chemistry. To view examples of regulation under the Food and Drugs Act throughout the early twentieth century, the National Library of Medicine has a fantastic digital archive of published federal notices of judgement for manufacturers and products prosecuted for violations of the law. However, this law focused on the regulation of product labelling, rather than pre-market approval of medicine. Unlike new drugs today, which must be approved before marketing, any drug was allowed to be marketed if the label was accurate. Drugs, held to the standard of the U.S. Pharmacopoeia and National Formulary (the two national compendia of medicine), could not be sold altered from these standards unless the specific changes were clearly outlined on the label. Labels could not be false or misleading, and if the food or drug contained alcohol, morphine, opium, cocaine, heroin, alpha or beta eucaine (synthetic anesthetics designed to be an analog to cocaine), chloroform, cannabis indica, chloral hydrate (a common sedative at the time), or acetanilide (a painkiller), the labels had to highlight their presence and amount. For example, under this act, drugs such as Mrs. Winslow’s Soothing Syrup for teething infants would have to note that the medicine contained morphine and alcohol. Finally, if the manufacturer listed the weight or measure of a food, this had to be done accurately and could not be false or misleading.[3]
Mrs. Winslow's Soothing Syrup Medical Trade Card
(courtesy of Artstor, https://library.artstor.org/public/28305419)
In the first few years following the passage of the Wiley Act, its namesake emphasized regulating foods, rather than drugs. The law prohibited the addition of any ingredients that substituted for the food, concealed damage, posed a health hazard, or had filthy or decomposed substances; however, interpretations of the law led to several significant court cases challenging aspects of the legislation. Significantly, in 1911, the Supreme Court ruled 6-3 in U.S. v. Johnson that the Food and Drugs Act did not prohibit false therapeutic claims but rather only false or misleading statements about the ingredients of the drug. Per this ruling, drug manufacturers could still claim that their medicine was a cure-all drug so long as they accurately described the ingredients within their alleged “miracle drug” on the label (in the case of U.S. v Johnson, the drugs in question promised to cure cancer). In response to U.S. v Johnson, Congress enacted the Sherley Amendment to the Food and Drugs Act, which prohibited the intentional labeling of medicine with false therapeutic claims.[4]
During the Progressive Era, Congress also passed the first narcotics control law. Prior to the 1914 passage of the Harrison Narcotic Act, there was virtually no effective substance regulation in America. Several states had laws regarding narcotics sales and some municipalities banned opium smoking, but enforcement of these laws was infrequent. By the turn of the twentieth century, there were about 300,000 opiate addicts in America. Many of these were Civil War veterans, who became addicted to morphine and opium during and immediately after the war. Opiates were widely used as painkillers for wounded soldiers and opium was a major remedy for diarrhea and other diseases spreading through army camps. Thousands of men became addicted to these drugs, creating a struggle to overcome their opiate addictions that lasted for decades after the war ended, often stigmatized as “weak men” or isolated because of their addiction.[5]
Societal beliefs largely explain the stigmatization of addiction. For many Americans of this era, drug addicted veterans were immoral, weak, and emasculated. Rather than helping treat their addiction, many in society believed that they deserved to be punished rather than helped, resulting in many veterans becoming isolated and, often, dying of accidental drug overdose. Furthermore, Victorian-era Americans were far more concerned about regulating alcohol than opioids. This was both because of the belief that alcoholics created a public nuisance and harmed women and children (compared to opiate addicts, who were largely withdrawn from society), and because of prejudicial societal beliefs. While opium smokers were largely white criminals and Chinese immigrant laborers, they were a minority of addicts; more common were middle and upper-class women who became addicted to medicinal opiates like laudanum or morphine sulfate. Given Victorian-era sexual attitudes towards women and the belief in the “cult of domesticity,” it is evident why society cared more about keeping men away from “the bottle.”[6]
Recognizing the problem of drug addictions in the early twentieth century, Progressive Era reformers began advocating for a solution. As American involvement in Asia, well known for its opium trade, increased after the Spanish-American War, American diplomats attempted to reduce or end the international spread of opium. After America participated in the 1909 international opium commission in Shanghai, Congress felt pressure to pass legislation regulating the drug. They passed a weak law banning the importation of opium for “other than medicinal purposes,” which, ultimately, did little to curb narcotics addiction in practice. Reformers, particularly Dr. Hamilton Wright, the American delegate to the Shanghai Commission and 1911-1912 Hague Opium Conference, pressured Congress to act on the opiate crisis. Many of the calls to regulate opiates were based on class and race-based stereotypes. Wright claimed without much factual basis that 45.48% of all American criminals, 25% of the Chinese-American population, and 21.6% of all prostitutes were opiate addicts, compared to about 0.18% of the “general adult population.”[7] Despite the unfounded nature of Wright’s claims, the media, legislators, and large portions of American society embraced these statistics and used them to create domestic pressure for laws regulating opiate substances.
The result was the Harrison Narcotic Act. This law required anyone who sold or distributed narcotics (importers, manufacturers, wholesale and retail druggists, and doctors) to register with the government and pay a small tax (the tax was intended to justify the constitutionality of the bill). These registered groups had to make detailed records of whom they sold or prescribed narcotics to, which was allowed to be audited by the government. Anyone found with narcotics who was unregistered or did not have a legitimate prescription from a registered doctor, dentist, or veterinary surgeon was presumed guilty of violating the law and, if convicted, liable to fines up to $2,000 and up to five years in prison. Significantly, the Harrison Act defined “narcotics” as both opium and coca-based drugs (despite cocaine being a stimulant rather than a depressant like opioids), effectively utilizing the term “narcotic” for any addictive substance. For example, by the 1930s, marijuana was described as a narcotic.[8]
The Harrison Act had one major and significant flaw: it failed to address whether or not an addict could be indefinitely prescribed narcotics. Woodrow Wilson’s Treasury Department, responsible for enforcing the law (which was legally a tax act), took a strict approach to it and attempted to prosecute addicts, doctors, and pharmacists for “conspiracy to violate” the Harrison Act. The Supreme Court, ultimately, ruled in March 1919 that the Harrison Act was constitutional and that a doctor was not legally permitted to write prescriptions for an addict “to keep him comfortable by maintaining his customary use.” Despite this ruling, however, some doctors continued to write prescriptions for addicts, and a small “gray market,” as historian David Courtwright called it, remained as an alternative to the adulterated heroin black market.[9]
By the end of the 1920s, the Bureau of Narcotics was led by Harry Jacob Anslinger. When the Bureau of Narcotics became its own unique organization in 1930 (mostly to ensure it was distinct from its former parent organization, the Prohibition Unit, which was consumed with controversy over alcohol prohibition), Anslinger became the first commissioner of the bureaucracy, a position which he held until 1962. As commissioner, Anslinger attempted to regulate the international narcotic business in addition to his strict domestic enforcement. On the diplomatic front, he was involved with the League of Nations’ attempts to limit the manufacture of narcotics, although these attempts, like most of the League of Nations’ actions, failed. Like many of his contemporaries, Anslinger viewed the world in black and white, and often through a racist approach. Either “civilized” countries were good and attempted to stop the drug trade or the bad, “uncivilized” countries like Imperial Japan and later, Communist China and Castro’s Cuba, used drugs as a method of conquest. Of course, it just so happened that the “uncivilized” countries were hostile to the United States, showing the larger political and diplomatic context of American drug regulations.[10]
America’s attempts to limit the spread of narcotics past the 1930s has its own extensive and fascinating history. This history, however, diverges from the story this article aims to tell, which focuses on the regulatory role of the FDA in ensuring the safety of medicines prescribed by doctors or purchased over the counter at pharmacies and drugstores. To learn more about America’s attempts to regulate narcotics, an effort that remains important in today’s political landscape, David Courtwright’s article, “A Century of American Narcotic Policy,” is an excellent reference.
In addition to the harder line on narcotics in the 1930s, a number of changes were made to food and drug regulations, beginning with a change in the name of the agency that enforced them. In 1930, under an agricultural appropriations act, the Food, Drug, and Insecticide Administration was renamed to the Food and Drug Administration (FDA). In 1933, the FDA recommended a complete revision of the 1906 Food and Drugs Act, launching a political battle that raged through Congress for five years. To raise public awareness of the 1906 law’s shortcomings, the FDA created a travelling exhibit displaying shocking and often gruesome side effects of products that the agency was legally unable to regulate, dubbed the “American Chamber of Horrors” by a reporter. For example, an eyebrow and lash dye called “Lash Lure,” unable to be regulated by the FDA as a cosmetic, was displayed for containing a poison that caused ulceration of the corneas and degradation of the eyeballs, causing blindness and potential death. Especially in the Great Depression era, where poor Americans often purchased cheaper (and thus potentially lower quality) foods, drugs, and cosmetics, these sorts of unregulated substances posed a great danger to Americans. The Chamber of Horrors was featured everywhere from the Chicago World’s Fair to Capitol Hill to state fairgrounds, garnering popular momentum for the revised legislation.[11]
Despite the momentum sparked by the Chamber of Horrors, a 1937 tragedy was the final spark to kick Congress into action. Between September and October 1937, Elixir Sulfanilamide killed 107 people. In early 1937 Sulfanilamide, used to treat streptococcal infections as a pill or tablet, was sold in a new liquid form. While this drug was tested for flavor, appearance, and fragrance, it had not been tested for safety, since meeting safety standards was not a legal requirement under the 1906 law. Because no safety tests had been conducted, the pharmacist manufacturing this drug overlooked one detail: the substance in which sulfanilamide was dissolved was diethylene glycol, a chemical normally used as antifreeze. Needless to say, it was a deadly poison. The first shipments were sent out in early September, and by October 11, the American Medical Association received reports from doctors in Tulsa about an initial wave of deaths. Upon inspection, the AMA recognized the presence of diethylene glycol and issued a warning across both newspaper and radio of the medicine’s toxicity. By mid-October, the FDA had sent out its entire field staff of 239 inspectors and chemists to recover every vial of the drug produced, although this process in itself was extraordinarily difficult.[12]
Eventually, the government was able to recover all 234 gallons and one pint of the substance (of the 240 gallons manufactured); the remainder had been consumed. Victims of the poison, many of whom were children treated for sore throats, fell ill from anywhere between one to three weeks, suffering intense pain, kidney failure, vomiting, and convulsions before succumbing to the antifreeze. At the time there was no antidote or treatment for diethylene glycol poisoning. In addition to those who died from taking Elixir Sulfanilamide, the chemist who formulated the deadly medicine, Harold Watkins, committed suicide, unable to cope with guilt of creating the lethal drug.[13]
While the government seized the medicine, it did so on misbranding, rather than the side effects of the drug. If the product had been called a solution instead of an elixir (which implied the presence of alcohol in the drug), the FDA would have had no legal authority to recover the medicine. As such, FDA Commissioner Walter Campbell, who had spent years lobbying Congress for the increased regulatory authority, finally managed to convince them and President Roosevelt to enact the new law, the Food, Drug, and Cosmetic Act of 1938.[14]
The Food, Drug, and Cosmetic (FD&C) Act serves as the basis for modern FDA regulations. Among many other things, it extended regulatory control to cosmetics. No longer would drugs like Lash Lure be allowed on the market. Instead, materials in colors had to be listed and approved. Foods now had to meet standards of identity, quality, and fill of container, and were considered illegal if they did not meet these standards. While initial standards extended only to canned tomatoes, tomato puree, and tomato paste, jams and jellies followed soon after and, by 1957 (the year before the implementation of the next significant food regulation), extended to many household foods like chocolate, flour, cereals, milk, cheese, juices, and eggs. To ensure that the Elixir Sulfanilamide disaster never happened again, manufacturers had to prove that a drug was safe before being marketed. Manufacturers also had to submit an application to the FDA before marketing a drug, further increasing FDA regulatory oversight.[15]
In 1951, Congress strengthened this regulation by requiring some drugs, considered to be more dangerous, to be administered only under the direction of a “qualified expert.”[16] This requirement began the requirement for prescription-only non-narcotic drugs, which of course, remains today. Compared to over-the-counter medicines like Tylenol or Claritin, which are not considered “dangerous,” a doctor must prescribe antibiotics for a sick patient, for example.