Christopher J. Castañeda
The modern oil industry began in 1859 with Edwin Drake’s discovery of oil at Titusville, Pennsylvania. Since then, this dynamic industry has experienced dramatic episodes of growth, aggressive competition for market share, various forms of corporate organization and cartel-like agreements, and governmental efforts at regulation and control, as well as monopoly, mergers, and consolidation. The history of the oil industry reflects its capital-intensive nature. Immense sums of money are spent on oil discovery, production, and refining projects. Marketing, transportation, and distribution systems likewise require enormous amounts of financing and logistical planning. Although oil is often produced in conjunction with, or in wells pressurized by, natural gas, the oil industry is distinct from the related natural gas industry. Since its origins in the mid-19th century, the oil industry has developed an industrial structure that emphasizes scale and scope to maximize profits. Profits can be huge, which attracts entrepreneurial efforts on individual, corporate, and national scales. By the late 20th through early 21st century, the oil industry had begun confronting questions about long-term viability, combined with an increasingly influential environmental movement that seeks to reduce fossil fuel consumption and prevent its toxic waste and by-products from polluting human, animal habitats, and natural habitats.
From the founding of the American republic through the 19th century, the nation’s environmental policy mostly centered on promoting American settlers’ conquest of the frontier. Early federal interventions, whether railroad and canal subsidies or land grant acts, led to rapid transformations of the natural environment that inspired a conservation movement by the end of the 19th century. Led by activists and policymakers, this movement sought to protect America’s resources now jeopardized by expansive industrial infrastructure. During the Gilded Age, the federal government established the world’s first national parks, and in the Progressive Era, politicians such as President Theodore Roosevelt called for the federal government to play a central role in ensuring the efficient utilization of the nation’s ecological bounty. By the early 1900s, conservationists established new government agencies, such as the U.S. Forest Service and the Bureau of Reclamation, to regulate the consumption of trees, water, and other valuable natural assets. Wise-use was the watchword of the day, with environmental managers in DC’s bureaucracy focused mainly on protecting the economic value latent in America’s ecosystems. However, other groups, such as the Wilderness Society, proved successful at redirecting policy prescriptions toward preserving beautiful and wild spaces, not just conserving resources central to capitalist enterprise. In the 1960s and 1970s, suburban and urban environmental activists attracted federal regulators’ attention to contaminated soil and water under their feet. The era of ecology had arrived, and the federal government now had broad powers through the Environmental Protection Agency (EPA) to manage ecosystems that stretched across the continent. But from the 1980s to the 2010s, the federal government’s authority to regulate the environment waxed and waned as economic crises, often exacerbated by oil shortages, brought environmental agencies under fire. The Rooseveltian logic of the Progressive Era, which said that America’s economic growth depended on federal oversight of the environment, came under assault from neoliberal disciples of Ronald Reagan, who argued that environmental regulations were in fact the root cause of economic stagnation in America, not a powerful prescription against it. What the country needed, according to the reformers of the New Right, was unregulated expansion into new frontiers. By the 2010s, the contours of these new frontiers were clear: deep-water oil drilling, Bakken shale exploration, and tar-sand excavation in Alberta, Canada. In many ways, the frontier conquest doctrine of colonial Americans found new life in deregulatory U.S. environmental policy pitched by conservatives in the wake of the Reagan Revolution. Never wholly dominant, this ethos carried on into the era of Donald Trump’s presidency.
Richard N. L. Andrews
Between 1964 and 2017, the United States adopted the concept of environmental policy as a new focus for a broad range of previously disparate policy issues affecting human interactions with the natural environment. These policies ranged from environmental health, pollution, and toxic exposure to management of ecosystems, resources, and use of the public lands, environmental aspects of urbanization, agricultural practices, and energy use, and negotiation of international agreements to address global environmental problems. In doing so, it nationalized many responsibilities that had previously been considered primarily state or local matters. It changed the United States’ approach to federalism by authorizing new powers for the federal government to set national minimum environmental standards and regulatory frameworks with the states mandated to participate in their implementation and compliance. Finally, it explicitly formalized administrative procedures for federal environmental decision-making with stricter requirements for scientific and economic justification rather than merely administrative discretion. In addition, it greatly increased public access to information and opportunities for input, as well as for judicial review, thus allowing citizen advocates for environmental protection and appreciative uses equal legitimacy with commodity producers to voice their preferences for use of public environmental resources.
These policies initially reflected widespread public demand and broad bipartisan support. Over several decades, however, they became flashpoints, first, between business interests and environmental advocacy groups and, subsequently, between increasingly ideological and partisan agendas concerning the role of the federal government. Beginning in the 1980s, the long-standing Progressive ideal of the “public interest” was increasingly supplanted by a narrative of “government overreach,” and the 1990s witnessed campaigns to delegitimize the underlying evidence justifying environmental policies by labeling it “junk science” or a “hoax.”
From the 1980s forward, the stated priorities of environmental policy vacillated repeatedly between presidential administrations and Congresses supporting continuation and expansion of environmental protection and preservation policies versus those seeking to weaken or even reverse protections in favor of private-property rights and more damaging uses of resources. Yet despite these apparent shifts, the basic environmental laws and policies enacted during the 1970s remained largely in place: political gridlock, in effect, maintained the status quo, with the addition of a very few innovations such as “cap and trade” policies. One reason was that environmental policies retained considerable latent public support: in electoral campaigns, they were often overshadowed by economic and other issues, but they still aroused widespread support in their defense when threatened. Another reason was that decisions by the courts also continued to reaffirm many existing policies and to reject attempts to dismantle them.
With the election of Donald Trump in 2016, along with conservative majorities in both houses of Congress, US environmental policy came under the most hostile and wide-ranging attack since its origins. More than almost any other issue, the incoming president targeted environmental policy for rhetorical attacks and budget cuts, and sought to eradicate the executive policies of his predecessor, weaken or rescind protective regulations, and undermine the regulatory and even the scientific capacity of the federal environmental agencies. In the early 21st century, it is as yet unclear how much of his agenda will actually be accomplished, or whether, as in past attempts, much of it will ultimately be blocked by Congress, the courts, public backlash, and business and state government interests seeking stable policy expectations rather than disruptive deregulation.
The development of nuclear technology had a profound influence on the global environment following the Second World War, with ramifications for scientific research, the modern environmental movement, and conceptualizations of pollution more broadly. Government sponsorship of studies on nuclear fallout and waste dramatically reconfigured the field of ecology, leading to the widespread adoption of the ecosystem concept and new understandings of food webs as well as biogeochemical cycles. These scientific endeavors of the atomic age came to play a key role in the formation of environmental research to address a variety of pollution problems in industrialized countries. Concern about invisible radiation served as a foundation for new ways of thinking about chemical risks for activists like Rachel Carson and Barry Commoner as well as many scientists, government officials, and the broader public. Their reservations were not unwarranted, as nuclear weapons and waste resulted in radioactive contamination of the environment around nuclear-testing sites and especially fuel-production facilities. Scholars date the start of the “Anthropocene” period, during which human activity began to have substantial effects on the environment, variously from the beginning of human farming roughly 8,000 years ago to the emergence of industrialism in the 19th century. But all agree that the advent of nuclear weapons and power has dramatically changed the potential for environmental alterations. Our ongoing attempts to harness the benefits of the atomic age while lessening its negative impacts will need to confront the substantial environmental and public-health issues that have plagued nuclear technology since its inception.
Timothy James LeCain
Technology and environmental history are both relatively young disciplines among Americanists, and during their early years they developed as distinctly different and even antithetical fields, at least in topical terms. Historians of technology initially focused on human-made and presumably “unnatural” technologies, whereas environmental historians focused on nonhuman and presumably “natural” environments. However, in more recent decades, both disciplines have moved beyond this oppositional framing. Historians of technology increasingly came to view anthropogenic artifacts such as cities, domesticated animals, and machines as extensions of the natural world rather than its antithesis. Even the British and American Industrial Revolutions constituted not a distancing of humans from nature, as some scholars have suggested, but rather a deepening entanglement with the material environment. At the same time, many environmental historians were moving beyond the field’s initial emphasis on the ideal of an American and often Western “wilderness” to embrace a concept of the environment as including humans and productive work. Nonetheless, many environmental historians continued to emphasize the independent agency of the nonhuman environment of organisms and things. This insistence that not everything could be reduced to human culture remained the field’s most distinctive feature.
Since the turn of millennium, the two fields have increasingly come together in a variety of synthetic approaches, including Actor Network Theory, envirotechnical analysis, and neomaterialist theory. As the influence of the cultural turn has waned, the environmental historians’ emphasis on the independent agency of the nonhuman has come to the fore, gaining wider influence as it is applied to the dynamic “nature” or “wildness” that some scholars argue exists within both the technological and natural environment. The foundational distinctions between the history of technology and environmental history may now be giving way to more materially rooted attempts to understand how a dynamic hybrid environment helps to create human history in all of its dimensions—cultural, social, and biological.
Adam M. Sowards
For more than a century after the republic’s founding in the 1780s, American law reflected the ideal that the commons—the public domain—should be turned into private property. As Americans became concerned about resource scarcity, waste, and monopolies at the end of the 19th century, reform-minded bureaucrats and scientists convinced Congress to maintain in perpetuity some of the nation’s land as public. This shift offered a measure of protection and an alternative to private property regimes. The federal agencies that primarily manage these lands today—U.S. Forest Service (USFS), National Park Service (NPS), U.S. Fish and Wildlife Service (USFWS), and Bureau of Land Management (BLM)—have worked since their origins in the early decades of the 20th century to fulfill their diverse, competing, evolving missions. Meanwhile, the public and Congress have continually demanded new and different goals as the land itself has functioned and responded in interdependent ways. In the mid-20th century, the agencies intensified their management, hoping they could satisfy the rising—and often conflicting—demands American citizens placed on the public lands. This intensification often worsened public lands’ ecology and increased political conflict, resulting in a series of new laws in the 1960s and 1970s. Those laws strengthened the role of science and the public in influencing agency practices while providing more opportunities for litigation. Predictably, since the late 1970s, these developments have polarized public lands’ politics. The economies, but also the identities, of many Americans remain entwined with the public lands, making political standoffs—over endangered species, oil production, privatizing land, and more—common and increasingly intractable. Because the public lands are national in scope but used by local people for all manner of economic and recreational activities, they have been and remain microcosms of the federal democratic system and all its conflicted nature.
Humans have utilized American forests for a wide variety of uses from the pre-Columbian period to the present. Native Americans heavily shaped forests to serve their needs, helping to create fire ecologies in many forests. English settlers harvested these forests for trade, to clear land, and for domestic purposes. The arrival of the Industrial Revolution in the early 19th century rapidly expanded the rate of logging. By the Civil War, many areas of the Northeast were logged out. Post–Civil War forests in the Great Lakes states, the South, and then the Pacific Northwest fell with increasing speed to feed the insatiable demands of the American economy, facilitated by rapid technological innovation that allowed for growing cuts. By the late 19th century, growing concerns about the future of American timber supplies spurred the conservation movement, personified by forester Gifford Pinchot and the creation of the U.S. Forest Service with Pinchot as its head in 1905. After World War II, the Forest Service worked closely with the timber industry to cut wide swaths of the nation’s last virgin forests. These gargantuan harvests led to the growth of the environmental movement. Beginning in the 1970s, environmentalists began to use legal means to halt logging in the ancient forests, and the listing of the northern spotted owl under the Endangered Species Act was the final blow to most logging on Forest Service lands in the Northwest. Yet not only does the timber industry remain a major employer in forested parts of the nation today, but alternative forest economies have also developed around more sustainable industries such as tourism.
Megan Kate Nelson
During the American Civil War, Union and Confederate commanders made the capture and destruction of enemy cities a central feature of their military campaigns. They did so for two reasons. First, most mid-19th-century cities had factories, foundries, and warehouses within their borders, churning out and storing war materiel; military officials believed that if they interrupted or incapacitated the enemy’s ability to arm or clothe themselves, the war would end. Second, it was believed that the widespread destruction of property—especially in major or capital cities—would also damage civilians’ morale, undermining their political convictions and decreasing their support for the war effort.
Both Union and Confederate armies bombarded and burned cities with these goals in mind. Sometimes they fought battles on city streets but more often, Union troops initiated long-term sieges in order to capture Confederate cities and demoralize their inhabitants. Soldiers on both sides were motivated by vengeance when they set fire to city businesses and homes; these acts were controversial, as was defensive burning—the deliberate destruction of one’s own urban center in order to keep its war materiel out of the hands of the enemy.
Urban destruction, particularly long-term sieges, took a psychological toll on (mostly southern) city residents. Many were wounded, lost property, or were forced to become refugees. Because of this, the destruction of cities during the American Civil War provoked widespread discussions about the nature of “civilized warfare” and the role that civilians played in military strategy. Both soldiers and civilians tried to make sense of the destruction of cities in writing, and also in illustrations and photographs; images in particular shaped both northern and southern memories of the war and its costs.
The national parks of the United States have been one of the country’s most popular federal initiatives, and popular not only within the nation but across the globe. The first park was Yellowstone, established in 1872, and since then almost sixty national parks have been added, along with hundreds of monuments, protected rivers and seashores, and important historical sites as well as natural preserves. In 1916 the parks were put under the National Park Service, which has managed them primarily as scenic treasures for growing numbers of tourists. Ecologically minded scientists, however, have challenged that stewardship and called for restoration of parks to their natural conditions, defined as their ecological integrity before white Europeans intervened. The most influential voice in the history of park philosophy remains John Muir, the California naturalist and Yosemite enthusiast and himself a proto-ecologist, who saw the parks as sacred places for a modern nation, where reverence for nature and respect for science might coexist and where tourists could be educated in environmental values. As other nations have created their own park systems, similar debates have occurred. While parks may seem like a great modern idea, this idea has always been embedded in cultural and social change—and subject to struggles over what that “idea” should be.
In September 1962, the National Farm Workers Association (NFWA) held its first convention in Fresno, California, initiating a multiracial movement that would result in the creation of United Farm Workers (UFW) and the first contracts for farm workers in the state of California. Led by Cesar Chavez, the union contributed a number of innovations to the art of social protest, including the most successful consumer boycott in the history of the United States. Chavez welcomed contributions from numerous ethnic and racial groups, men and women, young and old. For a time, the UFW was the realization of Martin Luther King Jr.’s beloved community—people from different backgrounds coming together to create a socially just world. During the 1970s, Chavez struggled to maintain the momentum created by the boycott as the state of California became more involved in adjudicating labor disputes under the California Agricultural Labor Relations Act (ALRA). Although Chavez and the UFW ultimately failed to establish a permanent, national union, their successes and strategies continue to influence movements for farm worker justice today.