Spanning countries across the globe, the antinuclear movement was the combined effort of millions of people to challenge the superpowers’ reliance on nuclear weapons during the Cold War. Encompassing an array of tactics, from radical dissent to public protest to opposition within the government, this movement succeeded in constraining the arms race and helping to make the use of nuclear weapons politically unacceptable. Antinuclear activists were critical to the establishment of arms control treaties, although they failed to achieve the abolition of nuclear weapons, as anticommunists, national security officials, and proponents of nuclear deterrence within the United States and Soviet Union actively opposed the movement. Opposition to nuclear weapons evolved in tandem with the Cold War and the arms race, leading to a rapid decline in antinuclear activism after the Cold War ended.
Cindy R. Lobel
Over the course of the 19th century, American cities developed from small seaports and trading posts to large metropolises. Not surprisingly, foodways and other areas of daily life changed accordingly. In 1800, the dietary habits of urban Americans were similar to those of the colonial period. Food provisioning was very local. Farmers, hunters, fishermen, and dairymen from a few miles away brought food by rowboats and ferryboats and by horse carts to centralized public markets within established cities. Dietary options were seasonal as well as regional. Few public dining options existed outside of taverns, which offered lodging as well as food. Most Americans, even in urban areas, ate their meals at home, which in many cases were attached to their workshops, countinghouses, and offices.
These patterns changed significantly over the course of the19th century, thanks largely to demographic changes and technological developments. By the turn of the 20th century, urban Americans relied on a food-supply system that was highly centralized and in the throes of industrialization. Cities developed complex restaurant sectors, and majority immigrant populations dramatically shaped and reshaped cosmopolitan food cultures. Furthermore, with growing populations, lax regulation, and corrupt political practices in many cities, issues arose periodically concerning the safety of the food supply. In sum, the roots of today’s urban food systems were laid down over the course of the 19th century.
Gary R. Edgerton
Television is an ever-evolving and multi-dimensional medium, being at once a technology, an industry, an art form, and an institutional force. In the United States, it emerged as an idea whose time had come at the end of World War II. TV eventually grew and matured into the most influential social and cultural catalyst shaping and reflecting American civilization during the second half of the 20th century. Television revolutionized the way citizens and consumers in the United States learned about and communicated with the world; it also recast and re-envisioned the way they experience themselves and others. More than just escapist entertainment, TV reveals the dynamism and diversity of everyday life in the United States and the evolving nature of the nation’s core values. Television is moreover in a continual state of change and renewal. Its history has developed through a prehistory (before 1948) to a network era (1948–1975), a cable era (1976–1994), and finally the current digital era (1995–present). Today there are more than 650 networks in the U.S. marketplace whereby members of the typical domestic household receive 189 channels and watch more than eight hours of TV a day on average. TV in the 21st century also travels anywhere at any time, given its synergistic relationship with the Internet and a wide array of digital devices. It is now increasingly personalized, interactive, mobile, and on demand. Television is presently a convergent technology, a global industry, a viable art form, a public catalyst, and a complex and dynamic reflection of American society and culture.
Christopher P. Loss
Until World War II, American universities were widely regarded as good but not great centers of research and learning. This changed completely in the press of wartime, when the federal government pumped billions into military research, anchored by the development of the atomic bomb and radar, and into the education of returning veterans under the GI Bill of 1944. The abandonment of decentralized federal–academic relations marked the single most important development in the history of the modern American university. While it is true that the government had helped to coordinate and fund the university system prior to the war—most notably the country’s network of public land-grant colleges and universities—government involvement after the war became much more hands-on, eventually leading to direct financial support to and legislative interventions on behalf of core institutional activities, not only the public land grants but the nation’s mix of private institutions as well. However, the reliance on public subsidies and legislative and judicial interventions of one kind or another ended up being a double-edged sword: state action made possible the expansion in research and in student access that became the hallmarks of the post-1945 American university; but it also created a rising tide of expectations for continued support that has proven challenging in fiscally stringent times and in the face of ongoing political fights over the government’s proper role in supporting the sector.