Spanning countries across the globe, the antinuclear movement was the combined effort of millions of people to challenge the superpowers’ reliance on nuclear weapons during the Cold War. Encompassing an array of tactics, from radical dissent to public protest to opposition within the government, this movement succeeded in constraining the arms race and helping to make the use of nuclear weapons politically unacceptable. Antinuclear activists were critical to the establishment of arms control treaties, although they failed to achieve the abolition of nuclear weapons, as anticommunists, national security officials, and proponents of nuclear deterrence within the United States and Soviet Union actively opposed the movement. Opposition to nuclear weapons evolved in tandem with the Cold War and the arms race, leading to a rapid decline in antinuclear activism after the Cold War ended.
Over the past seventy years, the American film industry has transformed from mass-producing movies to producing a limited number of massive blockbuster movies on a global scale. Hollywood film studios have moved from independent companies to divisions of media conglomerates. Theatrical attendance for American audiences has plummeted since the mid-1940s; nonetheless, American films have never been more profitable. In 1945, American films could only be viewed in theaters; now they are available in myriad forms of home viewing. Throughout, Hollywood has continued to dominate global cinema, although film and now video production reaches Americans in many other forms, from home videos to educational films.
Amid declining attendance, the Supreme Court in 1948 forced the major studios to sell off their theaters. Hollywood studios instead focused their power on distribution, limiting the supply of films and focusing on expensive productions to sell on an individual basis to theaters. Growing production costs and changing audiences caused wild fluctuations in profits, leading to an industry-wide recession in the late 1960s. The studios emerged under new corporate ownership and honed their blockbuster strategy, releasing “high concept” films widely on the heels of television marketing campaigns. New technologies such as cable and VCRs offered new windows for Hollywood movies beyond theatrical release, reducing the risks of blockbuster production. Deregulation through the 1980s and 1990s allowed for the “Big Six” media conglomerates to join film, theaters, networks, publishing, and other related media outlets under one corporate umbrella. This has expanded the scale and stability of Hollywood revenue while reducing the number and diversity of Hollywood films, as conglomerates focus on film franchises that can thrive on various digital media. Technological change has also lowered the cost of non-Hollywood films and thus encouraged a range of alternative forms of filmmaking, distribution, and exhibition.
Gabriella M. Petrick
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
American food in the twentieth and twenty-first centuries is characterized by abundance. Unlike the hardscrabble existence of many earlier Americans, the “Golden Age of Agriculture” brought the bounty produced in fields across the United States to both consumers and producers. While the “Golden Age” technically ended as World War I began, larger quantities of relatively inexpensive food became the norm for most Americans as more fresh foods, rather than staple crops, made their way to urban centers and rising real wages made it easier to purchase these comestibles.
The application of science and technology to food production from the field to the kitchen cabinet, or even more crucially the refrigerator by the mid-1930s, reflects the changing demographics and affluence of American society as much as it does the inventiveness of scientists and entrepreneurs. Perhaps the single most important symbol of overabundance in the United States is the postwar Green Revolution. The vast increase in agricultural production based on improved agronomics, provoked both praise and criticism as exemplified by Time magazine’s critique of Rachel Carson’s Silent Spring in September 1962 or more recently the politics of genetically modified foods.
Reflecting that which occurred at the turn of the twentieth century, food production, politics, and policy at the turn of the twenty-first century has become a proxy for larger ideological agendas and the fractured nature of class in the United States. Battles over the following issues speak to which Americans have access to affordable, nutritious food: organic versus conventional farming, antibiotic use in meat production, dissemination of food stamps, contraction of farm subsidies, the rapid growth of “dollar stores,” alternative diets (organic, vegetarian, vegan, paleo, etc.), and, perhaps most ubiquitous of all, the “obesity epidemic.” These arguments carry moral and ethical values as each side deems some foods and diets virtuous, and others corrupting. While Americans have long held a variety of food ideologies that meld health, politics, and morality, exemplified by Sylvester Graham and John Harvey Kellogg in the nineteenth and early twentieth centuries, among others, newer constructions of these ideologies reflect concerns over the environment, rural Americans, climate change, self-determination, and the role of government in individual lives. In other words, food can be used as a lens to understand larger issues in American society while at the same time allowing historians to explore the intimate details of everyday life.
In the post-1945 period, jazz moved rapidly from one major avant-garde revolution (the birth of bebop) to another (the emergence of free jazz) while developing a profusion of subgenres (hard bop, progressive, modal, Third Stream, soul jazz) and a new idiomatic persona (cool or hip) that originated as a form of African American resistance but soon became a signature of transgression and authenticity across the modern arts and culture. Jazz’s long-standing affiliation with African American urban life and culture intensified through its central role in the Black Arts Movement of the 1960s. By the 1970s, jazz, now fully eclipsed in popular culture by rock n’ roll, turned to electric instruments and fractured into a multitude of hyphenated styles (jazz-funk, jazz-rock, fusion, Latin jazz). The move away from acoustic performance and traditional codes of blues and swing musicianship generated a neoclassical reaction in the 1980s that coincided with a mission to establish an orthodox jazz canon and honor the music’s history in elite cultural institutions. Post-1980s jazz has been characterized by tension between tradition and innovation, earnest preservation and intrepid exploration, Americanism and internationalism.
Laura Isabel Serna
Latinos have constituted part of the United States’ cinematic imagination since the emergence of motion pictures in the late 19th century. Though shifting in their specific contours, representations of Latinos have remained consistently stereotypical; Latinos have primarily appeared on screen as bandits, criminals, nameless maids, or sultry señoritas. These representations have been shaped by broader political and social issues and have influenced the public perception of Latinos in the United States. However, the history of Latinos and film should not be limited to the topic of representation. Latinos have participated in the film industry as actors, creative personnel (including directors and cinematographers), and have responded to representations on screen as members of audiences with a shared sense of identity, whether as mexicanos de afuera in the early 20th century, Hispanics in the 1980s and 1990s, or Latinos in the 21st century. Both participation in production and reception have been shaped by the ideas about race that characterize the film industry and its products. Hollywood’s labor hierarchy has been highly stratified according to race, and Hollywood films that represent Latinos in a stereotypical fashion have been protested by Latino audiences. While some Latino/a filmmakers have opted to work outside the confines of the commercial film industry, others have sought to gain entry and reform the industry from the inside. Throughout the course of this long history, Latino representation on screen and on set has been shaped by debates over international relations, immigration, citizenship, and the continuous circulation of people and films between the United States and Latin America.
Landon R. Y. Storrs
The second Red Scare refers to the fear of communism that permeated American politics, culture, and society from the late 1940s through the 1950s, during the opening phases of the Cold War with the Soviet Union. This episode of political repression lasted longer and was more pervasive than the Red Scare that followed the Bolshevik Revolution and World War I. Popularly known as “McCarthyism” after Senator Joseph McCarthy (R-Wisconsin), who made himself famous in 1950 by claiming that large numbers of Communists had infiltrated the U.S. State Department, the second Red Scare predated and outlasted McCarthy, and its machinery far exceeded the reach of a single maverick politician. Nonetheless, “McCarthyism” became the label for the tactic of undermining political opponents by making unsubstantiated attacks on their loyalty to the United States.
The initial infrastructure for waging war on domestic communism was built during the first Red Scare, with the creation of an antiradicalism division within the Federal Bureau of Investigation (FBI) and the emergence of a network of private “patriotic” organizations. With capitalism’s crisis during the Great Depression, the Communist Party grew in numbers and influence, and President Franklin D. Roosevelt’s New Deal program expanded the federal government’s role in providing economic security. The anticommunist network expanded as well, most notably with the 1938 formation of the Special House Committee to Investigate Un-American Activities, which in 1945 became the permanent House Un-American Activities Committee (HUAC). Other key congressional investigation committees were the Senate Internal Security Subcommittee and McCarthy’s Permanent Subcommittee on Investigations. Members of these committees and their staff cooperated with the FBI to identify and pursue alleged subversives. The federal employee loyalty program, formalized in 1947 by President Harry Truman in response to right-wing allegations that his administration harbored Communist spies, soon was imitated by local and state governments as well as private employers. As the Soviets’ development of nuclear capability, a series of espionage cases, and the Korean War enhanced the credibility of anticommunists, the Red Scare metastasized from the arena of government employment into labor unions, higher education, the professions, the media, and party politics at all levels. The second Red Scare did not involve pogroms or gulags, but the fear of unemployment was a powerful tool for stifling criticism of the status quo, whether in economic policy or social relations. Ostensibly seeking to protect democracy by eliminating communism from American life, anticommunist crusaders ironically undermined democracy by suppressing the expression of dissent. Debates over the second Red Scare remain lively because they resonate with ongoing struggles to reconcile Americans’ desires for security and liberty.
Peace activism in the United States between 1945 and the 2010s focused mostly on opposition to U.S. foreign policy, efforts to strengthen and foster international cooperation, and support for nuclear nonproliferation and arms control. The onset of the Cold War between the United States and the Soviet Union marginalized a reviving postwar American peace movement emerging from concerns about atomic and nuclear power and worldwide nationalist politics that everywhere seemed to foster conflict, not peace. Still, peace activism continued to evolve in dynamic ways and to influence domestic politics and international relations.
Most significantly, peace activists pioneered the use of Gandhian nonviolence in the United States and provided critical assistance to the African American civil rights movement, led the postwar antinuclear campaign, played a major role in the movement against the war in Vietnam, helped to move the liberal establishment (briefly) toward a more dovish foreign policy in the early 1970s, and helped to shape the political culture of American radicalism. Despite these achievements, the peace movement never regained the political legitimacy and prestige it held in the years before World War II, and it struggled with internal divisions about ideology, priorities, and tactics.
Peace activist histories in the 20th century tended to emphasize organizational or biographical approaches that sometimes carried hagiographic overtones. More recently, historians have applied the methods of cultural history, examining the role of religion, gender, and race in structuring peace activism. The transnational and global turn in the historical discipline has also begun to make inroads in peace scholarship. These are promising new directions because they situate peace activism within larger historical and cultural developments and relate peace history to broader historiographical debates and trends.
Jimmy Carter’s “Crisis of Confidence Speech” of July 1979 was a critical juncture in post-1945 U.S. politics, but it also marks an exemplary pivot in post-1945 religion. Five dimensions of faith shaped the president’s sermon. The first concerned the shattered consensus of American religion. When Carter encouraged Americans to recapture a spirit of unity, he spoke in a heartfelt but spent language more suitable to Dwight Eisenhower’s presidency than his own. By 1979, the Protestant-Catholic-Jewish consensus of Eisenhower’s time was fractured into a dynamic pluralism, remaking American religion in profound ways. Carter’s speech revealed a second revolution of post-1945 religion when it decried its polarization and politicization. Carter sought to heal ruptures that were dividing the nation between what observers, two decades hence, would label “red” (conservative Republican) and “blue” (liberal Democratic) constituencies. Yet his endeavors failed, as would be evidenced in the religious politics of Ronald Reagan’s era, which followed. Carter championed community values as the answer to his society’s problems aware of yet a third dawning reality: globalization. The virtues of localism that Carter espoused were in fact implicated in (and complicated by) transnational forces of change that saw immigration, missionary enterprises, and state and non-state actors internationalizing the American religious experience. A fourth illuminating dimension of Carter’s speech was its critique of America’s gospel of wealth. Although this “born-again” southerner was a product of the evangelical South’s revitalized free-market capitalism, he lamented how laissez-faire Christianity had become America’s lingua franca. Finally, Carter wrestled with secularization, revealing a fifth feature of post-1945 America. Even though faith commitments were increasingly cordoned off from formal state functions during this time, the nation’s political discourse acquired a pronounced religiosity. Carter contributed by framing mundane issues (such as energy) in moral contexts that drew no hard-and-fast boundaries between matters of the soul and governance. Drawn from the political and economic crises of his moment, Carter’s speech thus also reveals the all-enveloping tide of religion in America’s post-1945 age.
Rock and roll, a popular music craze of the mid-1950s, turned a loud, fast, and sexy set of sounds rooted in urban, black, working class, and southern America into the pop preference as well of suburban, white, young, and northern America. By the late 1960s, those fans and British counterparts made their own version, more politicized and experimental and just called rock—the summoning sound of the counterculture. Rock’s aura soon faded: it became as much entertainment staple as dissident form, with subcategories disparate as singer-songwriter, heavy metal, alternative, and “classic rock.” Where rock and roll was integrated and heterogeneous, rock was largely white and homogeneous, policing its borders. Notoriously, rock fans detonated disco records in 1979. By the 1990s, rock and roll style was hip-hop, with its youth appeal and rebelliousness; post‒baby boomer bands gave rock some last vanguard status; and suburbanites found classic rock in New Country. This century’s notions of rock and roll have blended thoroughly, from genre “mash-ups” to superstar performers almost categories unto themselves and new sounds such as EDM beats. Still, crossover moments evoke rock and roll; assertions of authenticity evoke rock. Because rock and roll, and rock, epitomize cultural ideals and group identities, their definitions have been constantly debated. Initial argument focused on challenging genteel, professional notions of musicianship and behavior. Later discourse took up cultural incorporation and social empowerment, with issues of gender and commercialism as prominent as race and artistry. Rock and roll promised one kind of revolution to the post-1945 United States; rock another. The resulting hope and confusion has never been fully sorted, with mixed consequences for American music and cultural history.
The 1950s have typically been seen as a complacent, conservative time between the end of World War II and the radical 1960s, when anticommunism and the Cold War subverted reform and undermined civil liberties. But the era can also be seen as a very liberal time in which meeting the Communist threat led to Keynesian economic policies, the expansion of New Deal programs, and advances in civil rights. Politically, it was “the Eisenhower Era,” dominated by a moderate Republican president, a high level of bipartisan cooperation, and a foreign policy committed to containing communism. Culturally, it was an era of middle-class conformity, which also gave us abstract expressionism, rock and roll, Beat poetry, and a grassroots challenge to Jim Crow.