In the post-1945 period, jazz moved rapidly from one major avant-garde revolution (the birth of bebop) to another (the emergence of free jazz) while developing a profusion of subgenres (hard bop, progressive, modal, Third Stream, soul jazz) and a new idiomatic persona (cool or hip) that originated as a form of African American resistance but soon became a signature of transgression and authenticity across the modern arts and culture. Jazz’s long-standing affiliation with African American urban life and culture intensified through its central role in the Black Arts Movement of the 1960s. By the 1970s, jazz, now fully eclipsed in popular culture by rock n’ roll, turned to electric instruments and fractured into a multitude of hyphenated styles (jazz-funk, jazz-rock, fusion, Latin jazz). The move away from acoustic performance and traditional codes of blues and swing musicianship generated a neoclassical reaction in the 1980s that coincided with a mission to establish an orthodox jazz canon and honor the music’s history in elite cultural institutions. Post-1980s jazz has been characterized by tension between tradition and innovation, earnest preservation and intrepid exploration, Americanism and internationalism.
James I. Matray
On June 25, 1950, North Korea’s invasion of South Korea ignited a conventional war that had origins dating from at least the end of World War II. In April 1945, President Harry S. Truman abandoned a trusteeship plan for postwar Korea in favor of seeking unilateral U.S. occupation of the peninsula after an atomic attack forced Japan’s prompt surrender. Soviet entry into the Pacific war led to a last minute agreement dividing Korea at the 38th parallel into zones of occupation. Two Koreas emerged after Soviet-American negotiations failed to agree on a plan to end the division. Kim Il Sung in the north and Syngman Rhee in the south both were determined to reunite Korea, instigating major military clashes at the parallel in the summer of 1949. Moscow and Washington opposed their clients’ invasion plans until April 1950 when Kim persuaded Soviet Premier Joseph Stalin that with mass support in South Korea, he would achieve a quick victory.
At first, Truman hoped that South Korea could defend itself with more military equipment and U.S. air support. Commitment of U.S. ground forces came after General Douglas MacArthur, U.S. occupation commander in Japan, visited the front and advised that the South Koreans could not halt the advance. Overconfident U.S. soldiers would sustain defeat as well, retreating to the Pusan Perimeter, a rectangular area in the southeast corner of the peninsula. On September 15, MacArthur staged a risky amphibious landing at Inchon behind enemy lines that sent Communist forces fleeing back into North Korea. The People’s Republic of China viewed the U.S. offensive for reunification that followed as a threat to its security and prestige. In late November, Chinese “volunteers” attacked in mass. After a chaotic retreat, U.S. forces counterattacked in February 1951 and moved the line of battle just north of the parallel. After two Chinese offensives failed, negotiations to end the war began in July 1951, but stalemated in May 1952 over the issue of repatriation of prisoners of war. Peace came because of Stalin’s death in March 1953, rather than President Dwight D. Eisenhower’s veiled threat to stage nuclear strikes against China.
Scholars have disagreed about many issues surrounding the Korean War, but the most important debate continues to center on whether the conflict had international or domestic origins. Initially, historians relied mainly on U.S. government publications to write accounts that ignored events prior to North Korea’s attack, endorsing an orthodox interpretation assigning blame to the Soviet Union and applauding the U.S. response. Declassification of U.S. government documents and presidential papers during the 1970s led to the publication of studies assigning considerable responsibility to the United States for helping to create a kind of war in Korea before June 1950. Moreover, left revisionist writers labeled the conflict a classic civil war. Release of Chinese and Soviet sources after 1989 established that Stalin and Chinese leader Mao Zedong approved the North Korean invasion, prompting right revisionist scholars to reassert key orthodox arguments. This essay describes how and why recent access to Communist documents has not settled the disagreements among historians about the causes, course, and consequences of the Korean War.
Laura Isabel Serna
Latinos have constituted part of the United States’ cinematic imagination since the emergence of motion pictures in the late 19th century. Though shifting in their specific contours, representations of Latinos have remained consistently stereotypical; Latinos have primarily appeared on screen as bandits, criminals, nameless maids, or sultry señoritas. These representations have been shaped by broader political and social issues and have influenced the public perception of Latinos in the United States. However, the history of Latinos and film should not be limited to the topic of representation. Latinos have participated in the film industry as actors, creative personnel (including directors and cinematographers), and have responded to representations on screen as members of audiences with a shared sense of identity, whether as mexicanos de afuera in the early 20th century, Hispanics in the 1980s and 1990s, or Latinos in the 21st century. Both participation in production and reception have been shaped by the ideas about race that characterize the film industry and its products. Hollywood’s labor hierarchy has been highly stratified according to race, and Hollywood films that represent Latinos in a stereotypical fashion have been protested by Latino audiences. While some Latino/a filmmakers have opted to work outside the confines of the commercial film industry, others have sought to gain entry and reform the industry from the inside. Throughout the course of this long history, Latino representation on screen and on set has been shaped by debates over international relations, immigration, citizenship, and the continuous circulation of people and films between the United States and Latin America.
In 1944 President Franklin D. Roosevelt’s State of the Union address set out what he termed an “economic Bill of Rights” that would act as a manifesto of liberal policies after World War Two. Politically, however, the United States was a different place than the country that had faced the ravages of the Great Depression of the 1930s and ushered in Roosevelt’s New Deal to transform the relationship between government and the people. Key legacies of the New Deal, such as Social Security, remained and were gradually expanded, but opponents of governmental regulation of the economy launched a bitter campaign after the war to roll back labor union rights and dismantle the New Deal state.
Liberal heirs to FDR in the 1950s, represented by figures like two-time presidential candidate Adlai Stevenson, struggled to rework liberalism to tackle the realities of a more prosperous age. The long shadow of the U.S. Cold War with the Soviet Union also set up new challenges for liberal politicians trying to juggle domestic and international priorities in an era of superpower rivalry and American global dominance. The election of John F. Kennedy as president in November 1960 seemed to represent a narrow victory for Cold War liberalism, and his election coincided with the intensification of the struggle for racial equality in the United States that would do much to shape liberal politics in the 1960s. After his assassination in 1963, President Lyndon Johnson launched his “Great Society,” a commitment to eradicate poverty and to provide greater economic security for Americans through policies such as Medicare. But his administration’s deepening involvement in the Vietnam War and its mixed record on alleviating poverty did much to taint the positive connotations of “liberalism” that had dominated politics during the New Deal era.
Landon R. Y. Storrs
The second Red Scare refers to the fear of communism that permeated American politics, culture, and society from the late 1940s through the 1950s, during the opening phases of the Cold War with the Soviet Union. This episode of political repression lasted longer and was more pervasive than the Red Scare that followed the Bolshevik Revolution and World War I. Popularly known as “McCarthyism” after Senator Joseph McCarthy (R-Wisconsin), who made himself famous in 1950 by claiming that large numbers of Communists had infiltrated the U.S. State Department, the second Red Scare predated and outlasted McCarthy, and its machinery far exceeded the reach of a single maverick politician. Nonetheless, “McCarthyism” became the label for the tactic of undermining political opponents by making unsubstantiated attacks on their loyalty to the United States.
The initial infrastructure for waging war on domestic communism was built during the first Red Scare, with the creation of an antiradicalism division within the Federal Bureau of Investigation (FBI) and the emergence of a network of private “patriotic” organizations. With capitalism’s crisis during the Great Depression, the Communist Party grew in numbers and influence, and President Franklin D. Roosevelt’s New Deal program expanded the federal government’s role in providing economic security. The anticommunist network expanded as well, most notably with the 1938 formation of the Special House Committee to Investigate Un-American Activities, which in 1945 became the permanent House Un-American Activities Committee (HUAC). Other key congressional investigation committees were the Senate Internal Security Subcommittee and McCarthy’s Permanent Subcommittee on Investigations. Members of these committees and their staff cooperated with the FBI to identify and pursue alleged subversives. The federal employee loyalty program, formalized in 1947 by President Harry Truman in response to right-wing allegations that his administration harbored Communist spies, soon was imitated by local and state governments as well as private employers. As the Soviets’ development of nuclear capability, a series of espionage cases, and the Korean War enhanced the credibility of anticommunists, the Red Scare metastasized from the arena of government employment into labor unions, higher education, the professions, the media, and party politics at all levels. The second Red Scare did not involve pogroms or gulags, but the fear of unemployment was a powerful tool for stifling criticism of the status quo, whether in economic policy or social relations. Ostensibly seeking to protect democracy by eliminating communism from American life, anticommunist crusaders ironically undermined democracy by suppressing the expression of dissent. Debates over the second Red Scare remain lively because they resonate with ongoing struggles to reconcile Americans’ desires for security and liberty.
Housing in America has long stood as a symbol of the nation’s political values and a measure of its economic health. In the 18th century, a farmhouse represented Thomas Jefferson’s ideal of a nation of independent property owners; in the mid-20th century, the suburban house was seen as an emblem of an expanding middle class. Alongside those well-known symbols were a host of other housing forms—tenements, slave quarters, row houses, French apartments, loft condos, and public housing towers—that revealed much about American social order and the material conditions of life for many people.
Since the 19th century, housing markets have been fundamental forces driving the nation’s economy and a major focus of government policies. Home construction has provided jobs for skilled and unskilled laborers. Land speculation, housing development, and the home mortgage industry have generated billions of dollars in investment capital, while ups and downs in housing markets have been considered signals of major changes in the economy. Since the New Deal of the 1930s, the federal government has buttressed the home construction industry and offered economic incentives for home buyers, giving the United States the highest home ownership rate in the world. The housing market crash of 2008 slashed property values and sparked a rapid increase in home foreclosures, especially in places like Southern California and the suburbs of the Northeast, where housing prices had ballooned over the previous two decades. The real estate crisis led to government efforts to prop up the mortgage banking industry and to assist struggling homeowners. The crisis led, as well, to a drop in rates of home ownership, an increase in rental housing, and a growth in homelessness.
Home ownership remains a goal for many Americans and an ideal long associated with the American dream. The owner-occupied home—whether single-family or multifamily dwelling—is typically the largest investment made by an American family. Through much of the 18th and 19th centuries, housing designs varied from region to region. In the mid-20th century, mass production techniques and national building codes tended to standardize design, especially in new suburban housing. In the 18th century, the family home was a site of waged and unwaged work; it was the center of a farm, plantation, or craftsman’s workshop. Two and a half centuries later, a house was a consumer good: its size, location, and decor marked the family’s status and wealth.
The national parks of the United States have been one of the country’s most popular federal initiatives, and popular not only within the nation but across the globe. The first park was Yellowstone, established in 1872, and since then almost sixty national parks have been added, along with hundreds of monuments, protected rivers and seashores, and important historical sites as well as natural preserves. In 1916 the parks were put under the National Park Service, which has managed them primarily as scenic treasures for growing numbers of tourists. Ecologically minded scientists, however, have challenged that stewardship and called for restoration of parks to their natural conditions, defined as their ecological integrity before white Europeans intervened. The most influential voice in the history of park philosophy remains John Muir, the California naturalist and Yosemite enthusiast and himself a proto-ecologist, who saw the parks as sacred places for a modern nation, where reverence for nature and respect for science might coexist and where tourists could be educated in environmental values. As other nations have created their own park systems, similar debates have occurred. While parks may seem like a great modern idea, this idea has always been embedded in cultural and social change—and subject to struggles over what that “idea” should be.
Luke A. Nichter
Assessments of President Richard Nixon’s foreign policy continue to evolve as scholars tap new possibilities for research. Due to the long wait before national security records are declassified by the National Archives and made available to researchers and the public, only in recent decades has the excavation of the Nixon administration’s engagement with the world started to become well documented. As more records are released by the National Archives (including potentially 700 hours of Nixon’s secret White House tapes that remain closed), scholarly understanding of the Nixon presidency is likely to continue changing. Thus far, historians have pointed to four major legacies of Nixon’s foreign policy: tendencies to use American muscle abroad on a more realistic scale, to reorient the focus of American foreign policy to the Pacific, to reduce the chance that the Cold War could turn hot, and, inadvertently, to contribute to the later rise of Ronald Reagan and the Republican right wing—many of whom had been part of Nixon’s “silent majority.” While earlier works focused primarily on subjects like Vietnam, China, and the Soviet Union, the historiography today is much more diverse – now there is at least one work covering most major aspects of Nixon’s foreign policy.
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
Nuclear arms control has existed as long as the armaments themselves. American plans to limit or eliminate these weapons of mass destruction were put forward, even as the United States and nine other countries—the Soviet Union, the United Kingdom, France, China, Israel, South Africa, India, Pakistan, and North Korea—amassed stockpiles of explosives that harnessed the energies generated by the fission or fusion of atomic nuclei. Since 1945, the United States has sought to reduce its arsenal conjointly with the Soviet Union and (after 1991) Russia. Efforts have been made to inhibit new states from acquiring nuclear weapons, discourage their military use, and perhaps even allow for their eventual abolition.
Scholars disagree as to why the United States has engaged in nuclear arms control since World War II. The history of nuclear weapons encompasses intellectual theories and cultural attitudes as much as material or strategic developments. The overarching debate is one of structure versus agency: whether the weapons’ sheer power, or the attitudes of historical powers toward them, has driven arms control. Among those who stress agency, there are two further disagreements: (a) the influence of domestic culture, protest, and politics; and (b) whether nuclear arms control is an end in itself, or merely a means to end, namely the entrenchment of American power throughout the world.
The intensity of arms control efforts tends to rise and fall with the apparent likelihood of nuclear war. Faith in the country’s nuclear monopoly encouraged Harry Truman to sabotage early efforts at control, while Dwight Eisenhower’s faith in nuclear deterrence led to a similar destination. Mounting fears of a U.S.-Soviet thermonuclear exchange, in the late 1950s, stirred protest movements and diplomatic efforts in the direction of control. The spread of nuclear weapons to new states impelled presidential administrations from John F. Kennedy to Jimmy Carter to work against the expansion of nuclear arms, culminating in the 1968 Treaty on the Non-Proliferation of Nuclear Weapons (NPT). Richard Nixon proved the exception to these trends. Not only did he downplay proliferation, but his pursuit of the 1974 Strategic Arms Limitation Treaty was motivated by a cynical goal: improvement of America’s strategic position after the Vietnam War via détente with the Soviet Union. Rising fear of nuclear war under Ronald Reagan produced two more landmark U.S.-Soviet agreements: the 1987 Intermediate-Range Nuclear Forces Treaty (INF) and the 1991 Strategic Arms Reduction Treaty (START). Since the end of the Cold War, the attention of the United States has swung away from bilateral arms control treaties or nuclear disarmament, to the spread of nuclear weapons as the unipolar moment. The mounting prominence of regional conflicts, failed states, and non-state actors has stolen attention away from efforts to put the atomic genie back in the bottle.
Jessica M. Chapman
The origins of the Vietnam War can be traced to France’s colonization of Indochina in the late 1880s. The Viet Minh, led by Ho Chi Minh, emerged as the dominant anti-colonial movement by the end of World War II, though Viet Minh leaders encountered difficulties as they tried to consolidate their power on the eve of the First Indochina War against France. While that war was, initially, a war of decolonization, it became a central battleground of the Cold War by 1950. The lines of future conflict were drawn that year when the Peoples Republic of China and the Soviet Union recognized and provided aid to the Democratic Republic of Vietnam in Hanoi, followed almost immediately by Washington’s recognition of the State of Vietnam in Saigon. From that point on, American involvement in Vietnam was most often explained in terms of the Domino Theory, articulated by President Dwight D. Eisenhower on the eve of the Geneva Conference of 1954. The Franco-Viet Minh ceasefire reached at Geneva divided Vietnam in two at the 17th parallel, with countrywide reunification elections slated for the summer of 1956. However, the United States and its client, Ngo Dinh Diem, refused to participate in talks preparatory to those elections, preferring instead to build South Vietnam as a non-communist bastion. While the Vietnamese communist party, known as the Vietnam Worker’s Party in Hanoi, initially hoped to reunify the country by peaceful means, it reached the conclusion by 1959 that violent revolution would be necessary to bring down the “American imperialists and their lackeys.” In 1960, the party formed the National Liberation Front for Vietnam and, following Diem’s assassination in 1963, passed a resolution to wage all-out war in the south in an effort to claim victory before the United States committed combat troops. After President John F. Kennedy took office in 1961, he responded to deteriorating conditions in South Vietnam by militarizing the American commitment, though he stopped short of introducing dedicated ground troops. After Diem and Kennedy were assassinated in quick succession in November 1963, Lyndon Baines Johnson took office determined to avoid defeat in Vietnam, but hoping to prevent the issue from interfering with his domestic political agenda. As the situation in South Vietnam became more dire, LBJ found himself unable to maintain the middle-of-the-road approach that Kennedy had pursued. Forced to choose between escalation and withdrawal, he chose the former in March 1965 by launching a sustained campaign of aerial bombardment, coupled with the introduction of the first officially designated U.S. combat forces to Vietnam.