Justus D. Doenecke
For the United States, isolationism is best defined as avoidance of wars outside the Western Hemisphere, particularly in Europe; opposition to binding military alliances; and the unilateral freedom to act politically and commercially unrestrained by mandatory commitments to other nations. Until the controversy over American entry into the League of Nations, isolationism was never subject to debate. The United States could expand its territory, protect its commerce, and even fight foreign powers without violating its traditional tenets. Once President Woodrow Wilson sought membership in the League, however, Americans saw isolationism as a foreign policy option, not simply something taken for granted. A fundamental foreign policy tenet now became a faction, limited to a group of people branded as “isolationists.” Its high point came during the years 1934–1937, when Congress, noting the challenge of the totalitarian nations to the international status quo, passed the neutrality acts to insulate the country from global entanglements.
Once World War II broke out in Europe, President Franklin D. Roosevelt increasingly sought American participation on the side of the Allies. Isolationists unsuccessfully fought FDR’s legislative proposals, beginning with repeal of the arms embargo and ending with the convoying of supplies to Britain. The America First Committee (1940–1941), however, so effectively mobilized anti-interventionist opinion as to make the president more cautious in his diplomacy.
If the Japanese attack on Pearl Harbor permanently ended classic isolationism, by 1945 a “new isolationism” voiced suspicion of the United Nations, the Truman Doctrine, aid to Greece and Turkey, the Marshall Plan, the North Atlantic Treaty Organization, and U.S. participation in the Korean War. Yet, because the “new isolationists” increasingly advocated militant unilateral measures to confront Communist Russia and China, often doing so to advance the fortunes of the Republican party, they exposed themselves to charges of inconsistency and generally faded away in the 1950s. Since the 1950s, many Americans have opposed various military involvements— including the ones in Vietnam, Iraq, and Afghanistan— but few envision returning to an era when the United States avoids all commitments.
Racism and xenophobia, but also resilience and community building, characterize the return of thousands of Japanese Americans, or Nikkei, to the West Coast after World War II. Although the specific histories of different regions shaped the resettlement experiences for Japanese Americans, Los Angeles provides an instructive case study. For generations, the City of Angels has been home to one of the nation’s largest and most diverse Nikkei communities and the ways in which Japanese Americans rebuilt their lives and institutions resonate with the resettlement experience elsewhere.
Before World War II, greater Los Angeles was home to a vibrant Japanese American population. First generation immigrants, or Issei, and their American-born children, the Nisei, forged dynamic social, economic, cultural, and spiritual institutions out of various racial exclusions. World War II uprooted the community as Japanese Americans left behind their farms, businesses, and homes. In the best instances, they were able to entrust their property to neighbors or other sympathetic individuals. More often, the uncertainty of their future led Japanese Americans to sell off their property, far below the market price. Upon the war’s end, thousands of Japanese Americans returned to Los Angeles, often to financial ruin.
Upon their arrival in the Los Angeles area, Japanese Americans continued to face deep-seated prejudice, all the more accentuated by an overall dearth of housing. Without a place to live, they sought refuge in communal hostels set up in pre-war institutions that survived the war such as a variety of Christian and Buddhist churches. Meanwhile, others found housing in temporary trailer camps set up by the War Relocation Authority (WRA), and later administered by the Federal Public Housing Authority (FPHA), in areas such as Burbank, Sun Valley, Hawthorne, Santa Monica, and Long Beach. Although some local religious groups and others welcomed the returnees, white homeowners, who viewed the settlement of Japanese Americans as a threat to their property values, often mobilized to protest the construction of these camps. The last of these camps closed in 1956, demonstrating the hardship some Japanese Americans still faced in integrating back into society. Even when the returnees were able to leave the camps, they still faced racially restrictive housing covenants and, when those practices were ruled unconstitutional, exclusionary lending. Although new suburban enclaves of Japanese Americans eventually developed in areas such as Gardena, West Los Angeles, and Pacoima by the 1960s, the pathway to those destinations was far from easy. Ultimately, the resettlement of Japanese Americans in Los Angeles after their mass incarceration during World War II took place within the intertwined contexts of lingering anti-Japanese racism, Cold War politics, and the suburbanization of Southern California.
In the post-1945 period, jazz moved rapidly from one major avant-garde revolution (the birth of bebop) to another (the emergence of free jazz) while developing a profusion of subgenres (hard bop, progressive, modal, Third Stream, soul jazz) and a new idiomatic persona (cool or hip) that originated as a form of African American resistance but soon became a signature of transgression and authenticity across the modern arts and culture. Jazz’s long-standing affiliation with African American urban life and culture intensified through its central role in the Black Arts Movement of the 1960s. By the 1970s, jazz, now fully eclipsed in popular culture by rock n’ roll, turned to electric instruments and fractured into a multitude of hyphenated styles (jazz-funk, jazz-rock, fusion, Latin jazz). The move away from acoustic performance and traditional codes of blues and swing musicianship generated a neoclassical reaction in the 1980s that coincided with a mission to establish an orthodox jazz canon and honor the music’s history in elite cultural institutions. Post-1980s jazz has been characterized by tension between tradition and innovation, earnest preservation and intrepid exploration, Americanism and internationalism.
James I. Matray
On June 25, 1950, North Korea’s invasion of South Korea ignited a conventional war that had origins dating from at least the end of World War II. In April 1945, President Harry S. Truman abandoned a trusteeship plan for postwar Korea in favor of seeking unilateral U.S. occupation of the peninsula after an atomic attack forced Japan’s prompt surrender. Soviet entry into the Pacific war led to a last minute agreement dividing Korea at the 38th parallel into zones of occupation. Two Koreas emerged after Soviet-American negotiations failed to agree on a plan to end the division. Kim Il Sung in the north and Syngman Rhee in the south both were determined to reunite Korea, instigating major military clashes at the parallel in the summer of 1949. Moscow and Washington opposed their clients’ invasion plans until April 1950 when Kim persuaded Soviet Premier Joseph Stalin that with mass support in South Korea, he would achieve a quick victory.
At first, Truman hoped that South Korea could defend itself with more military equipment and U.S. air support. Commitment of U.S. ground forces came after General Douglas MacArthur, U.S. occupation commander in Japan, visited the front and advised that the South Koreans could not halt the advance. Overconfident U.S. soldiers would sustain defeat as well, retreating to the Pusan Perimeter, a rectangular area in the southeast corner of the peninsula. On September 15, MacArthur staged a risky amphibious landing at Inchon behind enemy lines that sent Communist forces fleeing back into North Korea. The People’s Republic of China viewed the U.S. offensive for reunification that followed as a threat to its security and prestige. In late November, Chinese “volunteers” attacked in mass. After a chaotic retreat, U.S. forces counterattacked in February 1951 and moved the line of battle just north of the parallel. After two Chinese offensives failed, negotiations to end the war began in July 1951, but stalemated in May 1952 over the issue of repatriation of prisoners of war. Peace came because of Stalin’s death in March 1953, rather than President Dwight D. Eisenhower’s veiled threat to stage nuclear strikes against China.
Scholars have disagreed about many issues surrounding the Korean War, but the most important debate continues to center on whether the conflict had international or domestic origins. Initially, historians relied mainly on U.S. government publications to write accounts that ignored events prior to North Korea’s attack, endorsing an orthodox interpretation assigning blame to the Soviet Union and applauding the U.S. response. Declassification of U.S. government documents and presidential papers during the 1970s led to the publication of studies assigning considerable responsibility to the United States for helping to create a kind of war in Korea before June 1950. Moreover, left revisionist writers labeled the conflict a classic civil war. Release of Chinese and Soviet sources after 1989 established that Stalin and Chinese leader Mao Zedong approved the North Korean invasion, prompting right revisionist scholars to reassert key orthodox arguments. This essay describes how and why recent access to Communist documents has not settled the disagreements among historians about the causes, course, and consequences of the Korean War.
Laura Isabel Serna
Latinos have constituted part of the United States’ cinematic imagination since the emergence of motion pictures in the late 19th century. Though shifting in their specific contours, representations of Latinos have remained consistently stereotypical; Latinos have primarily appeared on screen as bandits, criminals, nameless maids, or sultry señoritas. These representations have been shaped by broader political and social issues and have influenced the public perception of Latinos in the United States. However, the history of Latinos and film should not be limited to the topic of representation. Latinos have participated in the film industry as actors, creative personnel (including directors and cinematographers), and have responded to representations on screen as members of audiences with a shared sense of identity, whether as mexicanos de afuera in the early 20th century, Hispanics in the 1980s and 1990s, or Latinos in the 21st century. Both participation in production and reception have been shaped by the ideas about race that characterize the film industry and its products. Hollywood’s labor hierarchy has been highly stratified according to race, and Hollywood films that represent Latinos in a stereotypical fashion have been protested by Latino audiences. While some Latino/a filmmakers have opted to work outside the confines of the commercial film industry, others have sought to gain entry and reform the industry from the inside. Throughout the course of this long history, Latino representation on screen and on set has been shaped by debates over international relations, immigration, citizenship, and the continuous circulation of people and films between the United States and Latin America.
In 1944 President Franklin D. Roosevelt’s State of the Union address set out what he termed an “economic Bill of Rights” that would act as a manifesto of liberal policies after World War Two. Politically, however, the United States was a different place than the country that had faced the ravages of the Great Depression of the 1930s and ushered in Roosevelt’s New Deal to transform the relationship between government and the people. Key legacies of the New Deal, such as Social Security, remained and were gradually expanded, but opponents of governmental regulation of the economy launched a bitter campaign after the war to roll back labor union rights and dismantle the New Deal state.
Liberal heirs to FDR in the 1950s, represented by figures like two-time presidential candidate Adlai Stevenson, struggled to rework liberalism to tackle the realities of a more prosperous age. The long shadow of the U.S. Cold War with the Soviet Union also set up new challenges for liberal politicians trying to juggle domestic and international priorities in an era of superpower rivalry and American global dominance. The election of John F. Kennedy as president in November 1960 seemed to represent a narrow victory for Cold War liberalism, and his election coincided with the intensification of the struggle for racial equality in the United States that would do much to shape liberal politics in the 1960s. After his assassination in 1963, President Lyndon Johnson launched his “Great Society,” a commitment to eradicate poverty and to provide greater economic security for Americans through policies such as Medicare. But his administration’s deepening involvement in the Vietnam War and its mixed record on alleviating poverty did much to taint the positive connotations of “liberalism” that had dominated politics during the New Deal era.
Landon R. Y. Storrs
The second Red Scare refers to the fear of communism that permeated American politics, culture, and society from the late 1940s through the 1950s, during the opening phases of the Cold War with the Soviet Union. This episode of political repression lasted longer and was more pervasive than the Red Scare that followed the Bolshevik Revolution and World War I. Popularly known as “McCarthyism” after Senator Joseph McCarthy (R-Wisconsin), who made himself famous in 1950 by claiming that large numbers of Communists had infiltrated the U.S. State Department, the second Red Scare predated and outlasted McCarthy, and its machinery far exceeded the reach of a single maverick politician. Nonetheless, “McCarthyism” became the label for the tactic of undermining political opponents by making unsubstantiated attacks on their loyalty to the United States.
The initial infrastructure for waging war on domestic communism was built during the first Red Scare, with the creation of an antiradicalism division within the Federal Bureau of Investigation (FBI) and the emergence of a network of private “patriotic” organizations. With capitalism’s crisis during the Great Depression, the Communist Party grew in numbers and influence, and President Franklin D. Roosevelt’s New Deal program expanded the federal government’s role in providing economic security. The anticommunist network expanded as well, most notably with the 1938 formation of the Special House Committee to Investigate Un-American Activities, which in 1945 became the permanent House Un-American Activities Committee (HUAC). Other key congressional investigation committees were the Senate Internal Security Subcommittee and McCarthy’s Permanent Subcommittee on Investigations. Members of these committees and their staff cooperated with the FBI to identify and pursue alleged subversives. The federal employee loyalty program, formalized in 1947 by President Harry Truman in response to right-wing allegations that his administration harbored Communist spies, soon was imitated by local and state governments as well as private employers. As the Soviets’ development of nuclear capability, a series of espionage cases, and the Korean War enhanced the credibility of anticommunists, the Red Scare metastasized from the arena of government employment into labor unions, higher education, the professions, the media, and party politics at all levels. The second Red Scare did not involve pogroms or gulags, but the fear of unemployment was a powerful tool for stifling criticism of the status quo, whether in economic policy or social relations. Ostensibly seeking to protect democracy by eliminating communism from American life, anticommunist crusaders ironically undermined democracy by suppressing the expression of dissent. Debates over the second Red Scare remain lively because they resonate with ongoing struggles to reconcile Americans’ desires for security and liberty.
Housing in America has long stood as a symbol of the nation’s political values and a measure of its economic health. In the 18th century, a farmhouse represented Thomas Jefferson’s ideal of a nation of independent property owners; in the mid-20th century, the suburban house was seen as an emblem of an expanding middle class. Alongside those well-known symbols were a host of other housing forms—tenements, slave quarters, row houses, French apartments, loft condos, and public housing towers—that revealed much about American social order and the material conditions of life for many people.
Since the 19th century, housing markets have been fundamental forces driving the nation’s economy and a major focus of government policies. Home construction has provided jobs for skilled and unskilled laborers. Land speculation, housing development, and the home mortgage industry have generated billions of dollars in investment capital, while ups and downs in housing markets have been considered signals of major changes in the economy. Since the New Deal of the 1930s, the federal government has buttressed the home construction industry and offered economic incentives for home buyers, giving the United States the highest home ownership rate in the world. The housing market crash of 2008 slashed property values and sparked a rapid increase in home foreclosures, especially in places like Southern California and the suburbs of the Northeast, where housing prices had ballooned over the previous two decades. The real estate crisis led to government efforts to prop up the mortgage banking industry and to assist struggling homeowners. The crisis led, as well, to a drop in rates of home ownership, an increase in rental housing, and a growth in homelessness.
Home ownership remains a goal for many Americans and an ideal long associated with the American dream. The owner-occupied home—whether single-family or multifamily dwelling—is typically the largest investment made by an American family. Through much of the 18th and 19th centuries, housing designs varied from region to region. In the mid-20th century, mass production techniques and national building codes tended to standardize design, especially in new suburban housing. In the 18th century, the family home was a site of waged and unwaged work; it was the center of a farm, plantation, or craftsman’s workshop. Two and a half centuries later, a house was a consumer good: its size, location, and decor marked the family’s status and wealth.
The national parks of the United States have been one of the country’s most popular federal initiatives, and popular not only within the nation but across the globe. The first park was Yellowstone, established in 1872, and since then almost sixty national parks have been added, along with hundreds of monuments, protected rivers and seashores, and important historical sites as well as natural preserves. In 1916 the parks were put under the National Park Service, which has managed them primarily as scenic treasures for growing numbers of tourists. Ecologically minded scientists, however, have challenged that stewardship and called for restoration of parks to their natural conditions, defined as their ecological integrity before white Europeans intervened. The most influential voice in the history of park philosophy remains John Muir, the California naturalist and Yosemite enthusiast and himself a proto-ecologist, who saw the parks as sacred places for a modern nation, where reverence for nature and respect for science might coexist and where tourists could be educated in environmental values. As other nations have created their own park systems, similar debates have occurred. While parks may seem like a great modern idea, this idea has always been embedded in cultural and social change—and subject to struggles over what that “idea” should be.
Luke A. Nichter
Assessments of President Richard Nixon’s foreign policy continue to evolve as scholars tap new possibilities for research. Due to the long wait before national security records are declassified by the National Archives and made available to researchers and the public, only in recent decades has the excavation of the Nixon administration’s engagement with the world started to become well documented. As more records are released by the National Archives (including potentially 700 hours of Nixon’s secret White House tapes that remain closed), scholarly understanding of the Nixon presidency is likely to continue changing. Thus far, historians have pointed to four major legacies of Nixon’s foreign policy: tendencies to use American muscle abroad on a more realistic scale, to reorient the focus of American foreign policy to the Pacific, to reduce the chance that the Cold War could turn hot, and, inadvertently, to contribute to the later rise of Ronald Reagan and the Republican right wing—many of whom had been part of Nixon’s “silent majority.” While earlier works focused primarily on subjects like Vietnam, China, and the Soviet Union, the historiography today is much more diverse – now there is at least one work covering most major aspects of Nixon’s foreign policy.