James Graham Wilson
The Cold War may have ended on the evening of November 9, 1989, when East German border guards opened up checkpoints and allowed their fellow citizens to stream into West Berlin; it certainly was over by January 28, 1992, when U.S. president George H. W. Bush delivered his annual State of the Union Address one month after President Mikhail Gorbachev had announced his resignation and the end of the Soviet Union. After the Berlin Wall came down, Bush and Gorbachev spoke of the Cold War in the past tense in person and on the telephone. The reunification of Germany and U.S. military campaign in the Persian Gulf confirmed that reality. In January 1991, polls indicated that, for the first time, a majority of Americans believed that the Cold War was over. However, the poll results obscured the substantial foreign and domestic crises, challenges, and opportunities created by the end of the Cold War that occupied President Bush and his national-security team between November 1989 and Bush’s defeat in the 1992 presidential inauguration and the inauguration of William Jefferson Clinton as America’s first post–Cold War president in January 1993.
R. Joseph Parrott
The United States never sought to build an empire in Africa in the 19th and 20th centuries, as did European nations from Britain to Portugal. However, economic, ideological, and cultural affinities gradually encouraged the development of relations with the southern third of the continent (the modern Anglophone nations of South Africa, Zimbabwe, Zambia, Namibia, the former Portuguese colonies of Mozambique and Angola, and a number of smaller states). With official ties limited for decades, missionaries and business concerns built a small but influential American presence mostly in the growing European settler states. This state of affairs made the United State an important trading partner during the 20th century, but it also reinforced the idea of a white Christian civilizing mission as justification for the domination of black peoples. The United States served as a comparison point for the construction of legal systems of racial segregation in southern Africa, even as it became more politically involved in the region as part of its ideological competition with the Soviet Union.
As Europe’s empires dissolved after World War II, official ties to white settler states such as South Africa, Angola, and Rhodesia (modern Zimbabwe) brought the United States into conflict with mounting demands for decolonization, self-determination, and racial equality—both international and domestic. Southern Africa illustrated the gap between a Cold War strategy predicated on Euro-American preponderance and national traditions of liberty and democracy, eliciting protests from civil and human rights groups that culminated in the successful anti-apartheid movement of the 1980s. Though still a region of low priority at the beginning of the 21st century, American involvement in southern Africa evolved to emphasize the pursuit of social and economic improvement through democracy promotion, emergency relief, and health aid—albeit with mixed results. The history of U.S. relations with southern Africa therefore illustrates the transformation of trans-Atlantic racial ideologies and politics over the last 150 years, first in the construction of white supremacist governance and later in the eventual rejection of this model.
Oil played a central role in shaping US policy toward Iraq over the course of the 20th century. The United States first became involved in Iraq in the 1920s as part of an effort secure a role for American companies in Iraq’s emerging oil industry. As a result of State Department efforts, American companies gained a 23.75 percent ownership share of the Iraq Petroleum Company in 1928. In the 1940s, US interest in the country increased as a result of the Cold War with the Soviet Union. To defend against a perceived Soviet threat to Middle East oil, the US supported British efforts to “secure” the region. After nationalist officers overthrew Iraq’s British-supported Hashemite monarchy in 1958 and established friendly relations with the Soviet Union, the United States cultivated an alliance with the Iraqi Baath Party as an alternative to the Soviet-backed regime. The effort to cultivate an alliance with the Baath foundered as a result the Baath’s perceived support for Arab claims against Israel. The breakdown of US-Baath relations led the Baath to forge an alliance with the Soviet Union. With Soviet support, the Baath nationalized the Iraq Petroleum Company in 1972. Rather than resulting in a “supply cutoff,” Soviet economic and technical assistance allowed for a rapid expansion of the Iraqi oil industry and an increase in Iraqi oil flowing to world markets. As Iraq experienced a dramatic oil boom in the 1970s, the United States looked to the country as a lucrative market for US exports goods and adopted a policy of accommodation with regard to Baath. This policy of accommodation gave rise to close strategic and military cooperation throughout the 1980s as Iraq waged war against Iran. When Iraq invaded Kuwait and seized control of its oil fields in 1990, the United States shifted to a policy of Iraqi containment. The United States organized an international coalition that quickly ejected Iraqi forces from Kuwait, but chose not to pursue regime change for fear of destabilizing the country and wider region. Throughout the 1990s, the United States adhered to a policy of Iraqi containment but came under increasing pressure to overthrow the Baath and dismantle its control over the Iraqi oil industry. In 2003, the United States seized upon the 9/11 terrorist attacks as an opportunity to implement this policy of regime change and oil reprivatization.
The U.S. relationship with Southeast Asia has always reflected the state of U.S. interactions with the three major powers that surround the region: Japan, China, and, to a lesser extent, India. Initially, Americans looked at Southeast Asia as an avenue to the rich markets that China and India seemed to offer, while also finding trading opportunities in the region itself. Later, American missionaries sought to save Southeast Asian souls, while U.S. officials often viewed Southeast Asia as a region that could tip the overall balance of power in East Asia if its enormous resources fell under the control of a hostile power.
American interest expanded enormously with the annexation of the Philippines in 1899, an outgrowth of the Spanish-American War. That acquisition resulted in a nearly half-century of American colonial rule, while American investors increased their involvement in exploiting the region’s raw materials, notably tin, rubber, and petroleum, and missionaries expanded into areas previously closed to them.
American occupation of the Philippines heightened tensions with Japan, which sought the resources of Southeast Asia, particularly in French Indochina, Malaya, and the Dutch East Indies (today’s Indonesia). Eventually, clashing ambitions and perceptions brought the United States into World War II. Peeling those territories away from Japan during the war was a key American objective. Americans resisted the Japanese in the Philippines and in Burma, but after Japan quickly subdued Southeast Asia, there was little contact in the region until the reconquest began in 1944. American forces participated in the liberation of Burma and also fought in the Dutch Indies and the Philippines before the war ended in 1945.
After the war, the United States had to face the independence struggles in several Southeast Asian countries, even as the Grand Alliance fell apart and the Cold War emerged, which for the next several decades overshadowed almost everything. American efforts to prevent communist expansion in the region inhibited American support for decolonization and led to war in Vietnam and Laos and covert interventions elsewhere.
With the end of the Cold War in 1991, relations with most of Southeast Asia have generally been normal, except for Burma/Myanmar, where a brutal military junta ruled. The opposition, led by the charismatic Aung San Suu Kyi, found support in the United States. More recently American concerns with China’s new assertiveness, particularly in the South China Sea, have resulted in even closer U.S. relations with Southeast Asian countries.
Michael R. Anderson
American strategy in the Asia Pacific over the past two centuries has been marked by strong and often contradictory impulses. On the one hand, the western Pacific has served as a fertile ground for Christian missionaries, an alluring destination for American commercial enterprises, and eventually a critical launchpad for U.S. global power projection. Yet on the other hand, American policymakers at times have subordinated Asian strategy to European-based interests, or have found themselves embroiled in area conflicts that have hampered efforts to extend U.S. regional hegemony. Furthermore, leading countries in the Asia-Pacific region at times have challenged U.S. economic and military objectives, and the assertion of “Asian values” in recent years has undermined efforts to expand Western political and cultural norms. The United States’s professed “pivot to Asia” has opened a new chapter in a centuries-long relationship, one that will determine the geopolitical fault lines of the 21st century.
On February 19, 1942, President Franklin Delano Roosevelt signed Executive Order 9066 authorizing the incarceration of 120,000 Japanese Americans, living primarily on the West Coast of the continental United States. On August 10, 1988, President Ronald Reagan signed legislation authorizing formal apologies and checks for $20,000 to those still alive who had been unjustly imprisoned during WWII. In the interim period, nearly a half century, there were enormous shifts in memories of the events, mainstream accounts, and internal ethnic accountabilities. To be sure, there were significant acts of resistance, from the beginning of mass forced removal to the Supreme Court decisions toward the end of the war. But for a quarter of a century, between 1945 and approximately 1970, there was little to threaten a master narrative that posited Japanese Americans, led by the Japanese American Citizens League (JACL), as a once-embattled ethnic/racial minority that had transcended its victimized past to become America’s treasured model minority. The fact that the Japanese American community began effective mobilization for government apology and reparations in the 1970s only confirmed its emergence as a bona fide part of the American body politic. But where the earlier narrative extolled the memories of Japanese American war heroes and leaders of the JACL, memory making changed dramatically in the 1990s and 2000s. In the years since Reagan’s affirmation that “here we admit a wrong,” Japanese Americans have unleashed a torrent of memorials, museums, and monuments honoring those who fought the injustices and who swore they would resist current or future attempts to scapegoat other groups in the name of national security.
The United States was heavily involved in creating the United Nations in 1945 and drafting its charter. The United States continued to exert substantial clout in the organization after its founding, though there have been periods during which U.S. officials have met with significant opposition inside the United Nations, in Congress, and in American electoral politics, all of which produced struggles to gain support for America’s international policy goals. U.S. influence in the international organization has thus waxed and waned. The early postwar years witnessed the zenith of American prestige on the global stage. Starting in the mid- to late 1950s, as decolonization and the establishment of newly independent nations quickened, the United States began to lose influence in the United Nations owing to the spreading perception that its alliances with the European colonial powers placed it on the wrong side of history. As U.N. membership skyrocketed, the organization became more responsive to the needs and interests of the decolonizing states. During the 1970s and early 1980s, the American public responded to declining U.S. influence in the United Nations with calls to defund the organization and to pursue a unilateral approach to international challenges. The role of the United States in the United Nations was shaped by the politics of the Cold War competition with the Soviet Union. Throughout the nearly five decades of the Cold War, the United Nations served as a forum for the political and ideological rivalry between the United States and the Soviet Union, which frequently inhibited the organization from fulfilling what most considered to be its primary mission: the maintenance of global security and stability. After the collapse of the Soviet Union and the peaceful end of the Cold War, the United States enjoyed a brief period of unrivaled global hegemony. During this period, U.S. officials pursued a closer relationship with the United Nations and sought to use the organization to build support for its international policy agenda and military interventionism.
Andrew J. Falk
Americans in and out of government have relied on media and popular culture to construct the national identity, frame debates on military interventions, communicate core values abroad, and motivate citizens around the world to act in prescribed ways. During the late 19th century, as the United States emerged as a world power and expanded overseas, Americans adopted an ethos of worldliness in their everyday lives, even as some expressed worry about the nation’s position on war and peace. During the interwar period of the 1920s and 1930s, though America failed to join the League of Nations and retreated from foreign engagements, the nation also increased cultural interactions with the rest of the world through the export of motion pictures, music, consumer products, food, fashion, and sports. The policies and character of the Second World War were in part shaped by propaganda that evolved from earlier information campaigns. As the United States confronted communism during the Cold War, the government sanitized its cultural weapons to win the hearts and minds of Americans, allies, enemies, and nonaligned nations. But some cultural producers dissented from America’s “containment policy,” refashioned popular media for global audiences, and sparked a change in Washington’s cultural-diplomacy programs. An examination of popular culture also shows how people in the “Third World” deftly used the media to encourage superpower action. In the 21st century, activists and revolutionaries can be considered the inheritors of this tradition because they use social media to promote their political agendas. In short, understanding the roles popular culture played as America engaged the world greatly expands our understanding of modern American foreign relations.
Thomas A. Reinstein
The United States has a rich history of intelligence in the conduct of foreign relations. Since the Revolutionary War, intelligence has been most relevant to U.S. foreign policy in two ways. Intelligence analysis helps to inform policy. Intelligence agencies also have carried out overt action—secret operations—to influence political, military, or economic conditions in foreign states. The American intelligence community has developed over a long period, and major changes to that community have often occurred because of contingent events rather than long-range planning. Throughout their history, American intelligence agencies have used intelligence gained from both human and technological sources to great effect. Often, U.S. intelligence agencies have been forced to rely on technological means of intelligence gathering for lack of human sources. Recent advances in cyberwarfare have made technology even more important to the American intelligence community.
At the same time, the relationship between intelligence and national-security–related policymaking has often been dysfunctional. Indeed, though some American policymakers have used intelligence avidly, many others have used it haphazardly or not at all. Bureaucratic fights also have crippled the American intelligence community. Several high-profile intelligence failures tend to dominate the recent history of intelligence and U.S. foreign relations. Some of these failures were due to lack of intelligence or poor analytic tradecraft. Others came because policymakers failed to use the intelligence they had. In some cases, policymakers have also pressured intelligence officers to change their findings to better suit those policymakers’ goals. And presidents have often preferred to use covert action to carry out their preferred policies without paying attention to intelligence analysis. The result has been constant debate about the appropriate role of intelligence in U.S. foreign relations.
Justus D. Doenecke
For the United States, isolationism is best defined as avoidance of wars outside the Western Hemisphere, particularly in Europe; opposition to binding military alliances; and the unilateral freedom to act politically and commercially unrestrained by mandatory commitments to other nations. Until the controversy over American entry into the League of Nations, isolationism was never subject to debate. The United States could expand its territory, protect its commerce, and even fight foreign powers without violating its traditional tenets. Once President Woodrow Wilson sought membership in the League, however, Americans saw isolationism as a foreign policy option, not simply something taken for granted. A fundamental foreign policy tenet now became a faction, limited to a group of people branded as “isolationists.” Its high point came during the years 1934–1937, when Congress, noting the challenge of the totalitarian nations to the international status quo, passed the neutrality acts to insulate the country from global entanglements.
Once World War II broke out in Europe, President Franklin D. Roosevelt increasingly sought American participation on the side of the Allies. Isolationists unsuccessfully fought FDR’s legislative proposals, beginning with repeal of the arms embargo and ending with the convoying of supplies to Britain. The America First Committee (1940–1941), however, so effectively mobilized anti-interventionist opinion as to make the president more cautious in his diplomacy.
If the Japanese attack on Pearl Harbor permanently ended classic isolationism, by 1945 a “new isolationism” voiced suspicion of the United Nations, the Truman Doctrine, aid to Greece and Turkey, the Marshall Plan, the North Atlantic Treaty Organization, and U.S. participation in the Korean War. Yet, because the “new isolationists” increasingly advocated militant unilateral measures to confront Communist Russia and China, often doing so to advance the fortunes of the Republican party, they exposed themselves to charges of inconsistency and generally faded away in the 1950s. Since the 1950s, many Americans have opposed various military involvements— including the ones in Vietnam, Iraq, and Afghanistan— but few envision returning to an era when the United States avoids all commitments.