Peace activism in the United States between 1945 and the 2010s focused mostly on opposition to U.S. foreign policy, efforts to strengthen and foster international cooperation, and support for nuclear nonproliferation and arms control. The onset of the Cold War between the United States and the Soviet Union marginalized a reviving postwar American peace movement emerging from concerns about atomic and nuclear power and worldwide nationalist politics that everywhere seemed to foster conflict, not peace. Still, peace activism continued to evolve in dynamic ways and to influence domestic politics and international relations.
Most significantly, peace activists pioneered the use of Gandhian nonviolence in the United States and provided critical assistance to the African American civil rights movement, led the postwar antinuclear campaign, played a major role in the movement against the war in Vietnam, helped to move the liberal establishment (briefly) toward a more dovish foreign policy in the early 1970s, and helped to shape the political culture of American radicalism. Despite these achievements, the peace movement never regained the political legitimacy and prestige it held in the years before World War II, and it struggled with internal divisions about ideology, priorities, and tactics.
Peace activist histories in the 20th century tended to emphasize organizational or biographical approaches that sometimes carried hagiographic overtones. More recently, historians have applied the methods of cultural history, examining the role of religion, gender, and race in structuring peace activism. The transnational and global turn in the historical discipline has also begun to make inroads in peace scholarship. These are promising new directions because they situate peace activism within larger historical and cultural developments and relate peace history to broader historiographical debates and trends.
Public authorities are agencies created by governments to engage directly in the economy for public purposes. They differ from standard agencies in that they operate outside the administrative framework of democratically accountable government. Since they generate their own operating income by charging users for goods and services and borrow for capital expenses based on projections of future revenues, they can avoid the input from voters and the regulations that control public agencies funded by tax revenues.
Institutions built on the public authority model exist at all levels of government and in every state. A few of these enterprises, such as the Tennessee Valley Authority and the Port Authority of New York and New Jersey, are well known. Thousands more toil in relative obscurity, operating toll roads and bridges, airports, transit systems, cargo ports, entertainment venues, sewer and water systems, and even parking garages. Despite their ubiquity, these agencies are not well understood. Many release little information about their internal operations. It is not even possible to say conclusively how many exist, since experts disagree about how to define them, and states do not systematically track them.
One thing we do know about public authorities is that, over the course of the 20th century, these institutions have become a major component of American governance. Immediately following the Second World War, they played a minor role in public finance. But by the early 21st century, borrowing by authorities constituted well over half of all public borrowing at the sub-federal level. This change means that increasingly the leaders of these entities, rather than elected officials, make key decisions about where and how to build public infrastructure and steer economic development in the United States
Jimmy Carter’s “Crisis of Confidence Speech” of July 1979 was a critical juncture in post-1945 U.S. politics, but it also marks an exemplary pivot in post-1945 religion. Five dimensions of faith shaped the president’s sermon. The first concerned the shattered consensus of American religion. When Carter encouraged Americans to recapture a spirit of unity, he spoke in a heartfelt but spent language more suitable to Dwight Eisenhower’s presidency than his own. By 1979, the Protestant-Catholic-Jewish consensus of Eisenhower’s time was fractured into a dynamic pluralism, remaking American religion in profound ways. Carter’s speech revealed a second revolution of post-1945 religion when it decried its polarization and politicization. Carter sought to heal ruptures that were dividing the nation between what observers, two decades hence, would label “red” (conservative Republican) and “blue” (liberal Democratic) constituencies. Yet his endeavors failed, as would be evidenced in the religious politics of Ronald Reagan’s era, which followed. Carter championed community values as the answer to his society’s problems aware of yet a third dawning reality: globalization. The virtues of localism that Carter espoused were in fact implicated in (and complicated by) transnational forces of change that saw immigration, missionary enterprises, and state and non-state actors internationalizing the American religious experience. A fourth illuminating dimension of Carter’s speech was its critique of America’s gospel of wealth. Although this “born-again” southerner was a product of the evangelical South’s revitalized free-market capitalism, he lamented how laissez-faire Christianity had become America’s lingua franca. Finally, Carter wrestled with secularization, revealing a fifth feature of post-1945 America. Even though faith commitments were increasingly cordoned off from formal state functions during this time, the nation’s political discourse acquired a pronounced religiosity. Carter contributed by framing mundane issues (such as energy) in moral contexts that drew no hard-and-fast boundaries between matters of the soul and governance. Drawn from the political and economic crises of his moment, Carter’s speech thus also reveals the all-enveloping tide of religion in America’s post-1945 age.
Rock and roll, a popular music craze of the mid-1950s, turned a loud, fast, and sexy set of sounds rooted in urban, black, working class, and southern America into the pop preference as well of suburban, white, young, and northern America. By the late 1960s, those fans and British counterparts made their own version, more politicized and experimental and just called rock—the summoning sound of the counterculture. Rock’s aura soon faded: it became as much entertainment staple as dissident form, with subcategories disparate as singer-songwriter, heavy metal, alternative, and “classic rock.” Where rock and roll was integrated and heterogeneous, rock was largely white and homogeneous, policing its borders. Notoriously, rock fans detonated disco records in 1979. By the 1990s, rock and roll style was hip-hop, with its youth appeal and rebelliousness; post‒baby boomer bands gave rock some last vanguard status; and suburbanites found classic rock in New Country. This century’s notions of rock and roll have blended thoroughly, from genre “mash-ups” to superstar performers almost categories unto themselves and new sounds such as EDM beats. Still, crossover moments evoke rock and roll; assertions of authenticity evoke rock. Because rock and roll, and rock, epitomize cultural ideals and group identities, their definitions have been constantly debated. Initial argument focused on challenging genteel, professional notions of musicianship and behavior. Later discourse took up cultural incorporation and social empowerment, with issues of gender and commercialism as prominent as race and artistry. Rock and roll promised one kind of revolution to the post-1945 United States; rock another. The resulting hope and confusion has never been fully sorted, with mixed consequences for American music and cultural history.
In 1835, Alexis de Tocqueville argued in Democracy in America that there were “two great nations in the world.” They had started from different historical points but seemed to be heading in the same direction. As expanding empires, they faced the challenges of defeating nature and constructing a civilization for the modern era. Although they adhered to different governmental systems, “each of them,” de Tocqueville declared, “seems marked out by the will of Heaven to sway the destinies of half the globe.”
De Tocqueville’s words were prophetic. In the 19th century, Russian and American intellectuals and diplomats struggled to understand the roles that their countries should play in the new era of globalization and industrialization. Despite their differing understandings of how development should happen, both sides believed in their nation’s vital role in guiding the rest of the world. American adherents of liberal developmentalism often argued that a free flow of enterprise, trade, investment, information, and culture was the key to future growth. They held that the primary obligation of American foreign policy was to defend that freedom by pursuing an “open door” policy and free access to markets. They believed that the American model would work for everyone and that the United States had an obligation to share its system with the old and underdeveloped nations around it.
A similar sense of mission developed in Russia. Russian diplomats had for centuries struggled to establish defensive buffers around the periphery of their empire. They had linked economic development to national security, and they had argued that their geographic expansion represented a “unification” of peoples as opposed to a conquering of them. In the 19th century, after the Napoleonic Wars and the failed Decembrist Revolution, tsarist policymakers fought to defend autocracy, orthodoxy, and nationalism from domestic and international critics. As in the United States, Imperial and later Soviet leaders envisioned themselves as the emissaries of the Enlightenment to the backward East and as protectors of tradition and order for the chaotic and revolutionary West.
These visions of order clashed in the 20th century as the Soviet Union and the United States became superpowers. Conflicts began early, with the American intervention in the 1918–1921 Russian civil war. Tensions that had previously been based on differing geographic and strategic interests then assumed an ideological valence, as the fight between East and West became a struggle between the political economies of communism and capitalism. Foreign relations between the two countries experienced boom and bust cycles that took the world to the brink of nuclear holocaust and yet maintained a strategic balance that precluded the outbreak of global war for fifty years. This article will examine how that relationship evolved and how it shaped the modern world.
Robert O. Self
Few decades in American history reverberate with as much historical reach or glow as brightly in living mythology as the 1960s. During those years Americans reanimated and reinvented the core political principles of equality and liberty but, in a primal clash that resonates more than half a century later, fiercely contested what those principles meant, and for whom. For years afterward, the decade’s appreciators considered the era to have its own “spirit,” defined by greater freedoms and a deeper, more authentic personhood, and given breath by a youthful generation’s agitation for change in nearly every dimension of national life. To its detractors in subsequent decades, the era was marked by immature radical fantasies and dangerous destabilizations of the social order, behind which lay misguided youthful enthusiasms and an overweening, indulgent federal government. We need not share either conviction to appreciate the long historical shadow cast by the decade’s clashing of left, right, and center and its profound influence over the political debates, cultural logics, and social practices of the many years that followed.
The decade’s political and ideological clashes registered with such force because post–World War II American life was characterized by a society-wide embrace of antiradicalism and a prescribed normalcy. Having emerged from the war as the lone undamaged capitalist industrial power, the United States exerted enormous influence throughout the globe after 1945—so much that some historians have called the postwar years a “pax Americana.” In its own interest and in the interest of its Western allies, the United States engaged in a Cold War standoff with the Soviet Union over the fate of Europe and no less over the fate of developing countries on every continent. Fiercely anticommunist abroad and at home, U.S. elites stoked fears of the damage communism could do, whether in Eastern Europe or in a public school textbook. Americans of all sorts in the postwar years embraced potent ideologies justifying the prevailing order, whether that order was capitalist, patriarchal, racial, or heterosexual. They pursued a postwar “normalcy” defined by nuclear family domesticity and consumer capitalism in the shadow cast by the threat of communism and, after 1949, global thermonuclear war with the Soviet Union. This prevailing order was stultifying and its rupture in the 1960s is the origin point of the decade’s great dramas.
The social movements of that decade drew Americans from the margins of citizenship—African Americans, Latina/o, Native Americans, women, and gay men and lesbians, among others—into epochal struggles over the withheld promise of equality. For the first time since 1861, an American war deeply split the nation, nearly destroying a major political party and intensifying a generational revolt already under way. Violence, including political assassinations at the highest level, bombings and assassinations of African Americans, bombings by left-wing groups like the Weathermen, and major urban uprisings by African Americans against police and property bathed the country in more blood. The New Deal liberalism of Presidents Franklin D. Roosevelt and Harry S. Truman reached its postwar peak in 1965 under President Lyndon Johnson’s Great Society and then retreated amid acrimony and backlash, as a new conservative politics gained traction. All this took place in the context of a “global 1960s,” in which societies in Western and Eastern Europe, Latin America, Africa, and elsewhere experienced similar generational rebellions, quests for meaningful democracy, and disillusionment with American global hegemony. From the first year of the decade to the last, the 1960s were a watershed era that marked the definitive end of a “postwar America” defined by easy Cold War dualities, presumptions of national innocence, and political calcification.
To explain the foregoing, this essay is organized in five sections. First comes a broad overview of the decade, highlighting some of its indelible moments and seminal political events. The next four sections correspond to the four signature historical developments of the 1960s. Discussed first is the collapse of the political consensus that predominated in national life following World War II. We can call this consensus “Vital Center liberalism,” after the title of a 1949 book by Arthur Schlesinger Jr., or “Cold War liberalism.” Its assault from both the New Left and the New Right is one of the defining stories of the 1960s. Second is the resurgence, after a decades-long interregnum dating to Reconstruction, of African American political agency. The black freedom struggle of the 1960s was far more than a social movement for civil rights. To shape the conditions of national life and the content of public debate in ways impossible under Jim Crow, black American called for nothing less than a spiritual and political renewal of the country. Third, and following from the latter, is the emergence within the American liberal tradition of a new emphasis on expanding individual rights and ending invidious discrimination. Forged in conjunction with the black freedom movement by women, Latino/as, Asian Americans, Native Americans, and homophiles (as early gay rights activists were called) and gay liberationists, this new emphasis profoundly changed American law and set the terms of political debate for the next half century. Fourth and lastly, the 1960s witnessed the flourishing of a broad and diverse culture of anti-authoritarianism. In art, politics, and social behavior, this anti-authoritarianism took many forms, but at its heart lay two distinct historical phenomena: an ecstatic celebration of youth, manifest in the tension between the World War II generation and the baby boom generation, and an intensification of the long-standing conflict in American life between individualism and hierarchical order.
Despite the disruptions, rebellions, and challenges to authority in the decade, the political and economic elite proved remarkably resilient and preserved much of the prevailing order. This is not to discount the foregoing account of challenges to that order or to suggest that social change in the 1960s made little difference in American life. However, in grappling with this fascinating decade we are confronted with the paradox of outsized events and enormous transformations in law, ideology, and politics alongside a continuation, even an entrenchment, of traditional economic and political structures and practices.
The decade of the 1980s represented a turning point in American history—a crucial era, marked by political conservatism and an individualistic ethos. The 1980s also witnessed a dramatic series of developments in U.S. foreign relations, first an intensification of the Cold War with the Soviet Union and then a sudden relaxation of tensions and the effective end of the Cold War with an American victory. All of these developments were advanced and symbolized in the presidential administration of Ronald Reagan (1981–1989), a polarizing figure but a highly successful political leader. Reagan dominates our memories of the 1980s like few other American leaders do other eras. Reagan and the political movement he led—Reaganism—are central to the history of the 1980s. Both their successes and their failures, which became widely acknowledged in the later years of the decade, should be noted. Reaganite conservatives won political victories by rolling back state power in many realms, most of all in terms of taxation and regulation. They also succeeded in putting America at the unquestioned pinnacle of the world order through a victory over the Soviet Union in the Cold War, although this was unforeseen by America’s Cold Warriors when the 1980s began. The failures of Reaganite conservatism include its handling of rising poverty levels, the HIV/AIDS crisis, and worsening racial tensions, all problems that either Reaganites did little to stem or to which they positively contributed. In foreign affairs, Reaganites pursued a “war on terror” of questionable success, and their approach to Third World arenas of conflict, including Central America, exacted a terrible human toll.
The 1950s have typically been seen as a complacent, conservative time between the end of World War II and the radical 1960s, when anticommunism and the Cold War subverted reform and undermined civil liberties. But the era can also be seen as a very liberal time in which meeting the Communist threat led to Keynesian economic policies, the expansion of New Deal programs, and advances in civil rights. Politically, it was “the Eisenhower Era,” dominated by a moderate Republican president, a high level of bipartisan cooperation, and a foreign policy committed to containing communism. Culturally, it was an era of middle-class conformity, which also gave us abstract expressionism, rock and roll, Beat poetry, and a grassroots challenge to Jim Crow.
Since the social sciences began to emerge as scholarly disciplines in the last quarter of the 19th century, they have frequently offered authoritative intellectual frameworks that have justified, and even shaped, a variety of U.S. foreign policy efforts. They played an important role in U.S. imperial expansion in the late 19th and early 20th centuries. Scholars devised racialized theories of social evolution that legitimated the confinement and assimilation of Native Americans and endorsed civilizing schemes in the Philippines, Cuba, and elsewhere. As attention shifted to Europe during and after World War I, social scientists working at the behest of Woodrow Wilson attempted to engineer a “scientific peace” at Versailles. The desire to render global politics the domain of objective, neutral experts intensified during World War II and the Cold War. After 1945, the social sciences became increasingly central players in foreign affairs, offering intellectual frameworks—like modernization theory—and bureaucratic tools—like systems analysis—that shaped U.S. interventions in developing nations, guided nuclear strategy, and justified the increasing use of the U.S. military around the world.
Throughout these eras, social scientists often reinforced American exceptionalism—the notion that the United States stands at the pinnacle of social and political development, and as such has a duty to spread liberty and democracy around the globe. The scholarly embrace of conventional political values was not the result of state coercion or financial co-optation; by and large social scientists and policymakers shared common American values. But other social scientists used their knowledge and intellectual authority to critique American foreign policy. The history of the relationship between social science and foreign relations offers important insights into the changing politics and ethics of expertise in American public policy.
K. Tsianina Lomawaima
In 1911, a group of American Indian intellectuals organized what would become known as the Society of American Indians, or SAI. SAI members convened in annual meetings between 1911 and 1923, and for much of that period the Society’s executive offices were a hub for political advocacy, lobbying Congress and the Office of Indian Affairs (OIA), publishing a journal, offering legal assistance to Native individuals and tribes, and maintaining an impressively voluminous correspondence across the country with American Indians, “Friends of the Indian” reformers, political allies, and staunch critics. Notable Native activists, clergy, entertainers, professionals, speakers, and writers—as well as Native representatives from on- and off-reservation communities—were active in the Society. They worked tirelessly to meet daunting, unrealistic expectations, principally to deliver a unified voice of Indian “public opinion” and to pursue controversial political goals without appearing too radical, especially obtaining U.S. citizenship for Indian individuals and allowing Indian nations to access the U.S. Court of Claims. They maintained their myriad activities with scant financial resources solely through the unpaid labor of dedicated Native volunteers. By 1923, the challenges exhausted the Society’s substantial human and miniscule financial capital. The Native “soul of unity” demanded by non-white spectators and hoped for by SAI leaders could no longer hold the center, and the SAI dissolved. Their work was not in vain, but citizenship and the ability to file claims materialized in circumscribed forms. In 1924 Congress passed the Indian Citizenship Act, granting birthright citizenship to American Indians, but citizenship for Indians was deemed compatible with continued wardship status. In 1946 Congress established an Indian Claims Commission, not a court, and successful claims could only result in monetary compensation, not regained lands.