Maureen A. Flanagan
The decades from the 1890s into the 1920s produced reform movements in the United States that resulted in significant changes to the country’s social, political, cultural, and economic institutions. The impulse for reform emanated from a pervasive sense that the country’s democratic promise was failing. Political corruption seemed endemic at all levels of government. An unregulated capitalist industrial economy exploited workers and threatened to create a serious class divide, especially as the legal system protected the rights of business over labor. Mass urbanization was shifting the country from a rural, agricultural society to an urban, industrial one characterized by poverty, disease, crime, and cultural clash. Rapid technological advancements brought new, and often frightening, changes into daily life that left many people feeling that they had little control over their lives. Movements for socialism, woman suffrage, and rights for African Americans, immigrants, and workers belied the rhetoric of the United States as a just and equal democratic society for all its members.
Responding to the challenges presented by these problems, and fearful that without substantial change the country might experience class upheaval, groups of Americans proposed undertaking significant reforms. Underlying all proposed reforms was a desire to bring more justice and equality into a society that seemed increasingly to lack these ideals. Yet there was no agreement among these groups about the exact threat that confronted the nation, the means to resolve problems, or how to implement reforms. Despite this lack of agreement, all so-called Progressive reformers were modernizers. They sought to make the country’s democratic promise a reality by confronting its flaws and seeking solutions. All Progressivisms were seeking a via media, a middle way between relying on older ideas of 19th-century liberal capitalism and the more radical proposals to reform society through either social democracy or socialism. Despite differences among Progressives, the types of Progressivisms put forth, and the successes and failures of Progressivism, this reform era raised into national discourse debates over the nature and meaning of democracy, how and for whom a democratic society should work, and what it meant to be a forward-looking society. It also led to the implementation of an activist state.
Public authorities are agencies created by governments to engage directly in the economy for public purposes. They differ from standard agencies in that they operate outside the administrative framework of democratically accountable government. Since they generate their own operating income by charging users for goods and services and borrow for capital expenses based on projections of future revenues, they can avoid the input from voters and the regulations that control public agencies funded by tax revenues.
Institutions built on the public authority model exist at all levels of government and in every state. A few of these enterprises, such as the Tennessee Valley Authority and the Port Authority of New York and New Jersey, are well known. Thousands more toil in relative obscurity, operating toll roads and bridges, airports, transit systems, cargo ports, entertainment venues, sewer and water systems, and even parking garages. Despite their ubiquity, these agencies are not well understood. Many release little information about their internal operations. It is not even possible to say conclusively how many exist, since experts disagree about how to define them, and states do not systematically track them.
One thing we do know about public authorities is that, over the course of the 20th century, these institutions have become a major component of American governance. Immediately following the Second World War, they played a minor role in public finance. But by the early 21st century, borrowing by authorities constituted well over half of all public borrowing at the sub-federal level. This change means that increasingly the leaders of these entities, rather than elected officials, make key decisions about where and how to build public infrastructure and steer economic development in the United States
Joseph E. Hower
Government employees are an essential part of the early-21st-century labor movement in the United States. Teachers, firefighters, and police officers are among the most heavily unionized occupations in America, but public-sector union members also include street cleaners and nurses, janitors and librarians, zookeepers and engineers. Despite cultural stereotypes that continue to associate unions with steel or auto workers, public employees are five times more likely to be members of unions than workers in private industry. Today, nearly half of all union members work for federal, state, or local governments.
It was not always so. Despite a long, rich history of workplace and ballot box activism, government workers were marginal to the broader labor movement until the second half of the 20th century. Excluded from the legal breakthroughs that reshaped American industry in the 1930s, government workers lacked the basic organizing and bargaining rights extended to their private-sector counterparts. A complicated, and sometimes convoluted, combination of discourse and doctrine held that government employees were, as union leader Jerry Wurf later put it, a “servant to a master” rather than “a worker with a boss.” Inspired by the material success of workers in mass industry and moved by the moral clarity of the Black Freedom struggle, government workers demanded an end to their second-class status through one of the most consequential, and least recognized, social movements of late 20th century. Yet their success at improving the pay, benefits, and conditions of government work also increased the cost of government services, imposing new obligations at a time of dramatic change in the global economy. In the resulting crunch, unionized public workers came under political pressure, particularly from fiscal conservatives who charged that their bargaining rights and political power were incompatible with a new age of austerity and limits.
Puerto Rican migrants have resided in the United States since before the Spanish-Cuban-American War of 1898, when the United States took possession of the island of Puerto Rico as part of the Treaty of Paris. After the war, groups of Puerto Ricans began migrating to the United States as contract laborers, first to sugarcane plantations in Hawaii, and then to other destinations on the mainland. After the Jones Act of 1917 extended U.S. citizenship to islanders, Puerto Ricans migrated to the United States in larger numbers, establishing their largest base in New York City. Over the course of the 1920s and 1930s, a vibrant and heterogeneous colonia developed there, and Puerto Ricans participated actively both in local politics and in the increasingly contentious politics of their homeland, whose status was indeterminate until it became a commonwealth in 1952. The Puerto Rican community in New York changed dramatically after World War II, accommodating up to fifty thousand new migrants per year during the peak of the “great migration” from the island. Newcomers faced intense discrimination and marginalization in this era, defined by both a Cold War ethos and liberal social scientists’ interest in the “Puerto Rican problem.”
Puerto Rican migrant communities in the 1950s and 1960s—now rapidly expanding into the Midwest, especially Chicago, and into New Jersey, Connecticut, and Philadelphia—struggled with inadequate housing and discrimination in the job market. In local schools, Puerto Rican children often faced a lack of accommodation of their need for English language instruction. Most catastrophic for Puerto Rican communities, on the East Coast particularly, was the deindustrialization of the labor market over the course of the 1960s. By the late 1960s, in response to these conditions and spurred by the civil rights, Black Power, and other social movements, young Puerto Ricans began organizing and protesting in large numbers. Their activism combined a radical approach to community organizing with Puerto Rican nationalism and international anti-imperialism. The youth were not the only activists in this era. Parents in New York had initiated, together with their African American neighbors, a “community control” movement that spanned the late 1960s and early 1970s; and many other adult activists pushed the politics of the urban social service sector—the primary institutions in many impoverished Puerto Rican communities—further to the left.
By the mid-1970s, urban fiscal crises and the rising conservative backlash in national politics dealt another blow to many Puerto Rican communities in the United States. The Puerto Rican population as a whole was now widely considered part of a national “underclass,” and much of the political energy of Puerto Rican leaders focused on addressing the paucity of both basic material stability and social equality in their communities. Since the 1980s, however, Puerto Ricans have achieved some economic gains, and a growing college-educated middle class has managed to gain more control over the cultural representations of their communities. More recently, the political salience of Puerto Ricans as a group has begun to shift. For the better part of the 20th century, Puerto Ricans in the United States were considered numerically insignificant or politically impotent (or both); but in the last two presidential elections (2008 and 2012), their growing populations in the South, especially in Florida, have drawn attention to their demographic significance and their political sensibilities.
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
Despite its cultivated reputation as the nation’s “white spot” in the early 20th century, Southern California was in fact home to diverse and numerous communities of color, some composed of relatively new immigrants and some long predating the era of Anglo settlement and conquest. In the years following World War II, the region engaged in suburban home construction on a mass scale and became a global symbol of what Dolores Hayden called the economically democratic but racially exclusive “sitcom suburb,” from the tax-lowering mechanism of its “Lakewood plan” to the car-friendly “Googie” architecture of the San Fernando Valley. Existing suburban communities of color, such as the colonias of agricultural laborers, were engulfed by new settlements, while upwardly mobile African Americans, Latinas/Latinos, and Asian Americans sought access to the expanding suburban dream of homeownership, with varying degrees of success. The political responses to suburban diversity in metropolitan Los Angeles ranged from Anglo resistance and flight to multiracial political coalitions and the incorporation of people of color at multiple levels of local government. The ascent by a number of suburbanites of color to positions of local and regional political power from the 1960s through the 1980s sometimes exposed intra-ethnic discord and sometimes the fragility of cross-race coalition as multiple actors sought to protect property values and to pursue economic security within the competitive constraints of shrinking municipal resources, aging infrastructure, and a receding suburban fringe. As a result, political conflicts over crime, immigration, education, and inequality emerged in many Los Angeles County suburbs by the 1970s and later in the more distant corporate suburbs of Orange, Ventura, Riverside, and San Bernardino Counties. The suburbanization of poverty, the role of suburbs as immigrant gateways, and the emergence of “majority-minority” suburbs—all national trends by the late 1990s and the first decade of the 20th century—were evident far earlier in the Los Angeles metropolitan region, where diverse suburbanites negotiated social and economic crises and innovated political responses.
Kyle B. Roberts
From Cahokia to Newport, from Santa Fe to Chicago, cities have long exerted an important influence over the development of American religion; in turn, religion has shaped the life of America’s cities. Early visions of a New Jerusalem quickly gave way to a crowded spiritual marketplace full of faiths competing for the attention of a heterogeneous mass of urban consumers, although the dream of an idealized spiritual city never completely disappeared. Pluralism fostered toleration and freedom of religious choice, but also catalyzed competition and antagonism, sometimes resulting in violence. Struggles over political authority between established and dissenting churches gave way after the American Revolution to a contest over the right to exert moral authority through reform. Secularization, the companion of modernization and urbanization, did not toll the death knell for urban religion, but instead, provided the materials with which the religious engaged the city. Negative discursive constructions of the city proffered by a handful of religious reformers have long cast a shadow over the actual urban experience of most men and women. Historians continue to uncover the rich and innovative ways in which urban religion enabled individuals to understand, navigate, and contribute to the city around them.
Christopher D. Cantwell
Home to more than half the U.S. population by 1920, cities played an important role in the development of American religion throughout the 20th century. At the same time, the beliefs and practices of religious communities also shaped the contours of America’s urban landscape. Much as in the preceding three centuries, the economic development of America’s cities and the social diversity of urban populations animated this interplay. But the explosive, unregulated expansion that defined urban growth after the Civil War was met with an equally dramatic disinvestment from urban spaces throughout the second half of the 20th century. The domestic and European migrations that previously fueled urban growth also changed throughout the century, shifting from Europe and the rural Midwest to the deep South, Africa, Asia, and Latin America after World War II. These newcomers not only brought new faiths to America’s cities but also contributed to the innovation of several new, distinctly urban religious movements. Urban development and diversity on one level promoted toleration and cooperation as religious leaders forged numerous ecumenical and, eventually, interfaith bonds to combat urban problems. But it also led to tension and conflict as religious communities busied themselves with carving out spaces of their own through tight-knit urban enclaves or new suburban locales. Contemporary American cities are some of the most religiously diverse communities in the world. Historians continue to uncover how religious communities not only have lived in but also have shaped the modern city.
Robert O. Self
Few decades in American history reverberate with as much historical reach or glow as brightly in living mythology as the 1960s. During those years Americans reanimated and reinvented the core political principles of equality and liberty but, in a primal clash that resonates more than half a century later, fiercely contested what those principles meant, and for whom. For years afterward, the decade’s appreciators considered the era to have its own “spirit,” defined by greater freedoms and a deeper, more authentic personhood, and given breath by a youthful generation’s agitation for change in nearly every dimension of national life. To its detractors in subsequent decades, the era was marked by immature radical fantasies and dangerous destabilizations of the social order, behind which lay misguided youthful enthusiasms and an overweening, indulgent federal government. We need not share either conviction to appreciate the long historical shadow cast by the decade’s clashing of left, right, and center and its profound influence over the political debates, cultural logics, and social practices of the many years that followed.
The decade’s political and ideological clashes registered with such force because post–World War II American life was characterized by a society-wide embrace of antiradicalism and a prescribed normalcy. Having emerged from the war as the lone undamaged capitalist industrial power, the United States exerted enormous influence throughout the globe after 1945—so much that some historians have called the postwar years a “pax Americana.” In its own interest and in the interest of its Western allies, the United States engaged in a Cold War standoff with the Soviet Union over the fate of Europe and no less over the fate of developing countries on every continent. Fiercely anticommunist abroad and at home, U.S. elites stoked fears of the damage communism could do, whether in Eastern Europe or in a public school textbook. Americans of all sorts in the postwar years embraced potent ideologies justifying the prevailing order, whether that order was capitalist, patriarchal, racial, or heterosexual. They pursued a postwar “normalcy” defined by nuclear family domesticity and consumer capitalism in the shadow cast by the threat of communism and, after 1949, global thermonuclear war with the Soviet Union. This prevailing order was stultifying and its rupture in the 1960s is the origin point of the decade’s great dramas.
The social movements of that decade drew Americans from the margins of citizenship—African Americans, Latina/o, Native Americans, women, and gay men and lesbians, among others—into epochal struggles over the withheld promise of equality. For the first time since 1861, an American war deeply split the nation, nearly destroying a major political party and intensifying a generational revolt already under way. Violence, including political assassinations at the highest level, bombings and assassinations of African Americans, bombings by left-wing groups like the Weathermen, and major urban uprisings by African Americans against police and property bathed the country in more blood. The New Deal liberalism of Presidents Franklin D. Roosevelt and Harry S. Truman reached its postwar peak in 1965 under President Lyndon Johnson’s Great Society and then retreated amid acrimony and backlash, as a new conservative politics gained traction. All this took place in the context of a “global 1960s,” in which societies in Western and Eastern Europe, Latin America, Africa, and elsewhere experienced similar generational rebellions, quests for meaningful democracy, and disillusionment with American global hegemony. From the first year of the decade to the last, the 1960s were a watershed era that marked the definitive end of a “postwar America” defined by easy Cold War dualities, presumptions of national innocence, and political calcification.
To explain the foregoing, this essay is organized in five sections. First comes a broad overview of the decade, highlighting some of its indelible moments and seminal political events. The next four sections correspond to the four signature historical developments of the 1960s. Discussed first is the collapse of the political consensus that predominated in national life following World War II. We can call this consensus “Vital Center liberalism,” after the title of a 1949 book by Arthur Schlesinger Jr., or “Cold War liberalism.” Its assault from both the New Left and the New Right is one of the defining stories of the 1960s. Second is the resurgence, after a decades-long interregnum dating to Reconstruction, of African American political agency. The black freedom struggle of the 1960s was far more than a social movement for civil rights. To shape the conditions of national life and the content of public debate in ways impossible under Jim Crow, black American called for nothing less than a spiritual and political renewal of the country. Third, and following from the latter, is the emergence within the American liberal tradition of a new emphasis on expanding individual rights and ending invidious discrimination. Forged in conjunction with the black freedom movement by women, Latino/as, Asian Americans, Native Americans, and homophiles (as early gay rights activists were called) and gay liberationists, this new emphasis profoundly changed American law and set the terms of political debate for the next half century. Fourth and lastly, the 1960s witnessed the flourishing of a broad and diverse culture of anti-authoritarianism. In art, politics, and social behavior, this anti-authoritarianism took many forms, but at its heart lay two distinct historical phenomena: an ecstatic celebration of youth, manifest in the tension between the World War II generation and the baby boom generation, and an intensification of the long-standing conflict in American life between individualism and hierarchical order.
Despite the disruptions, rebellions, and challenges to authority in the decade, the political and economic elite proved remarkably resilient and preserved much of the prevailing order. This is not to discount the foregoing account of challenges to that order or to suggest that social change in the 1960s made little difference in American life. However, in grappling with this fascinating decade we are confronted with the paradox of outsized events and enormous transformations in law, ideology, and politics alongside a continuation, even an entrenchment, of traditional economic and political structures and practices.
During the 1890s, the word segregation became the preferred term for the practice of coercing different groups of people, especially those designated by race, to live in separate and unequal urban residential neighborhoods. In the southern states of the United States, segregationists imported the word—originally used in the British colonies of Asia—to describe Jim Crow laws, and, in 1910, whites in Baltimore passed a “segregation ordinance” mandating separate black and white urban neighborhoods. Copy-cat legislation sprang up in cities across the South and the Midwest. But in 1917, a multiracial team of lawyers from the fledgling National Association for the Advancement of Colored People (NAACP) mounted a successful legal challenge to these ordinances in the U.S. Supreme Court—even as urban segregation laws were adopted in other places in the world, most notably in South Africa. The collapse of the movement for legislated racial segregation in the United States occurred just as African Americans began migrating in large numbers into cities in all regions of the United States, resulting in waves of anti-black mob violence. Segregationists were forced to rely on nonstatutory or formally nonracial techniques. In Chicago, an alliance of urban reformers and real estate professionals invented alternatives to explicitly racist segregation laws. The practices they promoted nationwide created one of the most successful forms of urban racial segregation in world history, rivaling and finally outliving South African apartheid. Understanding how this system came into being and how it persists today requires understanding both how the Chicago segregationists were connected to counterparts elsewhere in the world and how they adapted practices of city-splitting to suit the peculiarities of racial politics in the United States.
Peter C. Baldwin
Today the term nightlife typically refers to social activities in urban commercial spaces—particularly drinking, dancing, dining, and listening to live musical performances. This was not always so. Cities in the 18th and early 19th centuries knew relatively limited nightlife, most of it occurring in drinking places for men. Theater attracted mixed-gender audiences but was sometimes seen as disreputable in both its content and the character of the audience. Theater owners worked to shed this negative reputation starting in the mid-19th century, while nightlife continued to be tainted by the profusion of saloons, brothels, and gambling halls. Gradual improvements in street lighting and police protection encouraged people to go out at night, as did growing incomes and decreasing hours of labor. Nightlife attracted more women in the decades around 1900 as it expanded and diversified. Dance halls, vaudeville houses, movie theaters, restaurants, and cabarets thrived in the electrified “bright lights” districts of central cities. Commercial entertainment contracted again in the 1950s and 1960s as Americans spent more of their evening leisure hours watching television and began to regard urban public spaces with suspicion. Still, nightlife is viewed as an important component of urban economic life and is actively promoted by many municipal governments.