Ramón A. Gutiérrez
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
Mexican immigration to the United States is a topic of particular interest at this moment for a number of political reasons. First, and probably foremost, Mexicans are currently the single largest group of foreign-born residents in the country. In 2013, the United States counted 41.3 million individuals of foreign birth; 28 percent, or 11.6 million, were Mexican. If census data are aggregated more broadly, adding together the foreign-born and persons of Mexican ancestry who are citizens, the number totals 31.8 million in 2010, or roughly 10 percent of the country’s total population of 308.7 million. What has nativists and those eager to restrict immigration particularly concerned is that the Mexican origin population has been growing rapidly, by 54 percent between the 2000 and 2010 censuses, or from 11.2 million to 31.8 million persons. This pace of growth has slowed, but not enough to calm racial and xenophobic fears of the citizenry fearful of foreigners and terrorists.
Mexican immigration to the United States officially began in 1846 and has continued into the present without any significant period of interruption, also making it quite distinct. The immigration histories of national groups that originated in Asia, Africa, and Europe are much more varied in trajectory and timing. They usually began with massive movements, driven by famine, political strife or burgeoning economic opportunities in the United States, and then slowed, tapered off, or ended abruptly, as was the case with Chinese immigration from 1850 to 2015. This fact helps explain why Mexico has been the single largest source of immigrants in the United States for the longest period of time.
The geographic proximity between the two countries, compounded by profound economic disparities, has continuously attracted Mexican immigrants, facilitated by a border that is rather porous and that has been poorly patrolled for much of the 20th century. The United States and Mexico are divided by a border that begins at the Pacific Ocean, at the twin cities of San Diego, California and Tijuana, Baja California. The border moves eastward until it reaches the Rio Grande at El Paso, Texas and Ciudad Júarez, Chihuahua. From there the border follows the river’s flow in a southeastern direction, until its mouth empties into the Gulf of Mexico where Brownsville, Texas and Matamoros, Tamaulipas sit. This expanse of over 1,945 miles is poorly marked. In many places, only old concrete markers, sagging, dry-rotted fence posts with rusted barbed wire, and a river that has continually changed its course, mark the separation between these two sovereign national spaces.
Since 1924, when the U.S. Border Patrol was created mainly to prohibit the unauthorized entry of Chinese immigrants, not Mexicans, American attempts to effectively regulate entries and exits has been concentrated only along known, highly trafficked routes that lead north. The inability of the United States to patrol the entire length of its border with Mexico has meant that any Mexican eager to work or live in the United States has rarely found the border an insurmountable obstacle, and if they have encountered it temporarily so, they have simply hired expensive professional smugglers (known as coyotes) to maximize safe passage into the United States without border inspection or official authorization. In 2014, there were approximately 11.3 million such unauthorized immigrants in the United States; 49 percent, or 5.6 million of them were Mexican.
Over the long course of history Mexican immigration is best characterized as the movement of unskilled workers toiling in agriculture, railroad construction, and mineral extraction; for the last two decades, they have worked in construction and service industries as well. This labor migration has evolved through five distinct phases, each marked by its own logic, demands, and governance.
Housing in America has long stood as a symbol of the nation’s political values and a measure of its economic health. In the 18th century, a farmhouse represented Thomas Jefferson’s ideal of a nation of independent property owners; in the mid-20th century, the suburban house was seen as an emblem of an expanding middle class. Alongside those well-known symbols were a host of other housing forms—tenements, slave quarters, row houses, French apartments, loft condos, and public housing towers—that revealed much about American social order and the material conditions of life for many people.
Since the 19th century, housing markets have been fundamental forces driving the nation’s economy and a major focus of government policies. Home construction has provided jobs for skilled and unskilled laborers. Land speculation, housing development, and the home mortgage industry have generated billions of dollars in investment capital, while ups and downs in housing markets have been considered signals of major changes in the economy. Since the New Deal of the 1930s, the federal government has buttressed the home construction industry and offered economic incentives for home buyers, giving the United States the highest home ownership rate in the world. The housing market crash of 2008 slashed property values and sparked a rapid increase in home foreclosures, especially in places like Southern California and the suburbs of the Northeast, where housing prices had ballooned over the previous two decades. The real estate crisis led to government efforts to prop up the mortgage banking industry and to assist struggling homeowners. The crisis led, as well, to a drop in rates of home ownership, an increase in rental housing, and a growth in homelessness.
Home ownership remains a goal for many Americans and an ideal long associated with the American dream. The owner-occupied home—whether single-family or multifamily dwelling—is typically the largest investment made by an American family. Through much of the 18th and 19th centuries, housing designs varied from region to region. In the mid-20th century, mass production techniques and national building codes tended to standardize design, especially in new suburban housing. In the 18th century, the family home was a site of waged and unwaged work; it was the center of a farm, plantation, or craftsman’s workshop. Two and a half centuries later, a house was a consumer good: its size, location, and decor marked the family’s status and wealth.
Wendy L. Wall
The New Deal generally refers to a set of domestic policies implemented by the administration of Franklin Delano Roosevelt in response to the crisis of the Great Depression. Propelled by that economic cataclysm, Roosevelt and his New Dealers pushed through legislation that regulated the banking and securities industries, provided relief for the unemployed, aided farmers, electrified rural areas, promoted conservation, built national infrastructure, regulated wages and hours, and bolstered the power of unions. The Tennessee Valley Authority prevented floods and brought electricity and economic progress to seven states in one of the most impoverished parts of the nation. The Works Progress Administration offered jobs to millions of unemployed Americans and launched an unprecedented federal venture into the arena of culture. By providing social insurance to the elderly and unemployed, the Social Security Act laid the foundation for the U.S. welfare state.
The benefits of the New Deal were not equitably distributed. Many New Deal programs—farm subsidies, work relief projects, social insurance, and labor protection programs—discriminated against racial minorities and women, while profiting white men disproportionately. Nevertheless, women achieved symbolic breakthroughs, and African Americans benefited more from Roosevelt’s policies than they had from any past administration since Abraham Lincoln’s. The New Deal did not end the Depression—only World War II did that—but it did spur economic recovery. It also helped to make American capitalism less volatile by extending federal regulation into new areas of the economy.
Although the New Deal most often refers to policies and programs put in place between 1933 and 1938, some scholars have used the term more expansively to encompass later domestic legislation or U.S. actions abroad that seemed animated by the same values and impulses—above all, a desire to make individuals more secure and a belief in institutional solutions to long-standing problems. In order to pass his legislative agenda, Roosevelt drew many Catholic and Jewish immigrants, industrial workers, and African Americans into the Democratic Party. Together with white Southerners, these groups formed what became known as the “New Deal coalition.” This unlikely political alliance endured long after Roosevelt’s death, supporting the Democratic Party and a “liberal” agenda for nearly half a century. When the coalition finally cracked in 1980, historians looked back on this extended epoch as reflecting a “New Deal order.”
In the years after the Civil War, Polish immigrants became an important part of the American working class. They actively participated in the labor movement and played key roles in various industrial strikes ranging from the 1877 Railroad Strike through the rise of the CIO and the post-1945 era of prosperity. Over time, the Polish American working class became acculturated and left its largely immigrant past behind while maintaining itself as an ethnic community. It also witnessed a good deal of upward mobility, especially over several generations. This ethnic community, however, continued to be refreshed with immigrants throughout the 20th century.
As with the larger American working class, Polish American workers were hard hit by changes in the industrial structure of the United States. Deindustrialization turned the centers of much of the Polish American community into the Rust Belt. This, despite a radical history, caused many to react by turning toward conservative causes in the late 20th and early 21st centuries.
Maureen A. Flanagan
The decades from the 1890s into the 1920s produced reform movements in the United States that resulted in significant changes to the country’s social, political, cultural, and economic institutions. The impulse for reform emanated from a pervasive sense that the country’s democratic promise was failing. Political corruption seemed endemic at all levels of government. An unregulated capitalist industrial economy exploited workers and threatened to create a serious class divide, especially as the legal system protected the rights of business over labor. Mass urbanization was shifting the country from a rural, agricultural society to an urban, industrial one characterized by poverty, disease, crime, and cultural clash. Rapid technological advancements brought new, and often frightening, changes into daily life that left many people feeling that they had little control over their lives. Movements for socialism, woman suffrage, and rights for African Americans, immigrants, and workers belied the rhetoric of the United States as a just and equal democratic society for all its members.
Responding to the challenges presented by these problems, and fearful that without substantial change the country might experience class upheaval, groups of Americans proposed undertaking significant reforms. Underlying all proposed reforms was a desire to bring more justice and equality into a society that seemed increasingly to lack these ideals. Yet there was no agreement among these groups about the exact threat that confronted the nation, the means to resolve problems, or how to implement reforms. Despite this lack of agreement, all so-called Progressive reformers were modernizers. They sought to make the country’s democratic promise a reality by confronting its flaws and seeking solutions. All Progressivisms were seeking a via media, a middle way between relying on older ideas of 19th-century liberal capitalism and the more radical proposals to reform society through either social democracy or socialism. Despite differences among Progressives, the types of Progressivisms put forth, and the successes and failures of Progressivism, this reform era raised into national discourse debates over the nature and meaning of democracy, how and for whom a democratic society should work, and what it meant to be a forward-looking society. It also led to the implementation of an activist state.
Public authorities are agencies created by governments to engage directly in the economy for public purposes. They differ from standard agencies in that they operate outside the administrative framework of democratically accountable government. Since they generate their own operating income by charging users for goods and services and borrow for capital expenses based on projections of future revenues, they can avoid the input from voters and the regulations that control public agencies funded by tax revenues.
Institutions built on the public authority model exist at all levels of government and in every state. A few of these enterprises, such as the Tennessee Valley Authority and the Port Authority of New York and New Jersey, are well known. Thousands more toil in relative obscurity, operating toll roads and bridges, airports, transit systems, cargo ports, entertainment venues, sewer and water systems, and even parking garages. Despite their ubiquity, these agencies are not well understood. Many release little information about their internal operations. It is not even possible to say conclusively how many exist, since experts disagree about how to define them, and states do not systematically track them.
One thing we do know about public authorities is that, over the course of the 20th century, these institutions have become a major component of American governance. Immediately following the Second World War, they played a minor role in public finance. But by the early 21st century, borrowing by authorities constituted well over half of all public borrowing at the sub-federal level. This change means that increasingly the leaders of these entities, rather than elected officials, make key decisions about where and how to build public infrastructure and steer economic development in the United States
Joseph E. Hower
Government employees are an essential part of the early-21st-century labor movement in the United States. Teachers, firefighters, and police officers are among the most heavily unionized occupations in America, but public-sector union members also include street cleaners and nurses, janitors and librarians, zookeepers and engineers. Despite cultural stereotypes that continue to associate unions with steel or auto workers, public employees are five times more likely to be members of unions than workers in private industry. Today, nearly half of all union members work for federal, state, or local governments.
It was not always so. Despite a long, rich history of workplace and ballot box activism, government workers were marginal to the broader labor movement until the second half of the 20th century. Excluded from the legal breakthroughs that reshaped American industry in the 1930s, government workers lacked the basic organizing and bargaining rights extended to their private-sector counterparts. A complicated, and sometimes convoluted, combination of discourse and doctrine held that government employees were, as union leader Jerry Wurf later put it, a “servant to a master” rather than “a worker with a boss.” Inspired by the material success of workers in mass industry and moved by the moral clarity of the Black Freedom struggle, government workers demanded an end to their second-class status through one of the most consequential, and least recognized, social movements of late 20th century. Yet their success at improving the pay, benefits, and conditions of government work also increased the cost of government services, imposing new obligations at a time of dramatic change in the global economy. In the resulting crunch, unionized public workers came under political pressure, particularly from fiscal conservatives who charged that their bargaining rights and political power were incompatible with a new age of austerity and limits.
Puerto Rican migrants have resided in the United States since before the Spanish-Cuban-American War of 1898, when the United States took possession of the island of Puerto Rico as part of the Treaty of Paris. After the war, groups of Puerto Ricans began migrating to the United States as contract laborers, first to sugarcane plantations in Hawaii, and then to other destinations on the mainland. After the Jones Act of 1917 extended U.S. citizenship to islanders, Puerto Ricans migrated to the United States in larger numbers, establishing their largest base in New York City. Over the course of the 1920s and 1930s, a vibrant and heterogeneous colonia developed there, and Puerto Ricans participated actively both in local politics and in the increasingly contentious politics of their homeland, whose status was indeterminate until it became a commonwealth in 1952. The Puerto Rican community in New York changed dramatically after World War II, accommodating up to fifty thousand new migrants per year during the peak of the “great migration” from the island. Newcomers faced intense discrimination and marginalization in this era, defined by both a Cold War ethos and liberal social scientists’ interest in the “Puerto Rican problem.”
Puerto Rican migrant communities in the 1950s and 1960s—now rapidly expanding into the Midwest, especially Chicago, and into New Jersey, Connecticut, and Philadelphia—struggled with inadequate housing and discrimination in the job market. In local schools, Puerto Rican children often faced a lack of accommodation of their need for English language instruction. Most catastrophic for Puerto Rican communities, on the East Coast particularly, was the deindustrialization of the labor market over the course of the 1960s. By the late 1960s, in response to these conditions and spurred by the civil rights, Black Power, and other social movements, young Puerto Ricans began organizing and protesting in large numbers. Their activism combined a radical approach to community organizing with Puerto Rican nationalism and international anti-imperialism. The youth were not the only activists in this era. Parents in New York had initiated, together with their African American neighbors, a “community control” movement that spanned the late 1960s and early 1970s; and many other adult activists pushed the politics of the urban social service sector—the primary institutions in many impoverished Puerto Rican communities—further to the left.
By the mid-1970s, urban fiscal crises and the rising conservative backlash in national politics dealt another blow to many Puerto Rican communities in the United States. The Puerto Rican population as a whole was now widely considered part of a national “underclass,” and much of the political energy of Puerto Rican leaders focused on addressing the paucity of both basic material stability and social equality in their communities. Since the 1980s, however, Puerto Ricans have achieved some economic gains, and a growing college-educated middle class has managed to gain more control over the cultural representations of their communities. More recently, the political salience of Puerto Ricans as a group has begun to shift. For the better part of the 20th century, Puerto Ricans in the United States were considered numerically insignificant or politically impotent (or both); but in the last two presidential elections (2008 and 2012), their growing populations in the South, especially in Florida, have drawn attention to their demographic significance and their political sensibilities.
Janine Giordano Drake
The term “Social Gospel” was coined by ministers and other well-meaning American Protestants with the intention of encouraging the urban and rural poor to understand that Christ cared about them and saw their struggles. The second half of the 19th century saw a rise of both domestic and international missionary fervor. Church and civic leaders feared a future in which freethinkers, agnostics, atheists, and other skeptics dominated spiritual life and well-educated ministers were marginal to American culture. They grew concerned with the rising number of independent and Pentecostal churches without extensive theological training or denominational authority. American Protestants especially feared that immigrant religious and cultural traditions, including Roman Catholicism, Judaism, and Eastern Orthodox Christianity, were not quintessentially American. Most of all, they worried that those belief systems could not promote what they saw as the traditional American values and mores central to the nation.
However, at least on the surface, the Social Gospel did not dwell on extinguishing ideas or traditions. Rather, as was typical of the Progressive Era, it forwarded a wide-ranging set of visions that emphasized scientific and professional expertise, guided by Christian ethics, to solve social and political problems. It fostered an energetic culture of conferences, magazines, and paperback books dedicated to reforming the nation. Books and articles unpacked social surveys that sorted through possible solutions to urban and rural poverty and reported on productive relationships between churches and municipal governments. Pastoral conferences often focused on planning revivals in urban auditoriums, churches, stadiums, or the open air, where participants not only were confronted with old-fashioned gospel messages but with lectures on what Christians could do to improve their communities.
The Social Gospel’s theological turn stressed the need for both individual redemption from sinful behavior, and the redemption of whole societies from damaged community relationships. Revivalists not only entreated listeners to reject personal habits like drinking, smoking, chewing tobacco, gambling, theater-going, and extramarital sex. They also encouraged listeners to replace the gathering space of the saloon with churches, schools, and public parks. Leaders usually saw themselves redeeming the “social sin” that produced impoverished neighborhoods, low-wage jobs, preventable diseases, and chronic unemployment and offering alternatives that kept businesses intact. In the Social Creed of the Churches (1908), ministers across the denominations proposed industrial reforms limiting work hours and improving working conditions, as well as government regulations setting a living wage and providing protection for the injured, sick, and elderly. Sometimes, Social Gospel leaders defended collective bargaining and built alliances with labor leaders. At other times, they proposed palliative solutions that would instill Christian “brotherhood” on the shop floor and render unions unnecessary. This wavering on principles produced complicated and sometimes tense relationships among union leaders, workers, and Social Gospel leaders.
Elements of the Social Gospel movement have carried even into the 21st century, leading some historians to challenge the idea that the movement died with the close of the Great War. The American Civil Liberties Union and Fellowship of Reconciliation, for example, did not lose any time in keeping alive the Social Gospel’s commitments to protecting the poor and defenseless. However, the rise of “premillennial dispensationalist” theology and the general disillusionment produced by the war’s massive casualties marked a major turning point, if not an endpoint, to the Social Gospel’s influence as a well-funded, Protestant evangelical force. The brutality of the war undermined American optimism—much of it fueled by Social Gospel thinking—about creating a more just, prosperous, and peaceful world. Meanwhile, attorney general A. Mitchell Palmer’s campaign against alleged anarchists and Bolsheviks immediately after the war—America’s first “Red Scare”—targeted a large number of labor and religious organizations with the accusation that socialist ideas were undemocratic and un-American. By the 1920s, many Social Gospel leaders had distanced themselves from the organized working classes. They either accepted new arrangements for harmonizing the interests of labor and capital or took their left-leaning political ideals underground.
During the 1890s, the word segregation became the preferred term for the practice of coercing different groups of people, especially those designated by race, to live in separate and unequal urban residential neighborhoods. In the southern states of the United States, segregationists imported the word—originally used in the British colonies of Asia—to describe Jim Crow laws, and, in 1910, whites in Baltimore passed a “segregation ordinance” mandating separate black and white urban neighborhoods. Copy-cat legislation sprang up in cities across the South and the Midwest. But in 1917, a multiracial team of lawyers from the fledgling National Association for the Advancement of Colored People (NAACP) mounted a successful legal challenge to these ordinances in the U.S. Supreme Court—even as urban segregation laws were adopted in other places in the world, most notably in South Africa. The collapse of the movement for legislated racial segregation in the United States occurred just as African Americans began migrating in large numbers into cities in all regions of the United States, resulting in waves of anti-black mob violence. Segregationists were forced to rely on nonstatutory or formally nonracial techniques. In Chicago, an alliance of urban reformers and real estate professionals invented alternatives to explicitly racist segregation laws. The practices they promoted nationwide created one of the most successful forms of urban racial segregation in world history, rivaling and finally outliving South African apartheid. Understanding how this system came into being and how it persists today requires understanding both how the Chicago segregationists were connected to counterparts elsewhere in the world and how they adapted practices of city-splitting to suit the peculiarities of racial politics in the United States.