David S. Tanenhaus
Juvenile justice is a technical term that refers to the specific area of law and affiliated institutions, most notably the juvenile court, with jurisdiction over the cases of minors who are accused of being miscreants. Although the idea that the law should treat minors differently from adults predates the American Revolution, juvenile justice itself is a Progressive Era invention. Its institutional legitimacy rests on the power and responsibility of the state to act as a parent (parens patriae) on behalf of those who cannot care for themselves. Since the establishment of the world’s first juvenile court in Chicago in 1899, this American idea of creating separate justice systems for juveniles has spread across the nation and much of the world. For more than a century, American states have used their juvenile justice systems to respond to youth crime and delinquency. Since the 1960s, the US Supreme Court has periodically considered whether juvenile courts must provide the same constitutional due process safeguards as adult criminal courts and whether juveniles prosecuted in the criminal justice system can receive the same sentences as adults, such as the death penalty or life without the possibility of parole.
Erik Gellman and Margaret Rung
From the late 1920s through the 1930s, countries on every inhabited continent suffered through a dramatic and wrenching economic contraction termed the Great Depression, an economic collapse that has come to represent the nadir of modern economic history. With national unemployment reaching well into double digits for over a decade, productivity levels falling by half, prices severely depressed, and millions of Americans without adequate food, shelter or clothing, the United States experienced some of the Great Depression’s severest consequences. The crisis left deep physical, psychological, political, social, and cultural impressions on the national landscape. It encouraged political reform and reaction, renewed labor activism, spurred migration, unleashed grass-roots movements, inspired cultural experimentation, and challenged family structures and gender roles.
John D. Fairfield
The City Beautiful movement arose in the 1890s in response to the accumulating dirt and disorder in industrial cities, which threatened economic efficiency and social peace. City Beautiful advocates believed that better sanitation, improved circulation of traffic, monumental civic centers, parks, parkways, public spaces, civic art, and the reduction of outdoor advertising would make cities throughout the United States more profitable and harmonious. Engaging architects and planners, businessmen and professionals, and social reformers and journalists, the City Beautiful movement expressed a boosterish desire for landscape beauty and civic grandeur, but also raised aspirations for a more humane and functional city. “Mean streets make mean people,” wrote the movement’s publicist and leading theorist, Charles Mulford Robinson, encapsulating the belief in positive environmentalism that drove the movement. Combining the parks and boulevards of landscape architect Frederick Law Olmsted with the neoclassical architecture of Daniel H. Burnham’s White City at the Chicago’s World Columbian Exposition in 1893, the City Beautiful movement also encouraged a view of the metropolis as a delicate organism that could be improved by bold, comprehensive planning. Two organizations, the American Park and Outdoor Art Association (founded in 1897) and the American League for Civic Improvements (founded in 1900), provided the movement with a national presence. But the movement also depended on the work of civic-minded women and men in nearly 2,500 municipal improvement associations scattered across the nation. Reaching its zenith in Burnham’s remaking of Washington, D.C., and his coauthored Plan of Chicago (1909), the movement slowly declined in favor of the “City Efficient” and a more technocratic city-planning profession. Aside from a legacy of still-treasured urban spaces and structures, the City Beautiful movement contributed to a range of urban reforms, from civic education and municipal housekeeping to city planning and regionalism.
The story of the pre-Columbian Mississippi Period (1000
Christopher R. Reed
The unanticipated and massive migration of half a million African Americans between 1916 and 1918 from the racially oppressive South to the welcoming North surprised the nation. Directly resulting from the advent of the First World War, the movement of these able-bodied workers provided essential labor to maintain wartime production that sustained the Allied war effort. One-tenth of the people who surged north headed to and remained in Chicago, where their presence challenged the status quo in the areas of employment, external race relations, internal race arrangements, politics, housing, and recreation. Once in the Windy City, this migrant-influenced labor pool expanded with the addition of resident blacks to form the city’s first African American industrial proletariat. Wages for both men and women increased compared to what they had been earning in the South, and local businesses were ready and willing to accommodate these new consumers. A small black business sector became viable and was able to support two banks, and by the mid-1920s, there were multiple stores along Chicago’s State Street forming a virtual “Black Wall Street.” An extant political submachine within Republic Party ranks also increased its power and influence in repeated electoral contests. Importantly, upon scrutiny, the purported social conflict between the Old Settler element and the newcomers was shown to be overblown and inconsequential to black progress.
Recent revisionist scholarship over the past two decades has served to minimize the first phase of northward movement and has positioned it within the context of a half-century phenomenon under the labels of the “Second Great Migration” and the “Great Black Migration.” No matter what the designation, the voluntary movement of five to six million blacks from what had been their traditional home to the uncertainty of the North and West between the First World War and the Vietnam conflict stands as both a condemnation of regional oppression of the human spirit and aspirations of millions, and a demonstration of group courage in taking on new challenges in new settings. Although Chicago would prove to be “no crystal stair,” it was on many occasions a land of hope and promise for migrants throughout the past century.
Company towns can be defined as communities dominated by a single company, typically focused on one industry. Beyond that very basic definition, company towns varied in their essentials. Some were purpose-built by companies, often in remote areas convenient to needed natural resources. There, workers were often required to live in company-owned housing as a condition of employment. Others began as small towns with privately owned housing, usually expanding alongside a growing hometown corporation. Residences were shoddy in some company towns. In others, company-built housing may have been excellent, with indoor plumbing and central heating, and located close to such amenities as schools, libraries, perhaps even theaters.
Company towns played a key role in US economic and social development. Such places can be found across the globe, but America’s vast expanse of undeveloped land, generous stock of natural resources, tradition of social experimentation, and laissez-faire attitude toward business provided singular opportunities for the emergence of such towns, large and small, in many regions of the United States. Historians have identified as many as 2,500 such places.
A tour of company towns can serve as a survey of the country’s industrial development, from the first large-scale planned industrial community—the textile town of Lowell, Massachusetts—to Appalachian mining villages, Western lumber towns, and steelmaking principalities such as the mammoth development at Gary, Indiana. More recent office-park and high-tech industrial-park complexes probably do not qualify as company towns, although they have some similar attributes. Nor do such planned towns as Disney Corporation’s Celebration, Florida, qualify, despite close ties to a single corporation, because its residents do not necessarily work for Disney.
Company towns have generally tended toward one of two models. First, and perhaps most familiar, are total institutions—communities where one business exerts a Big Brother–ish grip over the population, controlling or even taking the place of government, collecting rent on company-owned housing, dictating buying habits (possibly at the company store), and even directing where people worship and how they may spend their leisure time. A second form consists of model towns—planned, ideal communities backed by companies that promised to share their bounty with workers and families. Several such places were carefully put together by experienced architects and urban planners. Such model company towns were marked by a paternalistic, watchful attitude toward the citizenry on the part of the company overlords.
J. Mark Souther
Prior to the railroad age, American cities generally lacked reputations as tourist travel destinations. As railroads created fast, reliable, and comfortable transportation in the 19th century, urban tourism emerged in many cities. Luxury hotels, tour companies, and guidebooks were facilitating and shaping tourists’ experience of cities by the turn of the 20th century. Many cities hosted regional or international expositions that served as significant tourist attractions from the 1870s to 1910s. Thereafter, cities competed more keenly to attract conventions. Tourism promotion, once handled chiefly by railroad companies, became increasingly professionalized with the formation of convention and visitor bureaus. The rise of the automobile spurred the emergence of motels and theme parks on the suburban periphery, but renewed interest in historic urban core areas spurred historic preservation activism and adaptive reuse of old structures for dining, shopping, and entertainment. Although a few cities, especially Las Vegas, had relied heavily on tourism almost from their inception, by the last few decades of the 20th century few cities could afford to ignore tourism development. New waterfront parks, aquariums, stadiums, and other tourist and leisure attractions facilitated the symbolic transformation of cities from places of production to sites of consumption. Long aimed at the a mass market, especially affluent and middle-class whites, tourism promotion embraced market segmentation in the closing years of the 20th century, and a number of attractions and tours appealed to African Americans or LGBTQ communities. If social commentators often complained that cities were developing “tourist bubbles” that concentrated the advantages of tourism in too-small areas and in too few hands, recent trends point to a greater willingness to disperse tourist activity more widely in cities. By the 21st century, urban tourism was indispensable to many cities even as it continued to contribute to uneven development.
A. K. Sandoval-Strausz
“Latino urbanism” describes a culturally specific set of spatial forms and practices created by people of Hispanic origin. It includes many different aspects of those forms and practices, including town planning; domestic, religious, and civic architecture; the adaptation of existing residential, commercial, and other structures; and the everyday use of spaces such as yards, sidewalks, storefronts, streets, and parks.
Latino urbanism has developed over both time and space. It is the evolving product of half a millennium of colonization, settlement, international and domestic migration, and globalization. It has spanned a wide geographic range, beginning in the southern half of North America and gradually expanding to much of the hemisphere.
There have been many variations on Latino urbanism, but most include certain key features: shared central places where people show their sense of community, a walking culture that encourages face-to-face interaction with neighbors, and a sense that sociability should take place as much in the public realm as in the privacy of the home. More recently, planners and architects have realized that Latino urbanism offers solutions to problems such as sprawl, social isolation, and environmental unsustainability.
The term “urbanism” connotes city spaces, and Latino urbanism is most concentrated and most apparent at the center of metropolitan areas. At the same time, it has also been manifested in a wide variety of places and at different scales, from small religious altars in private homes; to Spanish-dominant commercial streetscapes in Latino neighborhoods; and ultimately to settlement patterns that reach from the densely packed centers of cities to the diversifying suburbs that surround them, out to the agricultural hinterlands at their far peripheries—and across borders to big cities and small pueblos elsewhere in the Americas.
Many Asian American neighborhoods faced displacement after World War II because of urban renewal or redevelopment under the 1949 Housing Act. In the name of blight removal and slum clearance this Act allowed local elites to procure federal money to seize land designated as blighted, clear it of its structures, and sell this land to private developers—in the process displacing thousands of residents, small businesses, and community institutions. San Francisco’s Fillmore District, a multiracial neighborhood that housed the city’s largest Japanese American and African American communities, experienced this postwar redevelopment. Like many Asian American neighborhoods that shared space with other communities of color, the Fillmore formed at the intersection of class inequality and racism, and it was this intersection of structural factors that led to substandard urban conditions. Rather than recognize the root causes of urban decline, San Francisco urban and regional elites argued that the Fillmore was among the city’s most blighted neighborhoods and advocated for the neighborhood’s destruction in the name of the public good. They also targeted the Fillmore because their postwar plans for remaking the city’s political economy envisioned the Fillmore as (1) a space to house white- collar workers in the postwar economy and (2) as an Asian-themed space for tourism that connected the city symbolically and economically to Japan, an important U.S. postwar ally. For over four decades these elite-directed plans for the Fillmore displaced more than 20,000 residents in two phases, severely damaging the community. The Fillmore’s redevelopment, then, provides a window into other cases of redevelopment and aids further investigations of the connection between Asian Americans and urban crisis. It also sheds light on the deeper history of displacement in the Asian American experience and contextualizes contemporary gentrification in Asian American neighborhoods.
Becky Nicolaides and Andrew Wiese
Mass migration to suburban areas was a defining feature of American life after 1945. Before World War II, just 13% of Americans lived in suburbs. By 2010, however, suburbia was home to more than half of the U.S. population. The nation’s economy, politics, and society suburbanized in important ways. Suburbia shaped habits of car dependency and commuting, patterns of spending and saving, and experiences with issues as diverse as race and taxes, energy and nature, privacy and community. The owner occupied, single-family home, surrounded by a yard, and set in a neighborhood outside the urban core came to define everyday experience for most American households, and in the world of popular culture and the imagination, suburbia was the setting for the American dream. The nation’s suburbs were an equally critical economic landscape, home to vital high-tech industries, retailing, “logistics,” and office employment. In addition, American politics rested on a suburban majority, and over several decades, suburbia incubated political movements across the partisan spectrum, from grass-roots conservativism, to centrist meritocratic individualism, environmentalism, feminism, and social justice. In short, suburbia was a key setting for postwar American life.
Even as suburbia grew in magnitude and influence, it also grew more diverse, coming to reflect a much broader cross-section of America itself. This encompassing shift marked two key chronological stages in suburban history since 1945: the expansive, racialized, mass suburbanization of the postwar years (1945–1970) and an era of intensive social diversification and metropolitan complexity (since 1970). In the first period, suburbia witnessed the expansion of segregated white privilege, bolstered by government policies, exclusionary practices, and reinforced by grassroots political movements. By the second period, suburbia came to house a broader cross section of Americans, who brought with them a wide range of outlooks, lifeways, values, and politics. Suburbia became home to large numbers of immigrants, ethnic groups, African Americans, the poor, the elderly and diverse family types. In the face of stubborn exclusionism by affluent suburbs, inequality persisted across metropolitan areas and manifested anew in proliferating poorer, distressed suburbs. Reform efforts sought to alleviate metro-wide inequality and promote sustainable development, using coordinated regional approaches. In recent years, the twin discourses of suburban crisis and suburban rejuvenation captured the continued complexity of America’s suburbs.