You are looking at 61-70 of 197 articles
John P. Bowes
Indian removals as a topic primarily encompasses the relocation of Native American tribes from American-claimed states and territories east of the Mississippi River to lands west of the Mississippi River in the first half of the 19th century. The bill passed by Congress in May 1830 referred to as the Indian Removal Act is the legislative expression of the ideology upon which federal and state governments acted to accomplish the dispossession and relocation of tens of thousands of Native American peoples during that time. Through both treaty negotiations and coercion, federal officials used the authority of removal policies to obtain land cessions and resettle eastern Indians in what is known in the early 21st century as Kansas and Oklahoma. These actions, in conjunction with non-Indian population growth and western migration, made it extremely difficult, if not impossible, for any tribes to remain on their eastern lands. The Cherokee Trail of Tears, which entailed the forced removal of approximately fourteen thousand men, women, and children from Georgia starting in the summer of 1838 until the spring of 1839, remains the most well-known illustration of this policy and its impact. Yet the comprehensive histories of removals encompass the forced relocations of tens of thousands of indigenous men, women, and children from throughout the Southeast as well as the Old Northwest from the 1810s into the 1850s.
The history of American slavery began long before the first Africans arrived at Jamestown in 1619. Evidence from archaeology and oral tradition indicates that for hundreds, perhaps thousands, of years prior, Native Americans had developed their own forms of bondage. This fact should not be surprising, for most societies throughout history have practiced slavery. In her cross-cultural and historical research on comparative captivity, Catherine Cameron found that bondspeople composed 10 percent to 70 percent of the population of most societies, lending credence to Seymour Drescher’s assertion that “freedom, not slavery, was the peculiar institution.” If slavery is ubiquitous, however, it is also highly variable. Indigenous American slavery, rooted in warfare and diplomacy, was flexible, often offering its victims escape through adoption or intermarriage, and it was divorced from racial ideology, deeming all foreigners—men, women, and children, of whatever color or nation—potential slaves. Thus, Europeans did not introduce slavery to North America. Rather, colonialism brought distinct and evolving notions of bondage into contact with one another. At times, these slaveries clashed, but they also reinforced and influenced one another. Colonists, who had a voracious demand for labor and export commodities, exploited indigenous networks of captive exchange, producing a massive global commerce in Indian slaves. This began with the second voyage of Christopher Columbus in 1495 and extended in some parts of the Americas through the twentieth century. During this period, between 2 and 4 million Indians were enslaved. Elsewhere in the Americas, Indigenous people adapted Euro-American forms of bondage. In the Southeast, an elite class of Indians began to hold African Americans in transgenerational slavery and, by 1800, developed plantations that rivaled those of their white neighbors. The story of Native Americans and slavery is complicated: millions were victims, some were masters, and the nature of slavery changed over time and varied from one place to another. A significant and long overlooked aspect of American history, Indian slavery shaped colonialism, exacerbated Native population losses, figured prominently in warfare and politics, and influenced Native and colonial ideas about race and identity.
Mass transit has been part of the urban scene in the United States since the early 19th century. Regular steam ferry service began in New York City in the early 1810s and horse-drawn omnibuses plied city streets starting in the late 1820s. Expanding networks of horse railways emerged by the mid-19th century. The electric streetcar became the dominant mass transit vehicle a half century later. During this era, mass transit had a significant impact on American urban development. Mass transit’s importance in the lives of most Americans started to decline with the growth of automobile ownership in the 1920s, except for a temporary rise in transit ridership during World War II. In the 1960s, congressional subsidies began to reinvigorate mass transit and heavy-rail systems opened in several cities, followed by light rail systems in several others in the next decades. Today concerns about environmental sustainability and urban revitalization have stimulated renewed interest in the benefits of mass transit.
By serving travelers and commerce, roads and streets unite people and foster economic growth. But as they develop, roads and streets also disrupt old patterns, upset balances of power, and isolate some as they serve others. The consequent disagreements leave historical records documenting social struggles that might otherwise be overlooked. For long-distance travel in America before the middle of the 20th century, roads were generally poor alternatives, resorted to when superior means of travel, such as river and coastal vessels, canal boats, or railroads were unavailable. Most roads were unpaved, unmarked, and vulnerable to the effects of weather. Before the railroads, for travelers willing to pay the toll, rare turnpikes and plank roads could be much better. Even in towns, unpaved streets were common until the late 19th century, and persisted into the 20th. In the late 19th century, rapid urban growth, rural free delivery of the mails, and finally the proliferation of electric railways and bicycling contributed to growing pressure for better roads and streets. After 1910, the spread of the automobile accelerated the trend, but only with great controversy, especially in cities. Partly in response to the controversy, advocates of the automobile organized to promote state and county motor highways funded substantially by gasoline taxes; such roads were intended primarily for motor vehicles. In the 1950s, massive federal funds accelerated the trend; by then, motor vehicles were the primary transportation mode for both long and short distances. The consequences have been controversial, and alternatives have been attracting growing interest.
Joel A. Tarr
Urban water supply and sewage disposal facilities are critical parts of the urban infrastructure. They have enabled cities and their metropolitan areas to function as centers of commerce, industry, entertainment, and human habitation. The evolution of water supply and sewage disposal systems in American cities from 1800 to 2015 is examined, with a focus on major turning points especially in regard to technological decisions, public policy, and environmental and public health issues.
Thomas A. Reinstein
The United States has a rich history of intelligence in the conduct of foreign relations. Since the Revolutionary War, intelligence has been most relevant to U.S. foreign policy in two ways. Intelligence analysis helps to inform policy. Intelligence agencies also have carried out overt action—secret operations—to influence political, military, or economic conditions in foreign states. The American intelligence community has developed over a long period, and major changes to that community have often occurred because of contingent events rather than long-range planning. Throughout their history, American intelligence agencies have used intelligence gained from both human and technological sources to great effect. Often, U.S. intelligence agencies have been forced to rely on technological means of intelligence gathering for lack of human sources. Recent advances in cyberwarfare have made technology even more important to the American intelligence community.
At the same time, the relationship between intelligence and national-security–related policymaking has often been dysfunctional. Indeed, though some American policymakers have used intelligence avidly, many others have used it haphazardly or not at all. Bureaucratic fights also have crippled the American intelligence community. Several high-profile intelligence failures tend to dominate the recent history of intelligence and U.S. foreign relations. Some of these failures were due to lack of intelligence or poor analytic tradecraft. Others came because policymakers failed to use the intelligence they had. In some cases, policymakers have also pressured intelligence officers to change their findings to better suit those policymakers’ goals. And presidents have often preferred to use covert action to carry out their preferred policies without paying attention to intelligence analysis. The result has been constant debate about the appropriate role of intelligence in U.S. foreign relations.
Between the 1790s and the 1990s, the Irish American population grew from some 500,000 to nearly 40 million. Part of this growth was due to immigration, especially in the years of the Great Irish Famine, though significant emigration from Ireland both preceded and followed the famine decade of 1846–1855. For much of this 200-year period, Irish-born men and women and their descendants were heavily concentrated in working-class occupations and urban communities. Especially in the years around the opening of the 20th century, Irish Catholic immigrants and their descendants put a distinctive stamp on both the American labor movement and urban working-class culture and politics as a whole. Their outsized influence diminished somewhat over the course of the 20th century, but the American Irish continued to occupy key leadership positions in the U.S. labor movement, the Democratic Party, and the American Catholic Church, even as the working-class members or constituents of these institutions became increasingly ethnically diverse. The experience of Irish American working people thus constitutes an important dimension of a larger story—that of the American working class as a whole.
Justus D. Doenecke
For the United States, isolationism is best defined as avoidance of wars outside the Western Hemisphere, particularly in Europe; opposition to binding military alliances; and the unilateral freedom to act politically and commercially unrestrained by mandatory commitments to other nations. Until the controversy over American entry into the League of Nations, isolationism was never subject to debate. The United States could expand its territory, protect its commerce, and even fight foreign powers without violating its traditional tenets. Once President Woodrow Wilson sought membership in the League, however, Americans saw isolationism as a foreign policy option, not simply something taken for granted. A fundamental foreign policy tenet now became a faction, limited to a group of people branded as “isolationists.” Its high point came during the years 1934–1937, when Congress, noting the challenge of the totalitarian nations to the international status quo, passed the neutrality acts to insulate the country from global entanglements.
Once World War II broke out in Europe, President Franklin D. Roosevelt increasingly sought American participation on the side of the Allies. Isolationists unsuccessfully fought FDR’s legislative proposals, beginning with repeal of the arms embargo and ending with the convoying of supplies to Britain. The America First Committee (1940–1941), however, so effectively mobilized anti-interventionist opinion as to make the president more cautious in his diplomacy.
If the Japanese attack on Pearl Harbor permanently ended classic isolationism, by 1945 a “new isolationism” voiced suspicion of the United Nations, the Truman Doctrine, aid to Greece and Turkey, the Marshall Plan, the North Atlantic Treaty Organization, and U.S. participation in the Korean War. Yet, because the “new isolationists” increasingly advocated militant unilateral measures to confront Communist Russia and China, often doing so to advance the fortunes of the Republican party, they exposed themselves to charges of inconsistency and generally faded away in the 1950s. Since the 1950s, many Americans have opposed various military involvements— including the ones in Vietnam, Iraq, and Afghanistan— but few envision returning to an era when the United States avoids all commitments.
Racism and xenophobia, but also resilience and community building, characterize the return of thousands of Japanese Americans, or Nikkei, to the West Coast after World War II. Although the specific histories of different regions shaped the resettlement experiences for Japanese Americans, Los Angeles provides an instructive case study. For generations, the City of Angels has been home to one of the nation’s largest and most diverse Nikkei communities and the ways in which Japanese Americans rebuilt their lives and institutions resonate with the resettlement experience elsewhere.
Before World War II, greater Los Angeles was home to a vibrant Japanese American population. First generation immigrants, or Issei, and their American-born children, the Nisei, forged dynamic social, economic, cultural, and spiritual institutions out of various racial exclusions. World War II uprooted the community as Japanese Americans left behind their farms, businesses, and homes. In the best instances, they were able to entrust their property to neighbors or other sympathetic individuals. More often, the uncertainty of their future led Japanese Americans to sell off their property, far below the market price. Upon the war’s end, thousands of Japanese Americans returned to Los Angeles, often to financial ruin.
Upon their arrival in the Los Angeles area, Japanese Americans continued to face deep-seated prejudice, all the more accentuated by an overall dearth of housing. Without a place to live, they sought refuge in communal hostels set up in pre-war institutions that survived the war such as a variety of Christian and Buddhist churches. Meanwhile, others found housing in temporary trailer camps set up by the War Relocation Authority (WRA), and later administered by the Federal Public Housing Authority (FPHA), in areas such as Burbank, Sun Valley, Hawthorne, Santa Monica, and Long Beach. Although some local religious groups and others welcomed the returnees, white homeowners, who viewed the settlement of Japanese Americans as a threat to their property values, often mobilized to protest the construction of these camps. The last of these camps closed in 1956, demonstrating the hardship some Japanese Americans still faced in integrating back into society. Even when the returnees were able to leave the camps, they still faced racially restrictive housing covenants and, when those practices were ruled unconstitutional, exclusionary lending. Although new suburban enclaves of Japanese Americans eventually developed in areas such as Gardena, West Los Angeles, and Pacoima by the 1960s, the pathway to those destinations was far from easy. Ultimately, the resettlement of Japanese Americans in Los Angeles after their mass incarceration during World War II took place within the intertwined contexts of lingering anti-Japanese racism, Cold War politics, and the suburbanization of Southern California.
In January 1938, Benny Goodman took command of Carnegie Hall on a blustery New York City evening and for two hours his band tore through the history of jazz in a performance that came to define the entire Swing Era. Goodman played Carnegie Hall at the top of his jazz game leading his crack band—including Gene Krupa on drums and Harry James on trumpet—through new, original arrangements by Fletcher Henderson. Compounding the historic nature of the highly publicized jazz concert, Goodman welcomed onto the stage members of Duke Ellington’s band to join in on what would be the first major jazz performance by an integrated band. With its sprit of inclusion as well as its emphasis on the historical contours of the first decades of jazz, Goodman’s Carnegie Hall concert represented the apex of jazz music’s acceptance as the most popular form of American musical expression. In addition, Goodman’s concert coincided with the resurgence of the record industry, hit hard by the Great Depression. By the late 1930s, millions of Americans purchased swing records and tuned into jazz radio programs, including Goodman’s own show, which averaged two million listeners during that period.
And yet, only forty years separated this major popular triumph and the very origins of jazz music. Between 1900 and 1945, American musical culture changed dramatically; new sounds via new technologies came to define the national experience. At the same time, there were massive demographic shifts as black southerners moved to the Midwest and North, and urban culture eclipsed rural life as the norm. America in 1900 was mainly a rural and disconnected nation, defined by regional identities where cultural forms were transmitted through live performances. By the end of World War II, however, a definable national musical culture had emerged, as radio came to link Americans across time and space. Regional cultures blurred as a national culture emerged via radio transmissions, motion picture releases, and phonograph records. The turbulent decade of the 1920s sat at the center of this musical and cultural transformation as American life underwent dramatic changes in the first decades of the 20th century.