You are looking at 51-60 of 192 articles
Humans have utilized American forests for a wide variety of uses from the pre-Columbian period to the present. Native Americans heavily shaped forests to serve their needs, helping to create fire ecologies in many forests. English settlers harvested these forests for trade, to clear land, and for domestic purposes. The arrival of the Industrial Revolution in the early 19th century rapidly expanded the rate of logging. By the Civil War, many areas of the Northeast were logged out. Post–Civil War forests in the Great Lakes states, the South, and then the Pacific Northwest fell with increasing speed to feed the insatiable demands of the American economy, facilitated by rapid technological innovation that allowed for growing cuts. By the late 19th century, growing concerns about the future of American timber supplies spurred the conservation movement, personified by forester Gifford Pinchot and the creation of the U.S. Forest Service with Pinchot as its head in 1905. After World War II, the Forest Service worked closely with the timber industry to cut wide swaths of the nation’s last virgin forests. These gargantuan harvests led to the growth of the environmental movement. Beginning in the 1970s, environmentalists began to use legal means to halt logging in the ancient forests, and the listing of the northern spotted owl under the Endangered Species Act was the final blow to most logging on Forest Service lands in the Northwest. Yet not only does the timber industry remain a major employer in forested parts of the nation today, but alternative forest economies have also developed around more sustainable industries such as tourism.
Carolyn Podruchny and Stacy Nation-Knapper
From the 15th century to the present, the trade in animal fur has been an economic venture with far-reaching consequences for both North Americans and Europeans (in which North Americans of European descent are included). One of the earliest forms of exchange between Europeans and North Americans, the trade in fur was about the garment business, global and local politics, social and cultural interaction, hunting, ecology, colonialism, gendered labor, kinship networks, and religion. European fashion, specifically the desire for hats that marked male status, was a primary driver for the global fur-trade economy until the late 19th century, while European desires for marten, fox, and other luxury furs to make and trim clothing comprised a secondary part of the trade. Other animal hides including deer and bison provided sturdy leather from which belts for the machines of the early Industrial Era were cut. European cloth, especially cotton and wool, became central to the trade for Indigenous peoples who sought materials that were lighter and dried faster than skin clothing. The multiple perspectives on the fur trade included the European men and indigenous men and women actually conducting the trade; the indigenous male and female trappers; European trappers; the European men and women producing trade goods; indigenous “middlemen” (men and women) who were conducting their own fur trade to benefit from European trade companies; laborers hauling the furs and trade goods; all those who built, managed, and sustained trading posts located along waterways and trails across North America; and those Europeans who manufactured and purchased the products made of fur and the trade goods desired by Indigenous peoples. As early as the 17th century, European empires used fur-trade monopolies to establish colonies in North America and later fur trading companies brought imperial trading systems inland, while Indigenous peoples drew Europeans into their own patterns of trade and power. By the 19th century, the fur trade had covered most of the continent and the networks of business, alliances, and families, and the founding of new communities led to new peoples, including the Métis, who were descended from the mixing of European and Indigenous peoples. Trading territories, monopolies, and alliances with Indigenous peoples shaped how European concepts of statehood played out in the making of European-descended nation-states, and the development of treaties with Indigenous peoples. The fur trade flourished in northern climes until well into the 20th century, after which time economic development, resource exploitation, changes in fashion, and politics in North America and Europe limited its scope and scale. Many Indigenous people continue today to hunt and trap animals and have fought in courts for Indigenous rights to resources, land, and sovereignty.
The issue of genocide and American Indian history has been contentious. Many writers see the massive depopulation of the indigenous population of the Americas after 1492 as a clear-cut case of the genocide. Other writers, however, contend that European and U.S. actions toward Indians were deplorable but were rarely if ever genocidal. To a significant extent, disagreements about the pervasiveness of genocide in the history of the post-Columbian Western Hemisphere, in general, and U.S. history, in particular, pivot on definitions of genocide. Conservative definitions emphasize intentional actions and policies of governments that result in very large population losses, usually from direct killing. More liberal definitions call for less stringent criteria for intent, focusing more on outcomes. They do not necessarily require direct sanction by state authorities; rather, they identify societal forces and actors. They also allow for several intersecting forces of destruction, including dispossession and disease. Because debates about genocide easily devolve into quarrels about definitions, an open-ended approach to the question of genocide that explores several phases and events provides the possibility of moving beyond the present stalemate. However one resolves the question of genocide in American Indian history, it is important to recognize that European and U.S. settler colonial projects unleashed massively destructive forces on Native peoples and communities. These include violence resulting directly from settler expansion, intertribal violence (frequently aggravated by colonial intrusions), enslavement, disease, alcohol, loss of land and resources, forced removals, and assaults on tribal religion, culture, and language. The configuration and impact of these forces varied considerably in different times and places according to the goals of particular colonial projects and the capacities of colonial societies and institutions to pursue them. The capacity of Native people and communities to directly resist, blunt, or evade colonial invasions proved equally important.
Gentrification is one of the most controversial issues in American cities today. But it also remains one of the least understood. Few agree on how to define it or whether it is boon or curse for cities. Gentrification has changed over time and has a history dating back to the early 20th century. Historically, gentrification has had a smaller demographic impact on American cities than suburbanization or immigration. But since the late 1970s, gentrification has dramatically reshaped cities like Seattle, San Francisco, and Boston. Furthermore, districts such as the French Quarter in New Orleans, New York City’s Greenwich Village, and Georgetown in Washington DC have had an outsized influence on the political, cultural, and architectural history of cities. Gentrification thus must be examined alongside suburbanization as one of the major historical trends shaping the 20th-century American metropolis.
Philippe R. Girard
Haiti (known as Saint-Domingue until it gained its independence from France in 1804) had a noted economic and political impact on the United States during the era of the American Revolution, when it forced U.S. statesmen to confront issues they had generally avoided, most prominently racism and slavery. But the impact of the Haitian Revolution was most tangible in areas like commerce, territorial expansion, and diplomacy. Saint-Domingue served as a staging ground for the French military and navy during the American Revolution and provided troops to the siege of Savannah in 1779. It became the United States’ second-largest commercial partner during the 1780s and 1790s. After Saint-Domingue’s slaves revolted in 1791, many of its inhabitants found refuge in the United States, most notably in Philadelphia, Charleston, and New Orleans. Fears (or hopes) that the slave revolt would spread to the United States were prevalent in public opinion. As Saint-Domingue achieved quasi-autonomous status under the leadership of Toussaint Louverture, it occupied a central place in the diplomacy of John Adams and Thomas Jefferson. The Louisiana Purchase was made possible in part by the failure of a French expedition to Saint-Domingue in 1802–1803. Bilateral trade declined after Saint-Domingue acquired its independence from France in 1804 (after which Saint-Domingue became known as Haiti), but Haiti continued to loom large in the African-American imagination, and there were several attempts to use Haiti as a haven for U.S. freedmen. The U.S. diplomatic recognition of Haiti also served as a reference point for antebellum debates on slavery, the slave trade, and the status of free people of color in the United States.
Sarah B. Snyder
In its formulation of foreign policy, the United States takes account of many priorities and factors, including national security concerns, economic interests, and alliance relationships. An additional factor with significance that has risen and fallen over time is human rights, or more specifically violations of human rights. The extent to which the United States should consider such abuses or seek to moderate them has been and continues to be the subject of considerable debate.
Sean P. Harvey
“Race,” as a concept denoting a fundamental division of humanity and usually encompassing cultural as well as physical traits, was crucial in early America. It provided the foundation for the colonization of Native land, the enslavement of American Indians and Africans, and a common identity among socially unequal and ethnically diverse Europeans. Longstanding ideas and prejudices merged with aims to control land and labor, a dynamic reinforced by ongoing observation and theorization of non-European peoples. Although before colonization, neither American Indians, nor Africans, nor Europeans considered themselves unified “races,” Europeans endowed racial distinctions with legal force and philosophical and scientific legitimacy, while Natives appropriated categories of “red” and “Indian,” and slaves and freed people embraced those of “African” and “colored,” to imagine more expansive identities and mobilize more successful resistance to Euro-American societies. The origin, scope, and significance of “racial” difference were questions of considerable transatlantic debate in the age of Enlightenment and they acquired particular political importance in the newly independent United States.
Since the beginning of European exploration in the 15th century, voyagers called attention to the peoples they encountered, but European, American Indian, and African “races” did not exist before colonization of the so-called New World. Categories of “Christian” and “heathen” were initially most prominent, though observations also encompassed appearance, gender roles, strength, material culture, subsistence, and language. As economic interests deepened and colonies grew more powerful, classifications distinguished Europeans from “Negroes” or “Indians,” but at no point in the history of early America was there a consensus that “race” denoted bodily traits only. Rather, it was a heterogeneous compound of physical, intellectual, and moral characteristics passed on from one generation to another. While Europeans assigned blackness and African descent priority in codifying slavery, skin color was secondary to broad dismissals of the value of “savage” societies, beliefs, and behaviors in providing a legal foundation for dispossession.
“Race” originally denoted a lineage, such as a noble family or a domesticated breed, and concerns over purity of blood persisted as 18th-century Europeans applied the term—which dodged the controversial issue of whether different human groups constituted “varieties” or “species”—to describe a roughly continental distribution of peoples. Drawing upon the frameworks of scripture, natural and moral philosophy, and natural history, scholars endlessly debated whether different races shared a common ancestry, whether traits were fixed or susceptible to environmentally produced change, and whether languages or the body provided the best means to trace descent. Racial theorization boomed in the U.S. early republic, as some citizens found dispossession and slavery incompatible with natural-rights ideals, while others reconciled any potential contradictions through assurances that “race” was rooted in nature.
Post-1945 immigration to the United States differed fairly dramatically from America’s earlier 20th- and 19th-century immigration patterns, most notably in the dramatic rise in numbers of immigrants from Asia. Beginning in the late 19th century, the U.S. government took steps to bar immigration from Asia. The establishment of the national origins quota system in the 1924 Immigration Act narrowed the entryway for eastern and central Europeans, making western Europe the dominant source of immigrants. These policies shaped the racial and ethnic profile of the American population before 1945. Signs of change began to occur during and after World War II. The recruitment of temporary agricultural workers from Mexico led to an influx of Mexicans, and the repeal of Asian exclusion laws opened the door for Asian immigrants. Responding to complex international politics during the Cold War, the United States also formulated a series of refugee policies, admitting refugees from Europe, the western hemisphere, and later Southeast Asia. The movement of people to the United States increased drastically after 1965, when immigration reform ended the national origins quota system. The intricate and intriguing history of U.S. immigration after 1945 thus demonstrates how the United States related to a fast-changing world, its less restrictive immigration policies increasing the fluidity of the American population, with a substantial impact on American identity and domestic policy.
John P. Bowes
Indian removals as a topic primarily encompasses the relocation of Native American tribes from American-claimed states and territories east of the Mississippi River to lands west of the Mississippi River in the first half of the 19th century. The bill passed by Congress in May 1830 referred to as the Indian Removal Act is the legislative expression of the ideology upon which federal and state governments acted to accomplish the dispossession and relocation of tens of thousands of Native American peoples during that time. Through both treaty negotiations and coercion, federal officials used the authority of removal policies to obtain land cessions and resettle eastern Indians in what is known in the early 21st century as Kansas and Oklahoma. These actions, in conjunction with non-Indian population growth and western migration, made it extremely difficult, if not impossible, for any tribes to remain on their eastern lands. The Cherokee Trail of Tears, which entailed the forced removal of approximately fourteen thousand men, women, and children from Georgia starting in the summer of 1838 until the spring of 1839, remains the most well-known illustration of this policy and its impact. Yet the comprehensive histories of removals encompass the forced relocations of tens of thousands of indigenous men, women, and children from throughout the Southeast as well as the Old Northwest from the 1810s into the 1850s.
The history of American slavery began long before the first Africans arrived at Jamestown in 1619. Evidence from archaeology and oral tradition indicates that for hundreds, perhaps thousands, of years prior, Native Americans had developed their own forms of bondage. This fact should not be surprising, for most societies throughout history have practiced slavery. In her cross-cultural and historical research on comparative captivity, Catherine Cameron found that bondspeople composed 10 percent to 70 percent of the population of most societies, lending credence to Seymour Drescher’s assertion that “freedom, not slavery, was the peculiar institution.” If slavery is ubiquitous, however, it is also highly variable. Indigenous American slavery, rooted in warfare and diplomacy, was flexible, often offering its victims escape through adoption or intermarriage, and it was divorced from racial ideology, deeming all foreigners—men, women, and children, of whatever color or nation—potential slaves. Thus, Europeans did not introduce slavery to North America. Rather, colonialism brought distinct and evolving notions of bondage into contact with one another. At times, these slaveries clashed, but they also reinforced and influenced one another. Colonists, who had a voracious demand for labor and export commodities, exploited indigenous networks of captive exchange, producing a massive global commerce in Indian slaves. This began with the second voyage of Christopher Columbus in 1495 and extended in some parts of the Americas through the twentieth century. During this period, between 2 and 4 million Indians were enslaved. Elsewhere in the Americas, Indigenous people adapted Euro-American forms of bondage. In the Southeast, an elite class of Indians began to hold African Americans in transgenerational slavery and, by 1800, developed plantations that rivaled those of their white neighbors. The story of Native Americans and slavery is complicated: millions were victims, some were masters, and the nature of slavery changed over time and varied from one place to another. A significant and long overlooked aspect of American history, Indian slavery shaped colonialism, exacerbated Native population losses, figured prominently in warfare and politics, and influenced Native and colonial ideas about race and identity.