You are looking at 151-160 of 180 articles
Don H. Doyle
America’s Civil War became part of a much larger international crisis as European powers, happy to see the experiment in self-government fail in America’s “Great Republic,” took advantage of the situation to reclaim former colonies in the Caribbean and establish a European monarchy in Mexico. Overseas, in addition to their formal diplomatic appeals to European governments, both sides also experimented with public diplomacy campaigns to influence public opinion. Confederate foreign policy sought to win recognition and aid from Europe by offering free trade in cotton and aligning their cause with that of the aristocratic anti-democratic governing classes of Europe. The Union, instead, appealed to liberal, republican sentiment abroad by depicting the war as a trial of democratic government and embracing emancipation of the slaves. The Union victory led to the withdrawal of European empires from the New World: Spain from Santo Domingo, France from Mexico, Russia from Alaska, and Britain from Canada, and the destruction of slavery in the United States hastened its end in Puerto Rico, Cuba, and Brazil.
Blake C. Scott
Tourism is so deep-seated in the history of U.S. foreign relations we seem to have taken its presence for granted. Millions of American tourists have traveled abroad, yet one can count with just two hands the number of scholarly monographs analyzing the relationship between U.S. foreign relations and tourism. What explains this lack of historical reflection about one of the most quotidian forms of U.S. influence abroad?
In an influential essay about wilderness and the American frontier, the environmental historian William Cronon argues, “one of the most striking proofs of the cultural invention of wilderness is its thoroughgoing erasure of the history from which it sprang.” Historians and the American public, perhaps in modern fashion, have overlooked tourism’s role in the nation’s international affairs. Only a culture and a people so intimately familiar with tourism’s practices could naturalize them out of history.
The history of international tourism is profoundly entangled with the history of U.S. foreign policy. This entanglement has involved, among other things, science and technology, military intervention, diplomacy, and the promotion of consumer spending abroad. U.S. expansion created the structure (the social stability, medical safety, and transportation infrastructure) for globetrotting travel in the 20th century. As this essay shows, U.S. foreign policy was crucial in transforming foreign travel into a middle-class consumer experience.
David M. Robinson
New England transcendentalism is the first significant literary movement in American history, notable principally for the influential works of Ralph Waldo Emerson, Margaret Fuller, and Henry David Thoreau. The movement emerged in the 1830s as a religious challenge to New England Unitarianism. Building on the writings of the Unitarian leader William Ellery Channing, Emerson and others such as Frederic Henry Hedge, George Ripley, James Freeman Clarke, and Theodore Parker developed a theology based on interior, intuitive experience rather than the historical truth of the Bible. By 1836 transcendentalist books from several important religious thinkers began to appear, including Emerson’s Nature, which employed idealist philosophy and Romantic symbolism to examine human interaction with the natural world. Emerson’s Harvard addresses, “The American Scholar” (1837) and the controversial “Divinity School Address” (1838), gave transcendental ideas a wider prominence, and also generated strong resistance that added an element of experiment and danger to the movement’s reputation. In 1840 the transcendentalists founded a journal for their work, and Fuller became the Dial’s first editor, a position that gave her an important role in the movement and a crucial outlet for her own work in literary criticism and women’s rights.
Though it had begun as a religious movement, by the middle 1840s transcendentalism could be better described as a literary movement with growing political engagements on several fronts. Emerson proclaimed it as an era of reform and aligned the transcendentalists with those who resisted the social and political status quo. In her feminist manifesto Woman in the Nineteenth Century (1845), Fuller called for the removal of both legal and social barriers to women’s full potential. In 1845 Henry David Thoreau went to live in the woods by Walden Pond; his memoir of his experience, Walden (1854), became a founding text of modern environmental thinking. Antislavery also became a key concern for many of the transcendentalists, who condemned the Fugitive Slave Act of 1850 and actively resisted the execution of the law after its passage. The transcendentalists, a nineteenth-century cultural avant-garde, continue to exert cultural influence through the durability of their writings, works that shaped many aspects of American national development.
Although the League of Nations was the first permanent organization established with the purpose of maintaining international peace, it built on the work of a series of 19th-century intergovernmental institutions. The destructiveness of World War I led American and British statesmen to champion a league as a means of maintaining postwar global order. In the United States, Woodrow Wilson followed his predecessors, Theodore Roosevelt and William Howard Taft, in advocating American membership of an international peace league, although Wilson’s vision for reforming global affairs was more radical. In Britain, public opinion had begun to coalesce in favor of a league from the outset of the war, though David Lloyd George and many of his Cabinet colleagues were initially skeptical of its benefits. However, Lloyd George was determined to establish an alliance with the United States and warmed to the league idea when Jan Christian Smuts presented a blueprint for an organization that served that end.
The creation of the League was a predominantly British and American affair. Yet Wilson was unable to convince Americans to commit themselves to membership in the new organization. The Franco-British-dominated League enjoyed some early successes. Its high point was reached when Europe was infused with the “Spirit of Locarno” in the mid-1920s and the United States played an economically crucial, if politically constrained, role in advancing Continental peace. This tenuous basis for international order collapsed as a result of the economic chaos of the early 1930s, as the League proved incapable of containing the ambitions of revisionist powers in Europe and Asia. Despite its ultimate limitations as a peacekeeping body, recent scholarship has emphasized the League’s relative successes in stabilizing new states, safeguarding minorities, managing the evolution of colonies into notionally sovereign states, and policing transnational trafficking; in doing so, it paved the way for the creation of the United Nations.
For almost a century and a half, successive American governments adopted a general policy of neutrality on the world stage, eschewing involvement in European conflicts and, after the Quasi War with France, alliances with European powers. Neutrality, enshrined as a core principle of American foreign relations by the outgoing President George Washington in 1796, remained such for more than a century.
Finally, in the 20th century, the United States emerged as a world power and a belligerent in the two world wars and the Cold War. This article explores the modern conflict between traditional American attitudes toward neutrality and the global agenda embraced by successive U.S. governments, beginning with entry in the First World War. With the United States immersed in these titanic struggles, the traditional U.S. support for neutrality eroded considerably. During the First World War, the United States showed some sympathy for the predicaments of the remaining neutral powers. In the Second World War it applied considerable pressure to those states still trading with Germany. During the Cold War, the United States was sometimes impatient with the choices of states to remain uncommitted in the global struggle, while at times it showed understanding for neutrality and pursued constructive relations with neutral states. The wide varieties of neutrality in each of these conflicts complicated the choices of U.S. policy makers. Americans remained torn between memory of their own long history of neutrality and a capacity to understand its potential value, on one hand, and a predilection to approach conflicts as moral struggles, on the other.
C. J. Alvarez
The region that today constitutes the United States–Mexico borderland has evolved through various systems of occupation over thousands of years. Beginning in time immemorial, the land was used and inhabited by ancient peoples whose cultures we can only understand through the archeological record and the beliefs of their living descendants. Spain, then Mexico and the United States after it, attempted to control the borderlands but failed when confronted with indigenous power, at least until the late 19th century when American capital and police established firm dominance. Since then, borderland residents have often fiercely contested this supremacy at the local level, but the borderland has also, due to the primacy of business, expressed deep harmonies and cooperation between the U.S. and Mexican federal governments. It is a majority minority zone in the United States, populated largely by Mexican Americans. The border is both a porous membrane across which tremendous wealth passes and a territory of interdiction in which noncitizens and smugglers are subject to unusually concentrated police attention. All of this exists within a particularly harsh ecosystem characterized by extreme heat and scarce water.
Michael R. Anderson
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
Although the term “Asia-Pacific” was not coined until World War II and the geographic parameters are admittedly imprecise, the regional designation nevertheless has gained popularity in recent decades among policymakers, businesspeople, and non-governmental organizations. Asia-Pacific refers to the regions bordering the western Pacific Ocean: East Asia, Southeast Asia, and Oceania. It excludes some countries that are considered part of the larger Pacific Rim: Russia, Canada, Mexico, and the western nations of Central and South America. American interest in the Asia-Pacific over the past two centuries has been marked by strong and often contradictory impulses. On the one hand, the western Pacific has served as a fertile ground for Christian missionaries, an alluring destination for American commercial enterprises, and a critical launch pad for U.S. global power projection. Yet on the other hand, leading countries in the Asia-Pacific region frequently have challenged U.S. economic and military interests, and the assertion of “Asian values” in recent years has undermined efforts to expand Western political and cultural norms. The United States’ professed “pivot to Asia” has set the stage for the latest chapter in a centuries-long relationship, one more than any other that will determine the geopolitical fault lines of the 21st century.
Risa L. Goluboff
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
The crime of vagrancy has deep historical roots in U.S. law and legal culture. Originating in 16th-century England, vagrancy laws came to the New World with the colonists and soon proliferated throughout the United States. Although they took myriad forms, vaguely worded statutes targeting objectionable, “oue-of-place” people, rather than any particular conduct, soon became a ubiquitous tool for maintaining hierarchy and order in American society. The laws and their application changed alongside perceived threats to the social fabric—at different times targeting labor activists, radical orators, cultural and sexual nonconformists, racial and religious minorities, civil rights protesters, and the poor. By the mid-20th century, vagrancy laws served as the basis for hundreds of thousands of arrests every year. But over the course of just two decades, the crime of vagrancy, virtually unquestioned for four hundred years, unraveled. Profound social upheaval in the 1960s produced a concerted effort against the vagrancy regime, and in 1972, the United States Supreme Court invalidated the laws. Local authorities have spent the years since looking for alternatives to the many functions vagrancy laws once served.
Christopher P. Loss
Until World War II, American universities were widely regarded as good but not great centers of research and learning. This changed completely in the press of wartime, when the federal government pumped billions into military research, anchored by the development of the atomic bomb and radar, and into the education of returning veterans under the GI Bill of 1944. The abandonment of decentralized federal–academic relations marked the single most important development in the history of the modern American university. While it is true that the government had helped to coordinate and fund the university system prior to the war—most notably the country’s network of public land-grant colleges and universities—government involvement after the war became much more hands-on, eventually leading to direct financial support to and legislative interventions on behalf of core institutional activities, not only the public land grants but the nation’s mix of private institutions as well. However, the reliance on public subsidies and legislative and judicial interventions of one kind or another ended up being a double-edged sword: state action made possible the expansion in research and in student access that became the hallmarks of the post-1945 American university; but it also created a rising tide of expectations for continued support that has proven challenging in fiscally stringent times and in the face of ongoing political fights over the government’s proper role in supporting the sector.
Megan Kate Nelson
During the American Civil War, Union and Confederate commanders made the capture and destruction of enemy cities a central feature of their military campaigns. They did so for two reasons. First, most mid-19th-century cities had factories, foundries, and warehouses within their borders, churning out and storing war materiel; military officials believed that if they interrupted or incapacitated the enemy’s ability to arm or clothe themselves, the war would end. Second, it was believed that the widespread destruction of property—especially in major or capital cities—would also damage civilians’ morale, undermining their political convictions and decreasing their support for the war effort.
Both Union and Confederate armies bombarded and burned cities with these goals in mind. Sometimes they fought battles on city streets but more often, Union troops initiated long-term sieges in order to capture Confederate cities and demoralize their inhabitants. Soldiers on both sides were motivated by vengeance when they set fire to city businesses and homes; these acts were controversial, as was defensive burning—the deliberate destruction of one’s own urban center in order to keep its war materiel out of the hands of the enemy.
Urban destruction, particularly long-term sieges, took a psychological toll on (mostly southern) city residents. Many were wounded, lost property, or were forced to become refugees. Because of this, the destruction of cities during the American Civil War provoked widespread discussions about the nature of “civilized warfare” and the role that civilians played in military strategy. Both soldiers and civilians tried to make sense of the destruction of cities in writing, and also in illustrations and photographs; images in particular shaped both northern and southern memories of the war and its costs.