Emerson W. Baker
The Salem Witch Trials are one of the best known, most studied, and most important events in early American history. The afflictions started in Salem Village (present-day Danvers), Massachusetts, in January 1692, and by the end of the year the outbreak had spread throughout Essex County, and threatened to bring down the newly formed Massachusetts Bay government of Sir William Phips. It may have even helped trigger a witchcraft crisis in Connecticut that same year. The trials are known for their heavy reliance on spectral evidence, and numerous confessions, which helped the accusations grow. A total of 172 people are known to have been formally charged or informally cried out upon for witchcraft in 1692. Usually poor and marginalized members of society were the victims of witchcraft accusations, but in 1692 many of the leading members of the colony were accused. George Burroughs, a former minister of Salem Village, was one of the nineteen people convicted and executed. In addition to these victims, one man, Giles Cory, was pressed to death, and five died in prison. The last executions took place in September 1692, but it was not until May 1693 that the last trial was held and the last of the accused was freed from prison.
The trials would have lasting repercussions in Massachusetts and signaled the beginning of the end of the Puritan City upon a Hill, an image of American exceptionalism still regularly invoked. The publications ban issued by Governor Phips to prevent criticism of the government would last three years, but ultimately this effort only ensured that the failure of the government to protect innocent lives would never be forgotten. Pardons and reparations for some of the victims and their families were granted by the government in the early 18th century, and the legislature would regularly take up petitions, and discuss further reparations until 1749, more than fifty years after the trials. The last victims were formally pardoned by the governor and legislature of Massachusetts in 2001.
Steven K. Green
Separation of church and state has long been viewed as a cornerstone of American democracy. At the same time, the concept has remained highly controversial in the popular culture and law. Much of the debate over the application and meaning of the phrase focuses on its historical antecedents. This article briefly examines the historical origins of the concept and its subsequent evolutions in the nineteenth century.
Both sexuality and religion are terms as vexatious to define as they can be alluring to pursue. In the contemporary period, figuring out one’s sexual feelings, identity, and preferences has become a signal aspect of self-formation. Understanding one’s religious feelings, identity, and preferences may seem less imminent, but is certainly no less complicated. Both terms cause no small amount of confusion. Clearing up some of this confusion requires speaking frankly about delicate matters, and also speaking flatly about enormously complex experiences. Popular media coverage of ecclesiastical sex scandals in America suggests that people enjoy hearing about the profanation of religious duty. Despite the observed, inferred, and accused sexuality in American religious history, or maybe because of it, eroticism suffuses narrative accounts of American religious history and descriptions of religious actors. In U.S. history, sexuality has often been a key lens through which we have understood the nature of religion, the leaders of religions, and the reason for religious commitment.
Peter C. Baldwin
Today the term nightlife typically refers to social activities in urban commercial spaces—particularly drinking, dancing, dining, and listening to live musical performances. This was not always so. Cities in the 18th and early 19th centuries knew relatively limited nightlife, most of it occurring in drinking places for men. Theater attracted mixed-gender audiences but was sometimes seen as disreputable in both its content and the character of the audience. Theater owners worked to shed this negative reputation starting in the mid-19th century, while nightlife continued to be tainted by the profusion of saloons, brothels, and gambling halls. Gradual improvements in street lighting and police protection encouraged people to go out at night, as did growing incomes and decreasing hours of labor. Nightlife attracted more women in the decades around 1900 as it expanded and diversified. Dance halls, vaudeville houses, movie theaters, restaurants, and cabarets thrived in the electrified “bright lights” districts of central cities. Commercial entertainment contracted again in the 1950s and 1960s as Americans spent more of their evening leisure hours watching television and began to regard urban public spaces with suspicion. Still, nightlife is viewed as an important component of urban economic life and is actively promoted by many municipal governments.
Gary R. Edgerton
Television is an ever-evolving and multi-dimensional medium, being at once a technology, an industry, an art form, and an institutional force. In the United States, it emerged as an idea whose time had come at the end of World War II. TV eventually grew and matured into the most influential social and cultural catalyst shaping and reflecting American civilization during the second half of the 20th century. Television revolutionized the way citizens and consumers in the United States learned about and communicated with the world; it also recast and re-envisioned the way they experience themselves and others. More than just escapist entertainment, TV reveals the dynamism and diversity of everyday life in the United States and the evolving nature of the nation’s core values. Television is moreover in a continual state of change and renewal. Its history has developed through a prehistory (before 1948) to a network era (1948–1975), a cable era (1976–1994), and finally the current digital era (1995–present). Today there are more than 650 networks in the U.S. marketplace whereby members of the typical domestic household receive 189 channels and watch more than eight hours of TV a day on average. TV in the 21st century also travels anywhere at any time, given its synergistic relationship with the Internet and a wide array of digital devices. It is now increasingly personalized, interactive, mobile, and on demand. Television is presently a convergent technology, a global industry, a viable art form, a public catalyst, and a complex and dynamic reflection of American society and culture.
H. Paul Thompson Jr.
The temperance and prohibition movement—a social reform movement that pursued many approaches to limit or prohibit the use and/or sale of alcoholic beverages—is arguably the longest-running reform movement in US history, extending from the 1780s through the repeal of national prohibition in 1933. During this 150-year period the movement experienced many ideological, organizational, and methodological changes. Probably the most widely embraced antebellum reform, many of its earliest assumptions and much of its earliest literature was explicitly evangelical, but over time the movement assumed an increasingly secular image while retaining strong ties to organized religion. During the movement’s first fifty years, its definition of temperance evolved successively from avoiding drunkenness, to abstaining from all distilled beverages, to abstaining from all intoxicating beverages (i.e., “teetotalism”). During these years, reformers sought merely to persuade others of their views—what was called “moral suasion.” But by the 1840s many reformers began seeking the coercive power of local and state governments to prohibit the “liquor traffic.” These efforts were called “legal suasion,” and in the early 20th century, when local and state laws were deemed insufficient, movement leaders turned to the federal government. Throughout its history, movement leaders produced an extensive and well-preserved serial and monographic literature to chronicle their efforts, which makes the movement relatively easy to study.
No less than five national temperance organizations rose and fell across the movement’s history, aided by many other organizations also promoted the message with great effect. Grass roots reformers organized innumerable state and local temperance societies and fraternal lodges committed to abstinence. Temperance reformers, hailing from nearly every conceivable demographic, networked through a series of national and international temperance conventions, and at any given time were pursuing a diverse and often conflicting array of priorities and methodologies.
Finally, during the Progressive Era, reformers focused their hatred for alcohol almost exclusively on saloons and the liquor traffic. Through groundbreaking lobbying efforts and a fortuitous convergence of social and political forces, reformers witnessed the ratification of the Eighteenth Amendment in January 1919 that established national prohibition. Despite such a long history of reform, the success seemed sudden and caught many in the movement off guard. The rise of liquor-related violence, a transformation in federal-state relations, increasingly organized and outspoken opposition, the Great Depression, and a re-alignment of political party coalitions all culminated in the sweeping repudiation of prohibition and its Republican supporters in the 1932 presidential election. On December 5, 1933, the Twenty-first Amendment to the Constitution repealed the Eighteenth Amendment, returning liquor regulation to the states, which have since maintained a wide variety of ever changing laws controlling the sale of alcoholic beverages. But national prohibition permanently altered the federal government’s role in law enforcement, and its legacy remains.
Ross A. Kennedy
World War I profoundly affected the United States. It led to an expansion of America’s permanent military establishment, a foreign policy focused on reforming world politics, and American preeminence in international finance. In domestic affairs, America’s involvement in the war exacerbated class, racial, and ethnic conflict. It also heightened both the ethos of voluntarism in progressive ideology and the progressive desire to step up state intervention in the economy and society. These dual impulses had a coercive thrust that sometimes advanced progressive goals of a more equal, democratic society and sometimes repressed any perceived threat to a unified war effort. Ultimately the combination of progressive and repressive coercion undermined support for the Democratic Party, shifting the nation’s politics in a conservative direction as it entered the 1920s.
David M. Robinson
New England transcendentalism is the first significant literary movement in American history, notable principally for the influential works of Ralph Waldo Emerson, Margaret Fuller, and Henry David Thoreau. The movement emerged in the 1830s as a religious challenge to New England Unitarianism. Building on the writings of the Unitarian leader William Ellery Channing, Emerson and others such as Frederic Henry Hedge, George Ripley, James Freeman Clarke, and Theodore Parker developed a theology based on interior, intuitive experience rather than the historical truth of the Bible. By 1836 transcendentalist books from several important religious thinkers began to appear, including Emerson’s Nature, which employed idealist philosophy and Romantic symbolism to examine human interaction with the natural world. Emerson’s Harvard addresses, “The American Scholar” (1837) and the controversial “Divinity School Address” (1838), gave transcendental ideas a wider prominence, and also generated strong resistance that added an element of experiment and danger to the movement’s reputation. In 1840 the transcendentalists founded a journal for their work, and Fuller became the Dial’s first editor, a position that gave her an important role in the movement and a crucial outlet for her own work in literary criticism and women’s rights.
Though it had begun as a religious movement, by the middle 1840s transcendentalism could be better described as a literary movement with growing political engagements on several fronts. Emerson proclaimed it as an era of reform and aligned the transcendentalists with those who resisted the social and political status quo. In her feminist manifesto Woman in the Nineteenth Century (1845), Fuller called for the removal of both legal and social barriers to women’s full potential. In 1845 Henry David Thoreau went to live in the woods by Walden Pond; his memoir of his experience, Walden (1854), became a founding text of modern environmental thinking. Antislavery also became a key concern for many of the transcendentalists, who condemned the Fugitive Slave Act of 1850 and actively resisted the execution of the law after its passage. The transcendentalists, a nineteenth-century cultural avant-garde, continue to exert cultural influence through the durability of their writings, works that shaped many aspects of American national development.
Christopher P. Loss
Until World War II, American universities were widely regarded as good but not great centers of research and learning. This changed completely in the press of wartime, when the federal government pumped billions into military research, anchored by the development of the atomic bomb and radar, and into the education of returning veterans under the GI Bill of 1944. The abandonment of decentralized federal–academic relations marked the single most important development in the history of the modern American university. While it is true that the government had helped to coordinate and fund the university system prior to the war—most notably the country’s network of public land-grant colleges and universities—government involvement after the war became much more hands-on, eventually leading to direct financial support to and legislative interventions on behalf of core institutional activities, not only the public land grants but the nation’s mix of private institutions as well. However, the reliance on public subsidies and legislative and judicial interventions of one kind or another ended up being a double-edged sword: state action made possible the expansion in research and in student access that became the hallmarks of the post-1945 American university; but it also created a rising tide of expectations for continued support that has proven challenging in fiscally stringent times and in the face of ongoing political fights over the government’s proper role in supporting the sector.
Megan Kate Nelson
During the American Civil War, Union and Confederate commanders made the capture and destruction of enemy cities a central feature of their military campaigns. They did so for two reasons. First, most mid-19th-century cities had factories, foundries, and warehouses within their borders, churning out and storing war materiel; military officials believed that if they interrupted or incapacitated the enemy’s ability to arm or clothe themselves, the war would end. Second, it was believed that the widespread destruction of property—especially in major or capital cities—would also damage civilians’ morale, undermining their political convictions and decreasing their support for the war effort.
Both Union and Confederate armies bombarded and burned cities with these goals in mind. Sometimes they fought battles on city streets but more often, Union troops initiated long-term sieges in order to capture Confederate cities and demoralize their inhabitants. Soldiers on both sides were motivated by vengeance when they set fire to city businesses and homes; these acts were controversial, as was defensive burning—the deliberate destruction of one’s own urban center in order to keep its war materiel out of the hands of the enemy.
Urban destruction, particularly long-term sieges, took a psychological toll on (mostly southern) city residents. Many were wounded, lost property, or were forced to become refugees. Because of this, the destruction of cities during the American Civil War provoked widespread discussions about the nature of “civilized warfare” and the role that civilians played in military strategy. Both soldiers and civilians tried to make sense of the destruction of cities in writing, and also in illustrations and photographs; images in particular shaped both northern and southern memories of the war and its costs.