Rap is the musical practice of hip hop culture that features vocalists, or MCs, reciting lyrics over an instrumental beat that emerged out of the political and economic transformations of New York City after the 1960s. Black and Latinx youth, many of them Caribbean immigrants, created this new cultural form in response to racism, poverty, urban renewal, deindustrialization, and inner-city violence. These new cultural forms eventually spread beyond New York to all regions of the United States as artists from Los Angeles, New Orleans, Miami, and Chicago began releasing rap music with their own distinct sounds. Despite efforts to demonize and censor rap music and hip hop culture, rap music has served as a pathway for social mobility for many black and Latinx youth. Many artists have enjoyed crossover success in acting, advertising, and business. Rap music has also sparked new conversations about various issues such as electoral politics, gender and sexuality, crime, policing, and mass incarceration, as well as technology.
In the years after the Civil War, Polish immigrants became an important part of the American working class. They actively participated in the labor movement and played key roles in various industrial strikes ranging from the 1877 Railroad Strike through the rise of the CIO and the post-1945 era of prosperity. Over time, the Polish American working class became acculturated and left its largely immigrant past behind while maintaining itself as an ethnic community. It also witnessed a good deal of upward mobility, especially over several generations. This ethnic community, however, continued to be refreshed with immigrants throughout the 20th century.
As with the larger American working class, Polish American workers were hard hit by changes in the industrial structure of the United States. Deindustrialization turned the centers of much of the Polish American community into the Rust Belt. This, despite a radical history, caused many to react by turning toward conservative causes in the late 20th and early 21st centuries.
Courtney Q. Shah
A concerted movement to promote sex education in America emerged in the early 20th century as part of a larger public health movement that also responded to the previous century’s concerns about venereal disease, prostitution, “seduction,” and “white slavery.” Sex education, therefore, offered a way to protect people (especially privileged women) from sexual activity of all kinds—consensual and coerced. A widespread introduction into public schools did not occur until after World War I. Sex education programs in schools tended to focus on training for heterosexual marriage at a time when high school attendance spiked in urban and suburban areas. Teachers often segregated male and female students.
Beyond teaching boys about male anatomy and girls about female anatomy, reformers and educators often conveyed different messages and used different materials, depending on the race of their students. Erratic desegregation efforts during the Civil Rights movement renewed a crisis in sex education programs. Parents and administrators considered sexuality education even more dangerous in the context of a racially integrated classroom. The backlash against sex education in the schools kept pace with the backlash against integration, with each often used to bolster the other. Opponents of integration and sex education, for example, often used racial language to scare parents about what kids were learning, and with whom.
In the 1980s and 1990s, the political power of the evangelical movement in the United States attracted support for “abstinence-only” curricula that relied on scare tactics and traditional assumptions about gender and sexuality. The ever-expanding acceptance (both legal and social) of lesbian, gay, bisexual, or transgender identity directly challenged the conservative turn of abstinence-until-marriage sex education programs. The politics of gender, race, class, and sexual orientation have consistently shaped and limited sex education.
Justus D. Doenecke
For the United States, isolationism is best defined as avoidance of wars outside the Western Hemisphere, particularly in Europe; opposition to binding military alliances; and the unilateral freedom to act politically and commercially unrestrained by mandatory commitments to other nations. Until the controversy over American entry into the League of Nations, isolationism was never subject to debate. The United States could expand its territory, protect its commerce, and even fight foreign powers without violating its traditional tenets. Once President Woodrow Wilson sought membership in the League, however, Americans saw isolationism as a foreign policy option, not simply something taken for granted. A fundamental foreign policy tenet now became a faction, limited to a group of people branded as “isolationists.” Its high point came during the years 1934–1937, when Congress, noting the challenge of the totalitarian nations to the international status quo, passed the neutrality acts to insulate the country from global entanglements.
Once World War II broke out in Europe, President Franklin D. Roosevelt increasingly sought American participation on the side of the Allies. Isolationists unsuccessfully fought FDR’s legislative proposals, beginning with repeal of the arms embargo and ending with the convoying of supplies to Britain. The America First Committee (1940–1941), however, so effectively mobilized anti-interventionist opinion as to make the president more cautious in his diplomacy.
If the Japanese attack on Pearl Harbor permanently ended classic isolationism, by 1945 a “new isolationism” voiced suspicion of the United Nations, the Truman Doctrine, aid to Greece and Turkey, the Marshall Plan, the North Atlantic Treaty Organization, and U.S. participation in the Korean War. Yet, because the “new isolationists” increasingly advocated militant unilateral measures to confront Communist Russia and China, often doing so to advance the fortunes of the Republican party, they exposed themselves to charges of inconsistency and generally faded away in the 1950s. Since the 1950s, many Americans have opposed various military involvements— including the ones in Vietnam, Iraq, and Afghanistan— but few envision returning to an era when the United States avoids all commitments.
Blake C. Scott
Tourism is so deep-seated in the history of U.S. foreign relations we seem to have taken its presence for granted. Millions of American tourists have traveled abroad, yet one can count with just two hands the number of scholarly monographs analyzing the relationship between U.S. foreign relations and tourism. What explains this lack of historical reflection about one of the most quotidian forms of U.S. influence abroad?
In an influential essay about wilderness and the American frontier, the environmental historian William Cronon argues, “one of the most striking proofs of the cultural invention of wilderness is its thoroughgoing erasure of the history from which it sprang.” Historians and the American public, perhaps in modern fashion, have overlooked tourism’s role in the nation’s international affairs. Only a culture and a people so intimately familiar with tourism’s practices could naturalize them out of history.
The history of international tourism is profoundly entangled with the history of U.S. foreign policy. This entanglement has involved, among other things, science and technology, military intervention, diplomacy, and the promotion of consumer spending abroad. U.S. expansion created the structure (the social stability, medical safety, and transportation infrastructure) for globetrotting travel in the 20th century. As this essay shows, U.S. foreign policy was crucial in transforming foreign travel into a middle-class consumer experience.
The “Chinese 49’ers” who arrived in the United States a decade before the American Civil War constituted the first large wave of Asian migrants to America and transplanted the first Asian cuisine to America. Chinese food was the first ethnic cuisine to be highly commodified at the national level as a type of food primarily to be prepared and consumed away from home. At the end of the 19th century, food from China began to attract a fast-growing non-Chinese clientele of diverse ethnic backgrounds in major cities across the nation, and by 1980 Chinese food had become the most popular ethnic cuisine in the United States, aided by a renewal of Chinese immigration to America. Chinese food also has been a vital economic lifeline for Chinese Americans as one of the two main sources of employment (laundries being the other) for Chinese immigrants and families for decades. Its development, therefore, is an important chapter in American history and a central part of the Chinese American experience.
The multiple and often divergent trends in the U.S. Chinese-food industry show that it is at a crossroads today. Its future hinges on the extent to which Chinese Americans can significantly alter their position in the social and political arena and on China’s ability to transform the economic equation in its relationship with the United States.
Anne L. Foster
The beginning of modern war on drugs in the United States is commonly credited to President Richard Nixon, who evoked fears of crime, degenerate youth, and foreign drugs to garner support for his massive, by early 1970s standards, effort to combat drugs in the United States. Scholars now agree, however, that the essential characteristics of the “war on drugs” stretched back to the early 20th century. The first federal law to prohibit a narcotic in the United States passed in 1909 and banned the import of “smoking opium.” Although opium itself remained legal, opium prepared for smoking—a form believed to be consumed predominantly by ethnic Chinese and imported into the United States—was not. All future anti-narcotics policies drew on these foundational notions: narcotics were of foreign origin and invaded the United States. Thus, interdiction efforts at U.S. borders, and increasingly in producer countries, were an appropriate response. Narcotics consumers were presented as equally threatening, viewed as foreigners or at the margins of American society, and U.S. lawmakers therefore criminalized both drug use and drug trafficking. With drugs as well as drug users defined as foreign threats, militarization of the efforts to prohibit drugs followed. In U.S. drug policy, there is no distinction between foreign and domestic policy. They are intertwined at all levels, including the definition of the problem, the origin of many drugs, and the sites of enforcement.
In the decade after 1965, radicals responded to the alienating features of America’s technocratic society by developing alternative cultures that emphasized authenticity, individualism, and community. The counterculture emerged from a handful of 1950s bohemian enclaves, most notably the Beat subcultures in the Bay Area and Greenwich Village. But new influences shaped an eclectic and decentralized counterculture after 1965, first in San Francisco’s Haight-Ashbury district, then in urban areas and college towns, and, by the 1970s, on communes and in myriad counter-institutions. The psychedelic drug cultures around Timothy Leary and Ken Kesey gave rise to a mystical bent in some branches of the counterculture and influenced counterculture style in countless ways: acid rock redefined popular music; tie dye, long hair, repurposed clothes, and hip argot established a new style; and sexual mores loosened. Yet the counterculture’s reactionary elements were strong. In many counterculture communities, gender roles mirrored those of mainstream society, and aggressive male sexuality inhibited feminist spins on the sexual revolution. Entrepreneurs and corporate America refashioned the counterculture aesthetic into a marketable commodity, ignoring the counterculture’s incisive critique of capitalism. Yet the counterculture became the basis of authentic “right livelihoods” for others. Meanwhile, the politics of the counterculture defy ready categorization. The popular imagination often conflates hippies with radical peace activists. But New Leftists frequently excoriated the counterculture for rejecting political engagement in favor of hedonistic escapism or libertarian individualism. Both views miss the most important political aspects of the counterculture, which centered on the embodiment of a decentralized anarchist bent, expressed in the formation of counter-institutions like underground newspapers, urban and rural communes, head shops, and food co-ops. As the counterculture faded after 1975, its legacies became apparent in the redefinition of the American family, the advent of the personal computer, an increasing ecological and culinary consciousness, and the marijuana legalization movement.
The development of military arms harnessing nuclear energy for mass destruction has inspired continual efforts to control them. Since 1945, the United States, the Soviet Union, the United Kingdom, France, the People’s Republic of China (PRC), Israel, India, Pakistan, North Korea, and South Africa acquired control over these powerful weapons, though Pretoria dismantled its small cache in 1989 and Russia inherited the Soviet arsenal in 1996. Throughout this period, Washington sought to limit its nuclear forces in tandem with those of Moscow, prevent new states from fielding them, discourage their military use, and even permit their eventual abolition.
Scholars disagree about what explains the United States’ distinct approach to nuclear arms control. The history of U.S. nuclear policy treats intellectual theories and cultural attitudes alongside technical advances and strategic implications. The central debate is one of structure versus agency: whether the weapons’ sheer power, or historical actors’ attitudes toward that power, drove nuclear arms control. Among those who emphasize political responsibility, there are two further disagreements: (1) the relative influence of domestic protest, culture, and politics; and (2) whether U.S. nuclear arms control aimed first at securing the peace by regulating global nuclear forces or at bolstering American influence in the world.
The intensity of nuclear arms control efforts tended to rise or fall with the likelihood of nuclear war. Harry Truman’s faith in the country’s monopoly on nuclear weapons caused him to sabotage early initiatives, while Dwight Eisenhower’s belief in nuclear deterrence led in a similar direction. Fears of a U.S.-Soviet thermonuclear exchange mounted in the late 1950s, stoked by atmospheric nuclear testing and widespread radioactive fallout, which stirred protest movements and diplomatic initiatives. The spread of nuclear weapons to new states motivated U.S. presidents (John Kennedy in the vanguard) to mount a concerted campaign against “proliferation,” climaxing with the 1968 Treaty on the Non-Proliferation of Nuclear Weapons (NPT). Richard Nixon was exceptional. His reasons for signing the Strategic Arms Limitation Treaty (SALT I) and Anti-Ballistic Missile Treaty (ABM) with Moscow in 1972 were strategic: to buttress the country’s geopolitical position as U.S. armed forces withdrew from Southeast Asia. The rise of protest movements and Soviet economic difficulties after Ronald Reagan entered the Oval Office brought about two more landmark U.S.-Soviet accords—the 1987 Intermediate Ballistic Missile Treaty (INF) and the 1991 Strategic Arms Reduction Treaty (START)—the first occasions on which the superpowers eliminated nuclear weapons through treaty. The country’s attention swung to proliferation after the Soviet collapse in December 1991, as failed states, regional disputes, and non-state actors grew more prominent. Although controversies over Iraq, North Korea, and Iran’s nuclear programs have since erupted, Washington and Moscow continued to reduce their arsenals and refine their nuclear doctrines even as President Barack Obama proclaimed his support for a nuclear-free world.
Humans have utilized American forests for a wide variety of uses from the pre-Columbian period to the present. Native Americans heavily shaped forests to serve their needs, helping to create fire ecologies in many forests. English settlers harvested these forests for trade, to clear land, and for domestic purposes. The arrival of the Industrial Revolution in the early 19th century rapidly expanded the rate of logging. By the Civil War, many areas of the Northeast were logged out. Post–Civil War forests in the Great Lakes states, the South, and then the Pacific Northwest fell with increasing speed to feed the insatiable demands of the American economy, facilitated by rapid technological innovation that allowed for growing cuts. By the late 19th century, growing concerns about the future of American timber supplies spurred the conservation movement, personified by forester Gifford Pinchot and the creation of the U.S. Forest Service with Pinchot as its head in 1905. After World War II, the Forest Service worked closely with the timber industry to cut wide swaths of the nation’s last virgin forests. These gargantuan harvests led to the growth of the environmental movement. Beginning in the 1970s, environmentalists began to use legal means to halt logging in the ancient forests, and the listing of the northern spotted owl under the Endangered Species Act was the final blow to most logging on Forest Service lands in the Northwest. Yet not only does the timber industry remain a major employer in forested parts of the nation today, but alternative forest economies have also developed around more sustainable industries such as tourism.