Category Archives: Uncategorized

F17 – 13 The Sectional Crisis

John Steuart Curry, Tragic Prelude, 1938-1940, Kansas State Capitol

This mural, created over eighty years after Brown’s death, captures the violence and religious fervor of the man and his era. John Steuart Curry, Tragic Prelude, 1938-1940, Kansas State Capitol.

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

Slavery’s western expansion created problems for the United States from the very start. Battles emerged over the westward expansion of slavery and over the role of the federal government in protecting the interests of slaveholders. Northern workers felt that slavery suppressed wages and stole land that could have been used by poor white Americans to achieve economic independence. Southerners feared that without slavery’s expansion, the abolitionist faction would come to dominate national politics and an increasingly dense population of slaves would lead to bloody insurrection and race war. Constant resistance from enslaved men and women required a strong proslavery government to maintain order. As the North gradually abolished human bondage, enslaved men and women headed North on an underground railroad of hideaways and safe houses. Northerners and Southerners came to disagree sharply on the role of the federal government in capturing and returning these freedom seekers. While Northerners appealed to their states’ rights to refuse capturing runaway slaves, Southerners demanded a national commitment to slavery. Enslaved laborers meanwhile remained vitally important to the nation’s economy, fueling not only the southern plantation economy but also providing raw materials for the industrial North. Differences over the fate of slavery remained at the heart of American politics, especially as the United States expanded. After decades of conflict, Americans north and south began to fear that the opposite section of the country had seized control of the government. By November 1860, an opponent of slavery’s expansion arose from within the Republican Party. During the secession crisis that followed, fears, nearly a century in the making, at last devolved into bloody war.

 

II. Sectionalism in the Early Republic

This map, published by the US Coast Guard, shows the percentage of slaves in the population in each county of the slave-holding states in 1860. The highest percentages lie along the Mississippi River, in the “Black Belt” of Alabama, and coastal South Carolina, all of which were centers of agricultural production (cotton and rice) in the United States. E. Hergesheimer (cartographer), Th. Leonhardt (engraver), Map Showing the Distribution of the Slave Population of the Southern States of the United States Compiled from the Census of 1860, c. 1861. Wikimedia, http://commons.wikimedia.org/wiki/File:SlavePopulationUS1860.jpg.

This map, published by the US Coast Guard, shows the percentage of slaves in the population in each county of the slave-holding states in 1860. The highest percentages lie along the Mississippi River, in the “Black Belt” of Alabama, and coastal South Carolina, all of which were centers of agricultural production (cotton and rice) in the United States.
E. Hergesheimer (cartographer), Th. Leonhardt (engraver), Map Showing the Distribution of the Slave Population of the Southern States of the United States Compiled from the Census of 1860, c. 1861. Wikimedia

Slavery’s history stretched back to antiquity. Prior to the American Revolution, nearly everyone in the world accepted it as a natural part of life.1 English colonies north and south relied on enslaved workers who grew tobacco, harvested indigo and sugar, and worked in ports. They generated tremendous wealth for the British crown. That wealth and luxury fostered seemingly limitless opportunities, and inspired seemingly boundless imaginations. Enslaved workers also helped give rise to revolutionary new ideals, ideals that in time became the ideological foundations of the sectional crisis. English political theorists, in particular, began to re-think natural law justifications for slavery. They rejected the longstanding idea that slavery was a condition that naturally suited some people. A new transatlantic antislavery movement began to argue that freedom was the natural condition of man.2

Revolutionaries seized onto these ideas to stunning effect in the late eighteenth century. In the United States, France, and Haiti, revolutionaries began the work of splintering the old order. Each revolution seemed to radicalize the next. Bolder and more expansive declarations of equality and freedom followed one after the other. Revolutionaries in the United States declared, “All men are created equal,” in the 1770s. French visionaries issued the “Declaration of Rights and Man and Citizen” by 1789.  But the most startling development came in 1803. A revolution led by the island’s rebellious slaves turned France’s most valuable sugar colony into an independent country administered by the formerly enslaved.

The Haitian Revolution marked an early origin of the sectional crisis. It helped splinter the Atlantic basin into clear zones of freedom and un-freedom, shattering the longstanding assumption that African-descended slaves could not also be rulers. Despite the clear limitations of the American Revolution in attacking slavery, the era marked a powerful break in slavery’s history. Military service on behalf of both the English and the American army freed thousands of slaves. Many others simply used the turmoil of war to make their escape. As a result, free black communities emerged—communities that would continually reignite the antislavery struggle. For nearly a century, most white Americans were content to compromise over the issue of slavery, but the constant agitation of black Americans, both enslaved and free, kept the issue alive.3

The national breakdown over slavery occurred over a long timeline and across a broad geography. Debates over slavery in the American West proved especially important. As the United States pressed westward, new questions arose as to whether those lands ought to be slave or free. The framers of the Constitution did a little, but not much, to help resolve these early questions. Article VI of the 1787 Northwest Ordinance banned slavery north and west of the Ohio River.4 Many took it to mean that the founders intended for slavery to die out, as why else would they prohibit its spread across such a huge swath of territory?

Questions over the framer’s intentions often led to confusion and bitter debate, but the actions of the new government left better clues as to what the new nation intended for slavery. Congress authorized the admission of Vermont (1791) and Kentucky (1792), with Vermont coming into the Union as a free state, and Kentucky coming in as a slave state. Though Americans at the time made relatively little of the balancing act suggested by the admission of a slave state and a free state, the pattern became increasingly important. By 1820, preserving the balance of free states and slave states would be seen as an issue of national security.

New pressures challenging the delicate balance again arose in the West.  The Louisiana Purchase of 1803 more than doubled the size of the United States. Questions immediately arose as to whether these lands would be made slave or free. Complicating matters further was the rapid expansion of plantation slavery fueled by the invention of the cotton gin in 1793. Yet even with the booming cotton economy, many Americans, including Thomas Jefferson, believed that slavery was a temporary institution and would soon die out. Tensions rose with the Louisiana Purchase, but a truly sectional national debate remained mostly dormant.

That debate, however, came quickly. Sectional differences tied to the expansion of plantation slavery in the West were especially important after 1803. The Ohio River Valley became an early fault line in the coming sectional struggle. Kentucky and Tennessee emerged as slave states, while free states Ohio, Indiana (1816) and Illinois (1818) gained admission along the river’s northern banks. Borderland negotiations and accommodations along the Ohio River fostered a distinctive kind of white supremacy, as laws tried to keep blacks out of the West entirely. Ohio’s so-called “Black Laws,” of 1803 foreshadowed the exclusionary cultures of Indiana, Illinois, and several subsequent states of the Old Northwest and later, the Far West.5 These laws often banned African American voting, denied black Americans access to public schools, and made it impossible for non-whites to serve on juries and in local militias, among a host of other restrictions and obstacles.

The Missouri Territory, by far the largest section of the Louisiana Territory, marked a turning point in the sectional crisis. Saint Louis, a bustling Mississippi River town filled with powerful slave owners, loomed large as an important trade headquarters for networks in the northern Mississippi Valley and the Greater West. In 1817, eager to put questions of whether this territory would be slave or free to rest, Congress opened its debate over Missouri’s admission to the Union. Congressman James Tallmadge of New York proposed laws that would gradually abolish slavery in the new state. Southern states responded with unanimous outrage, and the nation shuddered at an undeniable sectional controversy.6

Congress reached a “compromise” on Missouri’s admission, largely through the work of Kentuckian Henry Clay. Maine would be admitted to the Union as a free state. In exchange, Missouri would come into the Union as a slave state. Legislators sought to prevent future conflicts by making Missouri’s southern border at 36° 30′ the new dividing line between slavery and freedom in the Louisiana Purchase lands. South of that line, running east from Missouri to the western edge of the Louisiana Purchase lands (near the present-day Texas panhandle) slavery could expand. North of it, encompassing what in 1820 was still “unorganized territory,” there would be no slavery.7

The Missouri Compromise marked a major turning point in America’s sectional crisis because it exposed to the public just how divisive the slavery issue had grown. The debate filled newspapers, speeches, and Congressional records. Antislavery and pro-slavery positions from that point forward repeatedly returned to points made during the Missouri debates. Legislators battled for weeks over whether the Constitutional framers intended slavery’s expansion or not, and these contests left deep scars. Even seemingly simple and straightforward phrases like “All Men Are Created Equal” were hotly contested all over again. Questions over the expansion of slavery remained open, but nearly all Americans concluded that the Constitution protected slavery where it already existed.

Southerners were not yet advancing arguments that said slavery was a positive good, but they did insist during the Missouri Debate that the framers supported slavery and wanted to see it expand. In Article 1, Section 2, for example, the Constitution enabled representation in the South to be based on rules defining enslaved people as 3/5 of a voter, meaning southern white men would be overrepresented in Congress. The Constitution also stipulated that Congress could not interfere with the slave trade before 1808, and enabled Congress to draft fugitive slave laws.

Antislavery participants in the Missouri debate argued that the framers never intended slavery to survive the Revolution and in fact hoped it would disappear through peaceful means. The framers of the Constitution never used the word “slave.” Slaves were referred to as “persons held in service,” perhaps referring to English common law precedents that questioned the legitimacy of “property in man.” Antislavery activists also pointed out that while the Congress could not pass a law limiting the slave trade by 1808, the framers had also recognized the flip side of the debate and had thus opened the door to legislating the slave trade’s end once the deadline arrived. Language in the Tenth Amendment, they claimed, also said slavery could be banned in the territories. Finally, they pointed to the due process clause of the Fifth Amendment, which said that property could be seized through appropriate legislation.8 The bruising Missouri debates ultimately transcended arguments about the Constitution. They became an all-encompassing referendum on the American past, present, and future.

Despite the furor, the Missouri Crisis did not yet inspire hardened defenses of either slave or free labor as positive good. Those would come in the coming decades. In the meantime, the uneasy consensus forged by the Missouri Debate managed to bring a measure of calm.

The Missouri debate had also deeply troubled the nation’s African Americans and Native Americans. By the time of the Missouri compromise debate, both groups saw that whites never intended them to be citizens of the United States. In fact, the debates over Missouri’s admission had offered the first sustained debate on the question of black citizenship, as Missouri’s State Constitution wanted to impose a hard ban on any future black migrants. Legislators ultimately agreed that this hard ban violated the Constitution, but reaffirmed Missouri’s ability to deny citizenship to African Americans. Americans by 1820 had endured a broad challenge, not only to their cherished ideals but also more fundamentally to their conceptions of self.

 

III. The Crisis Joined

Missouri’s admission to the Union in 1821 exposed deep fault lines in American society. But the Compromise created a new sectional consensus that most white Americans, at least, hoped would ensure a lasting peace. Through sustained debates and arguments, white Americans agreed that the Constitution could do little about slavery where it already existed and that slavery, with the State of Missouri as the key exception, would never expand north of the 36°30′ line.

Once again westward expansion challenged this consensus, and this time the results proved even more damaging. Tellingly, enslaved southerners were among the first to signal their discontent. A rebellion led by Denmark Vesey in 1822 threatened lives and property throughout the Carolinas. The nation’s religious leaders also expressed a rising discontent with the new status quo.9 The Second Great Awakening further sharpened political differences by promoting schisms within the major Protestant churches, schisms that also became increasingly sectional in nature. Between 1820 and 1846, sectionalism drew on new political parties, new religious organizations, and new reform movements.

As politics grew more democratic, leaders attacked old inequalities of wealth and power, but in doing so many pandered to a unity under white supremacy. Slavery briefly receded from the nation’s attention in the early 1820s, but that would change quickly. By the last half of the decade, slavery was back, and this time it appeared even more threatening.

Inspired by the social change of Jacksonian democracy, white men regardless of status would gain not only land and jobs, but also the right to vote, the right to serve on juries, the right to attend public schools, and the right to serve in the militia and armed forces. In this post-Missouri context, leaders arose to push the country’s new expansionist desires in aggressive new directions. As they did so, however, the sectional crisis again deepened.

The Democratic Party initially seemed to offer a compelling answer to the problems of sectionalism by promising benefits to white working men of the North, South, and West, while also uniting rural, small town, and urban residents. Indeed, huge numbers of western, southern, and northern workingmen rallied during the 1828 Presidential election behind Andrew Jackson. The Democratic Party tried to avoid the issue of slavery and instead sought to unite Americans around shared commitments to white supremacy and desires to expand the nation.

Democrats were not without their critics. Northerners seen as especially friendly to the South had become known as “Doughfaces” during the Missouri debates, and as the 1830s wore on, more and more Doughfaced Democrats became vulnerable to the charge that they served the Southern slave oligarchs better than they served their own northern communities. Whites discontented with the direction of the country used the slur and other critiques to help chip away at Democratic Party majorities. The accusation that northern Democrats were lap dogs for southern slaveholders had real power.10

The Whigs offered an organized major party challenge to the Democrats. Whig strongholds often mirrored the patterns of westward migrations out of New England. Whigs drew from an odd coalition of wealthy merchants, middle and upper class farmers, planters in the Upland South, and settlers in the Great Lakes. Because of this motley coalition, the party struggled to bring a cohesive message to voters in the 1830s. Their strongest support came from places like Ohio’s Western Reserve, the rural and Protestant-dominated areas of Michigan, and similar parts of Protestant and small-town Illinois, particularly the fast-growing towns and cities of the state’s northern half.11

Whig leaders stressed Protestant culture, federal-sponsored internal improvements, and courted the support of a variety of reform movements, including temperance, nativism, and even antislavery, though few Whigs believed in racial equality. These positions attracted a wide range of figures, including a young convert to politics named Abraham Lincoln. Lincoln admired Whig leader Henry Clay of Kentucky, and by the early 1830s, Lincoln certainly fit the image of developing Whig. A veteran of the Black Hawk War, Lincoln had re-located to New Salem, Illinois, where he worked a variety of odd jobs, living a life of thrift, self-discipline, and sobriety as he educated himself in preparation for a professional life in law and politics.

The Whig Party blamed Democrats for defending slavery at the expense of the American people, but antislavery was never a core component of the Whig platform. Several abolitionists grew so disgusted with the Whigs that they formed their own party, a true antislavery party.  Activists in Warsaw, New York organized the antislavery Liberty Party in 1839. Liberty leaders demanded the end of slavery in the District of Columbia, the end of the interstate slave trade, and the prohibition of slavery’s expansion into the West. But the Liberty Party also shunned women’s participation in the movement and distanced themselves from visions of true racial egalitarianism. Few Americans voted for the party. The Democrats and Whigs continued to dominate American politics.

Democrats and Whigs fostered a moment of relative calm on the slavery debate, partially aided by gag rules prohibiting discussion of antislavery petitions. Arkansas (1836) and Michigan (1837) became the newest states admitted to the Union, with Arkansas coming in as a slave state, and Michigan coming in as a free state. Michigan gained admission through provisions established in the Northwest Ordinance, while Arkansas came in under the Missouri Compromise. Since its lands were below the line at 36° 30′ the admission of Arkansas did not threaten the Missouri consensus. The balancing act between slavery and freedom continued.

Events in Texas would shatter the balance. Independent Texas soon gained recognition from a supportive Andrew Jackson administration in 1837. But Jackson’s successor, President Martin Van Buren, also a Democrat, soon had reasons to worry about the Republic of Texas. Texas struggled with ongoing conflicts with Mexico and Indian raids from the powerful Comanche. The 1844 democratic presidential candidate James K. Polk sought to bridge the sectional divide by promising new lands to whites north and south. Polk cited the annexation of Texas and the Oregon Territory as campaign cornerstones.12 Yet as Polk championed the acquisition of these vast new lands, northern Democrats grew annoyed by their southern colleagues, especially when it came to Texas.

For many observers, the debates over Texas statehood illustrated that the federal government was clearly pro-slavery. Texas President Sam Houston managed to secure a deal with Polk, and gained admission to the Union for Texas in 1845. Antislavery northerners also worried about the admission of Florida, which entered the Union as a slave state in 1845. The year 1845 became a pivotal year in the memory of antislavery leaders. As Americans embraced calls to pursue their “Manifest Destiny,” antislavery voices looked at developments in Florida and Texas as signs that the sectional crisis had taken an ominous and perhaps irredeemable turn.

The 1840s opened with a number of disturbing developments for antislavery leaders. The 1842 Supreme Court case Prigg v. Pennsylvania ruled that the federal government’s Fugitive Slave Act trumped Pennsylvania’s personal liberty law.13 Antislavery activists believed that the federal government only served southern slaveholders and were trouncing the states’ rights of the North. A number of northern states reacted by passing new personal liberty laws in protest in 1843.

The rising controversy over the status of fugitive slaves swelled partly through the influence of escaped former slaves, including Frederick Douglass. Douglass’s entrance into northern politics marked an important new development in the nation’s coming sectional crisis. Born into slavery in 1818 at Talbot County, Maryland, Douglass grew up, like many enslaved people, barely having known his own mother or date of birth. And yet because of a range of unique privileges afforded him by the circumstances of his upbringing, as well as his own genius and determination, Douglass managed to learn how to read and write. He used these skills to escape from slavery in 1837, when he was just nineteen. By 1845, Douglass put the finishing touches on his autobiography, Narrative of the Life of Frederick Douglass.14 The book launched his life-long career as an advocate for the enslaved and helped further raise the visibility of black politics. Other former slaves, including Sojourner Truth, joined Douglass in rousing support for antislavery, as did free black Americans like Maria Stewart, James McCune Smith, Martin Delaney and numerous others.15 But black activists did more than deliver speeches. They also attacked fugitive slave laws by helping thousands to escape. The incredible career of Harriet Tubman is one of the more dramatic examples. But the forces of slavery had powerful allies at every level of government.

The year 1846 signaled new reversals to the antislavery cause, and the beginnings of a dark new era in American politics. President Polk and his Democratic allies were eager to see western lands brought into the Union and were especially anxious to see the borders of the nation extended to the shores of the Pacific Ocean. Critics of the administration blasted these efforts as little more than land-grabs on behalf of slaveholders. Events in early 1846 seemed to justify antislavery complaints. Since Mexico had never recognized independent Texas, it continued to lay claim to its lands, even after the United States admitted it to the Union. In January 1846, Polk ordered troops to Texas to enforce claims stemming from its border dispute along the Rio Grande. Polk asked for war on May 11, 1846, and by September 1847, the United States had invaded Mexico City. Whigs, like Abraham Lincoln, found their protests sidelined, but antislavery voices were becoming more vocal and more powerful.

After 1846, the sectional crisis raged throughout North America. Debates swirled over whether the new lands would be slave or free. The South began defending slavery as a positive good. At the same time, Congressman David Wilmot submitted his “Wilmot Proviso” late in 1846, banning the expansion of slavery into the territories won from Mexico. The Proviso gained widespread northern support and even passed the House with bipartisan support, but it failed in the Senate.

 

IV. Free Soil, Free Labor, Free Men

The conclusion of the Mexican War gave rise to the 1848 Treaty of Guadeloupe Hidalgo. The treaty infuriated antislavery leaders in the United States. The spoils gained from the Mexican War were impressive, and it was clear they would help expand slavery. The United States required Mexican officials to cede the California and New Mexico Territories for $15 million dollars. With American soldiers occupying their capital, Mexican leaders had no choice but sign or continue fighting a war they could not win. The new American territory included lands that would become the future states of California, Utah, Nevada, most of Arizona, and well as parts of New Mexico, Colorado, and Wyoming.

Questions about the balance of free and slave states in the Union became even more fierce after the US acquired these territories from Mexico by the 1848 in the Treaty of Guadalupe Hidalgo. Map of the Mexican Cession. WIkimedia, http://commons.wikimedia.org/wiki/File:Mexican_Cession.png.

Questions about the balance of free and slave states in the Union became even more fierce after the US acquired these territories from Mexico by the 1848 in the Treaty of Guadalupe Hidalgo. Map of the Mexican Cession. Wikimedia.

The acquisition of so much land made it imperative to antislavery leaders that these lands not be opened to slavery. But knowing that the Liberty Party was not likely to provide a home to many moderate voters, leaders instead hoped to foster a new and more competitive party, which they called the Free Soil Party. Antislavery leaders entered the 1848 election hoping that their vision of a federal government divorced from slavery might be heard. But both the Whigs and the Democrats, nominated pro-slavery southerners. Left unrepresented, antislavery Free Soil leaders swung into action.

Demanding an alternative to the pro-slavery status quo, Free Soil leaders assembled so-called “Conscience Whigs.” The new coalition called for a national convention in August 1848 at Buffalo, New York. A number of ex-Democrats committed to the party right away, including an important group of New Yorkers loyal to Martin Van Buren. The Free Soil Party’s platform bridged the eastern and the western leadership together and called for an end to slavery in Washington DC and a halt on slavery’s expansion in the territories.16 The Free Soil movement hardly made a dent in the 1848 Presidential election, but it drew more than four times the popular vote won by the Liberty Party earlier. It was a promising start. In 1848, Free Soil leaders claimed just 10% of the popular vote, but won over a dozen House seats, and even managed to win one Senate seat in Ohio, which went to Salmon P. Chase.17 In Congress, Free Soil members had enough votes to swing power to either the Whigs or the Democrats.

The admission of Wisconsin as a free state in May 1848 helped cool tensions after the Texas and Florida admissions. But news from a number of failed revolutions in Europe alarmed American reformers. As exiled radicals filtered out of Europe and into the United States, a women’s rights movement also got underway at Seneca Falls, New York. Representing the first of such meetings ever held in United States history, it was led by figures like Elizabeth Cady Stanton and Lucretia Mott, women with deep ties to the abolitionist cause. Frederick Douglass also appeared at the convention and took part in the proceedings, where participants debated the Declaration of Sentiments, Grievances and Resolutions.18 By August 1848, it seemed plausible that the Free Soil Movement might tap into these reforms and build a broader coalition. In some ways that is precisely what it did. But come November, the spirit of reform failed to yield much at the polls. Whig candidate Zachary Taylor bested Democrat Lewis Cass of Michigan.

The upheavals signaled by 1848 came to a quick end. Taylor remained in office only a brief time until his unexpected death from a stomach ailment in 1850. During Taylor’s brief time in office, the fruits of the Mexican War began to spoil. While he was alive, Taylor and his administration struggled to find a good remedy. Increased clamoring for the admission of California, New Mexico, and Utah pushed the country closer to the edge. Gold had been discovered in California, and as thousands continued to pour onto the West Coast and through the trans-Mississippi West, the admission of new states loomed. In Utah, Mormons were also making claims to an independent state they called Deseret. By 1850, California wanted admission as a free state. With so many competing dynamics underway, and with the President dead and replaced by Whig Millard Fillmore, the 1850s were off to a troubling start.

Congressional leaders like Henry Clay and newer legislators like Stephen A. Douglas of Illinois were asked to broker a compromise, but this time it was clear no compromise could bridge all the diverging interests at play in the country. Clay eventually left Washington disheartened by affairs. It fell to young Stephen Douglas, then, to shepherd the bills through the Congress, which he in fact did. Legislators rallied behind the “Compromise of 1850,” an assemblage of bills passed late in 1850, which managed to keep the promises of the Missouri Compromise alive.

Henry Clay (“The Great Compromiser”) addresses the U.S. Senate during the debates over the Compromise of 1850. The print shows a number of incendiary personalities, like John C. Calhoun, whose increasingly sectional beliefs were pacified for a time by the Compromise. P. F. Rothermel (artist), c. 1855. Wikimedia, http://commons.wikimedia.org/wiki/File:Henry_Clay_Senate3.jpg.

Henry Clay (“The Great Compromiser”) addresses the U.S. Senate during the debates over the Compromise of 1850. The print shows a number of incendiary personalities, like John C. Calhoun, whose increasingly sectional beliefs were pacified for a time by the Compromise. P. F. Rothermel (artist), c. 1855. Wikimedia.

The Compromise of 1850 tried to offer something to everyone, but in the end it only worsened the sectional crisis. For southerners, the package offered a tough new fugitive slave law that empowered the federal government to deputize regular citizens in arresting runaways. The New Mexico territory and the Utah Territory, would be allowed to determine their own fates as slave or free states based on popular sovereignty. The Compromise also allowed territories to submit suits directly to the Supreme Court over the status of fugitive slaves within its bounds.

The admission of California as the newest free state in the Union cheered many northerners, but even the admission of a vast new state full of resources and rich agricultural lands was not enough. In addition to California, northerners also gained a ban on the slave trade in Washington, D.C., but not the full emancipation abolitionists had long advocated. Texas, which had already come into the Union as a slave state, was asked to give some of its land to New Mexico in return for the federal government absorbing some of the former republic’s debt. But the Compromise debates soon grew ugly.

After the Compromise of 1850, antislavery critics became increasingly certain that slaveholders had co-opted the federal government, and that a southern “Slave Power” secretly held sway in Washington, where it hoped to make slavery a national institution. These northern complaints pointed back to how the 3/5 compromise of the Constitution gave southerners more representatives in Congress. In the 1850s, antislavery leaders increasingly argued that Washington worked on behalf of slaveholders while ignoring the interests of white working men.

None of the individual 1850 Compromise measures proved more troubling to national and international observers than the Fugitive Slave Act. In a clear bid to extend slavery’s influence throughout the country, the act created special federal commissioners to determine the fate of alleged fugitives without benefit of a jury trial or even court testimony. Under its provisions, local authorities in the North could not interfere with the capture of fugitives. Northern citizens, moreover, had to assist in the arrest of fugitive slaves when called upon by federal agents. The Fugitive Slave Act created the foundation for a massive expansion of federal power, including an alarming increase in the nation’s policing powers. Many northerners were also troubled by the way the bill undermined local and state laws. The law itself fostered corruption and the enslavement of free black northerners. The federal commissioners who heard these cases were paid $10 if they determined that the defendant was a slave and only $5 if they determined he or she was free.19 Many black northerners responded to the new law by heading further north to Canada.

The 1852 Presidential election gave the Whigs their most stunning defeat and effectively ended their existence as a national political party. Whigs captured just 42 of the 254 electoral votes needed to win. With the Compromise of 1850 and plenty of new lands, peaceful consensus seemed on the horizon. Antislavery feelings continued to run deep, however, and their depth revealed that with a Democratic Party misstep, a coalition united against the Democrats might yet emerge and bring them to defeat. One measure of the popularity of antislavery ideas came in 1852 when Harriet Beecher Stowe published her bestselling antislavery novel, Uncle Tom’s Cabin. ((Harriet Beecher Stowe, Uncle Tom’s Cabin (Boston: 1852).)) Sales for Uncle Tom’s Cabin were astronomical, eclipsed only by sales of the Bible.20 The book became a sensation and helped move antislavery into everyday conversation for many northerners. Despite the powerful antislavery message, Stowe’s book also reinforced many racist stereotypes. Even abolitionists struggled with the deeply ingrained racism that plagued American society. While the major success of Uncle Tom’s Cabin bolstered the abolitionist cause, the terms outlined by the Compromise of 1850 appeared strong enough to keep the peace.

Uncle Tom’s Cabin intensified an already hot debate over slavery throughout the United States. The book revolves around Eliza (the woman holding the young boy) and Tom (standing with his wife Chloe), each of whom takes a very different path: Eliza escapes slavery using her own two feet, but Tom endures his chains only to die by the whip of a brutish master. The horrific violence that both endured melted the hearts of many northerners and pressed some to join in the fight against slavery. Full-page illustration by Hammatt Billings for Uncle Tom's Cabin, 1852. Wikimedia, http://commons.wikimedia.org/wiki/File:ElizaEngraving.jpg.

Uncle Tom’s Cabin intensified an already hot debate over slavery throughout the United States. The book revolves around Eliza (the woman holding the young boy) and Tom (standing with his wife Chloe), each of whom takes a very different path: Eliza escapes slavery using her own two feet, but Tom endures his chains only to die by the whip of a brutish master. The horrific violence that both endured melted the hearts of many northerners and pressed some to join in the fight against slavery. Full-page illustration by Hammatt Billings for Uncle Tom’s Cabin, 1852. Wikimedia.

Democrats by 1853 were badly splintered along sectional lines over slavery, but they also had reasons to act with confidence. Voters had returned them to office in 1852 following the bitter fights over the Compromise of 1850. Emboldened, Illinois Senator Stephen A. Douglas introduced a set of additional amendments to a bill drafted in late 1853 to help organize the Nebraska Territory, the last of the Louisiana Purchase lands. In 1853, the Nebraska Territory was huge, extending from the northern end of Texas to the Canadian Border. Altogether, it encompassed present-day Nebraska, Wyoming, South Dakota, North Dakota, Colorado and Montana. Douglas’s efforts to amend and introduce the bill in 1854 opened dynamics that would break the Democratic Party in two and, in the process, rip the country apart.

Douglas proposed a bold plan in 1854 to cut off a large southern chunk of Nebraska and create it separately as the Kansas Territory. Douglas had a number of goals in mind. The expansionist Democrat from Illinois wanted to organize the territory to facilitate the completion of a national railroad that would flow through Chicago. But before he had even finished introducing the bill, opposition had already mobilized. Salmon P. Chase drafted a response in northern newspapers that exposed the Kansas-Nebraska Bill as a measure to overturn the Missouri Compromise and open western lands for slavery. Kansas-Nebraska protests emerged in 1854 throughout the North, with key meetings in Wisconsin and Michigan. Kansas would become slave or free depending on the result of local elections, elections that would be greatly influenced by migrants flooding to the state to either protect or stop the spread of slavery.

Ordinary Americans in the North increasingly resisted what they believed to be a pro-slavery federal government on their own terms. The rescues and arrests of fugitive slaves Anthony Burns in Boston and Joshua Glover in Milwaukee, for example, both signaled the rising vehemence of resistance to the nation’s 1850 fugitive slave law. The case of Anthony Burns illustrates how the Fugitive Slave Law radicalized many northerners.  On May 24, 1854, 20-year-old Burns, a preacher who worked in a Boston clothing shop, was clubbed and dragged to jail. One year earlier, Burns had escaped slavery in Virginia, and a group of slave catchers had come to return him to Richmond. Word of Burns’ capture spread rapidly through Boston, and a mob gathered outside of the courthouse demanding Burns’ release. Two days after the arrest, the crowd stormed the courthouse and shot a Deputy U.S. Marshall to death. News reached Washington, and the federal government sent soldiers. Boston was placed under Martial Law. Federal troops lined the streets of Boston as Burns was marched to a ship where he was sent back to slavery in Virginia. After spending over $40,000, the United States Government had successfully reenslaved Anthony Burns.21 A short time later, Burns was redeemed by abolitionists who paid $1,300 to return him to freedom, but the outrage among Bostonians only grew. And Anthony Burns was only one of hundreds of highly publicized episodes of the federal governments imposing the Fugitive Slave Law on rebellious northern populations. In the words of Amos Adams Lawrence, “We went to bed one night old-fashioned, conservative, compromise Union Whigs & woke up stark mad Abolitionists.”22

Anthony Burns, the fugitive slave, appears in a portrait at the center of this 1855. Burns’ arrest and trial, possible because of the 1850 Fugitive Slave Act, became a rallying cry. As a symbol of the injustice of the slave system, Burns’ treatment spurred riots and protests by abolitionists and citizens of Boston in the spring of 1854. John Andrews (engraver), “Anthony Burns,” c. 1855. Library of Congress, http://www.loc.gov/pictures/item/2003689280/.

Anthony Burns, the fugitive slave, appears in a portrait at the center of this 1855. Burns’ arrest and trial, possible because of the 1850 Fugitive Slave Act, became a rallying cry. As a symbol of the injustice of the slave system, Burns’ treatment spurred riots and protests by abolitionists and citizens of Boston in the spring of 1854. John Andrews (engraver), “Anthony Burns,” c. 1855. Library of Congress.

As northerners radicalized, organizations like the New England Emigrant Aid Society provided guns and other goods for pioneers willing to go to Kansas and establish the territory as antislavery through popular sovereignty. On all sides of the slavery issue, politics became increasingly militarized.

The year 1855 nearly derailed the northern antislavery coalition. A resurgent anti-immigrant movement briefly took advantage of the Whig collapse, and nearly stole the energy of the anti-administration forces by channeling its frustrations into fights against the large number of mostly Catholic German and Irish immigrants in American cities. Calling themselves “Know-Nothings,” on account of their tendency to pretend ignorance when asked about their activities, the Know-Nothing or American Party made impressive gains in 1854 and 1855, particularly in New England and the Middle Atlantic. But the anti-immigrant movement simply could not capture the nation’s attention in ways the antislavery movement already had.23

The antislavery political movements that started in 1854 coalesced as the coming Presidential election of 1856 accelerated the formation of a new political party. Harkening back to the founding fathers, this new party called itself the Republican Party. Republicans moved into a highly charged summer expecting great things for their cause. Following an explosive speech before Congress on May 19-20, Charles Sumner was beaten by congressional representative Preston Brooks of South Carolina right on the floor of the Senate chamber. Among other accusations, Sumner accused Senator Andrew Butler of South Carolina of defending slavery so he could have sexual access to black women.24 Butler’s cousin, representative Brooks felt that he had to defend his relative’s honor, and nearly killed Sumner as a result.

The Caning of Charles Sumner, 1856. Wikimedia, http://commons.wikimedia.org/wiki/File:Southern_Chivalry.jpg.

The Caning of Charles Sumner, 1856. Wikimedia.

The violence in Washington pales before the many murders occurring in Kansas. Proslavery raiders attacked Lawrence, Kansas. Radical abolitionist John Brown retaliated, murdering several pro-slavery Kansans in retribution. As all of this played out, the House failed to expel Brooks. Brooks resigned his seat anyway, only to be re-elected by his constituents later in the year. He received new canes emblazoned with the words “Hit him again!”25

With sectional tensions at a breaking point, both parties readied for the coming Presidential election. In June 1856, the newly named Republican Party held its nominating convention at Philadelphia, and selected Californian John Charles Frémont. Frémont’s antislavery credentials may not have pleased many abolitionists, but his dynamic and talented wife, Jessie Benton Frémont, appealed to more radical members of the coalition. The Kansas-Nebraska Debate, the organization of the Republican Party, and the 1856 Presidential Campaign all energized a new generation of political leaders, including Abraham Lincoln. Beginning with his speech at Peoria, Illinois, in 1854, Lincoln carved out a message that encapsulated better than anyone else the main ideas and visions of the Republican Party.26 Lincoln himself was slow to join the coalition, yet by the summer of 1856, Lincoln had fully committed to the Frémont campaign.

Frémont lost, but Republicans celebrated that he won 11 of the 16 free states. This showing, they urged, was truly impressive for any party making its first run at the Presidency. Yet northern Democrats in crucial swing states remained unmoved by the Republican Party’s appeals. Ulysses S. Grant of Missouri, for example, worried that Frémont and Republicans signaled trouble for the Union itself. Grant voted for the Democratic candidate, James Buchanan, believing a Republican victory might bring about disunion. In abolitionist and especially black American circles, Frémont’s defeat was more than a disappointment. Believing their fate had been sealed as permanent non-citizens, some African Americans would consider foreign emigration and colonization. Others began to explore the option of more radical and direct action against the Slave Power.

 

V. From Sectional Crisis to National Crisis

White antislavery leaders hailed Frémont’s defeat as a “glorious” one and looked ahead to the party’s future successes. For those still in slavery, or hoping to see loved ones freed, the news was of course much harder to take. The Republican Party had promised the rise of an antislavery coalition, but voters rebuked it. The lessons seemed clear enough.

Kansas loomed large over the 1856 election, darkening the national mood. The story of voter fraud in Kansas had begun years before in 1854, when nearby Missourians first started crossing the border to tamper with the Kansas elections. Noting this, critics at the time attacked the Pierce administration for not living up to the ideals of popular sovereignty by ensuring fair elections. From there, the crisis only deepened. Kansas voted to come into the Union as a free state, but the federal government refused to recognize their votes and instead recognized a sham pro-slavery legislature.

The sectional crisis had at last become a national crisis. “Bleeding Kansas” was the first place to demonstrate that the sectional crisis could easily, and in fact already was, exploding into a full-blown national crisis. As the national mood grew increasingly grim, Kansas attracted militants representing the extreme sides of the slavery debate.

In the days after the 1856 Presidential election, Buchanan made his plans for his time in office clear. He talked with Chief Justice Roger Taney on inauguration day about a court decision he hoped to see handled during his time in office. Indeed, not long after the inauguration, the Supreme Court handed down a decision that would come to define Buchanan’s Presidency. The Dred Scott decision, Scott v. Sandford, ruled that black Americans could not be citizens of the United States.27 This gave the Buchanan administration and its southern allies a direct repudiation of the Missouri Compromise. The court ruled that Scott, a Missouri slave, had no right to sue in United States courts. The Dred Scott decision signaled that the federal government was now fully committed to extending slavery as far and as wide as it might want.

Dred Scott’s Supreme Court case made clear that the federal government was no longer able or willing to ignore the issue of slavery. More than that, all black Americans, Justice Taney declared, could never be citizens of the United States. Though seemingly a disastrous decision for abolitionists, this controversial ruling actually increased the ranks of the abolitionist movement. Photograph of Dred Scott, 1857. Wikimedia, http://commons.wikimedia.org/wiki/File:Dred_Scott_photograph_%28circa_1857%29.jpg.

Dred Scott’s Supreme Court case made clear that the federal government was no longer able or willing to ignore the issue of slavery. More than that, all black Americans, Justice Taney declared, could never be citizens of the United States. Though seemingly a disastrous decision for abolitionists, this controversial ruling actually increased the ranks of the abolitionist movement. Photograph of Dred Scott, 1857. Wikimedia.

The Dred Scott decision seemed to settle the sectional crisis by making slavery fully national, but in reality it just exacerbated sectional tensions further. In 1857, Buchanan sent U.S. military forces to Utah, hoping to subdue Utah’s Mormon communities. This action, however, led to renewed charges, many of them leveled from within his own party, that the administration was abusing its powers. Far more important than the Utah invasion, however, was the ongoing events in Kansas. It was Kansas that at last proved to many northerners that the sectional crisis would not go away unless slavery also went away.

The Illinois Senate race in 1858 put the scope of the sectional crisis on full display. Republican candidate Abraham Lincoln challenged the greatly influential Democrat Stephen Douglas. Pandering to appeals to white supremacy, Douglas hammered the Republican opposition as a “Black Republican” party bent on racial equality.28 The Republicans, including Lincoln, were thrown on the defensive. Democrats hung on as best they could, but the Republicans won the House of Representatives and picked up seats in the Senate. Lincoln actually lost his contest with Stephen Douglas, but in the process firmly established himself as a leading national Republican. After the 1858 elections, all eyes turned to 1860. Given the Republican Party’s successes since 1854, it was expected that the 1860 Presidential election might produce the nation’s first antislavery president.

In the troubled decades since the Missouri Compromise, the nation slowly tore itself apart. Congressman clubbed each other nearly to death on the floor of the Congress, and by the middle of the 1850s Americans were already at war on the Kansas and Missouri plains. Across the country, cities and towns were in various stages of revolt against federal authority. Fighting spread even further against Indians in the Far West and against Mormons in Utah. The nation’s militants anticipated a coming breakdown, and worked to exploit it. John Brown, fresh from his actions in Kansas, moved east and planned more violence. Assembling a team from across the West, including black radicals from Oberlin, Ohio, and throughout communities in Canada West, Brown hatched a plan to attack Harper’s Ferry, a federal weapon’s arsenal in Virginia (now West Virginia). He would use the weapons to lead a slave revolt. Brown approached Frederick Douglass, though Douglass refused to join.

Brown’s raid embarked on October 16. By October 18, a command under Robert E. Lee had crushed the revolt. Many of Brown’s men, including his own sons, were killed, but Brown himself lived and was imprisoned. Brown prophesied while in prison that the nation’s crimes would only be purged with blood. He went to the gallows in December 1859. Northerners made a stunning display of sympathy on the day of his execution. Southerners took their reactions to mean that the coming 1860 election would be, in many ways, a referendum on secession and disunion.

The execution of John Brown made him a martyr in abolitionist circles and a confirmed traitor in southern crowds. Both of these images continued to pervade public memory after the Civil War, but in the North especially (where so many soldiers had died to help end slavery) his name was admired. Over two decades after Brown’s death, Thomas Hovenden portrayed Brown as a saint. As he is lead to his execution for attempting to destroy slavery, Brown poignantly leans over a rail to kiss a black baby. Thomas Hovenden, The Last Moments of John Brown, c. 1882-1884. Wikimedia, http://commons.wikimedia.org/wiki/File:%27The_Last_Moments_of_John_Brown%27,_oil_on_canvas_painting_by_Thomas_Hovenden.jpg.

The execution of John Brown made him a martyr in abolitionist circles and a confirmed traitor in southern crowds. Both of these images continued to pervade public memory after the Civil War, but in the North especially (where so many soldiers had died to help end slavery) his name was admired. Over two decades after Brown’s death, Thomas Hovenden portrayed Brown as a saint. As he is lead to his execution for attempting to destroy slavery, Brown poignantly leans over a rail to kiss a black baby. Thomas Hovenden, The Last Moments of John Brown, c. 1882-1884. Wikimedia.

Republicans wanted little to do with Brown and instead tried to portray themselves as moderates opposed to both abolitionists and proslavery expansionists. In this climate, the parties opened their contest for the 1860 Presidential election. The Democratic Party fared poorly as its southern delegates bolted its national convention at Charleston and ran their own candidate, Vice President John C. Breckenridge of Kentucky. Hoping to field a candidate who might nonetheless manage to bridge the broken party’s factions, the Democrats decided to meet again at Baltimore, and nominated Stephen A. Douglas of Illinois.

The Republicans, meanwhile, held their boisterous convention in Chicago. The Republican platform made the party’s antislavery commitments clear, also making wide promises to its white constituents, particularly westerners, with the promise of new land, transcontinental railroads, and broad support of public schools.29 Abraham Lincoln, a candidate few outside of Illinois truly expected to win, nonetheless proved far less polarizing than the other names on the ballot. Lincoln won the nomination, and with the Democrats in disarray, Republicans knew their candidate Lincoln had a good chance of winning.

In this political cartoon, Abraham Lincoln uncomfortably straddles a rail supported by a black man and Horace Greeley (editor of the New York “Tribune”). The wood board is a dual reference to the antislavery plank of the 1860 Republican platform -- which Lincoln seemed to uneasily defend -- and Lincoln’s backwoods origins. Louis Maurer, “The Rail Candidate,” Currier & Ives, c. 1860. Library of Congress, http://www.loc.gov/pictures/item/2001703953/.

In this political cartoon, Abraham Lincoln uncomfortably straddles a rail supported by a black man and Horace Greeley (editor of the New York “Tribune”). The wood board is a dual reference to the antislavery plank of the 1860 Republican platform — which Lincoln seemed to uneasily defend — and Lincoln’s backwoods origins. Louis Maurer, “The Rail Candidate,” Currier & Ives, c. 1860. Library of Congress.

Abraham Lincoln won the 1860 contest on November 6, gaining just 40% of the popular vote and not a single southern vote in the Electoral College. Within days, southern states were organizing secession conventions. John J. Crittenden of Kentucky proposed a series of compromises, but a clear pro-southern bias meant they had little chance of gaining Republican acceptance. Crittenden’s plan promised renewed enforcement of the Fugitive Slave Law and offered a plan to keep slavery in the nation’s capital.30 Republicans by late 1860 knew that the voters who had just placed them in power did not want them to cave on these points, and southern states proceed with their plans to leave the Union. On December 20, South Carolina voted to secede, and issued its “Declaration of the Immediate Causes.”31 The Declaration highlighted failure of the federal government to enforce the Fugitive Slave Act over competing personal liberty laws in northern states. After the war many southerners claimed that secession was primarily motivated by a concern to preserve states’ rights, but the primary complaint of the very first ordinance of secession, listed the federal government’s failure to exert its authority over the northern states.

The year 1861, then, saw the culmination of the secession crisis. Before he left for Washington, Lincoln told those who had gathered in Springfield to wish him well and that he faced a “task greater than Washington’s” in the years to come. Southerners were also learning the challenges of forming a new nation. The seceded states grappled with internal divisions right away, as states with slaveholders sometimes did not support the newly seceded states. In January, for example, Delaware rejected secession. But states in the lower south adopted a different course. The State of Mississippi seceded. Later in the month, the states of Florida, Alabama, Georgia, and Louisiana also all left the Union. By early February, Texas had also joined the newly seceded states. In February, southerners drafted a constitution protecting slavery and named a westerner, Jefferson Davis of Mississippi, as their President. When Abraham Lincoln acted upon his constitutional mandate as Commander in Chief following his inauguration on March 4, rebels calling themselves members of the Confederate States of America opened fire on Fort Sumter in South Carolina. Within days, Abraham Lincoln would demand 75,000 volunteers from the North to crush the rebellion, and the American Civil War began.

 

VI. Conclusion

Slavery had long divided the politics of the United States. In time, these divisions became both sectional and irreconcilable. The first and most ominous sign of a coming sectional storm occurred over debates surrounding the admission of the State of Missouri in 1821. As westward expansion continued, these fault lines grew even more ominous, particularly as the United States managed to seize even more lands from its war with Mexico. The country seemed to teeter ever closer to a full-throated endorsement of slavery. But an antislavery coalition arose in the middle 1850s calling itself the Republican Party. Eager to cordon off slavery and confine it to where it already existed, the Republicans won the presidential election of 1860 and threw the nation on the path to war.

Throughout this period, the mainstream of the antislavery movement remained committed to a peaceful resolution of the slavery issue through efforts understood to foster the “ultimate extinction” of slavery in due time. But as the secession crisis revealed, the South could not tolerate a federal government working against the interests of slavery’s expansion and decided to take a gamble on war with the United States. Secession, in the end, raised the possibility of emancipation through war, a possibility most Republicans knew, of course, had always been an option, but one they nonetheless hoped would never be necessary. By 1861 all bets were off, and the fate of slavery, and of the nation, depended upon war.

 

VII. Reference Material

This chapter was edited by Jesse Gant, with content contributions by Jeffrey Bain-Conkin, Matthew A. Byron, Christopher Childers, Jesse Gant, Christopher Null, Ryan Poe, Michael Robinson, Nicholas Wood, Michael Woods, and Ben Wright.

Recommended citation: Jeffrey Bain-Conkin et al., “The Sectional Crisis,” Jesse Gant, ed., in The American Yawp, Joseph Locke and Ben Wright, eds., last modified August 1, 2016, http://www.AmericanYawp.com.

 

Recommended Reading

  • Bacon, Margaret Hope. But One Race: The Life of Robert Purvis. Albany: SUNY Press, 2012. 
  • Baker, Jean H. Affairs of Party: The Political Culture of Northern Democrats in the Mid-nineteenth Century. New York: Fordham University Press, 1983.
  • Berlin, Ira. Generations of Captivity: A History of African-American Slaves. Cambridge: Belknap Press of Harvard University Press, 2003.
  • Boydston, Jeanne. Home and Work: Housework, Wages, and the Ideology of Labor in the Early Republic. New York: Oxford University Press, 1990.
  • Bracey, Christopher Alan, Paul Finkelman, and David Thomas Konig, eds. The Dred Scott Case: Historical and Contemporary Perspectives on Race and Law. Athens: Ohio University Press, 2010.
  • Cutter, Barbara. Domestic Devils, Battlefield Angels: The Radicalization of American Womanhood, 1830-1865.. DeKalb: Northern Illinois University Press, 2003.
  • Engs, Robert F. and Randall M. Miller, eds. The Birth of the Grand Old Party: The Republicans’ First Generation. Philadelphia: University of Pennsylvania Press, 2002.
  • Etcheson, Nicole. Bleeding Kansas: Contested Liberty in the Civil War Era. Lawrence: University Press of Kansas, 2004.
  • Flexnor, Eleanor. Century of Struggle: The Women’s Rights Movement in the United States. Cambridge: Harvard University Press, 1975.
  • Foner, Eric. Free Soil, Free Labor, Free Men: The Ideology of the Republican Party before the Civil War. New York: Oxford University Press, 1970.
  • Grant, Susan-Mary. North Over South: Northern Nationalism and American Identity in the Antebellum Era. Lawrence: University Press of Kansas, 2000.
  • Holt, Michael. The Rise and Fall of the American Whig Party: Jacksonian Politics and the Onset of the Civil War. New York: Oxford University Press, 1999.
  • Howe, Daniel Walker. The Political Culture of the American Whigs. Chicago: University of Chicago Press, 1979.
  • Jeffrey, Julie R. The Great Silent Army of Abolitionism: Ordinary Women in the Antislavery Movement. Chapel Hill: North Carolina University Press, 1998.
  • Jones, Martha S. All Bound Up Together: The Woman Question in African American Public Culture, 1830-1900. Chapel Hill: University of North Carolina Press, 2007.
  • Kantrowitz, Stephen. More than Freedom: Fighting for Black Citizenship in a White Republic, 1829-1889. New York: Penguin Press, 2012.
  • McDaniel, Caleb. The Problem of Democracy in the Age of Slavery: Garrisonian Abolitionists and Transatlantic Reform. Baton Rouge: Louisiana State University Press, 2013.
  • Oakes, James. The Scorpion’s Sting: Antislavery and the Coming of the Civil War. New York: W.W. Norton, 2014.
  • Potter, David M. The Impending Crisis, 1848-1861. New York: HarperCollins, 1976.
  • Quarles, Benjamin. Allies for Freedom: Blacks and John Brown. New York: Oxford University Press, 1974.
  • Robertson, Stacey. Hearts Beating for Liberty: Women Abolitionists in the Old Northwest. Chapel Hill: University of North Carolina Press, 2010.
  • Sinha, Manisha. The Counterrevolution of Slavery: Politics and Ideology in Antebellum South Carolina. Chapel Hill: University of North Carolina Press, 2000.
  • Smith, Kimberly K. The Dominion of Voice: Riot, Reason and Romance in Antebellum American Political Thought. Lawrence: University Press of Kansas, 1999.
  • Varon, Elizabeth. Disunion!: The Coming of the American Civil War, 1789-1859. Chapel Hill: University of North Carolina Press, 2008.
  • Zaeske, Susan. Signatures of Citizenship: Petitioning, Antislavery, & Women’s Political Identity. Chapel Hill: University of North Carolina Press, 2003.

 

Notes

  1. David Brion Davis, The Problem of Slavery in Western Culture (New York: Oxford University Press, 1966). []
  2. David Brion Davis, The Problem of Slavery in the Age of Revolution, 1770-1823 (New York: Oxford University Press, 1999), 164-212. []
  3. see “Black Founders: The Free Black Community in the Early Republic” a digital exhibit from the Library Company of Philadelphia, available online at www.librarycompany.org/blackfounders/#. []
  4. Northwest Ordinance; July 13, 1787, Charles C. Tansill, ed., Documents Illustrative of the Formation of the Union of the American States (Washington D.C.: Government Printing Office, 1927), House Document No. 398. Available online at: avalon.law.yale.edu/18th_century/nworder.asp. []
  5. Stephen Middleton, The Black Laws: Race and the Legal Process in Early Ohio (Athens: Ohio University Press, 2005). []
  6. Lawrence Wilson, ed., The National Register: A weekly paper, containing a series of the important public documents, and the proceedings of Congress…Volume VII (Washington City: 1819), 125. []
  7. Conference committee report on the Missouri Compromise, March 1, 1820; Joint Committee of Conference on the Missouri Bill, 03/01/1820-03/06/1820; Record Group 128l; Records of Joint Committees of Congress, 1789-1989; National Archives. Available online at: https://www.ourdocuments.gov/doc.php?flash=true&doc=22&page=transcript. []
  8. William M. Wiecek, The Sources of Antislavery Constitutionalism in America, 1760-1848 (Ithaca: Cornell University Press, 1977). []
  9. Richard Furman, Rev. Dr. Richard Furman’s Exposition of the Views of the Baptists, relative to the colored population of the United States (Charleston, S.C.: A.E. Miller, 1923), p. 1. []
  10. Nicholas Wood, ““A Sacrifice on the Altar of Slavery”: Doughface Politics and Black Disenfranchisement in Pennsylvania, 1837–1838,” Journal of the Early Republic, Vol. 31, No. 1 (Spring 2011), pp. 75-106. []
  11. Michael F. Holt, The Rise and Fall of the American Whig Party: Jacksonian Politics and the Onset of the Civil War (Oxford: Oxford University Press, 1999). []
  12. James K. Polk: “Inaugural Address,” March 4, 1845. Online by Gerhard Peters and John T. Woolley, The American Presidency Project. http://www.presidency.ucsb.edu/ws/?pid=25814. []
  13. Richard Peters, Report of the Case of Edward Prigg against the Commonwealth of Pennsylvania… (Philadelphia: 1842). []
  14. Frederick Douglass, Narrative of the Life of Frederick Douglass, an American Slave, Written by Himself (Boston: 1845). []
  15. See Sojourner Truth, The Narrative of Sojourner Truth, Olive Gilbert ed. (Boston: 1850), digital.library.upenn.edu/women/truth/1850/1850.html; Maria Stewart, Maria W. Stewart: America’s First Black Woman Political Writer, Marilyn Richardson, ed. (Bloomington: Indiana University Press, 1987); James McCune Smith, The Works of James McCune Smith: Black Intellectual and Abolitionist, John Stauffer, ed. (New York: Oxford University Press, 2007); Frank A. Rollin, Life and Public Services of Martin R. Delaney (Boston: 1868), esp p. 313-367, https://archive.org/details/lifepublicservic00inroll. []
  16. Eric Foner, Free Soil, Free Labor, Free Men: The Ideology of the Republican Party before the Civil War (New York: Oxford University Press, 1970). []
  17. Joseph Rayback, Free Soil: The Election of 1848 (Lexington: University of Kentucky Press, 2014). []
  18. Report of the Woman’s Rights Convention, Held at Seneca Falls, N.Y., July 19th and 20th, 1848 (Rochester, 1848). []
  19. Gloria J. Browne-Marshall, Race, Law and American Society, Second Edition (New York: Routledge, 2013), 56. []
  20. Michael Winship, “Uncle Tom’s Cabin: History of the Book in the 19th-Century United States” (University of Virginia, 2007). Accessed August 1, 2015: http://utc.iath.virginia.edu/interpret/exhibits/winship/winship.html. []
  21. Charles Harold Nichols, Many Thousand Gone: The Ex-slaves’ Account of Their Bondage and Freedom (Leiden, Netherlands: E.J. Brill, 1963), 156. []
  22. Amos A. Lawrence to Giles Richards, June 1, 1854, quoted in Jane J. & William H. Pease, eds., The Fugitive Slave Law and Anthony Burns: A Problem in Law Enforcement (Philadelphia: 1975), p. 43. []
  23. Tyler Anbinder, Nativism and Slavery: The Northern Know Nothings and the Politics of the 1850′s (New York: Oxford University Press, 1992). []
  24. Charles Sumner, The Crime Against Kansas, Speech of Hon. Charles Sumner in the Senate of the United States (Boston: 1856). Available online at: https://www.senate.gov/artandhistory/history/resources/pdf/CrimeAgainstKSSpeech.pdf. []
  25. Williamjames Hull Hoffer, The Caning of Charles Sumner: Honor, Idealism, and the Origins of the Civil War (Baltimore: Johns Hopkins University Press, 2010), 92. []
  26. Abraham Lincoln, “Peoria Speech, October 16, 1854” in Collected Works of Abraham Lincoln, Roy P. Basler, ed. (New Brunswick, N.J.: Rutgers University Press, 1953), 247-283. Available online at: https://www.nps.gov/liho/learn/historyculture/peoriaspeech.htm. []
  27. Judgment in the U.S. Supreme Court Case Dred Scott v. John F.A. Sanford, March 6, 1857; Case Files 1792-1995; Record Group 267; Records of the Supreme Court of the United States; National Archives. Accessed August 1, 2015: http://www.ourdocuments.gov/doc.php?flash=true&doc=29. []
  28. Rodney O. Davis, Douglas L. Wilson, eds., The Lincoln Douglas Debates (Knox College, 2008), 68. []
  29. Republican Party Platforms: “Republican Party Platform of 1860,” May 17, 1860. Online by Gerhard Peters and John T. Woolley, The American Presidency Project. http://www.presidency.ucsb.edu/ws/?pid=29620. []
  30. Horace Greeley, The American Conflict: A History of the Great Rebellion in the United States of America, 1860-1864, Volume 1 (Hartford: 1864), 366-367. []
  31. “Declaration of the Immediate Causes Which Induce and Justify the Secession of South Carolina from the Federal Union,” The Avalon Project at the Yale Law School. Accessed August 1, 2015: http://avalon.law.yale.edu/19th_century/csa_scarsec.asp. []

F16 – 11. The Cotton Revolution

Eyre Crowe, Slaves Waiting for Sale, Richmond, Virginia, 1861, via University of Virginia, The Atlantic Slave Trade and Slave Life in the Americas.

Eyre Crowe, Slaves Waiting for Sale, Richmond, Virginia, 1861, via University of Virginia, The Atlantic Slave Trade and Slave Life in the Americas.

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

In the decades leading up to the Civil War, the Southern states experienced extraordinary change that would define the region and its role in American history for decades, even centuries, to come. Between the 1830s and the beginning of the Civil War in 1861, the American South expanded its wealth and population and became an integral part of an increasingly global economy. It did not, as previous generations of histories have told, sit back on its cultural and social traditions and insulate itself from an expanding system of communication, trade, and production that connected Europe and Asia to the Americas. Quite the opposite, the South actively engaged new technologies and trade routes while also seeking to assimilate and upgrade its most “traditional” and culturally engrained practices—such as slavery and agricultural production—within a modernizing world.

Beginning in the 1830s, merchants from the Northeast, Europe, Canada, Mexico, and the Caribbean flocked to Southern cities, setting up trading firms, warehouses, ports, and markets. As a result, these cities—like Richmond, Charleston, St. Louis, Mobile, Savannah, and New Orleans, to name a few—doubled, and even tripled, in size and global importance. Populations became more cosmopolitan, more educated, and wealthier. Systems of class—lower-, middle-, and upper-class communities—developed where they had never clearly existed. Ports that had once focused entirely on the importation of slaves, and shipped only regionally, became homes to daily and weekly shipping lines to New York City, Liverpool, Manchester, Le Havre, and Lisbon. The world was, slowly but surely, coming closer together; and the South was right in the middle.

 

II. The Importance of Cotton

In November of 1785, the Liverpool firm of Peel, Yates, & Co. imported the first seven bales of American cotton ever to arrive in Europe. Prior to this unscheduled, and frankly unwanted, delivery, European merchants saw cotton as a product of the colonial Caribbean islands of Barbados, Saint-Domingue (now Haiti), Martinique, Cuba, and Jamaica. The American South, though relatively wide and expansive, was the go-to source for rice and, most importantly, tobacco.

Few knew that the seven bales sitting in Liverpool that winter of 1785 would change the world. But they did. By the early 1800s, the American South had developed a niche in the European market for “luxurious” long-staple cotton grown exclusively on the Sea Islands off the coast of South Carolina.1 But this was only the beginning of a massive flood to come, and the foundation of the South’s astronomical rise to global prominence. Before long, botanists, merchants, and planters alike set out to develop strains of cotton seed that would grow further west on the Southern mainland, especially in the new lands opened up by the Louisiana Purchase of 1803—an area that stretched from New Orleans in South to what is today Minnesota, parts of the Dakotas, and Montana.

The discovery of Gossypium barbadense—often called “Petit Gulf” cotton—near Rodney, Mississippi, in 1820 changed the American and global cotton markets forever.2 “Petit Gulf,” it was said, slid through the cotton gin—a machine developed by Eli Whitney in 1794 for deseeding cotton—more easily than any other strain. It also grew tightly, producing more usable cotton than anyone had imagined to that point. Perhaps most importantly, though, it came up at a time when land in the Southwest—southern Georgia, Alabama, Mississippi, and northern Louisiana—became readily available for anyone with a few dollars and big dreams. Throughout the 1820s and 1830s, the federal government implemented several forced migrations of Native Americans, establishing a system of reservations west of the Mississippi River upon which all eastern peoples were required to relocate and settle. This, enacted through the Indian Removal Act of 1830, allowed the federal government to survey, divide, and auction off millions of acres of land for however much bidders were willing to pay. Suddenly, farmers with dreams of owning a large plantation could purchase dozens, even hundreds, of acres in the fertile Mississippi River Delta for cents on the dollar. Pieces of land that in other, more developed places would cost thousands of dollars sold in the 1830s for several hundred, at prices as low as 40¢ per acre.3

Pair with 19th-century cotton gin

Eli Whitney’s mechanical cotton gin revolutionized cotton production and expanded and strengthened slavery throughout the South. Eli Whitney’s Patent for the Cotton gin, March 14, 1794; Records of the Patent and Trademark Office; Record Group 241. Wikimedia.

Pair with Eli Whitney's Patent

A 19th-century cotton gin on display at the Eli Whitney Museum. Wikimedia.

Thousands of people, each one with his or her own dream of massive and immediate success, rushed to the area quickly becoming known as the “Cotton Belt.” Joseph Holt Ingraham, a writer and traveler from Maine, called it “mania.”4 William Henry Sparks, a lawyer living in Natchez, Mississippi, remembered it as “a new El Dorado” in which “fortunes were made in a day, without enterprise or work.” The change was astonishing. “Where yesterday the wilderness darkened over the land with her wild forests,” he recalled, “to-day the cotton plantations whitened the earth.”5 Money flowed from banks, many newly formed, on promises of “other-worldly” profits and overnight returns. Banks in New York City, Baltimore, Philadelphia, and even London offered lines of credit to anyone looking to buy land in the Southwest. Some even sent their own agents to purchase cheap land at auction for the express purpose of selling it, sometimes the very next day, at double and triple the original value—a process known as “speculation.”

The explosion of available land in the fertile cotton belt brought new life to the South. By the end of the 1830s, “Petit Gulf” cotton had been perfected, distributed, and planted throughout the region. Advances in steam power and water travel revolutionized Southern farmers’ and planters’ ability to deseed, bundle, and move their products to ports popping up along the Atlantic seaboard. Indeed, by the end of the 1830s, cotton had become the primary crop not only of the Southwestern states, but of the entire nation.

The numbers were staggering. In 1793, just a few years after the first, albeit unintentional, shipment of American cotton to Europe, the South produced around five million pounds of cotton, again almost exclusively the product of South Carolina’s Sea Islands. Seven years later, in 1800, South Carolina remained the primary cotton producer in the South, sending 6.5 million pounds of the luxurious long-staple blend to markets in Charleston, Liverpool, London, and New York.6 But as the tighter, more abundant and vibrant “Petit Gulf” strain moved west with the dreamers, schemers, and speculators, the American South quickly became the world’s leading cotton producer. By 1835, the five main cotton-growing states—South Carolina, Georgia, Alabama, Mississippi, and Louisiana—produced more than 500 million pounds of “Petit Gulf” for a global market stretching from New Orleans to New York to London, Liverpool, Paris and beyond. That 500 million pounds of cotton made up nearly 55 percent of the entire United States export market, a trend that continued nearly every year until the outbreak of the Civil War. Indeed, the two billion pounds of cotton produced in 1860 alone amounted to more than 60 percent of the United States’ total exports for that year.7

The astronomical rise of American cotton production came at the cost of the South’s first staple crop—tobacco. Perfected in Virginia, but grown and sold in nearly every Southern territory and state, tobacco served as the South’s main economic commodity for more than a century. But tobacco was a rough crop. It treated the land poorly, sucking up nutrients at a rate with which the soil could not compete. Tobacco fields did not last forever. In fact, fields rarely survived more than four or five cycles of growth, which left them dried and barren, incapable of growing much more than patches of grass. Of course, tobacco is, and was, an addictive substance; but because of its violent pattern of growth, farmers had to move around, purchasing new lands, developing new methods of production, and even creating new fields through deforestation and westward expansion. Tobacco, then, was expensive to produce—and not only because of the ubiquitous use of slave labor. It required massive, temporary fields, large numbers of slaves and laborers, and constant movement.

Cotton was different, and it arrived at a time best suited for its success. “Petit Gulf” cotton, in particular, grew relatively quickly on cheap, widely available land. With the invention of the cotton gin in 1794, and the emergence of steam power three decades later, cotton became the average man’s commodity, the product with which the United States could expand westward, producing and reproducing Thomas Jefferson’s idyllic yeoman republic—a nation in control of its land, reaping the benefits of honest, free, and self-reliant work, a nation of families and farmers, expansion and settlement. But this all came at a violent cost. With the democratization of land ownership through Indian Removal, federal auctions, readily available credit, and the seemingly universal dream of cotton’s immediate profit, one of the South’s lasting “traditions” became normalized and engrained. And by the 1860s, that very “tradition,” seen as the backbone of Southern society and culture, would split the nation in two. The heyday of American slavery had arrived.

This map, published by the US Coast Guard, shows the percentage of slaves in the population in each county of the slave-holding states in 1860. The highest percentages lie along the Mississippi River, in the “Black Belt” of Alabama, and coastal South Carolina, all of which were centers of agricultural production (cotton and rice) in the United States. E. Hergesheimer (cartographer), Th. Leonhardt (engraver), Map Showing the Distribution of the Slave Population of the Southern States of the United States Compiled from the Census of 1860, c. 1861. Wikimedia, http://commons.wikimedia.org/wiki/File:SlavePopulationUS1860.jpg.

This map, published by the US Coast Guard, shows the percentage of slaves in the population in each county of the slave-holding states in 1860. The highest percentages lie along the Mississippi River, in the “Black Belt” of Alabama, and coastal South Carolina, all of which were centers of agricultural production (cotton and rice) in the United States.
E. Hergesheimer (cartographer), Th. Leonhardt (engraver), Map Showing the Distribution of the Slave Population of the Southern States of the United States Compiled from the Census of 1860, c. 1861. Wikimedia.

 

III. Cotton and Slavery

The rise of cotton, and the resulting upsurge in the United States’ global position, wed the South to slavery. Without slavery there could be no “Cotton Kingdom,” no massive production of raw materials stretching across thousands of acres worth millions of dollars, and employing, at different stages of the process, many hundreds of people. Indeed, cotton grew alongside slavery. The two moved hand-in-hand. The existence of slavery, and the absolute reliance the Southern economy came to have on the practice, became the defining factor in what would be known as the “Slave South.” Although slavery arrived in the Americas long before cotton became a profitable commodity, the use and purchase of slaves, the moralistic and economic justifications for the continuation of slavery, even the urgency to protect the practice from extinction before the Civil War all received new life from the rise of cotton and the economic, social, and cultural growth spurt that accompanied its success.

Slavery had existed in the South since at least 1619, when a group of Dutch traders arrived at Jamestown with 20 Africans. Although these Africans remained under the ambiguous legal status of “unfree,” rather than actual slaves, their arrival set in motion a practice that would stretch across the entire continent over the next two centuries. Slavery was everywhere by the time the American Revolution created the United States, although Northern states began a process of gradually abolishing the practice soon thereafter. In the more rural, agrarian South, slavery became a way of life, especially as farmers expanded their lands, planted more crops, and entered into the international trade market. By 1790, four years after the ratification of the Constitution, 654,121 slaves lived in the South—then just Virginia, North Carolina, South Carolina, Georgia, and the “Southwest Territory” (now Tennessee). Just twenty years later, in 1810, that number had increased to more than 1.1 million individuals in bondage.8

Pair with "Cotton picking house"

Though taken after the end of slavery, these stereographs show various stages of cotton production. The fluffy white staple fiber is first extracted from the boll (a prickly, sharp protective capsule), after which the seed is separated in the ginning and taken to a storehouse. Unknown, Picking cotton in a great plantation in North Carolina, U.S.A., c. 1865-1903. Wikimedia.

The massive change in the South’s enslaved population between 1790 and 1810 makes sense, though. During that time, the South went from a region of four states and one rather small territory to a region of six states (Virginia, North and South Carolina, Georgia, Kentucky, and Tennessee) and three rather large territories (Mississippi, Louisiana, and Orleans). The free population of the South also nearly doubled over that period—from around 1.3 million in 1790 to more than 2.3 million in 1810. It is important to note here that the enslaved population of the South did not increase at any rapid rate over the next two decades, until the cotton boom took hold in the mid-1830s. Indeed, following the constitutional ban on the international slave trade in 1808, the number of slaves in the South increased by just 750,000 in twenty years.

But then cotton came, and grew, and changed everything. Over the course of the 1830s, 40s, and 50s, slavery became so endemic to the “Cotton Belt” that travelers, writers, and statisticians began referring to the area as the “Black Belt,” not only to describe the color of the rich land, but also to describe the skin color of those forced to work its fields, line its docks, and move the products of others’ lands.

Perhaps the most important aspect of Southern slavery during this so-called “Cotton Revolution” was the value placed upon both the work and the body of the slaves themselves. Once the fever of the initial land rush subsided, land values grew more slowly, and credit flowed less freely. For Mississippi land that in 1835 cost no more than $600, a farmer or investor would have to shell out more than $3,000 in 1850. By 1860, that same land, depending on its record of production and location, could cost as much as $100,000.9 In many cases, cotton growers, especially planters with large lots and enslaved workforces, put up slaves as collateral for funds dedicated to buying more land. If that land, for one reason or another, be it weevils, a late freeze, or a simple lack of nutrients, did not produce a viable crop within a year, the planter would lose not only the new land, but also the slaves he or she put up as a guarantee of payment.

The slave markets of the South varied in size and style, but the St. Louis Exchange in New Orleans was so frequently described it became a kind of representation for all southern slave markets. Indeed, the St. Louis Hotel rotunda was cemented in the literary imagination of nineteenth-century Americans after Harriet Beecher Stowe chose it as the site for the sale of Uncle Tom in her 1852 novel, Uncle Tom’s Cabin. After the ruin of the St. Clare plantation, Tom and his fellow slaves were suddenly property that had to be liquidated. Brought to New Orleans to be sold to the highest bidder, Tom found himself “[b]eneath a splendid dome” where “men of all nations” scurried about. J. M. Starling (engraver), "Sale of estates, pictures and slaves in the rotunda, New Orleans,” 1842. Wikimedia, http://commons.wikimedia.org/wiki/File:Sale_of_Estates_Pictures_and_Slaves_in_the_Rotunda_New_Orleans.jpg.

The slave markets of the South varied in size and style, but the St. Louis Exchange in New Orleans was so frequently described it became a kind of representation for all southern slave markets. Indeed, the St. Louis Hotel rotunda was cemented in the literary imagination of nineteenth-century Americans after Harriet Beecher Stowe chose it as the site for the sale of Uncle Tom in her 1852 novel, Uncle Tom’s Cabin. After the ruin of the St. Clare plantation, Tom and his fellow slaves were suddenly property that had to be liquidated. Brought to New Orleans to be sold to the highest bidder, Tom found himself “[b]eneath a splendid dome” where “men of all nations” scurried about. J. M. Starling (engraver), “Sale of estates, pictures and slaves in the rotunda, New Orleans,” 1842. Wikimedia.

So much went into the production of cotton, the expansion of land, and maintenance of enslaved workforces that by the 1850s, nearly every ounce of credit offered by Southern, and even Northern, banks dealt directly with some aspect of the cotton market. And millions of dollars changed hands. Slaves, the literal and figurative backbones of the Southern cotton economy, served as the highest and most important expense for any successful cotton grower. Prices for slaves varied drastically, depending on skin color, sex, age, and location, both of purchase and birth. In Virginia in the 1820s, for example, a single female slave of childbearing years sold for an average of $300; an unskilled man above the age of 18 sold for around $450; and boys and girls below 13 years sold for between $100 and $150.10

By the 1840s, and into the 1850s, prices had nearly doubled—a result of both standard inflation and the increasing importance of enslaved laborers in the cotton market. In 1845, “plow boys” under the age of 18 sold for more than $600 in some areas, measured at “five or six dollars per pound.”11 “Prime field hands,” as they were called by merchants and traders, averaged $1,600 at market by 1850, a figure that fell in line with the rising prices of the cotton they picked. For example, when cotton sat at 7¢ per pound in 1838, the average “field hand” cost around $700. As the price of cotton increased to 9¢, 10¢, then 11¢ per pound over the next ten years, the average cost of an enslaved male laborer likewise rose to $775, $900, and then more than $1,600.12

The key is that cotton and slaves helped define each other, at least in the cotton South. By the 1850s, slavery and cotton had become so intertwined, that the very idea of change—be it crop diversity, anti-slavery ideologies, economic diversification, or the increasingly staggering cost of purchasing and maintaining slaves—became anathema to the Southern economic and cultural identity. Cotton had become the foundation of the Southern economy. Indeed, it was the only major product, besides perhaps sugar cane in Louisiana, that the South could effectively market internationally. As a result, Southern planters, politicians, merchants, and traders became more and more dedicated—some would say “obsessed”—to the means of its production: slaves and slavery.In 1834, Joseph Ingraham wrote that “to sell cotton in order to buy negroes—to make more cotton to buy more negroes, ‘ad infinitum,’ is the aim and direct tendency of all the operations of the thorough going cotton planter; his whole soul is wrapped up in the pursuit.”13 Twenty-three years later, such pursuit had taken on a seemingly religious character, as James Stirling, an Englishman traveling through the South, observed, “[slaves] and cotton—cotton and [slaves]; these are the law and the prophets to the men of the South.”14

The Cotton Revolution was a time of capitalism, panic, stress, and competition. Planters expanded their lands, purchased slaves, extended lines of credit, and went into massive amounts of debt because they were constantly working against the next guy, the newcomer, the social mover, the speculator, the trader. A single bad crop could cost even the most wealthy, landed planter his or her entire life, along with those of his or her slaves and their families. Although the cotton market was large and profitable, it was also fickle, risky, and cost intensive. The more wealth one gained, the more land he or she needed to procure, which led to more slaves, more credit, and more mouths to feed. The decades before the Civil War in the South, then, were not times of slow, simple tradition. They were times of high competition, high risk, and high reward, no matter where one stood in the social hierarchy. But the risk was not always economic in nature.

In southern cities like Norfolk, VA, markets sold not only vegetables, fruits, meats, and sundries, but also slaves. Enslaved men and women, like the two walking in the direct center, lived and labored next to free people, black and white. S. Weeks, “Market Square, Norfolk,” from Henry Howe's Historical Collections of Virginia, 1845. Wikimedia, http://commons.wikimedia.org/wiki/File:Historical_Collections_of_Virginia_-_Market_Square,_Norfolk.jpg.

In southern cities like Norfolk, VA, markets sold not only vegetables, fruits, meats, and sundries, but also slaves. Enslaved men and women, like the two walking in the direct center, lived and labored next to free people, black and white. S. Weeks, “Market Square, Norfolk,” from Henry Howe’s Historical Collections of Virginia, 1845. Wikimedia.

The most tragic, indeed horrifying, aspect of slavery was its inhumanity. All slaves had memories, emotions, experiences, and thoughts. They saw their experiences in full color, felt the pain of the lash, the heat of the sun, and the heartbreak of loss, whether through death, betrayal, or sale. Communities developed upon a shared sense of suffering, common work, and even family ties. Slaves communicated in the slave markets of the urban South, and worked together to help their families, ease their loads, or simply frustrate their owners. Simple actions of resistance, such as breaking a hoe, running a wagon off the road, causing a delay in production due to injury, running away, or even pregnancy, provided a language shared by nearly all slaves in the agricultural workforce, a sense of unity that remained unsaid, but was acted out daily.

Beyond the basic and confounding horror of it all, the problem of slavery in the cotton South was twofold. First, and most immediate, was the fear and risk of rebellion. With nearly four million individual slaves residing in the South in 1860, and nearly 2.5 million living in the “Cotton Belt” alone, the system of communication, resistance, and potential violence amongst slaves did not escape the minds of slaveholders across the region and nation as a whole. As early as 1787, Thomas Jefferson wrote in his Notes on the State of Virginia that blacks and whites were “two warring nations” held at bay by the existence of slavery. If white slaveowners did not remain vigilant, Jefferson wrote, the presence of Africans in the Americas would “produce convulsions, which will probably never end but in the extermination of the one or the other race.”15

Southern writers, planters, farmers, merchants, and politicians expressed the same fears more than a half century later. “The South cannot recede,” declared an anonymous writer in an 1852 issue of the New Orleans-based De Bow’s Review. “She must fight for her slaves or against them. Even cowardice would not save her.”16 To many slaveholders in the South, slavery was the saving grace not only of their own economic stability, but also the maintenance of peace and security in everyday life. Much of pro-slavery ideology rested upon the notion that slavery provided a sense of order, duty, and legitimacy to the lives of individual slaves, feelings that Africans and African Americans, it was said, could not otherwise experience. Without slavery, many thought, “blacks” (the word most often used for “slaves” in regular conversation) would become violent, aimless, and uncontrollable.

Some commentators recognized the problem in the 1850s, as the internal slave trade—the legal trade of slaves between states, along rivers, and along the Atlantic coastline—picked up in the decade before the Civil War. The problem was rather simple. The more slaves one owned, the more money is cost to a) maintain them, and b) extract product from their work. As planters and cotton growers expanded their lands and purchased more slaves, their expectations increased.

And productivity, in large part, did increase. But it came on the backs of slaves with heavier workloads, longer hours, and more intense punishments. “The great limitation to production is labor,” wrote one commentator in the American Cotton Planter in 1853. And many planters recognized this limitation, and worked night and day, sometimes literally, to find the furthest extent of that limit.17 According to some contemporary accounts, by the mid 1850s, the expected production of an individual slave in Mississippi’s Cotton Belt had increased from between four and five bales (weighing about 500 pounds each) per day to between eight and ten bales per day, on average.18 Other, perhaps more reliable sources, such as the account book of Buena Vista Plantation in Tensas Parish, Louisiana, list average daily production at between 300 and 500 pounds “per hand,” with weekly averages ranging from 1,700 to 2,100 pounds “per hand.” Cotton production “per hand” increased by 600 percent in Mississippi between 1820 and 1860.19 Each slave, then, was working longer, harder hours to keep up with his or her master’s expected yield.

Here was capitalism with its most colonial, violent, and exploitative face. Humanity became a commodity used and worked to produce profit for a select group of investors, regardless of its shortfalls, dangers, and immoralities. But slavery, profit, and cotton did not exist only in the rural South. The Cotton Revolution sparked the growth of an urban South, cities that served as Southern hubs of a global market, conduits through which the work of slaves and the profits of planters met and funded a wider world.

The slave trade sold bondspeople -- men, women, and children -- like mere pieces of property, as seen in the advertisements produced during the era. 1840 poster advertising slaves for sale in New Orleans. Wikimedia, http://commons.wikimedia.org/wiki/File:ValuableGangOfYoungNegroes1840.jpeg.

The slave trade sold bondspeople — men, women, and children — like mere pieces of property, as seen in the advertisements produced during the era. 1840 poster advertising slaves for sale in New Orleans. Wikimedia.

 

IV. The South and the City

Although much of the story of slavery and cotton lies in the rural areas where cotton actually grew, slaves worked in the fields, and planters and farmers held reign over their plantations and farms, the 1830s, 40s, and 50s saw an extraordinary spike in urban growth across the South. For nearly a half century after the Revolution, the South existed as a series of plantations, county seats, and small towns, some connected by roads, others connected only by rivers, streams, and lakes. Cities certainly existed, but they served more as local ports than regional, or national, commercial hubs. For example, New Orleans, then capital of Louisiana, which entered the union in 1812, was home to just over 27,000 people in 1820; and even with such a seemingly small population, it was the second largest city in the South—Baltimore had more than 62,000 people in 1820.20 Given the standard 19th-century measurement of an urban space (2,500+ people), the South had just ten in that year, one of which—Mobile, Alabama—contained only 2,672 individuals, nearly half of whom were enslaved.21

As late as the 1820s, Southern life was predicated upon a rural lifestyle—farming, laboring, acquiring land and slaves, and producing whatever that land and those slaves could produce. The market, often located in the nearest town or city, rarely stretched beyond state lines. Even in places like New Orleans, Charleston, and Norfolk, Va., which had active ports as early as the 1790s, shipments rarely, with some notable exceptions, left American waters or traveled further than the closest port down the coast. In the first decades of the nineteenth century, American involvement in international trade was largely confined to ports in New York, Boston, Philadelphia, and, sometimes, Baltimore—which loosely falls under the demographic category of the South. Imports dwarfed exports. In 1807, U.S. imports outnumbered exports by nearly $100 million; and even as the Napoleonic Wars broke out in Europe, causing a drastic decrease in European production and trade, the United States still took in almost $50 million more than it sent out.22

Cotton changed much of this, at least with respect to the South. Before cotton, the South had few major ports, almost none of which actively maintained international trade routes or even domestic supply routes. Internal travel and supply was difficult, especially on the waters of the Mississippi River, the main artery of the North American continent, and the eventual goldmine of the South. With a strong current, deadly undertow, and constant sharp turns, sandbars, and subsystems, navigating the Mississippi was difficult and dangerous. It promised a revolution in trade, transportation, and commerce only if the technology existed to handle its impossible bends, and fight against its southbound current. By the 1820s, and into the 1830s, small ships could successfully navigate their ways to New Orleans from as far north as Memphis and even St. Louis, if they so dared. But the problem was getting back. Most often, traders and sailors scuttled their boats upon landing in New Orleans, selling the wood for a quick profit or a journey home on a wagon or caravan.

The rise of cotton benefitted from a change in transportation technology that aided and guided the growth of Southern cotton into one of the world’s leading commodities. In January 1812, a 371-ton ship called the New Orleans arrived at its namesake from the distant internal port of Pittsburgh, Pennsylvania. This was the first steamboat to navigate the internal waterways of the North American continent from one end to the other, and remain capable of returning home. The technology was far from perfect—the New Orleans sank two years later after hitting a submerged sandbar covered in driftwood—but its successful trial promised a bright, new future for river-based travel.

And that future was, indeed, bright. Just five years after the New Orleans arrived in its city, seventeen steamboats ran regular upriver lines. By the mid 1840s, more than 700 steamboats did the same. In 1860, the port of New Orleans received and unloaded 3,500 steamboats, all focused entirely upon internal trade. These boats carried around 160,000 tons of raw product that merchants, traders, and agents converted into nearly $220 million in trade, all in a single year.23 More than 80 percent of the yield was from cotton alone, the product of the same fields tilled, expanded, and sold over the preceding three decades. Only now, in the 1840s and 1850s, could those fields, plantations, and farms simply load their products onto a boat, and wait for the profit, credit, or supplies to return from “downriver.”

Gordon, the slave pictured here, endured terrible brutality from his master before escaping to Union Army lines in 1863. He would become a soldier and help fight to end the violent system that produced the horrendous scars on his back. Matthew Brady, Gordon, 1863. Wikimedia, http://commons.wikimedia.org/wiki/File:Gordon,_scourged_back,_NPG,_1863.jpg.

Gordon, pictured here, endured terrible brutality from his master before escaping to Union Army lines in 1863. He would become a soldier and help fight to end the violent system that produced the horrendous scars on his back. Matthew Brady, Gordon, 1863. Wikimedia.

The explosion of steam power changed the face of the South, and indeed the nation as a whole. Everything that could be steam-powered was steam-powered, sometimes with very mixed results. Cotton gins, wagons, grinders, looms, and baths, among countless others, all fell under the net of this new technology. Most importantly, the South’s rivers, lakes, and bays were no longer barriers and hindrances to commerce. Quite the opposite, they had become the means by which commerce flowed, the roads of a modernizing society and region. And most importantly, the ability to use internal waterways connected the rural interior to increasingly urban ports, the sources of raw materials—i.e. cotton, tobacco, wheat, etc.—to an eager global market.

Coastal ports like New Orleans, Charleston, Norfolk, and even Richmond became targets of steamboats and coastal carriers. Merchants, traders, skilled laborers, and foreign speculators and agents flooded the towns. In fact, the South experienced a stronger trend in urbanization between 1820 and 1860 than the seemingly more industrial, urban-based North. Urbanization of the South simply looked different from that seen in the North and in Europe. Where most Northern and some European cities (most notably London, Liverpool, Manchester, and Paris) developed along the lines of industry, creating public spaces to boost the morale of wage laborers in factories, on the docks, and in storehouses, Southern cities developed within the cyclical logic of sustaining the trade in cotton that justified and paid for the maintenance of an enslaved labor force. The growth of Southern cities, then, allowed slavery to flourish and brought the South into a more modern world.

Between 1820 and 1860, quite a few Southern towns experienced dramatic population growth, which paralleled the increase in cotton production and international trade to and from the South. The 27,176 people New Orleans claimed in 1820 expanded to more than 168,000 by 1860. In fact, in New Orleans, the population nearly quadrupled from 1830 to 1840 as the Cotton Revolution hit full stride. At the same time, Charleston’s population doubled, from 24,780 to 40,522; Richmond expanded threefold, growing from a town of 12,067 to a capital city of 37,910; and St. Louis experienced the largest increase of any city in the nation, expanding from a frontier town of 10,049 to a booming Mississippi River metropolis of 160,773.24

The city and the field, the urban center and the rural space, were inextricably linked in the decades before the Civil War. And that relationship connected the region to a global market and community. As Southern cities grew, they became more cosmopolitan, attracting types of people either unsuited for, or uninterested in rural life. These people—merchants, skilled laborers, traders, sellers of all kinds and colors—brought rural goods to a market desperate for raw materials. Everyone, it seemed, had a place in the cotton trade. Agents, many of them transients from the North, and in some cases Europe, represented the interests of planters and cotton farmers in the cities, making connections with traders who in turn made deals with manufactories in the Northeast, Liverpool, and Paris.

Among the more important aspects of Southern urbanization is the development of a middle class in the urban centers, something that never fully developed in the more rural areas. In a very general sense, the rural South fell under a two-class system in which a landowning elite controlled the politics and most of the capital, and a working poor survived on subsistence farming or basic, unskilled labor funded by the elite. The development of large urban centers founded upon trade, and flush with transient populations of sailors, merchants, and travelers, gave rise to a large, highly developed middle class in the South. Predicated upon the idea of separation from those above and below them, middle class men and women in South thrived in the active, feverish rush of port city life.

Filled from the ranks of skilled craftsmen, merchants, traders, speculators, and storeowners, the Southern middle class took on a communal identity that embraced the urban lifestyle. Southern fashion paid less attention to practical function—such as broad brimmed hats to protect one from the sun, knee-high boots for horse riding, and linen shirts and trousers to fight the heat of an unrelenting sun. Silk, cotton, and bright colors came in vogue, especially in coastal cities like New Orleans and Charleston; cravats, golden broaches, diamonds, and “the best stylings of Europe” became the standards of urban middle-class life in the South.25 Neighbors, friends, and business partners formed and joined the same benevolent societies, dedicated, in a form of self-aggrandizing virtue, to aiding the less fortunate in society—the orphans, the impoverished, the destitute. But in many cases these benevolent societies simply served as a way to keep other people out of middle-class circles, sustaining both wealth and social prestige within an insular, well-regulated community. Members and partners married each others’ sisters, stood as godparents for each others’ children, and served, when the time came, as executors of fellow members’ wills.

The city bred exclusivity. That was part of the rush, part of fever of the time. Built upon the cotton trade, funded by European and Northeastern merchants, markets, and manufactories, Southern cities became headquarters of the nation’s largest and most profitable commodities—cotton and slaves. And they welcomed the world with open checkbooks and open arms.

 

V. Southern Cultures

Life, too, remained. The South, for all of its economic, agricultural, and technological growth, still housed many people, many cultures, and many individual lives. To understand the global and economic functions of the South, we also must understand the people who, just by living in the region, going to work each day, whether forced and voluntary, and participating the general dialogue of a community, made the whole thing work. The South, more than perhaps any other region in the United States, had a great diversity of cultures and situations. The South still relied on the existence of slavery; and as a result, it was home to nearly 4 million enslaved people by 1860, amounting to more than 45 percent of the entire Southern population.26 Naturally, these people, though fundamentally unfree in their movement, developed a culture all their own. They created kinship and family networks, systems of (often illicit) trade, linguistic codes, religious congregations, and even benevolent and social aid organizations—all this within the grip of slavery, a system dedicated to extraction rather than development, work and production rather than community and emotion.

The concept of family, more than anything else, played a crucial role in the daily lives of slaves. Family and kinship networks, and the benefits they carried, represented an institution through which slaves could piece together a sense of community, a sense of feeling and dedication, separate from the forced system of production that defined their daily lives. The creation of family units, distant relations, and communal traditions allowed slaves maintain religious beliefs, ancient ancestral traditions, and even names passed down from generation to generation in a way that challenged the ubiquitous nature of enslavement. Ideas passed between relatives on different plantations, names given to children in honor of the deceased, and the basic forms of love and devotion that bind closely-knit societies created a sense of individuality, an identity that assuaged the loneliness and desperation of enslaved life. Family defined how each plantation, each community, functioned, grew, and labored.

Nothing under slavery lasted long, at least not in the same form. Slave families and networks were no exceptions to this rule. African-born slaves during the seventeenth and eighteenth centuries engaged in marriages—sometimes polygamous—with those of the same ethnic groups whenever possible. This, most importantly, allowed for the maintenance of cultural traditions, such as language, religion, name practices, and even the rare practice of bodily scarring. In some parts of the South, such as Louisiana and coastal South Carolina, ethnic homogeneity thrived, and as a result, traditions and networks survived relatively unchanged for decades. As the number of slaves arriving in the United States increased, and generations of American-born slaves overtook the original African-born populations, the practice of marriage, especially among members of the same ethnic group, or even simply the same plantation, became vital to the continuation of aging traditions. Marriage served as the single most important aspect of cultural and identity formation, as it connected slaves to their own pasts, and gave some sense of protection for the future.27 By the start of the Civil War, approximately two-thirds of slaves were members of nuclear households, each household averaging six people—mother, father, children, and often a grandparent, elderly aunt or uncle, and even “in-laws.” Those not members of a marriage bond, or even a nuclear family, still maintained family ties, most often living with a single parent, brother, sister, or grandparent.28

Free people of color were present throughout the American South, particularly in urban areas like Charleston and New Orleans. Some were relatively well off, like this femme de couleur libre posed with her mixed race child in front of her New Orleans home, maintaining a middling position between free whites and slaves. As the nineteenth century progressed, however, free people of color lost their status and any rights they had as slavery expanded and strengthened. Free woman of color with quadroon daughter; late 18th century collage painting, New Orleans. Wikimedia, http://commons.wikimedia.org/wiki/File:Free_Woman_of_Color_with_daughter_NOLA_Collage.jpg.

Free people of color were present throughout the American South, particularly in urban areas like Charleston and New Orleans. Some were relatively well off, like this femme de couleur libre who posed with her mixed-race child in front of her New Orleans home, maintaining a middling position between free whites and unfree blacks. Free woman of color with quadroon daughter; late 18th century collage painting, New Orleans. Wikimedia

Many slave marriages endured for many years, as with all things under slavery, the threat of disruption, often through sale, always loomed. As the internal slave trade increased following the constitutional ban on slave importation in 1808 and the rise of cotton in the 1830s and 1840s, slave families, especially those established prior to the slaves’ arrival in the United States, came under increased threat. Hundreds of thousands of marriages, many with children, fell victim to sale “downriver”—a euphemism for the near constant flow of slave laborers down the Mississippi River to the developing cotton belt in the Southwest—as cheap land turned into cheap cotton.29 In fact, during the Cotton Revolution alone, between one-fifth and one-third of all slave marriages were broken up through sale or forced migration. But this was not the only threat. Planters, and slaveowners of all shapes and sizes, recognized that marriage was, in the most basic and tragic sense, a privilege granted and defined by them for their slaves. And as a result, many slaveholders used slaves’ marriages, or the threats thereto, to squeeze out more production, counteract disobedience, or simply make a gesture of power and superiority.

Threats to family networks, marriages, and household stability did not stop with the death of a master. A slave couple could live their entire lives together, even having been born, raised, and married on the slave plantation, and, following the death of their master, find themselves at opposite sides of the known world. It only took a single relative, executor, creditor, or friend of the deceased to make a claim against the estate to cause the sale and dispersal of an entire slave community.

Enslaved women were particularly vulnerable to the shifts of fate attached to slavery. In many cases, female slaves did the same work as men, spending the day—from sun up to sun down—in the fields picking and bundling cotton. In some rare cases, especially among the larger plantations, planters tended to use women as house servants more than men, but this was not universal. In both cases, however, females slaves’ experiences were different than their male counterparts, husbands, and neighbors. Sexual violence, unwanted pregnancies, and constant childrearing while continuing to work the fields all made life as a female slave more prone to disruption and uncertainty. Harriet Jacobs, an enslaved woman from North Carolina, chronicled her master’s attempts to sexually abuse her in her narrative, Incidents in the Life of a Slave Girl. Jacobs suggested that her successful attempts to resist sexual assault and her determination to love whom she pleased was “something akin to freedom.”30 But this “freedom,” however empowering and contextual, did not cast a wide net. Many enslaved women had no choice concerning love, sex, and motherhood. On plantations, small farms, and even in cities, rape was ever-present. Like the splitting of families, slaveowners used sexual violence as a form of terrorism, a way to promote increased production, obedience, and power relations. And this was not restricted only to unmarried women. In numerous contemporary accounts, particularly violent slaveowners forced men to witness the rape of their wives, daughters, and relatives, often as punishment, but occasionally as a sadistic expression of power and dominance.31

The key is that, as property, enslaved women had no recourse, and society, by and large, did not see a crime in this type of violence. Racist pseudo-scientists claimed that whites could not physically rape Africans or African Americans, as the sexual organs of each were not compatible in that way. State law, in some cases, supported this view, claiming that rape could only occur between either two white people or a black man and a white woman. All other cases fell under a silent acceptance.32 The consequences of rape, too, fell to the victim in the case of slaves. Pregnancies that resulted from rape did not always lead to a lighter workload for the mother. And if a slave acted out against a rapist, whether that be her master, mistress, or any other white attacker, her actions were seen as crimes rather than desperate acts of survival. For example, a 19-year-old slave named Celia fell victim to repeated rape by her master in Callaway County, Missouri. Between 1850 and 1855, Robert Newsom raped Celia hundreds of times, producing two children and several miscarriages. Sick and desperate in the fall of 1855, Celia took a club and struck her master in the head, killing him. But instead of sympathy and aid, or even an honest attempt to understand and empathize, the community rallied around their dead friend, calling for the execution of Celia. On November 16, 1855, after a trial of ten days, Celia, the 19-year-old rape victim and slave, was hanged for her crimes against her master.33

This photograph is Selina Gray and two of her daughters. Gray was the enslaved housekeeper to Robert E. Lee. Via the National Park Service

This photograph is Selina Gray and two of her daughters. Gray was the enslaved housekeeper to Robert E. Lee. Via the National Park Service.

Gender inequality did not always fall along the same lines as racial inequality. Southern society, especially in the age of cotton, deferred to white men, under whom laws, social norms, and cultural practices were written, dictated, and maintained. White and free women of color lived in a society dominated, in nearly every aspect, by men. Denied voting rights, women, of all statuses and colors, had no direct representation in the creation and discussion of law. Husbands, it was said, represented their wives, as the public sphere was too violent, heated, and high-minded for the female intellectual and physical frame. Society expected women to represent the foundations of the republic, gaining respectability through their work at home, in support of their husbands and children, away from the rough and boisterous realm of masculinity. In many cases, too, law did not protect women the same way it protected men. In most states, marriage, an act expected of any self-respecting, reasonable woman of any class, effectively transferred all of a woman’s property to her husband, forever, regardless of claim or command. Divorce existed, but it hardly worked in a woman’s favor, and often, if successful, ruined the wife’s standing in society, and even led to well-known cases of suicide.34

Life on the ground in cotton South, like the cities, systems, and networks within which it rested, defied the standard narrative of the Old South. Slavery existed to dominate, yet slaves formed bonds, maintained traditions, and crafted new culture. They fell in love, had children, and protected one another using the privileges granted them by their captors, and the basic intellect allowed all human beings. They were resourceful, brilliant, and vibrant, and they created freedom where freedom seemingly could not exist. But threats remained for all people in the cotton South, especially those frowned upon by the patriarchal system upon which Southern society was built. And within those communities, resilience and dedication often led to cultural sustenance. Among the enslaved, women, and the impoverished-but-free, culture thrived in ways that are difficult to see through the bales of cotton and the stacks of money sitting on the docks and in the counting houses of the South’s urban centers. But religion, honor, and pride transcended material goods, especially among those who could not express themselves that way.

The issue of emigration elicited disparate reactions from African Americans. Tens of thousands left the United States for Liberia, a map of which is shown here, to pursue greater freedoms and prosperity. Most emigrants did not experience such success, but Liberia continued to attract black settlers for decades. J. Ashmun, Map of the West Coast of Africa from Sierra Leone to Cape Palmas, including the colony of Liberia…, 1830. Library of Congress, http://memory.loc.gov/cgi-bin/query/h?ammem/gmd:@field%28NUMBER+@band%28g8882c+lm000002%29%29.

The issue of emigration elicited disparate reactions from African Americans. Tens of thousands left the United States for Liberia, a map of which is shown here, to pursue greater freedoms and prosperity. Most emigrants did not experience such success, but Liberia continued to attract black settlers for decades. J. Ashmun, Map of the West Coast of Africa from Sierra Leone to Cape Palmas, including the colony of Liberia…, 1830. Library of Congress.

 

VI. Religion and Honor in the Slave South

Economic growth, violence, and exploitation coexisted and mutually reinforced evangelical Christianity in the South. The revivals of the Second Great Awakening established the region’s prevailing religious culture. Led by Methodists, Baptists, and to a lesser degree, Presbyterians, this intense period of religious regeneration swept the along southern backcountry. By the outbreak of the Civil War, the vast majority of southerners who affiliated with a religious denomination belonged to either the Baptist or Methodist faith.35 Both churches in the South briefly attacked slavery before transforming into some of the most vocal defenders of slavery and the southern social order.

Southern ministers contended that God himself had selected Africans for bondage but also considered the evangelization of slaves to be one of their greatest callings. Missionary efforts among southern slaves largely succeeded and Protestantism spread rapidly among African Americans, leading to a proliferation of biracial congregations and prominent independent black churches. Some black and white southerners forged positive and rewarding biracial connections; however, more often black and white southerners described strained or superficial religious relationships.

As the institution of slavery hardened racism in the South, relationships between missionaries and Native Americans transformed as well. Missionaries of all denominations were among the first to represent themselves as “pillars of white authority.” After the Louisiana Purchase in 1803, plantation culture expanded into the Deep South, and mission work became a crucial element of Christian expansion. Frontier mission schools carried a continual flow of Christian influence into Native American communities. Some missionaries learned indigenous languages, but many more worked to prevent indigenous children from speaking their native tongues, insisting upon English for Christian understanding. By the Indian removals of 1835 and the Trail of Tears in 1838, missionaries in the South preached a proslavery theology that emphasized obedience to masters, the biblical basis of racial slavery via the curse of Ham, and the “civilizing” paternalism of slave-owners.

Slaves most commonly received Christian instruction from white preachers or masters, whose religious message typically stressed slave subservience. Anti-literacy laws ensured that most slaves would be unable to read the Bible in its entirety and thus could not acquaint themselves with such inspirational stories as Moses delivering the Israelites out of slavery. Contradictions between God’s Word and master and mistress cruelty did not pass unnoticed by many enslaved African Americans. As former slave William Wells Brown declared, “slaveholders hide themselves behind the Church,” adding that “a more praying, preaching, psalm-singing people cannot be found than the slaveholders of the South.”36

Many slaves chose to create and practice their own versions of Christianity, one that typically incorporated aspects of traditional African religions with limited input from the white community. Nat Turner, for example, found inspiration from religion early in life. Adopting an austere Christian lifestyle during his adolescence, Turner claimed to have been visited by “spirits” during his twenties, and considered himself something of a prophet. He claimed to have had visions, in which he was called upon to do the work of God, leading some contemporaries (as well as historians) to question his sanity.37

Inspired by his faith, Turner led the most deadly slave rebellion in the antebellum South. On the morning of August 22, 1831 in Southampton County, Virginia, Nat Turner and six collaborators attempted to free the region’s enslaved population. Turner initiated the violence by killing his master with an axe blow to the head. By the end of the day, Turner and his band, which had grown to over fifty men, killed fifty-seven white men, women, and children on eleven farms. By the next day, the local militia and white residents had captured or killed all of the participants except Turner, who hid for a number of weeks in nearby woods before being captured and executed. The white terror that followed Nat Turner’s rebellion transformed southern religion, as anti-literacy laws increased and black-led churches were broken up and placed under the supervision of white ministers.

This woodcut captured the terror felt by white southerners in the aftermath of Nat Turner's rebellion. After the rebellion, fearful white reactionaries killed hundreds of enslaved people—most of whom were unconnected to the rebellion— and the state created stricter, more limiting laws concerning slavery. via African American Intellectual History Society.

This woodcut captured the terror felt by white southerners in the aftermath of Nat Turner’s rebellion. After the rebellion, fearful white reactionaries killed hundreds of enslaved people—most of whom were unconnected to the rebellion— and the state created stricter, more limiting laws concerning slavery. via the African American Intellectual History Society.

Evangelical religion also shaped understandings of what it meant to be a southern man or a southern woman. Southern manhood was largely shaped by an obsession with masculine honor, whereas southern womanhood centered on expectations of sexual virtue or purity. Honor prioritized the public recognition of white masculine claims to reputation and authority. Southern men developed a code to ritualize their interactions with each other and to perform their expectations of honor. This code structured language and behavior and was designed to minimize conflict. But when conflict did arise, the code also provided rituals that would reduce the resulting violence.

The formal duel exemplified the code in action. If two men could not settle a dispute through the arbitration of their friends, they would exchange pistol shots to prove their equal honor status. Duelists arranged a secluded meeting, chose from a set of deadly weapons and risked their lives as they clashed with swords or fired pistols at one another. Some of the most illustrious men in American history participated in a duel at some point during their lives, including President Andrew Jackson, Vice-President Aaron Burr, United States Senators Henry Clay, and Thomas Hart Benton. In all but Burr’s case, dueling assisted in elevating these men to prominence.

Violence amongst the lower classes, especially those in the backcountry, involved fistfights and shootouts. Tactics included the sharpening of fingernails and filing of teeth into razor sharp points, which would be used to gouge eyes and bite off ears and noses. In a duel, a gentleman achieved recognition by risking his life rather than killing his opponent, whereas those involved in rough-and-tumble fighting achieved victory through maiming their opponent.

The legal system was partially to blame for the prevalence of violence in the Old South. Although states and territories had laws against murder, rape, and various other forms of violence, including specific laws against dueling, upper-class southerners were rarely prosecuted and juries often acquitted the accused. Despite the fact that hundreds of duelists fought and killed one another, there is little evidence that many duelists faced prosecution, and only one, Timothy Bennett (Belleville, Illinois), was ever executed. By contrast, prosecutors routinely sought cases against lower-class southerners, who were found guilty in greater numbers than their wealthier counterparts.

The southern emphasis on honor affected women as well. While southern men worked to maintain their sense of masculinity, so too southern women cultivated a sense of femininity. Femininity in the South was intimately tied to the domestic sphere, even more so than for women in the North. The cult of domesticity strictly limited the ability of wealthy southern women to engage in public life. While northern women began to organize reform societies, southern women remained bound to the home where they were instructed to cultivate their families’ religious sensibility and manage their household. Managing the household was not easy work, however. For women on large plantations, managing the household would include directing a large bureaucracy of potentially rebellious slaves. For the vast majority of southern women who did not live on plantations, managing the household would include nearly constant work in keeping families clean, fed, and well-behaved. On top of these duties, many southern women would be required to assist with agricultural tasks.

Female labor was an important aspect of the southern economy, but the social position of women in southern culture was understood not through economic labor but rather through moral virtue. While men fought to get ahead in a turbulent world of cotton boom, women were instructed to offer a calming, moralizing influence on husbands and children. The home was to be a place of quiet respite and spiritual solace. Under the guidance of a virtuous woman, the southern home would foster the values required for economic success and cultural refinement. Female virtue came to be understood largely as a euphemism for sexual purity, and southern culture, southern law, and southern violence largely centered on protecting that virtue of sexual purity from any possible imagined threat. In a world saturated with the sexual exploitation of black women, southerners developed a paranoid obsession with protecting the sexual purity of white women. Black men were presented as an insatiable sexual threat. Racial systems of violence and domination were wielded with crushing intensity for generations, all in the name of keeping white womanhood as pure as the cotton that anchored southern society.

 

VII. Conclusion

Cotton created the antebellum South. The wildly profitable commodity opened a previously closed society to the grandeur, the profit, the exploitation, and the social dimensions of a larger, more connected, global community. In this way, the South, and the world, benefitted from the Cotton Revolution and the urban growth it sparked. But not all that glitters is gold. Slavery remained, and as a result of urbanization, the internal slave trade increased to untold heights as the 1860s approached. Politics, race relations, and the burden of slavery continued beneath the roar of steamboats, counting houses, and the exchange of goods. Underneath it all, many questions remained—chief among them, what to do if slavery somehow came under threat.

 

VIII. Reference Material

This chapter was edited by Andrew Wegmann, with content contributions by Ian Beamish, Amanda Bellows, Marjorie Brown, Matthew Byron, Steffi Cerato, Kristin Condotta, Mari Crabtree, Jeff Fortney, Robert Gudmestad, John Marks, Maria Montalvo, James Anthony Owen, Katherine Rohrer, Marie Stango, James Wellborn, Ben Wright, and Ashley Young.

Preferred Citation: Ian Beamish et al., “The Cotton Revolution,” Andrew Wegmann, ed., in The American Yawp, Joseph Locke and Ben Wright, eds., last modified August 1, 2016, http://www.AmericanYawp.com.

 

Recommended Reading

  • Edward E. Baptist, The Half Has Never Been Told: Slavery and the Making of American Capitalism (New York: Basic Books, 2014).
  • Sven Beckert, Empire of Cotton: A Global History (New York: Alfred A. Knopf, 2014).
  • John W. Blassingame, The Slave Community: Plantation Life in the Antebellum South (New York: Oxford University Press, 1979).
  • Wilma A. Dunaway, The African-American Family in Slavery and Emancipation (Cambridge, U.K.: Cambridge University Press, 2003).
  • Beth English, A Common Thread: Labor, Politics, and Capital Mobility in the Textile Industry (Athens: The University of Georgia Press, 2006).
  • Lacy K. Ford, Deliver Us From Evil: The Slavery Question in the Old South (New York: Oxford University Press, 1999).
  • Eugene D. Genovese, Roll, Jordan, Roll: The World the Slaves Made (New York: Vintage Books, 1974).
  • Barbara Hahn, Making Tobacco Bright: Creating an American Commodity, 1617-1937 (Baltimore, MD: The Johns Hopkins University Press, 2011).
  • Gwendolyn Midlo Hall, Slavery and African Ethnicities in the Americas: Restoring the Links (Chapel Hill: The University of North Carolina Press, 2005).
  • Samuel S. Hill, Southern Churches in Crisis Revisited (Tuscaloosa: University of Alabama Press, 1999).
  • Walter Johnson, River of Dark Dreams: Slavery and Empire in the Cotton Kingdom (Cambridge, MA: Belknap Press of Harvard University Press, 2013).
  • Jacqueline Jones, Labor of Love, Labor of Sorrow: Black Women, Work, and the Family, from Slavery to the Present (New York: Basic Books, 2010).
  • Peter Kolchin, American Slavery: 1619-1877 (New York: Hill and Wang, 1993).
  • Angela Lakwete, Inventing the Cotton Gin: Machine and Myth in Antebellum America (Baltimore, MD: The Johns Hopkins University Press, 2003).
  • Scott P. Marler, The Merchants’ Capital: New Orleans and the Political Economy of the Nineteenth-Century South (New York: Cambridge University Press, 2013).
  • Robin McDonald and Valerie Pope Burnes, Visions of the Black Belt: A Cultural Survey of the Heart of Alabama (Tuscaloosa: The University of Alabama Press, 2015).
  • Maurie D. McInnis, Slaves Waiting for Sale: Abolitionist Art and the American Slave Trade (Chicago: The University of Chicago Press, 2011).
  • Dylan C. Penningroth, The Claims of Kinfolk: African American Property and Community in the Nineteenth-Century South (Chapel Hill: The University of North Carolina Press, 2003).
  • Joshua D. Rothman, Flush Times and Fever Dreams: A Story of Capitalism and Slavery in the Age of Jackson (Athens: University of Georgia Press, 2012).
  • Diane Miller Sommerville, Rape and Race in the Nineteenth-Century South (Chapel Hill: The University of North Carolina Press, 2004).
  • Larry E. Tise, Proslavery: A History of the Defense of Slavery in America, 1701-1840 (Athens: The University of Georgia Press, 1987).
  • Marie Tyler-McGraw, At the Falls: Richmond, Virginia, and Its People (Chapel Hill: The University of North Carolina Press, 1994).
  • Andrew N. Wegmann, “Skin Color and Social Practice: The Problem of Race and Class Among New Orleans Creoles and Across the South, 1718-1862” (Ph.D. diss.: Louisiana State University, 2015).
  • Jonathan Daniel Wells and Jennifer R. Green, eds., The Southern Middle Class in the Long Nineteenth Century (Baton Rouge: Louisiana State University Press, 2011).
  • Emily West, Chains of Love: Slave Couples in Antebellum South Carolina (Urbana: University of Illinois Press, 2004).
  • Betty Wood, The Origins of American Slavery: Freedom and Bondage in the English Colonies (New York: Hill and Wang, 1997).

 

Notes

  1. See Sven Beckert, Empire of Cotton: A Global History (New York: Alfred A. Knopf, 2014), 103; and Angela Lakwete, Inventing the Cotton Gin: Machine and Myth in Antebellum America (Baltimore, MD: The Johns Hopkins University Press, 2003), 148-151. []
  2. Walter Johnson, River of Dark Dreams: Slavery and Empire in the Cotton Kingdom (Cambridge, MA: Harvard University Press, 2013), 151-152; and John Solomon Otto, The Southern Frontiers, 1607-1860: The Agricultural Evolution of the Colonial and Antebellum South (Westport, CT: Greenwood Press, 1989), 94-96. []
  3. Joshua D. Rothman, Flush Times and Fever Dreams: A Story of Capitalism and Slavery in the Age of Jackson (Athens: University of Georgia Press, 2012), 6-7; David J. Libby, Slavery and Frontier Mississippi, 1720-1835 (Jackson: University Press of Mississippi, 2004), 30-36; and Scott Reynolds Nelson, A Nation of Deadbeats: An Uncommon History of America’s Financial Disasters (New York: Alfred A. Knopf, 2012), 115-118. []
  4. Joseph Holt Ingraham quoted in Rothman, Flush Times and Fever Dreams, 5. []
  5. W. H. Sparks, Memories of Fifty Years (Philadelphia, PA: Claxton, Remsen & Haffelfinger, 1870), 364. []
  6. Beckert, Empire of Cotton, 102-103. []
  7. For more cotton statistics, see Rothman, Flush Times and Fever Dreams, 3-5, 96-103; Johnson, River of Dark Dreams, 254-260; Beckert, Empire of Cotton, 102-104; Avery Plaw, “Slavery,” in Cynthia Clark, ed., The American Economy: A Historical Encyclopedia (Santa Barbara, CA: ABC-Clio, 2011), 108-109, 787-798; William J. Phalen, The Consequences of Cotton in Antebellum America (Jefferson, NC: McFarland & Co., 2014), 110-114; and Gene Dattel, Cotton and Race in the Making of America: The Human Costs of Economic Power (Lanham, MD: Rowman & Littlefield, 2009), 370-371. []
  8. For a valuable and approachable rundown of American slavery statistics, see Jenny Bourne, “Slavery in the United States,” at https://eh.net/encyclopedia/slavery-in-the-united-states/. For statistics earlier than 1790, see Morgan, American Slavery, American Freedom, appendix; and Kolchin, American Slavery, 252-257. All slavery statistics hereafter refer to Bourne’s “Slavery in the United States” unless otherwise noted. []
  9. On antebellum land prices, especially in the cotton belt, see Phalen, Consequences of Cotton, 157-160; Otto, The Southern Frontiers, 86-99; Beth English, A Common Thread: Labor, Politics, and Capital Mobility in the Textile Industry (Athens: The University of Georgia Press, 2006), 40-44; and Harold D. Woodman, King Cotton and His Retainers: Financing and Marketing the Cotton Crop of the South, 1800-1925 (Columbia: University of South Carolina Press, 1990), chapter 11. []
  10. See Brenda E. Stevenson, Life in Black and White: Family and Community in the Slave South (New York: Oxford University Press, 1996), 171-181. []
  11. See Walter Johnson, Soul by Soul: Life Inside the Antebellum Slave Market (Cambridge, MA: Harvard University Press, 1999), 140-141; and John Brown, Slave Life in Georgia: A Narrative of the Life, Sufferings, and Escapes of John Brown, a Fugitive Now in England (London: L. A. Chamerovzow, 1855), 16-17. []
  12. James L. Huston, “The Pregnant Economies of the Border South, 1840-1860: Virginia, Kentucky, Tennessee, and the Possibilities of Slave-Labor Expansion,” in L. Diane Barnes, Brian Schoen, and Frank Towers, eds., The Old South’s Modern Worlds: Slavery, Region, and Nation in the Age of Progress (New York: Oxford University Press, 2011), 132-134. []
  13. See Joseph Holt Ingraham, The Southwest, By a Yankee (New York, 1835), II: 91, quoted in Woodman, King Cotton and His Retainers, 135. A similar quote, recorded in 1854 and attributed to Edward Russell, appears in Johnson, River of Dark Dreams, 12. []
  14. James Stirling, Letters from the Slaves States (London: John W. Parker and Son, 1857), 179-180. []
  15. Thomas Jefferson, Notes on the State of Virginia (Boston, MA: Lilly and Wait, 1832), 143-144. []
  16. See “Excessive Slave Population: The Remedy,” De Bow’s Review, Vol. 12, No. 2 (Feb. 1852): 184-185, also quoted in Johnson, River of Dark Dreams, 13. []
  17. See Anonymous, “Cotton and Its Prospects,” American Cotton Planter, Vol. 1, No. 8 (August 1853): 226, also quoted in Johnson, River of Dark Dreams, 246. []
  18. See Thomas Prentice Kettel, Southern Wealth and Northern Profits, as Exhibited in Statistical Facts and Official Figures (New York: George W. and John A. Wood, 1860), 23. []
  19. Johnson, River of Dark Dreams, 247; 244. []
  20. On the populations of Southern cities, see Richard C. Wade, Slavery in the Cities: The South, 1820-1860 (New York: Oxford University Press, 1964), 325-327. The top three Southern cities, in terms of population in 1820, were Baltimore (62,738), New Orleans (27,176), and Charleston (24,780). []
  21. See Wade, Slavery in the Cities, 326. []
  22. For American import-export statistics, see Spencer C. Tucker, ed., The Encyclopedia of the Wars of the Early American Republic, 1783-1812 (Santa Barbara, CA: ABC-Clio, 2014), 670-671; and, among others, J. Bradford De Long, “Trade Policy and America’s Standard of Living: A Historical Perspective,” in Susan M. Collins, ed., Exports, Imports, and the American Worker (Washington, D.C.: The Brookings Institution, 1998), 354-357. []
  23. See Johnson, River of Dark Dreams, 6, 73-88; Paskoff, Troubled Waters, 13-19; and Gudmestad, Steamboats and the Rise of the Cotton Kingdom, chapter 1, 174-180. []
  24. See Scott P. Marler, The Merchants’ Capital: New Orleans and the Political Economy of the Nineteenth-Century South (New York: Cambridge University Press, 2013), part I; and Wade, Slavery in the Cities, 326-327. []
  25. On the fashion of the Southern middle class, see Wegmann, “Skin Color and Social Practice,” chapter 4; Wells, The Origins of the Southern Middle Class, 74-80; and John G. Deal, “Middle-Class Benevolent Societies in Antebellum Norfolk, Virginia,” in Jonathan Daniel Wells and Jennifer R. Green, eds., The Southern Middle Class in the Long Nineteenth Century (Baton Rouge: Louisiana State University Press, 2011), 92-95. []
  26. The enslaved population of the South in 1860 was 3,950,511 of a total Southern population of 8,289,782. For statistics on slavery, see Bourne, “Slavery in the United States,” at https://eh.net/encyclopedia/slavery-in-the-united-states/. []
  27. See Stevenson, Life in Black and White, chapter 8, especially 231-238; and Emily West, Chains of Love: Slave Couples in Antebellum South Carolina (Urbana: University of Illinois Press, 2004), particularly 21-33. []
  28. See Stephen Crawford, “The Slave Family: A View from the Slave Narratives,” in Claudia Goldin and Hugh Rockoff, eds. Strategic Factors in Nineteenth Century American Economic History: A Volume to Honor Robert W. Fogel (Chicago: The University of Chicago Press, 1992), 331-350. []
  29. For a fascinating, visual treatment of “downriver” slave sales, see Maurie D. McInnis, Slaves Waiting for Sale: Abolitionist Art and the American Slave Trade (Chicago: The University of Chicago Press, 2011), chapter 3. More generally, see Johnson, River of Dark Dreams, 144-147; and Kolchin, American Slavery, 95-98. []
  30. Harriet Jacobs, Incidents in the Life of a Slave Girl (Boston, 1861), 85. []
  31. Kevin Bales and Jody Sarich, “The Paradox of Women, Children, and Slavery,” in Benjamin N. Lawrence and Richard L. Roberts, eds., Trafficking in Slavery’s Wake: Law and the Experience of Women and Children in Africa (Athens: Ohio University Press, 2012), 241-243; Sommerville, Rape and Race, 44-48; and Jacqueline Jones, Labor of Love, Labor of Sorrow: Black Women, Work, and the Family, from Slavery to the Present (New York: Basic Books, 2010), 35-38. []
  32. See Clarence Walker, Mongrel Nation: The America Begotten by Thomas Jefferson and Sally Hemings (Charlottesville: University of Virginia Press, 2009), 30-46; and, among others, Hannah Rosen, Terror in the Heart of Freedom: Citizenship, Sexual Violence, and the Meaning of Race in the Postemancipation South (Chapel Hill: The University of North Carolina Press, 2009), 9-11, 75-82. []
  33. See Melton A. McLaurin, Celia, a Slave: A True Story of Violence and Retribution in Antebellum Missouri (Athens: The University of Georgia Press, 1991), chapters 2, 5, and 6. []
  34. On divorce, see Carol Lasser and Stacey Robertson, Antebellum Women: Private, Public, Partisan (Lanham, MD: Rowman & Littlefield, 2010), 5-8; Isenberg, Sex and Citizenship, 200-204; and David Silkenat, Moments of Despair: Suicide, Divorce, and Debt in Civil War Era North Carolina (Chapel Hill: The University of North Carolina Press, 2011), chapter 4, particularly 77-88. []
  35. Samuel S. Hill, Southern Churches in Crisis Revisited (Tuscaloosa: University of Alabama Press, 1999), 33. []
  36. William Wells Brown, Narrative of William W. Brown, a Fugitive Slave. Written by Himself (Reading, Massachusetts: Addison Wesley, 1969), 56. []
  37. Nat Turner, Confessions of Nat Turner… (Baltimore: 1831), 9-11. []

F17 – 12 Manifest Destiny

Emanuel Gottlieb Leutze, Westward the Course of Empire Takes Its Way, 1862. Mural, United States Capitol

Emanuel Gottlieb Leutze, Westward the Course of Empire Takes Its Way, 1862. Mural, United States Capitol.

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

John Louis O’Sullivan, a popular editor and columnist, articulated the long-standing American belief in the God-given mission of the United States to lead the world in the peaceful transition to democracy. In a little-read essay printed in The United States Magazine and Democratic Review, O’Sullivan outlined the importance of annexing Texas to the United States:

Why, were other reasoning wanting, in favor of now elevating this question of the reception of Texas into the Union, out of the lower region of our past party dissensions, up to its proper level of a high and broad nationality, it surely is to be found, found abundantly, in the manner in which other nations have undertaken to intrude themselves into it, between us and the proper parties to the case, in a spirit of hostile interference against us, for the avowed object of thwarting our policy and hampering our power, limiting our greatness and checking the fulfillment of our manifest destiny to overspread the continent allotted by Providence for the free development of our yearly multiplying millions. John Louis O’Sullivan ((John O’Sullivan, “Annexation,” United States Magazine and Democratic Review 17, no.1 (July-August 1845), 5-10.)) 

O’Sullivan and many others viewed expansion as necessary to achieve America’s destiny and to protect American interests. The quasi-religious call to spread democracy coupled with the reality of thousands of settlers pressing westward. Manifest destiny was grounded in the belief that a democratic, agrarian republic would save the world.

John O’Sullivan, shown here in a 1874 Harper’s Weekly sketch, coined the phrase “manifest destiny” in an 1845 newspaper article. Interestingly, he was not advocating using force to expand westward, arguing vehemently in those and later years against war in America and abroad. Wikimedia, http://commons.wikimedia.org/wiki/File:John_O%27Sullivan.jpg.

John O’Sullivan, shown here in a 1874 Harper’s Weekly sketch, coined the phrase “manifest destiny” in an 1845 newspaper article. Wikimedia.

Although called into name in 1845, manifest destiny was a widely held but vaguely defined belief that dated back to the founding of the nation. First, many Americans believed that the strength of American values and institutions justified moral claims to hemispheric leadership. Second, the lands on the North American continent west of the Mississippi River (and later into the Caribbean) were destined for American-led political and agricultural improvement. Third, God and the Constitution ordained an irrepressible destiny to accomplish redemption and democratization throughout the world. All three of these claims pushed many Americans, whether they uttered the words ‘manifest destiny’ or not, to actively seek the expansion the democracy. These beliefs and the resulting actions were often disastrous to anyone in the way of American expansion. The new religion of American democracy spread on the feet and in the wagons of those who moved west, imbued with the hope that their success would be the nation’s success.

The Young America movement, strongest among members of the Democratic Party but spanning the political spectrum, downplayed divisions over slavery and ethnicity by embracing national unity and emphasizing American exceptionalism, territorial expansion, democratic participation, and economic interdependence.1 Poet Ralph Waldo Emerson captured the political outlook of this new generation in a speech he delivered in 1844 entitled “The Young American”:

In every age of the world, there has been a leading nation, one of a more generous sentiment, whose eminent citizens were willing to stand for the interests of general justice and humanity, at the risk of being called, by the men of the moment, chimerical and fantastic. Which should be that nation but these States? Which should lead that movement, if not New England? Who should lead the leaders, but the Young American? Ralph Waldo Emerson2

However, many Americans, including Emerson, disapproved of aggressive expansion. For opponents of manifest destiny, the lofty rhetoric of the Young Americans was nothing other than a kind of imperialism that the American Revolution was supposed to have repudiated.3 Many members of the Whig Party (and later the Republican Party) argued that the United States’ mission was to lead by example, not by conquest. Abraham Lincoln summed up this criticism with a fair amount of sarcasm during a speech in 1859:

He (the Young American) owns a large part of the world, by right of possessing it; and all the rest by right of wanting it, and intending to have it…Young America had “a pleasing hope — a fond desire — a longing after” territory. He has a great passion — a perfect rage — for the “new”; particularly new men for office, and the new earth mentioned in the revelations, in which, being no more sea, there must be about three times as much land as in the present. He is a great friend of humanity; and his desire for land is not selfish, but merely an impulse to extend the area of freedom. He is very anxious to fight for the liberation of enslaved nations and colonies, provided, always, they have land…As to those who have no land, and would be glad of help from any quarter, he considers they can afford to wait a few hundred years longer. In knowledge he is particularly rich. He knows all that can possibly be known; inclines to believe in spiritual trappings, and is the unquestioned inventor of “Manifest Destiny.” Abraham Lincoln4

But Lincoln and other anti-expansionists would struggle to win popular opinion. The nation, fueled by the principles of manifest destiny, would continue westward. Along the way, Americans battled both native peoples and foreign nations, claiming territory to the very edges of the continent. But westward expansion did not come without a cost. It exacerbated the slavery question, pushed Americans toward civil war, and, ultimately, threatened the very mission of American democracy it was designed to aid.

Although the original painting was only seen by a small number of Americans, the engraving was widely distributed, reinforcing and perhaps spreading the nationalistic ideals of the “Manifest Destiny” ideology. Columbia, the central female figure representing America, leads the Americans into the West and thus into the future by carrying the values of republicanism (as seen through her Roman garb) and progress (shown through the inclusion of technological innovations like the telegraph). In the process, Columbia clears the West of any possible hindrances to this progress, including the native peoples and animals pushed into the darkness. Engraving after John Gast, Manifest Destiny, 1872. Wikimedia, http://commons.wikimedia.org/wiki/File:American_progress.JPG.

Artistic propaganda like this promoted the national project of manifest destiny. Columbia, the female figure of America, leads Americans into the West and into the future by carrying the values of republicanism (as seen through her Roman garb) and progress (shown through the inclusion of technological innovations like the telegraph) and clearing native peoples and animals, seen being pushed into the darkness. John Gast, American Progress, 1872. Wikimedia.

 

II. Antebellum Western Migration and Indian Removal

After the War of 1812, Americans settled the Great Lakes region rapidly thanks in part to aggressive land sales by the federal government.5 Missouri’s admission as a slave state presented the first major crisis over westward migration and American expansion in the antebellum period. Farther north, lead and iron ore mining spurred development in Wisconsin.6 By the 1830s and 1840s, increasing numbers of German and Scandinavian immigrants joined easterners in settling the Upper Mississippi watershed.7 Little settlement occurred west of Missouri as migrants viewed the Great Plains as a barrier to farming. Further west, the Rocky Mountains loomed as undesirable to all but fur traders, and all American Indians west of the Mississippi appeared too powerful to allow for white expansion.

“Do not lounge in the cities!” commanded publisher Horace Greeley in 1841, “There is room and health in the country, away from the crowds of idlers and imbeciles. Go west, before you are fitted for no life but that of the factory.”8 The New York Tribune often argued that American exceptionalism required the United States to benevolently conquer the continent as the prime means of spreading American capitalism and American democracy. However, the vast west was not empty. American Indians controlled much of the land east of the Mississippi River and almost all the West. Expansion hinged on a federal policy of Indian removal.

The harassment and dispossession of American Indians – whether driven by official U.S. government policy or the actions of individual Americans and their communities – depended on the belief in manifest destiny. Of course, a fair bit of racism was part of the equation as well. The political and legal processes of expansion always hinged on the belief that white Americans could best use new lands and opportunities. This belief rested upon the belief that only Americans embodied the democratic ideals of yeoman agriculturalism extolled by Thomas Jefferson and expanded under Jacksonian democracy.

Florida was an early test case for the Americanization of new lands. The territory held strategic value for the young nation’s growing economic and military interests in the Caribbean. The most important factors that led to the annexation of Florida included anxieties over runaway slaves, Spanish neglect of the region, and the desired defeat of Native American tribes who controlled large portions of lucrative farm territory.

During the early 19th century, Spain wanted to increase productivity in Florida and encouraged migration of mostly Southern slave owners. By the second decade of the 1800s, Anglo settlers occupied plantations along the St. Johns River, from the border with Georgia to Lake George 100 miles upstream. Spain began to lose control as the area quickly became a haven for slave smugglers bringing illicit human cargo into the U.S. for lucrative sale to Georgia planters. Plantation owners grew apprehensive about the growing numbers of slaves running to the swamps and Indian-controlled areas of Florida. American slave owners pressured the U.S. government to confront the Spanish authorities. Southern slave owners refused to quietly accept the continued presence of armed black men in Florida. During the War of 1812, a ragtag assortment of Georgia slave owners joined by a plethora of armed opportunists raided Spanish and British-owned plantations along the St. Johns River. These private citizens received U.S. government help on July 27, 1816, when U.S. army regulars attacked the Negro Fort (established as an armed outpost during the war by the British and located about 60 miles south of the Georgia border). The raid killed 270 of the fort’s inhabitants as a result of a direct hit on the fort’s gun powder stores. This conflict set the stage for General Andrew Jackson’s invasion of Florida in 1817 and the beginning of the First Seminole War.9 

Americans also held that Creek and Seminole Indians, occupying the area from the Apalachicola River to the wet prairies and hammock islands of central Florida, were dangers in their own right. These tribes, known to the Americans collectively as “Seminoles,” migrated into the region over the course of the 18th century and established settlements, tilled fields, and tended herds of cattle in the rich floodplains and grasslands that dominated the northern third of the Florida peninsula. Envious eyes looked upon these lands. After bitter conflict that often pitted Americans against a collection of Native Americans and former slaves, Spain eventually agreed to transfer the territory to the U.S. The resulting Adams-Onís Treaty exchanged Florida for $5 million and other territorial concessions elsewhere. (Francis Newton Thorpe ed., The Federal and State Constitutions Colonial Charters, and Other Organic Laws of the States, Territories, and Colonies Now or Heretofore Forming the United States of America Compiled and Edited Under the Act of Congress of June 30, 1906  (Washington, DC : Government Printing Office, 1909).))

After the purchase, planters from the Carolinas, Georgia, and Virginia entered Florida. However, the influx of settlers into the Florida territory was temporarily halted in the mid-1830s by the outbreak of the Second Seminole War (1835-1842). Free black men and women and escaped slaves also occupied the Seminole district; a situation that deeply troubled slave owners. Indeed, General Thomas Sidney Jesup, U.S. commander during the early stages of the Second Seminole War, labeled that conflict “a negro, not an Indian War,” fearful as he was that if the revolt, “was not speedily put down, the South will feel the effect of it on their slave population before the end of the next season.”10 Florida became a state in 1845 and settlement expanded into the former Indian lands.

American action in Florida seized Indians’ eastern lands, reduced lands available for runaway slaves, and killed entirely or removed Indian peoples farther west. This became the template for future action. Presidents, since at least Thomas Jefferson, had long discussed removal, but President Andrew Jackson took the most dramatic action. Jackson believed, “It [speedy removal] will place a dense and civilized population in large tracts of country now occupied by a few savage hunters.”11 Desires to remove American Indians from valuable farmland motivated state and federal governments to cease trying to assimilate Indians and instead plan for forced removal.

Congress passed the Indian Removal Act in 1830, thereby granting the president authority to begin treaty negotiations that would give American Indians land in the West in exchange for their lands east of the Mississippi. Many advocates of removal, including President Jackson, paternalistically claimed that it would protect Indian communities from outside influences that jeopardized their chances of becoming “civilized” farmers. Jackson emphasized this paternalism—the belief that the government was acting in the best interest of Native peoples— in his 1830 State of the Union Address. “It [removal] will separate the Indians from immediate contact with settlements of whites…and perhaps cause them gradually, under the protection of the Government and through the influence of good counsels, to cast off their savage habits and become an interesting, civilized, and Christian community.”12

The experience of the Cherokee was particularly brutal. Despite many tribal members adopting some Euro-American ways, including intensified agriculture, slave ownership, and Christianity; state and federal governments pressured the Choctaw, Chickasaw, Creek, and Cherokee nations to sign treaties and surrender land. Many of these tribal nations used the law in hopes of protecting their lands. Most notable among these efforts was the Cherokee Nation’s attempt to sue the state of Georgia.

Beginning in 1826, Georgian officials asked the federal government to negotiate with the Cherokee to secure lucrative lands. The Adams’ administration resisted the state’s request, but harassment from local settlers against the Cherokee forced the Adams and then Jackson administrations to begin serious negotiations with the Cherokees. Georgia grew impatient with the process of negotiation and abolished existing state agreements with the Cherokee that had guaranteed rights of movement and jurisdiction of tribal law. Andrew Jackson penned a letter soon after taking office that encouraged the Cherokee, among others, to voluntarily relocate to the West. The discovery of gold in Georgia in the fall of 1829 further antagonized the situation.

The Cherokee defended themselves against Georgia’s laws by citing treaties signed with the United States that guaranteed the Cherokee nation both their land and independence. The Cherokee appealed to the Supreme Court against Georgia to prevent dispossession. The Court, while sympathizing with the Cherokees’ plight, ruled that it lacked jurisdiction to hear the case (Cherokee Nation v. Georgia – 1831). In an associated case, Worcester v. Georgia 1832, The Supreme Court ruled that Georgia laws did not apply within Cherokee territory.13 Regardless of these rulings, the state government ignored the Supreme Court and did little to prevent conflict between settlers and the Cherokee.

Jackson wanted a solution that might preserve peace and his reputation. He sent Secretary of War Lewis Cass to offer title to western lands and the promise of tribal governance in exchange for relinquishing of the Cherokee’s eastern lands. These negotiations opened a rift within the Cherokee nation. Cherokee leader John Ridge believed removal was inevitable and pushed for a treaty that would give the best terms. Others, called nationalists and led by John Ross, refused to consider removal in negotiations. The Jackson administration refused any deal that fell short of large-scale removal of the Cherokee from Georgia, thereby fueling a devastating and violent intra-tribal battle between the two factions. Eventually tensions grew to the point that several treaty advocates were assassinated by members of the national faction.14 

In 1835, a portion of the Cherokee Nation led by John Ridge, hoping to prevent further tribal bloodshed signed the Treaty of New Echota. This treaty ceded lands in Georgia for five million dollars and, the signatories hoped, limiting future conflicts between the Cherokee and white settlers. However, most of the tribe refused to adhere to the terms, viewing the treaty as illegitimately negotiated. In response, John Ross pointed out the U.S. government’s hypocrisy. “You asked us to throw off the hunter and warrior state: We did so—you asked us to form a republican government: We did so. Adopting your own as our model. You asked us to cultivate the earth, and learn the mechanic arts. We did so. You asked us to learn to read. We did so. You asked us to cast away our idols and worship your god. We did so. Now you demand we cede to you our lands. That we will not do.”15

President Martin van Buren, in 1838, decided to press the issue beyond negotiation and court rulings and used the New Echota Treaty provisions to order the army to forcibly remove those Cherokee not obeying the Treaty’s cession of territory. Sixteen thousand Cherokee began the journey, but harsh weather, poor planning, and difficult travel resulted in an estimated 10,138 deaths on what became known as the Trail of Tears.16 Not every instance was as treacherous as the Cherokee example and some tribes resisted removal. But over 60,000 Indians were forced west by the opening of the Civil War.17 

The allure of manifest destiny encouraged expansion regardless of terrain or locale, and Indian removal also took place, to a lesser degree, in northern lands. In the Old Northwest, Odawa and Ojibwe communities in Michigan, Wisconsin, and Minnesota, resisted removal as many lived on land north of desirable farming land. Moreover, some Ojibwe and Odawa individuals purchased land independently. They formed successful alliances with missionaries to help advocate against removal, as well as some traders and merchants who depended on trade with Native peoples. Yet, Indian removal occurred in the North as well—the “Black Hawk War” in 1832, for instance, led to the removal of many Sauk to Kansas.18 

Despite the disaster of removal, tribal nations slowly rebuilt their cultures and in some cases even achieved prosperity in Indian Territory. Tribal nations blended traditional cultural practices, including common land systems, with western practices including constitutional governments, common school systems, and creating an elite slaveholding class.

Some Indian groups remained too powerful to remove. Beginning in the late eighteenth-century, the Comanche rose to power in the Southern Plains region of what is now the southwestern United States. By quickly adapting to the horse culture first introduced by the Spanish, the Comanche transitioned from a foraging economy into a mixed hunting and pastoral society. After 1821, the new Mexican nation-state claimed the region as part of the Northern Mexican frontier, but they had little control. Instead, the Comanche remained in power and controlled the economy of the Southern Plains. A flexible political structure allowed the Comanche to dominate other Indian groups as well as Mexican and American settlers.

In the 1830s, the Comanche launched raids into northern Mexico, ending what had been an unprofitable but peaceful diplomatic relationship with Mexico. At the same time, they forged new trading relationships with Anglo-American traders in Texas. Throughout this period, the Comanche and several other independent Native groups, particularly the Kiowa, Apache, and Navajo engaged in thousands of violent encounters with Northern Mexicans. Collectively, these encounters comprised an ongoing war during the 1830s and 1840s as tribal nations vied for power and wealth. By the 1840s, Comanche power peaked with an empire that controlled a vast territory in the trans-Mississippi west known as Comancheria. By trading in Texas and raiding in Northern Mexico, the Comanche controlled the flow of commodities, including captives, livestock, and trade goods. They practiced a fluid system of captivity and captive trading, rather than a rigid chattel system. The Comanche used captives for economic exploitation but also adopted captives into kinship networks. This allowed for the assimilation of diverse peoples in the region into the empire. The ongoing conflict in the region had sweeping consequences on both Mexican and American politics. The U.S.-Mexican War, beginning in 1846, can be seen as a culmination of this violence.19 

“Map of the Plains Indians,” undated. Smithsonian Institute, http://americanhistory.si.edu/buffalo/files/pdf/TrackingTheBuffalo_Map_printable.pdf.

“Map of the Plains Indians,” undated. Smithsonian Institute.

In the Great Basin region, Mexican Independence also escalated patterns of violence. This region, on the periphery of the Spanish empire, was nonetheless integrated in the vast commercial trading network of the West. Mexican officials and Anglo-American traders entered the region with their own imperial designs. New forms of violence spread into the homelands of the Paiute and Western Shoshone. Traders, settlers, and Mormon religious refugees, aided by U.S. officials and soldiers, committed daily acts of violence and laid the groundwork for violent conquest. This expansion of the American state into the Great Basin meant groups such as the Ute, Cheyenne and Arapahoe had to compete over land, resources, captives, and trade relations with Anglo-Americans. Eventually, white incursion and ongoing Indian Wars resulted in traumatic dispossession of land and the struggle for subsistence.

The federal government attempted more than relocation of Americans Indians. Policies to “civilize” Indians coexisted along with forced removal and served an important “Americanizing” vision of expansion that brought an ever-increasing population under the American flag and sought to balance aggression with the uplift of paternal care. Thomas L. McKenney, superintendent of Indian trade from 1816 to 1822 and the Superintendent of Indian Affairs from 1824 to 1830, served as the main architect of the “civilization policy.” He asserted that American Indians were morally and intellectually equal to whites. He sought to establish of a national Indian school system.

Congress rejected McKenney’s plan but instead passed the Civilization Fund Act in 1819. This act offered a $10,000 annual annuity to be allocated towards societies that funded missionaries to establish schools among Indian tribes. However, providing schooling for American Indians under the auspices of the Civilization program also allowed the federal government to justify taking more land. Treaties, such as the 1820 Treaty of Doak’s Stand made with the Choctaw nation, often included land cessions as requirements for education provisions. Removal and Americanization reinforced Americans sense of cultural dominance.20 

After removal in the 1830s, the Cherokee, Choctaw, and Chickasaw began to collaborate with missionaries to build school systems of their own. Leaders hoped education would help ensuing generations to protect political sovereignty. In 1841, the Cherokee Nation opened a public school system that within two years included eighteen schools. By 1852, the system expanded to twenty-one schools with a national enrollment of 1,100 pupils.21 Many of the students educated in these tribally controlled schools later served their nations as teachers, lawyers, physicians, bureaucrats, and politicians.

 

III. Life and Culture in the West

The dream of creating a democratic utopia in the West ultimately rested on those who picked up their possessions and their families and moved west. Western settlers usually migrated as families and settled along navigable and potable rivers. Settlements often coalesced around local traditions, especially religion, carried from eastern settlements. These shared understandings encouraged a strong sense of cooperation among western settlers that forged communities on the frontier.

Before the Mexican War, the West for most Americans still referred to the fertile area between the Appalachian Mountains and the Mississippi River with a slight amount of overspill beyond its banks. With soil exhaustion and land competition increasing in the East, most early western migrants sought a greater measure of stability and self-sufficiency by engaging in small scale farming. Boosters of these new agricultural areas along with the U.S. government encouraged perceptions of the West as a land of hard-built opportunity that promised personal and national bounty.

Women migrants bore the unique double burden of travel while also being expected to conform to restrictive gender norms. The key virtues of femininity, according to the “cult of true womanhood,” included piety, purity, domesticity, and submissiveness. The concept of “separate spheres” expected women to remain in the home. These values accompanied men and women as they traveled west to begin their new lives.

While many of these societal standards endured, there often existed an openness of frontier society that resulted in modestly more opportunities for women. Husbands needed partners in setting up a homestead and working in the field to provide food for the family. Suitable wives were often in short supply, enabling some to informally negotiate more power in their households.22 

Americans debated the role of government in westward expansion. This debate centered on the proper role of the U.S. government in paying for the internal improvements that soon became necessary to encourage and support economic development. Some saw frontier development as a self-driven undertaking that necessitated private risk and investment devoid of government interference. Others saw the federal government’s role as providing the infrastructural development needed to give migrants the push toward engagement with the larger national economy. In the end, federal aid proved essential for the conquest and settlement of the region.

American artist George Catlin traveled west to paint Native Americans. In 1832 he painted Eeh-nís-kim, Crystal Stone, wife of a Blackfoot leader. Via Smithsonian American Art Museum.

American artist George Catlin traveled west to paint Native Americans. In 1832 he painted Eeh-nís-kim, Crystal Stone, wife of a Blackfoot leader. Via Smithsonian American Art Museum.

Economic busts constantly threatened western farmers and communities. The economy worsened after the panic of 1819. Falling prices and depleted soil meant farmers were unable to make their loan payments. The dream of subsistence and stability abruptly ended as many migrants lost their land and felt the hand of the distant market economy forcing them even farther west to escape debt. As a result, the federal government consistently sought to increase access to land in the West, including efforts to lower the amount of land required for purchase. Smaller lots made it easier for more farmers to clear land and begin farming faster.23 

More than anything else, new roads and canals provided conduits for migration and settlement. Improvements in travel and exchange fueled economic growth in the 1820s and 1830s. Canal improvements expanded in the East, while road building prevailed in the West. Congress continued to allocate funds for internal improvements. Federal money pushed the National Road, begun in 1811, farther west every year. Laborers needed to construct these improvements increased employment opportunities and encouraged non-farmers to move to the West. Wealth promised by engagement with the new economy was hard to reject. However, roads were expensive to build and maintain and some Americans strongly opposed spending money on these improvements.

The use of steamboats grew quickly throughout the 1810s and into the 1820s. As water trade and travel grew in popularity, local and state, and federal funds helped connect rivers and streams. Hundreds of miles of new canals cut through the eastern landscape. The most notable of these early projects was the Erie Canal. That project, completed in 1825, linked the Great Lakes to New York City. The profitability of the canal helped New York outpace its east coast rivals to become the center for commercial import and export in the United States.24 

Early railroads like the Baltimore and Ohio line hoped to link mid-Atlantic cities with lucrative western trade routes. Railroad boosters encouraged the rapid growth of towns and cities along their routes. Not only did rail lines promise to move commerce faster, but the rails also encouraged the spreading of towns farther away from traditional waterway locations. Technological limitations, constant repairs, conflicts with American Indians, and political disagreements, all hampered railroading and kept canals and steamboats as integral parts of the transportation system. Nonetheless, this early establishment of railroads enabled a rapid expansion after the Civil War.

Economic chains of interdependence stretched over hundreds of miles of land and through thousands of contracts and remittances. America’s manifest destiny became wedded not only to territorial expansion, but also to economic development.25 

 

IV.  Texas, Mexico and America

The debate over slavery became one of the prime forces behind the Texas Revolution and the resulting republic’s annexation to the United States. After gaining its independence from Spain in 1821, Mexico hoped to attract new settlers to its northern areas to create a buffer between it and the powerful Comanche. New immigrants, mostly from the southern United States, poured into Mexican Texas. Over the next twenty-five years, concerns over growing Anglo influence and possible American designs on the area produced great friction between Mexicans and the former Americans in the area. In 1829, Mexico, hoping to quell both anger and immigration, outlawed slavery and required all new immigrants to convert to Catholicism. American immigrants, eager to expand their agricultural fortunes, largely ignored these requirements. In response, Mexican authorities closed their territory to any new immigration in 1830 – a prohibition ignored by Americans who often squatted on public lands.26 

In 1834, an internal conflict between federalists and centralists in the Mexican government led to the political ascendency of General Antonio Lopez de Santa Anna. Santa Anna, governing as a dictator, repudiated the federalist Constitution of 1824, pursued a policy of authoritarian central control, and crushed several revolts throughout Mexico. Anglo settlers in Mexican Texas, or Texians as they called themselves, opposed Santa Anna’s centralizing policies and met in November. They issued a statement of purpose that emphasized their commitment to the Constitution of 1824 and declared Texas to be a separate state within Mexico. After the Mexican government angrily rejected the offer, Texian leaders soon abandoned their fight for the Constitution of 1824 and declared independence on March 2, 1836.27 The Texas Revolution of 1835-1836 was a successful secessionist movement in the northern district of the Mexican state of Coahuila y Tejas that resulted in an independent Republic of Texas.

At the Alamo and Goliad, Santa Anna crushed smaller rebel forces and massacred hundreds of Texian prisoners. The Mexican army pursued the retreating Texian army deep into East Texas, spurring a mass panic and evacuation by American civilians known as the “Runaway Scrape.” The confident Santa Anna consistently failed to make adequate defensive preparations; an oversight that eventually led to a surprise attack from the outnumbered Texian army led by Sam Houston on April 21, 1836. The battle of San Jacinto lasted only eighteen minutes and resulted in a decisive victory for the Texians, who retaliated for previous Mexican atrocities by killing fleeing and surrendering Mexican soldiers for hours after the initial assault. Santa Anna was captured in the aftermath and compelled to sign the Treaty of Velasco on May 14, 1836, by which he agreed to withdraw his army from Texas and acknowledged Texas independence. Although a new Mexican government never recognized the Republic of Texas, the United States and several other nations gave the new country diplomatic recognition.28 

Texas annexation had remained a political landmine since the Republic declared independence from Mexico in 1836. American politicians feared that adding Texas to the Union would provoke a war with Mexico and re-ignite sectional tensions by throwing off the balance between free and slave states. However, after his expulsion from the Whig party, President John Tyler saw Texas statehood as the key to saving his political career. In 1842, he began work on opening annexation to national debate. Harnessing public outcry over the issue, Democrat James K. Polk rose from virtual obscurity to win the presidential election of 1844. Polk and his party campaigned on promises of westward expansion, with eyes toward Texas, Oregon, and California.  In the final days of his presidency, Tyler at last extended an official offer to Texas on March 3, 1845. The republic accepted on July 4, becoming the twenty-eighth state.

Mexico denounced annexation as “an act of aggression, the most unjust which can be found recorded in the annals of modern history.”29 Beyond the anger produced by annexation, the two nations both laid claim over a narrow strip of land between two rivers. Mexico drew the southwestern border of Texas at the Nueces River, but Texans claimed that the border lay roughly 150 miles further west at the Rio Grande. Neither claim was realistic since the sparsely populated area, known as the Nueces strip, was in fact controlled by Native Americans.

In November of 1845, President Polk secretly dispatched John Slidell to Mexico City to purchase the Nueces strip along with large sections of New Mexico and California. The mission was an empty gesture, designed largely to pacify those in Washington who insisted on diplomacy before war. Predictably, officials in Mexico City refused to receive Slidell. In preparation for the assumed failure of the negotiations, Polk preemptively sent a 4,000 man army under General Zachary Taylor to Corpus Christi, Texas, just northeast of the Nueces River. Upon word of Slidell’s rebuff in January 1846, Polk ordered Taylor to cross into the disputed territory. The President hoped that this show of force would push the lands of California onto the bargaining table as well. Unfortunately, he badly misread the situation. After losing Texas, the Mexican public strongly opposed surrendering any more ground to the United States. Popular opinion left the shaky government in Mexico City without room to negotiate. On April 24, Mexican cavalrymen attacked a detachment of Taylor’s troops in the disputed territory just north of the Rio Grande, killing eleven U.S. soldiers.

It took two weeks for the news to reach Washington. Polk sent a message to Congress on May 11 that summed up the assumptions and intentions of the United States.

Instead of this, however, we have been exerting our best efforts to propitiate her good will. Upon the pretext that Texas, a nation as independent as herself, thought proper to unite its destinies with our own, she has affected to believe that we have severed her rightful territory, and in official proclamations and manifestoes has repeatedly threatened to make war upon us for the purpose of reconquering Texas. In the meantime we have tried every effort at reconciliation. The cup of forbearance had been exhausted even before the recent information from the frontier of the Del Norte. But now, after reiterated menaces, Mexico has passed the boundary of the United States, has invaded our territory and shed American blood upon the American soil. She has proclaimed that hostilities have commenced, and that the two nations are now at war. James Knox Polk ((James K. Polk, “President Polk’s Mexican War Message,” quoted in Statemen’s Manual: The Addresses and Messages of the Presidents of the United States, Inaugural, Annual, and Special, from 1789 to 1846: With a Memoir of Each of the Presidents and a History of Their Administrations; Also the Constitution of the United States, and a Selection of Important Documents and Statistical Information, Volume 2, (New York: Edward Walker, 1847), 1489.))

The cagey Polk knew that since hostilities already existed, political dissent would be dangerous – a vote against war became a vote against supporting American soldiers under fire. Congress passed a declaration of war on May 13. Only a few members of both parties, notably John Quincy Adams and John C. Calhoun, opposed the measure. Upon declaring war in 1846, Congress issued a call for 50,000 volunteer soldiers. Spurred by promises of adventure and conquest abroad, thousands of eager men flocked to assembly points across the country. However, opposition to “Mr. Polk’s War” soon grew.

In the early fall of 1846, the U.S. Army invaded Mexico on multiple fronts and within a year’s time General Winfield Scott’s men took control of Mexico City. However, the city’s fall did not bring an end to the war. Scott’s men occupied Mexico’s capital for over four months while the two countries negotiated. In the United States, the war had been controversial from the beginning. Embedded journalists sent back detailed reports from the front lines, and a divided press viciously debated the news. Volunteers found that war was not as they expected. Disease killed seven times as many American soldiers as combat.30 Harsh discipline, conflict within the ranks, and violent clashes with civilians led soldiers to desert in huge numbers. Peace finally came on February 2, 1848 with the signing of the Treaty of Guadalupe Hidalgo.

Entrance into Mexico City

“General Scott’s entrance into Mexico.” Lithograph. 1851. Originally published in George Wilkins Kendall & Carl Nebel, The War between the United States and Mexico Illustrated, Embracing Pictorial Drawings of all the Principal Conflicts (New York: D. Appleton), 1851. Wikimedia Commons

The new American Southwest attracted a diverse group of entrepreneurs and settlers to the commercial towns of New Mexico, the fertile lands of eastern Texas, and the famed gold deposits of California, and the Rocky Mountains. This postwar migration built earlier paths dating back to the 1820s, when the lucrative Santa Fe trade enticed merchants to New Mexico and generous land grants brought numerous settlers to Texas. The Gadsden Purchase of 1854 further added to American gains north of Mexico.

The U.S.-Mexican War had an enormous impact on both countries. The American victory helped set the United States on the path to becoming a world power. It elevated Zachary Taylor to the presidency and served as a training ground for many of the Civil War’s future commanders. Most significantly, however, Mexico lost roughly half of its territory. Yet, the United States’ victory was not without danger. Ralph Waldo Emerson, an outspoken critic, predicted ominously at the beginning of the conflict, “We will conquer Mexico, but it will be as the man who swallows the arsenic which will bring him down in turn. Mexico will poison us.”31 Indeed, the conflict over whether or not to extend slavery into the newly won territory pushed the nation ever closer to disunion and civil war.

 

V. Manifest Destiny and the Gold Rush

California, belonging to Mexico prior to the war, was at least three arduous months travel from the nearest American settlements. There was some sparse settlement in the Sacramento valley and missionaries made the trip occasionally. The fertile farmland of Oregon, like the black dirt lands of the Mississippi valley, attracted more settlers than California. Dramatized stories of Indian attacks filled migrants with a sense of foreboding, although the majority of settlers encountered no violence and often no Indians at all. The slow progress, disease, human and oxen starvation, poor trails, terrible geographic preparations, lack of guidebooks, threatening wildlife, vagaries of weather, and general confusion were all more formidable and frequent than Indian attacks. Despite the harshness of the journey, by 1848 there were approximated 20,000 Americans living west of the Rockies, with about three-fourths of that number in Oregon.

The great environmental and economic potential of the Oregon Territory led many to pack up their families and head west along the Oregon Trail. The Trail represented the hopes of many for a better life, represented and reinforced by images like Bierstadt’s idealistic Oregon Trail. In reality, the Trail was violent and dangerous, and many who attempted to cross never made it to the “Promised Land” of Oregon. Albert Bierstadt, Oregon Trail (Campfire), 1863. Wikimedia, http://commons.wikimedia.org/wiki/File:Bierstadt_Albert_Oregon_Trail.jpg.

The great environmental and economic potential of the Oregon Territory led many to pack up their families and head west along the Oregon Trail. The Trail represented the hopes of many for a better life, represented and reinforced by images like Bierstadt’s idealistic Oregon Trail.  Albert Bierstadt, Oregon Trail (Campfire), 1863. Wikimedia.

Many who moved nurtured a romantic vision of life, attracting more Americans who sought more than agricultural life and familial responsibilities. The rugged individualism and military prowess of the West, encapsulated for some by service in the Mexican war, drew a growing new breed west of the Sierra Nevada to meet with the Californians already there; a breed of migrants different from the modest agricultural communities of the near-west.

If the great draw of the West served as manifest destiny’s kindling, then the discovery of gold in California was the spark that set the fire ablaze. The vast majority of western settlers sought land ownership, but the lure of getting rich quick drew younger single men (with some women) to gold towns throughout the West. These adventurers and fortune-seekers then served as magnets for the arrival of others providing services associated with the gold rush. Towns and cities grew rapidly throughout the West, notably San Francisco whose population grew from about 500 in 1848 to almost 50,000 by 1853. Lawlessness, predictable failure of most fortune seekers, racial conflicts, and the slavery question all threatened manifest destiny’s promises.

On January 24, 1848 James W. Marshall, a contractor hired by John Sutter, discovered gold on Sutter’s sawmill land in the Sacramento valley area of the California Territory. Throughout the 1850s, Californians beseeched Congress for a transcontinental railroad to provide service for both passengers and goods from the Midwest and the East Coast. The potential economic benefits for communities along proposed railroads made the debate over the route rancorous. Growing dissent over the slavery issue also heightened tensions.

The great influx of diverse people clashed in a combative and aggrandizing atmosphere of individualistic pursuit of fortune. Linguistic, cultural, economic, and racial conflict roiled both urban and rural areas. By the end of the 1850s, Chinese and Mexican immigrants made up 1/5th of the mining population in California. The ethnic patchwork of these frontier towns belied a clearly defined socio-economic arrangement that saw whites on top as landowners and managers with poor whites and ethnic minorities working the mines and assorted jobs. The competition for land, resources, and riches furthered individual and collective abuses particularly against Indians and older Mexican communities. California’s towns, as well as those dotting the landscape throughout the West, such as Coeur D’Alene in Idaho and Tombstone in Arizona, struggled to balance security with economic development and the protection of civil rights and liberties.

This cartoon depicts a highly racialized image of a Chinese immigrant and Irish immigrant “swallowing” the United States–in the form of Uncle Sam. Networks of railroads and the promise of American expansion can be seen in the background. “The great fear of the period That Uncle Sam may be swallowed by foreigners : The problem solved,” 1860-1869, Library of Congress.

 

VI. The Monroe Doctrine and Manifest Destiny.

The expansion of influence and territory off the continent became an important corollary to westward expansion. The U.S. government sought to keep European countries out of the western hemisphere and applied the principles of manifest destiny to the rest of the hemisphere. As Secretary of State for President James Monroe, John Quincy Adams held the responsibility for the satisfactory resolution of ongoing border disputes between the United States, England, Spain, and Russia. Adams’ view of American foreign policy was put into clearest practice in the Monroe Doctrine, which he had great influence in crafting.

Increasingly aggressive incursions from Russians in the Northwest, ongoing border disputes with the British in Canada, the remote possibility of Spanish reconquest of South America, and British abolitionism in the Caribbean all triggered an American response. In a speech before the U.S. House of Representatives on July 4th, 1821, Secretary of State Adams acknowledged the American need for a robust foreign policy that simultaneously protected and encouraged the nation’s growing and increasingly dynamic economy.

America…in the lapse of nearly half a century, without a single exception, respected the independence of other nations while asserting and maintaining her own…She is the well-wisher to the freedom and independence of all…She well knows that by once enlisting under other banners than her own, were they even the banners of foreign independence, she would involve herself beyond the power of extrication, in all the wars of interest and intrigue, of individual avarice, envy, and ambition, which assume the colors and usurp the standard of freedom. The fundamental maxims of her policy would insensibly change from liberty to force. The frontlet on her brows would no longer beam with the ineffable splendor of freedom and independence; but in its stead would soon be substituted an imperial diadem, flashing in false and tarnished lustre the murky radiance of dominion and power. She might become the dictatress of the world; she would be no longer the ruler of her own spirit. . . . Her glory is not dominion, but liberty. Her march is the march of the mind. She has a spear and a shield: but the motto upon her shield is, Freedom, Independence, Peace. This has been her Declaration: this has been, as far as her necessary intercourse with the rest of mankind would permit, her practice. John Quincy Adams ((John Quincy Adams, “Mr. Adams Oration, July 21, 1821,” quoted in Niles’ Weekly Register, Volume 20, (Baltimore: H. Niles, 1821), 332.))

Adams’ great fear was not territorial loss. He had no doubt that Russian and British interests in North America could be arrested. Adams held no reason to antagonize the Russians with grand pronouncements, nor was he generally called upon to do so. He enjoyed a good relationship with the Russian Ambassador and stewarded through Congress most-favored trade status for the Russians in 1824. Rather, Adams worried gravely about the ability of the United States to compete commercially with the British in Latin America and the Caribbean. This concern deepened with the valid concern that America’s chief Latin American trading partner, Cuba, dangled perilously close to outstretched British claws. Cabinet debates surrounding establishment of the Monroe Doctrine and geopolitical events in the Caribbean focused attention on that part of the world as key to the future defense of U.S. military and commercial interests; the main threat to those interests being the British. Expansion of economic opportunity and protection from foreign pressures became the overriding goals of U.S. foreign policy.32 But despite the philosophical confidence present in the Monroe administration’s decree, the reality of limited military power kept the Monroe Doctrine as an aspirational assertion. 

Bitter disagreements over the expansion of slavery into the new lands won from Mexico began even before the war ended. Many Northern businessmen and Southern slave owners supported the idea of expanding slavery into the Caribbean as a useful alternative to continental expansion, since slavery already existed in these areas. Some were critical of these attempts, seeing them as evidence of a growing slave-power conspiracy. Many others supported attempts at expansion, like those previously seen in East Florida, even if these attempts were not exactly legal. Filibustering, as it was called, involved privately financed schemes directed at capturing and occupying foreign territory without the approval of the U.S. government.

Filibustering took greatest hold in the imagination of Americans as they looked toward Cuba. Fears of racialized revolution in Cuba (as in Haiti and Florida before it) as well as the presence of an aggressive British abolitionist influence in the Caribbean energized the movement to annex Cuba and encouraged filibustering as expedient alternatives to lethargic official negotiations. Despite filibustering’s seemingly chaotic planning and destabilizing repercussions, those intellectually and economically guiding the effort imagined a willing and receptive Cuban population and expected an agreeable American business class. In Cuba, manifest destiny for the first time sought territory off the continent and hoped to put a unique spin on the story of success in Mexico. Yet, the annexation of Cuba, despite great popularity and some military attempts led by Narciso Lopez, a Cuban dissident, never succeeded.33           

Other filibustering expeditions were launched elsewhere, including two by William Walker, a former American soldier. Walker seized portions of the Baja peninsula in Mexico and then later took power and established a slaving regime in Nicaragua. Eventually Walker was executed in Honduras.34 These missions violated the laws of the United States, but wealthy Americans financed various filibusters and less-wealthy adventurers were all too happy to sign up. Filibustering enjoyed its brief popularity into the late 1850s, at which point slavery and concerns over session came to the fore. By the opening of the Civil War most saw these attempts as simply territorial theft.

 

VII. Conclusion

Debates over expansion, economics, diplomacy, and manifest destiny exposed some of the weaknesses of the American system. The chauvinism of policies like Native American removal, the Mexican War, and filibustering, existed alongside growing anxiety. Manifest destiny attempted to make a virtue of America’s lack of history and turn it into the very basis of nationhood. To locate such origins, John O’Sullivan and other champions of manifest destiny grafted biological and territorial imperatives – common among European definitions of nationalism – onto American political culture. The United States was the embodiment of the democratic ideal, they said. Democracy had to be timeless, boundless, and portable. New methods of transportation and communication, the rapidity of the railroad and the telegraph, the rise of the international market economy, and the growth of the American frontier provided shared platforms to help Americans think across local identities and reaffirm a national character.

 

VIII. Reference Material

This chapter was edited by Gregg Lightfoot, with content contributions by Ethan Bennett, Michelle Cassidy, Jonathan Grandage, Gregg Lightfoot, Jose Juan Perez Melendez, Jessica Moore, Nick Roland, Matthew K. Saionz, Rowan Steinecker, Patrick Troester, and Ben Wright.

Recommended citation: Ethan Bennett et al., “Manifest Destiny,” Gregg Lightfoot, ed., in The American Yawp, Joseph Locke and Ben Wright, eds., last modified August 1, 2016, http://www.AmericanYawp.com.

 

Recommended Reading

  • Blackhawk, Ned. Violence over the Land: Indians and Empires in the Early American West. Cambridge: Harvard University Press, 2008.
  • Brooks, James F. Captives and Cousins: Slavery, Kinship, and Community in the Southwest Borderlands. Chapel Hill: UNC Press, 2003.
  • Cusick, James G. The Other War of 1812: The Patriot War and the American Invasion of Spanish East Florida. Athens: University of Georgia Press, 2007.
  • DeLay, Brian. War of a Thousand Deserts: Indian Raids and the U.S.-Mexican War. New Haven: Yale University Press, 2009.
  • Exley, Jo Ella Powell, Frontier Blood: The Saga of the Parker Family. College Station: Texas A&M University Press, 2005. 
  • Gómez, Laura E. Manifest Destinies: The Making of the Mexican American Race. New York: New York University Press, 2008.
  • Gordon, Sarah Barringer. The Mormon Question: Polygamy and Constitutional Conflict in Nineteenth-Century America. Chapel Hill: UNC Press, 2001.
  • Greenberg, Amy S. Manifest Manhood and the Antebellum American Empire. Cambridge: Cambridge University Press, 2005.
  • Haas, Lisbeth. Conquest and Historical Identities in California, 1769-1936. Berkeley: University of California Press, 1995. 
  • Hämäläinen, Pekka. The Comanche Empire. New Haven: Yale University Press, 2009.
  • Holmes, Kenneth L. Covered Wagon Women: Diaries & Letters from the Western Trails, 1840-1849. Lincoln: University of Nebraska Press, 1995.
  • Horsman, Reginald. Race and Manifest Destiny: The Origins of American Racial Anglo-Saxonism. Cambridge: Harvard University Press, 2009.
  • Hyde, Anne F. Empires, Nations, and Families: A History of the North American West, 1800-1860. Lincoln: University of Nebraska Press, 2011.
  • Johnson, Susan Lee. Roaring Camp: The Social World of the California Gold Rush. New York: W. W. Norton, 2000. 
  • Larson, John Lauritz. Internal Improvement: National Public Works and the Promise of Popular Government in the Early United States. Chapel Hill: UNC Press, 2001.
  • Lazo, Rodrigo. Writing to Cuba: Filibustering and Cuban Exiles in the United States. Chapel Hill: UNC Press, 2006.
  • May, Robert E. Manifest Destiny’s Underworld: Filibustering in Antebellum America. Chapel Hill: UNC Press, 2002.
  • Merry, Robert W. A Country of Vast Designs: James K. Polk, the Mexican War and the Conquest of the American Continent. New York: Simon and Schuster, 2009.
  • Namias, June. White Captives: Gender and Ethnicity on the American Frontier. Chapel Hill: University of North Carolina Press, 2005. 
  • Perdue, Theda. “Mixed Blood” Indians: Racial Construction in the Early South. Athens: University of Georgia Press, 2005. 
  • Peters, Virginia Pergman. Women of the Earth Lodges: Tribal Life on the Plains. Norman: University of Oklahoma Press, 2000. 
  • Peterson, Dawn. Indians in the Family: Adoption and the Politics of Antebellum Expansion. Cambridge: Harvard University Press, 2017. 
  • Richter, Daniel K. Facing East from Indian Country: A Native History of Early America. Cambridge: Harvard University Press, 2009.
  • Wilkins, David E. Hollow Justice: A History of Indigenous Claims in the United States. New Haven: Yale University Press, 2013.
  • Yarbrough, Faye, Race and the Cherokee Nation: Sovereignty in the Nineteenth Century. Philadelphia: University of Pennsylvania Press, 2008. 

 

Notes

  1. Yonatan Eyal, The Young America Movement and the Transformation of the Democratic Party, 1828-1861 (New York: Cambridge University Press, 2007). []
  2. Ralph Waldo Emerson, “The Young American: A Lecture read before the Mercantile Library Association, Boston, February 7, 1844,” accessed May 18, 2015, http://www.emersoncentral.com/youngam.htm. []
  3. See Peter S. Onuf, “Imperialism and Nationalism in the Early American Republic,” in Empire’s Twin: U.S. Anti-imperialism from the Founding Era to the Age of Terrorism, Ian Tyrell and Jay Sexton, eds. (Ithaca: Cornell University Press, 2015), 21-40. []
  4. Abraham Lincoln, “Lecture on Discoveries and Inventions: First Delivered April 6, 1858,” accessed May 18, 2015, http://www.abrahamlincolnonline.org/lincoln/speeches/discoveries.htm. []
  5. Edmund Jefferson Danziger, Great Lakes Indian Accommodation and Resistance During the Early Reservation (Ann Arbor: University of Michigan Press, 2009), 11-13. []
  6. Malcolm J. Rohrbough, Trans-Appalachian Frontier, Third Edition: People, Societies, and Institutions, 1775-1850 (Bloomington: Indiana University Press, 2008), 474-479. []
  7. Mark Wyman, Immigrants in the Valley: Irish, Germans, and Americans in the Upper Mississippi Country, 1830-1860 (Carbondale: Southern Illinois University Press, 2016), 128, 148-149. []
  8. Horace Greeley, New York Tribune, 1841. Although the phrase, “Go West, Young Man,” is often attributed to Greeley, the exhortation was most likely only popularized by the newspaper editor in numerous speeches, letters, and editorials and always in the larger context of the comparable and superior health, wealth, and advantages to be had in the West. []
  9. Robert V. Remini, Andrew Jackson: The Course of American Empire, 1767-1821 (Baltimore: Johns Hopkins University Press, 1977/1998), 344-355. []
  10. Thomas Sidney Jesup quoted in Kenneth Wiggins Porter, “Negroes and the Seminole War, 1835-1842,” The Journal of Southern History Vol. 30, No. 4 (November 1964), 427-450, quote on 427. []
  11. “President Andrew Jackson’s Message to Congress ‘On Indian Removal’ (1830),” accessed May 26, 2015, http://www.ourdocuments.gov/doc.php?flash=true&doc=25&page=transcript. []
  12. Ibid. []
  13. Tim A. Garrison, “Worcester v. Georgia (1832).” New Georgia Encyclopedia, available online at http://www.georgiaencyclopedia.org/articles/government-politics/worcester-v-georgia-1832. []
  14. Fay A. Yarbrough, Race and the Cherokee Nation: Sovereignty in the Nineteenth Century (Philadelphia: University of Pennsylvania Press, 2008), 15-21. []
  15. John Ross quoted in Brian Hicks, Toward the Setting Sun: John Ross, the Cherokees, and the Trail of Tears, (New York: Atlantic Monthly Press, 2011), 210. []
  16. Russell Thornton, The Cherokees: A Population History (Lincoln: University of Nebraska Press, 1990), 76. []
  17. Senate Document # 512, 23 Cong., 1 Sess. Vol. IV, p. x. Available online at https://books.google.com/books?id=KSTlvxxCOkcC&dq=60,000+removal+indian&source=gbs_navlinks_s. []
  18. John P. Bowes, Land Too Good for Indians: Northern Indian Removal (Normal: University of Oklahoma Press, 2016). []
  19. Pekka Hämäläinen, The Comanche Empire (New Haven: Yale University Press, 2008). []
  20. Samuel J. Wells, “Federal Indian Policy: From Accommodation to Removal,” in Carolyn Reeves, ed., The Choctaw Before Removal (Jackson: The University Press of Mississippi, 1985), 181-211. []
  21. William C. Sturtevant, Handbook of North American Indians: History of Indian-White Relations, Vol. 4 (Smithsonian Institution, 1988), 289. []
  22. Adrienne Caughfield, True Women and Westward Expansion (College Station: Texas A&M University Press, 2005). []
  23. Murray Newton Rothbard, Panic of 1819: Reactions and Policies (New York: Columbia University Press, 1962). []
  24. Carol Sheriff, The Artificial River: The Erie Canal and the Paradox of Progress, 1817-1862 (New York: Hill and Wang, 1996). []
  25. For more on the technology and transportation revolutions see Daniel Walker Howe, What Hath God Wrought: The Transformation of America, 1815–1848 (New York: Oxford University Press, 2007). []
  26. David Reimers, Other Immigrants: The Global Origins of the American People (New York: New York University Press, 2005), 27. []
  27. H. P. N. Gammel, ed., The Laws of Texas, 1822-1897 Volume 1 (Austin: 1898), 1063. Available online at https://texashistory.unt.edu/ark:/67531/metapth5872/m1/1071/. []
  28. Randolph B. Campbell, An Empire for Slavery: The Peculiar Institution in Texas, 1821-1865 (Baton Rouge: Louisiana State University Press, 1989). []
  29. Quoted in, The Annual Register, Or, A View of the History and Politics of the Year 1846, Volume 88, (Washington: J.G. & F. Rivington, 1847), 377. []
  30. James M. Mccaffrey, Army of Manifest Destiny: The American Soldier in the Mexican War, 1846-1848 (New York: New York University Press, 1992), 53. []
  31. Ralph Waldo Emerson quoted in James McPherson, Battle Cry of Freedom: The Civil War Era, (New York: Oxford University Press, 1988), 51. []
  32. Gretchen Murphy, Hemispheric Imaginings: The Monroe Doctrine and Narratives of U.S. Empire (Durham: Duke University Press, 2009). []
  33. Tom Chaffin, Fatal Glory: Narciso López and the First Clandestine U.S. War against Cuba (Baton Rouge: Louisiana State University Press, 1996). []
  34. Anne F. Hyde, Empires, Nations, and Families: A History of the North American West, 1800-1860 (Lincoln: University of Nebraska Press, 2011), 471. []

F17 – 3 British North America

Unidentified artist, “The Old Plantation,” ca. 1790-1800, Abby Aldrich Rockefeller Folk Art Museum, via Wikimedia

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

Whether they came as servants, slaves, free farmers, religious refugees, or powerful planters, the men and women of the American colonies created new worlds. Native Americans saw fledgling settlements grow into unstoppable beachheads of vast new populations that increasingly monopolized resources and remade the land into something else entirely. Meanwhile, as colonial societies developed in the seventeenth and eighteenth centuries, fluid labor arrangements and racial categories solidified into the race-based, chattel slavery that increasingly defined the economy of the British Empire. The North American mainland originally occupied a small and marginal place in that broad empire, as even the output of its most prosperous colonies paled before the tremendous wealth of Caribbean sugar islands. And yet the colonial backwaters on the North American mainland, ignored by many imperial officials, were nevertheless deeply tied into these larger Atlantic networks. A new and increasingly complex Atlantic World connected the continents of Europe, Africa, and the Americas.

Events across the ocean continued to influence the lives of American colonists. Civil war, religious conflict, and nation building transformed seventeenth-century Britain and remade societies on both sides of the ocean. At the same time, colonial settlements grew and matured, developing into powerful societies capable of warring against Native Americans and subduing internal upheaval. Patterns and systems established during the colonial era would continue to shape American society for centuries. And none, perhaps, would be as brutal and destructive as the institution of slavery.

 

II. Slavery and the Making of Race

After his arrival as a missionary in Charles Town, Carolina, in 1706, Reverend Francis Le Jau quickly grew disillusioned by the horrors of American slavery. He met enslaved Africans ravaged by the Middle Passage, Indians traveling south to enslave enemy villages, and colonists terrified of invasions from French Louisiana and Spanish Florida. Slavery and death surrounded him.

Le Jau’s strongest complaints were reserved for his own countrymen, the English. English traders encouraged wars with Indians in order to purchase and enslave captives, and planters justified the use of an enslaved workforce by claiming white servants were “good for nothing at all.” Although the minister thought otherwise and baptized and educated a substantial number of slaves, he was unable to overcome masters’ fear that Christian baptism would lead to slave emancipation.1

The 1660s marked a turning point for black men and women in English colonies like Virginia in North American and Barbados in the West Indies. New laws gave legal sanction to the enslavement of people of African descent for life. The permanent deprivation of freedom and the separate legal status of enslaved Africans facilitated the maintenance of strict racial barriers. Skin color became more than superficial difference; it became the marker of a transcendent, all-encompassing division between two distinct peoples, two races, white and black.2 

All seventeenth-century racial thought did not point directly toward modern classifications of racial hierarchy. Captain Thomas Phillips, master of a slave ship in 1694, did not justify his work with any such creed: “I can’t think there is any intrinsic value in one color more than another, nor that white is better than black, only we think it so because we are so.”3 For Phillips, the profitability of slavery was the only justification he needed.

Wars offered the most common means for colonists to acquire Native American slaves. Seventeenth-century European legal thought held that enslaving prisoners of war was not only legal, but more merciful than killing the captives outright. After the Pequot War (1636-1637), Massachusetts Bay colonists sold hundreds of North American Indians into slavery in the West Indies. A few years later, Dutch colonists in New Netherland (New York and New Jersey) enslaved Algonquian Indians during both Governor Kieft’s War (1641-1645) and the two Esopus Wars (1659-1663). The Dutch sent these war captives to English-settled Bermuda as well as Curaçao, a Dutch plantation-colony in the southern Caribbean. An even larger number of Indian slaves were captured during King Phillip’s War (1675-1676), a pan-Indian uprising against the encroachments of the New England colonies. Hundreds of Indians were bound and shipped into slavery. The New England colonists also tried to send Indian slaves to Barbados, but the Barbados Assembly refused to import the New England Indians for fear they would encourage rebellion.

In the eighteenth century, wars in Florida, South Carolina, and the Mississippi Valley produced even more Indian slaves. Some wars emerged from contests between Indians and colonists for land, while others were manufactured as pretenses for acquiring captives. Some were not wars at all, but merely illegal raids performed by slave traders. Historians estimate that between 24,000 and 51,000 Native Americans were forced into slavery throughout the southern colonies between 1670 and 1715.  ((Alan Gallay, The Indian Slave Trade: The Rise of the English Empire in the American South 1670–1717 (New Haven: Yale University Press, 2002), 299.)) While some of the enslaved Indians remained in the region, many were exported through Charlestown, South Carolina, to other ports in the British Atlantic—most likely to Barbados, Jamaica, and Bermuda. Many of the English colonists who wished to claim land in frontier territories were threatened by the violence inherent in the Indian slave trade. By the eighteenth century, colonial governments often discouraged the practice, although it never ceased entirely as long as slavery was, in general, a legal institution.

Native American slaves died quickly, mostly from disease, but others were murdered or died from starvation. The demands of growing plantation economies required a more reliable labor force, and the transatlantic slave trade provided such a workforce. European slavers transported millions of Africans across the ocean in a terrifying journey known as the Middle Passage. Writing at the end of the eighteenth century, Olaudah Equiano recalled the fearsomeness of the crew, the filth and gloom of the hold, the inadequate provisions allotted for the captives, and the desperation that drove some slaves to suicide. (Equiano claimed to have been born in Igboland in modern-day Nigeria, but he may have been born in colonial South Carolina, where he collected memories of the Middle Passage from African-born slaves.) In the same time period, Alexander Falconbridge, a slave ship surgeon, described the sufferings of slaves from shipboard infections and close quarters in the hold. Dysentery, known as “the bloody flux,” left captives lying in pools of excrement. Chained in small spaces in the hold, slaves could lose so much skin and flesh from chafing against metal and timber that their bones protruded. Other sources detailed rapes, whippings, and diseases like smallpox and conjunctivitis aboard slave ships.4

“Middle” had various meanings in the Atlantic slave trade. For the captains and crews of slave ships, the Middle Passage was one leg in the maritime trade in sugar and other semi-finished American goods, manufactured European commodities, and African slaves. For the enslaved Africans, the Middle Passage was the middle leg of three distinct journeys from Africa to the Americas. First was an overland journey in Africa to a coastal slave-trading factory, often a trek of hundreds of miles. Second—and middle—was an oceanic trip lasting from one to six months in a slaver. Third was acculturation (known as “seasoning”) and transportation to the American mine, plantation, or other location where new slaves were forced to labor.

“Stowage of the British slave ship Brookes under the regulated slave trade act of 1788,” 1789, via Wikimedia. Slave ships transported 11-12 million Africans to destinations in North and South America, but it was not until the end of the 18th century that any regulation was introduced. The Brookes print dates to after the Regulated Slave Trade Act of 1788, but still shows enslaved Africans chained in rows using iron leg shackles. The slave ship Brookes was allowed to carry up to 454 slaves, allotting 6 feet (1.8 m) by 1 foot 4 inches (0.41 m) to each man; 5 feet 10 inches (1.78 m) by 1 foot 4 inches (0.41 m) to each women, and 5 feet (1.5 m) by 1 foot 2 inches (0.36 m) to each child, but one slave trader alleged that before 1788, the ship carried as many as 609 slaves.

“Slave ships transported 11-12 million Africans to destinations in North and South America, but it was not until the end of the 18th century that any regulation was introduced. The Brookes print dates to after the Regulated Slave Trade Act of 1788, but still shows enslaved Africans chained in rows using iron leg shackles. The slave ship Brookes was allowed to carry up to 454 slaves, allotting 6 feet (1.8 m) by 1 foot 4 inches (0.41 m) to each man; 5 feet 10 inches (1.78 m) by 1 foot 4 inches (0.41 m) to each women, and 5 feet (1.5 m) by 1 foot 2 inches (0.36 m) to each child, but one slave trader alleged that before 1788, the ship carried as many as 609 slaves.Stowage of the British slave ship Brookes under the regulated slave trade act of 1788,” 1789, via Wikimedia.

The impact of the Middle Passage on the cultures of the Americas remains evident today. Many foods associated with Africans, such as cassava, were originally imported to West Africa as part of the slave trade and were then adopted by African cooks before being brought to the Americas, where they are still consumed. West African rhythms and melodies live in new forms today in music as varied as religious spirituals and synthesized drumbeats. African influences appear in the basket making and language of the Gullah people on the Carolina Coastal Islands.

Recent estimates count between 11 and 12 million Africans forced across the Atlantic between the sixteenth and nineteenth centuries, with about 2 million deaths at sea as well as an additional several million dying in the trade’s overland African leg or during seasoning.5 Conditions in all three legs of the slave trade were horrible, but the first abolitionists focused especially on the abuses of the Middle Passage.

Southern European trading empires like the Catalans and Aragonese were brought into contact with a Levantine commerce in sugar and slaves in the fourteenth and fifteenth centuries. Europeans made the first steps toward an Atlantic slave trade in the 1440s when Portuguese sailors landed in West Africa in search of gold, spices, and allies against the Muslims who dominated Mediterranean trade. Beginning in the 1440s, ship captains carried African slaves to Portugal. These Africans were valued primarily as domestic servants, as peasants provided the primary agricultural labor force in Western Europe.6 European expansion into the Americas introduced both settlers and European authorities to a new situation—an abundance of land and a scarcity of labor. Portuguese, Dutch, and English ships became the conduits for Africans forced to America. The western coast of Africa, the Gulf of Guinea, and the west-central coast were the sources of African captives. Wars of expansion and raiding parties produced captives who could be sold in coastal factories. African slave traders bartered for European finished goods such as beads, cloth, rum, firearms, and metal wares.

The first trading post built on the Gulf of Guinea and the oldest European building southern of the Sahara,  Elmina Castle was established as a trade settlement by the Portuguese in the 15th century. The fort became one of the largest and most important markets for African slaves along the Atlantic slave trade. “View of the castle of Elmina on the north-west side, seen from the river. Located on the gold coast in Guinea,” in Atlas Blaeu van der Hem, c. 1665-1668. Wikimedia, http://commons.wikimedia.org/wiki/File:ElMina_AtlasBlaeuvanderHem.jpg.

The first trading post built on the Gulf of Guinea and the oldest European building southern of the Sahara, Elmina Castle was established as a trade settlement by the Portuguese in the 15th century. The fort became one of the largest and most important markets for African slaves along the Atlantic slave trade. “View of the castle of Elmina on the north-west side, seen from the river. Located on the gold coast in Guinea,” in Atlas Blaeu van der Hem, c. 1665-1668. Wikimedia.

Slavers often landed in the British West Indies, where slaves were seasoned in places like Barbados. Charleston, South Carolina, became the leading entry point for the slave trade on the mainland. The founding of Charleston (“Charles Town” until the 1780s) in 1670 was viewed as a serious threat by the Spanish in neighboring Florida, who began construction of Castillo de San Marcos in St. Augustine as a response. In 1693 the Spanish king issued the Decree of Sanctuary, which granted freedom to slaves fleeing the English colonies if they converted to Catholicism and swore an oath of loyalty to Spain.7 The presence of Africans who bore arms and served in the Spanish militia testifies to the different conceptions of race among the English and Spanish in America.

About 450,000 Africans landed in British North America, a relatively small portion of the 11 to 12 million victims of the trade.8 As a proportion of the enslaved population, there were more enslaved women in North America than in other colonial slave populations. Enslaved African women also bore more children than their counterparts in the Caribbean or South America, facilitating the natural reproduction of slaves on the North American continent.9 A 1662 Virginia law stated that an enslaved woman’s children inherited the “condition” of their mother; other colonies soon passed similar statutes.10 This economic strategy on the part of planters created a legal system in which all children born to slave women would be slaves for life, whether the father was white or black, enslaved or free.

Most fundamentally, the emergence of modern notions of race was closely related to the colonization of the Americas and the slave trade. African slave traders lacked a firm category of race that might have led them to think that they were selling their own people, in much the same way that Native Americans did not view other Indian groups as part of the same “race.” Similarly, most English citizens felt no racial identification with the Irish or the even the Welsh. The modern idea of race as an inherited physical difference (most often skin color) that is used to support systems of oppression was new in the early modern Atlantic world.

In the early years of slavery, especially in the South, the distinction between indentured servants and slaves was initially unclear. In 1643, however, a law was passed in Virginia that made African women “tithable.”11 This, in effect, associated African women’s work with difficult agricultural labor. There was no similar tax levied on white women; the law was an attempt to distinguish white from African women. The English ideal was to have enough hired hands and servants working on a farm so that wives and daughters did not have to partake in manual labor. Instead, white women were expected to labor in dairy sheds, small gardens, and kitchens. Of course, due to the labor shortage in early America, white women did participate in field labor. But this idealized gendered division of labor contributed to the English conceiving of themselves as better than other groups who did not divide labor in this fashion, including the West Africans arriving in slave ships to the colonies. For many white colonists, the association of a gendered division of labor with Englishness provided a further justification for the enslavement and subordination of Africans.

Ideas about the rule of the household were informed by legal and customary understandings of marriage and the home in England. A man was expected to hold “paternal dominion” over his household, which included his wife, children, servants, and slaves. In contrast, slaves were not legally masters of a household, and were therefore subject to the authority of the white master. Slave marriages were not recognized in colonial law. Some enslaved men and women married “abroad”; that is, they married individuals who were not owned by the same master and did not live on the same plantation. These husbands and wives had to travel miles at a time, typically only once a week on Sundays, to visit their spouses. Legal or religious authority did not protect these marriages, and masters could refuse to let their slaves visit a spouse, or even sell a slave to a new master hundreds of miles away from their spouse and children. Within the patriarchal and exploitative colonial environment, enslaved men and women struggled to establish families and communities.

 

III. Turmoil in Britain

Religious conflict plagued sixteenth-century England. While Spain plundered the New World and built an empire, Catholic and Protestant English monarchs vied for supremacy and attacked their opponents as heretics. Queen Elizabeth cemented Protestantism as the official religion of the realm, but questions endured as to what kind of Protestantism would hold sway. Many radical Protestants (often called “Puritans” by their critics) looked to the New World as an opportunity to create a beacon of Calvinist Christianity, while others continued the struggle in England. By the 1640s, political and economic conflicts between Parliament and the Crown merged with long-simmering religious tensions, made worse by a King who seemed sympathetic to Catholicism. The result was a bloody civil war. Colonists reacted in a variety of ways as England waged war on itself, but all were affected by these decades of turmoil.

Between 1629 and 1640 the absolute rule of Charles I caused considerable friction between the English Parliament and the King. Conflict erupted in 1640 when a parliament called by Charles refused to grant him subsidies to suppress a rebellion in Scotland. The Irish rebelled the following year, and by 1642 strained relations between Charles and Parliament led to civil war in England. In 1649 Parliament won, Charles I was executed, and England became a republic and protectorate under Oliver Cromwell. These changes redefined England’s relationship with its American colonies, as the new government under Cromwell attempted to consolidate its hold over its overseas territories.

In 1642, no permanent British North American colony was more than 35 years old. The Crown and various proprietors controlled most of the colonies, but settlers from Barbados to Maine enjoyed a great deal of independence. This was especially true in Massachusetts Bay, where Puritan settlers governed themselves according to the colony’s 1629 charter. Trade in tobacco and naval stores tied the colonies to England economically, as did religion and political culture, but in general the English government left the colonies to their own devices.

The English Revolution of the 1640s forced settlers in America to reconsider their place within the empire. Older colonies like Virginia and proprietary colonies like Maryland sympathized with the Crown. Newer colonies like Massachusetts Bay, populated by religious dissenters taking part in the Great Migration of the 1630s, tended to favor Parliament. Yet during the war the colonies remained neutral, fearing that support for either side could involve them in war. Even Massachusetts Bay, which nurtured ties to radical Protestants in Parliament, remained neutral.

King Charles I, pictured with the blue sash of the Order of the Garter, listens to his commanders detail the strategy for what would be the first pitched battle of the First English Civil War. As all previous constitutional compromises between King Charles and Parliament had broken down, both sides raised large armies in the hopes of forcing the other side to concede their position. The Battle of Edgehill ended with no clear winner, leading to a prolonged war of over four years and an even longer series of wars (known generally as the English Civil War) that eventually established the Commonwealth of England in 1649. Charles Landseer, The Eve of the Battle of Edge Hill, 1642, 1845. Wikimedia, http://commons.wikimedia.org/wiki/File:Charles_Landseer_-_The_Eve_of_the_Battle_of_Edge_Hill,_1642_-_Google_Art_Project.jpg.

King Charles I, pictured with the blue sash of the Order of the Garter, listens to his commanders detail the strategy for what would be the first pitched battle of the First English Civil War. As all previous constitutional compromises between King Charles and Parliament had broken down, both sides raised large armies in the hopes of forcing the other side to concede their position. The Battle of Edgehill ended with no clear winner, leading to a prolonged war of over four years and an even longer series of wars (known generally as the English Civil War) that eventually established the Commonwealth of England in 1649. Charles Landseer, The Eve of the Battle of Edge Hill, 1642, 1845. Wikimedia.

Charles’s execution in 1649 challenged American neutrality. Six colonies, including Virginia and Barbados, declared allegiance to the dead monarch’s son, Charles II. Parliament responded with an Act in 1650 that leveled an economic embargo on the rebelling colonies, forcing them to accept Parliament’s authority. Parliament argued that America had been “planted at the Cost, and settled” by the English nation, and that it, as the embodiment of that commonwealth, possessed ultimate jurisdiction over the colonies.12 It followed up the embargo with the Navigation Act of 1651, which compelled merchants in every colony to ship goods directly to England in English ships. Parliament sought to bind the colonies more closely to England and deny other European nations, especially the Dutch, from interfering with its American possessions.

England found itself in crisis after the death of Oliver Cromwell in 1658, leading in time to the reestablishment of the monarchy. On his 30th birthday (May 29, 1660), Charles II sailed from the Netherlands to his restoration after nine years in exile. He was received in London to great acclaim, as depicted in his contemporary painting. Lieve Verschuler, The arrival of King Charles II of England in Rotterdam, 24 May 1660. c. 1660-1665. Wikimedia, http://commons.wikimedia.org/wiki/File:The_arrival_of_King_Charles_II_of_England_in_Rotterdam,_may_24_1660_%28Lieve_Pietersz._Verschuier,_1665%29.jpg.

England found itself in crisis after the death of Oliver Cromwell in 1658, leading in time to the reestablishment of the monarchy. On his 30th birthday (May 29, 1660), Charles II sailed from the Netherlands to his restoration after nine years in exile. He was received in London to great acclaim, as depicted in his contemporary painting. Lieve Verschuler, The arrival of King Charles II of England in Rotterdam, 24 May 1660. c. 1660-1665. Wikimedia.

The monarchy was restored with Charles II, but popular suspicions of the Crown’s Catholic and French sympathies lingered. Charles II’s suppression of the religious and press freedoms that flourished during the civil war years demonstrated the Crown’s desire to re-impose order and royal rule. But it was the openly Catholic and pro-French policies of his successor, James II, that once again led to the overthrow of the monarchy in 1688. In that year a group of bishops and Parliamentarians offered the English throne to the Dutch Prince William of Holland and his English bride, Mary, the daughter of James II. This relatively peaceful coup was called the Glorious Revolution.

In the decades before the Glorious Revolution English colonists experienced religious and political conflict that reflected transformations in Europe as well as distinctly colonial conditions. In the 1670s and early 1680s King Charles II tightened English control over North America and the West Indies through the creation of new colonies, the imposition of new Navigation Acts, and the establishment of a new executive Council called the Lords of Trade and Plantations.13 As imperial officials attempted to curb colonists’ autonomy, threats from Native Americans and New France on the continent led many colonists to believe Indians and Catholics sought to destroy English America. In New England an uprising beginning in 1675 led by the Wampanoag leader Metacom, or King Philip as the English called him, seemed to confirm these fears. Indian conflicts helped trigger the revolt against royal authorities known as Bacon’s Rebellion in Virginia the following year.

James II worked to place the colonies on firmer administrative and defensive footing by creating the Dominion of New England in 1686. The Dominion consolidated the New England colonies, New York, and New Jersey into one administrative unit to counter French Canada, but colonists strongly resented the loss of their individual provinces. The Dominion’s governor, Sir Edmund Andros, did little to assuage fears of arbitrary power when he forced colonists into military service for a campaign against Maine Indians in early 1687. Impressment into military service was a longstanding grievance among English commoners that was transplanted to the colonies.

In England, James’s push for religious toleration of Catholics and Dissenters brought him into conflict with Parliament and the Anglican establishment in England. After the 1688 invasion by the Protestant William of Orange, James fled to France. When colonists learned imperial officials in Boston and New York City attempted to keep news of the Glorious Revolution secret, simmering hostilities toward provincial leaders burst into the open. In Massachusetts, New York, and Maryland colonists overthrew colonial governments as local social antagonisms fused with popular animosity towards imperial rule. Colonists in America quickly declared allegiance to the new monarchs. They did so in part to maintain order in their respective colonies. As one Virginia official explained, if there was “no King in England, there was no Government here.”14 A declaration of allegiance was therefore a means toward stability.

More importantly, colonists declared for William and Mary because they believed their ascension marked the rejection of absolutism and confirmed the centrality of Protestantism and liberty in English life. Settlers joined in the revolution by overthrowing the Dominion government, restoring the provinces to their previous status, and forcing out the Catholic-dominated Maryland government. They launched several assaults against French Canada as part of “King William’s War,” and rejoiced in Parliament’s 1689 passage of a Bill of Rights, which curtailed the power of the monarchy and cemented Protestantism in England. For English colonists, it was indeed a “glorious” revolution as it united them in a Protestant empire that stood counter to Catholic tyranny, absolutism, and French power.

 

IV. New Colonies

Despite the turmoil in Britain, colonial settlement grew considerably throughout the seventeenth century, and several new settlements joined the two original colonies of Virginia and Massachusetts.

In 1632, Charles I set a tract of about 12 million acres of land at the northern tip of the Chesapeake Bay aside for a second colony in America. Named for the new monarch’s queen, Maryland was granted to Charles’s friend and political ally, Cecilius Calvert, the second Lord Baltimore. Calvert hoped to gain additional wealth from the colony, as well as to create a haven for fellow Catholics. In England, many of that faith found themselves harassed by the Protestant majority and more than a few considered migrating to America. Charles I, a Catholic sympathizer, was in favor of Lord Baltimore’s plan to create a colony that would demonstrate that Catholics and Protestants could live together peacefully.

In late 1633, both Protestant and Catholic settlers left England for the Chesapeake, arriving in Maryland in March 1634. Men of middling means found greater opportunities in Maryland, which prospered as a tobacco colony without the growing pains suffered by Virginia.

Unfortunately, Lord Baltimore’s hopes of a diverse Christian colony were thwarted. Most colonists were Protestants relocating from Virginia. Many of these Protestants were radical Quakers and Puritans who were frustrated with Virginia’s efforts to force adherence to the Anglican Church, also known as the Church of England. In 1650, Puritans revolted, setting up a new government that prohibited both Catholicism and Anglicanism. Governor William Stone attempted to put down the revolt in 1655, but would not be successful until 1658. Two years after the Glorious Revolution (1688-1689), the Calverts lost control of Maryland and the province became a royal colony. 

Religion was a motivating factor in the creation of several other colonies as well, including the New England colonies of Connecticut and Rhode Island. The settlements that would eventually comprise Connecticut grew out of settlements in Saybrook and New Haven. Thomas Hooker and his congregation left Massachusetts for Connecticut because the area around Boston was becoming increasingly crowded. The Connecticut River Valley was large enough for more cattle and agriculture. In June 1636, Hooker led one hundred people and a variety of livestock in settling an area they called Newtown (later Hartford).

New Haven Colony had a more directly religious origin, as the founders attempted a new experiment in Puritanism. In 1638, John Davenport, Theophilus Eaton, and other supporters of the Puritan faith settled in the Quinnipiac (New Haven) area of the Connecticut River Valley. In 1643 New Haven Colony was officially organized with Eaton named governor. In the early 1660s, three men who had signed the death warrant for Charles I were concealed in New Haven. This did not win the colony any favors, and it became increasingly poorer and weaker. In 1665, New Haven was absorbed into Connecticut, but its singular religious tradition endured with the creation of Yale College.

Religious radicals similarly founded Rhode Island. After his exile from Massachusetts, Roger Williams created a settlement called Providence in 1636. He negotiated for the land with the local Narragansett sachems Canonicus and Miantonomi. Williams and his fellow settlers agreed on an egalitarian constitution and established religious and political freedom in the colony. The following year, another Massachusetts exile, Anne Hutchinson, and her followers settled near Providence. Others soon arrived, and the colony was granted a charter by Parliament in 1644. Persistently independent and with republican sympathies, the settlers refused a governor and instead elected a president and council. These separate communities passed laws abolishing witchcraft trials, imprisonment for debt and, in 1652, chattel slavery. Because of the colony’s policy of toleration, it became a haven for Quakers, Jews, and other persecuted religious groups. In 1663, Charles II granted the colony a royal charter establishing the colony of Rhode Island and Providence Plantations.

Until the middle of the seventeenth century, the English neglected the area between Virginia and New England despite obvious environmental advantages. The climate was healthier than the Chesapeake and more temperate than New England. The mid-Atlantic had three highly navigable rivers: the Susquehanna, Delaware, and Hudson. The Swedes and Dutch established their own colonies in the region: New Sweden in the Delaware Valley and New Netherland in the Hudson Valley.

Compared to other Dutch colonies around the globe, the settlements on the Hudson River were relatively minor. The Dutch West India Company realized that in order to secure its fur trade in the area, it needed to establish a greater presence in New Netherland. Toward this end, the company formed New Amsterdam on Manhattan Island in 1625.

Although the Dutch extended religious tolerance to those who settled in New Netherland, the population remained small. This left the colony vulnerable to English attack during the 1650s and 1660s, resulting in the hand-over of New Netherland to England in 1664. The new colony of New York was named for the proprietor, James, the Duke of York, brother to Charles II and funder of the expedition against the Dutch in 1664. New York was briefly reconquered by the Netherlands in 1667, and class and ethnic conflicts in New York City contributed to the rebellion against English authorities during the Glorious Revolution of 1688-89. Colonists of Dutch ancestry resisted assimilation into English culture well into the eighteenth century, prompting New York Anglicans to note that the colony was “rather like a conquered foreign province.”15

After the acquisition of New Netherland, Charles II and the Duke of York wished to strengthen English control over the Atlantic seaboard. In theory, this was to better tax the colonies; in practice, the awarding of the new proprietary colonies of New Jersey, Pennsylvania, and the Carolinas was a payoff of debts and political favors.

In 1664, the Duke of York granted the area between the Hudson and Delaware rivers to two English noblemen. These lands were split into two distinct colonies, East Jersey and West Jersey. One of West Jersey’s proprietors included William Penn. The ambitious Penn wanted his own, larger colony, the lands for which would be granted by both Charles II and the Duke of York. Pennsylvania consisted of about 45,000 square miles west of the Delaware River and the former New Sweden. Penn was a member of the Society of Friends, otherwise known as Quakers, and he intended his colony to be a “colony of Heaven for the children of Light.”16 Like New England’s aspirations to be a City Upon a Hill, Pennsylvania was to be an example of godliness. But Penn’s dream was to create not a colony of unity, but rather a colony of harmony. He noted in 1685 that “the people are a collection of diverse nations in Europe, as French, Dutch, Germans, Swedes, Danes, Finns, Scotch, and English; and of the last equal to all the rest.”17 Because Quakers in Pennsylvania extended to others in America the same rights they had demanded for themselves in England, the colony attracted a diverse collection of migrants. Slavery was particularly troublesome for some pacifist Quakers of Pennsylvania on the grounds that it required violence. In 1688, members of the Society of Friends in Germantown, outside of Philadelphia, signed a petition protesting the institution of slavery among fellow Quakers.

The Pennsylvania soil did not lend itself to the slave-based agriculture of the Chesapeake, but other colonies would depend heavily on slavery from their very foundations. The creation of the colony of Carolina, later divided into North and South Carolina and Georgia, was part of Charles II’s scheme to strengthen the English hold on the eastern seaboard and pay off political and cash debts. The Lords Proprietor of Carolina—eight very powerful favorites of the king—used the model of the colonization of Barbados to settle the area. In 1670, three ships of colonists from Barbados arrived at the mouth of the Ashley River, where they founded Charles Town. This defiance of Spanish claims to the area signified England’s growing confidence as a colonial power.

To attract colonists, the Lords Proprietor offered alluring incentives: religious tolerance, political representation by assembly, exemption from fees, and large land grants. These incentives worked, and Carolina grew quickly, attracting not only middling farmers and artisans but also wealthy planters. Colonists who could pay their own way to Carolina were granted 150 acres per family member. The Lords Proprietor allowed for slaves to be counted as members of the family. This encouraged the creation of large rice and indigo plantations along the coast of Carolina, which were more stable commodities than the deerskin and Indian slave trades. Because of the size of Carolina, the authority of the Lords Proprietor was especially weak in the northern reaches on the Albemarle Sound. This region had been settled by Virginians in the 1650s and was increasingly resistant to Carolina authority. As a result, the Lords Proprietor founded the separate province of North Carolina in 1691.18 

Henry Popple, “A map of the British Empire in America with the French and Spanish settlements adjacent thereto,” 1733 via Library of Congress.

Henry Popple, “A map of the British Empire in America with the French and Spanish settlements adjacent thereto,” 1733 via Library of Congress.

 

V. Riot, Rebellion, and Revolt

The seventeenth century saw the establishment and solidification of the British North American colonies, but this process did not occur peacefully. English settlements on the continent were rocked by explosions of violence, including the Pequot War, the Mystic massacre, King Philip’s War, the Susquehannock War, Bacon’s Rebellion, and the Pueblo Revolt.    

In May 1637, an armed contingent of English Puritans from Massachusetts Bay, Plymouth, and Connecticut colonies trekked into Indian country in territory claimed by New England. Referring to themselves as the “Sword of the Lord,” this military force intended to attack “that insolent and barbarous Nation, called the Pequots.” In the resulting violence, Puritans put the Mystic community to the torch, beginning with the north and south ends of the town. As Pequot men, women, and children tried to escape the blaze, other soldiers waited with swords and guns. One commander estimated that of the “four hundred souls in this Fort…not above five of them escaped out of our hands,” although another counted near “six or seven hundred” dead. In a span of less than two months, the English Puritans boasted that the Pequot “were drove out of their country, and slain by the sword, to the number of fifteen hundred.”19

The foundations of the war lay within the rivalry between the Pequot, the Narragansett, and the Mohegan, who battled for control of the fur and wampum trades in the northeast. This rivalry eventually forced the English and Dutch to choose sides. The war remained a conflict of Native interests and initiative, especially as the Mohegan hedged their bets on the English and reaped the rewards that came with displacing the Pequot.

Victory over the Pequots not only provided security and stability for the English colonies, but also propelled the Mohegan to new heights of political and economic influence as the primary power in New England. Ironically, history seemingly repeated itself later in the century as the Mohegan, desperate for a remedy to their diminishing strength, joined the Wampanoag war against the Puritans. This produced a more violent conflict in 1675 known as King Philip’s War, bringing a decisive end to Indian power in New England.

In the winter of 1675, the body of John Sassamon, a Christian, Harvard-educated Wampanoag, was found under the ice of a nearby pond. A fellow Christian Indian informed English authorities that three warriors under the local sachem named Metacom, known to the English as King Philip, had killed Sassamon, who had previously accused Metacom of planning an offensive against the English. The three alleged killers appeared before the Plymouth court in June 1675. They were found guilty of murder, and executed. Several weeks later, a group of Wampanoags killed nine English colonists in the town of Swansea.

Metacom—like most other New England sachems—had entered into covenants of “submission” to various colonies, viewing the arrangements as relationships of protection and reciprocity rather than subjugation. Indians and English lived, traded, worshiped, and arbitrated disputes in close proximity before 1675; but the execution of three of Metacom’s men at the hands of Plymouth Colony epitomized what many Indians viewed as the growing inequality of that relationship. The Wampanoags who attacked Swansea may have sought to restore balance, or to retaliate for the recent executions. Neither they nor anyone else sought to engulf all of New England in war, but that is precisely what happened. Authorities in Plymouth sprung into action, enlisting help from the neighboring colonies of Connecticut and Massachusetts.

Metacom and his followers eluded colonial forces in the summer of 1675, striking more Plymouth towns as they moved northwest. Some groups joined his forces, while others remained neutral or supported the English. The war badly divided some Indian communities. Metacom himself had little control over events, as panic and violence spread throughout New England in the autumn of 1675. English mistrust of neutral Indians, sometimes accompanied by demands they surrender their weapons, pushed many into open war. By the end of 1675, most of the Indians of present-day western and central Massachusetts had entered the war, laying waste to nearby English towns like Deerfield, Hadley, and Brookfield. Hapless colonial forces, spurning the military assistance of Indian allies such as the Mohegans, proved unable to locate more mobile native communities or intercept Indian attacks.

The English compounded their problems by attacking the powerful and neutral Narragansetts of Rhode Island in December 1675. In an action called the Great Swamp Fight, 1,000 Englishmen put the main Narragansett village to the torch, gunning down as many as 1,000 Narragansett men, women, and children as they fled the maelstrom. The surviving Narragansetts joined the Indians already fighting the English. Between February and April 1676, Native forces devastated a succession of English towns closer and closer to Boston.

In the spring of 1676, the tide turned. The New England colonies took the advice of men like Benjamin Church, who urged the greater use of Native allies, including Pequots and Mohegans, to find and fight the mobile warriors. Unable to plant crops and forced to live off the land, Indians’ will to continue the struggle waned as companies of English and Native allies pursued them. Growing numbers of fighters fled the region, switched sides, or surrendered in the spring and summer. The English sold many of the latter group into slavery. Colonial forces finally caught up with Metacom in August 1676, and the sachem was slain by a Christian Indian fighting with the English.

The war permanently altered the political and demographic landscape of New England. Between 800 and 1,000 English and at least 3,000 Indians perished in the 14-month conflict. Thousands of other Indians fled the region or were sold into slavery. In 1670, Native Americans comprised roughly 25 percent of New England’s population; a decade later, they made up perhaps 10 percent.20 The war’s brutality also encouraged a growing hatred of all Indians among many New England colonists. Though the fighting ceased in 1676, the bitter legacy of King Philip’s War lived on. 

Sixteen years later, New England faced a new fear: the supernatural. Beginning in early 1692 and culminating in 1693, Salem Town, Salem Village, Ipswich, and Andover all tried women and men as witches. Paranoia swept through the region, and fourteen women and six men were executed. Five other individuals died in prison. The causes of the trials are numerous and include local rivalries, political turmoil, enduring trauma of war, faulty legal procedure where accusing others became a method of self-defense, or perhaps even low-level environmental contamination. Enduring tensions with Indians framed the events, however, and an Indian or African woman named Tituba enslaved by the local minister was at the center of the tragedy.21 

Native American communities in Virginia had already been decimated by wars in 1622 and 1644. But a new clash arose in Virginia the same year that New Englanders crushed Metacom’s forces. This conflict, known as Bacon’s Rebellion, grew out of tensions between Native Americans and English settlers as well as tensions between wealthy English landowners and the poor settlers who continually pushed west into Indian territory.

Bacon’s Rebellion began, appropriately enough, with an argument over a pig. In the summer of 1675, a group of Doeg Indians visited Thomas Mathew on his plantation in northern Virginia to collect a debt that he owed them. When Mathew refused to pay, they took some of his pigs to settle the debt. This “theft” sparked a series of raids and counter-raids. The Susquehannock Indians were caught in the crossfire when the militia mistook them for Doegs, leaving fourteen dead. A similar pattern of escalating violence then repeated: the Susquehannocks retaliated by killing colonists in Virginia and Maryland, and the English marshaled their forces and laid siege to the Susquehannocks. The conflict became uglier after the militia executed a delegation of Susquehannock ambassadors under a flag of truce. A few parties of warriors intent on revenge launched raids along the frontier and killed dozens of English colonists.

The sudden and unpredictable violence of the Susquehannock War triggered a political crisis in Virginia. Panicked colonists fled en masse from the vulnerable frontiers, flooding into coastal communities and begging the government for help. But the cautious governor, Sir William Berkeley, did not send an army after the Susquehannocks. He worried that a full-scale war would inevitably drag other Indians into the conflict, turning allies into deadly enemies. Berkeley therefore insisted on a defensive strategy centered around a string of new fortifications to protect the frontier and strict instructions not to antagonize friendly Indians. It was a sound military policy but a public relations disaster. Terrified colonists condemned Berkeley. Building contracts for the forts went to Berkeley’s wealthy friends, who conveniently decided that their own plantations were the most strategically vital. Colonists denounced the government as a corrupt band of oligarchs more interested in lining their pockets than protecting the people.

By the spring of 1676, a small group of frontier colonists took matters into their own hands. Naming the charismatic young Nathaniel Bacon as their leader, these self-styled “volunteers” proclaimed that they took up arms in defense of their homes and families. They took pains to assure Berkeley that they intended no disloyalty, but Berkeley feared a coup and branded the volunteers as traitors. Berkeley finally mobilized an army—not to pursue Susquehannocks, but to crush the colonists’ rebellion. His drastic response catapulted a small band of anti-Indian vigilantes into full-fledged rebels whose survival necessitated bringing down the colonial government.

Bacon and the rebels stalked the Susquehannock as well as friendly Indians like the Pamunkeys and the Occaneechis. The rebels became convinced that there was a massive Indian conspiracy to destroy the English. Berkeley’s stubborn persistence in defending friendly Indians and destroying the Indian-fighting rebels led Bacon to accuse the governor of conspiring with a “powerful cabal” of elite planters and with “the protected and darling Indians” to slaughter his English enemies.22

In the early summer of 1676, Bacon’s neighbors elected him their burgess and sent him to Jamestown to confront Berkeley. Though the House of Burgesses enacted pro-rebel reforms like prohibiting the sale of arms to Indians and restoring suffrage rights to landless freemen, Bacon’s supporters remained unsatisfied. Berkeley soon had Bacon arrested and forced the rebel leader into the humiliating position of publicly begging forgiveness for his treason. Bacon swallowed this indignity, but turned the tables by gathering an army of followers and surrounding the State House, demanding that Berkeley name him the General of Virginia and bless his universal war against Indians. Instead, the 70-year old governor stepped onto the field in front of the crowd of angry men, unafraid, and called Bacon a traitor to his face. Then he tore open his shirt and dared Bacon to shoot him in the heart, if he was so intent on overthrowing his government. “Here!” he shouted before the crowd, “Shoot me, before God, it is a fair mark. Shoot!” When Bacon hesitated, Berkeley drew his sword and challenged the young man to a duel, knowing that Bacon could neither back down from a challenge without looking like a coward nor kill him without making himself into a villain. Instead, Bacon resorted to bluster and blasphemy. Threatening to slaughter the entire Assembly if necessary, he cursed, “God damn my blood, I came for a commission, and a commission I will have before I go.”23 Berkeley stood defiant, but the cowed burgesses finally prevailed upon him to grant Bacon’s request. Virginia had its general, and Bacon had his war.

After this dramatic showdown in Jamestown, Bacon’s Rebellion quickly spiraled out of control. Berkeley slowly rebuilt his loyalist army, forcing Bacon to divert his attention to the coasts and away from the Indians. But most rebels were more interested in defending their homes and families than in fighting other Englishmen and deserted in droves at every rumor of Indian activity. In many places, the “rebellion” was less an organized military campaign than a collection of local grievances and personal rivalries. Both rebels and loyalists smelled the opportunities for plunder, seizing their rivals’ estates and confiscating their property.

For a small but vocal minority of rebels, however, the rebellion became an ideological revolution: Sarah Drummond, wife of rebel leader William Drummond, advocated independence from England and the formation of a Virginian Republic, declaring “I fear the power of England no more than a broken straw.” Others struggled for a different kind of independence: white servants and black slaves fought side by side in both armies after promises of freedom for military service. Everyone accused everyone else of treason, rebels and loyalists switched sides depending on which side was winning, and the whole Chesapeake disintegrated into a confused melee of secret plots and grandiose crusades, sordid vendettas and desperate gambits, with Indians and English alike struggling for supremacy and survival. One Virginian summed up the rebellion as “our time of anarchy.”24

The rebels steadily lost ground and ultimately suffered a crushing defeat. Bacon died of typhus in the autumn of 1676, and his successors surrendered to Berkeley in January 1677. Berkeley summarily tried and executed the rebel leadership in a succession of kangaroo courts-martial. Before long, however, the royal fleet arrived, bearing over 1,000 red-coated troops and a royal commission of investigation charged with restoring order to the colony. The commissioners replaced the governor and dispatched Berkeley to London, where he died in disgrace.

But the conclusion of Bacon’s Rebellion was uncertain, and the maintenance of order remained precarious for years afterward. The garrison of royal troops discouraged both incursion by hostile Indians and insurrection by discontented colonists, allowing the king to continue profiting from tobacco revenues. The end of armed resistance did not mean a resolution to the underlying tensions destabilizing colonial society. Indians inside Virginia remained an embattled minority and Indians outside Virginia remained a terrifying threat. Elite planters continued to grow rich by exploiting their indentured servants and marginalizing small farmers. The vast majority of Virginians continued to resent their exploitation with a simmering fury. Virginia legislators did recognize the extent of popular hostility towards colonial rule, however, and improved the social and political conditions of poor white Virginians in the years after the rebellion. During the same period, the increasing availability of enslaved workers through the Atlantic slave trade contributed to planters’ large-scale adoption of slave labor in the Chesapeake.

Just a few years after Bacon’s Rebellion, the Spanish experienced their own tumult in the area of contemporary New Mexico. The Spanish had been maintaining control partly by suppressing Native American beliefs. Friars aggressively enforced Catholic practice, burning native idols and masks and other sacred objects and banishing traditional spiritual practices. In 1680 the Pueblo religious leader Popé, who had been arrested and whipped for “sorcery” five years earlier, led various Puebloan groups in rebellion. Several thousand Pueblo warriors razed the Spanish countryside and besieged Santa Fe. They killed 400, including 21 Franciscan priests, and allowed 2,000 other Spaniards and Christian Pueblos to flee. It was perhaps the greatest act of Indian resistance in North American history.

Built sometime between 1000 and 1450 AD, the Taos Pueblo located near modern-day Taos, New Mexico, functioned as a base for the leader Popé during the Pueblo Revolt. Luca Galuzzi (photographer), Taos Pueblo, 2007. Wikimedia, http://commons.wikimedia.org/wiki/File:USA_09669_Taos_Pueblo_Luca_Galuzzi_2007.jpg.

Built sometime between 1000 and 1450 AD, the Taos Pueblo located near modern-day Taos, New Mexico, functioned as a base for the leader Popé during the Pueblo Revolt. Luca Galuzzi (photographer), Taos Pueblo, 2007. Wikimedia.

In New Mexico, the Pueblos eradicated all traces of Spanish rule. They destroyed churches and threw themselves into rivers to wash away their Christian baptisms. “The God of the Christians is dead,” Popé proclaimed, and the Pueblo resumed traditional spiritual practices.25 The Spanish were exiled for twelve years. They returned in 1692, weakened, to reconquer New Mexico.

The late seventeenth century was a time of great violence and turmoil. Bacon’s Rebellion turned white Virginians against one another, King Philip’s War shattered Indian resistance in New England, and the Pueblo Revolt struck a major blow to Spanish power. It would take several more decades before similar patterns erupted in Carolina and Pennsylvania, but the constant advance of European settlements provoked conflict in these areas as well.

In 1715, the Yamasees, Carolina’s closest allies and most lucrative trading partners, turned against the colony and nearly destroyed it entirely. Writing from Carolina to London, the settler George Rodd believed the Yamasees wanted nothing less than “the whole continent and to kill us or chase us all out.”26 Yamasees would eventually advance within miles of Charles Town.

The Yamasee War’s first victims were traders. The governor had dispatched two of the colony’s most prominent men to visit and pacify a Yamasee council following rumors of native unrest. Yamasees quickly proved the fears well founded by killing the emissaries and every English trader they could corral.

Yamasees, like many other Indians, had come to depend on English courts as much as the flintlock rifles and ammunition traders offered them for slaves and animal skins. Feuds between English agents in Indian country had crippled the court of trade and shut down all diplomacy, provoking the violent Yamasee reprisal. Most Indian villages in the southeast sent at least a few warriors to join what quickly became a pan-Indian cause against the colony.

Yet Charles Town ultimately survived the onslaught by preserving one crucial alliance with the Cherokees. By 1717, the conflict had largely dried up, and the only remaining menace was roaming Yamasee bands operating from Spanish Florida. Most Indian villages returned to terms with Carolina and resumed trading. The lucrative trade in Indian slaves, however, which had consumed 50,000 souls in five decades, largely dwindled after the war. The danger was too high for traders, and the colonies discovered even greater profits by importing Africans to work new rice plantations. Herein lies the birth of the “Old South,” that expanse of plantations that created untold wealth and misery. Indians retained the strongest militaries in the region, but they never again threatened the survival of English colonies.

If a colony existed where peace with Indians might continue, it would be Pennsylvania. At the colony’s founding William Penn created a Quaker religious imperative for the peaceful treatment of Indians. While Penn never doubted that the English would appropriate Native lands, he demanded his colonists obtain Indian territories through purchase rather than violence. Though Pennsylvanians maintained relatively peaceful relations with Native Americans, increased immigration and booming land speculation increased the demand for land. Coercive and fraudulent methods of negotiation became increasingly prominent. The Walking Purchase of 1737 was emblematic of both colonists’ desire for cheap land and the changing relationship between Pennsylvanians and their Native neighbors.

Through treaty negotiation in 1737, native Delaware leaders agreed to sell Pennsylvania all of the land that a man could walk in a day and a half, a common measurement utilized by Delawares in evaluating distances. John and Thomas Penn, joined by the land speculator and longtime friend of the Penns James Logan, hired a team of skilled runners to complete the “walk” on a prepared trail. The runners traveled from Wrightstown to present-day Jim Thorpe and proprietary officials then drew the new boundary line perpendicular to the runners’ route, extending northeast to the Delaware River. The colonial government thus measured out a tract much larger than Delawares had originally intended to sell, roughly 1,200 square miles. As a result, Delaware-proprietary relations suffered. Many Delawares left the lands in question and migrated westward to join Shawnees and other Delawares already living in the Ohio Valley. There, they established diplomatic and trade relationships with the French. Memories of the suspect purchase endured into the 1750s and became a chief point of contention between the Pennsylvanian government and Delawares during the upcoming Seven Years War.27 

 

VI. Conclusion

The seventeenth century saw the creation and maturation of Britain’s North American colonies. Colonists endured a century of struggle against unforgiving climates, hostile natives, and imperial intrigue. They did so largely through ruthless expressions of power. Colonists conquered Native Americans, attacked European rivals, and joined a highly lucrative transatlantic economy rooted in slavery. After surviving a century of desperation and war, British North American colonists fashioned increasingly complex societies with unique religious cultures, economic ties, and political traditions. These societies would come to shape not only North America, but soon the entirety of the Atlantic World.

 

VII. Reference Material

This chapter was edited by Daniel Johnson, with content contributions by Gregory Ablavsky, James Ambuske, Carolyn Arena, L.D. Burnett, Lori Daggar, Daniel Johnson, Hendrick Isom, D. Andrew Johnson, Matthew Kruer, Joseph Locke, Samantha Miller, Melissa Morris, Bryan Rindfleisch, Emily Romeo, John Saillant, Ian Saxine, Marie Stango, Luke Willert, and Ben Wright.

Recommended citation: Gregory Ablavsky et al., “British North America,” Daniel Johnson, ed., in The American Yawp, Joseph Locke and Ben Wright, eds., last modified August 1, 2016, http://www.AmericanYawp.com.

 

Recommended Reading

  • Blackburn, Robin. The Making of New World Slavery: From the Baroque to the Modern, 1492-1800. London and New York: Verso, 1997.
  • Braddick, Michael. God’s Fury, England’s Fire: A New History of the English Civil Wars. New York: Penguin, 2008.
  • Brown, Kathleen M. Good Wives, Nasty Wenches, Anxious Patriarchs: Gender, Race, and Power in Colonial Virginia. Williamsburg, Va.: University of North Carolina Press, 1996.
  • Chaplin, Joyce. Subject Matter: Technology, the Body, and Science on the Anglo-American Frontier, 1500-1676. Cambridge, Mass.: Harvard University Press, 2001.
  • Donoghue, John. Fire Under the Ashes: An Atlantic History of the English Revolution. Chicago: University of Chicago Press, 2013.
  • Gallay, Alan. The Indian Slave Trade: The Rise of the English Empire in the American South, 1670-1717. New Haven: Yale University Press, 2003.
  • Goodfriend, Joyce D. Before the Melting Pot: Society and Culture in Colonial New York City, 1664-1730. Princeton, N.J.: Princeton University Press, 1992.
  • Landsman, Ned C. Crossroads of Empire: The Middle Colonies in British North America. Baltimore: Johns Hopkins University Press, 2010.
  • Heywood, Linda M. and John K. Thornton. Central Africans, Atlantic Creoles, and the Foundation of the Americas, 1585-1660. New York: Cambridge University Press, 2007.
  • Lepore, Jill. The Name of War: King Philip’s War and the Origins of American Identity. New York: Knopf Doubleday Publishing, 2009.
  • Little, Ann M. Many Captives of Esther Wheelright. New Haven: Yale University Press, 2016.
  • Mustakeem, Sowande’ M. Slavery at Sea: Terror, Sex, and Sickness in the Middle Passage. Urbana: University of Illinois Press, 2016.
  • O’Malley, Gregory E. Final Passages: The Intercolonial Slave Trade of British America, 1619-1807. Williamsburg, Va.: University of North Carolina Press, 2014.
  • Merrell, James H. Into the American Woods: Negotiations on the Pennsylvania Frontier. New York: W.W. Norton, 2000.
  • Parent, Anthony S. Foul Means: The Formation of a Slave Society in Virginia, 1660-1740. Williamsburg, Va.: University of North Carolina Press, 2003.
  • Parrish, Susan Scott. American Curiosity: Cultures of Natural History in the Colonial British Atlantic World. Chapel Hill: University of North Carolina Press, 2006.
  • Pestana, Carla Gardina. The English Atlantic in an Age of Revolution, 1640–1661. Cambridge: Harvard University Press, 2004.
  • Pulsipher, Jenny Hale. Subjects unto the Same King: Indians, English, and the Contest for Authority in Colonial New England. Philadelphia: University of Pennsylvania Press, 2005.
  • Roney, Jessica Choppin. Governed by a Spirit of Opposition: The Origins of American Political Practice in Colonial Philadelphia. Baltimore: Johns Hopkins University Press, 2014.
  • Ramsey, William L. The Yamasee War: A Study of Culture, Economy, and Conflict in the Colonial South. Lincoln: University of Nebraska Press, 2008.
  • Rice, James D. Tales from a Revolution: Bacon’s Rebellion and the Transformation of Early America. New York: Oxford University Press, 2012.
  • Smallwood, Stephanie E. Saltwater Slavery: A Middle Passage from Africa to American Diaspora. Cambridge, Mass.: Harvard University Press, 2008.
  • Stanwood, Owen. The Empire Reformed: English America in the Age of the Glorious Revolution. Philadelphia: University of Pennsylvania Press, 2011.
  • Taylor, Alan. American Colonies: The Settling of North America. New York: Viking, 2002.
  • Wood, Peter H. Black Majority: Negroes in Colonial South Carolina from 1670 through the Stono Rebellion. New York: Norton, 1975.

 

Notes

  1. Edgar Legare Pennington, “The Reverend Francis Le Jau’s Work Among Indians and Negro Slaves,” Journal of Southern History, 1, no. 4 (November 1935): 442-458. []
  2. William Waller Hening, Statutes at Large; Being a Collection of all the Laws of Virginia (Richmond, Va, 1809-23), Vol. 11, pp. 170, 260, 266, 270. []
  3. Captain Thomas Phillips, “A Journal of a Voyage Made in the Hannibal of London, 16” in Elizabeth Donnan, ed., Documents Illustrative of the Slave Trade to America: Volume 1, 1441-1700 (New York: Octagon Books, 1969), 403. []
  4. Alexander Falconbridge, An Account of the Slave Trade on the Coast of Africa (London: 1788). []
  5. Phillip Curtin estimated 9 million Africans were carried across the Atlantic. Joseph E. Inikori’s figure estimated 15 million, and Patrick Manning estimated 12 million transported with 10.5 million surviving the voyage. See. Phillip D. Curtin, The Atlantic Slave Trade: A Census (Madison: University of Wisconsin Press, 1969); Joseph E. Inikori, “Measuring the Atlantic slave trade: An assessment of Curtin and Anstey,” Journal of Africa, 17 (1976): 197-223; and Patrick Manning, “Historical datasets on Africa and the African Atlantic,” Journal of Comparative Economics, 40 (2012): 604–607. []
  6. Paul E. Lovejoy, Transformations in Slavery: A History of Slavery in Africa (Cambridge: Cambridge University Press, 1983/2000), 36. []
  7. Jane Landers, “Slavery in the Lower South,” OAH Magazine of History, 17:3 (2003): 23-27. []
  8. Lynn Dumenil, ed., The Oxford Encyclopedia of American Social History (New York: Oxford University Press, 2012), 512. []
  9. “Facts about the Slave Trade and Slavery,” The Gilder Lerhman Institute of American History. Available online at https://www.gilderlehrman.org/history-by-era/slavery-and-anti-slavery/resources/facts-about-slave-trade-and-slavery. []
  10. Willie Lee Nichols Rose, ed. A Documentary History of Slavery in North America (Athens: University of Georgia Press, 1999), 19. []
  11. Stephanie M. H. Camp, Closer to Freedom: Enslaved Women and Everyday Resistance in the Plantation (Chapel Hill: University of North Carolina Press, 2004), 63-64. []
  12. John H. Elliot, Empires of the Atlantic World: Britain and Spain in America, 1492-1830 (New Haven: Yale University Press, 2006), 148-49. []
  13. Paul Kléber Monod, Imperial Island: A History of Britain and Its Empire, 1660-1837 (Malden, MA: Wiley-Blackwell, 2009), 80. []
  14. Owen Stanwood, “Rumours and Rebellions in the English Atlantic World, 1688-9,” in Tim Harris and Steven Taylor, eds., The Final Crisis of the Stuart Monarchy: The Revolutions of 1688-91 in Their British, Atlantic and European Contexts (Woodbridge: Boydell Press, 2013), 214. []
  15. Joyce D. Goodfriend, Before the Melting Pot: Society and Culture in Colonial New York City, 1664-1730 (Princeton, N.J.: Princeton University Press, 1991), 54. []
  16. Quoted in David Hacket Fischer, Albion’s Seed: Four British Folkways in America (New York: Oxford University Press, 1989), 459. []
  17. Albert Cook Myers, ed., Narratives of Early Pennsylvania, West New Jersey, and Delaware, 1630-1707 (New York: Charles Scribner’s Sons, 1912), 260. []
  18. Noeleen McIlvenna, A Very Mutinous People: The Struggle for North Carolina, 1660-1713 (Chapel Hill: University of North Carolina Press, 2009). []
  19. John Mason, A Brief History of the Pequot War (1736), (Boston: 1736), available online through DigitalCommons@University of Nebraska-Lincoln: http://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1042&context=etas. []
  20. James David Drake, King Philip’s War: Civil War in New England, 1675-1676 (Amherst: University of Massachusetts Press, 1999), 169. []
  21. Paul Boyer and Stephen Nissenbaum, Salem Possessed: The Social Origins of Witchcraft (Cambridge: Harvard University Press, 1993). For more on Tituba, see Elaine G. Breslaw, Tituba, Reluctant Witch of Salem: Devilish Indians and Puritan Fantasies (New York: New York University Press, 1996). []
  22. Nathaniel Bacon, “Manifesto (1676),” in Myra Jehlen and Michael Warner, eds., The English Literatures of America: 1500-1800 (Routledge, 1996), 226. []
  23. Mary Newton Stanard, The Story of Bacon’s Rebellion (New York: 1907), 77-78. []
  24. Quoted in April Lee Hatfield, Atlantic Virginia: Intercolonial Relations in the Seventeenth Century (Philadelphia: University of Pennsylvania Press, 2004), 286 n27. []
  25. Robert Silverberg, The Pueblo Revolt (Lincoln: University of Nebraska Press, 1994), 131. []
  26. Calendar of State Papers, Colonial Series, America and West Indies, August 1714-December 1715 (London: Kraus Reprint Ltd., 1928), 168-69. []
  27. Steven Craig Harper, Promised Land: Penn’s Holy Experiment, The Walking Purchase, and the Dispossession of Delawares, 1600-1763 (Bethlehem: Lehigh University Press, 2006). []

F17 – 2 Colliding Cultures

Negotiating Peace With the Indians

Theodor de Bry, “Negotiating Peace With the Indians,” 1634, Virginia Historical Society.

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

The Columbian Exchange transformed both sides of the Atlantic, but with dramatically disparate outcomes. New diseases wiped out entire civilizations in the Americas, while newly imported nutrient-rich foodstuffs enabled a European population boom. Spain benefited most immediately as the wealth of the Aztec and Incan Empires strengthened the Spanish monarchy. Spain used its new riches to gain an advantage over other European nations, but this advantage was soon contested.

Portugal, France, the Netherlands, and England all raced to the New World, eager to match the gains of the Spanish. Native peoples greeted the new visitors with responses ranging from welcoming cooperation to aggressive violence, but the ravages of disease and the possibility of new trading relationships enabled Europeans to create settlements all along the western rim of the Atlantic world. New empires would emerge from these tenuous beginnings, and by the end of the seventeenth century, Spain would lose its privileged position to its rivals. An age of colonization had begun and, with it, a great collision of cultures commenced.

 

II. Spanish America

Spain extended its reach in the Americas after reaping the benefits of its colonies in Mexico, the Caribbean, and South America. Expeditions slowly began combing the continent and bringing Europeans into the modern-day United States in the hopes of establishing religious and economic dominance in a new territory.

Juan Ponce de Leon arrived in the area named “La Florida” in 1513. He found between 150,000 and 300,000 Native Americans. But then two-and-a-half centuries of contact with European and African peoples–whether through war, slave raids, or, most dramatically, foreign disease–decimated Florida’s indigenous population. European explorers, meanwhile, had hoped to find great wealth in Florida, but reality never aligned with their imaginations.

1513 Atlantic map from cartographer Martin Waldseemuller. Via Wikimedia.

1513 Atlantic map from cartographer Martin Waldseemuller. Via Wikimedia.

In the first half of the sixteenth century, Spanish colonizers fought frequently with Florida’s native peoples as well as with other Europeans. In the 1560s Spain expelled French Protestants, called Huguenots, from the area near modern-day Jacksonville in northeast Florida. In 1586 English privateer Sir Francis Drake burned the wooden settlement of St. Augustine. At the dawn of the seventeenth century, Spain’s reach in Florida extended from the mouth of the St. Johns River south to the environs of St. Augustine—an area of roughly 1,000 square miles. The Spaniards attempted to duplicate methods for establishing control used previously in Mexico, the Caribbean, and the Andes. The Crown granted missionaries the right to live among Timucua and Guale villagers in the late 1500s and early 1600s and encouraged settlement through the encomienda system (grants of Indian labor).1 

In the 1630s, the mission system extended into the Apalachee district in the Florida panhandle. The Apalachee, one of the most powerful tribes in Florida at the time of contact, claimed the territory from the modern Florida-Georgia border to the Gulf of Mexico. Apalachee farmers grew an abundance of corn and other crops. Indian traders carried surplus products east along the Camino Real (the royal road) that connected the western anchor of the mission system with St. Augustine. Spanish settlers drove cattle eastward across the St. Johns River and established ranches as far west as Apalachee. Still, Spain held Florida tenuously.

Further west, in 1598, Juan de Oñate led 400 settlers, soldiers, and missionaries from Mexico into New Mexico. The Spanish Southwest had brutal beginnings. When Oñate sacked the Pueblo city of Acoma, the “sky city,” the Spaniards slaughtered nearly half of its roughly 1,500 inhabitants, including women and children. Oñate ordered one foot cut off of every surviving male over 15 and he enslaved the remaining women and children.2

Santa Fe, the first permanent European settlement in the Southwest, was established in 1610. Few Spaniards relocated to the southwest due to the distance from Mexico City and the dry and hostile environment. Thus, the Spanish never achieved a commanding presence in the region. By 1680, only about 3,000 colonists called Spanish New Mexico home.3 There, they traded with and exploited the local Puebloan peoples. The region’s Puebloan population had plummeted from as many as 60,000 in 1600 to about 17,000 in 1680.4 

Spain shifted strategies after the military expeditions wove their way through the southern and western half of North America. Missions became the engine of colonization in North America. Missionaries, most of whom were members of the Franciscan religious order, provided Spain with an advance guard in North America. Catholicism had always justified Spanish conquest, and colonization always carried religious imperatives. By the early seventeenth century, Spanish friars established dozens of missions along the Rio Grande, in New Mexico, and in California.

 

III. Spain’s Rivals Emerge

The earliest plan of New Amsterdam (now Manhattan). 1660. Wikimedia.

The earliest plan of New Amsterdam (now Manhattan). 1660. Wikimedia.

While Spain plundered the New World, unrest plagued Europe. The Reformation threw England and France, the two European powers capable of contesting Spain, into turmoil. Long and expensive conflicts drained time, resources, and lives. Millions died from religious violence in France alone. As the violence diminished in Europe, however, religious and political rivalries continued in the New World.

The Spanish exploitation of New Spain’s riches inspired European monarchs to invest in exploration and conquest. Reports of Spanish atrocities spread throughout Europe and provided a humanitarian justification for European colonization. An English reprint of the writings of Bartolomé de las Casas bore the sensational title: “Popery Truly Display’d in its Bloody Colours: Or, a Faithful Narrative of the Horrid and Unexampled Massacres, Butcheries, and all manners of Cruelties that Hell and Malice could invent, committed by the Popish Spanish.” An English writer explained that the Indians “were simple and plain men, and lived without great labour,” but in their lust for gold the Spaniards “forced the people (that were not used to labour) to stand all the daie in the hot sun gathering gold in the sand of the rivers. By this means a great number of them (not used to such pains) died, and a great number of them (seeing themselves brought from so quiet a life to such misery and slavery) of desperation killed themselves. And many would not marry, because they would not have their children slaves to the Spaniards.”5 The Spanish accused their critics of fostering a “Black Legend.” The Black Legend drew on religious differences and political rivalries. Spain had successful conquests in France, Italy, Germany, and the Netherlands and left many in those nations yearning to break free from Spanish influence. English writers argued that Spanish barbarities were foiling a tremendous opportunity for the expansion of Christianity across the globe and that a benevolent conquest of the New World by non-Spanish monarchies offered the surest salvation of the New World’s pagan masses. With these religious justifications, and with obvious economic motives, Spain’s rivals arrived in the New World.

 

The French

The French crown subsidized exploration in the early sixteenth century. Early French explorers sought a fabled Northwest Passage, a mythical waterway passing through the North American continent to Asia. Despite the wealth of the New World, Asia’s riches still beckoned to Europeans. Canada’s Saint Lawrence River appeared to be such a passage, stretching deep into the continent and into the Great Lakes. French colonial possessions centered on these bodies of water (and, later, down the Mississippi River to the port of New Orleans).

French colonization developed through investment from private trading companies. Traders established Port-Royal in Acadia (Nova Scotia) in 1603 and launched trading expeditions that stretched down the Atlantic coast as far as Cape Cod. The needs of the fur trade set the future pattern of French colonization. Founded in 1608 under the leadership of Samuel de Champlain, Quebec provided the foothold for what would become New France. French fur traders placed a higher value on cooperating with the Indians than on establishing a successful French colonial footprint. Asserting dominance in the region could have been to their own detriment, as it might have compromised their access to skilled Indian trappers, and therefore wealth. Few Frenchmen traveled to the New World to settle permanently. In fact, few traveled at all. Many persecuted French Protestants (Huguenots) sought to emigrate after France criminalized Protestantism in 1685, but all non-Catholics were forbidden in New France.6 

Jean-Pierre Lassus, “Veüe et Perspective de la Nouvelle Orleans,” 1726, Centre des archives d’outre-mer, France via Wikimedia. This depiction of New Orleans in 1726 when it was an 8-year-old French frontier settlement.

Jean-Pierre Lassus, “Veüe et Perspective de la Nouvelle Orleans,” 1726, Centre des archives d’outre-mer, France via Wikimedia. This depiction of New Orleans in 1726 when it was an 8-year-old French frontier settlement.

The French preference for trade over permanent settlement fostered more cooperative and mutually beneficial relationships with Native Americans than was typical among the Spanish and English. Perhaps eager to debunk the anti-Catholic elements of the Black Legend, the French worked to cultivate cooperation with Indians. Jesuit missionaries, for instance, adopted different conversion strategies than the Spanish Franciscans. Spanish missionaries brought Indians into enclosed missions, whereas Jesuits more often lived with or alongside Indian groups. Many French fur traders married Indian women.7 The offspring of Indian women and French men were so common in New France that the French developed a word for these children, Métis(sage). The Huron people developed a particularly close relationship with the French and many converted to Christianity and engaged in the fur trade. But close relationships with the French would come at a high cost. The Huron were decimated by the ravages of European disease, and entanglements in French and Dutch conflicts proved disastrous.8 Despite this, some native peoples maintained alliances with the French.

Pressure from the powerful Iroquois in the east pushed many Algonquian-speaking peoples toward French territory in the mid-seventeenth century and together they crafted what historians have called a “middle ground,” a kind of cross-cultural space that allowed for native and European interaction, negotiation, and accommodation. French traders adopted–sometimes clumsily–the gift-giving and mediation strategies expected of native leader. Natives similarly engaged the impersonal European market and adapted–often haphazardly–to European laws. The Great Lakes “middle ground” experienced tumultuous success throughout the late-seventeenth and early-eighteenth centuries until English colonial officials and American settlers swarmed the region. The pressures of European expansion strained even the closest bonds.9

 

The Dutch

The Netherlands, a small maritime nation with great wealth, achieved considerable colonial success. In 1581, the Netherlands had officially broken away from the Hapsburgs and won a reputation as the freest of the new European nations. Dutch women maintained separate legal identities from their husbands and could therefore hold property and inherit full estates.

Ravaged by the turmoil of the Reformation, the Dutch embraced greater religious tolerance and freedom of the press than other European nations.10 Radical Protestants, Catholics, and Jews flocked to the Netherlands. The English Pilgrims, for instance, fled first to the Netherlands before sailing to the New World years later. The Netherlands built its colonial empire through the work of experienced merchants and skilled sailors. The Dutch were the most advanced capitalists in the modern world and marshaled extensive financial resources by creating innovative financial organizations such as the Amsterdam Stock Exchange and the East India Company. Although the Dutch offered liberties, they offered very little democracy—power remained in the hands of only a few. And Dutch liberties certainly had their limits. The Dutch advanced the slave trade and brought African slaves with them to the New World. Slavery was an essential part of Dutch capitalist triumphs.

Sharing the European hunger for access to Asia, in 1609 the Dutch commissioned the Englishman Henry Hudson to discover the fabled Northwest Passage through North America. He failed, of course, but nevertheless found the Hudson River and claimed modern-day New York for the Dutch. There they established New Netherland, an essential part of the Dutch New World empire. The Netherlands chartered the Dutch West India Company in 1621 and established colonies in Africa, the Caribbean, and North America. The island of Manhattan provided a launching pad to support its Caribbean colonies and attack Spanish trade.

Spiteful of the Spanish and mindful of the “Black Legend,” the Dutch were determined not to repeat Spanish atrocities. They fashioned guidelines for New Netherlands that conformed to the ideas of Hugo Grotius, a legal philosopher who believed native peoples possessed the same natural rights as Europeans. Colony leaders insisted that land be purchased; in 1626 Peter Minuit therefore “bought” Manhattan from Munsee Indians.11 Despite the seemingly honorable intentions, it is very likely that the Dutch paid the wrong Indians for the land (either intentionally or unintentionally) or that the Munsee and the Dutch understood the transaction in very different terms. Transactions like these illustrated both the Dutch attempt to find a more peaceful process of colonization and the inconsistency between European and Native American understandings of property.

Like the French, the Dutch sought to profit, not to conquer. Trade with Native peoples became New Netherland’s central economic activity. Dutch traders carried wampum along pre-existing Native trade routes and exchanged it for beaver pelts. Wampum consisted of shell beads fashioned by Algonquian Indians on the southern New England coast and were valued as a ceremonial and diplomatic commodity among the Iroquois. Wampum became a currency that could buy anything from a loaf of bread to a plot of land.12 

In addition to developing these trading networks, the Dutch also established farms, settlements, and lumber camps. The West India Company directors implemented the patroon system to encourage colonization. The patroon system granted large estates to wealthy landlords, who subsequently paid passage for the tenants to work their land. Expanding Dutch settlements correlated with deteriorating relations with local Indians. In the interior of the continent, the Dutch retained valuable alliances with the Iroquois to maintain Beverwijck, modern-day Albany, as a hub for the fur trade.13 In the places where the Dutch built permanent settlements, the ideals of peaceful colonization succumbed to the settlers’ increasing demand for land. Armed conflicts erupted as colonial settlements encroached on Native villages and hunting lands. Profit and peace, it seemed, could not coexist.

Labor shortages, meanwhile, crippled Dutch colonization. The patroon system failed to bring enough tenants and the colony could not attract a sufficient number of indentured servants to satisfy the colony’s backers. In response, the colony imported 11 company-owned slaves in 1626, the same year that Minuit purchased Manhattan. Slaves were tasked with building New Amsterdam (modern-day New York City), including a defensive wall along the northern edge of the colony (the site of modern-day Wall Street). They created its roads and maintained its all-important port. Fears of racial mixing led the Dutch to import enslaved women, enabling the formation of African Dutch families. The colony’s first African marriage occurred in 1641, and by 1650 there were at least 500 African slaves in the colony. By 1660 New Amsterdam had the largest urban slave population on the continent.14 

As was typical of the practice of African slavery in much of the early seventeenth century, Dutch slavery in New Amsterdam was less comprehensively exploitative than later systems of American slavery. Some enslaved Africans, for instance, successfully sued for back wages. When several company-owned slaves fought for the colony against the Munsee Indians, they petitioned for their freedom and won a kind of “half freedom” that allowed them to work their own land in return for paying a large tithe, or tax, to their masters. The children of these “half-free” laborers remained held in bondage by the West India Company, however. The Dutch, who so proudly touted their liberties, grappled with the reality of African slavery, and some New Netherlanders protested the enslavement of Christianized Africans. The economic goals of the colony slowly crowded out these cultural and religious objections, and the much boasted liberties of the Dutch came to exist alongside increasingly brutal systems of slavery.

 

The Portuguese 

The Portuguese had been leaders in Atlantic navigation well ahead of Columbus’s voyage. But the incredible wealth flowing from New Spain piqued the rivalry between the two Iberian countries, and accelerated Portuguese colonization efforts. This rivalry created a crisis within the Catholic world as Spain and Portugal squared off in a battle for colonial supremacy. The Pope intervened and divided the New World with the Treaty of Tordesillas in 1494. Land east of the Tordesillas Meridian, an imaginary line dividing South America, would be given to Portugal, whereas land west of the line was reserved for Spanish conquest. In return for the license to conquer, both Portugal and Spain were instructed to treat the natives with Christian compassion and to bring them under the protection of the Church.

Lucrative colonies in Africa and India initially preoccupied Portugal, but by 1530 the Portuguese turned their attention to the land that would become Brazil, driving out French traders and establishing permanent settlements. Gold and silver mines dotted the interior of the colony, but two industries powered early colonial Brazil: sugar and the slave trade. In fact, over the entire history of the Atlantic slave trade, more Africans were enslaved in Brazil than any other colony in the Atlantic World. Gold mines emerged in greater number throughout the eighteenth century, but still never rivaled the profitability of sugar or slave-trading.

Jesuit missionaries succeeded in bringing Christianity to Brazil, but strong elements of African and native spirituality mixed with orthodox Catholicism to create a unique religious culture. This culture resulted from the demographics of Brazilian slavery. High mortality rates on sugar plantations required a steady influx of new slaves, thus perpetuating the cultural connection between Brazil and Africa. The reliance on new imports of slaves increased the likelihood of resistance, however, and escaped slaves managed to create several free settlements, called quilombos. These settlements drew from both African and Native slaves, and despite frequent attacks, several endured throughout the long history of Brazilian slavery.15 

Despite the arrival of these new Europeans, Spain continued to dominate the New World. The wealth flowing from the exploitation of the Aztec and Incan Empires greatly eclipsed the profits of other European nations. But this dominance would not last long. By the end of the sixteenth century, the powerful Spanish Armada would be destroyed, and the English would begin to rule the waves.

 

IV. English Colonization

Nicholas Hilliard, The Battle of Gravelines, 1588, via National Geographic España

Nicholas Hilliard, The Battle of Gravelines, 1588, via National Geographic España 

Spain had a one-hundred year head start on New World colonization and a jealous England eyed the enormous wealth that Spain gleaned. The Protestant Reformation had shaken England but Elizabeth I assumed the English crown in 1558. Elizabeth oversaw England’s so-called “golden age” that included both the expansion of trade and exploration and the literary achievements of Shakespeare and Marlowe. English mercantilism, a state-assisted manufacturing and trading system, created and maintained markets. The markets provided a steady supply of consumers and laborers, stimulated economic expansion, and increased English wealth.

However, wrenching social and economic changes unsettled the English population. The island’s population increased from fewer than three million in 1500 to over five million by the middle of the seventeenth century.16 The skyrocketing cost of land coincided with plummeting farming income. Rents and prices rose but wages stagnated. Moreover, movements to enclose public land–sparked by the transition of English landholders from agriculture to livestock-raising–evicted tenants from the land and created hordes of landless, jobless peasants that haunted the cities and countryside. One-quarter to one-half of the population lived in extreme poverty.17 

New World colonization won support in England amid a time of rising English fortunes among the wealthy, a tense Spanish rivalry, and mounting internal social unrest. But supporters of English colonization always touted more than economic gains and mere national self-interest. They claimed to be doing God’s work. Many claimed that colonization would glorify God, England, and Protestantism by Christianizing the New World’s pagan peoples. Advocates such as Richard Hakluyt the Younger and John Dee, for instance, drew upon The History of the Kings of Britain, written by the twelfth century monk Geoffrey of Monmouth, and its mythical account of King Arthur’s conquest and Christianization of pagan lands to justify American conquest.18 Moreover, promoters promised that the conversion of New World Indians would satisfy God and glorify England’s “Virgin Queen,” Elizabeth I, who was seen as nearly divine by some in England. The English—and other European Protestant colonizers—imagined themselves superior to the Spanish, who still bore the Black Legend of inhuman cruelty. English colonization, supporters argued, would prove that superiority.

In his 1584 “Discourse on Western Planting,” Richard Hakluyt amassed the supposed religious, moral, and exceptional economic benefits of colonization. He repeated the “Black Legend” of Spanish New World terrorism and attacked the sins of Catholic Spain. He promised that English colonization could strike a blow against Spanish heresy and bring Protestant religion to the New World. English interference, Hakluyt suggested, may provide the only salvation from Catholic rule in the New World. The New World, too, he said, offered obvious economic advantages. Trade and resource extraction would enrich the English treasury. England, for instance, could find plentiful materials to outfit a world-class navy. Moreover, he said, the New World could provide an escape for England’s vast armies of landless “vagabonds.” Expanded trade, he argued, would not only bring profit, but also provide work for England’s jobless poor. A Christian enterprise, a blow against Spain, an economic stimulus, and a social safety valve all beckoned the English toward a commitment to colonization.19

This noble rhetoric veiled the coarse economic motives that brought England to the New World. New economic structures and a new merchant class paved the way for colonization. England’s merchants lacked estates but they had new plans to build wealth. By collaborating with new government-sponsored trading monopolies and employing financial innovations such as joint-stock companies, England’s merchants sought to improve on the Dutch economic system. Spain was extracting enormous material wealth from the New World; why shouldn’t England? Joint-stock companies, the ancestors of the modern corporations, became the initial instruments of colonization. With government monopolies, shared profits, and managed risks, these money-making ventures could attract and manage the vast capital needed for colonization. In 1606 James I approved the formation of the Virginia Company (named after Elizabeth, the “Virgin Queen”).

Rather than formal colonization, however, the most successful early English ventures in the New World were a form of state-sponsored piracy known as privateering. Queen Elizabeth sponsored sailors, or “Sea Dogges,” such as John Hawkins and Francis Drake, to plunder Spanish ships and towns in the Americas. Privateers earned a substantial profit both for themselves and for the English crown. England practiced piracy on a scale, one historian wrote, “that transforms crime into politics.”20 Francis Drake harried Spanish ships throughout the Western Hemisphere and raided Spanish caravans as far away as the coast of Peru on the Pacific Ocean. In 1580 Elizabeth rewarded her skilled pirate with knighthood. But Elizabeth walked a fine line. With Protestant-Catholic tensions already running high, English privateering provoked Spain. Tensions worsened after the execution of Mary, Queen of Scots, a Catholic. In 1588, King Philip II of Spain unleashed the fabled Armada. With 130 Ships, 8,000 sailors, and 18,000 soldiers, Spain launched the largest invasion in history to destroy the British navy and depose Elizabeth. 

An island nation, England depended upon a robust navy for trade and territorial expansion. England had fewer ships than Spain but they were smaller and swifter. They successfully harassed the Armada, forcing it to retreat to the Netherlands for reinforcements. But then a fluke storm, celebrated in England as the “divine wind,” annihilated the remainder of the fleet.21 The destruction of the Armada changed the course of world history. It not only saved England and secured English Protestantism, but it also opened the seas to English expansion and paved the way for England’s colonial future. By 1600, England stood ready to embark upon its dominance over North America.

English colonization would look very different from Spanish or French colonization. England had long been trying to conquer Catholic Ireland. Rather than integrating with the Irish and trying to convert them to Protestantism, England more often simply seized land through violence and pushed out the former inhabitants, leaving them to move elsewhere or to die. These same tactics would later be deployed in North American invasions. 

English colonization, however, began haltingly. Sir Humphrey Gilbert labored throughout the late-sixteenth century to establish a colony in Newfoundland but failed. In 1587, with a predominantly male cohort of 150 English colonizers, John White reestablished an abandoned settlement on North Carolina’s Roanoke Island. Supply shortages prompted White to return to England for additional support but the Spanish Armada and the mobilization of British naval efforts stranded him in Britain for several years. When he finally returned to Roanoke, he found the colony abandoned. What befell the failed colony? White found the word “Croatan” carved into a tree or a post in the abandoned colony. Historians presume the colonists, short of food, may have fled for a nearby island of that name and encountered its settled native population. Others offer violence as an explanation. Regardless, the English colonists were never heard from again. When Queen Elizabeth died in 1603, no Englishmen had yet established a permanent North American colony.

After King James made peace with Spain in 1604, privateering no longer held out the promise of cheap wealth. Colonization assumed a new urgency. The Virginia Company, established in 1606, drew inspiration from Cortes and the Spanish conquests. It hoped to find gold and silver as well as other valuable trading commodities in the New World: glass, iron, furs, pitch, tar, and anything else the country could supply. The Company planned to identify a navigable river with a deep harbor, away from the eyes of the Spanish. There they would find an Indian trading network and extract a fortune from the New World.

 

V. Jamestown

"Incolarum Virginiae piscandi ratio (The Method of Fishing of the Inhabitants of Virginia)," c1590, via the Encyclopedia Virginia.

“Incolarum Virginiae piscandi ratio (The Method of Fishing of the Inhabitants of Virginia),” c1590, via the Encyclopedia Virginia.

In April 1607 Englishmen aboard three ships—the Susan ConstantGodspeed, and Discovery—sailed forty miles up the James River (named for the English king) in present-day Virginia (Named for Elizabeth I, the “Virgin Queen”) and settled upon just such a place. The uninhabited peninsula they selected was upriver and out of sight of Spanish patrols. It offered easy defense against ground assaults and was both uninhabited and located close enough to many Indian villages and their potentially lucrative trade networks. But the location was a disaster. Indians had ignored the peninsula for two reasons: Terrible soil hampered agriculture and brackish tidal water led to debilitating disease. Despite these setbacks, the English built Jamestown, the first permanent English colony in the present-day United States.

The English had not entered a wilderness but had arrived amid a people they called the Powhatan Confederacy. Powhatan, or Wahunsenacawh, as he called himself, led nearly 10,000 Algonquian-speaking Indians in the Chesapeake. They burned vast acreage to clear brush and create sprawling artificial park-like grasslands so they could easily hunt deer, elk, and bison. The Powhatan raised corn, beans, squash, and possibly sunflowers, rotating acreage throughout the Chesapeake. Without plows, manure, or draft animals, the Powhatan achieved a remarkable number of calories cheaply and efficiently.

Jamestown was a profit-seeking venture backed by investors. The colonists were mostly gentlemen and proved entirely unprepared for the challenges ahead. They hoped for easy riches but found none. As John Smith later complained, they “Would rather starve than work.”22 And so they did. Disease and starvation ravaged the colonists, thanks in part to the peninsula’s unhealthy location and the fact that supplies from England arrived sporadically or spoiled. Fewer than half of the original colonists survived the first nine months.

John Smith, a yeoman’s son and capable leader, took command of the crippled colony and promised, “He that will not work shall not eat.” He navigated Indian diplomacy, claiming that he was captured and sentenced to death but Powhatan’s daughter, Pocahontas, intervened to save his life. She would later marry another colonist, John Rolfe, and die in England.

Powhatan kept the English alive that first winter. The Powhatan had welcomed the English and placed a high value on metal axe-heads, kettles, tools, and guns and eagerly traded furs and other abundant goods for them. With 10,000 confederated natives and with food in abundance, the Indians had little to fear and much to gain from the isolated outpost of sick and dying Englishmen.

John White, “Village of the Secotan, 1585, via Wikimedia.

John White, “Village of the Secotan, 1585, via Wikimedia.

Despite reinforcements, the English continued to die. Four hundred settlers arrived in 1609, but the overwhelmed colony entered a desperate “starving time” in the winter of 1609-1610. Supplies were lost at sea. Relations with the Indians deteriorated and the colonists fought a kind of slow-burning guerrilla war with the Powhatan. Disaster loomed for the colony. The settlers ate everything they could, roaming the woods for nuts and berries. They boiled leather. They dug up graves to eat the corpses of their former neighbors. One man was executed for killing and eating his wife. Some years later, George Percy recalled the colonists’ desperation during these years, when he served as the colony’s president: “Having fed upon our horses and other beasts as long as they lasted, we were glad to make shift with vermin as dogs, cats, rats and mice … as to eat boots shoes or any other leather … And now famine beginning to look ghastly and pale in every face, that nothing was spared to maintain life and to doe those things which seam incredible, as to dig up dead corpses out of graves and to eat them.”23 Archaeological excavations in 2012 exhumed the bones of a fourteen-year-old girl that exhibited signs of cannibalism.24 All but 60 settlers would die by the summer of 1610.

Little improved over the next several years. By 1616, 80 percent of all English immigrants that arrived in Jamestown had perished. England’s first American colony was a catastrophe. The colony was reorganized, and in 1614 the marriage of Pocahontas to John Rolfe eased relations with the Powhatan, though the colony still limped along as a starving, commercially disastrous tragedy. The colonists were unable to find any profitable commodities remained dependent upon the Indians and sporadic shipments from England for food. But then tobacco saved Jamestown.

By the time King James I described tobacco as a “noxious weed, … loathsome to the eye, hateful to the nose, harmful to the brain, and dangerous to the lungs,” it had already taken Europe by storm. In 1616 John Rolfe crossed tobacco strains from Trinidad and Guiana and planted Virginia’s first tobacco crop. In 1617 the colony sent its first cargo of tobacco back to England. The “noxious weed,” a native of the New World, fetched a high price in Europe and the tobacco boom began in Virginia and then later spread to Maryland. Within fifteen years American colonists were exporting over 500,000 pounds of tobacco per year. Within forty, they were exporting fifteen million.25 

Tobacco changed everything. It saved Virginia from ruin, incentivized further colonization, and laid the groundwork for what would become the United States. With a new market open, Virginia drew not only merchants and traders, but also settlers. Colonists came in droves. They were mostly young, mostly male, and mostly indentured servants who signed contracts called indentures that bonded them to employers for a period of years in return for passage across the ocean. But even the rough terms of servitude were no match for the promise of land and potential profits that beckoned English farmers. But still there were not enough of them. Tobacco was a labor-intensive crop and ambitious planters, with seemingly limitless land before them, lacked only laborers to escalate their wealth and status. The colony’s great labor vacuum inspired the creation of the “headright policy” in 1618: any person who migrated to Virginia would automatically receive 50 acres of land and any immigrant whose passage they paid would entitle them to 50 acres more.

In 1619 the Virginia Company established the House of Burgesses, a limited representative body composed of white landowners that first met in Jamestown. That same year, a Dutch slave ship sold 20 Africans to the Virginia colonists. Southern slavery was born.

Soon the tobacco-growing colonists expanded beyond the bounds of Jamestown’s deadly peninsula. When it became clear that the English were not merely intent on maintaining a small trading post, but sought a permanent ever-expanding colony, conflict with the Powhatan Confederacy became almost inevitable. Powhatan died in 1622 and was succeeded by his brother, Opechancanough, who promised to drive the land-hungry colonists back into the sea. He launched a surprise attack and in a single day (March 22, 1622) killed 347 colonists, or one-fourth of all the colonists in Virginia. The colonists retaliated and revisited the massacres upon Indian settlements many times over. The massacre freed the colonists to drive the Indians off their land. The governor of Virginia declared it colonial policy to achieve the “expulsion of the savages to gain the free range of the country.”26 War and disease destroyed the remnants of the Chesapeake Indians and tilted the balance of power decisively toward the English colonizers.

English colonists brought to the New World particular visions of racial, cultural, and religious supremacy. Despite starving in the shadow of the Powhatan Confederacy, English colonists nevertheless judged themselves physically, spiritually, and technologically superior to native peoples in North America. Christianity, metallurgy, intensive agriculture, trans-Atlantic navigation, and even wheat all magnified the English sense of superiority. This sense of superiority, when coupled with outbreaks of violence, left the English feeling entitled to indigenous lands and resources.

Spanish conquerors established the framework for the Atlantic slave trade over a century before the first chained Africans arrived at Jamestown. Even Bartolomé de las Casas, celebrated for his pleas to save Native Americans from colonial butchery, for a time recommended that indigenous labor be replaced by importing Africans. Early English settlers from the Caribbean and Atlantic coast of North America mostly imitated European ideas of African inferiority. “Race” followed the expansion of slavery across the Atlantic world. Skin-color and race suddenly seemed fixed. Englishmen equated Africans with categorical blackness and blackness with Sin, “the handmaid and symbol of baseness.”27 An English essayist in 1695 wrote that “A negro will always be a negro, carry him to Greenland, feed him chalk, feed and manage him never so many ways.”28 More and more Europeans embraced the notions that Europeans and Africans were of distinct races. Others now preached that the Old Testament God cursed Ham, the son of Noah, and doomed black people to perpetual enslavement.

And yet in the early years of American slavery, ideas about race were not yet fixed and the practice of slavery was not yet codified. The first generations of Africans in English North America faced miserable conditions but, in contrast to later American history, their initial servitude was not necessarily permanent, heritable, or even particularly disgraceful. Africans were definitively set apart as fundamentally different from their white counterparts, and faced longer terms of service and harsher punishments, but, like the indentured white servants whisked away from English slums, these first Africans in North America could also work for only a set number of years before becoming free landowners themselves. The Angolan Anthony Johnson, for instance, was sold into servitude but fulfilled his indenture and became a prosperous tobacco planter himself.29 

In 1622, at the dawn of the tobacco boom, Jamestown had still seemed a failure. But the rise of tobacco and the destruction of the Powhatan turned the tide. Colonists escaped the deadly peninsula and immigrants poured into the colony to grow tobacco and turn a profit for the Crown.

 

VI. New England

Seal of the Massachusetts Bay Colony.

Seal of the Massachusetts Bay Colony, via The History Project (UC Davis).

The English colonies in New England established from 1620 onward were founded with loftier goals than those in Virginia. Although migrants to New England expected economic profit, religious motives directed the rhetoric and much of the reality of these colonies. Not every English person who moved to New England during the seventeenth century was a Puritan, but Puritans dominated the politics, religion, and culture of New England. Even after 1700, the region’s Puritan inheritance shaped many aspects of its history.

The term Puritan began as an insult, and its recipients usually referred to each other as “the godly” if they used a specific term at all. Puritans believed that the Church of England did not distance itself far enough from Catholicism after Henry VIII broke with Rome in the 1530s. They largely agreed with European Calvinists—followers of theologian Jean Calvin—on matters of religious doctrine. Calvinists (and Puritans) believed that mankind was redeemed by God’s Grace alone, and that the fate of an individual’s immortal soul was predestined. The happy minority God had already chosen to save were known among English Puritans as the Elect. Calvinists also argued that the decoration or churches, reliance on ornate ceremony, and corrupt priesthood obscured God’s message. They believed that reading the Bible was the best way to understand God.

Puritans were stereotyped by their enemies as dour killjoys, and the exaggeration has endured. It is certainly true that the Puritans’ disdain for excess and opposition to many holidays popular in Europe (including Christmas, which, as Puritans never tired of reminding everyone, the Bible never told anyone to celebrate) lent themselves to caricature. But Puritans understood themselves as advocating a reasonable middle path in a corrupt world. It would never occur to a Puritan, for example, to abstain from alcohol or sex.

During the first century after the English Reformation (c.1530-1630) Puritans sought to “purify” the Church of England of all practices that smacked of Catholicism, advocating a simpler worship service, the abolition of ornate churches, and other reforms. They had some success in pushing the Church of England in a more Calvinist direction, but with the coronation of King Charles I (r. 1625-1649), the Puritans gained an implacable foe that cast English Puritans as excessive and dangerous. Facing growing persecution, the Puritans began the Great Migration, during which about 20,000 people traveled to New England between 1630 and 1640. The Puritans (unlike the small band of separatist “Pilgrims” who founded Plymouth Colony in 1620) remained committed to reforming the Church of England, but temporarily decamped to North America to accomplish this task. Leaders like John Winthrop insisted they were not separating from, or abandoning, England, but were rather forming a godly community in America, that would be a “City on a Hill” and an example for reformers back home.30 The Puritans did not seek to create a haven of religious toleration, a notion that they—along with nearly all European Christians—regarded as ridiculous at best, and dangerous at worst.

While the Puritans did not succeed in building a godly utopia in New England, a combination of Puritan traits with several external factors created colonies wildly different from any other region settled by English people. Unlike those heading to Virginia, colonists in New England (Plymouth [1620], Massachusetts Bay [1630], Connecticut [1636], and Rhode Island [1636]) generally arrived in family groups. The majority of New England immigrants were small landholders in England, a class contemporary English called the “middling sort.” When they arrived in New England they tended to replicate their home environments, founding towns comprised of independent landholders. The New England climate and soil made large-scale plantation agriculture impractical, so the system of large landholders using masses of slaves or indentured servants to grow labor-intensive crops never took hold.

There is no evidence that the New England Puritans would have opposed such a system were it possible; other Puritans made their fortunes on the Caribbean sugar islands, and New England merchants profited as suppliers of provisions and slaves to those colonies. By accident of geography as much as by design, New England society was much less stratified than any of Britain’s other seventeenth-century colonies.

Although New England colonies could boast wealthy landholding elites, the disparity of wealth in the region remained narrow compared to the Chesapeake, Carolina, or the Caribbean. Instead, seventeenth-century New England was characterized by a broadly-shared modest prosperity based on a mixed economy dependent on small farms, shops, fishing, lumber, shipbuilding, and trade with the Atlantic World.

A combination of environmental factors and the Puritan social ethos produced a region of remarkable health and stability during the seventeenth century. New England immigrants avoided most of the deadly outbreaks of tropical disease that turned Chesapeake colonies into graveyards. Disease, in fact, only aided English settlement and relations to Native Americans. In contrast to other English colonists who had to contend with powerful Native American neighbors, the Puritans confronted the stunned survivors of a biological catastrophe. A lethal pandemic of smallpox during the 1610s swept away as much as 90 percent of the region’s Native American population. Many survivors welcomed the English as potential allies against rival tribes who had escaped the catastrophe. The relatively healthy environment coupled with political stability and the predominance of family groups among early immigrants allowed the New England population to grow to 91,000 people by 1700 from only 21,000 immigrants. In contrast, 120,000 English went to the Chesapeake, and only 85,000 white colonists remained in 1700.31

The New England Puritans set out to build their utopia by creating communities of the godly. Groups of men, often from the same region of England, applied to the colony’s General Court for land grants.32 They generally divided part of the land for immediate use while keeping much of the rest as “commons” or undivided land for future generations. The town’s inhabitants collectively decided the size of each settler’s home lot based on their current wealth and status. Besides oversight of property, the town restricted membership, and new arrivals needed to apply for admission. Those who gained admittance could participate in town governments that, while not democratic by modern standards, nevertheless had broad popular involvement. All male property holders could vote in town meetings and choose the selectmen, assessors, constables, and other officials from among themselves to conduct the daily affairs of government. Upon their founding, towns wrote covenants, reflecting the Puritan belief in God’s covenant with His people. Towns sought to arbitrate disputes and contain strife, as did the church. Wayward or divergent individuals were persuaded, corrected, or coerced. Popular conceptions of Puritans as hardened authoritarians are exaggerated, but if persuasion and arbitration failed, people who did not conform to community norms were punished or removed. Massachusetts banished Anne Hutchinson, Roger Williams, and other religious dissenters like the Quakers.

Although by many measures colonization in New England succeeded, its Puritan leaders failed in their own mission to create a utopian community that would inspire their fellows back in England. They tended to focus their disappointment on the younger generation. “But alas!” Increase Mather lamented, “That so many of the younger Generation have so early corrupted their [the founders’] doings!”33 The Jeremiad, a sermon lamenting the fallen state of New England due to its straying from its early virtuous path, became a staple of late seventeenth-century Puritan literature.

Yet the Jeremiads could not stop the effects of prosperity. The population spread and grew more diverse. Many, if not most, New Englanders retained strong ties to their Calvinist roots into the eighteenth century, but the Puritans (who became Congregationalists) struggled against a rising tide of religious pluralism. On December 25, 1727, Judge Samuel Sewell noted in his diary that a new Anglican minister “keeps the day in his new Church at Braintrey: people flock thither.” Previously forbidden holidays like Christmas were celebrated publicly in church and privately in homes. Puritan divine Cotton Mather discovered on the Christmas of 1711, “a number of young people of both sexes, belonging, many of them, to my flock, had…a Frolick, a reveling Feast, and a Ball, which discovers their Corruption.”34

Despite the lamentations of the Mathers and other Puritan leaders of their failure, they left an enduring mark on New England culture and society that endured long after the region’s residents ceased to be called “Puritan.”

 

VII. Conclusion

The fledgling settlements in Virginia and Massachusetts paled in importance when compared to the sugar colonies of the Caribbean. Valued more as marginal investments and social safety valves where the poor could be released, these colonies nonetheless created a foothold for Britain on a vast North American continent. And although the seventeenth century would be fraught for Britain–religious, social, and political upheavals would behead one king and force another to flee his throne–settlers in Massachusetts and Virginia were nonetheless tied together by the emerging Atlantic economy. While commodities such as tobacco and sugar fueled new markets in Europe, the economy grew increasingly dependent upon slave labor. Enslaved Africans transported across the Atlantic would further complicate the collision of cultures in the Americas. The creation and maintenance of a slave system would spark new understandings of human difference and new modes of social control. The economic exchanges of the new Atlantic economy would not only generate great wealth and exploitation, they would also lead to new cultural systems and new identities for the inhabitants of at least four continents.

 

VIII. Reference Materials

This chapter was edited by Ben Wright and Joseph Locke, with content contributions by Erin Bonuso, L.D. Burnett, Jon Grandage, Joseph Locke, Lisa Mercer, Maria Montalvo, Ian Saxine, Jennifer Tellman, Luke Willert, and Ben Wright.

Recommended citation: Erin Bonuso et al, “Colliding Cultures,” Ben Wright and Joseph L. Locke, eds. The American Yawp, Joseph L. Locke and Ben Wright, ads., last modified August 1, 2016,  http://www.AmericanYawp.com.

 

Recommended Reading

  • Armitage, David and Michael J. Braddick, eds. The British Atlantic World, 1500-1800. (New York: Palgrave MacMillan, 2002.
  • Barr, Juliana. Peace Came in the Form of a Woman: Indians and Spaniards in the Texas Borderlands. Chapel Hill: University of North Carolina Press, 2009.
  • Blackburn, Robin. The Making of New World Slavery: From the Baroque to the Modern, 1492-1800. London and New York: Verso, 1997.
  • Calloway, Colin G. New Worlds for All: Indians, Europeans, and the Remaking of Early America. Baltimore: Johns Hopkins University Press, 1997.
  • Cañizares-Esguerra, Jorge. Puritan Conquistadors. Iberianizing the Atlantic , 1550-1700. Stanford University Press, 2006.
  • Cronon, William. Changes in the Land: Indians, Colonists, and the Ecology of New England. New York: Hill and Wang, 1983.
  • Daniels, Christine and Michael V. Kennedy, eds. Negotiated Empires: Centers and Peripheries in the Americas, 1500-1820. New York: Routledge, 2002.
  • Dubcovsky, Alejandra. Informed Power: Communication in the Early American South. Cambridge: Harvard University Press, 2016.
  • Elliot, John H. Empires of the Atlantic World: Britain and Spain in America, 1492-1830. New Haven: Yale University Press, 2006.
  • Fuentes, Marisa J. Dispossessed Lives: Enslaved Women, Violence, and the Archive. Philadelphia: University of Pennsylvania Press, 2016.
  • Goetz, Rebecca Anne. The Baptism of Early Virginia: How Christianity Created Race. Baltimore: Johns Hopkins University Press, 2012.  
  • Grandjean, Katherine. American Passage: The Communications Frontier in Early New England. Cambridge: Harvard University Press, 2015.
  • Gould, Eliga H. “Entangled Histories, Entangled Worlds: The English-Speaking Atlantic as a Spanish Periphery.” American Historical Review 112, no. 3 (June 2007): 764-786.
  • Mancall, Peter C. Hakluyt’s Promise: An Elizabethan’s Obsession for an English America. New Haven: Yale University Press, 2007.
  • Morgan, Edmund S. American Slavery, American Freedom: The Ordeal of Colonial Virginia. New York: W.W. Norton & Co., 1975.
  • Morgan, Jennifer. Laboring Women: Reproduction and Gender in New World Slavery. Philadelphia: University of Pennsylvania Press, 2004.
  • Reséndez, Andrés. The Other Slavery: The Uncovered Story of Indian Enslavement in America. New York: Houghton Mifflin Harcourt, 2017.
  • Seed, Patricia. Ceremonies of Possession in Europe’s Conquest of the New World, 1492-1640. New York: Cambridge University Press, 1995.
  • Socolow, Susan Migden. The Women of Colonial Latin America. New York: Cambridge University Press, 2000.
  • Snyder, Christina. Slavery In Indian Country: The Changing Face of Captivity in Early America. Cambridge: Harvard University Press, 2010.
  • Stoler, Ann Laura. “Tense and Tender Ties: The Politics of Comparison in North American History and (Post) Colonial Studies.” Journal of American History 88: 3 (Dec. 2001): 829- 897.
  • Thorton, John. Africa and Africans in the Making of the Atlantic World, 1400-1800. New York: Cambridge University Press, 1992.
  • Warren, Wendy. New England Bound: Slavery and Colonization in Early America. New York: W. W. Norton, 2016.
  • Weimer, Adrian. Martyrs’ Mirror: Persecution and Holiness in Early New England. New York, NY: Oxford University Press, 2011.
  • White, Richard. The Middle Ground: Indians, Empires, and Republics in the Great Lakes Region, 1650-1815. New York: Cambridge University Press, 1991.

 

Notes

  1. Stanley L. Engerman, Robert E. Gallman, eds., The Cambridge Economic History of the United States, Vol I The Colonial Era (New York: Cambridge University Press, 1996), 21. []
  2. Andrew L. Knaut, The Pueblo Revolt of 1680: Conquest and Resistance in Seventeenth Century New Mexico (Norman: University of Oklahoma Press, 2015), 46. []
  3. John E Kicza and Rebecca Horn, Resilient Cultures: America’s Native Peoples Confront European Colonization, 1500-1800 (New York: Routledge, 2013), 122. []
  4. Andrew L. Knaut, The Pueblo Revolt of 1680: Conquest and Resistance in Seventeenth-Century New Mexico (Norman: University of Oklahoma Press, 1995), 155. []
  5. John Ponet, A Short Treatise on Political Power: And of the true Obedience which Subjects owe to Kings, and other civil Governors (London: 1556), 43-44. []
  6. Alan Greer, The People of New France (Toronto: University of Tortonto Press, 1997). []
  7. Susan Sleeper-Smith, Indian Women and French Men: Rethinking Cultural Encounter in the Western Great Lakes (Amherst: University of Massachusetts Press, 2001). []
  8. Carole Blackburn, Harvest of Souls: The Jesuit Missions and Colonialism in North America, 1632-1659 (Montreal: McGill-Queen’s University Press, 2000), 116. []
  9. Richard White, The Middle Ground: Indians, Empires, and Republics in the Great Lakes Region, 650-1815 (New York: Cambridge University Press, 1991). []
  10. Evan Haefeli, New Netherland and the Dutch Origins of American Religious Liberty (Philadelphia: University of Pennsylvania Press, 2012), 20-53. []
  11. Allen W. Trelease, Indian Affairs in Colonial New York: The Seventeenth Century (Lincoln: University of Nebraska Press, 1960/1997), 36. []
  12. Daniel K. Richter, Trade, Land, Power: The Struggle for Eastern North America (Philadelphia: University of Pennsylvania Press, 2013), 101. []
  13. Janny Venema, Beverwijck: A Dutch Village on the American Frontier, 1652-1664 (Albany: SUNY Press, 2003). []
  14. Leslie M. Harris, In the Shadow of Slavery: African Americans in New York City, 1626-1863 (Chicago: University of Chicago Press, 2003), 21. []
  15. Alida C. Metcalf, Go-betweens and the Colonization of Brazil: 1500–1600 (Austin: University of Texas Press, 2005). See also James H. Sweet, Recreating Africa: Culture, Kinship, and Religion in the African-Portuguese World, 1441-1770 (Chapel Hill: University of North Carolina Press, 2003. []
  16. Edmund S. Morgan, American Slavery, American Freedom: The Ordeal of Colonial Virginia (New York: Norton, 1975), 30. []
  17. John Walter, Crowds and Popular Politics in Early Modern England (Manchester: Manchester University Press, 2006), 131-135. []
  18. Christopher Hodgkins, Reforming Empire: Protestant Colonialism and Conscience in British Literature (Columbia: University of Missouri Press, 2002), 15. []
  19. Richard Hakluyt, Discourse on Western Planting (1584). Available online from archive.org: https://archive.org/details/discourseonweste02hakl_0. []
  20. Edmund S. Morgan, American Slavery, American Freedom: The Ordeal of Colonial Virginia (New York: W.W. Norton & Co. , 1975), 9. []
  21. Felipe Fernández-Armesto, The Spanish Armada: The Experience of War in 1588 (New York: Oxford University Press, 1988). []
  22. John Smith, Advertisements for the Inexperienced Planters 
of New England, or Anywhere 
or 
The Pathway To Experience to Erect a Plantation (London: 1631), 16. []
  23. George Percy, “A True Relation of the Proceedings and Occurrents of Moment which Have Hap’ned in Virginia,” quoted in, Jamestown Narratives: Eyewitness Accounts of the Virginia Colony, the First Decade, 1607–1617, Edward Wright Haile, ed. (Champlain, Va.: Round House, 1998), p. 505. []
  24. Eric A. Powell, “Chilling Discovery at Jamestown,” Archaeology (June 10, 2013). Available online at: www.archaeology.org/issues/96-1307/trenches/973-jamestown-starving-time-cannibalism. []
  25. Dennis Montgomery, 1607: Jamestown and the New World (Williamsburg: The Colonial Williamsburg Foundation, 2007), 126. []
  26. Daniel K. Richter, Facing East From Indian Country: A Native History of Early America (Cambridge: Harvard University Press, 2009), 75. []
  27. Winthrop Jordan, White Over Black: American Attitudes Toward the Negro, 1550-12 (Chapel Hill: University of North Carolina Press, 1968), 7. []
  28. Ibid., 16. []
  29. T. H. Breen and Stephen Innes, “Myne Owne Ground”: Race and Freedom on Virginia’s Eastern Shore, 1640-1676 (New York: Oxford University Press, 1980/2005). []
  30. John Winthrop, A Modell of Christian Charity (1830), first published in Collections of the Massachusetts Historical Society (Boston, 1838), 3rd series 7:31-48. Available online at http://history.hanover.edu/texts/winthmod.html. Accessed July 1, 2015. []
  31. Alan Taylor, American Colonies: The Settling of North America (New York, Penguin, 2001), 170. []
  32. Virginia DeJohn Anderson, New England’s Generation: The Great Migration and the Formation of Society and Culture in the Seventeenth Century (New York: Cambridge University Press, 1991), 90-91. []
  33. Increase Mather, A Testimony Against Several Prophane and Superstitious Customs, Now Practised by Some in New-England (London: 1687). []
  34. Diary of Cotton Mather, 1709-724 (Boston: Massachusetts Historical Society, 1912), 146. []

30. The Recent Past

Crowds cheer rioters waving Donald Trump flags as they breach the U.S. Capitol

Supporters of defeated U.S. President Donald Trump cheer the breaching of the U.S. Capitol on January 6, 2021. Via Wikimedia.

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

The U.S. Capitol was stormed on January 6, 2021. Thousands of right-wing protestors, fueled by an onslaught of lies and fabrications and conspiracy theories surrounding the November 2020 elections, rallied that morning in front of the White House to “Stop the Steal.” Repeating a familiar litany of lies and distortions, the sitting president of the United States then urged them to march on the Capitol and stop the certification of the November electoral vote. “You’ll never take back our country with weakness,” he said. “Fight like hell,” he said. “If you don’t fight like hell, you’re not going to have a country anymore.”1 And so they did. They marched on the capitol, armed themselves with metal pipes, baseball bats, hockey sticks, pepper spray, stun guns, and flag poles, and attacked the police officers barricading the building.

“It was like something from a medieval battle,” Capitol Police Officer Aquilino Gonell recalled 2  The mob pulled D.C. Metropolitan Police Officer Michael Fanone into the crowd, beat him with flagpoles, and tasered him. “Kill him with his own gun,” Fanone remembered the mob shouting just before he lost consciousness. “I can still hear those words in my head today,” he testified six months later.3

The mob breached the barriers and poured into the building, marking perhaps the greatest domestic assault on the American federal government since the Civil War. But the events of January 6 were rooted in history.

Revolutionary technological change, unprecedented global flows of goods and people and capital, an amorphous decades-long War on Terror, accelerating inequality, growing diversity, a changing climate, political stalemate: our present is not an island of circumstance but a product of history. Time marches forever on. The present becomes the past, but, as William Faulkner famously put it, “The past is never dead. It’s not even past.”4 The last several decades of American history have culminated in the present, an era of innovation and advancement but also of stark partisan division, racial and ethnic tension, protests, gender divides, uneven economic growth, widening inequalities, military interventions, bouts of mass violence, and pervasive anxieties about the present and future of the United States. Through boom and bust, national tragedy, foreign wars, and the maturation of a new generation, a new chapter of American history is busy being written.

 

II. American Politics before September 11, 2001

The conservative Reagan Revolution lingered over the presidential election of 1988. At stake was the legacy of a newly empowered conservative movement, a movement that would move forward with Reagan’s vice president, George H. W. Bush, who triumphed over Massachusetts governor Michael Dukakis with a promise to continue the conservative work that had commenced in the 1980s.

The son of a U.S. senator from Connecticut, George H. W. Bush was a World War II veteran, president of a successful oil company, chair of the Republican National Committee, director of the CIA, and member of the House of Representatives from Texas. After failing to best Reagan in the 1980 Republican primaries, he was elected as his vice president in 1980 and again in 1984. In 1988, Michael Dukakis, a proud liberal from Massachusetts, challenged Bush for the White House.

Dukakis ran a weak campaign. Bush, a Connecticut aristocrat who had never been fully embraced by movement conservatism, particularly the newly animated religious right, nevertheless hammered Dukakis with moral and cultural issues. Bush said Dukakis had blocked recitation of the Pledge of Allegiance in Massachusetts schools and that he was a “card-carrying member” of the ACLU. Bush meanwhile dispatched his eldest son, George W. Bush, as his ambassador to the religious right.5 Bush also infamously released a political ad featuring the face of Willie Horton, a Black Massachusetts man and convicted murderer who raped a woman after being released through a prison furlough program during Dukakis’s tenure as governor. “By the time we’re finished,” Bush’s campaign manager, Lee Atwater, said, “they’re going to wonder whether Willie Horton is Dukakis’ running mate.”6 Liberals attacked conservatives for perpetuating the ugly “code word” politics of the old Southern Strategy—the underhanded appeal to white racial resentments perfected by Richard Nixon in the aftermath of civil rights legislation.7 Buoyed by such attacks, Bush won a large victory and entered the White House.

Bush’s election signaled Americans’ continued embrace of Reagan’s conservative program and further evidenced the utter disarray of the Democratic Party. American liberalism, so stunningly triumphant in the 1960s, was now in full retreat. It was still, as one historian put it, the “Age of Reagan.”8

The Soviet Union collapsed during Bush’s tenure. Devastated by a stagnant economy, mired in a costly and disastrous war in Afghanistan, confronted with dissident factions in Eastern Europe, and rocked by internal dissent, the Soviet Union crumbled. Soviet leader and reformer Mikhail Gorbachev loosened the Soviet Union’s tight personal restraints and censorship (glasnost) and liberalized the Soviet political machinery (perestroika). Eastern Bloc nations turned against their communist organizations and declared their independence from the Soviet Union. Gorbachev let them go. Soon, the Soviet Union unraveled. On December 25, 1991, Gorbachev resigned his office, declaring that the Soviet Union no longer existed. At the Kremlin—Russia’s center of government—the new tricolor flag of the Russian Federation was raised.9

The dissolution of the Soviet Union left the United States as the world’s only remaining superpower. Global capitalism seemed triumphant. Observers wondered if some final stage of history had been reached, if the old battles had ended and a new global consensus built around peace and open markets would reign forever. “What we may be witnessing is not just the end of the Cold War, or the passing of a particular period of post-war history, but the end of history as such,” wrote Francis Fukuyama in his much-talked-about 1989 essay, “The End of History?”10 Assets in Eastern Europe were privatized and auctioned off as newly independent nations introduced market economies. New markets were rising in Southeast Asia and Eastern Europe. India, for instance, began liberalizing its economic laws and opening itself up to international investment in 1991. China’s economic reforms, advanced by Chairman Deng Xiaoping and his handpicked successors, accelerated as privatization and foreign investment proceeded.

The post–Cold War world was not without international conflicts, however. When Iraq invaded the small but oil-rich nation of Kuwait in 1990, Congress granted President Bush approval to intervene. The United States laid the groundwork for intervention (Operation Desert Shield) in August and commenced combat operations (Operation Desert Storm) in January 1991. With the memories of Vietnam still fresh, many Americans were hesitant to support military action that could expand into a protracted war or long-term commitment of troops. But the Gulf War was a swift victory for the United States. New technologies—including laser-guided precision bombing—amazed Americans, who could now watch twenty-four-hour live coverage of the war on the Cable News Network (CNN). The Iraqi army disintegrated after only a hundred hours of ground combat. President Bush and his advisors opted not to pursue the war into Baghdad and risk an occupation and insurgency. And so the war was won. Many wondered if the “ghosts of Vietnam” had been exorcised.11 Bush won enormous popular support. Gallup polls showed a job approval rating as high as 89 percent in the weeks after the end of the war.12

Photograph of burning oil fields in Kuwait.

During the Gulf War, the Iraqi military set fire to Kuwait’s oil fields, many of which burned for months. March 21, 1991. Wikimedia.

President Bush’s popularity seemed to suggest an easy reelection in 1992, but Bush had still not won over the New Right, the aggressively conservative wing of the Republican Party, despite his attacks on Dukakis, his embrace of the flag and the pledge, and his promise, “Read my lips: no new taxes.” He faced a primary challenge from political commentator Patrick Buchanan, a former Reagan and Nixon White House advisor, who cast Bush as a moderate, as an unworthy steward of the conservative movement who was unwilling to fight for conservative Americans in the nation’s ongoing culture war. Buchanan did not defeat Bush in the Republican primaries, but he inflicted enough damage to weaken his candidacy.13

Still thinking that Bush would be unbeatable in 1992, many prominent Democrats passed on a chance to run, and the Democratic Party nominated a relative unknown, Arkansas governor Bill Clinton. Dogged by charges of marital infidelity and draft dodging during the Vietnam War, Clinton was a consummate politician with enormous charisma and a skilled political team. He framed himself as a New Democrat, a centrist open to free trade, tax cuts, and welfare reform. Twenty-two years younger than Bush, he was the first baby boomer to make a serious run at the presidency. Clinton presented the campaign as a generational choice. During the campaign he appeared on MTV, played the saxophone on The Arsenio Hall Show, and told voters that he could offer the United States a new way forward.

Bush ran on his experience and against Clinton’s moral failings. The GOP convention in Houston that summer featured speeches from Pat Buchanan and religious leader Pat Robertson decrying the moral decay plaguing American life. Clinton was denounced as a social liberal who would weaken the American family through both his policies and his individual moral character. But Clinton was able to convince voters that his moderated southern brand of liberalism would be more effective than the moderate conservatism of George Bush. Bush’s candidacy, of course, was perhaps most damaged by a sudden economic recession. As Clinton’s political team reminded the country, “It’s the economy, stupid.”

Clinton won the election, but the Reagan Revolution still reigned. Clinton and his running mate, Tennessee senator Albert Gore Jr., both moderate southerners, promised a path away from the old liberalism of the 1970s and 1980s (and the landslide electoral defeats of the 1980s). They were Democrats, but conservative Democrats, so-called New Democrats. In his first term, Clinton set out an ambitious agenda that included an economic stimulus package, universal health insurance, a continuation of the Middle East peace talks initiated by Bush’s secretary of state James A. Baker III, welfare reform, and a completion of the North American Free Trade Agreement (NAFTA) to abolish trade barriers between the United States, Mexico, and Canada. His moves to reform welfare, open trade, and deregulate financial markets were particular hallmarks of Clinton’s Third Way, a new Democratic embrace of heretofore conservative policies.14

With NAFTA, Clinton reversed decades of Democratic opposition to free trade and opened the nation’s northern and southern borders to the free flow of capital and goods. Critics, particularly in the Midwest’s Rust Belt, blasted the agreement for opening American workers to competition by low-paid foreign workers. Many American factories relocated and set up shops—maquilas—in northern Mexico that took advantage of Mexico’s low wages. Thousands of Mexicans rushed to the maquilas. Thousands more continued on past the border.

If NAFTA opened American borders to goods and services, people still navigated strict legal barriers to immigration. Policy makers believed that free trade would create jobs and wealth that would incentivize Mexican workers to stay home, and yet multitudes continued to leave for opportunities in el norte. The 1990s proved that prohibiting illegal migration was, if not impossible, exceedingly difficult. Poverty, political corruption, violence, and hopes for a better life in the United States—or simply higher wages—continued to lure immigrants across the border. Between 1990 and 2010, the proportion of foreign-born individuals in the United States grew from 7.9 percent to 12.9 percent, and the number of undocumented immigrants tripled from 3.5 million to 11.2. While large numbers continued to migrate to traditional immigrant destinations—California, Texas, New York, Florida, New Jersey, and Illinois—the 1990s also witnessed unprecedented migration to the American South. Among the fastest-growing immigrant destination states were Kentucky, Tennessee, Arkansas, Georgia, and North Carolina, all of which had immigration growth rates in excess of 100 percent during the decade.15

In response to the continued influx of immigrants and the vocal complaints of anti-immigration activists, policy makers responded with such initiatives as Operation Gatekeeper and Hold the Line, which attempted to make crossing the border more prohibitive. The new strategy “funneled” immigrants to dangerous and remote crossing areas. Immigration officials hoped the brutal natural landscape would serve as a natural deterrent. It wouldn’t. By 2017, hundreds of immigrants died each year of drowning, exposure, and dehydration.16

Clinton, meanwhile, sought to carve out a middle ground in his domestic agenda. In his first weeks in office, Clinton reviewed Department of Defense policies restricting homosexuals from serving in the armed forces. He pushed through a compromise plan, Don’t Ask, Don’t Tell, that removed any questions about sexual orientation in induction interviews but also required that gay military personnel keep their sexual orientation private. The policy alienated many. Social conservatives were outraged and his credentials as a conservative southerner suffered, while many liberals recoiled at continued antigay discrimination.

In his first term, Clinton also put forward universal healthcare as a major policy goal, and first lady Hillary Rodham Clinton played a major role in the initiative. But the push for a national healthcare law collapsed on itself. Conservatives revolted, the healthcare industry flooded the airwaves with attack ads, Clinton struggled with congressional Democrats, and voters bristled. A national healthcare system was again repulsed.

The midterm elections of 1994 were a disaster for the Democrats, who lost the House of Representatives for the first time since 1952. Congressional Republicans, led by Georgia congressman Newt Gingrich and Texas congressman Dick Armey, offered a policy agenda they called the Contract with America. Republican candidates from around the nation gathered on the steps of the Capitol to pledge their commitment to a conservative legislative blueprint to be enacted if the GOP won control of the House. The strategy worked.

Social conservatives were mobilized by an energized group of religious activists, especially the Christian Coalition, led by Pat Robertson and Ralph Reed. Robertson was a television minister and entrepreneur whose 1988 long shot run for the Republican presidential nomination brought him a massive mailing list and a network of religiously motivated voters around the country. From that mailing list, the Christian Coalition organized around the country, seeking to influence politics on the local and national level.

In 1996 the generational contest played out again when the Republicans nominated another aging war hero, Senator Bob Dole of Kansas, but Clinton again won the election, becoming the first Democrat to serve back-to-back terms since Franklin Roosevelt. He was aided in part by the amelioration of conservatives by his signing of welfare reform legislation, the Personal Responsibility and Work Opportunity Reconciliation Act of 1996, which decreased welfare benefits, restricted eligibility, and turned over many responsibilities to states. Clinton said it would “break the cycle of dependency.”17

Clinton presided over a booming economy fueled by emergent computing technologies. Personal computers had skyrocketed in sales, and the Internet became a mass phenomenon. Communication and commerce were never again the same. The tech boom was driven by business, and the 1990s saw robust innovation and entrepreneurship. Investors scrambled to find the next Microsoft or Apple, suddenly massive computing companies. But it was the Internet that sparked a bonanza. The dot-com boom fueled enormous economic growth and substantial financial speculation to find the next Google or Amazon.

Republicans, defeated at the polls in 1996 and 1998, looked for other ways to undermine Clinton’s presidency. Political polarization seemed unprecedented and a sensation-starved, post-Watergate media demanded scandal. The Republican Congress spent millions on investigations hoping to uncover some shred of damning evidence to sink Clinton’s presidency, whether it be real estate deals, White House staffing, or adultery. Rumors of sexual misconduct had always swirled around Clinton. The press, which had historically turned a blind eye to such private matters, saturated the media with Clinton’s sex scandals. Congressional investigations targeted the allegations and Clinton denied having “sexual relations” with Monica Lewinsky before a grand jury and in a statement to the American public. Republicans used the testimony to allege perjury. In December 1998, the House of Representatives voted to impeach the president. It was a wildly unpopular step. Two thirds of Americans disapproved, and a majority told Gallup pollsters that Republicans had abused their constitutional authority. Clinton’s approval rating, meanwhile, jumped to 78 percent.18 In February 1999, Clinton was acquitted by the Senate by a vote that mostly fell along party lines.

The 2000 election pitted Vice President Albert Gore Jr. against George W. Bush, the twice-elected Texas governor and son of the former president. Gore, wary of Clinton’s recent impeachment despite Clinton’s enduring approval ratings, distanced himself from the president and eight years of relative prosperity. Instead, he ran as a pragmatic, moderate liberal. Bush, too, ran as a moderate, claiming to represent a compassionate conservatism and a new faith-based politics. Bush was an outspoken evangelical. In a presidential debate, he declared Jesus Christ his favorite political philosopher. He promised to bring church leaders into government, and his campaign appealed to churches and clergy to get out the vote. Moreover, he promised to bring honor, dignity, and integrity to the Oval Office, a clear reference to Clinton. Utterly lacking the political charisma that had propelled Clinton, Gore withered under Bush’s attacks. Instead of trumpeting the Clinton presidency, Gore found himself answering the media’s questions about whether he was sufficiently an alpha male and whether he had invented the Internet.

Few elections have been as close and contentious as the 2000 election, which ended in a deadlock. Gore had won the popular vote by 500,000 votes, but the Electoral College hinged on a contested Florida election. On election night the media called Florida for Gore, but then Bush made late gains and news organizations reversed themselves by declaring the state for Bush—and Bush the probable president-elect. Gore conceded privately to Bush, then backpedaled as the counts edged back toward Gore yet again. When the nation awoke the next day, it was unclear who had been elected president. The close Florida vote triggered an automatic recount.

Lawyers descended on Florida. The Gore campaign called for manual recounts in several counties. Local election boards, Florida Secretary of State Katherine Harris, and the Florida Supreme Court all weighed in until the U.S. Supreme Court stepped in and, in an unprecedented 5–4 decision in Bush v. Gore, ruled that the recount had to end. Bush was awarded Florida by a margin of 537 votes, enough to win him the state and give him a majority in the Electoral College. He had won the presidency.

In his first months in office, Bush fought to push forward enormous tax cuts skewed toward America’s highest earners. The bursting of the dot-com bubble weighed down the economy. Old political and cultural fights continued to be fought. And then the towers fell.

 

III. September 11 and the War on Terror

On the morning of September 11, 2001, nineteen operatives of the al-Qaeda terrorist organization hijacked four passenger planes on the East Coast. American Airlines Flight 11 crashed into the North Tower of the World Trade Center in New York City at 8:46 a.m. Eastern Daylight Time (EDT). United Airlines Flight 175 crashed into the South Tower at 9:03. American Airlines Flight 77 crashed into the western façade of the Pentagon at 9:37. At 9:59, the South Tower of the World Trade Center collapsed. At 10:03, United Airlines Flight 93 crashed in a field outside Shanksville, Pennsylvania, brought down by passengers who had received news of the earlier hijackings. At 10:28, the North Tower collapsed. In less than two hours, nearly three thousand Americans had been killed.

Photograph of the smoldering ruins of the twin towers six days after the September 11th attacks.

Ground Zero six days after the September 11th attacks. Wikimedia, .

The attacks stunned Americans. Late that night, Bush addressed the nation and assured the country that “the search is under way for those who are behind these evil acts.” At Ground Zero three days later, Bush thanked first responders for their work. A worker said he couldn’t hear him. “I can hear you,” Bush shouted back, “The rest of the world hears you. And the people who knocked these buildings down will hear all of us soon.”

Photograph of President Bush addressing rescue workers at Ground Zero of the World Trade Center disaster.

President Bush addresses rescue workers at Ground Zero. 2001. FEMA Photo Library.

American intelligence agencies quickly identified the radical Islamic militant group al-Qaeda, led by the wealthy Saudi Osama bin Laden, as the perpetrators of the attack. Sheltered in Afghanistan by the Taliban, the country’s Islamic government, al-Qaeda was responsible for a 1993 bombing of the World Trade Center and a string of attacks at U.S. embassies and military bases across the world. Bin Laden’s Islamic radicalism and his anti-American aggression attracted supporters across the region and, by 2001, al-Qaeda was active in over sixty countries.

Although in his presidential campaign Bush had denounced foreign nation-building, he populated his administration with neoconservatives, firm believers in the expansion of American democracy and American interests abroad. Bush advanced what was sometimes called the Bush Doctrine, a policy in which the United States would have the right to unilaterally and preemptively make war on any regime or terrorist organization that posed a threat to the United States or to U.S. citizens. It would lead the United States into protracted conflicts in Afghanistan and Iraq and entangle the United States in nations across the world. Journalist Dexter Filkins called it a Forever War, a perpetual conflict waged against an amorphous and undefeatable enemy.19 The geopolitical realities of the twenty-first-century world were forever transformed.

The United States, of course, had a history in Afghanistan. When the Soviet Union invaded Afghanistan in December 1979 to quell an insurrection that threatened to topple Kabul’s communist government, the United States financed and armed anti-Soviet insurgents, the Mujahideen. In 1981, the Reagan administration authorized the CIA to provide the Mujahideen with weapons and training to strengthen the insurgency. An independent wealthy young Saudi, Osama bin Laden, also fought with and funded the Mujahideen. And they began to win. Afghanistan bled the Soviet Union dry. The costs of the war, coupled with growing instability at home, convinced the Soviets to withdraw from Afghanistan in 1989.20

Osama bin Laden relocated al-Qaeda to Afghanistan after the country fell to the Taliban in 1996. Under Bill Clinton, the United States launched cruise missiles at al-Qaeda camps in Afghanistan in retaliation for al-Qaeda bombings on American embassies in Africa.

After September 11, with a broad authorization of military force, Bush administration officials made plans for military action against al-Qaeda and the Taliban. What would become the longest war in American history began with the launching of Operation Enduring Freedom in October 2001. Air and missile strikes hit targets across Afghanistan. U.S. Special Forces joined with fighters in the anti-Taliban Northern Alliance. Major Afghan cities fell in quick succession. The capital, Kabul, fell on November 13. Bin Laden and al-Qaeda operatives retreated into the rugged mountains along the border of Pakistan in eastern Afghanistan. The American occupation of Afghanistan continued.

As American troops struggled to contain the Taliban in Afghanistan, the Bush administration set its sights on Iraq. After the conclusion of the Gulf War in 1991, American officials established economic sanctions, weapons inspections, and no-fly zones. By mid-1991, American warplanes were routinely patrolling Iraqi skies and coming under periodic fire from Iraqi missile batteries. The overall cost to the United States of maintaining the two no-fly zones over Iraq was roughly $1 billion a year. Related military activities in the region added almost another $500 million to the annual bill. On the ground in Iraq, meanwhile, Iraqi authorities clashed with UN weapons inspectors. Iraq had suspended its program for weapons of mass destruction, but Saddam Hussein fostered ambiguity about the weapons in the minds of regional leaders to forestall any possible attacks against Iraq.

In 1998, a standoff between Hussein and the United Nations over weapons inspections led President Bill Clinton to launch punitive strikes aimed at debilitating what was thought to be a developed chemical weapons program. Attacks began on December 16, 1998. More than two hundred cruise missiles fired from U.S. Navy warships and Air Force B-52 bombers flew into Iraq, targeting suspected chemical weapons storage facilities, missile batteries, and command centers. Airstrikes continued for three more days, unleashing in total 415 cruise missiles and 600 bombs against 97 targets. The number of bombs dropped was nearly double the number used in the 1991 conflict.

The United States and Iraq remained at odds throughout the 1990s and early 2000, when Bush administration officials began championing “regime change.” The administration publicly denounced Saddam Hussein’s regime and its alleged weapons of mass destruction. Deceptively tying Saddam Hussein to international terrorists—a majority of Americans linked Hussein to the 9/11 attacks21 The administration’s push for war was in full swing. Protests broke out across the country and all over the world, but majorities of Americans supported military action. On October 16, Congress passed the Authorization for Use of Military Force Against Iraq resolution, giving Bush the power to make war in Iraq. Iraq began cooperating with UN weapons inspectors in late 2002, but the Bush administration pressed on. On February 6, 2003, Secretary of State Colin Powell, who had risen to public prominence as chairman of the Joint Chiefs of State during the Persian Gulf War in 1991, presented allegations of a robust Iraqi weapons program to the UN. Protests continued.

The first American bombs hit Baghdad on March 20, 2003. Several hundred thousand troops moved into Iraq and Hussein’s regime quickly collapsed. Baghdad fell on April 9. On May 1, 2003, aboard the USS Abraham Lincoln, beneath a banner reading Mission Accomplished, George W. Bush announced that “major combat operations in Iraq have ended.”22 No evidence of weapons of mass destruction were ever found. And combat operations had not ended, not really. The Iraqi insurgency had begun, and the United States would spend the next ten years struggling to contain it.

Photograph of a 2003 celebration aboard an aircraft carrier that featured a banner saying "MISSION ACCOMPLISHED"

Despite George W. Bush’s ill-conceived photo op under a Mission Accomplished banner in May 2003, combat operations in Iraq continued for years. Wikimedia.

Efforts by various intelligence gathering agencies led to the capture of Saddam Hussein, hidden in an underground compartment near his hometown, on December 13, 2003. The new Iraqi government found him guilty of crimes against humanity and he was hanged on December 30, 2006. But the war in Iraq was not over.

 

IV. The End of the Bush Years

The War on Terror was a centerpiece in the race for the White House in 2004. The Democratic ticket, headed by Massachusetts senator John F. Kerry, a Vietnam War hero who entered the public consciousness for his subsequent testimony against it, attacked Bush for the ongoing inability to contain the Iraqi insurgency or to find weapons of mass destruction, the revelation and photographic evidence that American soldiers had abused prisoners at the Abu Ghraib prison outside Baghdad, and the inability to find Osama bin Laden. Moreover, many enemy combatants who had been captured in Iraq and Afghanistan were “detained” indefinitely at a military prison in Guantanamo Bay in Cuba. “Gitmo” became infamous for its harsh treatment, indefinite detentions, and torture of prisoners. Bush defended the War on Terror, and his allies attacked critics for failing to “support the troops.” Moreover, Kerry had voted for the war—he had to attack the very thing that he had authorized. Bush won a close but clear victory.

The second Bush term saw the continued deterioration of the wars in Iraq and Afghanistan, but Bush’s presidency would take a bigger hit from his perceived failure to respond to the domestic tragedy that followed Hurricane Katrina’s devastating hit on the Gulf Coast. Katrina had been a category 5 hurricane. It was, the New Orleans Times-Picayune reported, “the storm we always feared.”23

New Orleans suffered a direct hit, the levees broke, and the bulk of the city flooded. Thousands of refugees flocked to the Superdome, where supplies and medical treatment and evacuation were slow to come. Individuals died in the heat. Bodies wasted away. Americans saw poor Black Americans abandoned. Katrina became a symbol of a broken administrative system, a devastated coastline, and irreparable social structures that allowed escape and recovery for some and not for others. Critics charged that Bush had staffed his administration with incompetent supporters and had further ignored the displaced poor and Black residents of New Orleans.24

Photograph of hundreds of refugees from Hurricane Katrina living on cots in the Houston Astrodome.

Hurricane Katrina was one of the deadliest and more destructive hurricanes to hit American soil in U.S. history. It nearly destroyed New Orleans, Louisiana, as well as cities, towns, and rural areas across the Gulf Coast. It sent hundreds of thousands of refugees to near-by cities like Houston, Texas, where they temporarily resided in massive structures like the Astrodome. Photograph, September 1, 2005. Wikimedia.

Immigration, meanwhile, had become an increasingly potent political issue. The Clinton administration had overseen the implementation of several anti-immigration policies on the U.S.-Mexico border, but hunger and poverty were stronger incentives than border enforcement policies were deterrents. Illegal immigration continued, often at great human cost, but nevertheless fanned widespread anti-immigration sentiment among many American conservatives. But George W. Bush used the issue to win re-election and Republicans used it in the 2006 mid-terms, passing legislation—with bipartisan support—that provided for a border “fence.” 700 miles of towering steel barriers sliced through border towns and deserts. Many immigrants and their supporters tried to fight back. The spring and summer of 2006 saw waves of protests across the country. Hundreds of thousands marched in Chicago, New York, and Los Angeles, and tens of thousands marched in smaller cities around the country. Legal change, however, went nowhere. Moderate conservatives feared upsetting business interests’ demand for cheap, exploitable labor and alienating large voting blocs by stifling immigration, and moderate liberals feared upsetting anti-immigrant groups by pushing too hard for liberalization of immigration laws. The fence was built and the border was tightened.

At the same time, Iraq descended further into chaos as insurgents battled against American troops and groups such as Abu Musab al-Zarqawi’s al-Qaeda in Iraq bombed civilians and released video recordings of beheadings. In 2007, twenty-seven thousand additional U.S. forces deployed to Iraq under the command of General David Petraeus. The effort, “the surge,” employed more sophisticated anti-insurgency strategies and, combined with Sunni efforts, pacified many of Iraq’s cities and provided cover for the withdrawal of American forces. On December 4, 2008, the Iraqi government approved the U.S.-Iraq Status of Forces Agreement, and U.S. combat forces withdrew from Iraqi cities before June 30, 2009. The last U.S. combat forces left Iraq on December 18, 2011. Violence and instability continued to rock the country.

Afghanistan, meanwhile, had also continued to deteriorate. In 2006, the Taliban reemerged, as the Afghan government proved both highly corrupt and incapable of providing social services or security for its citizens. The Taliban began re-acquiring territory. Money and American troops continued to prop up the Afghanistan government until American forces withdrew hastily in August 2021. The Taliban immediately took over the remainder of the country, outlasting America’s twenty-year occupation.

 

V. The Great Recession

The Great Recession began, as most American economic catastrophes began, with the bursting of a speculative bubble. Throughout the 1990s and into the new millennium, home prices continued to climb, and financial services firms looked to cash in on what seemed to be a safe but lucrative investment. After the dot-com bubble burst, investors searched for a secure investment rooted in clear value, rather than in trendy technological speculation. What could be more secure than real estate? But mortgage companies began writing increasingly risky loans and then bundling them together and selling them over and over again, sometimes so quickly that it became difficult to determine exactly who owned what.

Decades of financial deregulation had rolled back Depression-era restraints and again allowed risky business practices to dominate the world of American finance. It was a bipartisan agenda. In the 1990s, for instance, Bill Clinton signed the Gramm-Leach-Bliley Act, repealing provisions of the 1933 Glass-Steagall Act separating commercial and investment banks, and the Commodity Futures Modernization Act, which exempted credit-default swaps—perhaps the key financial mechanism behind the crash—from regulation.

Mortgages had been so heavily leveraged that when American homeowners began to default on their loans, the whole system collapsed. Major financial services firms such as Bear Stearns and Lehman Brothers disappeared almost overnight. In order to prevent the crisis from spreading, President Bush signed the Emergency Economic Stabilization Act and the federal government immediately began pouring billions of dollars into the industry, propping up hobbled banks. Massive giveaways to bankers created shock waves of resentment throughout the rest of the country, contributing to Obama’s 2008 election. But Obama oversaw the program after his inauguration. Thereafter, conservative members of the Tea Party decried the cronyism of an incoming Obama administration filled with former Wall Street executives. The same energies also motivated the Occupy Wall Street movement, as mostly young left-leaning New Yorkers protested an American economy that seemed overwhelmingly tilted toward “the one percent.”25

The Great Recession only magnified already rising income and wealth inequalities. According to the chief investment officer at JPMorgan Chase, the largest bank in the United States, “profit margins have reached levels not seen in decades,” and “reductions in wages and benefits explain the majority of the net improvement.”26 A study from the Congressional Budget Office (CBO) found that since the late 1970s, after-tax benefits of the wealthiest 1 percent grew by over 300 percent. The “average” American’s after-tax benefits had grown 35 percent. Economic trends have disproportionately and objectively benefited the wealthiest Americans. Still, despite political rhetoric, American frustration failed to generate anything like the social unrest of the early twentieth century. A weakened labor movement and a strong conservative bloc continue to stymie serious attempts at reversing or even slowing economic inequalities. Occupy Wall Street managed to generate a fair number of headlines and shift public discussion away from budget cuts and toward inequality, but its membership amounted to only a fraction of the far more influential and money-driven Tea Party. Its presence on the public stage was fleeting.

The Great Recession, however, was not. While American banks quickly recovered and recaptured their steady profits, and the American stock market climbed again to new heights, American workers continued to lag. Job growth was slow and unemployment rates would remain stubbornly high for years. Wages froze, meanwhile, and well-paying full-time jobs that were lost were too often replaced by low-paying, part-time work. A generation of workers coming of age within the crisis, moreover, had been savaged by the economic collapse. Unemployment among young Americans hovered for years at rates nearly double the national average.

 

VI. The Obama Years

Photograph of 5-year-old Jacob Philadelphia touching President Barack Obama's hair. The child said, “I want to know if my hair is just like yours."

In 2008, Barack Obama became the first African American elected to the presidency. In this official White House photo from May, 2009, 5-year-old Jacob Philadelphia said, “I want to know if my hair is just like yours.” The White House via Flickr.

By the 2008 election, with Iraq still in chaos, Democrats were ready to embrace the antiwar position and sought a candidate who had consistently opposed military action in Iraq. Senator Barack Obama had only been a member of the Illinois state senate when Congress debated the war actions, but he had publicly denounced the war, predicting the sectarian violence that would ensue, and remained critical of the invasion through his 2004 campaign for the U.S. Senate. He began running for president almost immediately after arriving in Washington.

A former law professor and community activist, Obama became the first African American candidate to ever capture the nomination of a major political party.27 During the election, Obama won the support of an increasingly antiwar electorate. When an already fragile economy finally collapsed in 2007 and 2008, Bush’s policies were widely blamed. Obama’s opponent, Republican senator John McCain, was tied to those policies and struggled to fight off the nation’s desire for a new political direction. Obama won a convincing victory in the fall and became the nation’s first African American president.

President Obama’s first term was marked by domestic affairs, especially his efforts to combat the Great Recession and to pass a national healthcare law. Obama came into office as the economy continued to deteriorate. He continued the bank bailout begun under his predecessor and launched a limited economic stimulus plan to provide government spending to reignite the economy.

Despite Obama’s dominant electoral victory, national politics fractured, and a conservative Republican firewall quickly arose against the Obama administration. The Tea Party became a catch-all term for a diffuse movement of fiercely conservative and politically frustrated American voters. Typically whiter, older, and richer than the average American, flush with support from wealthy backers, and clothed with the iconography of the Founding Fathers, Tea Party activists registered their deep suspicions of the federal government.28 Tea Party protests dominated the public eye in 2009 and activists steered the Republican Party far to the right, capturing primary elections all across the country.

Obama’s most substantive legislative achievement proved to be a national healthcare law, the Patient Protection and Affordable Care Act (Obamacare). Presidents since Theodore Roosevelt had striven to pass national healthcare reform and failed. Obama’s plan forsook liberal models of a national healthcare system and instead adopted a heretofore conservative model of subsidized private care (similar plans had been put forward by Republicans Richard Nixon, Newt Gingrich, and Obama’s 2012 opponent, Mitt Romney). Beset by conservative protests, Obama’s healthcare reform narrowly passed through Congress. It abolished pre-existing conditions as a cause for denying care, scrapped junk plans, provided for state-run healthcare exchanges (allowing individuals without healthcare to pool their purchasing power), offered states funds to subsidize an expansion of Medicaid, and required all Americans to provide proof of a health insurance plan that measured up to government-established standards (those who did not purchase a plan would pay a penalty tax, and those who could not afford insurance would be eligible for federal subsidies). The number of uninsured Americans remained stubbornly high, however, and conservatives spent most of the next decade attacking the bill.

Meanwhile, in 2009, President Barack Obama deployed seventeen thousand additional troops to Afghanistan as part of a counterinsurgency campaign that aimed to “disrupt, dismantle, and defeat” al-Qaeda and the Taliban. Meanwhile, U.S. Special Forces and CIA drones targeted al-Qaeda and Taliban leaders. In May 2011, U.S. Navy Sea, Air and Land Forces (SEALs) conducted a raid deep into Pakistan that led to the killing of Osama bin Laden. The United States and NATO began a phased withdrawal from Afghanistan in 2011, with an aim of removing all combat troops by 2014. Although weak militarily, the Taliban remained politically influential in south and eastern Afghanistan. Al-Qaeda remained active in Pakistan but shifted its bases to Yemen and the Horn of Africa. As of December 2013, the war in Afghanistan had claimed the lives of 3,397 U.S. service members.

Photograph of former Taliban fighters surrendered their arms to the government of the Islamic Republic of Afghanistan during a reintegration ceremony.

Former Taliban fighters surrender their arms to the government of the Islamic Republic of Afghanistan during a reintegration ceremony at the provincial governor’s compound in May 2012. Wikimedia.

 

VII. Stagnation

In 2012, Barack Obama won a second term by defeating Republican Mitt Romney, the former governor of Massachusetts. However, Obama’s inability to control Congress and the ascendancy of Tea Party Republicans stunted the passage of meaningful legislation. Obama was a lame duck before he ever won reelection, and gridlocked government came to represent an acute sense that much of American life—whether in politics, economics, or race relations—had grown stagnant.

The economy continued its halfhearted recovery from the Great Recession. The Obama administration campaigned on little to specifically address the crisis and, faced with congressional intransigence, accomplished even less. While corporate profits climbed and stock markets soared, wages stagnated and employment sagged for years after the Great Recession. By 2016, the statistically average American worker had not received a raise in almost forty years. The average worker in January 1973 earned $4.03 an hour. Adjusted for inflation, that wage was about two dollars per hour more than the average American earned in 2014. Working Americans were losing ground. Moreover, most income gains in the economy had been largely captured by a small number of wealthy earners. Between 2009 and 2013, 85 percent of all new income in the United States went to the top 1 percent of the population.29

But if money no longer flowed to American workers, it saturated American politics. In 2000, George W. Bush raised a record $172 million for his campaign. In 2008, Barack Obama became the first presidential candidate to decline public funds (removing any applicable caps to his total fund-raising) and raised nearly three quarters of a billion dollars for his campaign. The average House seat, meanwhile, cost about $1.6 million, and the average Senate Seat over $10 million.30 The Supreme Court, meanwhile, removed barriers to outside political spending. In 2002, Senators John McCain and Russ Feingold had crossed party lines to pass the Bipartisan Campaign Reform Act, bolstering campaign finance laws passed in the aftermath of the Watergate scandal in the 1970s. But political organizations—particularly PACs—exploited loopholes to raise large sums of money and, in 2010, the Supreme Court ruled in Citizens United v. FEC that no limits could be placed on political spending by corporations, unions, and nonprofits. Money flowed even deeper into politics.

The influence of money in politics only heightened partisan gridlock, further blocking bipartisan progress on particular political issues. Climate change, for instance, has failed to transcend partisan barriers. In the 1970s and 1980s, experts substantiated the theory of anthropogenic (human-caused) global warming. Eventually, the most influential of these panels, the UN’s Intergovernmental Panel on Climate Change (IPCC) concluded in 1995 that there was a “discernible human influence on global climate.”31 This conclusion, though stated conservatively, was by that point essentially a scientific consensus. By 2007, the IPCC considered the evidence “unequivocal” and warned that “unmitigated climate change would, in the long term, be likely to exceed the capacity of natural, managed and human systems to adapt.”32

Climate change became a permanent and major topic of public discussion and policy in the twenty-first century. Fueled by popular coverage, most notably, perhaps, the documentary An Inconvenient Truth, based on Al Gore’s book and presentations of the same name, addressing climate change became a plank of the American left and a point of denial for the American right. American public opinion and political action still lagged far behind the scientific consensus on the dangers of global warming. Conservative politicians, conservative think tanks, and energy companies waged war to sow questions in the minds of Americans, who remain divided on the question, and so many others.

Much of the resistance to addressing climate change is economic. As Americans looked over their shoulder at China, many refused to sacrifice immediate economic growth for long-term environmental security. Twenty-first-century relations with China remained characterized by contradictions and interdependence. After the collapse of the Soviet Union, China reinvigorated its efforts to modernize its country. By liberating and subsidizing much of its economy and drawing enormous foreign investments, China has posted massive growth rates during the last several decades. Enormous cities rise by the day. In 2000, China had a GDP around an eighth the size of U.S. GDP. Based on growth rates and trends, analysts suggest that China’s economy will bypass that of the United States soon. American concerns about China’s political system have persisted, but money sometimes matters more to Americans. China has become one of the country’s leading trade partners. Cultural exchange has increased, and more and more Americans visit China each year, with many settling down to work and study.

 

VIII. American Carnage

By 2016, American voters were fed up. In that year’s presidential race, Republicans spurned their political establishment and nominated a real estate developer and celebrity billionaire, Donald Trump, who, decrying the tyranny of political correctness and promising to Make America Great Again, promised to build a wall to keep out Mexican immigrants and bar Muslim immigrants. The Democrats, meanwhile, flirted with the candidacy of Senator Bernie Sanders, a self-described democratic socialist from Vermont, before ultimately nominating Hillary Clinton, who, after eight years as first lady in the 1990s, had served eight years in the Senate and four more as secretary of state. Voters despaired: Trump and Clinton were the most unpopular nominees in modern American history. Majorities of Americans viewed each candidate unfavorably and majorities in both parties said, early in the election season, that they were motivated more by voting against their rival candidate than for their own.33 With incomes frozen, politics gridlocked, race relations tense, and headlines full of violence, such frustrations only channeled a larger sense of stagnation, which upset traditional political allegiances. In the end, despite winning nearly three million more votes nationwide, Clinton failed to carry key Midwestern states where frustrated white, working-class voters abandoned the Democratic Party—a Republican president hadn’t carried Wisconsin, Michigan, or Pennsylvania, for instance, since the 1980s—and swung their support to the Republicans. Donald Trump won the presidency.

Donald Trump speaks at a 2018 rally.

Donald Trump speaking at a 2018 rally. Photo by Gage Skidmore. Via Wikimedia.

Political divisions only deepened after the election. A nation already deeply split by income, culture, race, geography, and ideology continued to come apart. Trump’s presidency consumed national attention. Traditional print media and the consumers and producers of social media could not help but throw themselves at the ins and outs of Trump’s norm-smashing first years while seemingly refracting every major event through the prism of the Trump presidency. Robert Mueller’s investigation of Russian election-meddling and the alleged collusion of campaign officials in that effort produced countless headlines.

New policies, meanwhile, enflamed widening cultural divisions. Border apprehensions and deportations reached record levels under the Obama administration, and Trump pushed even farther. He pushed for a massive wall along the border to supplement the fence built under the Bush administration. He began ordering the deportation of so-called Dreamers—students who were born elsewhere but grew up in the United States—and immigration officials separated refugee-status-seeking parents and children at the border. Trump’s border policies heartened his base and aggravated his opponents. While Trump enflamed America’s enduring culture war, his narrowly passed 2017 tax cut continued the redistribution of American wealth toward corporations and wealthy individuals. The tax cut grew the federal deficit and further exacerbated America’s widening economic inequality.

In his inaugural address, Donald Trump promised to end what he called “American carnage”—a nation ravaged, he said, by illegal immigrants, crime, and foreign economic competition. But, under his presidency, the nation only spiraled deeper into cultural and racial divisions, domestic unrest, and growing anxiety about the nation’s future. Trump represented an aggressive, pugilistic anti-liberalism, and, as president, never missing an opportunity to fuel on the fires of right-wing rage. Refusing to settle for the careful statement or defer to bureaucrats, Trump smashed many of the norms of the presidency and raged on his personal Twitter account. And he refused to be governed by the truth.

Few Americans, especially after the Johnson and Nixon administrations, believed that presidents never lied. But perhaps no president ever lied so boldly or so often as Donald Trump, who made, according to one accounting, an untrue statement every day for the first forty days of his presidency.34 By the latter years of his presidency, only about a third of Americans counted him as trustworthy.35 And that compulsive dishonesty led directly to January 6, 2021.

In November 2020, Joseph R. Biden, a longtime senator from Delaware and former Vice President under Barack Obama, running alongside Kamala Harris, a California senator who would become the nation’s first female vice president, convincingly defeated Donald Trump at the polls: Biden won the popular vote by a margin of four percent and the electoral vote by a margin of 74 votes, marking the first time an incumbent president had been defeated in over thirty years. But Trump refused to concede the election. He said it had been stolen. He said votes had been manufactured. He said it was all rigged. The claims were easily debunked, but it didn’t seem to matter: months after the election, somewhere between one-half and two-thirds of self-identified Republicans judged the election stolen.36 So when, on the afternoon of January 6, 2021, the president again articulated a litany of lies about the election and told the crowd of angry conspiracy-minded protestors to march to the Capitol and “fight like hell,” they did.

Thousands of Trump’s followers converged on the Capitol. Roughly one in seven of the more than 500 rioters later arrested were affiliated with extremist groups organized around conspiracy theories, white supremacy, and the right-wing militia movement.37 They waved American and Confederate flags, displayed conspiracy theory slogans and white supremacist icons, carried Christian iconography, and, above all, bore flags, hats, shirts, and other emblazoned with the name of Donald Trump.38 Arming themselves for hand-to-hand combat, they pushed past barriers and battled barricaded police officers. The Capitol attackers injured about 150 of them.39 Officers suffered concussions, burns, bruises, stab wounds, and broken bones.40 One suffered a non-fatal heart attack after being shocked repeatedly by a stun gun. Capitol Police Officer Brian D. Sicknick was killed, either by repeated attacks with a fire extinguisher or from mace or bear spray. Four other officers later died by suicide.

As the rioters breached the building, officers inside the House chamber moved furniture to barricade the doors as House members huddled together on the floor, waiting for a breach. Ashli Babbitt, a thirty-five-year-old Air Force veteran consumed by social-media conspiracy theories, and wearing a Trump flag around her neck, was shot and killed by a Capitol Police officer when she attempted to storm the chamber. The House Chamber held, but attackers breached the Senate Chamber on the opposite end of the building. Lawmakers had already been evacuated.

The rioters held the Capitol for several hours before the National Guard cleared it that evening. Congress, refusing to back down, stayed that evening to certify the results of the election. And yet, despite everything that had happened the day, the president’s unfounded claims of election fraud kept their grip on on Republican lawmakers. Eleven Republican senators and 150 of the House’s 212 Republicans lodged objections to the certification. And a little more than a month later, they refused to convict Donald Trump during his quickly organized second impeachment trial, this time for “incitement of insurrection.”

 

IX. The Pandemic

In the winter of 2019 and 2020, a new respiratory virus, Covid-19, emerged in Wuhan, China. It was a coronavirus, named after its spiky, crown-like appearance under a microscope. Other coronaviruses had been identified and contained in previous years, but, by December, Chinese doctors were treating dozens of cases, and, by January, hundreds. Wuhan shut down to contain the outbreak but the virus escaped. By January, the United States confirmed its first case. Deaths were reported in the Philippines and in France. Outbreaks struck Italy and Iran. And American case counts grew. Countries began locking down. Air travel slowed.

The virus was highly contagious and could be spread before the onset of symptoms. Many who had the virus were asymptomatic: they didn’t exhibit any symptoms at all. But others, especially the elderly and those with “co-morbidities,” were struck down. The virus attacked their airways, suffocating them. Doctors didn’t know what they were battling. They struggled to procure oxygen and respirators and incubated the worst cases with what they had. But the deaths piled up.

The virus hit New York City in the spring. The city was devastated. Hospitals overflowed as doctors struggled to treat a disease they barely understood. By April, thousands of patients were dying every day. The city couldn’t keep up with the bodies. Dozens of “mobile morgues” were set up to house bodies which wouldn’t be processed for months.41

With medical-grade masks in short supply, Americans made their own homemade cloth masks. Many right-wing Americans notably refused to wear them at all, further exposing workers and family members to the virus.

Failing to contain the outbreak, the country shut down. Flights stopped. Schools and restaurants closed. White-collar workers transitioned to working from home when offices shut down. But others weren’t so lucky. By April, 10 million Americans had lost their jobs.42

But shutdowns were scattered and incomplete. States were left to fend for themselves, setting their own policies and competing with one another to acquire scarce personal protective equipment (PPE). Many workers couldn’t stay home. Hourly workers, lacking paid sick leave, often had to choose between a paycheck and reporting to work having been exposed or even when presenting symptoms. Mask-wearing, meanwhile, was politicized. By May, 100,000 Americans were dead. A new wave of cases hit the South in July and August, overwhelming hospitals across much of the region. But the worst came in the winter, when the outbreak went fully national. Hundreds of thousands tested positive for the virus every day and nearly three-thousand Americans died every day throughout January and much of February.

The outbreak retreated in the spring, and pharmaceutical labs, flush with federal dollars, released new, cutting-edge vaccines. By late spring, Americans were getting vaccinated by the millions. The virus looked like it could be defeated. But many Americans, variously swayed by conspiracy theories peddled on social media or simply politically radicalized into associating vaccinations with anti-Trump politics, refused them. By late summer, barely a majority of those eligible for vaccines were fully vaccinated. More contagious and elusive strains evolved and spread and the virus continued churning through the population, sending many, especially the elderly, chronically ill, and unvaccinated, to hospitals and to early deaths. By the end of the summer of 2021, according to official counts, over 600,000 Americans had died from Covid-19. By May 2022, the official death toll in the United States crossed one million.

X. New Horizons

Americans looked anxiously to the future, and yet also, often, to a new generation busy discovering, perhaps, that change was not impossible. Much public commentary in the early twenty-first century concerned “Millennials” and “Generation Z,” the generations that came of age during the new millennium. Commentators, demographers, and political prognosticators continued to ask what the new generation will bring. Time’s May 20, 2013, cover, for instance, read Millennials Are Lazy, Entitled Narcissists Who Still Live with Their Parents: Why They’ll Save Us All. Pollsters focused on features that distinguish millennials from older Americans: millennials, the pollsters said, were more diverse, more liberal, less religious, and wracked by economic insecurity. “They are,” as one Pew report read, “relatively unattached to organized politics and religion, linked by social media, burdened by debt, distrustful of people, in no rush to marry—and optimistic about the future.”43

Millennial attitudes toward homosexuality and gay marriage reflected one of the most dramatic changes in the popular attitudes of recent years. After decades of advocacy, American attitudes shifted rapidly. In 2006, a majority of Americans still told Gallup pollsters that “gay or lesbian relations” was “morally wrong.”44 But prejudice against homosexuality plummeted and greater public acceptance of coming out opened the culture–in 2001, 73 percent of Americans said they knew someone who was gay, lesbian, or bisexual; in 1983, only 24 percent did. Gay characters—and in particular, gay characters with depth and complexity—could be found across the cultural landscape. Attitudes shifted such that, by the 2010s, polls registered majority support for the legalization of gay marriage. A writer for the Wall Street Journal called it “one of the fastest-moving changes in social attitudes of this generation.”45

Such change was, in many respects, a generational one: on average, younger Americans supported gay marriage in higher numbers than older Americans. The Obama administration, meanwhile, moved tentatively. Refusing to push for national interventions on the gay marriage front, Obama did, however, direct a review of Defense Department policies that repealed the Don’t Ask, Don’t Tell policy in 2011. Without the support of national politicians, gay marriage was left to the courts. Beginning in Massachusetts in 2003, state courts had begun slowly ruling against gay marriage bans. Then, in June 2015, the Supreme Court ruled 5–4 in Obergefell v. Hodges that same-sex marriage was a constitutional right. Nearly two thirds of Americans supported the position.46

While liberal social attitudes marked the younger generation, perhaps nothing defined young Americans more than the embrace of technology. The Internet in particular, liberated from desktop modems, shaped more of daily life than ever before. The release of the Apple iPhone in 2007 popularized the concept of smartphones for millions of consumers and, by 2011, about a third of Americans owned a mobile computing device. Four years later, two thirds did.47.

Together with the advent of social media, Americans used their smartphones and their desktops to stay in touch with old acquaintances, chat with friends, share photos, and interpret the world—as newspaper and magazine subscriptions dwindled, Americans increasingly turned to their social media networks for news and information.48 Ambitious new online media companies, hungry for clicks and the ad revenue they represented, churned out provocatively titled, easy-to-digest stories that could be linked and tweeted and shared widely among like-minded online communities,49 but even traditional media companies, forced to downsize their newsrooms to accommodate shrinking revenues, fought to adapt to their new online consumers.

The ability of individuals to share stories through social media apps revolutionized the media landscape—smartphone technology and the democratization of media reshaped political debates and introduced new political questions. The easy accessibility of video capturing and the ability for stories to go viral outside traditional media, for instance, brought new attention to the tense and often violent relations between municipal police officers and African Americans. The 2014 death of Michael Brown in Ferguson, Missouri, sparked protests and focused the issue. It perhaps became a testament to the power of social media platforms such as Twitter that a hashtag, #blacklivesmatter, became a rallying cry for protesters and counterhashtags, #alllivesmatter and #bluelivesmatter, for critics.50 But a relentless number of videos documenting the deaths of Black men at the hands of police officers continued to circulated across social media networks. The deaths of Eric Garner, twelve-year-old Tamir Rice, Philando Castile, and were captured on cell phone cameras and went viral. So too did the stories of Breonna Taylor and Botham Jean. “Say their names,” a popular chant at Black Lives Matters marches went. And then George Floyd was murdered.

Crowds, holding homemade signs reading "Black Lives Matter" and "Enough is Enough," march in New York City.

George Floyd’s murder in 2020 sparked the largest protests in American history. Here, crowds holding homemade signs protest in New York City. Via Wikimedia.

On May 25, 2020, a teenager, Darnella Frazier, filmed Minneapolis police officer Derek Chauvin with his knee on the neck of George Floyd. “I can’t breathe,” Floyd said. Despite his pleas, and those of bystanders, Chauvin kept his knee on Floyd’s neck for nine minutes. Floyd’s body had long gone limp. The horrific footage shocked much of the country. Despite state and local lockdowns to slow the spread of Covid-19, spontaneous demonstrations broke out across the country. Protests erupted not only in major cities but in small towns and rural communities. The demonstrations dwarfed, in raw numbers, any comparable protest in American history. Taken together, as many as 25-million Americans may have participated in racial justice demonstrations that summer.51 And yet, despite the marches, no great national policy changes quickly followed. The “system” resisted calls to address “systemic racism.” Localities made efforts, of course. Criminal justice reformers won elections as district attorneys. Police departments mandated their officers carry body cameras. As cries of “defund the police” sounded among left-wing Americans, some cities experimented with alternative emergency services that emphasized mediation and mental health. Meanwhile, at a symbolic level, Democratic-leaning towns and cities in the South pulled down their Confederate iconography. But the intractable racial injustices embedded deeply within American life had not been uprooted and racial disparities in wealth, education, health, and other measures persevered, as they already had, in the United States, for hundreds of years.

As the Black Lives Matter movement captured national attention, another social media phenomenon, the #MeToo movement, began as the magnification of and outrage toward the past sexual crimes of notable male celebrities before injecting a greater intolerance toward those accused of sexual harassment and violence into much of the rest of American society. The sudden zero tolerance reflected the new political energies of many American women, sparked in large part by the candidacy and presidency of Donald Trump. The day after Trump’s inauguration, between five hundred thousand and one million people descended on Washington, D.C., for the Women’s March, and millions more demonstrated in cities and towns around the country to show a broadly defined commitment toward the rights of women and others in the face of the Trump presidency. And with three appointments to the Supreme Court, Donald Trump’s legacy persisted past his presidency. On June 24, 2022, the new conservative majority decided Dobbs v. Jackson, overturning Roe v. Wade (1973) and Planned Parenthood v. Casey (1992), cases that established a constitutional right to abortion. Meanwhile, other avenues of sexual politics opened across the country. By the 2020s, the broader American culture increasingly featured transgender individuals in media and many Americans began making their preferred pronouns explicit–as well as deploying “they” as a gender-neutral pronoun–to undermine fixed notions of gender. Many conservatives, however, fought back. State legislators around the country sponsored “bathroom bills” to keep transgender individuals out of the bathroom of their identified gender, alleging that they posed a violent sexual risk. In Texas, Attorney General Ken Paxton declared pediatric gender-affirming care to be child abuse.

As issues of race and gender captured much public discussion, immigration continued on as a potent political issue. Even as anti-immigrant initiatives like California’s Proposition 187 (1994) and Arizona’s SB1070 (2010) reflected the anxieties of many white Americans, younger Americans proved far more comfortable with immigration and diversity (which makes sense, given that they are the most diverse American generation in living memory). Since Lyndon Johnson’s Great Society liberalized immigration laws in the 1960s, the demographics of the United States have been transformed. In 2012, nearly one quarter of all Americans were immigrants or the sons and daughters of immigrants. Half came from Latin America. The ongoing Hispanicization of the United States and the ever-shrinking proportion of non-Hispanic whites have been the most talked about trends among demographic observers. By 2013, 17 percent of the nation was Hispanic. In 2014, Latinos surpassed non-Latino whites to become the largest ethnic group in California. In Texas, the image of a white cowboy hardly captures the demographics of a minority-majority state in which Hispanic Texans will soon become the largest ethnic group. For the nearly 1.5 million people of Texas’s Rio Grande Valley, for instance, where most residents speak Spanish at home, a full three fourths of the population is bilingual.52 Political commentators often wonder what political transformations these populations will bring about when they come of age and begin voting in larger numbers.

 

IX. Conclusion

The collapse of the Soviet Union brought neither global peace nor stability, and the attacks of September 11, 2001, plunged the United States into interminable conflicts around the world. At home, economic recession, a slow recovery, stagnant wage growth, and general pessimism infected American life as contentious politics and cultural divisions poisoned social harmony, leading directly to the January 6, 2021 attack on the U.S. Capitol. And yet the stream of history changes its course. Trends shift, things change, and events turn. New generations bring with them new perspectives, and they share new ideas. Our world is not foreordained. It is the product of history, the ever-evolving culmination of a longer and broader story, of a larger history, of a raw, distinctive, American Yawp.

 

X. Primary Sources

1. Bill Clinton on Free Trade and Financial Deregulation (1993-2000)

During his time in office, Bill Clinton passed the North American Free Trade Act (NAFTA) in 1993, allowing for the free movement of goods between Mexico, the United States, and Canada, signed legislation repealing the Glass-Steagall Act, a major plank of Franklin Roosevelt’s New Deal banking regulation, and deregulated the trading of derivatives, including credit default swaps, a complicated financial instrument that would play a key role in the 2007-2008 economic crash. In the following signing statements, Clinton offers his support of free trade and deregulation.

2. 9/11 Commission Report, “Reflecting On A Generational Challenge” (2004)

On July 22, 2004, the National Commission on Terrorist Attacks Upon the United States—or, the 9/11 Commission—delivered a 500-plus-page report that investigated the origins of the 9/11 attacks and America’s response and offered policy prescriptions for a post-9/11 world.

3. George W. Bush on the Post-9/11 World (2002)

In his 2002 State of the Union Address, George W. Bush proclaimed that the attacks of September 11 signaled a new, dangerous world that demanded American interventions. Bush identified an “Axis of Evil” and provided a justification for a broad “war on terror.”

4. Obergefell v. Hodges (2015)

In 2015, the Supreme Court ruled in Obergefell v. Hodges that prohibitions against same-sex marriage were unconstitutional. Gay marriage had been a divisive issue in American politics for well over a decade. Many states passed referendums and constitutional amendments barring same-sex marriages and, in 1996, Bill Clinton signed the Defense of Marriage Act, defining marriage at the federal level as between a man and a woman. In 2003, the Massachusetts Supreme Court struck down Massachusetts’ state’s prohibition, making it the first state to legally marry same-sex couples. More followed and public opinion began to turn. Although President Obama still refused to support it, by 2011 a majority of Americans believed same-sex marriages should be legally recognized. Four years later, the Supreme Court issued its Obergefell decision. The majority opinion, written by Justice Anthony Kennedy, considered the relationship between history and shifting notions of liberty and injustice.

5. Pedro Lopez on His Mother’s Deportation (2008/2015)

Pedro Lopez immigrated to Postville, Iowa, with his family as a young child. On May 12, 2008, Pedro Lopez’s mother, an undocumented immigrant from Mexico, was arrested, jailed, and deported to Mexico. Pedro was 13. Here, he describes the experience.

6. Chelsea Manning Petitions for a Pardon (2013)

Chelsea Manning, a U.S. Army intelligence analyst, was convicted in 2013 for violating the Espionage Act by leaking classified documents revealing the killing of civilians, the torture of prisoners, and other nefarious actions committed by the United States in the War on Terror. After being sentenced to thirty-five years in federal prison, she delivered a statement, through her attorney, explaining her actions and requesting a pardon from President Barack Obama. Manning’s sentence was commuted in 2017.

7. Emily Doe, Victim Impact Statement (2015)

On January 18, 2015, Stanford University student Brock Turner sexually assaulted an unconscious woman outside of a university fraternity house. At his sentencing on June 2, 2016, his unnamed victim (“Emily Doe”) read a 7,000-word victim impact statement describing the effect of the assault on her life. [Note: Chanel Miller identified herself publicly as Emily Doe in September 2019.]

8. Ground Zero (2001)

A worker stands in front of rubble from the World Trade Center at Ground Zero in Lower Manhattan several weeks after the September 11 attacks.

9. Barack Obama and a Young Boy (2009)

In 2008, Barack Obama became the first African American elected to the presidency. In this official White House photo from May, 2009, 5-year-old Jacob Philadelphia said, “I want to know if my hair is just like yours.”

 XI. Reference Material

This chapter was edited by Michael Hammond, with content contributions by Eladio Bobadilla, Andrew Chadwick, Zach Fredman, Leif Fredrickson, Michael Hammond, Richara Hayward, Joseph Locke, Mark Kukis, Shaul Mitelpunkt, Michelle Reeves, Elizabeth Skilton, Bill Speer, and Ben Wright.

Recommended citation: Eladio Bobadilla et al., “The Recent Past,” Michael Hammond, ed., in The American Yawp, eds. Joseph Locke and Ben Wright (Stanford, CA: Stanford University Press, 2018).

 

Recommended Reading

  • Alexander, Michelle. The New Jim Crow: Mass Incarceration in the Age of Colorblindness. New York: New Press, 2012.
  • Canaday, Margot. The Straight State: Sexuality and Citizenship in Twentieth-Century America. Princeton, NJ: Princeton University Press, 2011.
  • Carter, Dan T. From George Wallace to Newt Gingrich: Race in the Conservative Counterrevolution, 1963–1994. Baton Rouge: LSU Press, 1996.
  • Cowie, Jefferson. Capital Moves: RCA’s 70-Year Quest for Cheap Labor. New York: New Press, 2001.
  • Ehrenreich, Barbara. Nickel and Dimed: On (Not) Getting By in America. New York: Metropolitan, 2001.
  • Evans, Sara. Tidal Wave: How Women Changed America at Century’s End. New York: Free Press, 2003.
  • Gardner, Lloyd C. The Long Road to Baghdad: A History of U.S. Foreign Policy from the 1970s to the Present. New York: Free Press, 2008.
  • Hinton, Elizabeth. From the War on Poverty to the War on Crime. Cambridge, MA: Harvard University Press, 2016.
  • Hollinger, David. Postethnic America: Beyond Multiculturalism. New York: Basic Books, 1995.
  • Hunter, James D. Culture Wars: The Struggle to Define America. New York: Basic Books, 1992.
  • Meyerowitz, Joanne. How Sex Changed: A History of Transsexuality in the United States. Cambridge, MA: Harvard University Press, 2004.
  • Mittelstadt, Jennifer. The Rise of the Military Welfare State. Cambridge, MA: Harvard University Press, 2015.
  • Moreton, Bethany. To Serve God and Walmart: The Making of Christian Free Enterprise. Cambridge, MA: Harvard University Press, 2009.
  • Nadasen, Premilla. Welfare Warriors: The Welfare Rights Movement in the United States. New York: Routledge, 2005.
  • Osnos, Evan. Age of Ambition: Chasing Fortune, Truth and Faith in the New China. New York: Farrar, Straus and Giroux, 2014.
  • Packer, George. The Unwinding: An Inner History of the New America. New York: Farrar, Straus and Giroux, 2013.
  • Patterson, James T. Restless Giant: The United States from Watergate to Bush v. Gore. New York: Oxford University Press, 2005.
  • Piketty, Thomas. Capital in the Twenty-First Century. Translated from the French by Arthur Goldhammer. Cambridge, MA: Belknap Press, 2013.
  • Ricks, Thomas E. Fiasco: The American Military Adventure in Iraq. New York: Penguin, 2006.
  • Schlosser, Eric. Fast Food Nation: The Dark Side of the All-American Meal. New York: Houghton Mifflin Harcourt, 2001.
  • Stiglitz, Joseph. Freefall: America, Free Markets, and the Sinking of the World Economy. New York: Norton, 2010.
  • Taylor, Paul. The Next America: Boomers, Millennials, and the Looming Generational Showdown. New York: Public Affairs, 2014.
  • Wilentz, Sean. The Age of Reagan: A History, 1974–2008. New York: HarperCollins, 2008.
  • Williams, Daniel K. God’s Own Party: The Making of the Christian Right. New York: Oxford University Press, 2007.
  • Wright, Lawrence. The Looming Tower: Al Qaeda and the Road to 9/11. New York: Knopf, 2006.

 

Notes

  1. https://apnews.com/article/ap-fact-check-donald-trump-capitol-siege-violence-elections-507f4febbadecb84e1637e55999ac0ea. []
  2. https://www.washingtonpost.com/dc-md-va/2021/01/14/dc-police-capitol-riot/. []
  3. https://www.nytimes.com/2021/07/27/us/jan-6-inquiry.html. []
  4. William Faulker, Requiem for a Nun (New York: Random House, 1954), 73. []
  5. Bill Minutaglio, First Son: George W. Bush and the Bush Family Dynasty (New York: Random House, 1999), 210–224. []
  6. Roger Simon, “How a Murderer and Rapist Became the Bush Campaign’s Most Valuable Player,” Baltimore Sun, November 11, 1990. []
  7. See especially Dan T. Carter, From George Wallace to Newt Gingrich: Race in the Conservative Counterrevolution, 1963–1994 (Baton Rouge: LSU Press, 1996), 72–80. []
  8. Sean Wilentz, The Age of Reagan: A History, 1974–2008 (New York: HarperCollins, 2008). []
  9. James F. Clarity, “End of the Soviet Union,” New York Times, December 26, 1991. []
  10. Francis Fukuyama, “The End of History?” National Interest (Summer 1989). []
  11. William Thomas Allison, The Gulf War, 1990–91 (New York: Palgrave Macmillan, 2012), 145, 165. []
  12. Charles W. Dunn, The Presidency in the Twenty-first Century (Lexington: University Press of Kentucky, 2011), 152. []
  13. Robert M. Collins, Transforming America: Politics and Culture During the Reagan Years (New York: Columbia University Press, 2009), 171, 172. []
  14. For Clinton’s presidency and the broader politics of the 1990s, see James T. Patterson, Restless Giant: The United States from Watergate to Bush v. Gore (New York: Oxford University Press, 2005); and Wilentz, Age of Reagan. []
  15. Patterson, Restless Giant, 298–299. []
  16. United Nations International Organization for Migration, “Migrant Deaths Remain High Despite Sharp Fall in US-Mexico Border Crossings in 2017,” press release, February 6, 2018. https://news.un.org/en/story/2018/02/1002101. []
  17. Carolyn Skorneck, “Final Welfare Bill Written,” Washington Post, July 30, 1996, A1. []
  18. Frank Newport, “Clinton Receives Record High Job Approval Rating,” Gallup, December 24, 1998. http://news.gallup.com/poll/4111/clinton-receives-record-high-job-approval-rating-after-impeachment-vot.aspx). []
  19. Dexter Filkins, The Forever War (New York: Vintage Books, 2009). []
  20. See, for instance, Lawrence Wright, The Looming Tower: Al Qaeda and the Road to 9/11 (New York: Knopf, 2006). []
  21. https://www.washingtonpost.com/archive/politics/2003/09/06/hussein-link-to-911-lingers-in-many-minds/7cd31079-21d1-42cf-8651-b67e93350fde/.)) —the Bush administration began pushing for a “pre-emptive” war in the fall of 2002. The administration alleged that Hussein was trying to acquire uranium and that it had aluminum tubes used for nuclear centrifuges. Public opinion was divided. George W. Bush said in October, “Facing clear evidence of peril, we cannot wait for the final proof—the smoking gun—that could come in the form of a mushroom cloud.” ((Thomas R. Mockaitis, The Iraq War: A Documentary and Reference Guide (Santa Barbara, CA: ABC-Clio, 2012), 26. []
  22. Judy Keen, “Bush to Troops: Mission Accomplished,” USA Today, June 5, 2003. []
  23. Bruce Nolan, “Katrina: The Storm We’ve Always Feared,” New Orleans Times-Picayune, August 30, 2005. []
  24. Douglas Brinkley, The Great Deluge: Hurricane Katrina, New Orleans, and the Mississippi Gulf Coast (New York: HarperCollins, 2006). []
  25. On the Great Recession, see Joseph Stiglitz, Freefall: America, Free Markets, and the Sinking of the World Economy (New York: Norton, 2010); and Michael Lewis, The Big Short: Inside the Doomsday Machine (New York: Norton: 2010). []
  26. Harold Meyerson, “Corporate America’s Chokehold on Wages,” Washington Post, July 19, 2011. []
  27. Thomas J. Sugrue, Not Even Past: Barack Obama and the Burden of Race (Princeton, NJ: Princeton University Press, 2012). []
  28. Kate Zernike and Megan Thee-Brenan, “Poll Finds Tea Party Backers Wealthier and More Educated,” New York Times, April 14, 2010; Jill Lepore, The Whites of Their Eyes: The Tea Party’s Revolution and the Battle over American History (Princeton, NJ: Princeton University Press, 2011). []
  29. Kerry Close, “The 1% Pocketed 85% of Post-Recession Income Growth,” Time, June 16, 2016. http://time.com/money/4371332/income-inequality-recession/. See also Justin Wolfers, “The Gains from the Economic Recovery Are Still Limited to the Top One Percent,” New York Times, January 27, 2015. http://www.nytimes.com/2015/01/28/upshot/gains-from-economic-recovery-still-limited-to-top-one-percent.html. []
  30. Julia Queen and Christian Hilland, “2008 Presidential Campaign Financial Activity Summarized: Receipts Nearly Double 2004 Total,” Federal Election Commission, June 8, 2009. http://www.fec.gov/press/press2009/20090608PresStat.shtml; Andre Tartar and Eric Benson, “The Forever Campaign,” New York Magazine (October 14, 2012). http://nymag.com/news/politics/elections-2012/timeline-2012-10/. []
  31. Intergovernmental Panel on Climate Change, Climate Change 2013: The Physical Science Basis (Cambridge, UK: Cambridge University Press, 2014). []
  32. Intergovernmental Panel on Climate Change, Climate Change 2014: Impacts, Adaptation and Vulnerability: Global and Sectoral Aspects (Cambridge, UK: Cambridge University Press, 2014). []
  33. Philip Bump, “A Quarter of Americans Dislike Both Major-Party Presidential Candidates,” Washington Post, July 14, 2016. https://www.washingtonpost.com/news/the-fix/wp/2016/07/14/a-quarter-of-americans-dislike-both-major-party-presidential-candidates/?tid=a_inl; Aaron Zitner and Julia Wolfe, “Trump and Clinton’s Popularity Problem,” Wall Street Journal, May 24, 2016. http://graphics.wsj.com/elections/2016/donald-trump-and-hillary-clintons-popularity-problem/. []
  34. https://www.nytimes.com/interactive/2017/06/23/opinion/trumps-lies.html. []
  35. https://news.gallup.com/poll/312737/americans-views-trump-character-firmly-established.aspx. []
  36. See, for instance, https://www.ipsos.com/sites/default/files/ct/news/documents/2021-04/topline_write_up_reuters_ipsos_trump_coattails_poll_-_april_02_2021.pdf. []
  37. https://www.cbsnews.com/news/capitol-riot-arrests-latest-2021-07-27/. []
  38. https://www.nytimes.com/2021/01/13/video/extremist-signs-symbols-capitol-riot.html. []
  39. https://www.cbsnews.com/news/capitol-police-injuries-riot/https://www.nytimes.com/2021/02/11/us/politics/capitol-riot-police-officer-injuries.html. []
  40. https://www.nytimes.com/2021/02/11/us/politics/capitol-riot-police-officer-injuries.html. []
  41. https://www.nytimes.com/2020/04/02/nyregion/coronavirus-new-york-bodies.html. []
  42. https://www.nytimes.com/article/coronavirus-timeline.html. []
  43. Paul Taylor, The Next America: Boomers, Millennials, and the Looming Generational Showdown (New York: Public Affairs, 2014). []
  44. “Gay and Lesbian Rights,” Gallup, December 5–7, 2003. http://www.gallup.com/poll/1651/gay-lesbian-rights.aspx. []
  45. Janet Hook, “Support for Gay Marriage Hits All-Time High,” Wall Street Journal, March 9, 2015. []
  46. Ibid. []
  47. Monica Anders, “Technology Device Ownership: 2015,” Pew Research Center, October 29, 2015. http://www.pewglobal.org/2016/02/22/smartphone-ownership-and-internet-usage-continues-to-climb-in-emerging-economies/. []
  48. Monica Anderson and Andrea Caumont, “How Social Media Is Reshaping News,” Pew Research Center, September 24, 2014. http://www.pewresearch.org/fact-tank/2014/09/24/how-social-media-is-reshaping-news/. []
  49. See, for instance, Nicholas G. Carr’s 2010 The Shallows: What the Internet Is Doing to Our Brains, a 2011 Pulitzer Prize finalist. []
  50. Bijan Stephen, “Social Media Helps Black Lives Matter Fight the Power,” Wired (November 2015). http://www.wired.com/2015/10/how-black-lives-matter-uses-social-media-to-fight-the-power/. []
  51. https://www.nytimes.com/interactive/2020/07/03/us/george-floyd-protests-crowd-size.html. []
  52. U.S. Census Bureau, 2016 American Community Survey 1-Year Estimates. https://factfinder.census.gov/bkmk/table/1.0/en/ACS/16_1YR/S1601/0500000US48061|0500000US48215. []

29. The Triumph of the Right

Photograph of the activist Phyllis Schlafly campaigning against the Equal Rights Amendment in 1978. She stands in front of stop signs that say "STOP ERA"

Activist Phyllis Schlafly campaigns against the Equal Rights Amendment in 1977. Library of Congress

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

Speaking to Detroit autoworkers in October 1980, Republican presidential candidate Ronald Reagan described what he saw as the American Dream under Democratic president Jimmy Carter. The family garage may have still held two cars, cracked Reagan, but they were “both Japanese and they’re out of gas.”1 The charismatic former governor of California suggested that a once-proud nation was running on empty. But Reagan held out hope for redemption. Stressing the theme of “national decline,” he nevertheless promised to make the United States once again a glorious “city upon a hill.”2 In November, Reagan’s vision triumphed.

Reagan rode the wave of a powerful political movement referred to by historians as the New Right. More libertarian in its economics and more politically forceful in its conservative religious principles than the moderate brand of conservatism popular after World War II, the New Right had by the 1980s evolved into the most influential wing of the Republican Party. And it could claim increasing credit for Republican electoral successes. Building on the gradual unraveling of the New Deal political order in the 1960s and 1970s (see Chapter 28), the conservative movement not only enjoyed the guidance of skilled politicians like Reagan but drew tremendous energy from a broad range of grassroots activists. Countless ordinary citizens—newly mobilized Christian conservatives, in particular—helped the Republican Party steer the country rightward. Enduring conflicts over race, economic policy, sexual politics, and foreign affairs fatally fractured the liberal consensus that had dominated American politics since the presidency of Franklin Roosevelt, and the New Right attracted support from Reagan Democrats, blue-collar voters who had lost faith in the old liberal creed.

The rise of the right affected Americans’ everyday lives in numerous ways. The Reagan administration’s embrace of free markets dispensed with the principles of active income redistribution and social welfare spending that had animated the New Deal and Great Society in the 1930s and 1960s. As American liberals increasingly embraced a “rights” framework directed toward African Americans, Latinos, women, lesbians and gays, and other marginalized groups, conservative policy makers targeted the regulatory and legal landscape of the United States. Critics complained that Reagan’s policies served the interests of corporations and wealthy individuals and pointed to the sudden widening of economic inequality. But the New Right harnessed popular distrust of regulation, taxes, and bureaucrats, and conservative activists celebrated the end of hyperinflation and substantial growth in GDP.

In many ways, however, the rise of the right promised more than it delivered. Battered but intact, the social welfare programs of the New Deal and Great Society (for example, social security, Medicaid, and Aid to Families with Dependent Children) survived the 1980s. Despite Republican vows of fiscal discipline, both the federal government and the national debt ballooned. At the end of the decade, conservative Christians viewed popular culture as more vulgar and hostile to their values than ever before. And in the near term, the New Right registered only partial victories on a range of public policies and cultural issues. Yet from a long-term perspective, conservatives achieved a subtler and more enduring transformation of American politics and society. In the words of one historian, the conservative movement successfully “changed the terms of debate and placed its opponents on the defensive.”3 Liberals and their programs and policies did not disappear, but they increasingly fought battles on terrain chosen by the New Right.

 

II. Conservative Ascendance

The Reagan Revolution marked the culmination of a long process of political mobilization on the American right. In the first two decades after World War II the New Deal seemed firmly embedded in American electoral politics and public policy. Even two-term Republican president Dwight D. Eisenhower declined to roll back the welfare state. To be sure, William F. Buckley tapped into a deep vein of elite conservatism in 1955 by announcing in the first issue of National Review that his magazine “stands athwart history yelling Stop.”4 Senator Joseph McCarthy and John Birch Society founder Robert Welch stirred anticommunist fervor. But in general, the far right lacked organizational cohesion. Following Lyndon Johnson’s resounding defeat of Republican Barry Goldwater—“Mr. Conservative”—in the 1964 presidential election, many observers declared American conservatism finished. New York Times columnist James Reston wrote that Goldwater had “wrecked his party for a long time to come.”5

Despite these dire predictions, conservatism not only persisted, it prospered. Its growing appeal had several causes. The expansive social and economic agenda of Johnson’s Great Society reminded anticommunists of Soviet-style central planning and deficits alarmed fiscal conservatives. Race also drove the creation of the New Right. The civil rights movement, along with the Civil Rights Act and the Voting Rights Act, challenged the racial hierarchy of the Jim Crow South. All of these occurred under Democratic leadership, pushing white southerners toward the Republican Party. In the late 1960s and early 1970s, Black Power, affirmative action, and court-ordered busing of children between schools to achieve racial balance brought “white backlash” in the North, often in cities previously known for political liberalism. To many white Americans, the urban rebellions, antiwar protests, and student uprisings of the late 1960s signaled social chaos. At the same time, slowing wage growth, rising prices, and growing tax burdens threatened many working- and middle-class citizens who long formed the core of the New Deal coalition. Liberalism no longer seemed to offer the great mass of white Americans a road map to prosperity, so they searched for new political solutions.

Former Alabama governor and conservative Democrat George Wallace masterfully exploited the racial, cultural, and economic resentments of working-class whites during his presidential runs in 1968 and 1972. Wallace’s record as a staunch segregationist made him a hero in the Deep South, where he won five states as a third-party candidate in the 1968 general election. Wallace’s populist message also resonated with blue-collar voters in the industrial North who felt left behind by the rights revolution. On the campaign stump, the fiery candidate lambasted hippies, antiwar protesters, and government bureaucrats. He assailed female welfare recipients for “breeding children as a cash crop” and ridiculed “over-educated, ivory-tower” intellectuals who “don’t know how to park a bicycle straight.”6 Wallace also advanced progressive proposals for federal job training programs, a minimum wage hike, and legal protections for collective bargaining. Running as a Democrat in 1972, Wallace captured the Michigan primary and polled second in the industrial heartland of Wisconsin, Pennsylvania, and Indiana. In May 1972, an assassin’s bullet left Wallace paralyzed and ended his campaign. Nevertheless, his amalgamation of older, New Deal–style proposals and conservative populism represented the rapid reordering of party loyalties in the late 1960s and early 1970s. Richard Nixon similarly harnessed the New Right’s sense of grievance through his rhetoric about “law and order” and the “silent majority.”7 But Nixon and his Republican successor, Gerald Ford, continued to accommodate the politics of the New Deal order. The New Right remained without a major public champion.

Christian conservatives also felt themselves under siege from liberalism. In the early 1960s, Supreme Court decisions prohibiting teacher-led prayer (Engel v. Vitale) and Bible reading in public schools (Abington v. Schempp) led some on the right to conclude that a liberal judicial system threatened Christian values. In the following years, the counterculture’s celebration of sex and drugs, along with relaxed obscenity and pornography laws, intensified the conviction that “permissive” liberalism encouraged immorality in private life. Evangelical Protestants—Christians who professed a personal relationship with Jesus Christ, upheld the Bible as an infallible source of truth, and felt a duty to convert, or evangelize, nonbelievers—composed the core of the so-called religious right.

With increasing assertiveness in the 1960s and 1970s, Christian conservatives mobilized to protect the “traditional” family. Women composed a striking number of the religious right’s foot soldiers. In 1968 and 1969 a group of newly politicized mothers in Anaheim, California, led a sustained protest against sex education in public schools.8 Catholic activist Phyllis Schlafly marshaled opposition to the ERA, while evangelical pop singer Anita Bryant drew national headlines for her successful fight to repeal Miami’s gay rights ordinance in 1977. In 1979, Beverly LaHaye (whose husband, Tim—an evangelical pastor in San Diego—later coauthored the wildly popular Left Behind Christian book series) founded Concerned Women for America, which linked small groups of local activists opposed to the ERA, abortion, homosexuality, and no-fault divorce.

Activists like Schlafly and LaHaye valorized motherhood as women’s highest calling. Abortion therefore struck at the core of their female identity. More than perhaps any other issue, abortion drew different segments of the religious right—Catholics and Protestants, women and men—together. The Supreme Court’s 1973 Roe v. Wade ruling outraged many devout Catholics and evangelicals (who had been less universally opposed to the procedure than their Catholic counterparts). Christian author Francis Schaeffer cultivated evangelical opposition to abortion through the 1979 documentary film Whatever Happened to the Human Race?, arguing that the “fate of the unborn is the fate of the human race.”9 With abortion framed in stark, existential terms, many evangelicals felt compelled to combat the procedure through political action.

Grassroots passion drove anti-abortion activism, but a set of religious and secular institutions turned the various strands of the New Right into a sophisticated movement. In 1979 Jerry Falwell—a Baptist minister and religious broadcaster from Lynchburg, Virginia—founded the Moral Majority, an explicitly political organization dedicated to advancing a “pro-life, pro-family, pro-morality, and pro-American” agenda. The Moral Majority skillfully wove together social and economic appeals to make itself a force in Republican politics. Secular, business-oriented institutions also joined the attack on liberalism, fueled by stagflation and by the federal government’s creation of new regulatory agencies like the Environmental Protection Agency and the Occupational Safety and Health Administration. Conservative business leaders bankrolled new “think tanks” like the Heritage Foundation and the Cato Institute. These organizations provided grassroots activists with ready-made policy prescriptions. Other business leaders took a more direct approach by hiring Washington lobbyists and creating political action committees (PACs) to press their agendas in the halls of Congress and federal agencies. Between 1976 and 1980 the number of corporate PACs rose from under three hundred to over twelve hundred.

Grassroots activists and business leaders received unlikely support from a circle of neoconservatives—disillusioned intellectuals who had rejected liberalism and the Left and become Republicans. Irving Kristol, a former Marxist who went on to champion free-market capitalism as a Wall Street Journal columnist, defined a neoconservative as a “liberal who has been mugged by reality.”10 Neoconservative journals like Commentary and Public Interest argued that the Great Society had proven counterproductive, perpetuating the poverty and racial segregation that it aimed to cure. By the middle of the 1970s, neoconservatives felt mugged by foreign affairs as well. As ardent Cold Warriors, they argued that Nixon’s policy of détente left the United States vulnerable to the Soviet Union.

In sum, several streams of conservative political mobilization converged in the late 1970s. Each wing of the burgeoning New Right—disaffected northern blue-collar workers, white southerners, evangelicals and devout Catholics, business leaders, disillusioned intellectuals, and Cold War hawks—turned to the Republican Party as the most effective vehicle for their political counterassault on liberalism and the New Deal political order. After years of mobilization, the domestic and foreign policy storms of the Carter administration provided the tailwinds that brought the conservative movement to shore.

 

III. The Conservatism of the Carter Years

The election of Jimmy Carter in 1976 brought a Democrat to the White House for the first time since 1969. Large Democratic majorities in Congress provided the new president with an opportunity to move aggressively on the legislative front. With the infighting of the early 1970s behind them, many Democrats hoped the Carter administration would update and expand the New Deal. But Carter won the presidency on a wave of post-Watergate disillusionment with government that did not translate into support for liberal ideas.

In its early days, the Carter administration embraced several policies backed by liberals. It pushed an economic stimulus package containing $4 billion for public works, extended food stamp benefits to 2.5 million new recipients, enlarged the Earned Income Tax Credit for low-income households, and expanded the Nixon-era Comprehensive Employment and Training Act (CETA).11 But the White House quickly realized that Democratic control of Congress did not guarantee support for its initially left-leaning economic proposals. Many of the Democrats elected to Congress in the aftermath of Watergate were more moderate than their predecessors, who had been trained in the New Deal gospel. These conservative Democrats sometimes partnered with congressional Republicans to oppose Carter, most notably in response to the administration’s proposal for a federal office of consumer protection.

Events outside Carter’s control certainly helped discredit liberalism, but the president’s own temperamental and philosophical conservatism hamstrung the administration and pushed national politics further to the right. In his 1978 State of the Union address, Carter lectured Americans that “government cannot solve our problems . . . it cannot eliminate poverty, or provide a bountiful economy, or reduce inflation, or save our cities, or cure illiteracy, or provide energy.”12 The statement neatly captured the ideological transformation of the country. Rather than leading a resurgence of American liberalism, Carter became, as one historian put it, “the first president to govern in a post–New Deal framework.”13 Organized labor felt abandoned by Carter, who remained cool to several of their highest legislative priorities. The president offered tepid support for a national health insurance proposal and declined to lobby aggressively for a package of modest labor law reforms. The business community rallied to defeat the latter measure, in what AFL-CIO chief George Meany described as “an attack by every anti-union group in America to kill the labor movement.”14 In 1977 and 1978, liberal Democrats rallied behind the Humphrey-Hawkins Full Employment and Training Act, which promised to end unemployment through extensive government planning. The bill aimed not only to guarantee a job to every American but also to reunite the interracial, working-class Democratic coalition that had been fractured by deindustrialization and affirmative action.15 But Carter’s lack of enthusiasm for the proposal allowed conservatives from both parties to water the bill down to a purely symbolic gesture. Liberals, like labor leaders, came to regard the president as an unreliable ally.

Carter also came under fire from Republicans, especially the religious right. His administration incurred the wrath of evangelicals in 1978 when the IRS established new rules revoking the tax-exempt status of racially segregated, private Christian schools. The rules only strengthened a policy instituted by the Nixon administration; however, the religious right accused Carter of singling out Christian institutions. Republican activist Richard Viguerie described the IRS controversy as the “spark that ignited the religious right’s involvement in real politics.”16 Race sat just below the surface of the IRS fight. After all, many of the schools had been founded to circumvent court-ordered desegregation. But the IRS ruling allowed the New Right to rain down fire on big government interference while downplaying the practice of segregation at the heart of the case.

While the IRS controversy flared, economic crises multiplied. Unemployment reached 7.8 percent in May 1980, up from 6 percent at the start of Carter’s first term.17 Inflation (the rate at which the cost of goods and services increases) jumped from 6 percent in 1978 to a staggering 20 percent by the winter of 1980.18 In another bad omen, the iconic Chrysler Corporation appeared close to bankruptcy. The administration responded to these challenges in fundamentally conservative ways. First, Carter proposed a tax cut for the upper middle class, which Congress passed in 1978. Second, the White House embraced a longtime goal of the conservative movement by deregulating the airline and trucking industries in 1978 and 1980, respectively. Third, Carter proposed balancing the federal budget—much to the dismay of liberals, who would have preferred that he use deficit spending to finance a new New Deal. Finally, to halt inflation, Carter’s appointed chair of the Federal Reserve, Paul Volcker, raised interest rates and tightened the money supply—policies designed to reduce inflation in the long run but which increased unemployment in the short run. Liberalism was on the run.

The decade’s second “energy crisis,” which witnessed another spike in oil prices and oil shortages across the country, brought out the southern Baptist moralist in Carter. On July 15, 1979, the president delivered a nationally televised speech on energy policy in which he attributed the country’s economic woes to a “crisis of confidence.” Carter lamented that “too many of us now tend to worship self-indulgence and consumption.”19 The country initially responded favorably to the push for energy conservation, yet Carter’s emphasis on discipline and sacrifice and his spiritual diagnosis for economic hardship sidestepped deeper questions of large-scale economic change and downplayed the harsh toll inflation had taken on regular Americans.

 

IV. The Election of 1980

These domestic challenges, combined with the Soviet invasion of Afghanistan and the hostage crisis in Iran, hobbled Carter heading into his 1980 reelection campaign. Many Democrats were dismayed by his policies. The president of the International Association of Machinists dismissed Carter as “the best Republican President since Herbert Hoover.”20 Angered by the White House’s refusal to back national health insurance, Massachusetts senator Ted Kennedy challenged Carter in the Democratic primaries. Running as the party’s liberal standard-bearer and heir to the legacy of his slain older brothers, Kennedy garnered support from key labor unions and left-wing Democrats. Carter ultimately vanquished Kennedy, but the close primary tally exposed the president’s vulnerability.

Carter’s opponent in the general election was Ronald Reagan, a former Hollywood actor who had served two terms as governor of California. Reagan ran as a staunch fiscal conservative and a Cold War hawk, vowing to reduce government spending and shrink the federal bureaucracy. Reagan also accused his opponent of failing to confront the Soviet Union and vowed steep increases in military spending. Carter responded by calling Reagan a warmonger, but the Soviet invasion of Afghanistan and the confinement of 52 American hostages in Iran discredited Carter’s foreign policy in the eyes of many Americans.

The incumbent fared no better on domestic affairs. Unemployment remained at nearly 8 percent.21 Meanwhile the Federal Reserve’s anti-inflation measures pushed interest rates to an unheard-of 18.5 percent.22 Reagan seized on these bad economic trends. On the campaign trail he brought down the house by proclaiming: “A recession is when your neighbor loses his job, and a depression is when you lose your job.” Reagan would then pause before concluding, “And a recovery is when Jimmy Carter loses his job.”23

Social and cultural issues presented yet another challenge for the president. Although a self-proclaimed “born-again” Christian and Sunday school teacher, Carter struggled to court the religious right. Carter scandalized devout Christians by admitting to lustful thoughts during an interview with Playboy magazine in 1976, telling the reporter he had “committed adultery in my heart many times.”24 Although Reagan was only a nominal Christian and rarely attended church, the religious right embraced him. Reverend Jerry Falwell directed the full weight of the Moral Majority behind Reagan. The organization registered an estimated two million new voters in 1980. Reagan also cultivated the religious right by denouncing abortion and endorsing prayer in school. The IRS tax exemption issue resurfaced as well, with the 1980 Republican platform vowing to “halt the unconstitutional regulatory vendetta launched by Mr. Carter’s IRS commissioner against independent schools.”25 Early in the primary season, Reagan condemned the policy during a speech at South Carolina’s Bob Jones University, which had recently sued the IRS after the school’s ban on interracial dating led to the loss of its tax-exempt status.

Photograph of Jerry Falwell, the wildly popular TV evangelist and founder of the Moral Majority.

Jerry Falwell, a wildly popular TV evangelist, founded the Moral Majority in the late 1970s. Decrying the demise of the nation’s morality, the organization gained a massive following and helped to cement the status of the New Christian Right in American politics. Wikimedia.

Reagan’s campaign appealed subtly but unmistakably to the racial hostilities of white voters. The candidate held his first post–nominating convention rally at the Neshoba County Fair near Philadelphia, Mississippi, the town where three civil rights workers had been murdered in 1964. In his speech, Reagan championed the doctrine of states’ rights, which had been the rallying cry of segregationists in the 1950s and 1960s. In criticizing the welfare state, Reagan had long employed thinly veiled racial stereotypes about a “welfare queen” in Chicago who drove a Cadillac while defrauding the government or a “strapping young buck” purchasing T-bone steaks with food stamps.26 Like George Wallace before him, Reagan exploited the racial and cultural resentments of struggling white working-class voters. And like Wallace, he attracted blue-collar workers in droves.

With the wind at his back on almost every issue, Reagan only needed to blunt Carter’s characterization of him as an angry extremist. Reagan did so during their only debate by appearing calm and amiable. “Are you better off than you were four years ago?” he asked the American people at the conclusion of the debate.27 The American people answered no. Reagan won the election with 51 percent of the popular vote to Carter’s 41 percent. (Independent John Anderson captured 7 percent.)28 Despite capturing only a slim majority of the overall popular vote, Reagan scored a decisive 489–49 victory in the Electoral College.29 Republicans gained control of the Senate for the first time since 1955 by winning twelve seats. Liberal Democrats George McGovern, Frank Church, and Birch Bayh went down in defeat, as did liberal Republican Jacob Javits. The GOP picked up thirty-three House seats, narrowing the Democratic advantage in the lower chamber.30 The New Right had arrived in Washington, D.C.

 

V. The New Right in Power

Photograph showing Ronald Reagan and his wife, Nancy Reagan, waving from a limousine during the inaugural parade in Washington, D.C., in 1981.

Ronald Reagan secured the presidency by appealing to the growing conservatism of much of the country. Here, Ronald Reagan and his wife, Nancy Reagan, wave from a limousine during the inaugural parade in Washington, D.C., in 1981. Wikimedia.

In his first inaugural address Reagan proclaimed that “government is not the solution to the problem, government is the problem.”31 In reality, Reagan focused less on eliminating government than on redirecting government to serve new ends. In line with that goal, his administration embraced supply-side economic theories that had recently gained popularity among the New Right. While the postwar gospel of Keynesian economics had focused on stimulating consumer demand, supply-side economics held that lower personal and corporate tax rates would encourage greater private investment and production. Supply-side advocates promised that the resulting wealth would reach—or “trickle down” to, in the words of critics—lower-income groups through job creation and higher wages. Conservative economist Arthur Laffer predicted that lower tax rates would generate so much economic activity that federal tax revenues would actually increase. The administration touted the so-called Laffer Curve as justification for the tax cut plan that served as the cornerstone of Reagan’s first year in office. Republican congressman Jack Kemp, an early supply-side advocate and co-sponsor of Reagan’s tax bill, promised that it would unleash the “creative genius that has always invigorated America.”32

The Iranian hostage crisis ended literally during President Reagan’s inauguration speech. By a coincide of timing, then, the Reagan administration received credit for ending the conflict. This group photograph shows the former hostages in the hospital before being released back to the U.S. Johnson Babela, Photograph, 1981. Wikimedia, http://commons.wikimedia.org/wiki/File:DF-SN-82-06759.jpg.

The Iranian hostage crisis ended literally during President Reagan’s inauguration speech. The Reagan administration received credit for bringing the hostages home. This group photograph shows the former hostages in the hospital in 1981 before being released back to the United States. Wikimedia.

The tax cut faced early skepticism from Democrats and even some Republicans. Vice president George H. W. Bush had belittled supply-side theory as “voodoo economics” during the 1980 Republican primaries.33 But a combination of skill and serendipity pushed the bill over the top. Reagan aggressively and effectively lobbied individual members of Congress for support on the measure. Then on March 30, 1981, Reagan survived an assassination attempt by a mentally unstable young man named John Hinckley. Public support swelled for the hospitalized president. Congress ultimately approved a $675 billion tax cut in July 1981 with significant Democratic support. The bill reduced overall federal taxes by more than one quarter and lowered the top marginal rate from 70 percent to 50 percent, with the bottom rate dropping from 14 percent to 11 percent. It also slashed the rate on capital gains from 28 percent to 20 percent.34 The next month, Reagan scored another political triumph in response to a strike called by the Professional Air Traffic Controllers Organization (PATCO). During the 1980 campaign, Reagan had wooed organized labor, describing himself as “an old union man” (he had led the Screen Actors Guild from 1947 to 1952) who still held Franklin Roosevelt in high regard.35 PATCO had been one of the few labor unions to endorse Reagan. Nevertheless, the president ordered the union’s striking air traffic controllers back to work and fired more than eleven thousand who refused. Reagan’s actions crippled PATCO and left the American labor movement reeling. For the rest of the 1980s the economic terrain of the United States—already unfavorable to union organizing—shifted decisively in favor of employers. The unionized portion of the private-sector workforce fell from 20 percent in 1980 to 12 percent in 1990.36 Reagan’s tax bill and the defeat of PATCO not only enhanced the economic power of corporations and high-income households, they confirmed that a new conservative age had dawned in American life.

The new administration appeared to be flying high in the fall of 1981, but developments challenged the rosy economic forecasts emanating from the White House. As Reagan ratcheted up tension with the Soviet Union, Congress approved his request for $1.2 trillion in new military spending.37 The combination of lower taxes and higher defense budgets caused the national debt to balloon. By the end of Reagan’s first term it equaled 53 percent of GDP, as opposed to 33 percent in 1981.38 The increase was staggering, especially for an administration that had promised to curb spending. Meanwhile, Federal Reserve chairman Paul Volcker continued his policy from the Carter years of combating inflation by maintaining high interest rates, which surpassed 20 percent in June 1981.39 The Fed’s action increased the cost of borrowing money and stifled economic activity.

As a result, the United States experienced a severe economic recession in 1981 and 1982. Unemployment rose to nearly 11 percent, the highest figure since the Great Depression.40 Reductions in social welfare spending heightened the impact of the recession on ordinary people. Congress had followed Reagan’s lead by reducing funding for food stamps and Aid to Families with Dependent Children and removed a half million people from the Supplemental Social Security program for the physically disabled.41 The cuts exacted an especially harsh toll on low-income communities of color. The head of the NAACP declared that the administration’s budget cuts had rekindled “war, pestilence, famine, and death.”42 Reagan also received bipartisan rebuke in 1981 after proposing cuts to social security benefits for early retirees. The Senate voted unanimously to condemn the plan, and Democrats framed it as a heartless attack on the elderly. Confronted with recession and harsh public criticism, a chastened White House worked with Democratic House Speaker Tip O’Neill in 1982 on a bill that restored $98 billion of the previous year’s tax cuts.43 Despite compromising with the administration on taxes, Democrats railed against the so-called Reagan Recession, arguing that the president’s economic policies favored the most fortunate Americans. This appeal, which Democrats termed the “fairness issue,” helped them win twenty-six House seats in the autumn congressional races.44 The New Right appeared to be in trouble.

 

VI. Morning in America

President Ronald Reagan, a master of the "photo op," appears here with a row of American flags at his back at a 1982 rally for Senator David Durenberger in Minneapolis, Minnesota. President Ronald Reagan, 1982. Via National Archives (198527).

President Ronald Reagan, a master of the photo op, appears here with a row of American flags at his back at a 1982 rally for Senator David Durenberger in Minneapolis, Minnesota. National Archives (198527).

Reagan nimbly adjusted to the political setbacks of 1982. Following the rejection of his social security proposals, Reagan appointed a bipartisan panel to consider changes to the program. In early 1983, the commission recommended a onetime delay in cost-of-living increases, a new requirement that government employees pay into the system, and a gradual increase in the retirement age from sixty-five to sixty-seven. The commission also proposed raising state and federal payroll taxes, with the new revenue poured into a trust fund that would transform social security from a pay-as-you-go system to one with significant reserves.45 Congress quickly passed the recommendations into law, allowing Reagan to take credit for strengthening a program cherished by most Americans. The president also benefited from an economic rebound. Real disposable income rose 2.5 percent in 1983 and 5.8 percent the following year.46 Unemployment dropped to 7.5 percent in 1984.47 Meanwhile, the “harsh medicine” of high interest rates helped reduce inflation to 3.5 percent.48 While campaigning for reelection in 1984, Reagan pointed to the improving economy as evidence that it was “morning again in America.”49 His personal popularity soared. Most conservatives ignored the debt increase and tax hikes of the previous two years and rallied around the president.

The Democratic Party, on other hand, stood at an ideological crossroads in 1984. The favorite to win the party’s nomination was Walter Mondale, a staunch ally of organized labor and the civil rights movement as a senator during the 1960s and 1970s. He later served as Jimmy Carter’s vice president. Mondale’s chief rivals were civil rights activist Jesse Jackson and Colorado senator Gary Hart, one of the young Democrats elected to Congress in 1974 following Nixon’s downfall. Hart and other “Watergate babies” still identified themselves as liberals but rejected their party’s faith in activist government and embraced market-based approaches to policy issues. In so doing, they conceded significant political ground to supply-siders and conservative opponents of the welfare state. Many Democrats, however, were not prepared to abandon their New Deal heritage, and so the ideological tension within the party played out in the 1984 primary campaign. Jackson offered a largely progressive program but won only two states. Hart’s platform—economically moderate but socially liberal—inverted the political formula of Mondale’s New Deal–style liberalism. Throughout the primaries, Hart contrasted his “new ideas” with Mondale’s “old-fashioned” politics. Mondale eventually secured his party’s nomination but suffered a crushing defeat in the general election. Reagan captured forty-nine of fifty states, winning 58.8 percent of the popular vote.50

Mondale’s loss seemed to confirm that the new breed of moderate Democrats better understood the mood of the American people. The future of the party belonged to post–New Deal liberals like Hart and to the constituency that supported him in the primaries: upwardly mobile, white professionals and suburbanites. In February 1985, a group of centrists formed the Democratic Leadership Council (DLC) as a vehicle for distancing the party from organized labor and Keynesian economics while cultivating the business community. Jesse Jackson dismissed the DLC as “Democrats for the Leisure Class,” but the organization included many of the party’s future leaders, including Arkansas governor Bill Clinton.51 The formation of the DLC illustrated the degree to which to the New Right had transformed American politics: New Democrats looked a lot like old Republicans.

Reagan entered his second term with a much stronger mandate than in 1981, but the Grand Old Party (GOP) makeover of Washington, D.C., stalled. The Democrats regained control of the Senate in 1986, and Democratic opposition prevented Reagan from eliminating means-tested social welfare programs, although Congress failed to increase benefit levels for welfare programs or raise the minimum wage, decreasing the real value of those benefits. Democrats and Republicans occasionally fashioned legislative compromises, as with the Tax Reform Act of 1986. The bill lowered the top corporate tax rate from 46 percent to 34 percent and reduced the highest marginal income tax rate from 50 percent to 28 percent, while also simplifying the tax code and eliminating numerous loopholes.52 The steep cuts to the corporate and individual rates certainly benefited wealthy individuals, but the legislation made virtually no net change to federal revenues. In 1986, Reagan also signed into law the Immigration Reform and Control Act. American policy makers hoped to do two things: deal with the millions of undocumented immigrants already in the United States while simultaneously choking off future unsanctioned migration. The former goal was achieved (nearly three million undocumented workers received legal status) but the latter proved elusive.

One of Reagan’s most far-reaching victories occurred through judicial appointments. He named 368 district and federal appeals court judges during his two terms.53 Observers noted that almost all of the appointees were white men. (Seven were African American, fifteen were Latino, and two were Asian American.) Reagan also appointed three Supreme Court justices: Sandra Day O’Connor, who to the dismay of the religious right turned out to be a moderate; Anthony Kennedy, a solidly conservative Catholic who occasionally sided with the court’s liberal wing; and archconservative Antonin Scalia. The New Right’s transformation of the judiciary had limits. In 1987, Reagan nominated Robert Bork to fill a vacancy on the Supreme Court. Bork, a federal judge and former Yale University law professor, was a staunch conservative. He had opposed the 1964 Civil Rights Act, affirmative action, and the Roe v. Wade decision. After acrimonious confirmation hearings, the Senate rejected Bork’s nomination by a vote of 58–42.54

 

VII. African American Life in Reagan’s America

African Americans read Bork’s nomination as another signal of the conservative movement’s hostility to their social, economic, and political aspirations. Indeed, Ronald Reagan’s America presented African Americans with a series of contradictions. Black Americans achieved significant advances in politics, culture, and socioeconomic status. A trend from the late 1960s and 1970s continued and Black politicians gained control of major municipal governments across the country during the 1980s. In 1983, voters in Philadelphia and Chicago elected Wilson Goode and Harold Washington, respectively, as their cities’ first Black mayors. At the national level, civil rights leader Jesse Jackson became the first African American man to run for president when he campaigned for the Democratic Party’s nomination in 1984 and 1988. Propelled by chants of “Run, Jesse, run,” Jackson achieved notable success in 1988, winning nine state primaries and finishing second with 29 percent of the vote.55

Jesse Jackson was only the second African American to mount a national campaign for the presidency. His work as a civil rights activist and Baptist minister garnered him a significant following in the African American community, but never enough to secure the Democratic nomination. His Warren K. Leffler, “IVU w/ [i.e., interview with] Rev. Jesse Jackson,” July 1, 1983. Library of Congress, http://www.loc.gov/pictures/item/2003688127/.

Jesse Jackson, pictured here in 1983, was only the second African American to mount a national campaign for the presidency. His work as a civil rights activist garnered him a significant following in the African American community but never enough to secure the Democratic nomination. Library of Congress.

The excitement created by Jackson’s campaign mirrored the acclaim received by a few prominent African Americans in media and entertainment. Comedian Eddie Murphy rose to stardom on television’s Saturday Night Live and achieved box office success with movies like 48 Hours and Beverly Hills Cop. In 1982, pop singer Michael Jackson released Thriller, the best-selling album of all time. Oprah Winfrey began her phenomenally successful nationally syndicated talk show in 1985. Comedian Bill Cosby’s sitcom about an African American doctor and lawyer raising their four children drew the highest ratings on television for most of the decade. The popularity of The Cosby Show revealed how class informed perceptions of race in the 1980s. Cosby’s fictional TV family represented a growing number of Black middle-class professionals in the United States. Indeed, income for the top fifth of African American households increased faster than that of white households for most of the decade. Middle-class African Americans found new doors open to them in the 1980s, but the poor and working-class faced continued challenges. During Reagan’s last year in office the African American poverty rate stood at 31.6 percent, as opposed to 10.1 percent for whites.56 Black unemployment remained double that of whites throughout the decade.57 By 1990, the median income for Black families was $21,423, 42 percent below the median income for white households.58 The Reagan administration failed to address such disparities and in many ways intensified them.

New Right values threatened the legal principles and federal policies of the Great Society and the “rights revolution.” Reagan’s appointment of conservatives to agencies such as the Justice Department and the Equal Employment Opportunity Commission took aim at key policy achievements of the civil rights movement. When the 1965 Voting Rights Act came up for renewal during Reagan’s first term, the Justice Department pushed the president to oppose any extension. Only the intervention of more moderate congressional Republicans saved the law. The administration also initiated a plan to rescind federal affirmative action rules. In 1986, a broad coalition of groups—including the NAACP, the Urban League, the AFL-CIO, and even the National Association of Manufacturers—compelled the administration to abandon the effort. Despite the conservative tenor of the country, diversity programs were firmly entrenched in the corporate world by the end of the decade.

Americans increasingly embraced racial diversity as a positive value but most often approached the issue through an individualistic—not a systemic—framework. Certain federal policies disproportionately affected racial minorities. Spending cuts enacted by Reagan and congressional Republicans shrank Aid to Families with Dependent Children, Medicaid, food stamps, school lunch programs, and job training programs that provided crucial support to African American households. In 1982, the National Urban League’s annual “State of Black America” report concluded that “never [since the first report in 1976] . . . has the state of Black America been more vulnerable. Never in that time have black economic rights been under such powerful attack.”59 African American communities, especially in urban areas, also bore the stigma of violence and criminality. Homicide was the leading cause of death for Black males between ages fifteen and twenty-four, occurring at a rate six times that of other groups.60 Although African Americans were most often the victims of violent crime, sensationalist media reports incited fears about black-on-white crime in big cities. Ironically, such fear could by itself spark violence. In December 1984 a thirty-seven-year-old white engineer, Bernard Goetz, shot and seriously wounded four Black teenagers on a New York City subway car. The so-called Subway Vigilante suspected that the young men—armed with screwdrivers—planned to rob him. Pollsters found that 90 percent of white New Yorkers sympathized with Goetz.61 Echoing the law-and-order rhetoric (and policies) of the 1960s and 1970s, politicians—both Democratic and Republican—and law enforcement agencies implemented more aggressive policing of minority communities and mandated longer prison sentences for those arrested. The explosive growth of mass incarceration exacted a heavy toll on African American communities long into the twenty-first century.

VIII. Bad Times and Good Times

Working- and middle-class Americans, especially those of color, struggled to maintain economic equilibrium during the Reagan years. The growing national debt generated fresh economic pain. The federal government borrowed money to finance the debt, raising interest rates to heighten the appeal of government bonds. Foreign money poured into the United States, raising the value of the dollar and attracting an influx of goods from overseas. The imbalance between American imports and exports grew from $36 billion in 1980 to $170 billion in 1987.62 Foreign competition battered the already anemic manufacturing sector. The appeal of government bonds likewise drew investment away from American industry.

Continuing an ongoing trend, many steel and automobile factories in the industrial Northeast and Midwest closed or moved overseas during the 1980s. Bruce Springsteen, the self-appointed bard of blue-collar America, offered eulogies to Rust Belt cities in songs like “Youngstown” and “My Hometown,” in which the narrator laments that his “foreman says these jobs are going, boys / and they ain’t coming back.”63 Competition from Japanese carmakers spurred a “Buy American” campaign. Meanwhile, a “farm crisis” gripped the rural United States. Expanded world production meant new competition for American farmers, while soaring interest rates caused the already sizable debt held by family farms to mushroom. Farm foreclosures skyrocketed during Reagan’s tenure. In September 1985, prominent musicians including Neil Young and Willie Nelson organized Farm Aid, a benefit concert at the University of Illinois’s football stadium designed to raise money for struggling farmers.

At the other end of the economic spectrum, wealthy Americans thrived under the policies of the New Right. The financial industry found new ways to earn staggering profits during the Reagan years. Wall Street brokers like junk bond king Michael Milken reaped fortunes selling high-risk, high-yield securities. Reckless speculation helped drive the stock market steadily upward until the crash of October 19, 1987. On Black Friday, the market plunged eight hundred points, erasing 13 percent of its value. Investors lost more than $500 billion.64 An additional financial crisis loomed in the savings and loan (S&L) industry, and Reagan’s deregulatory policies bore significant responsibility. In 1982 Reagan signed a bill increasing the amount of federal insurance available to savings and loan depositors, making those financial institutions more popular with consumers. The bill also allowed S&Ls to engage in high-risk loans and investments for the first time. Many such deals failed catastrophically, while some S&L managers brazenly stole from their institutions. In the late 1980s, S&Ls failed with regularity, and ordinary Americans lost precious savings. The 1982 law left the government responsible for bailing out S&Ls out at an eventual cost of $132 billion.65

IX. Culture Wars of the 1980s

Popular culture of the 1980s offered another venue in which conservatives and liberals waged a battle of ideas. The militarism and patriotism of Reagan’s presidency pervaded movies like Top Gun and the Rambo series, starring Sylvester Stallone as a Vietnam War veteran haunted by his country’s failure to pursue victory in Southeast Asia. In contrast, director Oliver Stone offered searing condemnations of the war in Platoon and Born on the Fourth of July. Television shows like Dynasty and Dallas celebrated wealth and glamour, reflecting the pride in conspicuous consumption that emanated from the White House and corporate boardrooms during the decade. At the same time, films like Wall Street and novels like Bret Easton Ellis’s Less Than Zero skewered the excesses of the rich.

The most significant aspect of much popular culture in the 1980s, however, was its lack of politics altogether. Steven Spielberg’s E.T.: The Extra-Terrestrial and his Indiana Jones adventure trilogy topped the box office. Cinematic escapism replaced the social films of the 1970s. Quintessential Hollywood leftist Jane Fonda appeared frequently on television but only to peddle exercise videos. Television viewership—once dominated by the big three networks of NBC, ABC, and CBS—fragmented with the rise of cable channels catering to particularized tastes. Few cable channels so captured the popular imagination as MTV, which debuted in 1981. Telegenic artists like Madonna, Prince, and Michael Jackson skillfully used MTV to boost their reputations and album sales. Conservatives condemned music videos for corrupting young people with vulgar, anti-authoritarian messages, but the medium only grew in stature. Critics of MTV targeted Madonna in particular. Her 1989 video “Like a Prayer” drew protests for what some people viewed as sexually suggestive and blasphemous scenes. The religious right increasingly perceived popular culture as hostile to Christian values.

Photograph of the Apple II computer.

The Apple II was the smallest and sleekest personal computer model yet introduced. Indeed, it revolutionized both the substance and design of personal computers. Wikimedia.

The Apple II computer, introduced in 1977, was the first successful mass-produced microcomputer meant for home use. Cultural battles were even more heated in the realm of gender and sexual politics. American women pushed further into male-dominated spheres during the 1980s. By 1984, women in the workforce outnumbered those who worked at home.66 That same year, New York representative Geraldine Ferraro became the first woman to run on a major party’s presidential ticket when Democratic candidate Walter Mondale named her his running mate. Yet the triumph of the right placed fundamental questions about women’s rights near the center of American politics—particularly in regard to abortion. The issue increasingly divided Americans. Pro-life Democrats and pro-choice Republicans grew rare, as the National Abortion Rights Action League enforced pro-choice orthodoxy on the left and the National Right to Life Commission did the same with pro-life orthodoxy on the right. Religious conservatives took advantage of the Republican takeover of the White House and Senate in 1980 to push for new restrictions on abortion—with limited success. Senators Jesse Helms of North Carolina and Orrin Hatch of Utah introduced versions of a Human Life Amendment to the U.S. Constitution that defined life as beginning at conception. Both efforts failed.67 Reagan, more interested in economic issues than social ones, provided only lukewarm support for the anti-abortion movement. He further outraged anti-abortion activists by appointing Sandra Day O’Connor, a supporter of abortion rights, to the Supreme Court. Despite these setbacks, anti-abortion forces succeeded in defunding some abortion providers. The 1976 Hyde Amendment prohibited the use of federal funds to pay for abortions; by 1990 almost every state had its own version of the Hyde Amendment. Yet some anti-abortion activists demanded more. In 1988 evangelical activist Randall Terry founded Operation Rescue, an organization that targeted abortion clinics and pro-choice politicians with confrontational—and sometimes violent—tactics. Operation Rescue demonstrated that the fight over abortion would grow only more heated in the 1990s.

The emergence of a deadly new illness, acquired immunodeficiency syndrome (AIDS), simultaneously devastated, stigmatized, and energized the nation’s homosexual community. When AIDS appeared in the early 1980s, most of its victims were gay men. For a time the disease was known as GRID—gay-related immune deficiency. The epidemic rekindled older pseudoscientific ideas about the inherently diseased nature of homosexual bodies. The Reagan administration met the issue with indifference, leading liberal congressman Henry Waxman to rage that “if the same disease had appeared among Americans of Norwegian descent . . . rather than among gay males, the response of both the government and the medical community would be different.”68 Some religious figures seemed to relish the opportunity to condemn homosexual activity; Catholic columnist Patrick Buchanan remarked that “the sexual revolution has begun to devour its children.”69

Homosexuals were left to forge their own response to the crisis. Some turned to confrontation—like New York playwright Larry Kramer. Kramer founded the Gay Men’s Health Crisis, which demanded a more proactive response to the epidemic. Others sought to humanize AIDS victims; this was the goal of the AIDS Memorial Quilt, a commemorative project begun in 1985. By the middle of the decade the federal government began to address the issue haltingly. Surgeon General C. Everett Koop, an evangelical Christian, called for more federal funding on AIDS-related research, much to the dismay of critics on the religious right. By 1987 government spending on AIDS-related research reached $500 million—still only 25 percent of what experts advocated.70 In 1987 Reagan convened a presidential commission on AIDS; the commission’s report called for antidiscrimination laws to protect people with AIDS and for more federal spending on AIDS research. The shift encouraged activists. Nevertheless, on issues of abortion and gay rights—as with the push for racial equality—activists spent the 1980s preserving the status quo rather than building on previous gains. This amounted to a significant victory for the New Right.

AIDS awareness poster featuring a photograph of Patti LaBelle, the words "Don't listen to rumors about AIDS. Get the facts!" and the phone number 1-800-342-AIDS

The AIDS epidemic hit gay and African American communities particularly hard in the 1980s, prompting widespread social stigmatization, but also prompting awareness campaigns, such as this poster featuring singer Patti LaBelle. Wikimedia.

 

X. The New Right Abroad

The conservative movement gained ground on gender and sexual politics, but it captured the entire battlefield on American foreign policy in the 1980s, at least for a time. Ronald Reagan entered office a committed Cold Warrior. He held the Soviet Union in contempt, denouncing it in a 1983 speech as an “evil empire.”71 And he never doubted that the Soviet Union would end up “on the ash heap of history,” as he said in a 1982 speech to the British Parliament.72 Indeed, Reagan believed it was the duty of the United States to speed the Soviet Union to its inevitable demise. His Reagan Doctrine declared that the United States would supply aid to anticommunist forces everywhere in the world.73 To give this doctrine force, Reagan oversaw an enormous expansion in the defense budget. Federal spending on defense rose from $171 billion in 1981 to $229 billion in 1985, the highest level since the Vietnam War.74 He described this as a policy of “peace through strength,” a phrase that appealed to Americans who, during the 1970s, feared that the United States was losing its status as the world’s most powerful nation. Yet the irony is that Reagan, for all his militarism, helped bring the Cold War to an end through negotiation, a tactic he had once scorned.

Reagan’s election came at a time when many Americans feared their country was in an irreversible decline. American forces withdrew in disarray from South Vietnam in 1975. The United States returned sovereignty over the Panama Canal to Panama in 1978, despite protests from conservatives. Pro-American dictators were toppled in Iran and Nicaragua in 1979. The Soviet Union invaded Afghanistan that same year, leading conservatives to warn about American weakness in the face of Soviet expansion. Reagan spoke to fears of decline and warned, in 1976, that “this nation has become Number Two in a world where it is dangerous—if not fatal—to be second best.75

Margaret Thatcher and Ronald Reagan, leaders of two of the world’s most powerful countries, formed an alliance that benefited both throughout their tenures in office. Photograph of Margaret Thatcher with Ronald Reagan at Camp David, December 22, 1984. Wikimedia, http://commons.wikimedia.org/wiki/File:Thatcher_Reagan_Camp_David_sofa_1984.jpg.

Margaret Thatcher and Ronald Reagan, pictured here at Camp David in December 1984, led two of the world’s most powerful countries and formed an alliance that benefited both throughout their tenures in office. Wikimedia.

The Reagan administration made Latin America a showcase for its newly assertive policies. Jimmy Carter had sought to promote human rights in the region, but Reagan and his advisors scrapped this approach and instead focused on fighting communism—a term they applied to all Latin American left-wing movements. And so when communists with ties to Cuba overthrew the government of the Caribbean nation of Grenada in October 1983, Reagan dispatched the U.S. Marines to the island. Dubbed Operation Urgent Fury, the Grenada invasion overthrew the leftist government after less than a week of fighting. Despite the relatively minor nature of the mission, its success gave victory-hungry Americans something to cheer about after the military debacles of the previous two decades.

This photograph shows the deployment of U.S. Army Rangers into Grenada. Photograph, October 25, 1983. Wikimedia, http://commons.wikimedia.org/wiki/File:US_Army_Rangers_parachute_into_Grenada_during_Operation_Urgent_Fury.jpg.

Operation Urgent Fury, the U.S. invasion of Grenada, was broadly supported by the U.S. public. This photograph shows the deployment of U.S. Army Rangers into Grenada. Photograph, October 25, 1983. Wikimedia.

Grenada was the only time Reagan deployed the American military in Latin America, but the United States also influenced the region by supporting right-wing, anticommunist movements there. From 1981 to 1990, the United States gave more than $4 billion to the government of El Salvador in a largely futile effort to defeat the guerrillas of the Farabundo Martí National Liberation Front (FMLN).76 Salvadoran security forces equipped with American weapons committed numerous atrocities, including the slaughter of almost one thousand civilians at the village of El Mozote in December 1981.

The Reagan administration took a more cautious approach in the Middle East, where its policy was determined by a mix of anticommunism and hostility toward the Islamic government of Iran. When Iraq invaded Iran in 1980, the United States supplied Iraqi dictator Saddam Hussein with military intelligence and business credits—even after it became clear that Iraqi forces were using chemical weapons. Reagan’s greatest setback in the Middle East came in 1982, when, shortly after Israel invaded Lebanon, he dispatched Marines to the Lebanese city of Beirut to serve as a peacekeeping force. On October 23, 1983, a suicide bomber killed 241 Marines stationed in Beirut. Congressional pressure and anger from the American public forced Reagan to recall the Marines from Lebanon in March 1984. Reagan’s decision demonstrated that, for all his talk of restoring American power, he took a pragmatic approach to foreign policy. He was unwilling to risk another Vietnam by committing American troops to Lebanon.

Though Reagan’s policies toward Central America and the Middle East aroused protest, his policy on nuclear weapons generated the most controversy. Initially Reagan followed the examples of presidents Nixon, Ford, and Carter by pursuing arms limitation talks with the Soviet Union. American officials participated in the Intermediate-range Nuclear Force (INF) Talks that began in 1981 and the Strategic Arms Reduction Talks (START) in 1982. But the breakdown of these talks in 1983 led Reagan to proceed with plans to place Pershing II nuclear missiles in Western Europe to counter Soviet SS-20 missiles in Eastern Europe. Reagan went a step further in March 1983, when he announced plans for a Strategic Defense Initiative (SDI), a space-based system that could shoot down incoming Soviet missiles. Critics derided the program as a “Star Wars” fantasy, and even Reagan’s advisors harbored doubts. “We don’t have the technology to do this,” secretary of state George Shultz told aides.77 These aggressive policies fed a growing nuclear freeze movement throughout the world. In the United States, organizations like the Committee for a Sane Nuclear Policy organized protests that culminated in a June 1982 rally that drew almost a million people to New York City’s Central Park.

Image showing a series of satellites that would be a part of the proposed space-based Strategic Defense Initiative.

President Reagan proposed new space- and ground-based defense systems to protect the United States from nuclear missiles in his 1984 Strategic Defense Initiative (SDI). Scientists argued that it was technologically unfeasible, and it was lambasted in the media as the “Star Wars” program. Wikimedia.

Protests in the streets were echoed by resistance in Congress. Congressional Democrats opposed Reagan’s policies on the merits; congressional Republicans, though they supported Reagan’s anticommunism, were wary of the administration’s fondness for circumventing Congress. In 1982, the House voted 411–0 to approve the Boland Amendment, which barred the United States from supplying funds to the contras, a right-wing insurgency fighting the leftist Sandinista government in Nicaragua. Reagan, overlooking the contras’ brutal tactics, hailed them as the “moral equivalent of the Founding Fathers.”78 The Reagan administration’s determination to flout these amendments led to a scandal that almost destroyed Reagan’s presidency. Robert MacFarlane, the president’s national security advisor, and Oliver North, a member of the National Security Council, raised money to support the contras by selling American missiles to Iran and funneling the money to Nicaragua. When their scheme was revealed in 1986, it was hugely embarrassing for Reagan. The president’s underlings had not only violated the Boland Amendment but had also, by selling arms to Iran, made a mockery of Reagan’s declaration that “America will never make concessions to the terrorists.” But while the Iran-Contra affair generated comparisons to the Watergate scandal, investigators were never able to prove Reagan knew about the operation. Without such a “smoking gun,” talk of impeaching Reagan remained simply talk.

Though the Iran-Contra scandal tarnished the Reagan administration’s image, it did not derail Reagan’s most significant achievement: easing tensions with the Soviet Union. This would have seemed impossible in Reagan’s first term, when the president exchanged harsh words with a rapid succession of Soviet leaders—Leonid Brezhnev, Yuri Andropov, and Konstantin Chernenko. In 1985, however, the aged Chernenko’s death handed leadership of the Soviet Union to Mikhail Gorbachev, who, while a true believer in socialism, nonetheless realized that the Soviet Union desperately needed to reform itself. He instituted a program of perestroika, which referred to the restructuring of the Soviet system, and of glasnost, which meant greater transparency in government. Gorbachev also reached out to Reagan in hopes of negotiating an end to the arms race, which was bankrupting the Soviet Union. Reagan and Gorbachev met in Geneva, Switzerland, in 1985 and Reykjavik, Iceland, in 1986. The summits failed to produce any concrete agreements, but the two leaders developed a relationship unprecedented in the history of U.S.-Soviet relations. This trust made possible the Intermediate Nuclear Forces Treaty of 1987, which committed both sides to a sharp reduction in their nuclear arsenal.

By the late 1980s the Soviet empire was crumbling. Reagan successfully combined anticommunist rhetoric (such as his 1987 speech at the Berlin Wall, where he declared, “General Secretary Gorbachev, if you seek peace . . . tear down this wall!”) with a willingness to negotiate with Soviet leadership.79 But the most significant causes of collapse lay within the Soviet empire itself. Soviet-allied governments in Eastern Europe tottered under pressure from dissident organizations like Poland’s Solidarity and East Germany’s Neues Forum. Some of these countries, such as Poland, were also pressured from within by the Roman Catholic Church, which had turned toward active anticommunism under Pope John Paul II. When Gorbachev made it clear that he would not send the Soviet military to prop up these regimes, they collapsed one by one in 1989—in Poland, Hungary, Czechoslovakia, Romania, Bulgaria, and East Germany. Within the Soviet Union, Gorbachev’s proposed reforms unraveled the decaying Soviet system rather than bringing stability. By 1991 the Soviet Union itself had vanished, dissolving into a Commonwealth of Independent States.

 

XI. Conclusion

Reagan left office in 1988 with the Cold War waning and the economy booming. Unemployment had dipped to 5 percent by 1988.80 Between 1981 and 1986, gas prices fell from $1.38 per gallon to 95¢.81 The stock market recovered from the crash, and the Dow Jones Industrial Average—which stood at 950 in 1981—reached 2,239 by the end of Reagan’s second term.82 Yet the economic gains of the decade were unequally distributed. The top fifth of households enjoyed rising incomes while the rest stagnated or declined.83 In constant dollars, annual chief executive officer (CEO) pay rose from $3 million in 1980 to roughly $12 million during Reagan’s last year in the White House.84 Between 1985 and 1989 the number of Americans living in poverty remained steady at thirty-three million.85 Real per capita money income grew at only 2 percent per year, a rate roughly equal to the Carter years.86 The American economy saw more jobs created than lost during the 1980s, but half of the jobs eliminated were in high-paying industries.87 Furthermore, half of the new jobs failed to pay wages above the poverty line. The economic divide was most acute for African Americans and Latinos, one third of whom qualified as poor.

The triumph of the right proved incomplete. The number of government employees actually increased under Reagan. With more than 80 percent of the federal budget committed to defense, entitlement programs, and interest on the national debt, the right’s goal of deficit elimination floundered for lack of substantial areas to cut.88 Between 1980 and 1989 the national debt rose from $914 billion to $2.7 trillion.89 Despite steep tax cuts for corporations and the wealthy, the overall tax burden of the American public basically remained unchanged. Moreover, so-called regressive taxes on payroll and certain goods actually increased the tax burden on low- and middle-income Americans. Finally, Reagan slowed but failed to vanquish the five-decade legacy of liberal economics. Most New Deal and Great Society programs proved durable. Government still offered its neediest citizens a safety net, if a now continually shrinking one.

Yet the discourse of American politics had irrevocably changed. The preeminence of conservative political ideas grew ever more pronounced, even when Democrats controlled Congress or the White House. In response to the conservative mood of the country, the Democratic Party adapted its own message to accommodate many of the Republicans’ Reagan-era ideas and innovations. The United States was on a rightward path.

 

XII. Primary Sources

1. First Inaugural Address of Ronald Reagan (1981)

Ronald Reagan, a former actor, corporate spokesperson, and California governor, won the presidency in 1980 with a potent mix of personal charisma and conservative politics. In his first inaugural address, Reagan famously declared that “government is not the solution to our problem; government is the problem.”

2. Jerry Falwell on the “Homosexual Revolution” (1981)

“Letter from Jerry Falwell on his opposition to homosexuality and asking for support in keeping his “Old-Time Gospel Hour” television program on the air. Falwell writes that the Old Time Gospel Hour “is one of the few major ministries in America crying out against militant homosexuals” (p. 1). The letter is printed on what appears to be lined yellow notepad paper.”

3. Statements of AIDS Patients (1983)

HIV/AIDS confronted Americans in the 1980s. The disease was first associated with gay men (it was initially called Gay-Related Immune Disease, or GRID) and AIDS sufferers fought for recognition of the disease’s magnitude, petitioned for research funds, and battled against popular stigma associated with the disease.

4. Statements from The Parents Music Resource Center (1985)

In 1985, the Senate held hearings on explicit music. The Parents Music Resource Center (1985), founded by the wives of prominent politicians in Washington D.C., publicly denounced lyrics, album covers, and music videos dealing with sex, violence, and drug use. The PRMC pressured music publishers and retailers and singled out artists such as Judas Priest, Prince, AC/DC, Madonna, and Black Sabbath, and Cyndi Lauper. The following is extracted from statements by Susan Baker, the wife of then-Treasury Secretary James Baker, and Tipper Gore, wife of Senator and later Vice President Al Gore, in support of warning labels on music packaging.

5. Pat Buchanan on the Culture War (1992)

Pat Buchanan was a conservative journalist who worked in the Nixon and Reagan administrations before running for the Republican presidential nomination in 1992. Although he lost the nomination to George H.W. Bush, he was invited to speak at that year’s Republican National Convention, where he delivered a fiery address criticizing liberals and declaring a “culture war” at the heart of American life.

6. Phyllis Schlafly on Women’s Responsibility for Sexual Harassment (1981)

Conservative activist Phyllis Schlafly fought against feminism and other liberal cultural trends for decades. Perhaps most notably, she led the campaign against the Equal Rights Amendment, turning what had seemed an inevitability into a failed effort. Here, she testified before Congress about what she saw as the largely imagined problem of sexual harassment.

7. Jesse Jackson on the Rainbow Coalition (1984)

After a groundbreaking yet unsuccessful campaign to capture the Democratic Party’s nomination for president, Jesse Jackson delivered the keynote speech at the 1984 Democratic National Convention in San Francisco. He had campaigned on the idea of a “rainbow coalition,” a political movement that drew upon the nation’s racial, religious, and economic diversity. He echoed that theme in his convention speech.

8. Satellites Imagined in Orbit (1981)

While Cold War fears still preyed upon Americans, satellite technology and advancements in telecommunications inspired hopes for an interconnected future. Here, an artist in 1981 depicts various satellites in orbit around the Earth.

9. Ronald Reagan and the American Flag (1982)

President Ronald Reagan, a master of the “photo op,” appears here with a row of American flags at his back at a 1982 rally for Senator David Durenberger in Minneapolis, Minnesota.

 

XIII. Reference Material

This chapter was edited by Richard Anderson and William J. Schultz, with content contributions by Richard Anderson, Laila Ballout, Marsha Barrett, Seth Bartee, Eladio Bobadilla, Kyle Burke, Andrew Chadwick, Aaron Cowan, Jennifer Donnally, Leif Fredrickson, Kori Graves, Karissa A. Haugeberg, Jonathan Hunt, Stephen Koeth, Colin Reynolds, William J. Schultz, and Daniel Spillman.

Recommended citation: Richard Anderson et al., “The Triumph of the Right,” Richard Anderson and William J. Schultz, eds., in The American Yawp, eds. Joseph Locke and Ben Wright (Stanford, CA: Stanford University Press, 2018).

Recommended Reading

  1. Brier, Jennifer. Infectious Ideas: U.S. Political Responses to the AIDS Crisis. Chapel Hill: University of North Carolina Press, 2009.
  2. Carter, Dan T. The Politics of Rage: George Wallace, the Origins of the New Conservatism, and the Transformation of American Politics. Baton Rouge: LSU Press, 1995.
  3. Chappell, Marisa. The War on Welfare: Family, Poverty, and Politics in Modern America. Philadelphia: University of Pennsylvania Press, 2009.
  4. Crespino, Joseph. In Search of Another Country: Mississippi and the Conservative Counterrevolution. Princeton, NJ: Princeton University Press, 2007.
  5. Critchlow, Donald. The Conservative Ascendancy: How the GOP Right Made Political History. Cambridge, MA: Harvard University Press, 2007.
  6. Dallek, Matthew. The Right Moment: Ronald Reagan’s First Victory and the Decisive Turning Point in American Politics. New York: Free Press, 2000.
  7. Hinton, Elizabeth. From the War on Poverty to the War on Crime. Cambridge, MA: Harvard University Press, 2016.
  8. Hunter, James D. Culture Wars: The Struggle to Define America. New York: Basic Books, 1992.
  9. Kalman, Laura. Right Star Rising: A New Politics, 1974–1980. New York: Norton, 2010.
  10. Kruse, Kevin M. White Flight: Atlanta and the Making of Modern Conservatism. Princeton, NJ: Princeton University Press, 2005.
  11. Lassiter, Matthew D. The Silent Majority: Suburban Politics in the Sunbelt South. Princeton, NJ: Princeton University Press, 2006.
  12. MacLean, Nancy. Freedom Is Not Enough: The Opening of the American Workplace. Cambridge, MA: Harvard University Press, 2008.
  13. Moreton, Bethany. To Serve God and Walmart: The Making of Christian Free Enterprise. Cambridge, MA: Harvard University Press, 2009.
  14. Nadasen, Premilla. Welfare Warriors: The Welfare Rights Movement in the United States. New York: Routledge, 2005.
  15. Nickerson, Michelle M. Mothers of Conservatism: Women and the Postwar Right. Princeton, NJ: Princeton University Press, 2012.
  16. Patterson, James T. Restless Giant: The United States from Watergate to Bush v. Gore. New York: Oxford University Press, 2005.
  17. Phillips-Fein, Kim. Invisible Hands: The Businessmen’s Crusade Against the New Deal. New York: Norton, 2010.
  18. Rodgers, Daniel T. Age of Fracture. Cambridge: Belknap Press, 2011.
  19. Schoenwald, Jonathan. A Time for Choosing: The Rise of Modern American Conservatism. New York: Oxford University Press, 2001.
  20. Self, Robert O. All in the Family: The Realignment of American Democracy Since the 1960s. New York: Hill and Wang, 2012.
  21. Troy, Gil. Morning in America: How Ronald Reagan Invented the 1980s. Princeton, NJ: Princeton University Press, 2005.
  22. Westad, Odd Arne. The Global Cold War: Third World Interventions and the Making of Our Times. New York: Cambridge University Press, 2005.
  23. Wilentz, Sean. The Age of Reagan: A History, 1974–2008. New York: HarperCollins, 2008.
  24. Williams, Daniel K. God’s Own Party: The Making of the Christian Right. New York: Oxford University Press, 2007.
  25. Zaretsky, Natasha. No Direction Home: The American Family and the Fear of National Decline. Chapel Hill: University of North Carolina Press, 2007.

 

Notes

  1. Ronald Reagan, quoted in Steve Neal, “Reagan Assails Carter On Auto Layoffs,” Chicago Tribune, October 20, 1980, 5. []
  2. Ronald Reagan, quoted in James T. Patterson, Restless Giant: The United States from Watergate to Bush v. Gore (New York: Oxford University Press, 2005), 152. []
  3. Robert Self, All in the Family: The Realignment of American Democracy Since the 1960s (New York: Hill and Wang, 2012), 369. []
  4. William F. Buckley, Jr., “Our Mission Statement,” National Review, November 19, 1955. http://www.nationalreview.com/article/223549/our-mission-statement-william-f-buckley-jr. []
  5. James Reston, “What Goldwater Lost: Voters Rejected His Candidacy, Conservative Cause and the G.O.P.,” New York Times, November 4, 1964, 23. []
  6. George Wallace, quoted in William Chafe, The Unfinished Journey: America Since World War II (New York: Oxford University Press, 1991), 377. []
  7. James Patterson, Grand Expectations: The United States, 1945–1974 (New York: Oxford University Press, 1996), 735–736. []
  8. Lisa McGirr, Suburban Warriors: The Origins of the New American Right (Princeton, NJ: Princeton University Press, 2001), 227–231. []
  9. Francis Schaeffer, quoted in Whatever Happened to the Human Race? (Episode I), Film, directed by Franky Schaeffer, (1979, USA, Franky Schaeffer V Productions). https://www.youtube.com/watch?v=UQAyIwi5l6E.. []
  10. Walter Goodman, “Irving Kristol: Patron Saint of the New Right,” New York Times Magazine, December 6, 1981. http://www.nytimes.com/1981/12/06/magazine/irving-kristol-patron-saint-of-the-new-right.html. []
  11. Patterson, Restless Giant, 113. []
  12. Jimmy Carter, 1978 State of the Union Address, January 19, 1978, Jimmy Carter Presidential Library and Museum, http://www.jimmycarterlibrary.gov/documents/speeches/su78jec.phtml.. []
  13. Jefferson Cowie, Stayin’ Alive: The 1970s and the Last Days of the Working Class (New York: New Press, 2010), 12. []
  14. George Meany, quoted in ibid., 293. []
  15. Ibid., 268. []
  16. Richard Viguerie, quoted in Joseph Crespino, “Civil Rights and the Religious Right,” in Bruce J. Schulman and Julian Zelizer, eds., Rightward Bound: Making America Conservative in the 1970s (Cambridge, MA: Harvard University Press, 2008), 91. []
  17. Patterson, Restless Giant, 148. []
  18. Judith Stein, Pivotal Decade: How the United States Traded Factories for Finance in the Seventies (New Haven, CT: Yale University Press, 2010), 231. []
  19. Jimmy Carter, quoted in Chafe, Unfinished Journey, 453. []
  20. William Winpisinger, quoted in Cowie, Stayin’ Alive, 261. []
  21. Patterson, Restless Giant, 148. []
  22. Ibid. []
  23. Ibid. []
  24. Jimmy Carter, quoted in “Carter Tells of ‘Adultery in His Heart,’” Los Angeles Times, September 21, 1976, B6. []
  25. Crespino, “Civil Rights and the Religious Right,” 103. []
  26. Patterson, Restless Giant, 163; Jon Nordheimer, “Reagan Is Picking His Florida Spots: His Campaign Aides Aim for New G.O.P. Voters in Strategic Areas,” New York Times, February 5, 1976, 24. []
  27. Sean Wilentz, The Age of Reagan: A History, 1974–2008 (New York: HarperCollins, 2008), 124. []
  28. Meg Jacobs and Julian Zelizer, Conservatives in Power: The Reagan Years, 1981–1989: A Brief History with Documents (Boston: Bedford St. Martin’s, 2011), 2. []
  29. Patterson, Restless Giant, 150. []
  30. Ibid. []
  31. Ronald Reagan, quoted in Jacobs and Zelizer, Conservatives in Power, 20. []
  32. Jack Kemp, quoted in Jacobs and Zelizer, Conservatives in Power, 21. []
  33. Wilentz, Age of Reagan, 121. []
  34. Jacobs and Zelizer, Conservatives in Power, 25–26. []
  35. Ronald Reagan, quoted in Neal, “Reagan Assails Carter,” 5 []
  36. Stein, Pivotal Decade, 267. []
  37. Chafe, Unfinished Journey, 474. []
  38. Patterson, Restless Giant, 159. []
  39. Gil Troy, Morning in America: How Ronald Reagan Invented the 1980s (Princeton, NJ: Princeton University Press, 2005), 67. []
  40. Chafe, Unfinished Journey, 476. []
  41. Ibid., 474. []
  42. Margaret Bush Wilson, quoted in Troy, Morning in America, 93. []
  43. Ibid., 210. []
  44. Ibid., 110. []
  45. Patterson, Restless Giant, 163–164. []
  46. Troy, Morning in America, 208. []
  47. Chafe, Unfinished Journey, 477. []
  48. Patterson, Restless Giant, 162. Many people used the term harsh medicine to describe Volcker’s action on interest rates; see Art Pine, “Letting Harsh Medicine Work,” Washington Post, October 14, 1979, G1. []
  49. Patterson, Restless Giant, 189. []
  50. Ibid. []
  51. Ibid., 190–191. []
  52. Troy, Morning in America, 210; Patterson, Restless Giant, 165. []
  53. Patterson, Restless Giant, 173–174. []
  54. Ibid., 171. []
  55. 1988 Democratic Primaries, CQ Voting and Elections Collection, database accessed June 30, 2015. []
  56. The State of Black America, 1990 (New York: National Urban League, 1990), 34. []
  57. Andrew Hacker, Two Nations: Black and White, Separate, Hostile, Unequal (New York: Scribner, 1992), 102. []
  58. Ibid., 94. []
  59. Troy, Morning in America, 91. []
  60. American Social History Project, Who Built America? Vol. Two: Since 1877 (New York: Bedford St. Martin’s, 2000), 723. []
  61. Patterson, Restless Giant, 172–173. []
  62. Chafe, Unfinished Journey, 487. []
  63. Bruce Springsteen, “My Hometown,” Born in the USA (Columbia Records: New York, 1984). []
  64. Chafe, Unfinished Journey, 489. []
  65. Patterson, Restless Giant, 175. []
  66. Ruth Rosen, The World Split Open: How the Modern Women’s Movement Changed America (New York: Penguin, 2000), 337. []
  67. Self, All in the Family, 376–377. []
  68. Ibid., 387–388. []
  69. Ibid., 384. []
  70. Ibid., 389. []
  71. Wilentz, Age of Reagan, 163. []
  72. Lou Cannon, “President Calls for ‘Crusade’: Reagan Proposes Plan to Counter Soviet Challenge,” Washington Post, June 9, 1982, A1. []
  73. Conservative newspaper columnist Charles Krauthammer coined the phrase. See Wilentz, Age of Reagan, 157. []
  74. Patterson, Restless Giant, 205. []
  75. Laura Kalman, Right Star Rising: A New Politics, 1974–1980 (New York: Norton, 2010), 166–167. []
  76. Ronald Reagan, “Address to the Nation on United States Policy in Central America,” May 9, 1984. http://www.reagan.utexas.edu/archives/speeches/1984/50984h.htm.. []
  77. Frances Fitzgerald, Way out There in the Blue: Reagan, Star Wars, and the End of the Cold War (New York: Simon and Schuster, 2000), 205. []
  78. Ronald Reagan, “Remarks at the Annual Dinner of the Conservative Political Action Conference,” March 1, 1985. http://www.presidency.ucsb.edu/ws/?pid=38274. []
  79. Lou Cannon, “Reagan Challenges Soviets to Dismantle Berlin Wall: Aides Disappointed at Crowd’s Lukewarm Reception,” Washington Post, June 13, 1987, A1. []
  80. Patterson, Restless Giant, 163. []
  81. Ibid. []
  82. Patterson, Restless Giant. []
  83. Jacobs and Zelizer, Conservatives in Power, 32. []
  84. Patterson, Restless Giant, 186. []
  85. Ibid., 164. []
  86. Ibid., 166. []
  87. Chafe, Unfinished Journey, 488. []
  88. Jacobs and Zelizer, Conservatives in Power, 31. []
  89. Patterson, Restless Giant, 158. []

28. The Unraveling

Photograph of an abandoned Packard Automotive Plant in Detroit, Michigan. Via Wikimedia.

Abandoned Packard Automotive Plant in Detroit, Michigan. Wikimedia.

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

On December 6, 1969, an estimated three hundred thousand people converged on the Altamont Motor Speedway in Northern California for a massive free concert headlined by the Rolling Stones and featuring some of the era’s other great rock acts.1 Only four months earlier, Woodstock had shown the world the power of peace and love and American youth. Altamont was supposed to be “Woodstock West.”2

But Altamont was a disorganized disaster. Inadequate sanitation, a horrid sound system, and tainted drugs strained concertgoers. To save money, the Hells Angels biker gang was paid $500 in beer to be the show’s “security team.” The crowd grew progressively angrier throughout the day. Fights broke out. Tensions rose. The Angels, drunk and high, armed themselves with sawed-off pool cues and indiscriminately beat concertgoers who tried to come on the stage. The Grateful Dead refused to play. Finally, the Stones came on stage.3

The crowd’s anger was palpable. Fights continued near the stage. Mick Jagger stopped in the middle of playing “Sympathy for the Devil” to try to calm the crowd: “Everybody be cool now, c’mon,” he pleaded. Then, a few songs later, in the middle of “Under My Thumb,” eighteen-year-old Meredith Hunter approached the stage and was beaten back. Pissed off and high on methamphetamines, Hunter brandished a pistol, charged again, and was stabbed and killed by an Angel. His lifeless body was stomped into the ground. The Stones just kept playing.4

If the more famous Woodstock music festival captured the idyll of the sixties youth culture, Altamont revealed its dark side. There, drugs, music, and youth were associated not with peace and love but with anger, violence, and death. While many Americans in the 1970s continued to celebrate the political and cultural achievements of the previous decade, a more anxious, conservative mood grew across the nation. For some, the United States had not gone nearly far enough to promote greater social equality; for others, the nation had gone too far, unfairly trampling the rights of one group to promote the selfish needs of another. Onto these brewing dissatisfactions, the 1970s dumped the divisive remnants of a failed war, the country’s greatest political scandal, and an intractable economic crisis. It seemed as if the nation was ready to unravel.

 

II. The Strain of Vietnam

Photograph of Vietnam War protestors in Washington DC. A sign says "Get the Hell out of Vietnam!"

Vietnam War protestors at the March on the Pentagon. Lyndon B. Johnson Library via Wikimedia.

Perhaps no single issue contributed more to public disillusionment than the Vietnam War. As the war deteriorated, the Johnson administration escalated American involvement by deploying hundreds of thousands of troops to prevent the communist takeover of the south. Stalemates, body counts, hazy war aims, and the draft catalyzed an antiwar movement and triggered protests throughout the United States and Europe. With no end in sight, protesters burned draft cards, refused to pay income taxes, occupied government buildings, and delayed trains loaded with war materials. By 1967, antiwar demonstrations were drawing hundreds of thousands. In one protest, hundreds were arrested after surrounding the Pentagon.5

Vietnam was the first “living room war.”6 Television, print media, and open access to the battlefield provided unprecedented coverage of the conflict’s brutality. Americans confronted grisly images of casualties and atrocities. In 1965, CBS Evening News aired a segment in which U.S. Marines burned the South Vietnamese village of Cam Ne with little apparent regard for the lives of its occupants, who had been accused of aiding Vietcong guerrillas. President Johnson berated the head of CBS, yelling over the phone, “Your boys just shat on the American flag.”7

While the U.S. government imposed no formal censorship on the press during Vietnam, the White House and military nevertheless used press briefings and interviews to paint a deceptive image of the war. The United States was winning the war, officials claimed. They cited numbers of enemies killed, villages secured, and South Vietnamese troops trained. However, American journalists in Vietnam quickly realized the hollowness of such claims (the press referred to afternoon press briefings in Saigon as “the Five o’Clock Follies”).8 Editors frequently toned down their reporters’ pessimism, often citing conflicting information received from their own sources, who were typically government officials. But the evidence of a stalemate mounted.

Stories like CBS’s Cam Ne piece exposed a credibility gap, the yawning chasm between the claims of official sources and the increasingly evident reality on the ground in Vietnam.9 Nothing did more to expose this gap than the 1968 Tet Offensive. In January, communist forces attacked more than one hundred American and South Vietnamese sites throughout South Vietnam, including the American embassy in Saigon. While U.S. forces repulsed the attack and inflicted heavy casualties on the Vietcong, Tet demonstrated that despite the repeated claims of administration officials, the enemy could still strike at will anywhere in the country, even after years of war. Subsequent stories and images eroded public trust even further. In 1969, investigative reporter Seymour Hersh revealed that U.S. troops had raped and/or massacred hundreds of civilians in the village of My Lai.10 Three years later, Americans cringed at Nick Ut’s wrenching photograph of a naked Vietnamese child fleeing a South Vietnamese napalm attack. More and more American voices came out against the war.

Reeling from the war’s growing unpopularity, on March 31, 1968, President Johnson announced on national television that he would not seek reelection.11 Eugene McCarthy and Robert F. Kennedy unsuccessfully battled against Johnson’s vice president, Hubert Humphrey, for the Democratic Party nomination (Kennedy was assassinated in June). At the Democratic Party’s national convention in Chicago, local police brutally assaulted protesters on national television.

For many Americans, the violent clashes outside the convention hall reinforced their belief that civil society was unraveling. Republican challenger Richard Nixon played on these fears, running on a platform of “law and order” and a vague plan to end the war. Well aware of domestic pressure to wind down the war, Nixon sought, on the one hand, to appease antiwar sentiment by promising to phase out the draft, train South Vietnamese forces to assume more responsibility for the war effort, and gradually withdraw American troops. Nixon and his advisors called it “Vietnamization.”12 At the same time, Nixon appealed to the so-called silent majority of Americans who still supported the war (and opposed the antiwar movement) by calling for an “honorable” end to U.S. involvement—what he later called “peace with honor.”13 He narrowly edged out Humphrey in the fall’s election.

Public assurances of American withdrawal, however, masked a dramatic escalation of conflict. Looking to incentivize peace talks, Nixon pursued a “madman strategy” of attacking communist supply lines across Laos and Cambodia, hoping to convince the North Vietnamese that he would do anything to stop the war.14 Conducted without public knowledge or congressional approval, the bombings failed to spur the peace process, and talks stalled before the American-imposed November 1969 deadline. News of the attacks renewed antiwar demonstrations. Police and National Guard troops killed six students in separate protests at Jackson State University in Mississippi, and, more famously, Kent State University in Ohio in 1970.

Another three years passed—and another twenty thousand American troops died—before an agreement was reached.15 After Nixon threatened to withdraw all aid and guaranteed to enforce a treaty militarily, the North and South Vietnamese governments signed the Paris Peace Accords in January 1973, marking the official end of U.S. force commitment to the Vietnam War. Peace was tenuous, and when war resumed North Vietnamese troops quickly overwhelmed southern forces. By 1975, despite nearly a decade of direct American military engagement, Vietnam was united under a communist government.

The Vietnam War profoundly influenced domestic politics. Moreover, it poisoned many Americans’ perceptions of their government and its role in the world. And yet, while the antiwar demonstrations attracted considerable media attention and stand today as a hallmark of the sixties counterculture, many Americans nevertheless continued to regard the war as just. Wary of the rapid social changes that reshaped American society in the 1960s and worried that antiwar protests threatened an already tenuous civil order, a growing number of Americans turned to conservatism.

 

III. Racial, Social, and Cultural Anxieties

A photograph of Los Angeles police violently arresting a Black man during the Watts riot on August 12, 1965

Los Angeles police violently arrest a man during the Watts riot on August 12, 1965. Wikimedia.

The civil rights movement looked dramatically different at the end of the 1960s than it had at the beginning. The movement had never been monolithic, but prominent, competing ideologies had fractured the movement in the 1970s. The rise of the Black Power movement challenged the integrationist dreams of many older activists as the assassinations of Martin Luther King Jr. and Malcolm X fueled disillusionment and many alienated activists recoiled from liberal reformers.

The political evolution of the civil rights movement was reflected in American culture. The lines of race, class, and gender ruptured American “mass” culture. The monolith of popular American culture, pilloried in the fifties and sixties as exclusively white, male-dominated, conservative, and stifling, finally shattered and Americans retreated into ever smaller, segmented subcultures. Marketers now targeted particular products to ever smaller pieces of the population, including previously neglected groups such as African Americans.16 Subcultures often revolved around certain musical styles, whether pop, disco, hard rock, punk rock, country, or hip-hop. Styles of dress and physical appearance likewise aligned with cultures of choice.

If the popular rock acts of the sixties appealed to a new counterculture, the seventies witnessed the resurgence of cultural forms that appealed to a white working class confronting the social and political upheavals of the 1960s. Country hits such as Merle Haggard’s “Okie from Muskogee” evoked simpler times and places where people “still wave Old Glory down at the courthouse” and they “don’t let our hair grow long and shaggy like the hippies out in San Francisco.” (Haggard would claim the song was satirical, but it nevertheless took hold.) A popular television sitcom, All in the Family, became an unexpected hit among “middle America.” The show’s main character, Archie Bunker, was designed to mock reactionary middle-aged white men, but audiences embraced him. “Isn’t anyone interested in upholding standards?” he lamented in an episode dealing with housing integration. “Our world is coming crumbling down. The coons are coming!”17

Photograph of the interracial cast of the CBS television show All in the Family.

The cast of CBS’s All in the Family in 1973. Wikimedia.

As Bunker knew, African Americans were becoming much more visible in American culture. While Black cultural forms had been prominent throughout American history, they assumed new popular forms in the 1970s. Disco offered a new, optimistic, racially integrated pop music. Musicians such as Aretha Franklin, Andraé Crouch, and “fifth Beatle” Billy Preston brought their background in church performance to their own recordings as well as to the work of white artists like the Rolling Stones, with whom they collaborated. By the end of the decade, African American musical artists had introduced American society to one of the most significant musical innovations in decades: the Sugarhill Gang’s 1979 record, Rapper’s Delight. A lengthy paean to Black machismo, it became the first rap single to reach the Top 40.18

Just as rap represented a hypermasculine Black cultural form, Hollywood popularized its white equivalent. Films such as 1971’s Dirty Harry captured a darker side of the national mood. Clint Eastwood’s titular character exacted violent justice on clear villains, working within the sort of brutally simplistic ethical standard that appealed to Americans anxious about a perceived breakdown in “law and order.” (“The film’s moral position is fascist,” said critic Roger Ebert, who nevertheless gave it three out of four stars.19)

Perhaps the strongest element fueling American anxiety over “law and order” was the increasingly visible violence associated with the civil rights movement. No longer confined to the antiblack terrorism that struck the southern civil rights movement in the 1950s and 1960s, publicly visible violence now broke out among Black Americans in urban riots and among whites protesting new civil rights programs. In the mid-1970s, for instance, protests over the use of busing to overcome residential segregation and truly integrate public schools in Boston washed the city in racial violence. Stanley Forman’s Pulitzer Prize–winning photo, The Soiling of Old Glory, famously captured a Black civil rights attorney, Ted Landsmark, being attacked by a mob of anti-busing protesters, one of whom wielded an American flag as a weapon.20

Urban riots, though, rather than anti-integration violence, tainted many white Americans’ perception of the civil rights movement and urban life in general. Civil unrest broke out across the country, but the riots in Watts/Los Angeles (1965), Newark (1967), and Detroit (1967) were the most shocking. In each, a physical altercation between white police officers and African Americans spiraled into days of chaos and destruction. Tens of thousands participated in urban riots. Many looted and destroyed white-owned business. There were dozens of deaths, tens of millions of dollars in property damage, and an exodus of white capital that only further isolated urban poverty.21

In 1967, President Johnson appointed the Kerner Commission to investigate the causes of America’s riots. Their report became an unexpected best seller.22 The commission cited Black frustration with the hopelessness of poverty as the underlying cause of urban unrest. As the head of the Black National Business League testified, “It is to be more than naïve—indeed, it is a little short of sheer madness—for anyone to expect the very poorest of the American poor to remain docile and content in their poverty when television constantly and eternally dangles the opulence of our affluent society before their hungry eyes.”23 A Newark rioter who looted several boxes of shirts and shoes put it more simply: “They tell us about that pie in the sky but that pie in the sky is too damn high.”24 But white conservatives blasted the conclusion that white racism and economic hopelessness were to blame for the violence. African Americans wantonly destroying private property, they said, was not a symptom of America’s intractable racial inequalities but the logical outcome of a liberal culture of permissiveness that tolerated—even encouraged—nihilistic civil disobedience. Many white moderates and liberals, meanwhile, saw the explosive violence as a sign that African Americans had rejected the nonviolence of the earlier civil rights movement.

The unrest of the late sixties did, in fact, reflect a real and growing disillusionment among African Americans with the fate of the civil rights crusade. In the still-moldering ashes of Jim Crow, African Americans in Watts and other communities across the country bore the burdens of lifetimes of legally sanctioned discrimination in housing, employment, and credit. Segregation survived the legal dismantling of Jim Crow. The perseverance into the present day of stark racial and economic segregation in nearly all American cities destroyed any simple distinction between southern de jure segregation and nonsouthern de facto segregation. Black neighborhoods became traps that too few could escape.

Political achievements such as the 1964 Civil Rights Act and the 1965 Voting Rights Act were indispensable legal preconditions for social and political equality, but for most, the movement’s long (and now often forgotten) goal of economic justice proved as elusive as ever. “I worked to get these people the right to eat cheeseburgers,” Martin Luther King Jr. supposedly said to Bayard Rustin as they toured the devastation in Watts some years earlier, “and now I’ve got to do something . . . to help them get the money to buy it.”25 What good was the right to enter a store without money for purchases?

 

IV. The Crisis of 1968

To Americans in 1968, the country seemed to be unraveling. Martin Luther King Jr. was killed on April 4, 1968. He had been in Memphis to support striking sanitation workers. (Prophetically, he had reflected on his own mortality in a rally the night before. Confident that the civil rights movement would succeed without him, he brushed away fears of death. “I’ve been to the mountaintop,” he said, “and I’ve seen the promised land.”). The greatest leader in the American civil rights movement was lost. Riots broke out in over a hundred American cities. Two months later, on June 6, Robert F. Kennedy was killed campaigning in California. He had represented the last hope of liberal idealists. Anger and disillusionment washed over the country.

As the Vietnam War descended ever deeper into a brutal stalemate and the Tet Offensive exposed the lies of the Johnson administration, students shut down college campuses and government facilities. Protests enveloped the nation.

Protesters converged on the Democratic National Convention in Chicago at the end of August 1968, when a bitterly fractured Democratic Party gathered to assemble a passable platform and nominate a broadly acceptable presidential candidate. Demonstrators planned massive protests in Chicago’s public spaces. Initial protests were peaceful, but the situation quickly soured as police issued stern threats and young people began to taunt and goad officials. Many of the assembled students had protest and sit-in experiences only in the relative safe havens of college campuses and were unprepared for Mayor Richard Daley’s aggressive and heavily armed police force and National Guard troops in full riot gear. Attendees recounted vicious beatings at the hands of police and Guardsmen, but many young people—convinced that much public sympathy could be won via images of brutality against unarmed protesters—continued stoking the violence. Clashes spilled from the parks into city streets, and eventually the smell of tear gas penetrated the upper floors of the opulent hotels hosting Democratic delegates. Chicago’s brutality overshadowed the convention and culminated in an internationally televised, violent standoff in front of the Hilton Hotel. “The whole world is watching,” the protesters chanted. The Chicago riots encapsulated the growing sense that chaos now governed American life.

For many sixties idealists, the violence of 1968 represented the death of a dream. Disorder and chaos overshadowed hope and progress. And for conservatives, it was confirmation of all of their fears and hesitations. Americans of 1968 turned their back on hope. They wanted peace. They wanted stability. They wanted “law and order.”

 

V. The Rise and Fall of Richard Nixon

Photograph of Richard Nixon campaigning in Philadelphia during the 1968 presidential election. National Archives via Wikimedia

Richard Nixon campaigns in Philadelphia during the 1968 presidential election. National Archives.

Beleaguered by an unpopular war, inflation, and domestic unrest, President Johnson opted against reelection in March 1968—an unprecedented move in modern American politics. The forthcoming presidential election was shaped by Vietnam and the aforementioned unrest as much as by the campaigns of Democratic nominee Vice President Hubert Humphrey, Republican Richard Nixon, and third-party challenger George Wallace, the infamous segregationist governor of Alabama. The Democratic Party was in disarray in the spring of 1968, when senators Eugene McCarthy and Robert Kennedy challenged Johnson’s nomination and the president responded with his shocking announcement. Nixon’s candidacy was aided further by riots that broke out across the country after the assassination of Martin Luther King Jr. and the shock and dismay experienced after the slaying of Robert Kennedy in June. The Republican nominee’s campaign was defined by shrewd maintenance of his public appearances and a pledge to restore peace and prosperity to what he called “the silent center; the millions of people in the middle of the political spectrum.” This campaign for the “silent majority” was carefully calibrated to attract suburban Americans by linking liberals with violence and protest and rioting. Many embraced Nixon’s message; a September 1968 poll found that 80 percent of Americans believed public order had “broken down.”

Meanwhile, Humphrey struggled to distance himself from Johnson and maintain working-class support in northern cities, where voters were drawn to Wallace’s appeals for law and order and a rejection of civil rights. The vice president had a final surge in northern cities with the aid of union support, but it was not enough to best Nixon’s campaign. The final tally was close: Nixon won 43.3 percent of the popular vote (31,783,783), narrowly besting Humphrey’s 42.7 percent (31,266,006). Wallace, meanwhile, carried five states in the Deep South, and his 13.5 percent (9,906,473) of the popular vote constituted an impressive showing for a third-party candidate. The Electoral College vote was more decisive for Nixon; he earned 302 electoral votes, while Humphrey and Wallace received only 191 and 45 votes, respectively. Although Republicans won a few seats, Democrats retained control of both the House and Senate and made Nixon the first president in 120 years to enter office with the opposition party controlling both houses.

Once installed in the White House, Richard Nixon focused his energies on American foreign policy, publicly announcing the Nixon Doctrine in 1969. On the one hand, Nixon asserted the supremacy of American democratic capitalism and conceded that the United States would continue supporting its allies financially. However, he denounced previous administrations’ willingness to commit American forces to Third World conflicts and warned other states to assume responsibility for their own defense. He was turning America away from the policy of active, anticommunist containment, and toward a new strategy of détente.26

Promoted by national security advisor and eventual secretary of state Henry Kissinger, détente sought to stabilize the international system by thawing relations with Cold War rivals and bilaterally freezing arms levels. Taking advantage of tensions between communist China and the Soviet Union, Nixon pursued closer relations with both in order to de-escalate tensions and strengthen the United States’ position relative to each. The strategy seemed to work. In 1972, Nixon became the first American president to visit communist China and the first since Franklin Roosevelt to visit the Soviet Union. Direct diplomacy and cultural exchange programs with both countries grew and culminated with the formal normalization of U.S.-Chinese relations and the signing of two U.S.-Soviet arms agreements: the antiballistic missile (ABM) treaty and the Strategic Arms Limitations Treaty (SALT I). By 1973, after almost thirty years of Cold War tension, peaceful coexistence suddenly seemed possible.

Soon, though, a fragile calm gave way again to Cold War instability. In November 1973, Nixon appeared on television to inform Americans that energy had become “a serious national problem” and that the United States was “heading toward the most acute shortages of energy since World War II.”27 The previous month Arab members of the Organization of the Petroleum Exporting Countries (OPEC), a cartel of the world’s leading oil producers, embargoed oil exports to the United States in retaliation for American intervention in the Middle East. The embargo launched the first U.S. energy crisis. By the end of 1973, the global price of oil had quadrupled.28 Drivers waited in line for hours to fill up their cars. Individual gas stations ran out of gas. American motorists worried that oil could run out at any moment. A Pennsylvania man died when his emergency stash of gasoline ignited in his trunk and backseat.29 OPEC rescinded its embargo in 1974, but the economic damage had been done. The crisis extended into the late 1970s.

Like the Vietnam War, the oil crisis showed that small countries could still hurt the United States. At a time of anxiety about the nation’s future, Vietnam and the energy crisis accelerated Americans’ disenchantment with the United States’ role in the world and the efficacy and quality of its leaders. Furthermore, government scandals in the 1970s and early 1980s sapped trust in America’s public institutions. In 1971, the Nixon administration tried unsuccessfully to sue the New York Times and the Washington Post to prevent the publication of the Pentagon Papers, a confidential and damning history of U.S. involvement in Vietnam commissioned by the Defense Department and later leaked. The papers showed how presidents from Truman to Johnson repeatedly deceived the public on the war’s scope and direction.30 Nixon faced a rising tide of congressional opposition to the war, and Congress asserted unprecedented oversight of American war spending. In 1973, it passed the War Powers Resolution, which dramatically reduced the president’s ability to wage war without congressional consent.

However, no scandal did more to unravel public trust than Watergate. On June 17, 1972, five men were arrested inside the offices of the Democratic National Committee (DNC) in the Watergate Complex in downtown Washington, D.C. After being tipped off by a security guard, police found the men attempting to install sophisticated bugging equipment. One of those arrested was a former CIA employee then working as a security aide for the Nixon administration’s Committee to Re-elect the President (lampooned as “CREEP”).

While there is no direct evidence that Nixon ordered the Watergate break-in, he had been recorded in conversation with his chief of staff requesting that the DNC chairman be illegally wiretapped to obtain the names of the committee’s financial supporters. The names could then be given to the Justice Department and the Internal Revenue Service (IRS) to conduct spurious investigations into their personal affairs. Nixon was also recorded ordering his chief of staff to break into the offices of the Brookings Institution and take files relating to the war in Vietnam, saying, “Goddammit, get in and get those files. Blow the safe and get it.”31

Whether or not the president ordered the Watergate break-in, the White House launched a massive cover-up. Administration officials ordered the CIA to halt the FBI investigation and paid hush money to the burglars and White House aides. Nixon distanced himself from the incident publicly and went on to win a landslide election victory in November 1972. But, thanks largely to two persistent journalists at the Washington Post, Bob Woodward and Carl Bernstein, information continued to surface that tied the burglaries ever closer to the CIA, the FBI, and the White House. The Senate held televised hearings. Citing executive privilege, Nixon refused to comply with orders to produce tapes from the White House’s secret recording system. In July 1974, the House Judiciary Committee approved a bill to impeach the president. Nixon resigned before the full House could vote on impeachment. He became the first and only American president to resign from office.32

Vice President Gerald Ford was sworn in as his successor and a month later granted Nixon a full presidential pardon. Nixon disappeared from public life without ever publicly apologizing, accepting responsibility, or facing charges.

 

VI. Deindustrialization and the Rise of the Sunbelt

Photograph of an abandoned Youngstown factory.

Abandoned Youngstown factory. Stuart Spivack, via Flickr.

American workers had made substantial material gains throughout the 1940s and 1950s. During the so-called Great Compression, Americans of all classes benefited from postwar prosperity. Segregation and discrimination perpetuated racial and gender inequalities, but unemployment continually fell and a highly progressive tax system and powerful unions lowered general income inequality as working-class standards of living nearly doubled between 1947 and 1973.

But general prosperity masked deeper vulnerabilities. Perhaps no case better illustrates the decline of American industry and the creation of an intractable urban crisis than Detroit. Detroit boomed during World War II. When auto manufacturers like Ford and General Motors converted their assembly lines to build machines for the American war effort, observers dubbed the city the “arsenal of democracy.”

After the war, however, automobile firms began closing urban factories and moving to outlying suburbs. Several factors fueled the process. Some cities partly deindustrialized themselves. Municipal governments in San Francisco, St. Louis, and Philadelphia banished light industry to make room for high-rise apartments and office buildings. Mechanization also contributed to the decline of American labor. A manager at a newly automated Ford engine plant in postwar Cleveland captured the interconnections between these concerns when he glibly noted to United Automobile Workers (UAW) president Walter Reuther, “You are going to have trouble collecting union dues from all of these machines.”33 More importantly, however, manufacturing firms sought to reduce labor costs by automating, downsizing, and relocating to areas with “business friendly” policies like low tax rates, anti-union right-to-work laws, and low wages.

Detroit began to bleed industrial jobs. Between 1950 and 1958, Chrysler, which actually kept more jobs in Detroit than either Ford or General Motors, cut its Detroit production workforce in half. In the years between 1953 and 1960, East Detroit lost ten plants and over seventy-one thousand jobs.34 Because Detroit was a single-industry city, decisions made by the Big Three automakers reverberated across the city’s industrial landscape. When auto companies mechanized or moved their operations, ancillary suppliers like machine tool companies were cut out of the supply chain and likewise forced to cut their own workforce. Between 1947 and 1977, the number of manufacturing firms in the city dropped from over three thousand to fewer than two thousand. The labor force was gutted. Manufacturing jobs fell from 338,400 to 153,000 over the same three decades.35

Industrial restructuring decimated all workers, but deindustrialization fell heaviest on the city’s African Americans. Although many middle-class Black Detroiters managed to move out of the city’s ghettos, by 1960, 19.7 percent of Black autoworkers in Detroit were unemployed, compared to just 5.8 percent of whites.36 Overt discrimination in housing and employment had for decades confined African Americans to segregated neighborhoods where they were forced to pay exorbitant rents for slum housing. Subject to residential intimidation and cut off from traditional sources of credit, few could afford to follow industry as it left the city for the suburbs and other parts of the country, especially the South. Segregation and discrimination kept them stuck where there were fewer and fewer jobs. Over time, Detroit devolved into a mass of unemployment, crime, and crippled municipal resources. When riots rocked Detroit in 1967, 25 to 30 percent of Black residents between ages eighteen and twenty-four were unemployed.37

Deindustrialization in Detroit and elsewhere also went hand in hand with the long assault on unionization that began in the aftermath of World War II. Lacking the political support they had enjoyed during the New Deal years, labor organizations such as the CIO and the UAW shifted tactics and accepted labor-management accords in which cooperation, not agitation, was the strategic objective.

This accord held mixed results for workers. On the one hand, management encouraged employee loyalty through privatized welfare systems that offered workers health benefits and pensions. Grievance arbitration and collective bargaining also provided workers official channels through which to criticize policies and push for better conditions. At the same time, bureaucracy and corruption increasingly weighed down unions and alienated them from workers and the general public. Union management came to hold primary influence in what was ostensibly a “pluralistic” power relationship. Workers—though still willing to protest—by necessity pursued a more moderate agenda compared to the union workers of the 1930s and 1940s. Conservative politicians meanwhile seized on popular suspicions of Big Labor, stepping up their criticism of union leadership and positioning themselves as workers’ true ally.

While conservative critiques of union centralization did much to undermine the labor movement, labor’s decline also coincided with ideological changes within American liberalism. Labor and its political concerns undergirded Roosevelt’s New Deal coalition, but by the 1960s, many liberals had forsaken working-class politics. More and more saw poverty as stemming not from structural flaws in the national economy, but from the failure of individuals to take full advantage of the American system. Roosevelt’s New Deal might have attempted to rectify unemployment with government jobs, but Johnson’s Great Society and its imitators funded government-sponsored job training, even in places without available jobs. Union leaders in the 1950s and 1960s typically supported such programs and philosophies.

Internal racism also weakened the labor movement. While national CIO leaders encouraged Black unionization in the 1930s, white workers on the ground often opposed the integrated shop. In Detroit and elsewhere after World War II, white workers participated in “hate strikes” where they walked off the job rather than work with African Americans. White workers similarly opposed residential integration, fearing, among other things, that Black newcomers would lower property values.38

By the mid-1970s, widely shared postwar prosperity leveled off and began to retreat. Growing international competition, technological inefficiency, and declining productivity gains stunted working- and middle-class wages. As the country entered recession, wages decreased and the pay gap between workers and management expanded, reversing three decades of postwar contraction. At the same time, dramatic increases in mass incarceration coincided with the deregulation of prison labor to allow more private companies access to cheaper inmate labor, a process that, whatever its aggregate impact, impacted local communities where free jobs were moved into prisons. The tax code became less progressive and labor lost its foothold in the marketplace. Unions represented a third of the workforce in the 1950s, but only one in ten workers belonged to one as of 2015.39

Geography dictated much of labor’s fall, as American firms fled pro-labor states in the 1970s and 1980s. Some went overseas in the wake of new trade treaties to exploit low-wage foreign workers, but others turned to anti-union states in the South and West stretching from Virginia to Texas to Southern California. Factories shuttered in the North and Midwest, leading commentators by the 1980s to dub America’s former industrial heartland the Rust Belt. With this, they contrasted the prosperous and dynamic Sun Belt.”

In this 1973 photo, two subway riders sit amid a graffitied subway car in New York City. Erik Calonius, "Many Subway Cars in New York City Have Been Spray-Painted by Vandals" 1973. Via National Archives (8464439).

Urban decay confronted Americans of the 1960s and 1970s. As the economy sagged and deindustrialization hit much of the country, Americans increasingly associated major cities with poverty and crime. In this 1973 photo, two subway riders sit amid a graffitied subway car in New York City. National Archives (8464439).

Coined by journalist Kevin Phillips in 1969, the term Sun Belt refers to the swath of southern and western states that saw unprecedented economic, industrial, and demographic growth after World War  II.40 During the New Deal, President Franklin D. Roosevelt declared the American South “the nation’s No. 1 economic problem” and injected massive federal subsidies, investments, and military spending into the region. During the Cold War, Sun Belt politicians lobbied hard for military installations and government contracts for their states.41

Meanwhile, southern states’ hostility toward organized labor beckoned corporate leaders. The Taft-Hartley Act in 1947 facilitated southern states’ frontal assault on unions. Thereafter, cheap, nonunionized labor, low wages, and lax regulations pulled northern industries away from the Rust Belt. Skilled northern workers followed the new jobs southward and westward, lured by cheap housing and a warm climate slowly made more tolerable by modern air conditioning.

The South attracted business but struggled to share their profits. Middle-class whites grew prosperous, but often these were recent transplants, not native southerners. As the cotton economy shed farmers and laborers, poor white and Black southerners found themselves mostly excluded from the fruits of the Sun Belt. Public investments were scarce. White southern politicians channeled federal funding away from primary and secondary public education and toward high-tech industry and university-level research. The Sun Belt inverted Rust Belt realities: the South and West had growing numbers of high-skill, high-wage jobs but lacked the social and educational infrastructure needed to train native poor and middle-class workers for those jobs.

Regardless, more jobs meant more people, and by 1972, southern and western Sun Belt states had more electoral votes than the Northeast and Midwest. This gap continues to grow.42 Though the region’s economic and political ascendance was a product of massive federal spending, New Right politicians who constructed an identity centered on “small government” found their most loyal support in the Sun Belt. These business-friendly politicians successfully synthesized conservative Protestantism and free market ideology, creating a potent new political force. Housewives organized reading groups in their homes, and from those reading groups sprouted new organized political activities. Prosperous and mobile, old and new suburbanites gravitated toward an individualistic vision of free enterprise espoused by the Republican Party. Some, especially those most vocally anticommunist, joined groups like the Young Americans for Freedom and the John Birch Society. Less radical suburban voters, however, still gravitated toward the more moderate brand of conservatism promoted by Richard Nixon.

 

VII. The Politics of Love, Sex, and Gender

Photograph of activists opposed to the Equal Rights Amendment standing in front of the White House. Signs say "Stop the Web of Deception," "ERA means AMY registers for the draft at 18," "Maternity Ward Persons (Women is crossed out) Only," and "Rosalyn Carter tear up your own social security card, not mine!"

Demonstrators opposed to the Equal Rights Amendment protest in front of the White House in 1977. Library of Congress.

The sexual revolution continued into the 1970s. Many Americans—feminists, gay men, lesbians, and straight couples—challenged strict gender roles and rejected the rigidity of the nuclear family. Cohabitation without marriage spiked, straight couples married later (if at all), and divorce levels climbed. Sexuality, decoupled from marriage and procreation, became for many not only a source of personal fulfillment but a worthy political cause.

At the turn of the decade, sexuality was considered a private matter yet rigidly regulated by federal, state, and local law. Statutes typically defined legitimate sexual expression within the confines of patriarchal, procreative marriage. Interracial marriage, for instance, was illegal in many states until 1967 and remained largely taboo long after. Same-sex intercourse and cross-dressing were criminalized in most states, and gay men, lesbians, and transgender people were vulnerable to violent police enforcement as well as discrimination in housing and employment.

Two landmark legal rulings in 1973 established the battle lines for the “sex wars” of the 1970s. First, the Supreme Court’s 7–2 ruling in Roe v. Wade (1973) struck down a Texas law that prohibited abortion in all cases when a mother’s life was not in danger. The Court’s decision built on precedent from a 1965 ruling that, in striking down a Connecticut law prohibiting married couples from using birth control, recognized a constitutional “right to privacy.”43 In Roe, the Court reasoned that “this right of privacy . . . is broad enough to encompass a woman’s decision whether or not to terminate her pregnancy.”44 The Court held that states could not interfere with a woman’s right to an abortion during the first trimester of pregnancy and could only fully prohibit abortions during the third trimester.

Other Supreme Court rulings, however, found that sexual privacy could be sacrificed for the sake of “public” good. Miller v. California (1973), a case over the unsolicited mailing of sexually explicit advertisements for illustrated “adult” books, held that the First Amendment did not protect “obscene” material, defined by the Court as anything with sexual appeal that lacked, “serious literary, artistic, political, or scientific value.”45 The ruling expanded states’ abilities to pass laws prohibiting materials like hard-core pornography. However, uneven enforcement allowed pornographic theaters and sex shops to proliferate despite whatever laws states had on the books. Americans debated whether these represented the pinnacle of sexual liberation or, as poet and lesbian feminist Rita Mae Brown suggested, “the ultimate conclusion of sexist logic.”46

Of more tangible concern for most women, though, was the right to equal employment access. Thanks partly to the work of Black feminists like Pauli Murray, Title VII of the 1964 Civil Rights Act banned employment discrimination based on sex, in addition to race, color, religion, and national origin. “If sex is not included,” she argued in a memorandum sent to members of Congress, “the civil rights bill would be including only half of the Negroes.”47 Like most laws, Title VII’s full impact came about slowly, as women across the nation cited it to litigate and pressure employers to offer them equal opportunities compared to those they offered to men. For one, employers in the late sixties and seventies still viewed certain occupations as inherently feminine or masculine. NOW organized airline workers against a major company’s sexist ad campaign that showed female flight attendants wearing buttons that read, “I’m Debbie, Fly Me” or “I’m Cheryl, Fly Me.” Actual female flight attendants were required to wear similar buttons.48 Other women sued to gain access to traditionally male jobs like factory work. Protests prompted the Equal Employment Opportunity Commission (EEOC) to issue a more robust set of protections between 1968 and 1971. Though advancement came haltingly and partially, women used these protections to move eventually into traditional male occupations, politics, and corporate management.

The battle for sexual freedom was not just about the right to get into places, though. It was also about the right to get out of them—specifically, unhappy households and marriages. Between 1959 and 1979, the American divorce rate more than doubled. By the early 1980s, nearly half of all American marriages ended in divorce.49 The stigma attached to divorce evaporated and a growing sense of sexual and personal freedom motivated individuals to leave abusive or unfulfilling marriages. Legal changes also promoted higher divorce rates. Before 1969, most states required one spouse to prove that the other was guilty of a specific offense, such as adultery. The difficulty of getting a divorce under this system encouraged widespread lying in divorce courts. Even couples desiring an amicable split were sometimes forced to claim that one spouse had cheated on the other even if neither (or both) had. Other couples temporarily relocated to states with more lenient divorce laws, such as Nevada.50 Widespread recognition of such practices prompted reforms. In 1969, California adopted the first no-fault divorce law. By the end of the 1970s, almost every state had adopted some form of no-fault divorce. The new laws allowed for divorce on the basis of “irreconcilable differences,” even if only one party felt that he or she could not stay in the marriage.51

Gay men and women, meanwhile, negotiated a harsh world that stigmatized homosexuality as a mental illness or an immoral depravity. Building on postwar efforts by gay rights organizations to bring homosexuality into the mainstream of American culture, young gay activists of the late sixties and seventies began to challenge what they saw as the conservative gradualism of the “homophile” movement. Inspired by the burgeoning radicalism of the Black Power movement, the New Left protests of the Vietnam War, and the counterculture movement for sexual freedom, gay and lesbian activists agitated for a broader set of sexual rights that emphasized an assertive notion of liberation rooted not in mainstream assimilation but in pride of sexual difference.

Perhaps no single incident did more to galvanize gay and lesbian activism than the 1969 uprising at the Stonewall Inn in New York City’s Greenwich Village. Police regularly raided gay bars and hangouts. But when police raided the Stonewall in June 1969, the bar patrons protested and sparked a multiday street battle that catalyzed a national movement for gay liberation. Seemingly overnight, calls for homophile respectability were replaced with chants of “Gay Power!”52

Photograph of the window under the Stonewall sign that reads: “We homosexuals plead with our people to please help maintain peaceful and quiet conduct on the streets of the Village--Mattachine.”

The window under the Stonewall Inn sign reads: “We homosexuals plead with our people to please help maintain peaceful and quiet conduct on the streets of the Village–Mattachine.” Photograph 1969. Wikimedia.

In the following years, gay Americans gained unparalleled access to private and public spaces. Gay activists increasingly attacked cultural norms that demanded they keep their sexuality hidden. Citing statistics that sexual secrecy contributed to stigma and suicide, gay activists urged people to come out and embrace their sexuality. A step towards the normalization of homosexuality occurred in 1973, when the American Psychiatric Association stopped classifying homosexuality as a mental illness. Pressure mounted on politicians. In 1982, Wisconsin became the first state to ban discrimination based on sexual orientation. More than eighty cities and nine states followed suit over the following decade. But progress proceeded unevenly, and gay Americans continued to suffer hardships from a hostile culture.

Like all social movements, the sexual revolution was not free of division. Transgender people were often banned from participating in Gay Pride rallies and lesbian feminist conferences. They, in turn, mobilized to fight the high incidence of rape, abuse, and murder of transgender people. A 1971 newsletter denounced the notion that transgender people were mentally ill and highlighted the particular injustices they faced in and out of the gay community, declaring, “All power to Trans Liberation.”53

As events in the 1970s broadened sexual freedoms and promoted greater gender equality, so too did they generate sustained and organized opposition. Evangelical Christians and other moral conservatives, for instance, mobilized to reverse gay victories. In 1977, activists in Dade County, Florida, used the slogan “Save Our Children” to overturn an ordinance banning discrimination based on sexual orientation.54 A leader of the ascendant religious right, Jerry Falwell, said in 1980, “It is now time to take a stand on certain moral issues. . . . We must stand against the Equal Rights Amendment, the feminist revolution, and the homosexual revolution. We must have a revival in this country.”55

Much to Falwell’s delight, conservative Americans did, in fact, stand against and defeat the Equal Rights Amendment (ERA), their most stunning social victory of the 1970s. Versions of the amendment—which declared, “Equality of rights under the law shall not be denied or abridged by the United States or any state on account of sex”—were introduced to Congress each year since 1923. It finally passed amid the upheavals of the sixties and seventies and went to the states for ratification in March 1972.56 With high approval ratings, the ERA seemed destined to pass swiftly through state legislatures and become the Twenty-Seventh Amendment. Hawaii ratified the amendment the same day it cleared Congress. Within a year, thirty states had done so. But then the amendment stalled. It took years for more states to pass it. In 1977, Indiana became the thirty-fifth and final state to ratify.57

By 1977, anti-ERA forces had successfully turned the political tide against the amendment. At a time when many women shared Betty Friedan’s frustration that society seemed to confine women to the role of homemaker, Phyllis Schlafly’s STOP ERA organization (“Stop Taking Our Privileges”) trumpeted the value and advantages of being a homemaker and mother.58 Marshaling the support of evangelical Christians and other religious conservatives, Schlafly worked tirelessly to stifle the ERA. She lobbied legislators and organized counter-rallies to ensure that Americans heard “from the millions of happily married women who believe in the laws which protect the family and require the husband to support his wife and children.”59 The amendment needed only three more states for ratification. It never got them. In 1982, the time limit for ratification expired—and along with it, the amendment.60

The failed battle for the ERA uncovered the limits of the feminist crusade. And it illustrated the women’s movement’s inherent incapacity to represent fully the views of 50 percent of the country’s population, a population riven by class differences, racial disparities, and cultural and religious divisions.

 

VIII. The Misery Index

Photograph of Jimmy Carter-supporters with pumpkins carved in the likeness of Carter.

Supporters rally with pumpkins carved in the likeness of President Jimmy Carter in Polk County, Florida, in October 1980. State Library and Archives of Florida via Flickr.

Although Nixon eluded prosecution, Watergate continued to weigh on voters’ minds. It netted big congressional gains for Democrats in the 1974 midterm elections, and Ford’s pardon damaged his chances in 1976. Former Georgia governor Jimmy Carter, a nuclear physicist and peanut farmer who represented the rising generation of younger, racially liberal “New South” Democrats, captured the Democratic nomination. Carter did not identify with either his party’s liberal or conservative wing; his appeal was more personal and moral than political. He ran on no great political issues, letting his background as a hardworking, honest, southern Baptist navy man ingratiate him to voters around the country, especially in his native South, where support for Democrats had wavered in the wake of the civil rights movement. Carter’s wholesome image was painted in direct contrast to the memory of Nixon, and by association with the man who pardoned him. Carter sealed his party’s nomination in June and won a close victory in November.61

When Carter took the oath of office on January 20, 1977, however, he became president of a nation in the midst of economic turmoil. Oil shocks, inflation, stagnant growth, unemployment, and sinking wages weighed down the nation’s economy. Some of these problems were traceable to the end of World War II when American leaders erected a complex system of trade policies to help rebuild the shattered economies of Western Europe and Asia. After the war, American diplomats and politicians used trade relationships to win influence and allies around the globe. They saw the economic health of their allies, particularly West Germany and Japan, as a crucial bulwark against the expansion of communism. Americans encouraged these nations to develop vibrant export-oriented economies and tolerated restrictions on U.S. imports.

Photograph of cars in long lines waiting to buy gas.

The 1979 energy crisis panicked consumers who remembered the 1973 oil shortage, prompting many Americans to buy oil in huge quantities. Library of Congress.

This came at great cost to the United States. As the American economy stalled, Japan and West Germany soared and became major forces in the global production for autos, steel, machine tools, and electrical products. By 1970, the United States began to run massive trade deficits. The value of American exports dropped and the prices of its imports skyrocketed. Coupled with the huge cost of the Vietnam War and the rise of oil-producing states in the Middle East, growing trade deficits sapped the United States’ dominant position in the global economy.

American leaders didn’t know how to respond. After a series of negotiations with leaders from France, Great Britain, West Germany, and Japan in 1970 and 1971, the Nixon administration allowed these rising industrial nations to continue flouting the principles of free trade. They maintained trade barriers that sheltered their domestic markets from foreign competition while at the same time exporting growing amounts of goods to the United States. By 1974, in response to U.S. complaints and their own domestic economic problems, many of these industrial nations overhauled their protectionist practices but developed even subtler methods (such as state subsidies for key industries) to nurture their economies.

The result was that Carter, like Ford before him, presided over a hitherto unimagined economic dilemma: the simultaneous onset of inflation and economic stagnation, a combination popularized as stagflation.”62 Neither Ford nor Carter had the means or ambition to protect American jobs and goods from foreign competition. As firms and financial institutions invested, sold goods, and manufactured in new rising economies like Mexico, Taiwan, Japan, Brazil, and elsewhere, American politicians allowed them to sell their often cheaper products in the United States.

As American officials institutionalized this new unfettered global trade, many American manufacturers perceived only one viable path to sustained profitability: moving overseas, often by establishing foreign subsidiaries or partnering with foreign firms. Investment capital, especially in manufacturing, fled the United States looking for overseas investments and hastened the decline in the productivity of American industry.

During the 1976 presidential campaign, Carter had touted the “misery index,” the simple addition of the unemployment rate to the inflation rate, as an indictment of Gerald Ford and Republican rule. But Carter failed to slow the unraveling of the American economy, and the stubborn and confounding rise of both unemployment and inflation damaged his presidency.

Just as Carter failed to offer or enact policies to stem the unraveling of the American economy, his idealistic vision of human rights–based foreign policy crumbled. He had not made human rights a central theme in his campaign, but in May 1977 he declared his wish to move away from a foreign policy in which “inordinate fear of communism” caused American leaders to “adopt the flawed and erroneous principles and tactics of our adversaries.” Carter proposed instead “a policy based on constant decency in its values and on optimism in our historical vision.”63

Carter’s human rights policy achieved real victories: the United States either reduced or eliminated aid to American-supported right-wing dictators guilty of extreme human rights abuses in places like South Korea, Argentina, and the Philippines. In September 1977, Carter negotiated the return to Panama of the Panama Canal, which cost him enormous political capital in the United States.64 A year later, in September 1978, Carter negotiated a peace treaty between Israeli prime minister Menachem Begin and Egyptian president Anwar Sadat. The Camp David Accords—named for the president’s rural Maryland retreat, where thirteen days of secret negotiations were held—represented the first time an Arab state had recognized Israel, and the first time Israel promised Palestine self-government. The accords had limits, for both Israel and the Palestinians, but they represented a major foreign policy coup for Carter.65

And yet Carter’s dreams of a human rights–based foreign policy crumbled before the Cold War and the realities of American politics. The United States continued to provide military and financial support for dictatorial regimes vital to American interests, such as the oil-rich state of Iran. When the President and First Lady Rosalynn Carter visited Tehran, Iran, in January 1978, the president praised the nation’s dictatorial ruler, Shah Reza Pahlavi, and remarked on the “respect and the admiration and love” Iranians had for their leader.66 When the shah was deposed in November 1979, revolutionaries stormed the American embassy in Tehran and took fifty-two Americans hostage. Americans not only experienced another oil crisis as Iran’s oil fields shut down, they watched America’s news programs, for 444 days, remind them of the hostages and America’s new global impotence. Carter couldn’t win their release. A failed rescue mission only ended in the deaths of eight American servicemen. Already beset with a punishing economy, Carter’s popularity plummeted.

Carter’s efforts to ease the Cold War by achieving a new nuclear arms control agreement disintegrated under domestic opposition from conservative Cold War hawks such as Ronald Reagan, who accused Carter of weakness. A month after the Soviets invaded Afghanistan in December 1979, a beleaguered Carter committed the United States to defending its “interests” in the Middle East against Soviet incursions, declaring that “an assault [would] be repelled by any means necessary, including military force.” The Carter Doctrine not only signaled Carter’s ambivalent commitment to de-escalation and human rights, it testified to his increasingly desperate presidency.67

The collapse of American manufacturing, the stubborn rise of inflation, the sudden impotence of American foreign policy, and a culture ever more divided: the sense of unraveling pervaded the nation. “I want to talk to you right now about a fundamental threat to American democracy,” Jimmy Carter said in a televised address on July 15, 1979. “The threat is nearly invisible in ordinary ways. It is a crisis of confidence. It is a crisis that strikes at the very heart and soul and spirit of our national will.”

 

IX. Conclusion

Though American politics moved right after Lyndon Johnson’s administration, Nixon’s 1968 election was no conservative counterrevolution. American politics and society remained in flux throughout the 1970s. American politicians on the right and the left pursued relatively moderate courses compared to those in the preceding and succeeding decades. But a groundswell of anxieties and angers brewed beneath the surface. The world’s greatest military power had floundered in Vietnam and an American president stood flustered by Middle Eastern revolutionaries. The cultural clashes from the sixties persisted and accelerated. While cities burned, a more liberal sexuality permeated American culture. The economy crashed, leaving America’s cities prone before poverty and crime and its working class gutted by deindustrialization and globalization. American weakness was everywhere. And so, by 1980, many Americans—especially white middle- and upper-class Americans—felt a nostalgic desire for simpler times and simpler answers to the frustratingly complex geopolitical, social, and economic problems crippling the nation. The appeal of Carter’s soft drawl and Christian humility had signaled this yearning, but his utter failure to stop the unraveling of American power and confidence opened the way for a new movement, one with new personalities and a new conservatism—one that promised to undo the damage and restore the United States to its own nostalgic image of itself.

 

X. Primary Sources

1. Report of the National Advisory Commission on Civil Disorders (1968)

Riots rocked American cities in the mid-late sixties. Hundreds died, thousands were injured, and thousands of buildings were destroyed. Many communities never recovered. In 1967, devastating riots, particularly in Detroit, Michigan, and Newark, New Jersey, captivated national television audiences. President Lyndon Johnson appointed an 11-person commission, chaired by Illinois Governor Otto Kerner, to explain the origins of the riots and recommend policies to prevent them in the future.

2. Statement by John Kerry of Vietnam Veterans Against the War (1971)

On April 23, 1971, a young Vietnam veteran named John Kerry spoke on behalf of the Vietnam Veterans Against the War before the Senate Committee of Foreign Relations. Kerry, later a Massachusetts Senator and 2004 presidential contender, articulated a growing disenchantment with the Vietnam War and delivered a blistering indictment of the reasoning behind its prosecution.

3. Nixon Announcement of China Visit (1971)

Richard Nixon, who built his political career on anti-communism, worked from the first day of his presidency to normalize relations with the communist People’s Republic of China. In 1971, Richard Nixon announced that he would make an unprecedented visit there to advance American-Chinese relations. Here, he explains his intentions.

4. Barbara Jordan, 1976 Democratic National Convention Keynote Address (1976)

On July 12, 1976, Texas Congresswoman Barbara Jordan delivered the keynote address at the Democratic National Convention. As Americans sensed a fracturing of American life in the 1970s, Jordan called for Americans to commit themselves to a “national community” and the “common good.” Jordan began by noting she was the first Black woman to ever deliver a keynote address at a major party convention and that such a thing would have been almost impossible even a decade earlier.

5. Jimmy Carter, “Crisis of Confidence” (1979)

On July 15, 1979, amid stagnant economic growth, high inflation, and an energy crisis, Jimmy Carter delivered a televised address to the American people. In it, Carter singled out a pervasive “crisis of confidence” preventing the American people from moving the country forward. A year later, Ronald Reagan would frame his optimistic political campaign in stark contrast to the tone of Carter’s speech, which would be remembered, especially by critics, as the “malaise speech.”

6. Gloria Steinem on Equal Rights for Women (1970)

 The first Congressional hearing on the equal rights amendment (ERA) was held in 1923, but the push for the amendment stalled until the 1960s, when a revived women’s movement thrust it again into the national consciousness. Congress passed and sent to the states for ratification the ERA on March 22, 1972. But it failed, stalling just three states short of the required three-fourths needed for ratification. Despite popular support for the amendment, activists such as Phyllis Schlafly outmaneuvered the amendment’s supporters. In 1970, author Gloria Steinem argued that such opposition was rooted in outmoded ideas about gender.

7. Native Americans Occupy Alcatraz (1969)

 In November 1969, Native American activists occupied Alcatraz Island and held it for nineteen months to bring attention to past injustices and contemporary issues confronting Native Americans, as state in this proclamation, drafted largely by Adam Fortunate Eagle of the Ojibwa Nation.

8. New York City Subway (1973)

“Urban Decay” confronted Americans of the 1960s and 1970s. As the economy sagged and deindustrialization hit much of the country, many Americans associated major cities with poverty and crime. In this 1973 photo, two subway riders sit amid a graffitied subway car in New York City.

9. “Stop ERA” Protest (1977)

In the 1970s, conservative Americans defeated the Equal Rights Amendment (ERA). With high approval ratings, the ERA–which declared, “Equality of rights under the law shall not be denied or abridged by the United States or any state on account of sex”—seemed destined to pass swiftly through state legislatures and become the Twenty-Seventh Amendment, but conservative opposition stopped the Amendment just short of ratification.

 

XI. Reference Material

This chapter was edited by Edwin Breeden, with content contributions by Seth Anziska, Jeremiah Bauer, Edwin Breeden, Kyle Burke, Brent Cebul, Alexandra Evans, Sean Fear, Anne Gray Fischer, Destin Jenkins, Matthew Kahn, Suzanne Kahn, Brooke Lamperd, Katherine McGarr, Matthew Pressman, Adam Parsons, Emily Prifogle, John Rosenberg, Brandy Thomas Wells, and Naomi R. Williams.

Recommended citation: Seth Anziska et al., “The Unraveling,” Edwin Breeden, ed., in The American Yawp, eds. Joseph Locke and Ben Wright (Stanford, CA: Stanford University Press, 2018).

Recommended Reading

  • Carter, Dan T. The Politics of Rage: George Wallace, the Origins of the New Conservatism, and the Transformation of American Politics. Baton Rouge: LSU Press, 1995.
  • Cowie, Jefferson R. Stayin’ Alive: The 1970s and the Last Days of the Working Class. New York: New Press, 2010.
  • Evans, Sara. Personal Politics: The Roots of Women’s Liberation in the Civil Rights Movement and the New Left. New York: Vintage Books, 1979.
  • Flamm, Michael W. Law and Order: Street Crime, Civil Unrest, and the Crisis of Liberalism in the 1960s. New York: Columbia University Press, 2005.
  • Formisano, Ronald P. Boston Against Busing: Race, Class, and Ethnicity in the 1960s and 1970s. Chapel Hill: University of North Carolina Press, 1991.
  • Greenberg, David. Nixon’s Shadow: The History of an Image. New York: Norton, 2004.
  • Harvey, David. The Condition of Postmodernity: An Enquiry into the Origins of Cultural Change. Cambridge, UK: Blackwell, 1989.
  • Jenkins, Philip. Decade of Nightmares: The End of the Sixties and the Making of Eighties America. New York: Oxford University Press, 2008.
  • Kalman, Laura. Right Star Rising: A New Politics, 1974–1980. New York: Norton, 2010.
  • Lassiter, Matthew D. The Silent Majority: Suburban Politics in the Sunbelt South. Princeton, NJ: Princeton University Press, 2006.
  • MacLean, Nancy. Freedom Is Not Enough: The Opening of the American Workplace. Cambridge, MA: Harvard University Press, 2008.
  • Marable, Manning. Malcolm X: A Life of Reinvention. New York: Viking, 2011.
  • Matusow, Allen J. The Unraveling of America: A History of Liberalism in the 1960s. New York: Harper and Row, 1984.
  • Murch, Donna Jean. Living for the City: Migration, Education, and the Rise of the Black Panther Party in Oakland, California. Durham: University of North Carolina Press, 2010.
  • Patterson, James T. Grand Expectations: The United States, 1945–1974. New York: Oxford University Press, 1996.
  • Perlstein, Rick. Nixonland: The Rise of a President and the Fracturing of America. New York: Norton, 2003.
  • Phelps, Wesley. A People’s War on Poverty: Urban Politics, Grassroots Activists, and the Struggle for Democracy in Houston, 1964–1976. Athens: University of Georgia Press, 2014.
  • Rodgers, Daniel T. Age of Fracture. Cambridge, MA: Belknap Press, 2011.
  • Roth, Benita. Separate Roads to Feminism: Black, Chicana, and White Feminist Movements in America’s Second Wave. New York: Cambridge University Press, 2004.
  • Sargent, Daniel J. A Superpower Transformed: The Remaking of American Foreign Relations in the 1970s. Oxford, UK: Oxford University Press, 2015.
  • Schulman, Bruce J. The Seventies: The Great Shift in American Culture, Society, and Politics. New York: Free Press, 2001.
  • Springer, Kimberly. Living for the Revolution: Black Feminist Organizations, 1968–1980. Durham, NC: Duke University Press, 2005.
  • Stein, Judith. Pivotal Decade: How the United States Traded Factories for Finance in the 1970s. New Haven, CT: Yale University Press, 2010.
  • Thompson, Heather Ann. Blood in the Water: The Attica Prison Uprising of 1971 and Its Legacy. New York: Pantheon Books, 2016.
  • Zaretsky, Natasha. No Direction Home: The American Family and the Fear of National Decline. Chapel Hill: University of North Carolina Press, 2007.

 

Notes

  1. Acts included Santana; Jefferson Airplane; Crosby, Stills, Nash & Young; and the Flying Burrito Brothers. The Grateful Dead were scheduled but refused to play. []
  2. Bruce J. Schulman, The Seventies: The Great Shift in American Culture, Society, and Politics (Cambridge, MA: Da Capo Press, 2002), 18 []
  3. Allen J. Matusow, The Unraveling of America: A History of Liberalism in the 1960s, updated ed. (Athens: University of Georgia Press, 2009), 304–305. []
  4. Owen Gleibman, “Altamont at 45: The Most Dangerous Rock Concert,” BBC, December 5, 2014, http://www.bbc.com/culture/story/20141205-did-altamont-end-the-60s.. []
  5. Jeff Leen, “The Vietnam Protests: When Worlds Collided,” Washington Post, September 27, 1999, http://www.washingtonpost.com/wp-srv/local/2000/vietnam092799.htm. []
  6. Michael J. Arlen, Living-Room War (New York: Viking, 1969). []
  7. Tom Engelhardt, The End of Victory Culture: Cold War America and the Disillusioning of a Generation, rev. ed. (Amherst: University of Massachusetts Press, 2007), 190. []
  8. Mitchel P. Roth, Historical Dictionary of War Journalism (Westport, CT: Greenwood, 1997), 105. []
  9. David L. Anderson, The Columbia Guide to the Vietnam War (New York: Columbia University Press, 2002), 109. []
  10. Guenter Lewy, America in Vietnam (New York: Oxford University Press, 1978), 325–326. []
  11. Lyndon B. Johnson, “Address to the Nation Announcing Steps to Limit the War in Vietnam and Reporting His Decision Not to Seek Reelection,” March 31, 1968, Lyndon Baines Johnson Library, http://www.lbjlib.utexas.edu/johnson/archives.hom/speeches.hom/680331.asp. []
  12. Lewy, America in Vietnam, 164–169; Henry Kissinger, Ending the Vietnam War: A History of America’s Involvement in and Extrication from the Vietnam War (New York: Simon and Schuster, 2003), 81–82. []
  13. Richard Nixon, “Address to the Nation Announcing Conclusion of an Agreement on Ending the War and Restoring Peace in Vietnam,” January 23, 1973, American Presidency Project, http://www.presidency.ucsb.edu/ws/?pid=3808. []
  14. Richard Nixon, quoted in Walter Isaacson, Kissinger: A Biography (New York: Simon and Schuster, 2005), 163–164. []
  15. Geneva Jussi Hanhimaki, The Flawed Architect: Henry Kissinger and American Foreign Policy (New York: Oxford University Press, 2004), 257. []
  16. Cohen, Consumer’s Republic). []
  17. Quotes from “Lionel Moves into the Neighborhood,” All in the Family, season 1, episode 8 (1971), http://www.tvrage.com/all-in-the-family/episodes/5587. []
  18. Jim Dawson and Steve Propes, 45 RPM: The History, Heroes and Villains of a Pop Music Revolution (San Francisco: Backbeat Books, 2003), 120. []
  19. Roger Ebert, “Review of Dirty Harry,” January 1, 1971, http://www.rogerebert.com/reviews/dirty-harry-1971. []
  20. Ronald P. Formisano, Boston Against Busing: Race, Class, and Ethnicity in the 1960s and 1970s (Chapel Hill: University of North Carolina Press, 1991). []
  21. Michael W. Flamm, Law and Order: Street Crime, Civil Unrest, and the Crisis of Liberalism in the 1960s (New York: Columbia University Press, 2005), 58–59, 85–93. []
  22. Thomas J. Sugrue, Sweet Land of Liberty: The Forgotten Struggle for Civil Rights in the North (New York: Random House, 2008), 348. []
  23. Cohen, Consumer’s Republic, 373. []
  24. Ibid., 376. []
  25. Martin Luther King, quoted in David J. Garrow, Bearing the Cross: Martin Luther King Jr. and the Southern Christian Leadership Conference (New York: Morrow, 1986), 439. []
  26. Richard M. Nixon, “Address to the Nation on the War in Vietnam,” November 3, 1969, American Experience, http://www.pbs.org/wgbh/americanexperience/features/primary-resources/nixon-vietnam/. []
  27. Richard Nixon, “Address to the Nation about Policies to Deal with Energy Shortages,” November 7, 1973, American Presidency Project, http://www.presidency.ucsb.edu/ws/?pid=4034.. []
  28. Office of the Historian, “Oil Embargo, 1973–1974,” U.S. Department of State, https://history.state.gov/milestones/1969-1976/oil-embargo. []
  29. “Gas Explodes in Man’s Car,” Uniontown (PA) Morning Herald, December 5, 1973, p. 12. []
  30. Larry H. Addington, America’s War in Vietnam: A Short Narrative History (Bloomington: Indiana University Press, 2000), 140–141. []
  31. Schulman, Seventies, 44. []
  32. “Executive Privilege,” in John J. Patrick, Richard M. Pious, and Donald A. Ritchie, The Oxford Guide to the United States Government (New York: Oxford University Press, 2001), 227; Schulman, The Seventies, 44–48. []
  33. Sugrue, Origins of the Urban Crisis, 132. []
  34. Ibid., 136, 149. []
  35. Ibid., 144. []
  36. Ibid., 144. []
  37. Ibid., 261. []
  38. Jefferson Cowie and Nick Salvatore, “The Long Exception: Rethinking the Place of the New Deal in American History,” International Labor and Working-Class History 74 (Fall 2008), 1–32, esp. 9. []
  39. Quoctrung Bui, “50 Years of Shrinking Union Membership in One Map,” February 23, 2015, NPR, http://www.npr.org/sections/money/2015/02/23/385843576/50–years-of-shrinking-union-membership-in-one-map. []
  40. Kevin P. Phillips, The Emerging Republic Majority (New Rochelle, NY: Arlington House, 1969), 17. []
  41. Bruce J. Schulman, From Cotton Belt to Sunbelt: Federal Policy, Economic Development, and the Transformation of the South, 1938–1980, 3rd printing (Durham, NC: Duke University Press, 2007), 3. []
  42. William H. Frey, “The Electoral College Moves to the Sun Belt,” research brief, Brookings Institution, May 2005. []
  43. Griswold v. Connecticut, 381 U.S. 479, June 7, 1965. []
  44. Roe v. Wade, 410 U.S. 113, January 22, 1973. []
  45. Miller v. California, 413 U.S. 15, June 21, 1973. []
  46. Rita Mae Brown, quoted in David Allyn, Make Love, Not War—The Sexual Revolution: An Unfettered History (New York: Routledge, 2001), 239. []
  47. Nancy MacLean, Freedom Is Not Enough: The Opening of the American Workplace (Cambridge, MA: Harvard University Press), 121. []
  48. Ibid., 129. []
  49. Arland Thornton, William G. Axinn, and Yu Xie, Marriage and Cohabitation (Chicago: University of Chicago Press, 2007), 57. []
  50. Glenda Riley, Divorce: An American Tradition (New York: Oxford University Press, 1991), 135–139. []
  51. Ibid., 161–165; Mary Ann Glendon, The Transformation of Family Law: State, Law, and Family in the United States and Western Europe (Chicago: University of Chicago Press, 1989), 188–189. []
  52. David Carter, Stonewall: The Riots That Sparked the Gay Revolution (New York: St. Martin’s Press, 2004), 147. []
  53. Trans Liberation Newsletter, in Susan Styker, Transgender History (Berkeley, CA: Seal Press, 2008), 96–97. []
  54. William N. Eskridge, Dishonorable Passions: Sodomy Laws in America, 1861–2003 (New York: Viking, 2008), 209–212. []
  55. Jerry Falwell, Listen, America! (Garden City, NY: Doubleday), 19. []
  56. Donald Critchlow, Phyllis Schlafly and Grassroots Conservatism: A Woman’s Crusade (Princeton, NJ: Princeton University Press, 2005), 213–216. []
  57. Ibid., 218–219; Joel Krieger, ed., The Oxford Companion to the Politics of the World, 2nd ed. (New York: Oxford University Press, 2001), 256. []
  58. Critchlow, Phyllis Schlafly and Grassroots Conservatism, 219. []
  59. Phyllis Schlafly, quoted in Christine Stansell, The Feminist Promise: 1792 to the Present (New York: Modern Library, 2010), 340. []
  60. Critchlow, Phyllis Schlafly and Grassroots Conservatism, 281. []
  61. Sean Wilentz, The Age of Reagan: A History, 1974–2008 (New York: HarperCollins, 2008), 69–72. []
  62. Ibid., 75. []
  63. Jimmy Carter, “University of Notre Dame—Address at the Commencement Exercises at the University,” May 22, 1977, American Presidency Project, http://www.presidency.ucsb.edu/ws/?pid=7552. []
  64. Wilentz, Age of Reagan, 100–102. []
  65. Harvey Sicherman, Palestinian Autonomy, Self-Government, and Peace (Boulder, CO: Westview Press, 1993), 35. []
  66. Jimmy Carter, “Tehran, Iran Toasts of the President and the Shah at a State Dinner,” December 31, 1977, American Presidency Project, http://www.presidency.ucsb.edu/ws/?pid=7080. []
  67. Jimmy Carter, “The State of the Union Address,” January 23, 1980, American Presidency Project, http://www.presidency.ucsb.edu/ws/?pid=33079. []

27. The Sixties

Photograph of civil rights protestors marching from Selma to Montgomery. Many are carrying American flags.

Demonstrators march from Selma to Montgomery, Alabama, in 1965 to champion African American civil rights. Library of Congress.

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

Perhaps no decade is so immortalized in American memory as the 1960s. Couched in the colorful rhetoric of peace and love, complemented by stirring images of the civil rights movement, and fondly remembered for its music, art, and activism, the decade brought many people hope for a more inclusive, forward-thinking nation. But the decade was also plagued by strife, tragedy, and chaos. It was the decade of the Vietnam War, inner-city riots, and assassinations that seemed to symbolize the crushing of a new generation’s idealism. A decade of struggle and disillusionment rocked by social, cultural, and political upheaval, the 1960s are remembered because so much changed, and because so much did not.

II. Kennedy and Cuba

The decade’s political landscape began with a watershed presidential election. Americans were captivated by the 1960 race between Republican vice president Richard Nixon and Democratic senator John F. Kennedy, two candidates who pledged to move the nation forward and invigorate an economy experiencing the worst recession since the Great Depression. Kennedy promised to use federal programs to strengthen the economy and address pockets of longstanding poverty, while Nixon called for a reliance on private enterprise and reduction of government spending. Both candidates faced criticism as well; Nixon had to defend Dwight Eisenhower’s domestic policies, while Kennedy, who was attempting to become the first Catholic president, had to counteract questions about his faith and convince voters that he was experienced enough to lead.

One of the most notable events of the Nixon-Kennedy presidential campaign was their televised debate in September, the first of its kind between major presidential candidates. The debate focused on domestic policy and provided Kennedy with an important moment to present himself as a composed, knowledgeable statesman. In contrast, Nixon, an experienced debater who faced higher expectations, looked sweaty and defensive. Radio listeners famously thought the two men performed equally well, but the TV audience was much more impressed by Kennedy, giving him an advantage in subsequent debates. Ultimately, the election was extraordinarily close; in the largest voter turnout in American history up to that point, Kennedy bested Nixon by less than one percentage point (34,227,096 to 34,107,646 votes). Although Kennedy’s lead in electoral votes was more comfortable at 303 to 219, the Democratic Party’s victory did not translate in Congress, where Democrats lost a few seats in both houses. As a result, Kennedy entered office in 1961 without the mandate necessary to achieve the ambitious agenda he would refer to as the New Frontier.

Kennedy also faced foreign policy challenges. The United States entered the 1960s unaccustomed to stark foreign policy failures, having emerged from World War II as a global superpower before waging a Cold War against the Soviet Union in the 1950s. In the new decade, unsuccessful conflicts in Cuba and Vietnam would yield embarrassment, fear, and tragedy, stunning a nation that expected triumph and altering the way many thought of America’s role in international affairs.

On January 8, 1959, Fidel Castro and his revolutionary army initiated a new era of Cuban history. Having ousted the corrupt Cuban president Fulgencio Batista, who had fled Havana on New Year’s Eve, Castro and his rebel forces made their way triumphantly through the capital city’s streets. The United States, which had long propped up Batista’s corrupt regime, had withdrawn support and, initially, expressed sympathy for Castro’s new government, which was immediately granted diplomatic recognition. But President Dwight Eisenhower and members of his administration were wary. The new Cuban government soon instituted leftist economic policies centered on agrarian reform, land redistribution, and the nationalization of private enterprises. Cuba’s wealthy and middle-class citizens fled the island in droves. Many settled in Miami, Florida, and other American cities.

The relationship between Cuba and the United States deteriorated rapidly. On October 19, 1960, the United States instituted a near-total trade embargo to economically isolate the Cuban regime, and in January 1961, the two nations broke off formal diplomatic relations. The Central Intelligence Agency (CIA), acting under the mistaken belief that the Castro government lacked popular support and that Cuban citizens would revolt if given the opportunity, began to recruit members of the exile community to participate in an invasion of the island. On April 16, 1961, an invasion force consisting primarily of Cuban émigrés landed on Girón Beach at the Bay of Pigs. Cuban soldiers and civilians quickly overwhelmed the exiles, many of whom were taken prisoner. The Cuban government’s success at thwarting the Bay of Pigs invasion did much to legitimize the new regime and was a tremendous embarrassment for the Kennedy administration.

As the political relationship between Cuba and the United States disintegrated, the Castro government became more closely aligned with the Soviet Union. This strengthening of ties set the stage for the Cuban Missile Crisis, perhaps the most dramatic foreign policy crisis in the history of the United States. In 1962, in response to the United States’ longtime maintenance of a nuclear arsenal in Turkey and at the invitation of the Cuban government, the Soviet Union deployed nuclear missiles in Cuba. On October 14, 1962, American spy planes detected the construction of missile launch sites, and on October 22, President Kennedy addressed the American people to alert them to this threat. Over the course of the next several days, the world watched in horror as the United States and the Soviet Union hovered on the brink of nuclear war. Finally, on October 28, the Soviet Union agreed to remove its missiles from Cuba in exchange for a U.S. agreement to remove its missiles from Turkey and a formal pledge that the United States would not invade Cuba, and the crisis was resolved peacefully.

Protestors hold signs that read "President Kennedy Be Careful," "Let the UN Handle the Cuban Crisis!," "Peace or Perish," and "[unclear] your responsibility and give us peace."

The Cuban Missile Crisis was a time of great anxiety in America. Eight hundred women demonstrated outside the United Nations Building in 1962 to promote peace. Library of Congress.

Though the Cuban Missile Crisis temporarily halted the flow of Cuban refugees into the United States, emigration began again in earnest in the mid-1960s. In 1965, the Johnson administration and the Castro government brokered a deal that facilitated the reunion of families that had been separated by earlier waves of migration, opening the door for thousands to leave the island. In 1966 President Lyndon B. Johnson signed the Cuban Adjustment Act, a law allowing Cuban refugees to become permanent residents. Over the course of the 1960s, hundreds of thousands of Cubans left their homeland and built new lives in America.

 

III. The Civil Rights Movement Continues

So much of the energy and character of the sixties emerged from the civil rights movement, which won its greatest victories in the early years of the decade. The movement itself was changing. Many of the civil rights activists pushing for school desegregation in the 1950s were middle-class and middle-aged. In the 1960s, a new student movement arose whose members wanted swifter changes in the segregated South. Confrontational protests, marches, boycotts, and sit-ins accelerated.1

The tone of the modern U.S. civil rights movement changed at a North Carolina department store in 1960, when four African American students participated in a sit-in at a whites-only lunch counter. The 1960 Greensboro sit-ins were typical. Activists sat at segregated lunch counters in an act of defiance, refusing to leave until being served and willing to be ridiculed, attacked, and arrested if they were not. This tactic drew resistance but forced the desegregation of Woolworth’s department stores. It prompted copycat demonstrations across the South. The protests offered evidence that student-led direct action could enact social change. Increasingly disenchanted with the seemingly distant, professionalized civil rights leadership of older southern ministers, Ella Baker left King’s Southern Christian Leadership Conference and helped organize the Student Non-Violent Coordinating Committee (SNCC, and often pronounced “snick”) that year. She embraced the direct, grassroots action of student activists such as Julian Bond, Stokely Carmichael, Diane Nash, John Lewis, and countless other who would push the civil rights movement in a new, more confrontational direction.2

In the following year, 1961, civil rights advocates attempted a bolder variation of a sit-in when they participated in the Freedom Rides. Activists in the Congress of Racial Equality (CORE) organized interstate bus rides following a Supreme Court decision outlawing segregation on public buses and trains. The rides intended to test the court’s ruling, which many southern states had ignored. An interracial group of Freedom Riders boarded buses in Washington, D.C., with the intention of sitting in integrated patterns on the buses as they traveled through the Deep South. On the initial rides in May 1961, the riders encountered fierce resistance in Alabama. Angry mobs composed of KKK members attacked riders in Birmingham, burning one of the buses and beating the activists who escaped. Additional Freedom Rides launched through the summer and generated national attention amid additional violent resistance. Ultimately, the Interstate Commerce Commission enforced integrated interstate buses and trains in November 1961.3

In the fall of 1961, civil rights activists descended on Albany, a small city in southwest Georgia. Known for entrenched segregation and racial violence, Albany seemed an unlikely place for Black Americans to rally and demand change. The activists there, however, formed the Albany Movement, a coalition of civil rights organizers that included members of the Student Nonviolent Coordinating Committee (SNCC), the Southern Christian Leadership Conference (SCLC), and the NAACP. But the movement was stymied by Albany police chief Laurie Pritchett, who launched mass arrests but refused to engage in police brutality and bailed out leading officials to avoid negative media attention. It was a peculiar scene, and a lesson for southern activists.4

The Albany Movement included elements of a Christian commitment to social justice in its platform, with activists stating that all people were “of equal worth” in God’s family and that “no man may discriminate against or exploit another.” In many instances in the 1960s, Black Christianity propelled civil rights advocates to action and demonstrated the significance of religion to the broader civil rights movement. King’s rise to prominence underscored the role that African American religious figures played in the 1960s civil rights movement. Protesters sang hymns and spirituals as they marched. Preachers rallied the people with messages of justice and hope. Churches hosted meetings, prayer vigils, and conferences on nonviolent resistance. The moral thrust of the movement strengthened African American activists and confronted white society by framing segregation as a moral evil.5

As the civil rights movement garnered more followers and more attention, white resistance stiffened. In October 1962, James Meredith became the first African American student to enroll at the University of Mississippi. Meredith’s enrollment sparked riots on the Oxford campus, prompting President John F. Kennedy to send in U.S. Marshals and National Guardsmen to maintain order. On an evening known infamously as the Battle of Ole Miss, segregationists clashed with troops in the middle of campus, resulting in two deaths and hundreds of injuries. Violence served as a reminder of the strength of white resistance to the civil rights movement, particularly in the realm of education.6

Photograph of James Meredith, accompanied by U.S. Marshalls, walking to class at the University of Mississippi in 1962. Meredith was the first African-American student admitted to the still segregated Ole Miss. Marion S. Trikosko, “Integration at Ole Miss[issippi] Univ[ersity],” 1962. Library of Congress, http://www.loc.gov/pictures/item/2003688159/.

James Meredith, accompanied by U.S. Marshals, walks to class at the University of Mississippi in 1962. Meredith was the first African American student admitted to the segregated university. Library of Congres.

The following year, 1963, was perhaps the decade’s most eventful year for civil rights. In April and May, the SCLC organized the Birmingham Campaign, a broad campaign of direct action aiming to topple segregation in Alabama’s largest city. Activists used business boycotts, sit-ins, and peaceful marches as part of the campaign. SCLC leader Martin Luther King Jr. was jailed, prompting his famous handwritten letter urging not only his nonviolent approach but active confrontation to directly challenge injustice. The campaign further added to King’s national reputation and featured powerful photographs and video footage of white police officers using fire hoses and attack dogs on young African American protesters. It also yielded an agreement to desegregate public accommodations in the city: activists in Birmingham scored a victory for civil rights and drew international praise for the nonviolent approach in the face of police-sanctioned violence and bombings.7

White resistance intensified. While much of the rhetoric surrounding the 1960s focused on a younger, more liberal generation’s progressive ideas, conservatism maintained a strong presence on the American political scene. Few political figures in the decade embodied the working-class, conservative views held by millions of white Americans quite like George Wallace. Wallace’s vocal stance on segregation was immortalized in his 1963 inaugural address as Alabama governor with the phrase: “Segregation now, segregation tomorrow, segregation forever!” Just as the civil rights movement began to gain unprecedented strength, Wallace became the champion of the many white southerners opposed to the movement. Consequently, Wallace was one of the best examples of the very real opposition civil rights activists faced in the late twentieth century.8

As governor, Wallace loudly supported segregation. His efforts were symbolic, but they earned him national recognition as a political figure willing to fight for what many southerners saw as their traditional way of life. In June 1963, just five months after becoming governor, in his “Stand in the Schoolhouse Door,” Wallace famously stood in the door of Foster Auditorium to protest integration at the University of Alabama. President Kennedy addressed the nation that evening, criticizing Wallace and calling for a comprehensive civil rights bill. A day later, civil rights leader Medgar Evers was assassinated at his home in Jackson, Mississippi.

Alabama governor George Wallace stands defiantly at the door of the University of Alabama, blocking the attempted integration of the school. Wallace was perhaps the most notoriously pro-segregation politician of the 1960s, proudly proclaiming in his 1963 inaugural address “segregation now, segregation tomorrow, segregation forever.” Warren K. Leffler, “[Governor George Wallace attempting to block integration at the University of Alabama],” June 11, 1963. Library of Congress, http://www.loc.gov/pictures/item/2003688161/.

Alabama governor George Wallace stands defiantly at the door of the University of Alabama, blocking the attempted integration of the school. Wallace became the most notorious pro-segregation politician of the 1960s, proudly proclaiming, in his 1963 inaugural address, “Segregation now, segregation tomorrow, segregation forever.” Library of Congress.

That summer, civil rights leaders organized the August 1963 March on Washington. The march called for, among other things, civil rights legislation, school integration, an end to discrimination by public and private employers, job training for the unemployed, and a raise in the minimum wage. On the steps of the Lincoln Memorial, King delivered his famous “I Have a Dream” speech, an internationally renowned call for civil rights that raised the movement’s profile to new heights and put unprecedented pressure on politicians to pass meaningful civil rights legislation.9

White activists increasingly joined African Americans in the Civil Rights Movement during the 1960s. This photograph shows Martin Luther King, Jr., and other black civil rights leaders arm-in-arm with leaders of the Jewish community. Signs read "We March for Jobs for All Now!" "We Demand Voting Rights Now!" "End Segregated Rules for Public Schools!" "We March for Jobs for All and a Decent Pay Now!" and "We Demand Equal Rights Now!"

This photograph shows Martin Luther King Jr. and other Black civil rights leaders arm-in-arm with leaders of the Jewish community during the March on Washington on August 28, 1963. Wikimedia.

Kennedy offered support for a civil rights bill, but southern resistance was intense and Kennedy was unwilling to expend much political capital on it. And so the bill stalled in Congress. Then, on November 22, 1963, President Kennedy was assassinated in Dallas. The nation’s youthful, popular president was gone. Vice President Lyndon Johnson lacked Kennedy’s youth, his charisma, his popularity, and his aristocratic upbringing, but no one knew Washington better and no one before or since fought harder and more successfully to pass meaningful civil rights legislation. Raised in poverty in the Texas Hill Country, Johnson scratched and clawed his way up the political ladder. He was both ruthlessly ambitious and keenly conscious of poverty and injustice. He idolized Franklin Roosevelt whose New Deal had brought improvements for the impoverished central Texans Johnson grew up with.

President Lyndon Johnson, then, an old white southerner with a thick Texas drawl, embraced the civil rights movement. He took Kennedy’s stalled civil rights bill, ensured that it would have teeth, and navigated it through Congress. The following summer he signed the Civil Rights Act of 1964, widely considered to be among the most important pieces of civil rights legislation in American history. The comprehensive act barred segregation in public accommodations and outlawed discrimination based on race, ethnicity, gender, and national or religious origin.

Photograph of Lyndon B. Johnson sitting with Civil Rights leaders in the White House including Martin Luther King Jr.

Lyndon B. Johnson sits with Civil Rights Leaders in the White House. One of Johnson’s greatest legacies would be his staunch support of civil rights legislation. Wikimedia.

Photograph of President Lyndon B. Johnson leaning very close into the face of Senator Richard Russell.

Johnson gives Senator Richard Russell the famous “Johnson Treatment.” Yoichi R. Okamoto, Photograph of Lyndon B. Johnson pressuring Senator Richard Russell, December 17, 1963. Wikimedia.

The civil rights movement created space for political leaders to pass legislation, and the movement continued pushing forward. Direct action continued through the summer of 1964, as student-run organizations like SNCC and the Congress of Racial Equality (CORE) helped with the Freedom Summer in Mississippi, a drive to register African American voters in a state with an ugly history of discrimination. Freedom Summer campaigners set up schools for African American children. Even with progress, intimidation and violent resistance against civil rights continued, particularly in regions with longstanding traditions of segregation. Three young CORE activists, James Chaney, Michael Schwerner, and Andrew Goodman, were murdered by local law enforcement officers and Klan members in Neshoba County, outside of Philadelphia, Mississippi.10 In August, over 2,000 Black Mississippians assembled in Jackson and formed the Mississippi Freedom Democratic Party. They demanded that their delegates be seated at the Democratic National Convention. Denied anything more than two at-large seats, the Party protested. “I question America,” co-founder Fannie Lou Hamer said in a nationally televised address. “Is this America?” she asked.

Activists kept fighting. In March 1965, activists attempted to march from Selma to Montgomery, Alabama, on behalf of local African American voting rights. In a narrative that had become familiar, “Bloody Sunday” featured peaceful protesters attacked by white law enforcement with batons and tear gas. After they were turned away violently a second time, marchers finally made the fifty-mile trek to the state capitol later in the month. Coverage of the first march prompted President Johnson to present the bill that became the Voting Rights Act of 1965, an act that abolished voting discrimination in federal, state, and local elections. In two consecutive years, landmark pieces of legislation had assaulted de jure (by law) segregation and disenfranchisement.11

Photograph of five leaders of the Civil Rights Movement. From left: Bayard Rustin, Andrew Young, N.Y. Congressman William Ryan, James Farmer, and John Lewis in 1965. Stanley Wolfson, Photograph, 1965. Library of Congress, http://www.loc.gov/pictures/item/98515229/.

Five leaders of the Civil Rights Movement in 1965. From left: Bayard Rustin, Andrew Young, N.Y. Congressman William Ryan, James Farmer, and John Lewis. Library of Congress.

 

IV. Lyndon Johnson’s Great Society

On a May morning in 1964, President Johnson laid out a sweeping vision for a package of domestic reforms known as the Great Society. Speaking before that year’s graduates of the University of Michigan, Johnson called for “an end to poverty and racial injustice” and challenged both the graduates and American people to “enrich and elevate our national life, and to advance the quality of our American civilization.” At its heart, he promised, the Great Society would uplift racially and economically disfranchised Americans, too long denied access to federal guarantees of equal democratic and economic opportunity, while simultaneously raising all Americans’ standards and quality of life.12

The Great Society’s legislation was breathtaking in scope, and many of its programs and agencies are still with us today. The Civil Rights Act of 1964 and the Voting Rights Act of 1965 codified federal support for many of the civil rights movement’s goals by prohibiting job discrimination, abolishing the segregation of public accommodations, and providing vigorous federal oversight of southern states’ election laws in order to guarantee minority access to the ballot. Ninety years after Reconstruction, these measures effectively ended Jim Crow. Moreover, the Immigration and Nationality Act of 1965–or the Hart-Celler Act–abolished the quota regime established by the 1924 Reed-Johnson Act. American immigration, which had for more than four decades effectively barred legal immigration to the United States from anywhere other than Northern and Western Europe, finally opened the United States up to the world and forever reshaped the demographics of the nation.

In addition to civil rights and immigration, the Great Society took on a range of quality-of-life concerns that seemed suddenly solvable in a society of such affluence. It established the first federal food stamp program. Medicare and Medicaid would ensure access to quality medical care for the aged and poor. In 1965, the Elementary and Secondary Education Act was the first sustained and significant federal investment in public education, totaling more than $1 billion. Significant funds were poured into colleges and universities. The Great Society also established the National Endowment for the Arts and the National Endowment for the Humanities, federal investments in arts and letters that fund American cultural expression to this day.

While these programs persisted and even thrived, in the years immediately following this flurry of legislative activity, the national conversation surrounding Johnson’s domestic agenda largely focused on the $3 billion spent on War on Poverty programming within the Great Society’s Economic Opportunity Act (EOA) of 1964. No EOA program was more controversial than Community Action, considered the cornerstone antipoverty program. Johnson’s antipoverty planners felt that the key to uplifting disfranchised and impoverished Americans was involving poor and marginalized citizens in the actual administration of poverty programs, what they called “maximum feasible participation.” Community Action Programs would give disfranchised Americans a seat at the table in planning and executing federally funded programs that were meant to benefit them—a significant sea change in the nation’s efforts to confront poverty, which had historically relied on local political and business elites or charitable organizations for administration.13

In fact, Johnson himself had never conceived of poor Americans running their own poverty programs. While the president’s rhetoric offered a stirring vision of the future, he had singularly old-school notions for how his poverty policies would work. In contrast to “maximum feasible participation,” the president imagined a second New Deal: local elite-run public works camps that would instill masculine virtues in unemployed young men. Community Action almost entirely bypassed local administrations and sought to build grassroots civil rights and community advocacy organizations, many of which had originated in the broader civil rights movement. Despite widespread support for most Great Society programs, the War on Poverty increasingly became the focal point of domestic criticisms from the left and right. On the left, frustrated Americans recognized the president’s resistance to further empowering poor minority communities and also assailed the growing war in Vietnam, the cost of which undercut domestic poverty spending. As racial unrest and violence swept across urban centers, critics from the right lambasted federal spending for “unworthy” citizens.

Johnson had secured a series of meaningful civil rights laws, but then things began to stall. Days after the ratification of the Voting Rights Act, race riots broke out in the Watts neighborhood of Los Angeles. Rioting in Watts stemmed from local African American frustrations with residential segregation, police brutality, and racial profiling. Waves of riots rocked American cities every summer thereafter. Particularly destructive riots occurred in 1967—two summers later—in Newark and Detroit. Each resulted in deaths, injuries, arrests, and millions of dollars in property damage. In spite of Black achievements, problems persisted for many African Americans. The phenomenon of “white flight”—when whites in metropolitan areas fled city centers for the suburbs—often resulted in resegregated residential patterns. Limited access to economic and social opportunities in urban areas bred discord. In addition to reminding the nation that the civil rights movement was a complex, ongoing event without a concrete endpoint, the unrest in northern cities reinforced the notion that the struggle did not occur solely in the South. Many Americans also viewed the riots as an indictment of the Great Society, President Johnson’s sweeping agenda of domestic programs that sought to remedy inner-city ills by offering better access to education, jobs, medical care, housing, and other forms of social welfare. The civil rights movement was never the same.14

The Civil Rights Acts, the Voting Rights Acts, and the War on Poverty provoked conservative resistance and were catalysts for the rise of Republicans in the South and West. However, subsequent presidents and Congresses have left intact the bulk of the Great Society, including Medicare and Medicaid, food stamps, federal spending for arts and literature, and Head Start. Even Community Action Programs, so fraught during their few short years of activity, inspired and empowered a new generation of minority and poverty community activists who had never before felt, as one put it, that “this government is with us.”15

 

V. The Origins of the Vietnam War

American involvement in the Vietnam War began during the postwar period of decolonization. The Soviet Union backed many nationalist movements across the globe, but the United States feared the expansion of communist influence and pledged to confront any revolutions aligned against Western capitalism. The Domino Theory—the idea that if a country fell to communism, then neighboring states would soon follow—governed American foreign policy. After the communist takeover of China in 1949, the United States financially supported the French military’s effort to retain control over its colonies in Vietnam, Cambodia, and Laos.

Between 1946 and 1954, France fought a counterinsurgency campaign against the nationalist Viet Minh forces led by Ho Chi Minh. The United States assisted the French war effort with funds, arms, and advisors, but it was not enough. On the eve of the Geneva Peace Conference in 1954, Viet Minh forces defeated the French army at Dien Bien Phu. The conference temporarily divided Vietnam into two separate states until UN-monitored elections occurred. But the United States feared a communist electoral victory and blocked the elections. The temporary partition became permanent. The United States established the Republic of Vietnam, or South Vietnam, with the U.S.-backed Ngo Dinh Diem as prime minister. Diem, who had lived in the United States, was a committed anticommunist.

Diem’s government, however, and its Army of the Republic of Vietnam (ARVN) could not contain the communist insurgency seeking the reunification of Vietnam. The Americans provided weapons and support, but despite a clear numerical and technological advantage, South Vietnam stumbled before insurgent Vietcong (VC) units. Diem, a corrupt leader propped up by the American government with little domestic support, was assassinated in 1963. A merry-go-round of military dictators followed as the situation in South Vietnam continued to deteriorate. The American public, though, remained largely unaware of Vietnam in the early 1960s, even as President John F. Kennedy deployed some sixteen thousand military advisors to help South Vietnam suppress a domestic communist insurgency.16

This all changed in 1964. On August 2, the USS Maddox reported incoming fire from North Vietnamese ships in the Gulf of Tonkin. Although the details of the incident are controversial, the Johnson administration exploited the event to provide a pretext for escalating American involvement in Vietnam. Congress passed the Gulf of Tonkin Resolution, granting President Johnson the authority to deploy the American military to defend South Vietnam. U.S. Marines landed in Vietnam in March 1965, and the American ground war began.

American forces under General William Westmoreland were tasked with defending South Vietnam against the insurgent VC and the regular North Vietnamese Army (NVA). But no matter how many troops the Americans sent or how many bombs they dropped, they could not win. This was a different kind of war. Progress was not measured by cities won or territory taken but by body counts and kill ratios. Although American officials like Westmoreland and secretary of defense Robert McNamara claimed a communist defeat was on the horizon, by 1968 half a million American troops were stationed in Vietnam, nearly twenty thousand had been killed, and the war was still no closer to being won. Protests, which would provide the backdrop for the American counterculture, erupted across the country.

 

VI. Culture and Activism

Epitomizing the folk music and protest culture of 1960s youth, Joan Baez and Bob Dylan are photographed here singing together at the March on Washington in 1963.

Epitomizing the folk music and protest culture of 1960s youth, Joan Baez and Bob Dylan are pictured here singing together at the March on Washington in 1963. Wikimedia.

The 1960s wrought enormous cultural change. The United States that entered the decade looked and sounded little like the one that left it. Rebellion rocked the supposedly hidebound conservatism of the 1950s as the youth counterculture became mainstream. Native Americans, Chicanos, women, and environmentalists participated in movements demonstrating that rights activism could be applied to ethnicity, gender, and nature. Even established religious institutions such as the Catholic Church underwent transformations, emphasizing freedom and tolerance. In each instance, the decade brought substantial progress and evidence that activism remained fluid and unfinished.

Much of the counterculture was filtered through popular culture and consumption. The fifties consumer culture still saturated the country, and advertisers continued to appeal to teenagers and the expanding youth market. During the 1960s, though, advertisers looked to a growing counterculture to sell their products. Popular culture and popular advertising in the 1950s had promoted an ethos of “fitting in” and buying products to conform. The new countercultural ethos touted individuality and rebellion. Some advertisers were subtle; ads for Volkswagens (VWs) acknowledged the flaws and strange look of their cars. One ad read, “Presenting America’s slowest fastback,” which “won’t go over 72 mph even though the speedometer shows a wildly optimistic top speed of 90.” Another stated, “And if you run out of gas, it’s easy to push.” By marketing the car’s flaws and reframing them as positive qualities, the advertisers commercialized young people’s resistance to commercialism, while simultaneously positioning the VW as a car for those wanting to stand out in a crowd. A more obviously countercultural ad for the VW Bug showed two cars: one black and one painted multicolor in the hippie style; the contrasting captions read, “We do our thing,” and “You do yours.”

Companies marketed their products as countercultural in and of themselves. One of the more obvious examples was a 1968 ad from Columbia Records, a hugely successful record label since the 1920s. The ad pictured a group of stock rebellious characters—a shaggy-haired white hippie, a buttoned-up Beat, two biker types, and a Black jazz man sporting an Afro—in a jail cell. The counterculture had been busted, the ad states, but “the man can’t bust our music.” Merely buying records from Columbia was an act of rebellion, one that brought the buyer closer to the counterculture figures portrayed in the ad.17

But it wasn’t just advertising: the culture was changing and changing rapidly. Conservative cultural norms were falling everywhere. The dominant style of women’s fashion in the 1950s, for instance, was the poodle skirt and the sweater, tight-waisted and buttoned up. The 1960s ushered in an era of much less restrictive clothing. Capri pants became popular casual wear. Skirts became shorter. When Mary Quant invented the miniskirt in 1964, she said it was a garment “in which you could move, in which you could run and jump.”18 By the late 1960s, the hippies’ more androgynous look became trendy. Such trends bespoke the new popular ethos of the 1960s: freedom, rebellion, and individuality.

In a decade plagued by social and political instability, the American counterculture also sought psychedelic drugs as its remedy for alienation. For middle-class white teenagers, society had become stagnant and bureaucratic. The New Left, for instance, arose on college campuses frustrated with the lifeless bureaucracies that they believed strangled true freedom. Lysergic acid diethylamide (LSD) began its life as a drug used primarily in psychological research before trickling down into college campuses and out into society at large. The counterculture’s notion that American stagnation could be remedied by a spiritual-psychedelic experience drew heavily from psychologists and sociologists. The popularity of these drugs also spurred a political backlash. By 1966, enough incidents had been connected to LSD to spur a Senate hearing on the drug, and newspapers were reporting that hundreds of LSD users had been admitted to psychiatric wards.

The counterculture conquered popular culture. Rock ’n’ roll, liberalized sexuality, an embrace of diversity, recreational drug use, unalloyed idealism, and pure earnestness marked a new generation. Criticized by conservatives as culturally dangerous and by leftists as empty narcissism, the youth culture nevertheless dominated headlines and steered American culture. Perhaps one hundred thousand youth descended on San Francisco for the utopic promise of 1967’s Summer of Love. 1969’s Woodstock concert in New York became shorthand for the new youth culture and its mixture of politics, protest, and personal fulfillment. While the ascendance of the hippies would be both exaggerated and short-lived, and while Vietnam and Richard Nixon shattered much of its idealism, the counterculture’s liberated social norms and its embrace of personal fulfillment still define much of American culture.

 

VII. Beyond Civil Rights

Despite substantial legislative achievements, frustrations with the slow pace of change grew. Tensions continued to mount in cities, and the tone of the civil rights movement changed yet again. Activists became less conciliatory in their calls for progress. Many embraced the more militant message of the burgeoning Black Power Movement and Malcolm X, a Nation of Islam (NOI) minister who encouraged African Americans to pursue freedom, equality, and justice by “any means necessary.” Prior to his death in 1965, Malcolm X and the NOI emerged as the radical alternative to the racially integrated, largely Protestant approach of Martin Luther King Jr. Malcolm advocated armed resistance in defense of the safety and well-being of Black Americans, stating, “I don’t call it violence when it’s self-defense, I call it intelligence.” For his part, King and leaders from more mainstream organizations like the NAACP and the Urban League criticized both Malcolm X and the NOI for what they perceived to be racial demagoguery. King believed Malcolm X’s speeches were a “great disservice” to Black Americans, claiming that they lamented the problems of African Americans without offering solutions. The differences between King and Malcolm X represented a core ideological tension that would inhabit Black political thought throughout the 1960s and 1970s.19

Like Booker T. Washington and W.E.B. Du Bois before them, Martin Luther King, Jr., and Malcolm X represented two styles of racial uplift while maintaining the same general goal of ending racial discrimination. How they would get to that goal is where the men diverged. Marion S. Trikosko, “[Martin Luther King and Malcolm X waiting for press conference],” March 26, 1964. Library of Congress, http://www.loc.gov/pictures/item/92522562/.

Like Booker T. Washington and W. E. B. Du Bois before them, Martin Luther King Jr., and Malcolm X, pictured here in 1964, represented different strategies to achieve racial justice. Library of Congress.

By the late 1960s, SNCC, led by figures such as Stokely Carmichael, had expelled its white members and shunned the interracial effort in the rural South, focusing instead on injustices in northern urban areas. After President Johnson refused to take up the cause of the Black delegates in the Mississippi Freedom Democratic Party at the 1964 Democratic National Convention, SNCC activists became frustrated with institutional tactics and turned away from the organization’s founding principle of nonviolence. This evolving, more aggressive movement called for African Americans to play a dominant role in cultivating Black institutions and articulating Black interests rather than relying on interracial, moderate approaches. At a June 1966 civil rights march, Carmichael told the crowd, “What we gonna start saying now is black power!”20 The slogan not only resonated with audiences, it also stood in direct contrast to King’s “Freedom Now!” campaign. The political slogan of Black power could encompass many meanings, but at its core it stood for the self-determination of Black people in political, economic, and social organizations.

 

 

1970 poster for the Black Panther Party calling for a "Revolutionary People's Constitutional Convention." The backdrop to the poster includes an image of Bobby Seal tied to a chair, which was during a trial. The bottom of the poster says "The Shackling like a Slave of Black Panther Party Chairman Bobby Seal is like the Reincarnation of Dred Scott 1857. This Brazen Violation of Bobby Seale's Constitutional Rights Exposes Without a Doubt that Black People have No Rights That Racist Oppressor Is Bound to Respect."

The Black Panther Party used radical and incendiary tactics to bring attention to the continued oppression of Black Americans. This 1970 poster captures their outlook. Wikimedia.

Carmichael asserted that “black power means black people coming together to form a political force.”21 To others it also meant violence. In 1966, Huey Newton and Bobby Seale formed the Black Panther Party in Oakland, California. The Black Panthers became the standard-bearers for direct action and self-defense, using the concept of decolonization in their drive to liberate Black communities from white power structures. The revolutionary organization also sought reparations and exemptions for Black men from the military draft. Citing police brutality and racist governmental policies, the Black Panthers aligned themselves with the “other people of color in the world” against whom America was fighting abroad. Although it was perhaps most well known for its open display of weapons, military-style dress, and Black nationalist beliefs, the party’s 10-Point Plan also included employment, housing, and education. The Black Panthers worked in local communities to run “survival programs” that provided food, clothing, medical treatment, and drug rehabilitation. They focused on modes of resistance that empowered Black activists on their own terms.22

But African Americans weren’t the only Americans struggling to assert themselves in the 1960s. The successes of the civil rights movement and growing grassroots activism inspired countless new movements. In the summer of 1961, for instance, frustrated Native American university students founded the National Indian Youth Council (NIYC) to draw attention to the plight of Indigenous Americans. In the Pacific Northwest, the council advocated for tribal fisherman to retain immunity from conservation laws on reservations and in 1964 held a series of “fish-ins”: activists and celebrities cast nets and waited for the police to arrest them.23 The NIYC’s militant rhetoric and use of direct action marked the beginning of what was called the Red Power movement, an intertribal movement designed to draw attention to Native issues and to protest discrimination. The American Indian Movement (AIM) and other activists staged dramatic demonstrations. In November 1969, dozens began a year-and-a-half-long occupation of the abandoned Alcatraz Island in San Francisco Bay. In 1973, hundreds occupied the town of Wounded Knee, South Dakota, site of the infamous 1890 massacre, for several months.24

Meanwhile, the Chicano movement in the 1960s emerged out of the broader Mexican American civil rights movement of the post–World War II era. The word Chicano was initially considered a derogatory term for Mexican immigrants, until activists in the 1960s reclaimed the term and used it as a catalyst to campaign for political and social change among Mexican Americans. The Chicano movement confronted discrimination in schools, politics, agriculture, and other formal and informal institutions. Organizations like the Mexican American Political Association (MAPA) and the Mexican American Legal Defense Fund (MALDF) buoyed the Chicano movement and patterned themselves after similar influential groups in the African American civil rights movement.25

Cesar Chavez became the most well-known figure of the Chicano movement, using nonviolent tactics to campaign for workers’ rights in the grape fields of California. Chavez and activist Dolores Huerta founded the National Farm Workers Association, which eventually merged and became the United Farm Workers of America (UFWA). The UFWA fused the causes of Chicano and Filipino activists protesting the subpar working conditions of California farmers on American soil. In addition to embarking on a hunger strike and a boycott of table grapes, Chavez led a three-hundred-mile march in March and April 1966 from Delano, California, to the state capital of Sacramento. The pro-labor campaign garnered the national spotlight and the support of prominent political figures such as Robert Kennedy. Today, Chavez’s birthday (March 31) is observed as a federal holiday in California, Colorado, and Texas.

Rodolfo “Corky” Gonzales was another activist whose calls for Chicano self-determination resonated long past the 1960s. A former boxer and Denver native, Gonzales founded the Crusade for Justice in 1966, an organization that would establish the first annual Chicano Liberation Day at the National Chicano Youth Conference. The conference also yielded the Plan Espiritual de Aztlán, a Chicano nationalist manifesto that reflected Gonzales’s vision of Chicanos as a unified, historically grounded, all-encompassing group fighting against discrimination in the United States. By 1970, the Texas-based La Raza Unida political party had a strong foundation for promoting Chicano nationalism and continuing the campaign for Mexican American civil rights.26

Photograph of the 1966 Rio Grande Valley Farm Workers March (“La Marcha”). Marchers hold the American flag, Texas flag, an image of the Virgin Mary, and signs that say "U.S. Democratic Principles Apply [unclear]," and "Justice for All Workers Now"

The 1966 Rio Grande Valley Farm Workers March (“La Marcha”). August 27, 1966. The University of Texas-San Antonio Libraries’ Special Collections (MS 360: E-0012-187-D-16)

The feminist movement also grew in the 1960s. Women were active in both the civil rights movement and the labor movement, but their increasing awareness of gender inequality did not find a receptive audience among male leaders in those movements. In the 1960s, then, many of these women began to form a movement of their own. Soon the country experienced a groundswell of feminist consciousness.

An older generation of women who preferred to work within state institutions figured prominently in the early part of the decade. When John F. Kennedy established the Presidential Commission on the Status of Women in 1961, former first lady Eleanor Roosevelt headed the effort. The commission’s official report, a self-declared “invitation to action,” was released in 1963. Finding discriminatory provisions in the law and practices of industrial, labor, and governmental organizations, the commission advocated for “changes, many of them long overdue, in the conditions of women’s opportunity in the United States.”27 Change was recommended in areas of employment practices, federal tax and benefit policies affecting women’s income, labor laws, and services for women as wives, mothers, and workers. This call for action, if heeded, would ameliorate the types of discrimination primarily experienced by middle-class and elite white working women, all of whom were used to advocating through institutional structures like government agencies and unions.28 The specific concerns of poor and nonwhite women lay largely beyond the scope of the report.

Betty Friedan’s The Feminine Mystique hit bookshelves the same year the commission released its report. Friedan had been active in the union movement and was by this time a mother in the new suburban landscape of postwar America. In her book, Friedan labeled the “problem that has no name,” and in doing so helped many white middle-class American women come to see their dissatisfaction as housewives not as something “wrong with [their] marriage, or [themselves],” but instead as a social problem experienced by millions of American women. Friedan observed that there was a “discrepancy between the reality of our lives as women and the image to which we were trying to conform, the image I call the feminine mystique.” No longer would women allow society to blame the “problem that has no name” on a loss of femininity, too much education, or too much female independence and equality with men.29

The 1960s also saw a different group of women pushing for change in government policy. Mothers on welfare began to form local advocacy groups in addition to the National Welfare Rights Organization, founded in 1966. Mostly African American, these activists fought for greater benefits and more control over welfare policy and implementation. Women like Johnnie Tillmon successfully advocated for larger grants for school clothes and household equipment in addition to gaining due process and fair administrative hearings prior to termination of welfare entitlements.

Yet another mode of feminist activism was the formation of consciousness-raising groups. These groups met in women’s homes and at women’s centers, providing a safe environment for women to discuss everything from experiences of gender discrimination to pregnancy, from relationships with men and women to self-image. The goal of consciousness-raising was to increase self-awareness and validate the experiences of women. Groups framed such individual experiences as examples of society-wide sexism, and claimed that “the personal is political.”30 Consciousness-raising groups created a wealth of personal stories that feminists could use in other forms of activism and crafted networks of women from which activists could mobilize support for protests.

The end of the decade was marked by the Women’s Strike for Equality, celebrating the fiftieth anniversary of women’s right to vote. Sponsored by the National Organization for Women (NOW), the 1970 protest focused on employment discrimination, political equality, abortion, free childcare, and equality in marriage. All of these issues foreshadowed the backlash against feminist goals in the 1970s. Not only would feminism face opposition from other women who valued the traditional homemaker role to which feminists objected, the feminist movement would also fracture internally as minority women challenged white feminists’ racism and lesbians vied for more prominence within feminist organizations.

Photograph of a womens rights march. Signs say "Women Demand EQUALITY," "I am a second class citizen," and "GWU Womens Liberation"

The women’s movement stalled during the 1930s and 1940s, but by the 1960s it was back in full force. Inspired by the civil rights movement and fed up with gender discrimination, women took to the streets to demand their rights as American citizens. Here, women march during the “Women’s Strike for Equality,” a nationwide protest launched on the 50th anniversary of women’s suffrage. Photograph, August 26, 1970. Library of Congress.

American environmentalism’s significant gains during the 1960s emerged in part from Americans’ recreational use of nature. Postwar Americans backpacked, went to the beach, fished, and joined birding organizations in greater numbers than ever before. These experiences, along with increased formal education, made Americans more aware of threats to the environment and, consequently, to themselves. Many of these threats increased in the postwar years as developers bulldozed open space for suburbs and new hazards emerged from industrial and nuclear pollutants.

By the time that biologist Rachel Carson published her landmark book, Silent Spring, in 1962, a nascent environmentalism had emerged in America. Silent Spring stood out as an unparalleled argument for the interconnectedness of ecological and human health. Pesticides, Carson argued, also posed a threat to human health, and their overuse threatened the ecosystems that supported food production. Carson’s argument was compelling to many Americans, including President Kennedy, but was virulently opposed by chemical industries that suggested the book was the product of an emotional woman, not a scientist.31

After Silent Spring, the social and intellectual currents of environmentalism continued to expand rapidly, culminating in the largest demonstration in history, Earth Day, on April 22, 1970, and in a decade of lawmaking that significantly restructured American government. Even before the massive gathering for Earth Day, lawmakers from the local to the federal level had pushed for and achieved regulations to clean up the air and water. President Richard Nixon signed the National Environmental Policy Act into law in 1970, requiring environmental impact statements for any project directed or funded by the federal government. He also created the Environmental Protection Agency, the first agency charged with studying, regulating, and disseminating knowledge about the environment. A raft of laws followed that were designed to offer increased protection for air, water, endangered species, and natural areas.

The decade’s activism manifested across the world. It even affected the Catholic Church. The Second Vatican Council, called by Pope John XXIII to modernize the church and bring it in closer dialogue with the non-Catholic world, operated from 1962 to 1965, when it proclaimed multiple reforms, including the vernacular mass (mass in local languages, rather than in Latin) and a greater role for laypeople, and especially women, in the Church. Many Catholic churches adopted more informal, contemporary styles. Many conservative Catholics recoiled at what they perceived as rapid and dangerous changes, but Vatican II’s reforms in many ways created the modern Catholic Church.

 

VIII. Conclusion

In 1969, Americans hailed the moon landing as a profound victory in the space race against the Soviet Union. This landmark achievement fulfilled the promise of the late John F. Kennedy, who had declared in 1961 that the United States would put a man on the moon by the end of the decade. But while Neil Armstrong said his steps marked “one giant leap for mankind,” and Americans marveled at the achievement, the brief moment of wonder only punctuated years of turmoil. The Vietnam War disillusioned a generation, riots rocked cities, protests hit campuses, and assassinations robbed the nation of many of its leaders. The forward-thinking spirit of a complex decade had waned. Uncertainty loomed.

 

IX. Primary Sources

1. Barry Goldwater, Republican Nomination Acceptance Speech (1964)

In 1964, Senator Barry Goldwater of Arizona accepted the Republican Party’s nomination for the presidency. In his speech, Goldwater refused to apologize for his strict conservative politics. “Extremism in the defense of liberty is no vice,” he said, and “moderation in the pursuit of justice is no virtue.”

2. Lyndon Johnson on Voting Rights and the American Promise (1965)

On March 15, 1965, Lyndon Baines Johnson addressed a joint session of Congress to push for the Voting Rights Act. In his speech, Johnson not only advocated policy, he borrowed the language of the civil rights movement and tied the movement to American history.

3. Lyndon Johnson, Howard University Commencement Address (1965)

On June 4, 1965, President Johnson delivered the commencement address at Howard University, the nation’s most prominent historically Black university. In his address, Johnson explained why “opportunity” was not enough to ensure the civil rights of disadvantaged Americans.

4. National Organization for Women, “Statement of Purpose” (1966)

The National Organization for Women was founded in 1966 by prominent American feminists, including Betty Friedan, Shirley Chisolm, and others. The organization’s “statement of purpose” laid out the goals of the organization and the targets of its feminist vision.

5. George M. Garcia, Vietnam Veteran, Oral Interview (2012/1969)

In 2012, George Garcia sat down to be interviewed about his experiences as a corporal in the United States Marine Corps during the Vietnam War. Alternating between English and Spanish, Garcia told of early life in Brownsville, Texas, his time as a U.S. Marine in Vietnam, and his experience coming home from the war.

6. The Port Huron Statement (1962)

The Port Huron Statement was a 1962 manifesto by the Students for a Democratic Society (SDS), written primarily by student activist Tom Hayden, that proposed a new form of “participatory democracy” to rescue modern society from destructive militarism and cultural alienation.

7. Fannie Lou Hamer: Testimony at the Democratic National Convention 1964

Civil rights activists struggled against the repressive violence of Mississippi’s racial regime. State NAACP head Medger Evers was murdered in 1963. Freedom Summer activists tried to register Black voters in 1964. Three disappeared and were found murdered. The Mississippi Democratic Party continued to disfranchise the state’s African American voters. Civil rights activist Fannie Lou Hamer co-founded the Mississippi Freedom Democratic Party (MFDP) and traveled to the Democratic National Convention in 1964 to demand that the MFDP’s delegates, rather than the all-white Mississippi Democratic Party delegates, be seated in the convention. Although unsuccessful, her moving testimony was broadcast on national television and drew further attention to the plight of African Americans in the South.

8. Selma March (1965)

Civil rights activists protested against the injustice of segregation in a variety of ways. Here, in 1965, marchers, some carrying American flags, march from Selma to Montgomery, Alabama, to champion African American voting rights. 

9. LBJ and Civil Rights Leaders (1964)

As civil rights demonstrations rocked the American South, civil rights legislation made its way through Washington D.C. Here, President Lyndon B. Johnson sits with civil rights leaders in the White House.

10. Women’s Liberation March (1970)

American popular feminism accelerated throughout the 1960s. The slogan “Women’s Liberation” accompanied a growing women’s movement but also alarmed conservative Americans. In this 1970 photograph, women march during the “Women’s Strike for Equality,” a nationwide protest launched on the 50th anniversary of women’s suffrage, carrying signs reading, “Women Demand Equality,” “I’m a Second Class Citizen,” and “Women’s Liberation.”

 

X. Reference Material

This chapter was edited by Samuel Abramson, with content contributions by Samuel Abramson, Marsha Barrett, Brent Cebul, Michell Chresfield, William Cossen, Jenifer Dodd, Michael Falcone, Leif Fredrickson, Jean-Paul de Guzman, Jordan Hill, William Kelly, Lucie Kyrova, Maria Montalvo, Emily Prifogle, Ansley Quiros, Tanya Roth, and Robert Thompson.

Recommended citation: Samuel Abramson et al., “The Sixties,” Samuel Abramson, ed., in The American Yawp, eds. Joseph Locke and Ben Wright (Stanford, CA: Stanford University Press, 2018).

Recommended Reading

  1. Branch, Taylor. Parting the Waters: America in the King Years, 1954–1963. New York: Simon and Schuster, 1988.
  2. ———. Pillar of Fire: America in the King Years, 1963–65. New York: Simon and Schuster, 1998.
  3. Breines, Winifred. The Trouble Between Us: An Uneasy History of White and Black Women in the Feminist Movement. New York: Oxford University Press, 2006.
  4. Brick, Howard. The Age of Contradictions: American Thought and Culture in the 1960s. Ithaca, NY: Cornell University Press, 2000.
  5. Brown-Nagin, Tomiko. Courage to Dissent: Atlanta and the Long History of the Civil Rights Movement. New York: Oxford University Press, 2011.
  6. Carson, Clayborne. In Struggle: SNCC and the Black Awakening of the 1960s. Cambridge, MA: Harvard University Press, 1981.
  7. Chafe, William. Civilities and Civil Rights: Greensboro, North Carolina, and the Black Struggle for Freedom. New York: Oxford University Press, 1980.
  8. Dallek, Robert. Flawed Giant: Lyndon Johnson and His Times, 1961–1973. New York: Oxford University Press, 1993.
  9. D’Emilio, John. Sexual Politics, Sexual Communities: The Making of a Homosexual Minority in the United States, 1940–1970. Chicago: University of Chicago Press, 1983.
  10. Echols, Alice. Daring to Be Bad: Radical Feminism in America, 1967–1975. Minneapolis: University of Minnesota Press, 1989.
  11. Gitlin, Todd. The Sixties: Years of Hope, Days of Rage. New York: Bantam Books, 1987.
  12. Hall, Jacquelyn Dowd. “The Long Civil Rights Movement and the Political Uses of the Past.” Journal of American History 91 (March 2005): 1233–1263.
  13. Isserman, Maurice. If I Had a Hammer: The Death of the Old Left and the Birth of the New Left. Champaign: University of Illinois Press, 1987.
  14. Johnson, Troy R. The American Indian Occupation of Alcatraz Island: Red Power and Self-Determination. Lincoln: University of Nebraska Press, 2008.
  15. Joseph, Peniel. Waiting ’til the Midnight Hour: A Narrative History of Black Power in America. New York: Holt, 2006.
  16. Kazin, Michael, and Maurice Isserman. America Divided: The Civil War of the 1960s. New York: Oxford University Press, 2007.
  17. McGirr, Lisa. Suburban Warriors: The Origins of the New American Right. Princeton, NJ: Princeton University Press, 2001.
  18. Orleck, Annelise. Storming Caesar’s Palace: How Black Mothers Fought Their Own War on Poverty. New York: Beacon Books, 2005.
  19. Patterson, James T. America’s Struggle Against Poverty in the Twentieth Century. Cambridge, MA: Harvard University Press, 1981.
  20. Patterson, James T. Grand Expectations: The United States, 1945–1974. New York: Oxford University Press, 1996.
  21. Perlstein, Rick. Before the Storm: Barry Goldwater and the Unmaking of the American Consensus. New York: Hill and Wang, 2001.
  22. Ransby, Barbara. Ella Baker and the Black Freedom Movement: A Radical Democratic Vision. Chapel Hill: University of North Carolina Press, 2000.
  23. Robnett, Belinda. How Long? How Long?: African American Women in the Struggle for Civil Rights. New York: Oxford University Press, 2000.
  24. Sugrue, Thomas. The Origins of the Urban Crisis: Race and Inequality in Postwar Detroit. Princeton, NJ: Princeton University Press, 2005.

 

Notes

  1. For the major events of the civil rights movement, see Taylor Branch, Parting the Waters: America in the King Years, 1954–63 (New York: Simon and Schuster, 1988); Taylor Branch, Pillar of Fire: America in the King Years, 1963–65 (New York: Simon and Schuster, 1998); and Taylor Branch, At Canaan’s Edge: America in the King Years, 1965–68 (New York: Simon and Schuster, 2007). []
  2. Branch, Parting the Waters. []
  3. Raymond Arsenault, Freedom Riders: 1961 and the Struggle for Racial Justice (New York: Oxford University Press, 2006). []
  4. Clayborne Carson, In Struggle: SNCC and the Black Awakening of the 1960s (Cambridge, MA: Harvard University Press, 1980); Adam Fairclough, To Redeem the Soul of America: The Southern Christian Leadership Conference & Martin Luther King (Athens: University of Georgia Press, 1987). []
  5. David L. Chappell, A Stone of Hope: Prophetic Religion and the Death of Jim Crow (Chapel Hill: University of North Carolina Press, 2005). []
  6. Branch, Parting the Waters. []
  7. Ibid. []
  8. Dan T. Carter, The Politics of Rage: George Wallace, the Origins of the New Conservatism, and the Transformation of American Politics (Baton Rouge: LSU Press, 2000). []
  9. Branch, Parting the Waters. []
  10. Branch, Pillar of Fire. []
  11. Branch, At Canaan’s Edge. []
  12. Lyndon Baines Johnson, “Remarks at the University of Michigan,” May 22, 1964, Public Papers of the Presidents of the United States: Lyndon B. Johnson, 1964 (Washington, DC: U.S. Government Printing Office, 1965), 704. []
  13. See, for instance, Wesley G. Phelps, A People’s War on Poverty: Urban Politics and Grassroots Activists in Houston (Athens: University of Georgia Press, 2014). []
  14. Ibid. []
  15. Guian A. McKee, “‘This Government is with Us’: Lyndon Johnson and the Grassroots War on Poverty,” in Annelise Orleck and Lisa Gayle Hazirjian, eds., The War on Poverty: A New Grassroots History, 1964–1980 (Athens: University of Georgia Press, 2011). []
  16. Michael P. Sullivan, The Vietnam War: A Study in the Making of American Foreign Policy (Lexington: University Press of Kentucky, 1985), 58. []
  17. Thomas Frank, The Conquest of Cool: Business Culture, Counterculture, and the Rise of Hip Consumerism (Chicago: University of Chicago Press, 1998), 7. []
  18. Brenda Polan and Roger Tredre, The Great Fashion Designers (New York: Berg, 2009), 103–104. []
  19. Manning Marable, Malcolm X: A Life of Reinvention (New York: Penguin, 2011). []
  20. Peniel E. Joseph, ed., The Black Power Movement: Rethinking the Civil Rights–Black Power Era (New York: Routledge, 2013), 2. []
  21. Gordon Parks, “Whip of Black Power,” Life (May 19, 1967), 82. []
  22. Joshua Bloom and Waldo E. Martin Jr., Black Against Empire: The History and Politics of the Black Panther Party (Berkeley: University of California Press, 2012). []
  23. In 1974, fishing rights activists and tribal leaders reached a legal victory in United States v. Washington, otherwise known as the Boldt Decision, which declared that Native Americans were entitled to up to 50 percent of the fish caught in the “usual and accustomed places,” as stated in 1850s treaties. []
  24. Paul Chaat Smith and Robert Allen Warrior, Like a Hurricane: The Indian Movement from Alcatraz to Wounded Knee (New York: New Press, 1997). []
  25. See, for instance, Juan Gómez-Quiñones and Irene Vásquez, Making Aztlán: Ideology and Culture of the Chicana and Chicano Movement, 1966–1977 (Albuquerque: University of New Mexico Press, 2014). []
  26. Armando Navarro, Mexican American Youth Organization: Avant-Garde of the Movement in Texas (Austin: University of Texas Press, 1995); Ignacio M. Garcia, United We Win: The Rise and Fall of La Raza Unida Party (Tucson: University of Arizona Mexican American Studies Research Center, 1989). []
  27. American Women: Report of the President’s Commission the Status of Women (U.S. Department of Labor: 1963), 2, https://www.dol.gov/wb/American%20Women%20Report.pdf, accessed June 7, 2018. []
  28. Flora Davis, Moving the Mountain: The Women’s Movement in America Since 1960 (Champaign: University of Illinois Press, 1999); Cynthia Ellen Harrison, On Account of Sex: The Politics of Women’s Issues, 1945–1968 (Berkeley: University of California Press, 1988). []
  29. Betty Friedan, The Feminine Mystique (New York: Norton, 1963), 50. []
  30. Carol Hanisch, “The Personal Is Political,” in Shulamith Firestone and Anne Koedt, eds., Notes from the Second Year: Women’s Liberation (New York: Radical Feminism, 1970). []
  31. Rachel Carson, Silent Spring (New York: Houghton Mifflin, 1962; Linda Lear, Rachel Carson: Witness for Nature (New York: Holt, 1997). []

26. The Affluent Society

"Photograph shows an African American high school girl being educated via television during the period that the Little Rock schools were closed to avoid integration." 1958. Photograph by Thomas J. O'Halloran. Library of Congress (LC-U9- 1525F-28).

Little Rock schools closed rather than allow integration. This 1958 photograph shows an African American high school girl watching school lessons on television. Library of Congress (LC-U9- 1525F-28).

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

In 1958, Harvard economist and public intellectual John Kenneth Galbraith published The Affluent Society. Galbraith’s celebrated book examined America’s new post–World War II consumer economy and political culture. While noting the unparalleled riches of American economic growth, it criticized the underlying structures of an economy dedicated only to increasing production and the consumption of goods. Galbraith argued that the U.S. economy, based on an almost hedonistic consumption of luxury products, would inevitably lead to economic inequality as private-sector interests enriched themselves at the expense of the American public. Galbraith warned that an economy where “wants are increasingly created by the process by which they are satisfied” was unsound, unsustainable, and, ultimately, immoral. “The Affluent Society,” he said, was anything but.1

While economists and scholars debate the merits of Galbraith’s warnings and predictions, his analysis was so insightful that the title of his book has come to serve as a ready label for postwar American society. In the two decades after the end of World War II, the American economy witnessed massive and sustained growth that reshaped American culture through the abundance of consumer goods. Standards of living—across all income levels—climbed to unparalleled heights and economic inequality plummeted.2

And yet, as Galbraith noted, the Affluent Society had fundamental flaws. The new consumer economy that lifted millions of Americans into its burgeoning middle class also reproduced existing inequalities. Women struggled to claim equal rights as full participants in American society. The poor struggled to win access to good schools, good healthcare, and good jobs. The same suburbs that gave middle-class Americans new space left cities withering in spirals of poverty and crime and caused irreversible ecological disruptions. The Jim Crow South tenaciously defended segregation, and Black Americans and other minorities suffered discrimination all across the country.

The contradictions of the Affluent Society defined the decade: unrivaled prosperity alongside persistent poverty, life-changing technological innovation alongside social and environmental destruction, expanded opportunity alongside entrenched discrimination, and new liberating lifestyles alongside a stifling conformity.

 

II. The Rise of the Suburbs

Photograph of the houses in Levittown

Levittown in the early1950s. Flickr/Creative Commons.

The seeds of a suburban nation were planted in New Deal government programs. At the height of the Great Depression, in 1932, some 250,000 households lost their property to foreclosure. A year later, half of all U.S. mortgages were in default. The foreclosure rate stood at more than one thousand per day. In response, FDR’s New Deal created the Home Owners’ Loan Corporation (HOLC), which began purchasing and refinancing existing mortgages at risk of default. The HOLC introduced the amortized mortgage, allowing borrowers to pay back interest and principal regularly over fifteen years instead of the then standard five-year mortgage that carried large balloon payments at the end of the contract. The HOLC eventually owned nearly one of every five mortgages in America. Though homeowners paid more for their homes under this new system, home ownership was opened to the multitudes who could now gain residential stability, lower monthly mortgage payments, and accrue wealth as property values rose over time.3

Additionally, the Federal Housing Administration (FHA), another New Deal organization, increased access to home ownership by insuring mortgages and protecting lenders from financial loss in the event of a default. Lenders, however, had to agree to offer low rates and terms of up to twenty or thirty years. Even more consumers could afford homes. Though only slightly more than a third of homes had an FHA-backed mortgage by 1964, FHA loans had a ripple effect, with private lenders granting more and more home loans even to non-FHA-backed borrowers. Government programs and subsidies like the HOLC and the FHA fueled the growth of home ownership and the rise of the suburbs.

Government spending during World War II pushed the United States out of the Depression and into an economic boom that would be sustained after the war by continued government spending. Government expenditures provided loans to veterans, subsidized corporate research and development, and built the interstate highway system. In the decades after World War II, business boomed, unionization peaked, wages rose, and sustained growth buoyed a new consumer economy. The Servicemen’s Readjustment Act (popularly known as the G.I. Bill), passed in 1944, offered low-interest home loans, a stipend to attend college, loans to start a business, and unemployment benefits.

The rapid growth of home ownership and the rise of suburban communities helped drive the postwar economic boom. Builders created sprawling neighborhoods of single-family homes on the outskirts of American cities. William Levitt built the first Levittown, the prototypical suburban community, in 1946 in Long Island, New York. Purchasing large acreage, subdividing lots, and contracting crews to build countless homes at economies of scale, Levitt offered affordable suburban housing to veterans and their families. Levitt became the prophet of the new suburbs, and his model of large-scale suburban development was duplicated by developers across the country. The country’s suburban share of the population rose from 19.5 percent in 1940 to 30.7 percent by 1960. Home ownership rates rose from 44 percent in 1940 to almost 62 percent in 1960. Between 1940 and 1950, suburban communities with more than ten thousand people grew 22.1 percent, and planned communities grew at an astonishing rate of 126.1 percent.4 As historian Lizabeth Cohen notes, these new suburbs “mushroomed in territorial size and the populations they harbored.”5 Between 1950 and 1970, America’s suburban population nearly doubled to seventy-four million. Eighty-three percent of all population growth occurred in suburban places.6

The postwar construction boom fed into countless industries. As manufacturers converted from war materials back to consumer goods, and as the suburbs developed, appliance and automobile sales rose dramatically. Flush with rising wages and wartime savings, homeowners also used newly created installment plans to buy new consumer goods at once instead of saving for years to make major purchases. Credit cards, first issued in 1950, further increased access to credit. No longer stymied by the Depression or wartime restrictions, consumers bought countless washers, dryers, refrigerators, freezers, and, suddenly, televisions. The percentage of Americans that owned at least one television increased from 12 percent in 1950 to more than 87 percent in 1960. This new suburban economy also led to increased demand for automobiles. The percentage of American families owning cars increased from 54 percent in 1948 to 74 percent in 1959. Motor fuel consumption rose from some twenty-two million gallons in 1945 to around fifty-nine million gallons in 1958.7

On the surface, the postwar economic boom turned America into a land of abundance. For advantaged buyers, loans had never been easier to obtain, consumer goods had never been more accessible, single-family homes had never been so cheap, and well-paying jobs had never been more abundant. “If you had a college diploma, a dark suit, and anything between the ears,” a businessman later recalled, “it was like an escalator; you just stood there and you moved up.”8 But the escalator did not serve everyone. Beneath aggregate numbers, racial disparity, sexual discrimination, and economic inequality persevered, undermining many of the assumptions of an Affluent Society.

In 1939, real estate appraisers arrived in sunny Pasadena, California. Armed with elaborate questionnaires to evaluate the city’s building conditions, the appraisers were well versed in the policies of the HOLC. In one neighborhood, most structures were rated in “fair” repair, and appraisers noted a lack of “construction hazards or flood threats.” However, they concluded that the area “is detrimentally affected by 10 owner occupant Negro families.” While “the Negroes are said to be of the better class,” the appraisers concluded, “it seems inevitable that ownership and property values will drift to lower levels.”9

Wealth created by the booming economy filtered through social structures with built-in privileges and prejudices. Just when many middle- and working-class white American families began their journey of upward mobility by moving to the suburbs with the help of government programs such as the FHA and the G.I. Bill, many African Americans and other racial minorities found themselves systematically shut out.

A look at the relationship between federal organizations such as the HOLC, the FHA, and private banks, lenders, and real estate agents tells the story of standardized policies that produced a segregated housing market. At the core of HOLC appraisal techniques, which reflected the existing practices of private real estate agents, was the pernicious insistence that mixed-race and minority-dominated neighborhoods were credit risks. In partnership with local lenders and real estate agents, the HOLC created Residential Security Maps to identify high- and low-risk-lending areas. People familiar with the local real estate market filled out uniform surveys on each neighborhood. Relying on this information, the HOLC assigned every neighborhood a letter grade from A to D and a corresponding color code. The least secure, highest-risk neighborhoods for loans received a D grade and the color red. Banks limited loans in such “redlined” areas.10

Pair with 1938 Brooklyn Redline map.

Black communities in cities such as Detroit, Chicago, Brooklyn, and Atlanta (mapped here) experienced redlining, the process by which banks and other organizations demarcated minority neighborhoods on a map with a red line. Doing so made visible the areas they believed were unfit for their services, directly denying Black residents loans, but also, indirectly, housing, groceries, and other necessities of modern life. National Archives.

Pair with Redlined Map of Greater Atlanta

1938 Brooklyn redlining map. National Archives.

Phrases like subversive racial elements and racial hazards pervade the redlined-area description files of surveyors and HOLC officials. Los Angeles’s Echo Park neighborhood, for instance, had concentrations of Japanese and African Americans and a “sprinkling of Russians and Mexicans.” The HOLC security map and survey noted that the neighborhood’s “adverse racial influences which are noticeably increasing inevitably presage lower values, rentals and a rapid decrease in residential desirability.”11

While the HOLC was a fairly short-lived New Deal agency, the influence of its security maps lived on in the FHA and Veterans Administration (VA), the latter of which dispensed G.I. Bill–backed mortgages. Both of these government organizations, which reinforced the standards followed by private lenders, refused to back bank mortgages in “redlined” neighborhoods. On the one hand, FHA- and VA-backed loans were an enormous boon to those who qualified for them. Millions of Americans received mortgages that they otherwise would not have qualified for. But FHA-backed mortgages were not available to all. Racial minorities could not get loans for property improvements in their own neighborhoods and were denied mortgages to purchase property in other areas for fear that their presence would extend the red line into a new community. Levittown, the poster child of the new suburban America, only allowed whites to purchase homes. Thus, FHA policies and private developers increased home ownership and stability for white Americans while simultaneously creating and enforcing racial segregation.

The exclusionary structures of the postwar economy prompted protest from African Americans and other minorities who were excluded. Fair housing, equal employment, consumer access, and educational opportunity, for instance, all emerged as priorities of a brewing civil rights movement. In 1948, the U.S. Supreme Court sided with African American plaintiffs and, in Shelley v. Kraemer, declared racially restrictive neighborhood housing covenants—property deed restrictions barring sales to racial minorities—legally unenforceable. Discrimination and segregation continued, however, and activists would continue to push for fair housing practices.

During the 1950s and early 1960s many Americans retreated to the suburbs to enjoy the new consumer economy and search for some normalcy and security after the instability of depression and war. But many could not. It was both the limits and opportunities of housing, then, that shaped the contours of postwar American society. Moreover, the postwar suburban boom not only exacerbated racial and class inequalities, it precipitated a major environmental crisis.

The introduction of mass production techniques in housing wrought ecological destruction. Developers sought cheaper land ever farther way from urban cores, wrecking havoc on particularly sensitive lands such as wetlands, hills, and floodplains. “A territory roughly the size of Rhode Island,” historian Adam Rome wrote, “was bulldozed for urban development” every year.12 Innovative construction strategies, government incentives, high consumer demand, and low energy prices all pushed builders away from more sustainable, energy-conserving building projects. Typical postwar tract-houses were difficult to cool in the summer and heat in the winter. Many were equipped with malfunctioning septic tanks that polluted local groundwater. Such destructiveness did not go unnoticed. By the time Rachel Carson published Silent Spring, a forceful denunciation of the excessive use of pesticides such as DDT in agricultural and domestic settings, in 1962, many Americans were already primed to receive her message. Stories of kitchen faucets spouting detergent foams and children playing in effluents brought the point home: comfort and convenience did not have to come at such cost. And yet most of the Americans who joined the early environmentalist crusades of the 1950s and 1960s rarely questioned the foundations of the suburban ideal. Americans increasingly relied upon automobiles and idealized the single-family home, blunting any major push to shift prevailing patterns of land and energy use.13

 

III. Race and Education

This photograph shows American soldiers escorting Black students into a school.

School desegregation was a tense experience for all involved, but none more so than the African American students who integrated white schools. The Little Rock Nine were the first to do so in Arkansas. Their escorts, the 101st Airborne Division of the U.S. Army, protected students who took that first step in 1957. Wikimedia.

Older battles over racial exclusion also confronted postwar American society. One long-simmering struggle targeted segregated schooling. In 1896, the Supreme Court declared the principle of “separate but equal” constitutional. Segregated schooling, however, was rarely “equal”: in practice, Black Americans, particularly in the South, received fewer funds, attended inadequate facilities, and studied with substandard materials. African Americans’ battle against educational inequality stretched across half a century before the Supreme Court again took up the merits of “separate but equal.”

On May 17, 1954, after two years of argument, re-argument, and deliberation, Chief Justice Earl Warren announced the Supreme Court’s decision on segregated schooling in Brown v. Board of Education (1954). The court found by a unanimous 9–0 vote that racial segregation violated the Equal Protection Clause of the Fourteenth Amendment. The court’s decision declared, “Separate educational facilities are inherently unequal.” “Separate but equal” was made unconstitutional.14

Decades of African American–led litigation, local agitation against racial inequality, and liberal Supreme Court justices made Brown possible. In the early 1930s, the NAACP began a concerted effort to erode the legal underpinnings of segregation in the American South. Legal, or de jure, segregation subjected racial minorities to discriminatory laws and policies. Law and custom in the South hardened antiblack restrictions. But through a series of carefully chosen and contested court cases concerning education, disfranchisement, and jury selection, NAACP lawyers such as Charles Hamilton Houston, Robert L. Clark, and future Supreme Court Justice Thurgood Marshall undermined Jim Crow’s constitutional underpinnings. These attorneys initially sought to demonstrate that states systematically failed to provide African American students “equal” resources and facilities, and thus failed to live up to Plessy. By the late 1940s activists began to more forcefully challenge the assumptions that “separate” was constitutional at all.

NAACP leaders, including Thurgood Marshall (who would become the first African American Supreme Court Justice), hold a poster saying "Stamp out MIssissippi-ism! Join NAACP"

The NAACP was a key organization in the fight to end legalized racial discrimination. In this 1956 photograph, NAACP leaders, including Thurgood Marshall, who would become the first African American Supreme Court Justice, hold a poster decrying racial bias in Mississippi in 1956. Library of Congress.

Though remembered as just one lawsuit, Brown v. Board of Education consolidated five separate cases that had originated in the southeastern United States: Briggs v. Elliott (South Carolina), Davis v. County School Board of Prince Edward County (Virginia), Beulah v. Belton (Delaware), Bolling v. Sharpe (Washington, D.C.), and Brown v. Board of Education (Kansas). Working with local activists already involved in desegregation fights, the NAACP purposely chose cases with a diverse set of local backgrounds to show that segregation was not just an issue in the Deep South, and that a sweeping judgment on the fundamental constitutionality of Plessy was needed.

Briggs v. Elliott, the first case accepted by the NAACP, illustrated the plight of segregated Black schools. Briggs originated in rural Clarendon County, South Carolina, where taxpayers in 1950 spent $179 to educate each white student and $43 for each Black student. The district’s twelve white schools were cumulatively worth $673,850; the value of its sixty-one Black schools (mostly dilapidated, overcrowded shacks) was $194,575.15 While Briggs underscored the South’s failure to follow Plessy, the Brown suit focused less on material disparities between Black and white schools (which were significantly less than in places like Clarendon County) and more on the social and spiritual degradation that accompanied legal segregation. This case cut to the basic question of whether “separate” was itself inherently unequal. The NAACP said the two notions were incompatible. As one witness before the U.S. District Court of Kansas said, “The entire colored race is craving light, and the only way to reach the light is to start [black and white] children together in their infancy and they come up together.”16

To make its case, the NAACP marshaled historical and social scientific evidence. The Court found the historical evidence inconclusive and drew their ruling more heavily from the NAACP’s argument that segregation psychologically damaged Black children. To make this argument, association lawyers relied on social scientific evidence, such as the famous doll experiments of Kenneth and Mamie Clark. The Clarks demonstrated that while young white girls would naturally choose to play with white dolls, young Black girls would, too. The Clarks argued that Black children’s aesthetic and moral preference for white dolls demonstrated the pernicious effects and self-loathing produced by segregation.

Identifying and denouncing injustice, though, is different from rectifying it. Though Brown repudiated Plessy, the Court’s orders did not extend to segregation in places other than public schools and, even then, to preserve a unanimous decision for such an historically important case, the justices set aside the divisive yet essential question of enforcement. Their infamously ambiguous order in 1955 (what came to be known as Brown II) that school districts desegregate “with all deliberate speed” was so vague and ineffectual that it left the actual business of desegregation in the hands of those who opposed it.

Photograph of white anti-integration protestors. Signs read "Race mixing is communism" and "Stop the Race Mixing March of the Antichrist."

In 1959, photographer John Bledsoe captured this image of the crowd on the steps of the Arkansas state capitol building protesting the federally mandated integration of Little Rock’s Central High School. This image shows how worries about desegregation were bound up with other concerns, such as the reach of communism and government power. Library of Congress.

In most of the South, as well as the rest of the country, school integration did not occur on a wide scale until well after Brown. Only in the 1964 Civil Rights Act did the federal government finally implement some enforcement of the Brown decision by threatening to withhold funding from recalcitrant school districts, but even then southern districts found loopholes. Court decisions such as Green v. New Kent County (1968) and Alexander v. Holmes (1969) finally closed some of those loopholes, such as “freedom of choice” plans, to compel some measure of actual integration.

When Brown finally was enforced in the South, the quantitative impact was staggering. In 1968, fourteen years after Brown, some 80 percent of school-age Black southerners remained in schools that were 90 to 100 percent nonwhite. By 1972, though, just 25 percent were in such schools, and 55 percent remained in schools with a simple nonwhite minority. By many measures, the public schools of the South became, ironically, the most integrated in the nation.17

As a landmark moment in American history, Brown’s significance perhaps lies less in immediate tangible changes—which were slow, partial, and inseparable from a much longer chain of events—than in the idealism it expressed and the momentum it created. The nation’s highest court had attacked one of the fundamental supports of Jim Crow segregation and offered constitutional cover for the creation of one of the greatest social movements in American history.

 

IV. Civil Rights in an Affluent Society

Photograph of a Black boy drinking from a water fountain. Affixed to the tree next to it is a sign that reads "COLORED."

This segregated drinking fountain was located on the grounds of the Halifax County courthouse in North Carolina. Photograph, April 1938. Wikimedia.

Education was but one aspect of the nation’s Jim Crow machinery. African Americans had been fighting against a variety of racist policies, cultures, and beliefs in all aspects of American life. And while the struggle for Black inclusion had few victories before World War II, the war and the Double V campaign for victory against fascism abroad and racism at home, as well as the postwar economic boom led, to rising expectations for many African Americans. When persistent racism and racial segregation undercut the promise of economic and social mobility, African Americans began mobilizing on an unprecedented scale against the various discriminatory social and legal structures.

While many of the civil rights movement’s most memorable and important moments, such as the sit-ins, the Freedom Rides, and especially the March on Washington, occurred in the 1960s, the 1950s were a significant decade in the sometimes tragic, sometimes triumphant march of civil rights in the United States. In 1953, years before Rosa Parks’s iconic confrontation on a Montgomery city bus, an African American woman named Sarah Keys publicly challenged segregated public transportation. Keys, then serving in the Women’s Army Corps, traveled from her army base in New Jersey back to North Carolina to visit her family. When the bus stopped in North Carolina, the driver asked her to give up her seat for a white customer. Her refusal to do so landed her in jail in 1953 and led to a landmark 1955 decision, Sarah Keys v. Carolina Coach Company, in which the Interstate Commerce Commission ruled that “separate but equal” violated the Interstate Commerce Clause of the U.S. Constitution. Poorly enforced, it nevertheless gave legal coverage for the Freedom Riders years later and motivated further assaults against Jim Crow.

But if some events encouraged civil rights workers with the promise of progress, others were so savage they convinced activists that they could do nothing but resist. In the summer of 1955, two white men in Mississippi kidnapped and brutally murdered fourteen-year-old Emmett Till. Till, visiting from Chicago and perhaps unfamiliar with the “etiquette” of Jim Crow, allegedly whistled at a white woman named Carolyn Bryant. Her husband, Roy Bryant, and another man, J. W. Milam, abducted Till from his relatives’ home, beat him, mutilated him, shot him, and threw his body in the Tallahatchie River. Emmett’s mother held an open-casket funeral so that Till’s disfigured body could make national news. The men were brought to trial. The evidence was damning, but an all-white jury found the two not guilty. Mere months after the decision, the two boasted of their crime, in all of its brutal detail, in Look magazine. “They ain’t gonna go to school with my kids,” Milam said. They wanted “to make an example of [Till]—just so everybody can know how me and my folks stand.”18 The Till case became an indelible memory for the young Black men and women soon to propel the civil rights movement forward.

On December 1, 1955, four months after Till’s death and six days after the Keys v. Carolina Coach Company decision, Rosa Parks refused to surrender her seat on a Montgomery city bus and was arrested. Montgomery’s public transportation system had longstanding rules requiring African American passengers to sit in the back of the bus and to give up their seats to white passengers if the buses filled. Parks was not the first to protest the policy by staying seated, but she was the first around whom Montgomery activists rallied.

Activists sprang into action. Joanne Robinson, who as head of the Women’s Political Council had long fought against the city’s segregated busing, worked long into the night to with a colleague and two students from Alabama State College to mimeograph over 50,000 handbills calling for an immediate boycott. Montgomery’s Black community responded, and, in response, local ministers and civil rights workers formed the Montgomery Improvement Association (MIA) to coordinate an organized, sustained boycott of the city’s buses. The Montgomery Bus Boycott lasted from December 1955 until December 20, 1956, when the Supreme Court ordered their integration. The boycott not only crushed segregation in Montgomery’s public transportation, it energized the entire civil rights movement and established the leadership of the MIA’s president, a recently arrived, twenty-six-year-old Baptist minister named Martin Luther King Jr.

Motivated by the success of the Montgomery boycott, King and other Black leaders looked to continue the fight. In 1957, King, fellow ministers such as Ralph Abernathy and Fred Shuttlesworth, and key staffers such as Ella Baker and Septima Clark helped create and run the Southern Christian Leadership Conference (SCLC) to coordinate civil rights groups across the South in their efforts to organize and sustain boycotts, protests, and other assaults against Jim Crow discrimination.

As pressure built, Congress passed the Civil Rights Act of 1957, the first such measure passed since Reconstruction. The act was compromised away nearly to nothing, although it did achieve some gains, such as creating the Department of Justice’s Civil Rights Commission, which was charged with investigating claims of racial discrimination. And yet, despite its weakness, the act signaled that pressure was finally mounting on Americans to confront the legacy of discrimination.

Despite successes at both the local and national level, the civil rights movement faced bitter opposition. Those opposed to the movement often used violent tactics to scare and intimidate African Americans and subvert legal rulings and court orders. For example, a year into the Montgomery bus boycott, angry white southerners bombed four African American churches as well as the homes of King and fellow civil rights leader E. D. Nixon. Though King, Nixon, and the MIA persevered in the face of such violence, it was only a taste of things to come. Such unremitting hostility and violence left the outcome of the burgeoning civil rights movement in doubt. Despite its successes, civil rights activists looked back on the 1950s as a decade of mixed results and incomplete accomplishments. While the bus boycott, Supreme Court rulings, and other civil rights activities signaled progress, church bombings, death threats, and stubborn legislators demonstrated the distance that still needed to be traveled.

 

V. Gender and Culture in the Affluent Society

There are two advertisements here, one for an oven and another for a washer and dryer set. Women appear happy in both ads. In the washer and dryer ad, a happy husband and two children also admire the appliances.

As shown in this 1958 advertisement for a “Westinghouse with Cold Injector,” a midcentury marketing frenzy targeted female consumers by touting technological innovations designed to make housework easier. Westinghouse.

America’s consumer economy reshaped how Americans experienced culture and shaped their identities. The Affluent Society gave Americans new experiences, new outlets, and new ways to understand and interact with one another.

“The American household is on the threshold of a revolution,” the New York Times declared in August 1948. “The reason is television.”19 Television was presented to the American public at the New York World’s Fair in 1939, but commercialization of the new medium in the United States lagged during the war years. In 1947, though, regular full-scale broadcasting became available to the public. Television was instantly popular, so much so that by early 1948 Newsweek reported that it was “catching on like a case of high-toned scarlet fever.”20 Indeed, between 1948 and 1955 close to two thirds of the nation’s households purchased a television set. By the end of the 1950s, 90 percent of American families had one and the average viewer was tuning in for almost five hours a day.21

The technological ability to transmit images via radio waves gave birth to television. Television borrowed radio’s organizational structure, too. The big radio broadcasting companies—NBC, CBS, and the American Broadcasting Corporation (ABC)—used their technical expertise and capital reserves to conquer the airwaves. They acquired licenses to local stations and eliminated their few independent competitors. The refusal of the Federal Communication Commission (FCC) to issue any new licenses between 1948 and 1955 was a de facto endorsement of the big three’s stranglehold on the market.

In addition to replicating radio’s organizational structure, television also looked to radio for content. Many of the early programs were adaptations of popular radio variety and comedy shows, including The Ed Sullivan Show and Milton Berle’s Texaco Star Theater. These were accompanied by live plays, dramas, sports, and situation comedies. Because of the cost and difficulty of recording, most programs were broadcast live, forcing stations across the country to air shows at the same time. And since audiences had a limited number of channels to choose from, viewing experiences were broadly shared. More than two thirds of television-owning households, for instance, watched popular shows such as I Love Lucy.

The limited number of channels and programs meant that networks selected programs that appealed to the widest possible audience to draw viewers and advertisers, television’s greatest financers. By the mid-1950s, an hour of primetime programming cost about $150,000 (about $1.5 million in today’s dollars) to produce. This proved too expensive for most commercial sponsors, who began turning to a joint financing model of thirty-second spot ads. The need to appeal to as many people as possible promoted the production of noncontroversial shows aimed at the entire family. Programs such as Father Knows Best and Leave it to Beaver featured light topics, humor, and a guaranteed happy ending the whole family could enjoy.22

This still image from a game show shows the podium emblazoned with the company name Geritol.

Advertising was everywhere in the 1950s, including on TV shows such as the quiz show Twenty One, sponsored by Geritol, a dietary supplement. Library of Congress.

Television’s broad appeal, however, was about more than money and entertainment. Shows of the 1950s, such as Father Knows Best and I Love Lucy, idealized the nuclear family, “traditional” gender roles, and white, middle-class domesticity. Leave It to Beaver, which became the prototypical example of the 1950s television family, depicted its breadwinner father and homemaker mother guiding their children through life lessons. Such shows, and Cold War America more broadly, reinforced a popular consensus that such lifestyles were not only beneficial but the most effective way to safeguard American prosperity against communist threats and social “deviancy.”

 

Postwar prosperity facilitated, and in turn was supported by, the ongoing postwar baby boom. From 1946 to 1964, American fertility experienced an unprecedented spike. A century of declining birth rates abruptly reversed. Although popular memory credits the cause of the baby boom to the return of virile soldiers from battle, the real story is more nuanced. After years of economic depression, families were now wealthy enough to support larger families and had homes large enough to accommodate them, while women married younger and American culture celebrated the ideal of a large, insular family.

Underlying this “reproductive consensus” was the new cult of professionalism that pervaded postwar American culture, including the professionalization of homemaking. Mothers and fathers alike flocked to the experts for their opinions on marriage, sexuality, and, most especially, child-rearing. Psychiatrists held an almost mythic status as people took their opinions and prescriptions, as well as their vocabulary, into their everyday life. Books like Dr. Spock’s Baby and Child Care (1946) were diligently studied by women who took their career as housewife as just that: a career, complete with all the demands and professional trappings of job development and training. And since most women had multiple children roughly the same age as their neighbors’ children, a cultural obsession with kids flourished throughout the era. Women bore the brunt of this pressure, chided if they did not give enough of their time to the children—especially if it was because of a career—yet cautioned that spending too much time would lead to “Momism,” producing “sissy” boys who would be incapable of contributing to society and extremely susceptible to the communist threat.

A new youth culture exploded in American popular culture. On the one hand, the anxieties of the atomic age hit America’s youth particularly hard. Keenly aware of the discontent bubbling beneath the surface of the Affluent Society, many youth embraced rebellion. The 1955 film Rebel Without a Cause demonstrated the restlessness and emotional incertitude of the postwar generation raised in increasing affluence yet increasingly unsatisfied with their comfortable lives. At the same time, perhaps yearning for something beyond the “massification” of American culture yet having few other options to turn to beyond popular culture, American youth embraced rock ’n’ roll. They listened to Little Richard, Buddy Holly, and especially Elvis Presley (whose sexually suggestive hip movements were judged subversive).

The popularity of rock ’n’ roll had not yet blossomed into the countercultural musical revolution of the coming decade, but it provided a magnet for teenage restlessness and rebellion. “Television and Elvis,” the musician Bruce Springsteen recollected, “gave us full access to a new language, a new form of communication, a new way of being, a new way of looking, a new way of thinking; about sex, about race, about identity, about life; a new way of being an American, a human being; and a new way of hearing music.” American youth had seen so little of Elvis’s energy and sensuality elsewhere in their culture. “Once Elvis came across the airwaves,” Springsteen said, “once he was heard and seen in action, you could not put the genie back in the bottle. After that moment, there was yesterday, and there was today, and there was a red hot, rockabilly forging of a new tomorrow, before your very eyes.”23

This photograph of Elvis depicts the rock star dancing.

While many Black musicians such as Chuck Berry helped pioneer rock ’n’ roll, white artists such as Elvis Presley brought it into the mainstream American culture. Elvis’s good looks, sensual dancing, and sonorous voice stole the hearts of millions of American teenage girls, who were at that moment becoming a central segment of the consumer population. Wikimedia.

Other Americans took larger steps to reject the expected conformity of the Affluent Society. The writers, poets, and musicians of the Beat Generation, disillusioned with capitalism, consumerism, and traditional gender roles, sought a deeper meaning in life. Beats traveled across the country, studied Eastern religions, and experimented with drugs, sex, and art.

Behind the scenes, Americans were challenging sexual mores. The gay rights movement, for instance, stretched back into the Affluent Society. While the country proclaimed homosexuality a mental disorder, gay men established the Mattachine Society in Los Angeles and gay women formed the Daughters of Bilitis in San Francisco as support groups. They held meetings, distributed literature, provided legal and counseling services, and formed chapters across the country. Much of their work, however, remained secretive because homosexuals risked arrest and abuse if discovered.24

Society’s “consensus,” on everything from the consumer economy to gender roles, did not go unchallenged. Much discontent was channeled through the machine itself: advertisers sold rebellion no less than they sold baking soda. And yet others were rejecting the old ways, choosing new lifestyles, challenging old hierarchies, and embarking on new paths.

 

VI. Politics and Ideology in the Affluent Society

Postwar economic prosperity and the creation of new suburban spaces inevitably shaped American politics. In stark contrast to the Great Depression, the new prosperity renewed belief in the superiority of capitalism, cultural conservatism, and religion.

In the 1930s, the economic ravages of the international economic catastrophe knocked the legs out from under the intellectual justifications for keeping government out of the economy. And yet pockets of true believers kept alive the gospel of the free market. The single most important was the National Association of Manufacturers (NAM). In the midst of the depression, NAM reinvented itself and went on the offensive, initiating advertising campaigns supporting “free enterprise” and “The American Way of Life.”25 More importantly, NAM became a node for business leaders, such as J. Howard Pew of Sun Oil and Jasper Crane of DuPont Chemical Co., to network with like-minded individuals and take the message of free enterprise to the American people. The network of business leaders that NAM brought together in the midst of the Great Depression formed the financial, organizational, and ideological underpinnings of the free market advocacy groups that emerged and found ready adherents in America’s new suburban spaces in the postwar decades.

One of the most important advocacy groups that sprang up after the war was Leonard Read’s Foundation for Economic Education (FEE). Read founded FEE in 1946 on the premise that “The American Way of Life” was essentially individualistic and that the best way to protect and promote that individualism was through libertarian economics. Libertarianism took as its core principle the promotion of individual liberty, property rights, and an economy with a minimum of government regulation. FEE, whose advisory board and supporters came mostly from the NAM network of Pew and Crane, became a key ideological factory, supplying businesses, service clubs, churches, schools, and universities with a steady stream of libertarian literature, much of it authored by Austrian economist Ludwig von Mises.26

Shortly after FEE’s formation, Austrian economist and libertarian intellectual Friedrich Hayek founded the Mont Pelerin Society (MPS) in 1947. The MPS brought together libertarian intellectuals from both sides of the Atlantic to challenge Keynesian economics—the dominant notion that government fiscal and monetary policy were necessary economic tools—in academia. University of Chicago economist Milton Friedman became its president. Friedman (and his Chicago School of Economics) and the MPS became some of the most influential free market advocates in the world and helped legitimize for many the libertarian ideology so successfully evangelized by FEE, its descendant organizations, and libertarian popularizers such as the novelist Ayn Rand.27

Libertarian politics and evangelical religion were shaping the origins of a new conservative, suburban constituency. Suburban communities’ distance from government and other top-down community-building mechanisms—despite relying on government subsidies and government programs—left a social void that evangelical churches eagerly filled. More often than not the theology and ideology of these churches reinforced socially conservative views while simultaneously reinforcing congregants’ belief in economic individualism. Novelist Ayn Rand, meanwhile, whose novels The Fountainhead (1943) and Atlas Shrugged (1957) were two of the decades’ best sellers, helped move the ideas of individualism, “rational self-interest,” and “the virtue of selfishness” outside the halls of business and academia and into suburbia. The ethos of individualism became the building blocks for a new political movement. And yet, while the growing suburbs and their brewing conservative ideology eventually proved immensely important in American political life, their impact was not immediately felt. They did not yet have a champion.

In the post–World War II years the Republican Party faced a fork in the road. Its complete lack of electoral success since the Depression led to a battle within the party about how to revive its electoral prospects. The more conservative faction, represented by Ohio senator Robert Taft (son of former president William Howard Taft) and backed by many party activists and financiers such as J. Howard Pew, sought to take the party further to the right, particularly in economic matters, by rolling back New Deal programs and policies. On the other hand, the more moderate wing of the party, led by men such as New York governor Thomas Dewey and Nelson Rockefeller, sought to embrace and reform New Deal programs and policies. There were further disagreements among party members about how involved the United States should be in the world. Issues such as foreign aid, collective security, and how best to fight communism divided the party.

In this photograph Harry Truman holds up a newspaper with the incorrect headine "DEWEY DEFEATS TRUMAN."

Just like the internet, don’t always trust what you read in newspapers. This obviously incorrect banner from the front page of the Chicago Daily Tribune on November 3, 1948 made its own headlines as the newspaper’s most embarrassing gaff. Photograph, 1948. http://media-2.web.britannica.com/eb-media/14/65214-050-D86AAA4E.jpg. Undated portrait of President Harry S. Truman. National Archives.

Initially, the moderates, or “liberals,” won control of the party with the nomination of Thomas Dewey in 1948. Dewey’s shocking loss to Truman, however, emboldened conservatives, who rallied around Taft as the 1952 presidential primaries approached. With the conservative banner riding high in the party, General Dwight Eisenhower (“Ike”), most recently North Atlantic Treaty Organization (NATO) supreme commander, felt obliged to join the race in order to beat back the conservatives and “prevent one of our great two Parties from adopting a course which could lead to national suicide.” In addition to his fear that Taft and the conservatives would undermine collective security arrangements such as NATO, he also berated the “neanderthals” in his party for their anti–New Deal stance. Eisenhower felt that the best way to stop communism was to undercut its appeal by alleviating the conditions under which it was most attractive. That meant supporting New Deal programs. There was also a political calculus to Eisenhower’s position. He observed, “Should any political party attempt to abolish social security, unemployment insurance, and eliminate labor laws and farm programs, you would not hear of that party again in our political history.”28

The primary contest between Taft and Eisenhower was close and controversial. Taft supporters claimed that Eisenhower stole the nomination from Taft at the convention. Eisenhower, attempting to placate the conservatives in his party, picked California congressman and virulent anticommunist Richard Nixon as his running mate. With the Republican nomination sewn up, the immensely popular Eisenhower swept to victory in the 1952 general election, easily besting Truman’s hand-picked successor, Adlai Stevenson. Eisenhower’s popularity boosted Republicans across the country, leading them to majorities in both houses of Congress.

The Republican sweep in the 1952 election, owing in part to Eisenhower’s popularity, translated into few tangible legislative accomplishments. Within two years of his election, the moderate Eisenhower saw his legislative proposals routinely defeated by an unlikely alliance of conservative Republicans, who thought Eisenhower was going too far, and liberal Democrats, who thought he was not going far enough. For example, in 1954 Eisenhower proposed a national healthcare plan that would have provided federal support for increasing healthcare coverage across the nation without getting the government directly involved in regulating the healthcare industry. The proposal was defeated in the house by a 238–134 vote with a swing bloc of seventy-five conservative Republicans joining liberal Democrats voting against the plan.29 Eisenhower’s proposals in education and agriculture often suffered similar defeats. By the end of his presidency, Ike’s domestic legislative achievements were largely limited to expanding social security; making Health, Education and Welfare (HEW) a cabinet position; passing the National Defense Education Act; and bolstering federal support to education, particularly in math and science.

As with any president, however, Eisenhower’s impact was bigger than just legislation. Ike’s “middle of the road” philosophy guided his foreign policy as much as his domestic agenda. He sought to keep the United States from direct interventions abroad by bolstering anticommunist and procapitalist allies. Ike funneled money to the French in Vietnam fighting the Ho Chi Minh–led communists, walked a tight line between helping Chiang Kai-Shek’s Taiwan without overtly provoking Mao Zedong’s China, and materially backed groups that destabilized “unfriendly” governments in Iran and Guatemala. The centerpiece of Ike’s Soviet policy, meanwhile, was the threat of “massive retaliation,” or the threat of nuclear force in the face of communist expansion, thereby checking Soviet expansion without direct American involvement. While Ike’s “mainstream” “middle way” won broad popular support, his own party was slowly moving away from his positions. By 1964 the party had moved far enough to the right to nominate Arizona senator Barry Goldwater, the most conservative candidate in a generation. The political moderation of the Affluent Society proved little more than a way station on the road to liberal reforms and a more distant conservative ascendancy.

 

VII. Conclusion

The postwar American “consensus” held great promise. Despite the looming threat of nuclear war, millions experienced an unprecedented prosperity and an increasingly proud American identity. Prosperity seemed to promise ever higher standards of living. But things fell apart, and the center could not hold: wracked by contradiction, dissent, discrimination, and inequality, the Affluent Society stood on the precipice of revolution.

 

VIII. Primary Sources

1, Migrant Farmers and Immigrant Labor (1952)

During the labor shortages of World War II, the United States’ launched the Bracero (“laborer”) program to bring Mexican laborers into the United States. The program continued into the 1960s and brought more than a million workers into the United States on short-term contracts. Undocumented immigration continued, however. Congress held hearings and, in the selection below, a migrant worker named Juanita Garcia testifies to Congress about the state of affairs in California’s Imperial Valley. Beginning in 1954, Dwight Eisenhower’s administration oversaw, with the cooperation of the Mexican government, “Operation Wetback,” which empowered to the Border Patrol to crack down upon illegal immigration.

2. Hernandez v. Texas (1954)

Pete Hernandez, a migrant worker, was tried for the murder of his employer, Joe Espinosa, in Edna, Texas, in 1950. Hernandez was convicted by an all-white jury. His lawyers appealed. They argued that Hernandez was entitled to a jury “of his peers” and that systematic exclusion of Mexican Americans violated constitutional law. In a unanimous decision, the United States Supreme Court ruled that Mexican Americans—and all “classes”—were entitled to the “equal protection” articulated in the Fourteenth Amendment.

3. Brown v. Board of Education of Topeka (1954)

In 1896, the United States Supreme Court declared in Plessy v. Ferguson that the doctrine of “separate but equal” was constitutional. In 1954, the United States Supreme Court overturned that decision and ruled unanimously against school segregation.

4. Richard Nixon on the American Standard of Living (1959)

As Cold War tensions eased, exhibitions allowed for Americans and Soviets to survey the other’s culture and way of life. In 1959, the Russians held an exhibition in New York, and the Americans in Moscow. A videotaped discussion between Vice President Richard Nixon and Soviet Premier Nikita Kruschev, the so-called “Kitchen Debate,” won Richard Nixon acclaim at home for his articulate defense of the American standard of living. In the following extract from July 24, 1959, Nixon opened the American Exhibition in Moscow.

5. John F. Kennedy on the Separation of Church and State (1960)

American Anti-Catholicism had softened in the aftermath of World War II, but no Catholic had ever been elected president and Protestant Americans had long been suspicious of Catholic politicians when John F. Kennedy ran for the presidency in 1960. (Al Smith, the first Catholic presidential candidate, was roundly defeated in 1928 owing in large part to popular anti-Catholic prejudice). On September 12, 1960, Kennedy addressed the Greater Houston Ministerial Association and he not only allayed popular fears of his Catholic faith, he delivered a seminal statement on the separation of church and state.

6. Congressman Arthur L. Miller Gives “the Putrid Facts” About Homosexuality (1950)

In 1950, Representative Arthur L. Miller, a Nebraska Republican, offered an amendment to a bill requiring background checks for employees of the Economic Cooperation Administration (ECA). Miller proposed to bar homosexuals from working with the ECA. Although his amendment was rejected, his views of homosexuality revealed much about postwar American views.

7. Rosa Parks on Life in Montgomery, Alabama (1956-1958)

In this unfinished correspondence and undated personal notes, Rosa Parks recounted living under segregation in Montgomery, Alabama, explained why she refused to surrender her seat on a city bus, and lamented the psychological toll exacted by Jim Crow.

8. Little Rock Rally (1959)

In 1959, photographer John Bledsoe captured this image of the crowd on the steps of the Arkansas state capitol building, protesting the federally mandated integration of Little Rock’s Central High School. This image shows how worries about desegregation were bound up with other concerns, such as the reach of communism and government power.

9. “In the Suburbs” (1957)

Redbook made this film to convince advertisers that the magazine would help them attract the white suburban consumers they desired.  The “happy go spending, buy it now, young adults of today” are depicted by the film as flocking to the suburbs to escape global and urban turmoil. Redbook Magazine, “In The Suburbs” (1957). Via The Internet Archive.

 

IX. Reference Material

This chapter was edited by James McKay, with content contributions by Edwin C. Breeden, Aaron Cowan, Elsa Devienne, Maggie Flamingo, Destin Jenkins, Kyle Livie, Jennifer Mandel, James McKay, Laura Redford, Ronny Regev, and Tanya Roth.

Recommended citation: Edwin C. Breeden et al., “The Affluent Society,” James McKay, ed., in The American Yawp, eds. Joseph Locke and Ben Wright (Stanford, CA: Stanford University Press, 2018).

Recommended Reading

  • Boyle, Kevin. The UAW and the Heyday of American Liberalism, 1945–1968. Ithaca, NY: Cornell University Press, 1995.
  • Branch, Taylor. Parting the Waters: America in the King Years, 1954–1963. New York: Simon and Schuster, 1988.
  • Brown, Kate. Plutopia: Nuclear Families, Atomic Cities, and the Great Soviet and American Plutonium Disasters. New York: Oxford University Press, 2013.
  • Brown-Nagin, Tomiko. Courage to Dissent: Atlanta and the Long History of the Civil Rights Movement. New York: Oxford University Press, 2011.
  • Cohen, Lizabeth. A Consumer’s Republic: The Politics of Mass Consumption in Postwar America. New York: Knopf, 2003.
  • Coontz, Stephanie. The Way We Never Were: American Families and the Nostalgia Trap. New York: Basic Books, 1993.
  • Dudziak, Mary. Cold War Civil Rights: Race and the Image of American Democracy. Princeton, NJ: Princeton University Press, 2002.
  • Fried, Richard M. Nightmare in Red: The McCarthy Era in Perspective. New York: Oxford University Press, 1990.
  • Grisinger, Joanna. The Unwieldy American State: Administrative Politics Since the New Deal. Cambridge, UK: Cambridge University Press, 2012.
  • Hernández, Kelly Lytle. Migra! A History of the U.S. Border Patrol. Berkeley: University of California Press, 2010.
  • Horowitz, Daniel. Betty Friedan and the Making of the Feminine Mystique: The American Left, the Cold War, and Modern Feminism. Amherst: University of Massachusetts Press, 1998.
  • Jackson, Kenneth T. Crabgrass Frontier: The Suburbanization of the United States. New York: Oxford University Press, 1985.
  • Jumonville, Neil. Critical Crossings: The New York Intellectuals in Postwar America. Berkeley: University of California Press, 1991.
  • Levenstein, Lisa. A Movement Without Marches: African American Women and the Politics of Poverty in Postwar Philadelphia. Chapel Hill: University of North Carolina Press, 2009.
  • May, Elaine Tyler. Homeward Bound: American Families in the Cold War Era. New York: Basic Books, 1988.
  • McGirr, Lisa. Suburban Warriors: The Origins of the New American Right. Princeton, NJ: Princeton University Press, 2001.
  • Ngai, Mae. Impossible Subjects: Illegal Aliens and the Making of Modern America. Princeton, NJ: Princeton University Press, 2003.
  • Patterson, James T. Grand Expectations: The United States, 1945–1974. New York: Oxford University Press, 1996.
  • Roberts, Gene, and Hank Klibanoff. The Race Beat: The Press, the Civil Rights Struggle, and the Awakening of a Nation. New York: Knopf, 2006.
  • Self, Robert. American Babylon: Race and the Struggle for Postwar Oakland. Princeton, NJ: Princeton University Press, 2005.
  • Sugrue, Thomas. The Origins of the Urban Crisis: Race and Inequality in Postwar Detroit. Princeton, NJ: Princeton University Press, 2005.
  • Von Eschen, Penny. Satchmo Blows Up the World: Jazz Ambassadors Play the Cold War. Cambridge, MA: Harvard University Press, 2004.
  • Wagnleitner, Reinhold. Coca-Colonization and the Cold War: The Cultural Mission of the United States in Austria After the Second World War. Chapel Hill: University of North Carolina Press, 1994.
  • Wall, Wendy. Inventing the “American Way”: The Politics of Consensus from the New Deal to the Civil Rights Movement. New York: Oxford University Press, 2008.
  • Whitfield, Stephen. The Culture of the Cold War. Baltimore: Johns Hopkins University Press, 1991.

 

Notes

  1. John Kenneth Galbraith, The Affluent Society (New York: Houghton Mifflin, 1958), 129. []
  2. See, for example, Claudia Goldin and Robert A. Margo, “The Great Compression: The Wage Structure in the United States at Mid-Century,” Quarterly Journal of Economics 107 (February 1992), 1–34. []
  3. Price Fishback, Jonathan Rose, and Kenneth Snowden, Well Worth Saving: How the New Deal Safeguarded Home Ownership (Chicago: University of Chicago Press, 2013). []
  4. Leo Schnore, “The Growth of Metropolitan Suburbs,” American Sociological Review 22 (April 1957), 169. []
  5. Lizabeth Cohen, A Consumers’ Republic: The Politics of Mass Consumption in Postwar America (New York: Random House, 2002), 202. []
  6. Elaine Tyler May, Homeward Bound: American Families in the Cold War Era (New York, Basic Books, 1999), 152. []
  7. Leo Fishman, The American Economy (Princeton, NJ: Van Nostrand, 1962), 560. []
  8. John P. Diggins, The Proud Decades: America in War and in Peace, 1941–1960 (New York: Norton, 1989), 219. []
  9. David Kushner, Levittown: Two Families, One Tycoon, and the Fight for Civil Rights in America’s Legendary Suburb (New York: Bloomsbury Press, 2009), 17. []
  10. Thomas Sugrue, The Origins of the Urban Crisis: Race and Inequality in Postwar Detroit (Princeton, NJ: Princeton University Press, 2005). []
  11. Becky M. Nicolaides, My Blue Heaven: Life and Politics in the Working–Class Suburbs of Los Angeles, 1920-1965 (Chicago: University of Chicago Press, 2002), 193. []
  12. Adam W. Rome, The Bulldozer in the Countryside: Suburban Sprawl and the Rise of American Environmentalism (Cambridge: Cambridge University Press, 2001), 7. []
  13. See also J. R. McNeill and Peter Engelke, The Great Acceleration: An Environmental History of the Anthropocene (Cambridge, MA: Harvard University Press, 2016); Andrew Needham, Power Lines: Phoenix and the Making of the Modern Southwest (Princeton: Princeton University Press, 2014); and Ted Steinberg, Down to Earth: Nature’s Role in American History (New York: Oxford University Press, 2002). []
  14. Oliver Brown, et al. v. Board of Education of Topeka, et al., 347 U.S. 483 (1954). []
  15. James T. Patterson and William W. Freehling, Brown v. Board of Education: A Civil Rights Milestone and Its Troubled Legacy (New York: Oxford University Press, 2001), 25; Pete Daniel, Standing at the Crossroads: Southern Life in the Twentieth Century (Baltimore: Johns Hopkins University Press, 1996), 161–164. []
  16. Patterson and Freehling, Brown v. Board, xxv. []
  17. Charles T. Clotfelter, After Brown: The Rise and Retreat of School Desegregation (Princeton, NJ: Princeton University Press, 2011), 6. []
  18. William Bradford Huie, “The Shocking Story of Approved Killing in Mississippi,” Look (January 24, 1956), 46–50. []
  19. Lewis L. Gould, Watching Television Come of Age: The New York Times Reviews by Jack Gould (Austin: University of Texas Press, 2002), 186. []
  20. Gary Edgerton, Columbia History of American Television (New York: Columbia University Press, 2009), 90. []
  21. Ibid., 178. []
  22. Christopher H. Sterling and John Michael Kittross, Stay Tuned: A History of American Broadcasting (New York: Routledge, 2001), 364. []
  23. Bruce Springsteen, “SXSW Keynote Address,” Rolling Stone (March 28, 2012), http://www.rollingstone.com/music/news/exclusive-the-complete-text-of-bruce-springsteens-sxsw-keynote-address-20120328. []
  24. John D’Emilio, Sexual Politics, Sexual Communities, 2nd ed. (Chicago: University of Chicago Press, 2012), 102–103. []
  25. See Richard Tedlow, “The National Association of Manufacturers and Public Relations During the New Deal,” Business History Review 50 (Spring 1976), 25–45; and Wendy Wall, Inventing the “American Way”: The Politics of Consensus from the New Deal to the Civil Rights Movement (New York: Oxford University Press, 2008). []
  26. Gregory Eow, “Fighting a New Deal: Intellectual Origins of the Reagan Revolution, 1932–1952,” PhD diss., Rice University, 2007; Brian Doherty, Radicals for Capitalism: A Freewheeling History of the Modern American Libertarian Movement (New York: Public Affairs, 2007); and Kim Phillips Fein, Invisible Hands: The Businessmen’s Crusade Against the New Deal (New York: Norton, 2009), 43–55. []
  27. Angus Burgin, The Great Persuasion: Reinventing Free Markets Since the Depression (Cambridge, MA: Harvard University Press, 2012); Jennifer Burns, Goddess of the Market: Ayn Rand and the American Right (New York: Oxford University Press, 2009). []
  28. Allan J. Lichtman, White Protestant Nation: The Rise of the American Conservative Movement (New York: Atlantic Monthly Press, 2008), 180, 201, 185. []
  29. Steven Wagner, Eisenhower Republicanism Pursuing the Middle Way (DeKalb: Northern Illinois University Press, 2006), 15. []