United States

United States
a republic in the N Western Hemisphere comprising 48 conterminous states, the District of Columbia, and Alaska in North America, and Hawaii in the N Pacific. 267,954,767; conterminous United States, 3,022,387 sq. mi. (7,827,982 sq. km); with Alaska and Hawaii, 3,615,122 sq. mi. (9,363,166 sq. km). Cap.: Washington, D.C. Abbr.: U.S., US Also called United States of America, America.

* * *

United States

Introduction United States
Background: Britain's American colonies broke with the mother country in 1776 and were recognized as the new nation of the United States of America following the Treaty of Paris in 1783. During the 19th and 20th centuries, 37 new states were added to the original 13 as the nation expanded across the North American continent and acquired a number of overseas possessions. The two most traumatic experiences in the nation's history were the Civil War (1861-65) and the Great Depression of the 1930s. Buoyed by victories in World Wars I and II and the end of the Cold War in 1991, the US remains the world's most powerful nation- state. The economy is marked by steady growth, low unemployment and inflation, and rapid advances in technology. Geography United States -
Location: North America, bordering both the North Atlantic Ocean and the North Pacific Ocean, between Canada and Mexico
Geographic coordinates: 38 00 N, 97 00 W
Map references: North America
Area: total: 9,629,091 sq km land: 9,158,960 sq km water: 470,131 sq km note: includes only the 50 states and District of Columbia
Area - comparative: about half the size of Russia; about three-tenths the size of Africa; about half the size of South America (or slightly larger than Brazil); slightly larger than China; about two and a half times the size of Western Europe
Land boundaries: total: 12,034 km border countries: Canada 8,893 km (including 2,477 km with Alaska), Mexico 3,141 km note: US Naval Base at Guantanamo Bay, Cuba is leased by the US and thus remains part of Cuba; the base boundary is 29 km
Coastline: 19,924 km
Maritime claims: contiguous zone: 24 NM continental shelf: not specified exclusive economic zone: 200 NM territorial sea: 12 NM
Climate: mostly temperate, but tropical in Hawaii and Florida, arctic in Alaska, semiarid in the great plains west of the Mississippi River, and arid in the Great Basin of the southwest; low winter temperatures in the northwest are ameliorated occasionally in January and February by warm chinook winds from the eastern slopes of the Rocky Mountains
Terrain: vast central plain, mountains in west, hills and low mountains in east; rugged mountains and broad river valleys in Alaska; rugged, volcanic topography in Hawaii
Elevation extremes: lowest point: Death Valley -86 m highest point: Mount McKinley 6,194 m
Natural resources: coal, copper, lead, molybdenum, phosphates, uranium, bauxite, gold, iron, mercury, nickel, potash, silver, tungsten, zinc, petroleum, natural gas, timber
Land use: arable land: 19.32% other: 80.46% (1998 est.) permanent crops: 0.22% NEGL%
Irrigated land: 214,000 sq km (1998 est.)
Natural hazards: tsunamis, volcanoes, and earthquake activity around Pacific Basin; hurricanes along the Atlantic and Gulf of Mexico coasts; tornadoes in the midwest and southeast; mud slides in California; forest fires in the west; flooding; permafrost in northern Alaska, a major impediment to development Environment - current issues: air pollution resulting in acid rain in both the US and Canada; the US is the largest single emitter of carbon dioxide from the burning of fossil fuels; water pollution from runoff of pesticides and fertilizers; very limited natural fresh water resources in much of the western part of the country require careful management; desertification Environment - international party to: Air Pollution, Air
agreements: Pollution-Nitrogen Oxides, Antarctic-Environmental Protocol, Antarctic-Marine Living Resources, Antarctic Seals, Antarctic Treaty, Climate Change, Desertification, Endangered Species, Environmental Modification, Marine Dumping, Marine Life Conservation, Nuclear Test Ban, Ozone Layer Protection, Ship Pollution, Tropical Timber 83, Tropical Timber 94, Wetlands, Whaling signed, but not ratified: Air Pollution-Persistent Organic Pollutants, Air Pollution-Volatile Organic Compounds, Biodiversity, Climate Change-Kyoto Protocol, Hazardous Wastes
Geography - note: world's third-largest country by size (after Russia and Canada) and by population (after China and India); Mt. McKinley is highest point in North America and Death Valley the lowest point on the continent People United States
Population: 280,562,489 (July 2002 est.)
Age structure: 0-14 years: 21% (male 30,116,782; female 28,765,183) 15-64 years: 66.4% (male 92,391,120; female 93,986,468) 65 years and over: 12.6% (male 14,748,522; female 20,554,414) (2002 est.)
Population growth rate: 0.89% (2002 est.)
Birth rate: 14.1 births/1,000 population (2002 est.)
Death rate: 8.7 deaths/1,000 population (2002 est.)
Net migration rate: 3.5 migrant(s)/1,000 population (2002 est.)
Sex ratio: at birth: 1.05 male(s)/female under 15 years: 1.05 male(s)/female 15-64 years: 0.98 male(s)/female 65 years and over: 0.72 male(s)/ female total population: 0.96 male(s)/ female (2002 est.)
Infant mortality rate: 6.69 deaths/1,000 live births (2002 est.) Life expectancy at birth: total population: 77.4 years male: 74.5 years female: 80.2 years (2002 est.)
Total fertility rate: 2.07 children born/woman (2002 est.) HIV/AIDS - adult prevalence rate: 0.61% (1999 est.) HIV/AIDS - people living with HIV/ 850,000 (1999 est.)
AIDS:
HIV/AIDS - deaths: 20,000 (1999 est.)
Nationality: noun: American(s) adjective: American
Ethnic groups: white 77.1%, black 12.9%, Asian 4.2%, Amerindian and Alaska native 1.5%, native Hawaiian and other Pacific islander 0.3%, other 4% (2000) note: a separate listing for Hispanic is not included because the US Census Bureau considers Hispanic to mean a person of Latin American descent (especially of Cuban, Mexican, or Puerto Rican origin) living in the US who may be of any race or ethnic group (white, black, Asian, etc.)
Religions: Protestant 56%, Roman Catholic 28%, Jewish 2%, other 4%, none 10% (1989)
Languages: English, Spanish (spoken by a sizable minority)
Literacy: definition: age 15 and over can read and write male: 97% female: 97% (1979 est.) total population: 97%
People - note: note: data for the US are based on projections that do not take into consideration the results of the 2000 census Government United States
Country name: conventional long form: United States of America conventional short form: United States abbreviation: US or USA
Government type: federal republic; strong democratic tradition
Capital: Washington, DC Administrative divisions: 50 states and 1 district*; Alabama, Alaska, Arizona, Arkansas, California, Colorado, Connecticut, Delaware, District of Columbia*, Florida, Georgia, Hawaii, Idaho, Illinois, Indiana, Iowa, Kansas, Kentucky, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, Montana, Nebraska, Nevada, New Hampshire, New Jersey, New Mexico, New York, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, Pennsylvania, Rhode Island, South Carolina, South Dakota, Tennessee, Texas, Utah, Vermont, Virginia, Washington, West Virginia, Wisconsin, Wyoming
Dependent areas: American Samoa, Baker Island, Guam, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Islands, Navassa Island, Northern Mariana Islands, Palmyra Atoll, Puerto Rico, Virgin Islands, Wake Island note: from 18 July 1947 until 1 October 1994, the US administered the Trust Territory of the Pacific Islands, but recently entered into a new political relationship with all four political units: the Northern Mariana Islands is a commonwealth in political union with the US (effective 3 November 1986); Palau concluded a Compact of Free Association with the US (effective 1 October 1994); the Federated States of Micronesia signed a Compact of Free Association with the US (effective 3 November 1986); the Republic of the Marshall Islands signed a Compact of Free Association with the US (effective 21 October 1986)
Independence: 4 July 1776 (from Great Britain)
National holiday: Independence Day, 4 July (1776)
Constitution: 17 September 1787, effective 4 March 1789
Legal system: based on English common law; judicial review of legislative acts; accepts compulsory ICJ jurisdiction, with reservations
Suffrage: 18 years of age; universal
Executive branch: chief of state: President George W. BUSH (since 20 January 2001) and Vice President Richard B. CHENEY (since 20 January 2001); note - the president is both the chief of state and head of government head of government: President George W. BUSH (since 20 January 2001) and Vice President Richard B. CHENEY (since 20 January 2001); note - the president is both the chief of state and head of government cabinet: Cabinet appointed by the president with Senate approval elections: president and vice president elected on the same ticket by a college of representatives who are elected directly from each state; president and vice president serve four-year terms; election last held 7 November 2000 (next to be held 2 November 2004) election results: George W. BUSH elected president; percent of popular vote - George W. BUSH (Republican Party) 48%, Albert A. GORE, Jr. (Democratic Party) 48%, Ralph NADER (Green Party) 3%, other 1%
Legislative branch: bicameral Congress consists of the Senate (100 seats, one-third are renewed every two years; two members are elected from each state by popular vote to serve six-year terms) and the House of Representatives (435 seats; members are directly elected by popular vote to serve two-year terms) election results: Senate - percent of vote by party - NA%; seats by party - Democratic Party 50, Republican Party 49, independent 1; House of Representatives - percent of vote by party - NA%; seats by party - Republican Party 221, Democratic Party 211, independent 2, vacant 1 elections: Senate - last held 7 November 2000 (next to be held 4 November 2002); House of Representatives - last held 7 November 2000 (next to be held 4 November 2002)
Judicial branch: Supreme Court (its nine justices are appointed for life by the president with confirmation by the Senate); United States Courts of Appeal; United States District Courts; State and County Courts Political parties and leaders: Democratic Party [Terence McAULIFFE, national committee chairman]; Green Party [leader NA]; Republican Party [Governor Marc RACICOT, national committee chairman] Political pressure groups and NA
leaders: International organization AfDB, ANZUS, APEC, ARF (dialogue
participation: partner), AsDB, ASEAN (dialogue partner), Australia Group, BIS, CCC, CE (observer), CERN (observer), CP, EAPC, EBRD, ECE, ECLAC, ESCAP, FAO, G- 8, G-5, G-7, G-10, IADB, IAEA, IBRD, ICAO, ICC, ICFTU, ICRM, IDA, IEA, IFAD, IFC, IFRCS, IHO, ILO, IMF, IMO, Interpol, IOC, IOM, ISO, ITU, MINURSO, MIPONUH, NAM (guest), NATO, NEA, NSG, OAS, OECD, OPCW, OSCE, PCA, SPC, UN, UN Security Council, UNCTAD, UNHCR, UNIKOM, UNITAR, UNMEE, UNMIBH, UNMIK, UNMOVIC, UNOMIG, UNRWA, UNTAET, UNTSO, UNU, UPU, WCL, WHO, WIPO, WMO, WTrO, ZC
Flag description: thirteen equal horizontal stripes of red (top and bottom) alternating with white; there is a blue rectangle in the upper hoist-side corner bearing 50 small, white, five-pointed stars arranged in nine offset horizontal rows of six stars (top and bottom) alternating with rows of five stars; the 50 stars represent the 50 states, the 13 stripes represent the 13 original colonies; known as Old Glory; the design and colors have been the basis for a number of other flags, including Chile, Liberia, Malaysia, and Puerto Rico Economy United States -
Economy - overview: The US has the largest and most technologically powerful economy in the world, with a per capita GDP of $36,300. In this market-oriented economy, private individuals and business firms make most of the decisions, and the federal and state governments buy needed goods and services predominantly in the private marketplace. US business firms enjoy considerably greater flexibility than their counterparts in Western Europe and Japan in decisions to expand capital plant, lay off surplus workers, and develop new products. At the same time, they face higher barriers to entry in their rivals' home markets than the barriers to entry of foreign firms in US markets. US firms are at or near the forefront in technological advances, especially in computers and in medical, aerospace, and military equipment, although their advantage has narrowed since the end of World War II. The onrush of technology largely explains the gradual development of a "two-tier labor market" in which those at the bottom lack the education and the professional/technical skills of those at the top and, more and more, fail to get comparable pay raises, health insurance coverage, and other benefits. Since 1975, practically all the gains in household income have gone to the top 20% of households. The years 1994-2000 witnessed solid increases in real output, low inflation rates, and a drop in unemployment to below 5%. The year 2001 witnessed the end of the boom psychology and performance, with output increasing only 0.3% and unemployment and business failures rising substantially. The response to the terrorist attacks of September 11 showed the remarkable resilience of the economy. Moderate recovery is expected in 2002, with the GDP growth rate rising to 2.5% or more. A major short-term problem in first half 2002 was a sharp decline in the stock market, fueled in part by the exposure of dubious accounting practices in some major corporations. Long-term problems include inadequate investment in economic infrastructure, rapidly rising medical and pension costs of an aging population, sizable trade deficits, and stagnation of family income in the lower economic groups.
GDP: purchasing power parity - $10.082 trillion (2001 est.)
GDP - real growth rate: 0.3% (2001 est.)
GDP - per capita: purchasing power parity - $36,300 (2001 est.) GDP - composition by sector: agriculture: 2% industry: 18% services: 80% (2001 est.) Population below poverty line: 12.7% (2001 est.) Household income or consumption by lowest 10%: 1.8%
percentage share: highest 10%: 30.5% (1997) Distribution of family income - Gini 40.8 (1997)
index: Inflation rate (consumer prices): 2.8% (2001)
Labor force: 141.8 million (includes unemployed) (2001) Labor force - by occupation: managerial and professional 31%, technical, sales and administrative support 28.9%, services 13.6%, manufacturing, mining, transportation, and crafts 24.1%, farming, forestry, and fishing 2.4% (2001) note: figures exclude the unemployed
Unemployment rate: 5% (2001)
Budget: revenues: $1.828 trillion expenditures: $1.703 trillion, including capital expenditures of $NA (1999)
Industries: leading industrial power in the world, highly diversified and technologically advanced; petroleum, steel, motor vehicles, aerospace, telecommunications, chemicals, electronics, food processing, consumer goods, lumber, mining Industrial production growth rate: -3.7% (2001 est.) Electricity - production: 3,799.944 billion kWh (2000) Electricity - production by source: fossil fuel: 70.76% hydro: 7.19% other: 2.21% (2000) nuclear: 19.84% Electricity - consumption: 3.613 trillion kWh (2000)
Electricity - exports: 14.829 billion kWh (2000)
Electricity - imports: 48.879 billion kWh (2000)
Agriculture - products: wheat, other grains, corn, fruits, vegetables, cotton; beef, pork, poultry, dairy products; forest products; fish
Exports: $723 billion (f.o.b., 2001 est.)
Exports - commodities: capital goods, automobiles, industrial supplies and raw materials, consumer goods, agricultural products
Exports - partners: Canada 22.4%, Mexico 13.9%, Japan 7.9%, UK 5.6%, Germany 4.1%, France, Netherlands (2001)
Imports: $1.148 trillion (f.o.b., 2001 est.)
Imports - commodities: crude oil and refined petroleum products, machinery, automobiles, consumer goods, industrial raw materials, food and beverages
Imports - partners: Canada 19%, Mexico 11.5%, Japan 11.1%, China 8.9%, Germany 5.2%, UK, Taiwan (2001)
Debt - external: $862 billion (1995 est.)
Economic aid - donor: ODA, $6.9 billion (1997)
Currency: US dollar (USD)
Currency code: USD
Exchange rates: British pounds per US dollar - 0.6981 (January 2002), 0.6944 (2001), 0.6596 (2000), 0.6180 (1999), 0.6037 (1998), 0.6106 (1997); Canadian dollars per US dollar - 1.6003 (January 2002), 1.5488 (2001), 1.4851 (2000), 1.4857 (1999), 1.4835 (1998), 1.3846 (1997); French francs per US dollar - 5.65 (January 1999), 5.8995 (1998), 5.8367 (1997); Italian lire per US dollar - 1,668.7 (January 1999), 1,763.2 (1998), 1,703.1 (1997); Japanese yen per US dollar - 132.66 (January 2002), 121.53 (2001), 107.77 (2000), 113.91 (1999), 130.91 (1998), 120.99 (1997); German deutsche marks per US dollar - 1.69 (January 1999), 1.9692 (1998), 1.7341 (1997); euros per US dollar - 1.1324 (January 2002), 1.1175 (2001), 1.08540 (2000), 0.93863 (1999) note: financial institutions in France, Italy, and Germany and eight other European countries started using the euro on 1 January 1999 with the euro replacing the local currency in consenting countries for all transactions in 2002
Fiscal year: 1 October - 30 September Communications United States Telephones - main lines in use: 194 million (1997) Telephones - mobile cellular: 69.209 million (1998)
Telephone system: general assessment: a very large, technologically advanced, multipurpose communications system domestic: a large system of fiber- optic cable, microwave radio relay, coaxial cable, and domestic satellites carries every form of telephone traffic; a rapidly growing cellular system carries mobile telephone traffic throughout the country international: 24 ocean cable systems in use; satellite earth stations - 61 Intelsat (45 Atlantic Ocean and 16 Pacific Ocean), 5 Intersputnik (Atlantic Ocean region), and 4 Inmarsat (Pacific and Atlantic Ocean regions) (2000) Radio broadcast stations: AM 4,762, FM 5,542, shortwave 18 (1998)
Radios: 575 million (1997) Television broadcast stations: more than 1,500 (including nearly 1,000 stations affiliated with the five major networks - NBC, ABC, CBS, FOX, and PBS; in addition, there are about 9,000 cable TV systems) (1997)
Televisions: 219 million (1997)
Internet country code: .us Internet Service Providers (ISPs): 7,800 (2000 est.)
Internet users: 166 million (2001) Transportation United States
Railways: total: 212,433 km mainline routes standard gauge: 212,433 km 1.435- m gauge note: represents the aggregate length of roadway of all line-haul railroads including an estimate for Class II and III railroads (1998)
Highways: total: 6,370,031 km paved: 5,733,028 km (including 74,091 km of expressways) unpaved: 637,003 km (1997)
Waterways: 41,009 km note: navigable inland channels, exclusive of the Great Lakes
Pipelines: petroleum products 276,000 km; natural gas 331,000 km (1991)
Ports and harbors: Anchorage, Baltimore, Boston, Charleston, Chicago, Duluth, Hampton Roads, Honolulu, Houston, Jacksonville, Los Angeles, New Orleans, New York, Philadelphia, Port Canaveral, Portland (Oregon), Prudhoe Bay, San Francisco, Savannah, Seattle, Tampa, Toledo
Merchant marine: total: 264 ships (1,000 GRT or over) totaling 6,911,641 GRT/9,985,660 DWT ships by type: barge carrier 1, bulk 11, cargo 14, chemical tanker 16, collier 1, combination bulk 4, combination tanker 11, container 86, multi-functional large-load carrier 4, passenger/cargo 2, petroleum tanker 81, roll on/roll off 28, specialized tanker 3, vehicle carrier 2 note: includes some foreign-owned ships registered here as a flag of convenience: Australia 1, Canada 4, Denmark 15, France 1, Germany 1, Netherlands 3, Norway 7, Puerto Rico 4, Singapore 11, Sweden 1, United Kingdom 3 (2002 est.)
Airports: 14,695 (2001) Airports - with paved runways: total: 5,127 over 3,047 m: 183 2,438 to 3,047 m: 222 914 to 1,523 m: 2,413 under 914 m: 967 (2001) 1,524 to 2,437 m: 1,342 Airports - with unpaved runways: total: 9,568 under 914 m: 7,716 (2001) over 3,047 m: 1 2,438 to 3,047 m: 7 914 to 1,523 m: 1,679 1,524 to 2,437 m: 165
Heliports: 132 (2001) Military United States
Military branches: Department of the Army, Department of the Navy (includes Marine Corps), Department of the Air Force note: the Coast Guard is normally subordinate to the Department of Transportation, but in wartime reports to the Department of the Navy Military manpower - military age: 18 years of age (2002 est.) Military manpower - availability: males age 15-49: 70,819,436 (2001 est.) Military manpower - fit for military NA (2002 est.)
service: Military manpower - reaching males: 2,053,179 (2002 est.)
military age annually: Military expenditures - dollar $276.7 billion (FY99 est.)
figure: Military expenditures - percent of 3.2% (FY99 est.)
GDP:
Military - note: note: 2002 estimates for military manpower are based on projections that do not take into consideration the results of the 2000 census Transnational Issues United States Disputes - international: maritime boundary disputes with Canada (Dixon Entrance, Beaufort Sea, Strait of Juan de Fuca, Machias Seal Island); US Naval Base at Guantanamo Bay is leased from Cuba and only mutual agreement or US abandonment of the area can terminate the lease; Haiti claims Navassa Island; US has made no territorial claim in Antarctica (but has reserved the right to do so) and does not recognize the claims of any other state; Marshall Islands claims Wake Island
Illicit drugs: consumer of cocaine shipped from Colombia through Mexico and the Caribbean; consumer of heroin, marijuana, and increasingly methamphetamine from Mexico; consumer of high-quality Southeast Asian heroin; illicit producer of cannabis, marijuana, depressants, stimulants, hallucinogens, and methamphetamine; money-laundering center

* * *

Federal republic, North America.

It comprises 48 contiguous states occupying the mid-continent, Alaska at the northwestern extreme of North America, and the island state of Hawaii in the mid-Pacific Ocean. Area, including the U.S. share of the Great Lakes: 3,675,031 sq mi (9,518,287 sq km). Population (2002 est.): 287,602,000. Capital: Washington, D.C. The population includes people of European and Middle Eastern ancestry, African Americans, Hispanics, Asians, Pacific Islanders, American Indians (Native Americans), and Alaska Natives. Languages: English (predominant), Spanish. Religions: Protestantism, Roman Catholicism, Judaism, Islam. Currency: U.S. dollar. The country's regions encompass mountains, plains, lowlands, and deserts. Mountain ranges include the Appalachians, Ozarks, Rockies, Cascades, and Sierra Nevada. The lowest point is Death Valley, Calif. The highest point is Alaska's Mount McKinley; within the coterminous U.S. it is Mount Whitney, Calif. Chief rivers are the Mississippi system, the Colorado, the Columbia, and the Rio Grande. The Great Lakes, the Great Salt Lake, and Lake Okeechobee are the largest lakes. The U.S. is among the world's leading producers of several minerals, including copper, silver, zinc, gold, coal, petroleum, and natural gas; it is the chief exporter of food. Its manufactures include iron and steel, chemicals, electronic equipment, and textiles. Other important industries are tourism, dairying, livestock raising, fishing, and lumbering. The U.S. is a republic with two legislative houses; its head of state and government is the president. The territory was originally inhabited for several thousand years by numerous American Indian peoples who had probably emigrated from Asia. European exploration and settlement from the 16th century began displacement of the Indians. The first permanent European settlement, by the Spanish, was at Saint Augustine, Fla., in 1565; the British settled Jamestown, Va. (1607); Plymouth, Mass. (1620); Maryland (1634); and Pennsylvania (1681). The British took New York, New Jersey, and Delaware from the Dutch in 1664, a year after the Carolinas had been granted to British noblemen. The British defeat of the French in 1763 (see French and Indian War) assured British political control over its 13 colonies. Political unrest caused by British colonial policy culminated in the American Revolution (1775–83) and the Declaration of Independence (1776). The U.S. was first organized under the Articles of Confederation (1781), then finally under the Constitution (1787) as a federal republic. Boundaries extended west to the Mississippi River, excluding Spanish Florida. Land acquired from France by the Louisiana Purchase (1803) nearly doubled the country's territory. The U.S. fought the War of 1812 against the British and acquired Florida from Spain in 1819. In 1830 it legalized removal of American Indians to lands west of the Mississippi River. Settlement expanded into the Far West in the mid-19th century, especially after the discovery of gold in California in 1848 (see gold rush). Victory in the Mexican War (1846–48) brought the territory of seven more future states (including California and Texas) into U.S. hands. The northwestern boundary was established by treaty with Great Britain in 1846. The U.S. acquired southern Arizona by the Gadsden Purchase (1853). It suffered disunity during the conflict between the slavery-based plantation economy in the South and the free industrial and agricultural economy in the North, culminating in the American Civil War and the abolition of slavery under the 13th Amendment. After Reconstruction (1865–77) the U.S. experienced rapid growth, urbanization, industrial development, and European immigration. In 1877 it authorized allotment of American Indian reservation land to individual tribesmen, resulting in widespread loss of land to whites. By the end of the 19th century, it had developed foreign trade and acquired outlying territories, including Alaska, Midway Island, the Hawaiian Islands, the Philippines, Puerto Rico, Guam, Wake Island, American Samoa, the Panama Canal Zone, and the Virgin Islands. The U.S. participated in World War I in 1917–18. It granted suffrage to women in 1920 and citizenship to American Indians in 1924. The stock market crash of 1929 led to the Great Depression. The U.S. entered World War II after the Japanese bombing of Pearl Harbor (Dec. 7, 1941). The explosion by the U.S. of an atomic bomb on Hiroshima (Aug. 6, 1945) and another on Nagasaki (Aug. 9, 1945), Japan, brought about Japan's surrender. Thereafter the U.S. was the military and economic leader of the Western world. In the first decade after the war, it aided the reconstruction of Europe and Japan and became embroiled in a rivalry with the Soviet Union known as the Cold War. It participated in the Korean War from 1950 to 1953. In 1952 it granted autonomous commonwealth status to Puerto Rico. Racial segregation in schools was declared unconstitutional in 1954. Alaska and Hawaii were made states in 1959. In 1964 Congress passed the Civil Rights Act and authorized U.S. entry into the Vietnam War. The mid-to late 1960s were marked by widespread civil disorder, including race riots and antiwar demonstrations. The U.S. accomplished the first manned lunar landing in 1969. All U.S. troops were withdrawn from Vietnam in 1973. The U.S. led a coalition of forces against Iraq in the First Persian Gulf War (1991), sent troops to Somalia (1992) to aid starving populations, and participated in NATO air strikes against Serbian forces in the former Yugoslavia in 1995 and 1999. In 1998 Pres. Bill Clinton became only the second president to be impeached by the House of Representatives; he was acquitted by the Senate in 1999. Administration of the Panama Canal was turned over to Panama in 1999. In 2000 George W. Bush became the first person since 1888 to be elected president by the electoral college despite having won fewer popular votes than his opponent, Al Gore. After the September 11 attacks on the U.S. in 2001 destroyed the World Trade Center and part of the Pentagon, the U.S. attacked Afghanistan's Taliban government for harbouring and refusing to extradite the mastermind of the terrorism, Osama bin Laden. In 2003 the U.S. and the United Kingdom attacked Iraq and overthrew the government of Saddām Ḥussein, which they had accused of aiding terrorists and possessing and developing biological, chemical, and nuclear weapons. As the U.S. attempted to help reconstruct and bring democracy to Iraq, it faced an escalating Iraqi insurgency. In 2004 Bush narrowly defeated Democratic challenger John Kerry to win a second presidential term.

* * *

▪ 2009

Introduction
Area:
9,522,055 sq km (3,676,486 sq mi), including 204,083 sq km of inland water and 156,049 sq km of the Great Lakes that lie within U.S. boundaries but excluding 109,362 sq km of coastal water
Population
(2008 est.): 305,146,000
Capital:
Washington, D.C.
Head of state and government:
President George W. Bush

      With the long-developing subprime-mortgage crisis as the proximate cause, the United States led the world into a historic economic recession in late 2008. The downturn was marked by the collapse of financial firms, a dramatic decline in equity prices, and a subsequent falloff in lending and economic activity. By September the malaise had spread to developed economies in Europe, Asia, and elsewhere, prompting Western governments to undertake extraordinary rescue measures, often by nationalizing private banks. The U.S. government abandoned traditional free-market boundaries as it struggled to fashion an effective response, providing billions in assistance to save some firms, lowering interest rates, injecting capital to encourage lending, and taking an unprecedented equity position in private companies. By year's end the heroic measures had stabilized the economy at least temporarily, but the U.S. was clearly deeply mired in a global economic slump of uncertain duration.

      The economic turmoil occurred against the backdrop of a national election, and the Republican administration's controversial response to the crisis, accompanied by a public demand for policy change, helped Democrats take full control in Washington. The deteriorating economy and an overextended military also helped to ensure that the U.S. enjoyed few diplomatic successes during the year. In the ongoing war on terrorism, one bright spot for the administration was the continued firming up of the security situation in Iraq and the completion of a road map for ending U.S. combat operations there. The progress in Iraq, however, was at least partially offset by deteriorating conditions in Afghanistan that would require an increased Western troop presence.

Economic Crisis.
      Waning confidence in the value of securitized home mortgages and derivatives finally caught up with the U.S. economy during the year, prompting a disastrous chain reaction that eventually infected financial markets worldwide. The mortgages were packaged together and sold in bundles, backed by intricate and highly leveraged financial contracts, in a scheme designed to mitigate risk. The instruments, designed by Wall Street lawyers outside government regulatory oversight, were complicated and lacked transparency. When cracks appeared, instead of spreading and minimizing risk, the system acted to amplify unease and created a domino effect that spread across the financial system, from housing to mortgage lending, to investment banks, to securities firms, and beyond.

      In January, amid gloomy news of plummeting home sales and the first annual decline in home prices in at least four decades, equity prices plummeted rapidly. In response, Congress approved an economic stimulus package that provided a $600 cash rebate for most persons filing income-tax returns. The measure put $168 billion quickly into the economy but served only to delay more serious consequences. Institutions exposed to securitized mortgages and associated instruments saw their positions continue to deteriorate. In March, Bear Stearns, a venerable New York City investment bank, neared collapse and was sold in a fire sale backed by $30 billion in Federal Reserve funds. In July, Indymac Bancorp, the largest thrift institution in the Los Angeles area, was placed in receivership.

      On July 30, Pres. George W. Bush signed a bill designed to shore up mortgage lenders by guaranteeing up to $300 billion in new fixed-rate mortgages. The measure was ineffectual, however, and on September 7 the federal government essentially nationalized both Fannie Mae and Freddie Mac, which together owned or guaranteed half of the country's $12 trillion mortgage market. Instead of providing reassurance, the move only heightened investor worries about the economy and financial markets.

      In mid-September the dam broke. Merrill Lynch, the country's largest brokerage house, was sold to Bank of America under duress. Investment bank Lehman Brothers filed for bankruptcy, and federal regulators said that the firm owned so many toxic assets that a bailout attempt would be futile. A major money-market mutual fund, Reserve Primary, said that losses threatened its solvency; the Federal Reserve (Fed) offered $105 billion to shore up money funds, and the U.S. Treasury offered temporary insurance to money-fund investors. The Fed also pumped $85 billion into insurance giant AIG, which had provided backing for mortgage instruments, with the government taking a major equity position in return. Washington Mutual, the country's largest thrift institution, was seized as insolvent and sold for a fraction of its former value. By this time the contagion had spread to Europe and Asia, throwing the developed world economy into turmoil. (See Special Report (Financial Crisis of 2008 ).)

      U.S. Treasury Secretary Henry Paulson proposed a $700 billion rescue bill—initially written on only three pages—that was eventually approved by Congress on October 3. The Troubled Asset Relief Program (TARP) allowed federal authorities to purchase assets of failing banks and eased rules requiring strict valuation of distressed securities. The week of October 6–10, however, proved to be the worst one on Wall Street in at least 75 years, with the Dow Jones Industrial Average (DJIA) down 18%. Under pressure to prevent a complete financial collapse, during October the Fed pumped more than $2.5 trillion in emergency loans to banks and nonfinancial firms, lowering interest rates and working with European central banks to contain the damage.

      In November, as confidence continued to erode, Paulson abandoned plans to buy troubled assets under TARP and instead launched a plan to recapitalize financial firms, mostly by purchasing preferred shares of banks. The Fed also pledged another $800 billion to shore up distressed mortgages, provided $45 billion in assistance to Citigroup, and vowed further cuts to already-low interest rates. Those actions, in addition to similar moves by European and Asian governments, appeared to stabilize investor confidence. The stock market hit bottom for the year on November 20, with the DJIA settling at just over half of its record level of a year earlier. Even so, all indicators were showing that the underlying U.S. economy—technically in recession since the previous December—would continue to suffer from the crisis for months to come.

      Other distressed U.S. industries began petitioning Washington for assistance. After Congress refused a request from Detroit automakers for a $14 billion package, in December the Bush administration awarded up to $17.4 billion in loans to General Motors and Chrysler. That effectively postponed the automakers' plight until 2009 and handed the problem over to a new administration. Aides to President-elect Barack Obama (Obama, Barack ) publicly contemplated another federal stimulus package of $850 billion or more, including money for government infrastructure projects, as an early 2009 priority.

      The wild economic year devastated the country's balance sheet. The federal deficit for the fiscal year that ended September 30 almost tripled, to $454.8 billion, and analysts predicted that it would top $1 trillion in 2009. Investors lost an estimated $7.3 trillion in value from the decline in the 5,000 largest stocks alone. Overall, the year produced a 13% drop in the median home resale price, and an estimated 1 in 10 homeowners was in financial distress. Unemployment started the year at a modest 5% but stood at 7.2% in December and was climbing. The accelerating recession at least temporarily erased fears over rising inflation, with the consumer price index up little more than 1% in 2008. At midyear, as international demand peaked, oil touched $147 a barrel, producing gasoline prices of more than $4 per gallon and widespread distress in American households. By year's end demand was down, crude was under $40 per barrel, and gasoline had dropped to around $1.60 a gallon.

      The economy took a final blow in December with the arrest of Bernard Madoff, a major New York City hedge-fund operator. Madoff was accused of having run a giant Ponzi scheme, bilking his investors of up to $50 billion in what could be the largest financial scandal in history.

War on Terrorism.
      Five years after leading the invasion that toppled Saddam Hussein, the U.S. negotiated with the new Iraqi democratic government for an eventual end to allied combat operations. The agreement capped a year of declining violence and increased government control in Iraq and represented a dramatic turnaround for U.S. policy, which had seemed destined for a humiliating defeat only two years earlier. It also cleared the way for redeployment of U.S. troops elsewhere, particularly into resurgent terrorist areas of Afghanistan. Outgoing president George W. Bush hailed the Iraqi developments as a major step forward for democracy and credited the 2007 U.S. military surge, but his year-end visit to Iraq ironically was marred by dramatic political protest.

      Under U.S. pressure the Iraqi parliament took several steps to accommodate Iraq's Sunni minority and achieve ethnic reconciliation. In March the Shiʿite-dominated government deployed 30,000 Iraqi troops, accompanied by U.S. air support, into Basra in a successful thrust to depose the Mahdi Army, a radical Shiʿite militia that had long controlled the port city. Iraqi troops later entered and occupied Sadr City, a renegade Shiʿite section of Baghdad, without significant resistance.

      As violence ebbed markedly during the year, the Iraqi government took over increasing responsibility for its domestic security. In September Anbar province, once the cradle of the Sunni insurgency against the central government, was turned over to full Iraqi control. The following month Iraq assumed responsibility for some 100,000 (mostly Sunni) fighters; these Awakening Council forces had previously been paid and supervised by the U.S. military.

      At year's end Iraq and the U.S. signed a status-of-forces agreement that called for the removal of allied troops from Iraqi cities by mid-2009 and complete withdrawal of U.S. combat troops by the end of 2011. The agreement also gave Iraqi civilian authorities criminal jurisdiction over off-duty U.S. troops who committed infractions while away from their bases. Incoming U.S. president Barack Obama had campaigned for earlier withdrawal of U.S. forces within 16 months—or by May 2010. Obama later signaled, however, that he would listen to military advice and remain flexible on his timetable.

      By year's end allied forces were withdrawing from Iraq, and the U.S. military presence was diminishing toward presurge levels of 135,000. According to the Associated Press, U.S. troop deaths in 2008 stood at 314, down from more than 900 in 2007. (A total of 4,221 U.S. soldiers had died in the conflict since it began in 2003.) Some Middle East experts suggested that the security improvements were largely the result of internal Iraqi political reconciliation. In a final visit to Baghdad on December 14, however, President Bush declared that his administration's policies deserved credit and called the surge “one of the greatest successes in the history of the United States military.” At a press conference that same day with Iraqi Pres. Nuri al-Maliki, in a highly publicized incident, an Iraqi journalist threw two shoes at Bush as a sign of disrespect. Bush ducked the shoes; the journalist was temporarily jailed; and critics noted that such political protest would have been inconceivable in Saddam's Iraq.

      The military progress in Iraq was offset by renewed violence in Afghanistan as Sunni-dominated militant groups, including the Taliban and al-Qaeda, penetrated and challenged NATO forces in more than half of the country. In tacit recognition of the threat, U.S. Army Gen. David Petraeus, architect of the Iraq surge strategy, was elevated in October to head the U.S. Central Command, effectively taking control of allied military strategy in the war on terrorism, including the aggression in Afghanistan.

      As Afghan terrorist violence increased during the year, several NATO countries augmented troop deployments. At year's end the U.S. had about 32,000 of 62,000 NATO combat troops in Afghanistan, including an additional 1,000 sent by President Bush in November as part of what he termed a “quiet surge.” U.S. forces were concentrated in the east, on the dangerous border with Pakistan; the U.S. pursued an active counterinsurgency program on both sides of the border that involved the use of unmanned drone airplanes equipped with missiles.

      The Bush administration's legal strategy toward suspected terrorists suffered setbacks during 2008. In June the U.S. Supreme Court ruled, in a 5–4 decision, that even enemy combatants held outside the U.S.—at the U.S. detention facility at Guantánamo Bay, Cuba—had a right to a review of their cases in U.S. civilian courts. The ruling declared unconstitutional parts of two laws approved by Congress after 9/11 that were designed to allow indefinite detention of suspects and their eventual trial by military commissions. It further complicated dozens of pending combatant cases that were already burdened with charges of torture, withholding of evidence, and violations of international law by the U.S. military.

      Two war crimes trials were concluded during the year, the first in the U.S. since World War II. Salim Hamdan, a former driver for Osama bin Laden, was convicted in August on reduced charges of having provided “material support for terrorism.” He received a modest sentence of five and a half years and was released at year's end. A second defendant, Ali Hamza al-Bahlul, a Yemeni accused of having produced propaganda for al-Qaeda, including videos, was convicted by a military commission at Guantánamo Bay in October and given a life sentence. Neither Bahlul nor his attorney participated in his defense.

      In U.S. civilian courts, federal prosecutors won convictions in two antiterrorism criminal cases. In November, after a previous trial ended in a hung jury, the Holy Land Foundation and five former organizers were found guilty in Dallas of having funneled $12 million to the terrorist group Hamas. One observer alleged that the Muslim foundation's practice of supplying cash payments to Palestinian terrorists' families was the moral equivalent of car bombing. In December five foreign-born Muslims were convicted in New Jersey on charges that included having planned to kill U.S. soldiers at Ft. Dix. Defense attorneys claimed that the men were only talking and had planned no real violence, but prosecutors said that the convictions proved the effectiveness of the U.S. post-9/11 strategy of infiltrating violence-prone groups.

Domestic Policy.
      As lawmakers awaited a new administration following the historic win of Barack Obama in the presidential contest, election-year political considerations dramatically slowed the U.S. legislative process. (See Special Report (U.S. Election of 2008 ).) Despite record farm and food prices, Congress approved a $289 billion farm bill renewal that expanded agriculture subsidies and food-assistance programs. Congress also postponed a scheduled 10.6% reduction in physician reimbursements for Medicare, paying for the measure by trimming payments to insurance companies that provided supplemental health care programs. Bush vetoed both measures, but his vetoes were overridden both times. Two bills augmenting veterans' benefits—for housing, health care, life insurance, and family allowances—were signed into law. Another law dramatically expanded G.I. Bill education awards, essentially providing a full college education to veterans who had at least three years of service and allowing benefits to be transferred to family members.

      Preparing to leave office, the Bush administration at year's end proposed several dozen regulatory changes. Among them were provisions for expanding federal land eligible for shale oil development, increasing allowable on-road hours for truck drivers, allowing health care workers to refuse to participate in procedures that violated their moral or religious beliefs, permitting the possession of licensed firearms in national parks, reducing access to Medicaid vision and dental benefits, eliminating factors such as greenhouse conditions in Endangered Species Act reviews, and slowing federal protection for workers exposed to toxic chemicals. Obama transition officials promised to review the entire list in 2009.

Foreign Policy.
      U.S. relations with a resurgent and energy-rich Russia deteriorated further in 2008. Effects of heightened tensions could be seen worldwide as the two countries sparred over missile defense, Latin America, Iraq, Iran, and Russia's invasion of a province of Georgia. In one example, Russia almost single-handedly blocked U.S. efforts to ratchet up UN sanction pressure on Iran over its refusal to allow nuclear inspections. By year's end some commentators were saying that U.S.-Russia relations were at their lowest ebb since the end of the Cold War nearly two decades earlier.

      In April, under U.S. prodding, NATO agreed that it would eventually accept Georgia, Russia's southern neighbour, as a member—even though Russia opposed NATO's eastward expansion and viewed it as a security threat. Four months later, Russian troops invaded two rebellious Georgian provinces, South Ossetia and Abkhazia, and recognized them as independent states. NATO stepped up its military presence in the region, with U.S. warships delivering relief efforts to Georgia via the Black Sea. In what was widely viewed as a response, Russia dispatched a military flotilla to Venezuela in November in a show of support for Pres. Hugo Chávez, a critic of the U.S., and at year's end Moscow also staged a rare Russian navy visit to Cuba.

      With Chávez and Cuba's Raúl Castro in the lead, Latin American leaders formed a South American union (Unasur) and took other steps aimed at reducing U.S. influence in the region. A group of 33 countries staged a summit meeting in Brazil in December, pledging internal cooperation and welcoming Cuba after having failed to invite U.S. representatives.

      Efforts to prevent nuclear weapons proliferation suffered setbacks during the year. No progress was made in stopping nuclear development in either Iran or North Korea or in numerous Middle Eastern countries that were nervous about a potential threat from Iran; a number of Middle Eastern countries had initiated steps toward starting their own nuclear programs. Iran, continuing to insist that its nuclear development was solely for civilian energy purposes, persisted in stonewalling international watchdogs, even while Russia supplied Iran with uranium for enrichment and processing that could be diverted to weapons purposes. At midyear, in Geneva, U.S. authorities engaged in direct talks with Iranian nuclear negotiators for the first time and also joined major powers in offering yet another package of incentives for Iranian abandonment of its nuclear ambitions. Iran continued to obfuscate, however, and Congress tightened U.S. economic sanctions on Iran in September.

      After agreeing in 2005 to scrap its nuclear weapons program in return for normalized world relations, North Korea accepted promised food and fuel assistance from the U.S. and allies. As a show of good faith, Pres. George W. Bush removed Pyongyang from an international blacklist as a state sponsor of terrorism. In December five countries met to persuade North Korea to accept a verification regime written by its ally, China. The talks collapsed, however, when the North Koreans refused to sign the agreement, with analysts speculating that they were waiting for more favourable terms from the new U.S. administration. Prior to the breakdown, the U.S., Russia, China, and South Korea had already delivered 500,000 tons of fuel oil promised to North Korea for its cooperation.

      The U.S. continued to push for rapprochement between India and Pakistan, both to facilitate critical support for antiterrorism efforts and to counter growing Chinese influence in Asia. In October the U.S. signed an agreement to supply technological aid for India's nuclear program, even though India had tested nuclear weapons and refused to sign the Non-proliferation Treaty. In November, after Pakistan-based terrorists staged a bloody raid on Mumbai (Bombay), U.S. Secretary of State Condoleezza Rice visited the subcontinent to pressure both countries to continue normalizing relations. (See Special Report (Terror in Mumbai ).)

David C. Beckwith

Developments in the States 2008
      The national economic recession hit U.S. states with a vengeance in late 2008, throwing budgets deeply into the red and prompting forecasts of even more financial trouble ahead. Forced to balance their books, a few states raised taxes or fees to generate new revenue. Most states, however, tightened their belts—postponing or canceling new programs, laying off state employees, and trimming spending across the board to weather the fiscal storm. The action came as state capitals continued to wrestle with a host of issues left unresolved on the federal level, including immigration, global warming, children's health insurance, and education reform. Regular legislative sessions were held in 44 states during the year, and 22 states staged one or more special sessions, often to deal with financial issues.

      Eleven states held gubernatorial elections, and Democrats took over the Missouri governorship previously held by Republicans; this left the prospective 2009 governorship lineup at 29 Democrats and 21 Republicans. Legislative elections were staged in 44 states and resulted in modest gains for Democrats. Republicans won control of the Montana Senate and the Tennessee House and Senate, all previously tied or held by the other party. Democrats, however, took charge in the Delaware House, New York Senate, Nevada Senate, Ohio House, and Wisconsin Assembly. The Alaska Senate, previously Republican, and the Montana House, previously Democratic, were tied. That meant that Democrats had two-chamber control of 27 state legislatures, Republicans dominated in 14 states, and control was split or tied in 8 others. Nebraska had a nonpartisan unicameral legislature.

Structures, Powers.
      Voters in three states— Connecticut, Hawaii, and Illinois—rejected ballot measures authorizing conventions to write new state constitutions. Opponents said that the conventions could be hijacked by special interests—including opponents of same-sex marriage—and were an inefficient way to resolve local governmental concerns. California and New York became the first states to create a cabinet-level position to oversee volunteer and charitable activity.

       Arkansas became the 45th state to authorize annual legislative sessions. South Dakota voters decided to keep its term limits for legislators. By a narrow margin, California voters endorsed a proposal to have state legislative districts drawn up every 10 years by a citizen panel instead of by the legislature itself.

Finances.
      As a mid-decade housing boom turned to bust, state revenue projections declined early in 2008, sending state authorities scrambling for cost savings. The outlook turned even more bleak in the fall as the financial crisis accelerated the U.S. descent into recession and pushed most state budgets into deficit. States were particularly hit by economic slowdowns because sales taxes and property-transfer levies were adversely affected, while state spending on unemployment assistance, Medicaid, and other benefits rose quickly. Among the hardest-hit states were California, which was forced to lay off thousands of state workers, and New York, which was dependent upon Wall Street transactions for one-fifth of state revenue. (See Special Report (Financial Crisis of 2008 ).)

      Most states were required to balance their budgets every year. Spending restrictions were enacted in some 40 states, often targeting health care and even education, the biggest items in most state budgets. The National Conference of State Legislatures reported that states found $40 billion in cost savings or additional revenue during the year but still faced an additional $97 billion in deficits for the 2009 and 2010 fiscal years. At year's end, governors petitioned President-elect Barack Obama (Obama, Barack ) for federal infrastructure assistance and for increased federal funds to help defray fast-rising Medicaid, unemployment insurance, and food-stamp costs.

      In November balloting, Colorado voters refused to repeal the state's strict limits on increased spending. Voters in North Dakota turned down a proposal to halve the state's income tax, and Massachusetts voters rejected the abolition of the state income tax. Maine voters voided a legislative plan to increase taxes on beer, wine, and soft drinks.

      Several states took steps to mitigate the mortgage crisis. North Carolina approved a foreclosure-prevention law offering state mediation assistance for borrowers. Twenty-nine states tightened laws covering mortgage licensing, and four—Kentucky, Maryland, Utah, and Washington—established mortgage fraud as a crime. Seven others tried to curb unscrupulous foreclosure-rescue scams.

      At midyear, with energy prices at record levels, the country's governors sought a doubling of the federal government's low-income heating-assistance program. Energy prices dropped markedly in the fall, however, and the anticipated crisis disappeared. New York became the first state to force online retailers to collect sales taxes; e-commerce company Amazon quickly filed suit in an attempt to void the law.

Social Issues.
      The Supreme Courts of California and Connecticut established same-sex marriage as a state constitutional right during the year, making those states the first to legalize same-sex unions since the top Massachusetts court authorized full marriage rights for homosexuals in 2003. The California decision, which was announced in June, was quickly challenged, however, and in the November election was overridden (52–48%) by state voters. The ballot result sorely disappointed gay rights advocates who were hoping for the first major voter ratification of same-sex marriage, and it also called into question the legality of 18,000 marriages performed in California in the five months following the court decision. New York's governor announced that the state would recognize gay marriages performed elsewhere. Even so, voters in Arizona and Florida banned same-sex marriage in their states, and in a related measure Arkansas voters required that foster parents be a married couple. At year's end 40 states had specifically outlawed same-sex marriage, through either state law or constitutional amendment, while 11 states and the District of Columbia legally recognized some form of domestic partnerships, civil unions, or gay marriage.

      Nebraska became the fourth state to ban race-based preferences in state hiring, contracting, and educational admissions decisions. A similar referendum, however, failed on a close vote in Colorado, which represented the first defeat for the anti-affirmative-action measure.

      Right-to-life advocates suffered reverses during the year. Washington voters joined Oregon in approving “death with dignity” acts allowing physician-assisted suicide. Michigan voters terminated a long-standing ban on embryonic stem cell research. South Dakota voters turned down a highly restrictive proposition banning abortion except in cases of rape, incest, or danger to the mother's health. For the second time, California voted down a ballot measure requiring parental notification before a minor could obtain an abortion.

Law, Ethics.
      Deadlock within the federal government on immigration reform led to state legislative action, but no consistent pattern developed. The administration of Pres. George W. Bush moved to head off a growing revolt over Real ID, a 2005 federal law requiring states to verify the identity of all drivers and issue tamper-proof licenses, a measure that states said was too costly and infringed on privacy rights. Facing widespread foot-dragging and noncompliance, the administration gave all states two additional years to conform. Oregon and Texas banned illegal immigrants from obtaining driver's licenses, and California's governor vetoed a legislative measure allowing them to be licensed. Seeking to combat accidents involving undocumented immigrants, Georgia upgraded to felony status a repeat conviction of driving without a license. Georgia and Mississippi increased mandatory use of the federal E-verify system to combat the hiring of illegal immigrants, but a U.S. judge blocked a similar Oklahoma law. Arizona voters refused to amend a controversial law that cracked down on employers who knowingly hired illegal immigrants. Oregon voters defeated a ballot measure restricting bilingual education.

      Arkansas voters, seeking to fund college scholarships, approved the 43rd state lottery. Maryland legalized slot machines at racetracks. Ohio and Maine voters rejected new casinos, but Colorado and Missouri voters expanded casino games and hours of operation. Voters in Massachusetts decriminalized the possession of one ounce or less of marijuana, and Michigan became the 13th state to allow marijuana for medical use. California voters rejected a major drug-law rewrite that would have decriminalized possession of small amounts of marijuana.

      Six states increased penalties for dog and other animal fighting. Massachusetts banned greyhound racing. Concealed-carry gun laws continued to expand: Florida allowed permit holders to take weapons to work (if they were left in a parked vehicle), and Georgia allowed guns in restaurants, parks, and public transit. Alaska, Indiana, Georgia, and Tennessee toughened laws against Internet predators.

      The year produced numerous ethics investigations, one involving Alaska Gov. Sarah Palin (Palin, Sarah ), whom state legislators accused of having improperly fired the state public-safety commissioner. A special counsel exonerated her one day before the November election, in which she ran as the Republican vice presidential candidate. New York Gov. Eliot Spitzer was forced to resign after he admitted having engaged a prostitute. Ohio Attorney General Marc Dann also resigned in a sexual-harassment scandal. In December, Illinois Gov. Rod Blagojevich was arrested by federal agents and charged with conspiracy to commit fraud and solicitation of bribery, including an alleged attempt to sell Barack Obama's vacated U.S. Senate seat.

      State use of the death penalty was suspended early in the year while the U.S. Supreme Court reviewed the constitutionality of lethal injections. After executions resumed in May, the use of capital punishment continued to decline. During 2008, 37 inmates were executed, down from 42 in 2007. Florida enacted a statute setting compensation for wrongful criminal convictions; the amount was $50,000 for every year served in prison.

Health, Welfare.
       Iowa became the 28th state to ban smoking in any public place, including bars and restaurants. Six additional states (for a total of 28) required cigarettes to be wrapped in self-extinguishing paper to prevent fires; this effectively made the statute a national requirement.

      Health-conscious California became the first state to ban trans fats and also the first to require posting of calorie and nutritional content on fast-food menus. In another antiobesity move, five states boosted the mandatory time that schoolchildren must spend at recess or gym classes.

      Budget problems forced several states (including California, Illinois, Missouri, New Mexico, and Pennsylvania) to postpone expansion of state health insurance coverage. Iowa, Colorado, and Montana expanded children's health care, and Florida and Maine increased funding for their novel health insurance assistance programs. New Jersey became the first state to require that all children have health insurance, though the measure contained no enforcement clause.

      New Jersey joined California and Washington in mandating that employers provide up to six weeks of paid leave annually to care for family members, but funding for Washington's law never materialized during the year. Seeking to curb infant deaths, Nebraska on July 1 joined states providing a “safe haven” for unwanted children. The new law failed to specify any age parameters, however, and within a few months, nearly three dozen older children, several from other states and some as old as 17, had been legally handed over to state care. At year's end Nebraska legislators amended the law with a 30-day age limit.

Environment, Education.
      California became the first state to enact a law encouraging home building in areas near workplaces and public transportation; the measure was designed to curb suburban sprawl and air pollution. Connecticut joined four other states in capping greenhouse gas emissions, and Delaware, Florida, and New Hampshire also approved measures to reduce emissions blamed for global warming. Delaware approved a major offshore wind energy project.

      Massachusetts became the first state to exempt non-food-based biofuels from state gasoline taxes and also approved a unique plan to manage its waters as a wind, wave, and tidal energy resource. Alaska issued a license for a $20 billion natural gas pipeline. Meanwhile, California voters approved nearly $10 billion in bonds for high-speed-rail construction between Los Angeles and San Francisco.

      Minnesota voters set aside a percentage of state sales-tax revenue for wetland protection. Alaska voters, seeking to protect moose, approved a game-management program that allowed the shooting of wolves from airplanes. Missouri voters approved a measure that required utilities to produce 15% of energy through renewable sources by 2021, but Californians rejected a more drastic requirement of 50% by 2025. Hawaii became the first state to require solar-powered water heaters in new homes.

      Some 32 states increased funding for prekindergarten education programs, but some plans were trimmed in late-year budget cutting. A shortage of funds torpedoed Arizona Gov. Janet Napolitano's plan to grant free public college tuition to all high-school graduates who had at least a B average.

Regulation.
      Reacting to high accident rates, several states tightened restrictions on new drivers, particularly teenagers. Virginia established for teen drivers a “baby DUI” law, a strict .02 blood-alcohol standard, which was one-quarter the allowable amount for adults. California banned teens from using cell phones while driving and also outlawed text messaging for all drivers. California and Washington joined three states that banned motorist use of handheld cellular phones. Arizona and Ohio voters rejected proposals to tighten restrictions on “payday lenders” accused of having predatory business practices.

David C. Beckwith

▪ 2008

Introduction
Area:
9,366,006 sq km (3,616,235 sq mi), including 204,083 sq km of inland water but excluding the 156,049 sq km of the Great Lakes that lie within U.S. boundaries
Population
(2007 est.): 302,633,000
Capital:
Washington, D.C.
Head of state and government:
President George W. Bush

      For four years the United States economy had expanded robustly and virtually without incident, shrugging off concerns about potential overextension in a costly and deteriorating military expedition in Iraq, but 2007 brought abrupt change. A long-shot plan to temporarily increase the U.S. military presence seemed to work, reestablishing hope for a stable Iraq and easing pressure on an unpopular president—even as a fast-appearing disaster in the U.S. housing and financial sectors disrupted world markets and threatened to plunge the U.S. economy into recession. (See Sidebar (Subprime Mortgages: A Catalyst for Global Chaos ).)

War on Terrorism.
      As 2007 began, the U.S.-led international coalition in Iraq was fraying noticeably, American casualties were rising, and the newly elected Democratic congressional majority was demanding a prompt U.S. exit. Facing a humiliating forced withdrawal and likely defeat in his quest to establish a stable Middle East democracy, Pres. George W. Bush decided instead to replace his military leadership and escalate the U.S. military presence in the conflict. The bold plan attracted comparisons to the disastrous U.S. experience in Vietnam and ran counter to majority opinion—from Congress, the Iraq Study Group, and even U.S. public opinion polls. The conflict also led to the bloodiest year yet for U.S. troops fighting the war on terrorism.

      After a spike in violence at midyear, reinforced coalition forces in Iraq were able to forge cooperation pacts with numerous factions and root out terrorists in Baghdad and elsewhere. Security began improving dramatically across Iraq. As 2007 ended, Bush appeared to have won his last-minute gamble, at least temporarily, and bought more time for his policies.

 Bush vowed in January to augment the 132,000 Iraq-based U.S. forces with 30,000 reinforcements. Fresh troops began arriving within weeks, taking on both Sunni and Shiʿite militias for control of Baghdad neighbourhoods and creating alliances with tribal chiefs to combat suspected al-Qaeda fighters in outlying provinces. The military also provided financial support to Awakening Councils, formed by Sunni sheikhs designed to turn Iraqi neighbourhoods against foreign terrorist fighters by appealing to the residents' nationalist sentiment. The efforts were particularly successful in the unruly western Al-Anbar province, where previously hostile tribes began turning against al-Qaeda.

      By fall, as the U.S. surge reached its peak, some 160,000 U.S. troops were on Iraqi deployment, and congressional opposition to the plan grew ferocious. The new U.S. military commander, Army Gen. David Petraeus (Petraeus, David ), was summoned to Washington in September to answer skeptics and defend his cautious claims of progress. One prominent antiwar group, Moveon.org, ran a controversial full-page newspaper ad questioning General Petraeus's credibility and patriotism. Throughout the year Congress held more than 80 votes designed to reduce funding or force U.S. withdrawal from Iraq, but President Bush was able to obtain $200 billion for the war in three emergency spending bills that were eventually approved without strings attached.

      By October it had become obvious that the insurgency—bombings, attacks, and both civilian and military deaths—was losing momentum rapidly. At year's end violent incidents were down by two-thirds, Iraqis had taken over security in many areas, and officials were able to announce initial U.S. troop withdrawals. Even so, U.S. military deaths in Iraq reached 899 for the year, the highest number since the 2003 U.S.-led incursion. The U.S. monthly death toll peaked in May at 126, but it dropped to 37 in November and 23 in December. Although fighting between Islamic factions was also reduced during the year, critics pointed out that the Iraqi government had failed to make substantial progress in achieving national reconciliation.

      In Afghanistan Islamic radicals continued their resurgence following the 2001 NATO coalition invasion that toppled the Taliban from power. Sheltered in sanctuaries in lawless tribal areas of western Pakistan and financed in part by opium production, Taliban fighters escalated armed clashes in remote areas, at times retaking effective control of up to half the country. For the first time U.S. troop deaths topped 100, and the overall 232 coalition deaths for the year were almost evenly divided between U.S. and other NATO forces. For much of the year, the U.S. encouraged its allies to substantially increase their troop commitments in Afghanistan but with little success, and some observers suggested that the U.S. would soon be compelled to increase its own presence in Afghanistan even as it drew forces down in Iraq.

Domestic Policy.
       Democrats took control of Congress from scandal-plagued Republicans in January, promising major changes in national priorities, ethics, entitlements, health care, fiscal policy, and the war in Iraq. However, President Bush and the Republican minority, utilizing veto threats and procedural rules, managed to alter many Democratic initiatives and halt others altogether.

      The tone was set early in the year when the House of Representatives quickly approved virtually all of the Democrats' “Six in '06” campaign promises—including stepped-up stem-cell research, a minimum-wage increase, reduced-cost student loans, and mandatory negotiation on Medicare drug prices. All of the initiatives bogged down in the Senate, where a 60-vote supermajority was often necessary to move legislation. By May none of the Democratic legislation had reached Bush's desk.

      In the spring, antiwar Democrats attached an amendment to a $90 billion supplemental Iraq War appropriation requested by President Bush, setting a timeline for withdrawal of U.S. forces. Bush rejected the measure—only his second veto in more than six years in office. Amid GOP warnings that U.S. troops needed resupply, Democrats were forced to pass the supplemental without timeline amendments.

 Relations between Congress and the administration were contentious. The new congressional majority launched numerous investigations of past administration actions; one inquiry into the firing of eight U.S. attorneys in 2006 led to the resignation of Attorney General Alberto Gonzales, who was replaced in November by Michael Mukasey (Mukasey, Michael ). Bush was also weakened by the conviction in March of former vice presidential aide I. Lewis (“Scooter”) Libby on charges of having lied to a special counsel about his involvement in the leak of a covert CIA officer's identity. Bush commuted Libby's two-and-a-half-year prison sentence on the eve of Libby's incarceration.

      The pace of congressional legislation was glacial. Immigration reform, frustrated by the U.S. House in 2006, appeared headed for passage early in the year when the Bush administration started negotiations with Republican and Democratic Senate leaders on a compromise bill. Initially, in a test vote, 69 Senators signaled willingness to consider the measure, but the bill rapidly lost support on the Senate floor. The measure allowed most illegal aliens to stay in the U.S. and earn permanent status by paying taxes, learning English, and avoiding criminal activity, but it was quickly denounced by opponents as an amnesty that would encourage future disrespect for border laws. After three weeks of contentious debate, supporters agreed to add a “touchback” provision requiring undocumented aliens to return to their home countries at least briefly before receiving legal status. In the critical procedural test, however, only 46 Senators (of 60 required) voted to pursue the legislation, and comprehensive reform died again.

      President Bush vetoed five additional bills during the year and threatened rejection of some 50 more while seeking to force legislative changes. Bush vetoed expansion of federal funding for embryonic-stem-cell research and thus killed the measure again. Republicans stymied the plan to force pharmaceutical companies to negotiate with the government over Medicare drug prices. An ambitious bipartisan bill to double expenditures on the State Children's Health Insurance Program was vetoed twice; at year's end Congress extended the existing bill for 18 months, removing the issue from the 2008 election cycle. Congress was able to override only one Bush veto—an appropriations measure containing numerous local infrastructure projects, including funding for rebuilding the hurricane-devastated Gulf coast.

      The new Congress was ultimately successful at midyear in raising the minimum wage for the first time in a decade; the law increased the rate from $5.15 to $7.25 per hour by 2009. A Democrat-led effort to provide additional assistance to students by expanding Pell Grants and reducing interest rates on student loans also became law.

      Congress provided a record funding increase for veterans' health care programs and significantly tightened Washington lobbying and ethics rules. Critics noted that the new rules did not directly address concerns over rapidly expanding congressional earmarks—projects inserted in appropriations bills by individual lawmakers—and President Bush complained that a massive spending bill at year's end contained more than 9,800 such additions, with an estimated cost of $10 billion. Congress also provided a one-year fix for the Alternative Minimum Tax (AMT), a 1978 law originally written to ensure that the wealthy paid at least minimal taxes. For 2007, owing to inflation, the AMT threatened some 23 million taxpayers. After extended negotiations, the $50 billion AMT expansion was suspended, and the lost government revenue was to be added to the federal deficit.

      As oil prices moved close to $100 per barrel during the year, Congress passed new energy legislation to expand alternative energy sources, increase vehicle mileage standards by 40% (to an average of 35 mi per gal by 2020), and phase out incandescent light bulbs in favour of fluorescent lighting. Yet another threatened veto forced removal of provisions rolling back tax breaks to oil and gas companies, and Bush thus had successfully stopped any major tax increase during the year.

The Economy.
      Excesses in the domestic housing sector finally caught up with the U.S. economy during 2007, causing major disruption among financial firms and marring an otherwise solid sixth consecutive year of economic growth. The turmoil had worldwide ramifications. In most recent years the U.S. economic engine had pulled the global economy forward. In 2007, however, with the U.S. overextended and struggling, less-developed economies in India, China, Russia, and elsewhere shared the mantle of world economic leadership.

      Spurred by near 5% growth in the third quarter, the U.S. economy expanded by almost 3% for the year, close to its long-range potential. Employment increased every month, setting a national record of 52 consecutive months of net job growth. Some 1.5 million new payroll jobs were created, and although the unemployment rate moved upward from 4.4% to 5.0% during the year, employers continued to report worker shortages in many areas.

      The positive statistics, however, masked tumult caused by an urban housing bubble earlier in the decade. Brokers had helped fuel a boom in home construction and resales by offering adjustable-rate mortgages at low initial interest rates. The easy cash drove home prices up markedly, producing an overheated real-estate market that peaked in 2005. The new mortgages were typically packaged together and resold as securities to banks and other investors in the U.S. and worldwide. By mid-2007, however, it had become clear that a substantial minority of homeowners could not make their payments when their interest rates were adjusted upward. That led to rising delinquency rates and foreclosures, and an estimated $500 billion worth of “ subprime” mortgage securities were devalued, which reduced the lending capacity of financial institutions.

      With equity markets in turmoil, federal officials in August changed signals and began easing short-term interest rates, which had remained largely unchanged for a year. In an effort to avoid an economic slowdown, the U.S. Federal Reserve Board (Fed) lowered the key federal funds rate by one-half percentage point in August and followed with two additional quarter-point reductions in the fall. The administration also sought voluntary private-sector cooperation to ease the crisis, including a controversial plan to freeze interest rates temporarily. The action came too late, however, to forestall multibillion-dollar losses reported in the fall by holders of subprime paper. The chairmen of Citibank and Merrill Lynch resigned under pressure, and several major financial institutions were forced to seek infusions of foreign funds to bolster their books.

      Other economic news was mixed. With energy prices again rising, the threat of inflation reappeared, and the consumer price index topped 3% for the year, well above the Fed's guidelines. National workplace productivity, a key measure of economic efficiency, resumed substantial growth after a brief slowdown. As expanding economic activity bolstered revenues, the federal budget deficit declined again in 2007, to $163 billion. As the U.S. trade deficit continued at a historic peak, the U.S. dollar suffered, losing 10% of its value to the euro during the year.

      Overall, despite turmoil among financial firms, Wall Street ended a turbulent year with solid, if unspectacular, gains. Equity markets rallied following the August interest-rate cut but gave back most of the year's gains later in the year. The broad Dow Jones Wilshire 5000 index finished up by 3.9% for the year, while the Dow Jones Industrial Average gained 6.4%. At year's end, however, consumer and investor confidence was dropping, and economists were divided on whether the national economic expansion would continue into 2008.

Foreign Policy.
      With its attention and resources concentrated on Iraq and Afghanistan, the U.S. was unable to focus sustained diplomatic attention on overseas issues and recorded little real progress during 2007. One apparent exception involved North Korea, which President Bush in 2002 had named as one of three “axis of evil” countries because of weapons exports and support for terrorism. For four years Japan, Russia, China, the U.S., and South Korea had negotiated with North Korea to dismantle its fledgling nuclear weapons capacity. On September 3, however, diplomatic negotiators announced that North Korea had agreed to catalog and dismantle its nuclear testing sites and would in turn receive a $300 million aid package. At year's end North Korea failed to honour yet another disclosure deadline, but diplomats remained optimistic that a breakthrough had been achieved.

      Efforts to prevent Iran from achieving nuclear capability were largely unavailing. After Iran denied UN inspectors access to suspected weapons sites, the Security Council approved a unanimous resolution tightening international economic sanctions in March—again, with few ascertainable results. In early December U.S. intelligence agencies released a surprise consensus National Intelligence Estimate (NIE) declaring with “high confidence” that Iran had abandoned its pursuit of nuclear weapons capacity in 2003—reversing a 2005 “high confidence” estimate by the same agencies that Iran was rapidly developing that weaponry. The new NIE undermined the international consensus seeking to stop the Iranian nuclear development, and critics charged that U.S. intelligence officials were effectively overturning Bush administration policy.

       Russia maintained substantial trade with Iran, including its first delivery of uranium fuel, and U.S. relations with Russia continued to slowly deteriorate during the year. At midyear President Bush invited Russian Pres. Vladimir Putin to Kennebunkport, Maine, in an unsuccessful attempt to warm up bilateral relations. U.S. officials were openly critical of Putin's centralization of control over the Russian government, suggesting democracy was being undermined. After the U.S. pushed toward installing missile defense shields in Poland and the Czech Republic, Putin announced that Russia would suspend its participation in the 1990 Conventional Forces in Europe (CFE) treaty, the arms-control agreement.

      U.S. policy toward Asia was dominated by the growing influence of China, a current trading partner viewed as a future economic or even military rival. The U.S. filed three World Trade Organization complaints during the year against China, which nonetheless continued to enjoy a huge export advantage in the bilateral trade balance. In the spring China was forced into massive recalls of substandard products shipped to the U.S., including defective tires, tainted pet food, and toys with lead paint. (See Special Report (Perils of China's Explosive Growth ).)

      The U.S. concentrated on tightening relations with India and an increasingly unstable Pakistan in an effort to counter China's growing influence. The U.S. signed a controversial agreement with India to facilitate production of domestic nuclear power, even though the deal arguably infringed on international nuclear nonproliferation agreements. Relations with India's rival, Pakistan, were rockier. The U.S. pushed the Pakistani military regime for democratic reform even while seeking from Pakistan additional military action against Taliban fighters attempting to destabilize Afghanistan. The U.S. openly criticized Pakistani Pres. Pervez Musharraf's declaration of a short-lived state of emergency in November but was less outspoken when President Musharraf's chief rival, former prime minister Benazir Bhutto (Bhutto, Benazir ), was assassinated in late December.

      The U.S. was able to claim closer ties with one of its traditional major allies following the presidential elections in France. The warming was especially noteworthy because for years France had been openly critical of U.S. policies in Europe and the Middle East.

      With the U.S. Senate bogged down in partisan gridlock, international treaties received scant attention. Over opposition from trade unions, the Senate finally approved a free-trade agreement with Peru, but similar proposed pacts with Colombia, Panama, and South Korea languished at year's end. A Senate committee voted 17–4 in late October to ratify the decades-old Law of the Sea treaty, which had previously been signed by virtually every other country, but conservatives argued that the treaty would grant the UN powers that rightfully belonged under exclusive U.S. sovereignty. The full Senate did not take up the treaty by year's end.

 President Bush attempted to counter a distinct regional movement toward socialism and the growing influence of Venezuelan Pres. Hugo Chávez by visiting five Latin American countries in the spring. Chávez, the most visible manifestation of a discernible leftward shift in Latin American politics, stepped up his anti-American rhetoric during the year and established a close relationship with U.S. adversaries such as Iran. U.S. influence was bolstered when Chávez appeared to overreach and narrowly lost a December referendum that would have allowed him to rule the oil-rich country indefinitely.

      U.S. policy was severely tested in two international conferences at year's end. Responding to complaints about the lack of leadership toward Middle East peace, U.S. Secretary of State Condoleezza Rice convened a 40-country summit in Annapolis, Md., in late fall. The conference, which included the Israeli and Palestinian heads of state, ended amiably with mutual vows to draft another framework for peace in 2008. The U.S. found itself isolated at a UN-sponsored conference on global warming held in Bali, Indon., in December. Criticized for its failure to sign the 1996 Kyoto Protocol and largely abandoned in public sessions by other major industrialized countries, the U.S. delegation reversed itself in mid-conference and agreed to a new process that promised involvement of less-developed countries, speedier antipollution technology transfers to Third World countries, and development of a worldwide plan to combat global warming by the end of 2009.

David C. Beckwith

Developments in the States
      Reacting to perceived failures by the federal government, U.S. states moved forward in 2007 on several lawmaking fronts, including health care, immigration, security, climate change, and other areas heretofore considered national issues. The tension with Washington, D.C., enlivened an active year for state governments and resulted in a marked deterioration of fiscal balances by year's end. All 50 states staged legislative sessions during the year, and 22 states held one or more special sessions.

Party Strengths.
       Democrats recorded gains in limited state elections during the year. In governorships Republicans took over a Democratic seat in Louisiana but were ousted in Kentucky, thereby maintaining the Democratic advantage at 28–22. Democrats also added to their majority control in legislative balloting, having taken over the state Senates in Virginia and Mississippi and adding seats elsewhere. In 2008 Democrats would have control of both legislative chambers in 23 states, and the GOP would dominate in 14, with 12 split or tied. (Nebraska has a nonpartisan, unicameral legislature.)

Structures, Powers.
      Several states took action to safeguard ballot procedures. Montana and South Dakota moved to curb abuses in citizen initiatives by prohibiting signature gatherers from being paid by the signature. After enacting a model election-reform law in 2001 in response to the presidential election debacle the previous year, Florida was forced to revisit the subject; this time the state required a paper trail in all electronic voting machines. Iowa, Maryland, and Virginia approved similar mandates; as a result, 27 states required a paper record for auditing purposes.

Government Relations.
      A rare revolt over mandates in a federal law broke into the open during 2007 when several state legislatures—including New Hampshire, Montana, Oklahoma, and Washington—refused to comply with the Real ID Act. Many others also took preliminary steps in the same direction. The federal law required states to verify the identity of all 245 million licensed drivers and to impose other security features, at an estimated cost of $14 billion.

      Real ID had been under fire since its passage as an antiterrorism measure in 2005. States objected to its cost; civil libertarians raised privacy concerns; and immigrant rights groups objected to provisions impairing states' ability to grant driver's licenses to noncitizens. Under the act, licenses that did not comply could not be used as identification for entering airports or federal buildings. Tennessee followed North Carolina in denying driver's licenses to illegal immigrants during 2007, and a proposal by New York's governor to issue licenses to illegal residents was abandoned after widespread criticism.

 Tension over funding and control of state National Guard troops continued to simmer. After Louisiana's governor turned down a federal National Guard takeover in the wake of Hurricane Katrina in 2005, Congress permitted its federalization in future disasters, which prompted objections from numerous governors. Kansas Gov. Kathleen Sebelius implied that Iraq deployment had reduced the ability of the National Guard in her state to assist in May when tornadoes devastated Greensburg. Under pressure from the White House, however, Sebelius said that her real worry was preparedness for possible future needs.

Finances.
      After two years of bustling revenue and spending growth, states tightened their belts in 2007 as national economic growth slowed at year's end. Tightening mortgage standards helped depress the housing market in many areas, and sales and real-estate tax collections slowed markedly. Arizona, California, Florida, Illinois, Maryland, Michigan, Virginia, and Wisconsin were among the states facing major budget deficits. (See Sidebar. (Subprime Mortgages: A Catalyst for Global Chaos )) With its auto industry slumping, Michigan continued in what became known as a “one state recession.”

      State general fund expenditures rose by 9%, paced by increases in Medicaid and pension spending. For the first time, owing to rising health care costs, state spending on Medicaid programs for low-income individuals in 2007 surpassed state expenditures on K–12 education. Numerous states adjusted tax rates, but overall changes in revenue collections were minor by historical standards. Twenty-four states reduced and 4 raised personal income taxes, saving taxpayers $1 billion, while 22 states lowered and two increased sales taxes. Nine states, led by Michigan and New York, increased corporate income taxes. The biggest revenue increases were assessed against tobacco products, with eight states raising cigarette taxes by $761 million. No state adjusted alcohol taxes during the year. Two states raised motor fuel taxes, and 13 boosted motor vehicle and other user fees.

      As the real-estate slump deepened late in the year, states began tapping their “rainy day” funds, carryover balances, and other reserves to combat looming budget deficits. More than a dozen states raised alarms over pending budget deficits, increasing the prospect of further belt tightening or tax increases in 2008.

Law and Justice.
      Opposition to capital punishment gained significant ground during 2007, fed by concerns over wrongful convictions and the humaneness of executions. In late September the U.S. Supreme Court announced that it would review whether the lethal-injection method used in virtually all executions constituted “cruel and unusual punishment.” States immediately suspended imposition of the death penalty for the year. Only 42 inmates were executed during 2007, the fewest number since 1994. At year's end New Jersey became the first state in 42 years to abolish capital punishment; death penalty statutes remained on the books in 36 states.

      Though Congress failed to enact immigration reform, 46 states ratified immigration-related legislation. Several moved to increase employer responsibility for ensuring that their workers were in the U.S. legally. Arizona, Nevada, Oklahoma, Tennessee, and West Virginia joined Colorado and Georgia in restricting immigrant services or increasing enforcement penalties against illegal aliens. The Arizona law suspended the license of any business convicted of having hired illegal aliens; a second offense was grounds for permanent revocation. Arizona's governor also deployed National Guard forces to assist in border enforcement. Oklahoma prohibited the hiring or the transporting of illegal workers and banned undocumented aliens from receiving public benefits. A dozen other states approved a variety of measures that cracked down on identity fraud or required proof of legal status to receive public benefits. California specifically extended public benefits to migrant workers, and Illinois became the first state to prohibit officials from checking identities by using a federal database.

      After National Football League star Michael Vick was arrested on federal dogfighting charges, several states moved to bolster animal cruelty laws. New Mexico and Louisiana became the last two states to ban cockfighting, although the Louisiana law would not take effect until August 2008.

Gambling.
      States continued to expand legalized gaming during the year. Seeking a greater share of profits, Kansas became the first to authorize large-scale casino resorts owned and operated by the state. Indiana joined 11 other states that allowed slot machines at horse tracks, and legislation was pending in Maryland and Michigan; West Virginia added table games at casino racetracks. Maine voters rejected a harness-racing track with slot machines, but Florida and California were among the states that allowed expanded gaming in Native American casinos.

Health and Welfare.
  Health care issues—including access, cost, and delivery—dominated legislative agendas during 2007. The debate came as the federal government moved slowly to expand and reform its State Children's Health Insurance Program (SCHIP), which some states had used to cover parents, single adults, and middle-class families. New York and New Jersey helped to fuel the controversy by seeking federal matching SCHIP funds for families earning up to 400% of the poverty income level.

      Illinois became the first state to guarantee health insurance to all children. Florida and Indiana initiated closely watched experiments in Medicaid reform, expanding coverage while trying to hold down costs through insurer competition and requiring recipients to contribute to personal health savings accounts.

      More states moved toward universal health insurance coverage. California's governor proposed a $12 billion plan to cover all state residents—learning from universal coverage experiments under way in Maine, Massachusetts, Vermont, and Hawaii—but the initiative bogged down in the state legislature.

      Eight additional states—Illinois, Maryland, Minnesota, New Hampshire, New Mexico, Oklahoma, Oregon, and Tennessee—banned smoking in public areas and places of employment, including restaurants and bars. By 2008 a total of 31 states would mandate smoke-free environments.

      Bucking a national trend, Oregon voters turned down a proposal that would raise tobacco taxes to finance increased health insurance for children. State stem-cell research had a mixed year; voters in New Jersey rejected a major bond issue related to such research, but New York budgeted $600 million over 10 years. Texas voters approved $3 billion in bonds for cancer research. Texas and Florida joined New Jersey in testing high-school athletes for steroid use. Nearly half the states considered requiring schoolgirls to be vaccinated against the human papillomavirus (HPV), which causes cervical cancer, but only one—Virginia—enacted a statewide program (parents were allowed to opt out, however).

Education.
      Programs that would allow for choice in K–12 education made minimal progress during 2007; in the past, such programs had been on the rise. Utah became the first state to enact a universal voucher law that allowed any child to receive public funds to attend private school, but Utah voters repealed the measure in November. Three states expanded voucher programs but only for students with disabilities.

Rights.
      Washington joined California in requiring employers to grant paid leave of up to $250 per week for parents with newborn children. Illinois became the 12th state to require a mandatory daily moment of silence in public schools. Maryland became the first to enact a “living wage” law that required state contractors to pay their employees up to $11.30 per hour. Alabama, Maryland, North Carolina, and Virginia legislatures expressed remorse for their states' past support of slavery.

      Advocates of equal rights for homosexuals made progress during the year. New Hampshire became the fourth state to approve civil unions, giving same-sex couples all rights granted under traditional marriage laws. Oregon and Washington joined California, Maine, and Hawaii in enacting domestic-partnership laws, with many of the same benefits. Rhode Island's attorney general declared that his state would recognize marriages performed in Massachusetts, the only state that recognized same-sex marriages. Iowa and Colorado banned discrimination in the workplace on the basis of sexual orientation, and Colorado specified that homosexuals could adopt children.

      Oregon voters rolled back a controversial 2004 initiative that required that the government compensate property owners for land-use restrictions; the measure had produced demands for $19 billion in little more than two years. Florida and Maryland restored voting rights for convicted felons who had served their time.

Environment.
       Global-warming fears, augmented by a perception that the federal government was foot-dragging on environmental protection, spurred significant state legislation during the year. Hawaii, New Jersey, Minnesota, and Washington endorsed a 2006 California law that limited smokestack emissions from power plants and industrial sources. After President Bush signed a law boosting automobile fuel economy standards over 12 years, the administration formally rejected a tougher 2002 California law requiring an even faster reduction in auto carbon-dioxide emissions. The state initiative had been endorsed by a dozen additional states, including Maryland in 2007, and at year's end California announced new plans to sue the federal government.

      States continued to boost goals for producing electricity from renewable sources, with Minnesota, New Hampshire, and Oregon officially aiming at a goal of 25% clean production by 2025. A total of 23 states had renewable energy standards.

Consumer Protection.
      State laws requiring that cigarettes be self-extinguishing gained rapidly in popularity. A total of 15 states approved new “fire-safe” measures, bringing the number to 21 states that required that manufacturers add bands of paper that snuffed the flame quickly if a cigarette was not being smoked.

      Following the collapse of a Minnesota I-35W highway bridge on August 1, states nationwide moved to reinspect similar structures and propose infrastructure-repair plans. Even though studies showed that more than one-quarter of the country's bridges were rated either structurally deficient or obsolete, minimal additional funding was allocated during the year.

      A battle continued in state legislatures between telephone and cable companies over regulation and taxation of multichannel television; more than a dozen states moved from local to statewide control. Telephone firms wanted to bypass complicated local requirements as they attempted to compete with cable on Internet access as well as telephone and television delivery.

      With $40 billion in insurance claims from Hurricane Katrina, insurers moved to raise rates or reduce coverage, creating a serious backlash in several states, particularly along the Gulf Coast. Louisiana and South Carolina offered tax breaks to insurers, and in a controversial move, Florida dramatically expanded its state-run “insurer of last resort” to cover more than one million residents. Critics warned that the state was taking on excessive risk. Nevada, New Mexico, and Oregon increased regulation of short-term-interest “payday” lenders. Washington state voters approved a measure that allowed triple-damage lawsuits against insurers who wrongfully rejected claims.

David C. Beckwith

▪ 2007

Introduction
Area:
9,366,008 sq km (3,616,236 sq mi), including 204,083 sq km of inland water but excluding the 156,049 sq km of the Great Lakes that lie within U.S. boundaries
Population
(2006 est.): 299,330,000; on October 17 the population passed 300,000,000
Capital:
Washington, D.C.
Head of state and government:
President George W. Bush

 Following the terrorist attacks in the U.S. on Sept. 11, 2001, the administration of Pres. George W. Bush mounted an aggressive international response, organizing a military coalition of willing industrialized countries to root out international terrorism. By 2006, however, the effort had been bloodied by religious-inspired violence, and even though there again was no subsequent terrorist attack on U.S. soil, U.S. defeat on the central battlefield appeared possible. For the third consecutive year, more than 800 U.S. soldiers died in Iraq (more than 3,000 had died since the conflict began in 2003), and the patience of the American public was exhausted. Following electoral reverses in November, President Bush faced the real prospect that his legacy in the war on terrorism would be one of overreach and failure. (See Sidebar. (U.S. 2006 Midterm Elections ))

Iraq.
      As the year began amid signs of easing tensions, top U.S. commander Gen. George Casey was planning a yearlong gradual reduction in the 155,000 U.S. troops on the ground in Iraq. At that time minority Sunni Muslims, aided by non-Iraqi terrorists, were carrying out repeated bombing and kidnapping attacks against majority Shiʿite members, but U.S. and Iraqi government forces were handling the violence. On February 21, however, seven al-Qaeda terrorists staged a predawn attack on the revered al-Askari shrine in Samarraʾ, north of Baghdad, blowing apart the mosque's famed golden dome. The sacrilege at one of Shiʿism's most holy sites started an outburst of retaliatory violence that escalated steadily during the year, drawing much of Iraq into a sectarian civil war. As thousands of Iraqis were murdered and U.S. troop losses mounted, military reinforcements were ordered, and domestic support for President Bush and his Iraq policy contracted rapidly. Antiwar sentiment rose, and as elections approached, even loyal Republicans began to break with the president, many decrying the absence of a strategy to win in Iraq.

 After voters emphatically repudiated Republic leadership in congressional elections, Bush accepted the resignation of Secretary of Defense Donald Rumsfeld, a clear indication that the administration's Iraq policy had failed. Bush appointed as Rumsfeld's replacement former director of central intelligence Robert M. Gates (Gates, Robert ). (See Biographies.) A bipartisan Iraq Study Group of government elders cochaired by former secretary of state James A. Baker III and former congressman Lee H. Hamilton issued a report calling for increased regional diplomacy and phased withdrawal of the overstretched U.S. military from Iraq. The report was designed to provide political cover for disengagement. At year's end, however, Bush appeared to be pondering instead a “surge,” or temporary escalation in U.S. forces, led by aggressive new field commanders, in a last-ditch effort to win the peace in Iraq and establish democracy in the Arab world.

      Amid increasing pessimism, there were occasional signs of progress in the war on terrorism. A U.S. air strike in June killed Abu Musab al-Zarqawi (Zarqawi, Abu Musab al- ) (see Obituaries), the leader of al-Qaeda in Iraq. British authorities in August broke up a London-based plot by Islamic terrorists to carry small liquid bombs, disguised as sports drinks, on up to 12 U.S.-bound jumbo jetliners. The episode was a stark reminder of the stakes faced by the West in the terrorism struggle. In December a U.S. aircraft-carrier task force supported Ethiopian army forces that routed Islamic fighters from Somalia.

      The war on terrorism also exposed deep internal divisions over constitutional protections during wartime. News stories detailed government surveillance techniques against terrorism, including wiretapping of overseas calls and monitoring of international bank records—measures that alarmed civil libertarians. After persistent rumours, Bush announced that CIA officials had been holding high-value terror detainees in secret prisons around the world and subjecting them to aggressive interrogation methods that some critics labeled torture. In September, following weeks of contentious debate, Congress approved new legislation allowing the president latitude in approving interrogation techniques and granting detainees only a well-regulated military-commission prosecution, not a full hearing in federal courts.

Domestic Policy.
      With public-opinion polls showing popular disdain for the political leadership in Washington, Congress accomplished little during 2006 and earned comparison with “do-nothing” legislatures of earlier eras. Republicans blamed unfinished business on opposition obstructionism, but Democrats replied that congressional leaders needed to work harder and listen more closely to public opinion. In November voters opted for new congressional management.

      With the Bush administration distracted by Iraq and investigations, the White House was unable to furnish strong leadership for much of the year. In June the president's top strategist, Karl Rove, was finally cleared by a special prosecutor following the investigation of the publication in 2003 of a CIA employee's identity. It was revealed later that a top Department of State official, Richard Armitage, had inadvertently leaked the name—a fact known by Prosecutor Patrick Fitzgerald before he started his three-year investigation.

      The legislative year got off to a fast start. In late January, by a 58–42 vote, New Jersey federal judge Samuel Alito (see Biographies) was confirmed as a U.S. Supreme Court justice, replacing retired justice Sandra Day O'Connor. The Bush nomination produced the most contentious high-court confirmation battle in 15 years, with opponents suggesting that Alito would expand presidential powers at the expense of Congress and curb abortion rights.

      A month later Congress overwhelmingly approved reauthorization of the Patriot Act, initially enacted after the 2001 terrorism attacks. The law, which had attracted widespread criticism, was changed only modestly to provide subpoena targets with additional procedural rights as information was gathered in terrorism investigations.

      The U.S. Senate conducted a tumultuous debate on immigration policy during the spring. Initially, the Judiciary Committee approved a bill that would allow most undocumented persons now residing in the country to stay in the U.S. and “earn citizenship” by paying $2,000 in fines, working for six years, learning English, undergoing a background check, and paying any back taxes owed. That bill was widely criticized as an amnesty that rewarded illegal conduct. Republicans soon substituted a new version on the Senate floor that also contained a guest-worker program but stiffened requirements for obtaining legal status and eliminated most recent arrivals altogether.

      During consideration of more than 40 amendments to the bill, senators barred immigrants from obtaining legal status if they had committed a felony or violated a court order; this provision alone eliminated an estimated 500,000 of the 12,000,000 undocumented persons currently in the country. In a close vote the Senate allowed even workers who had used false identity to claim Social Security benefits on their earnings. The compromise bill, which had President Bush's tacit support, increased border and workplace enforcement and expanded visa authorization; it was approved 62–36. The debate prompted strong public reaction. On April 10, Hispanics and their sympathizers staged massive protest marches in 102 cities across the country. Some marchers carried Mexican flags, which generated a backlash in public opinion, and flags disappeared from later demonstrations. The protests led to a demand for stronger border enforcement. On May 15, President Bush announced that he was sending National Guard troops to the Mexican border to assist the U.S. Border Patrol.

      The U.S. House had previously approved an enforcement-only border-security plan, with stiffened penalties for immigration violations and no provision for guest workers. Instead of negotiating with the Senate, House leaders refused to appoint conferees, staging instead a series of 40 public hearings across the country designed to highlight perceived deficiencies in the Senate bill. As the election approached, Congress reconfirmed support only for 1,125 km (700 mi) of fencing along the U.S.-Mexico border, leaving comprehensive immigration reform for another year. (See Special Report (Immigration's Economic Impact ).)

      The year saw no progress on reforming Social Security, even as baby boomers began to retire and receive benefits. Despite persistent headlines from the corruption scandal involving lobbyist Jack Abramoff, lobbying reform was not seriously considered. Legislation to address global warming died, as did the bills to combat identify theft and to increase the minimum wage. Congress approved one controversial measure, which authorized dramatically expanded funding for embryonic- stem-cell research, but right-to-life groups vigorously opposed the bill, and President Bush vetoed it in July. It was his first veto in more than five years in office.

      Congress renewed the 1965 Voting Rights Act for 25 additional years. It also overhauled national pension legislation and allowed travelers to bring back up to three months' supply of prescription drugs from low-price retailers in Canada. Only 2 of 11 appropriations bills were approved—in large part because taxpayer groups found more than 10,000 congressional earmarks in the drafts. This meant that government would be largely funded into 2007 via a resolution that would continue existing programs. At year's end, Americans were mourning the loss of former U.S. president Gerald R. Ford (Ford, Gerald Rudolph, Jr. ), who died in Rancho Mirage, Calif., on December 26. (See Obituaries.)

The Economy.
      Despite a slowdown in key auto and housing sectors and renewed turmoil in corporate executive suites, the U.S. economy expanded for the fifth consecutive year. Some 1.8 million new payroll jobs were created, and equity markets headed substantially higher. The jobless rate fell further, from 5% to 4.5%, creating virtual full employment in most areas of the country, even as tens of thousands of new undocumented workers from abroad joined the workforce.

      Bustling economic activity continued to stir inflationary fears at the country's central bank. The U.S. Federal Reserve continued a two-year policy of nudging up interest rates and boosted the federal funds rate by 0.25% on four occasions early in the year. With higher rates pinching housing sales and gasoline prices again heading over $3 per gallon because of political unrest and summer demand, the Fed halted further increases for the year. Even so, the expansion was slowed, and inflation dropped to the lowest rate in three years. GDP jumped by 5.6% in the first quarter before settling down to a healthy 2–3% growth range for the remainder of the year. The spring energy price rise proved short-lived, and by mid-October the price of gasoline was dropping rapidly back down to $2.20. With consumer prices rising at 2.5% for the year and energy prices again under control, national financial markets staged a substantial late-year rally. The S&P 500 finished the year up 13.6%, and the closely watched Dow Jones Industrial Average did even better, gaining 16.3%.

      Some critics asserted that the year's rosy financial news concealed deep underlying problems in the U.S. economy. Productivity growth, a key measurement of economic efficiency, slowed during 2006 after five years of major gains. Rapid growth in government revenues slashed the U.S. budget deficit for fiscal 2006 to $248 billion, well under early estimates, but a worry to some economists.

      After overheating in 2005, the nation's housing market continued to cool dramatically, with average prices in some areas falling by up to 10%. That, plus higher interest rates, reduced the ability of consumers to borrow against their home equity, a major source of economic liquidity in recent years. Even so, American consumers continued to spend heavily on foreign goods, including automobiles, and the three major U.S. auto companies joined housing on the short list of industries that failed to join in the year's economic good news. Consumer demand for foreign goods coupled with a decline of U.S. energy reserves meant that the country's trade deficit set another record during the year. The U.S. dollar rose slightly against the Japanese yen and dropped more than 10% against the euro.

      Corporate America was hit by another major internal scandal in 2006, this one centring around executive stock options. Under federal investigation, nearly 200 publicly held companies reported irregularities in granting of options on corporate stock, often involving backdating of options to maximize their worth. By mid-December more than 55 executives and directors had been forced out and others implicated in questionable transactions, some in major firms. In one particularly dramatic case, the CEO of Comverse Technology fled to Namibia to avoid extradition and questioning about option grants.

      The country's most successful retailer, Wal-Mart, sustained a rare tumultuous year capped by disappointing year-end holiday sales. The company was the target of a major public-relations attack by union activists objecting to the company's pay and benefits policies, its reliance on Chinese merchandise, and its adverse affect on local small retailers.

Foreign Policy.
      With American power tied down by sectarian conflict in Iraq, diverting both focus and resources, U.S. diplomacy suffered through a forgettable year in 2006. Although relations with some allies, including India, improved during the year, the globe's sole remaining superpower appeared impotent at times, captive to events in most areas, and unable to exert accustomed will on world events.

      In an attempt to relieve stress on the U.S. military, complete control of security in Afghanistan was turned over to NATO during the year. Taliban rebels continued, however, to stage a fierce resurgence punctuated by bombings and suicide attacks, making 2006 the country's bloodiest year since the Taliban was ousted from power in 2001. The Afghan drug trade, technically illegal but tolerated by the government, flourished as security deteriorated. About half of the 40,000 troops in the country belonged to the U.S., and despite repeated calls for assistance, most NATO countries were unable or unwilling to step up their commitment. Efforts to capture Osama bin Laden, believed to be hiding in a lawless area of northern Pakistan near the Afghan border, went nowhere during the year.

      Two rogue states with nuclear ambitions and unstable leadership, Iran and North Korea, took particular advantage of the overstretched U.S. military and preoccupation with Iraq. For most of the year, North Korea declined to participate in six-nation diplomatic efforts designed to stop development of its nuclear weapons program. Instead, on July 5, North Korea test-fired seven missiles, including a Taepodong-2 long-range version that some analysts said was capable of hitting the western United States. The missile failed after 40 seconds, however, landing in the East Sea (Sea of Japan), but not before the U.S. had activated still-unproven interceptor missile systems in Alaska and California. North Korea then shocked even its closest ally, China, by detonating its first confirmed nuclear device inside a Korean mountain tunnel on October 9. Following universal criticism, North Korea agreed to resume international talks, but the outlook was unpromising. North Korean negotiators initially declined to discuss the nuclear program and instead limited discussion to economic sanctions previously imposed on the regime for counterfeiting and illegal technology transfers. (See Korea, Democratic People's Republic of , above.)

      International attempts to persuade Iran to curb its nuclear program, which the oil-rich country insisted was necessary for civilian energy production, were met with continued stalling by Tehran. With Russia and China competing actively for Iranian contracts and trade, Iran was able to play world powers against each other and elude international sanctions. As the year began, a Russian offer to defuse the crisis by enriching uranium for Iran was rejected. The UN Security Council then gave Tehran until August 31 to stop enrichment or prove its program was peaceful; member countries offered a package of economic and political concessions as encouragement. When International Atomic Energy Agency inspectors, previously ejected from Iran, returned in mid-August, Iranians refused to release a key 15-page report on possible uranium shaping for weapons uses. Amid arguments over what should be done, the Security Council unanimously voted in December to impose economic sanctions on Iran—but only after watering down, at the behest of China and Russia, provisions for freezing assets and restricting travel of Iranian officials. At year's end, despite internal political problems, Iran's nuclear program remained intact, and its influence in the Middle East appeared to be growing rapidly. (See Iran: Special Report (Iran's Power Dilemma ), above.)

      U.S. efforts to stop a civil war that had claimed more than 200,000 lives in Darfur, the westernmost region in The Sudan, also proved largely ineffectual. President Bush signed a law in October imposing economic sanctions on The Sudan following the central government's refusal to admit 17,000 United Nations peacekeeping troops. During 2006 the U.S. contributed more than $400 million in humanitarian aid to Darfur, but some aid was intercepted, and the Sudanese government continued to ignore international protests.

 With attention focused on Iraq, U.S. diplomats were unable to apply significant new influence on Russia or China during the year. In a May speech that proved controversial, Vice Pres. Dick Cheney accused the government of Russian Pres. Vladimir Putin of rolling back human rights and using its energy reserves as “tools of intimidation or blackmail” against its neighbours. The Putin government rejected U.S. suggestions that authoritarianism was returning to Russia and threatening democracy in Eastern Europe.

      The year produced a record U.S. trade deficit with China of more than $215 billion, severely hampering U.S. efforts to influence perceived human rights, currency, and environmental issues in the world's largest country. The Chinese economy continued its rapid growth, challenging U.S. economic power in Asia, and the U.S. moved notably closer to India during the year. A treaty granting technological assistance to India's fledging civilian nuclear-power program was approved by the U.S. Senate in December.

       Venezuelan Pres. Hugo Chávez won a substantial reelection victory in December, increasing his prestige in the Western Hemisphere. Chávez repeatedly referred to President Bush as “the devil” in public speeches and led an effort to reduce U.S. economic and political influence in the region. Following his reelection, Chávez moved to nationalize key industries and shut down opposition media, ignoring U.S. criticism in the process.

David C. Beckwith

Developments in the States
      In 2006 U.S. states enjoyed a relatively tranquil year, which was marked by strengthening fiscal conditions and improved intergovernmental relations, and a respite from major natural disasters. Responding to perceived inaction by the federal government, states took action on numerous issues previously considered outside their province, including illegal immigration, climate change, minimum wage, stem cell research, and health care. In the November midterm elections, states followed a national trend in opting for a major shift in partisan control of state capitals. (See Sidebar (U.S. 2006 Midterm Elections ).) Regular legislative sessions were staged by 44 states during the year, and 20 states held one or more special sessions.

Party Strengths.
       Democrats made substantial gains in 2006 state elections, winning 20 of 36 governorships at stake and establishing a clear advantage in state legislatures. Going into the election, Republicans enjoyed a 28–22 advantage among governors, but the lineup for 2007 would be reversed, with 28 Democratic governors. In legislative elections Democrats reestablished majority control after several years of virtual parity between the parties. Prior to the election, Republicans had a two-house majority in 20 legislatures, Democrats were dominant in 19, and the remaining states were split or tied. The new breakdown included two-chamber Democratic majorities in 23 states, GOP control in 15, and 11 split or tied. (Nebraska has a nonpartisan unicameral legislature.)

Structures, Powers.
      By a 58–42% margin, voters in Michigan approved a constitutional amendment banning race-based affirmative action in college admissions and government hiring. The initiative, opposed overwhelmingly by educators and opinion leaders, was similar to measures approved by voters in California in 1996 and Washington in 1998. Support for antitax and state spending-cap measures waned during 2006, continuing a recent trend. Voters in Maine, Nebraska, and Oregon rejected the Taxpayer Bill of Rights measures that would limit spending increases to population growth plus inflation and mandate a legislative supermajority to increase taxes. As comprehensive immigration reform languished in Congress, numerous states grappled with rules for illegal aliens. Arizona voters approved a series of get-tough measures that included denial of day-care or tuition benefits. Colorado and Georgia targeted employers, prohibiting tax deductions for payment to illegals and sanctioning a state lawsuit against the federal government for lack of immigration enforcement. A gubernatorial veto voided the California legislature's attempt to allow licenses for undocumented drivers. (See Special Report (Immigration's Economic Impact ).) Arizona voters rejected an innovative attempt to encourage voting; the proposal would have entered voters in an annual million-dollar lottery.

Finances.
      With the national economy continuing to expand, state revenues across the country exceeded expectations and led to the first overall net tax decrease in six years. Some 40 states posted significant surpluses, and officials responded by restoring funding for critical programs, such as education, and by rebuilding “rainy day” funds and addressing perceived inequities in taxation.

      Overall, 24 states cut taxes, while 15 states increased rates, for an overall modest $2.1 billion reduction in tax revenue. The largest tax increase occurred in New Jersey, where a budget dispute between newly elected Gov. Jon Corzine and the legislature led to a six-day shutdown of state government at midyear that idled 45,000 state employees, closed state parks, and even shuttered Atlantic City casinos. The standoff ended with agreement to raise the state sales tax immediately from 6% to 7%, followed by a promised reduction later in state property taxes, the highest in the nation.

       Ohio, pursuing a five-year reduction plan, led 18 states that decreased personal income taxes; only 2 states raised income-tax rates during 2006. Five other states raised sales taxes, and 15 reduced them, usually by cutting levies on food, clothing, or other essentials. Six states raised cigarette and tobacco taxes, and Idaho boosted alcohol revenues. As energy prices spiked at midyear, most states left motor-fuel taxes unchanged.

 Seventeen states made adjustments to corporate taxes, most of them modest. Texas, however, enacted a new business tax as part of a school-finance-reform plan, boosting revenues by more than $400 million. Alaska enacted a major increase in oil-company taxes. Although soaring home prices leveled out or even declined nationwide, house appraisals often rose, and major protests occurred over increased property taxes in numerous states. Texas joined Arizona, New York, New Jersey, Pennsylvania, Georgia, Indiana, Rhode Island, Idaho, and South Carolina in promising relief during the year, often by transferring funds to local governments to offset property-tax revenue.

      With memories of the 2001–03 state budget crisis still fresh, legislators were often wary of enacting measures associated with new spending. As unemployment declined and welfare rolls stabilized, mandatory programs, such as Medicaid, grew at a modest rate during 2006. Numerous states replenished reserves and restored spending on K–12 education, higher education, highways, and other general expenditures that had been trimmed earlier.

Social Issues.
      Although supporters of traditional marriage won most ballot contests during the year, advocates of legal rights for same-sex couples claimed progress in establishing protection for homosexual unions. New Jersey became the third state, with Vermont and Connecticut, to establish civil unions as a formal alternative to traditional marriage. The change came after New Jersey's highest court unanimously declared that homosexual couples deserved all rights and privileges of marriage and ordered the state to either legalize same-sex marriage or provide equal statutory treatment for gay couples. During the year voters in 7 additional states, for a total of 27, amended their constitutions to define marriage as being between one man and one woman. Arizona, however, became the first state to reject a gay-marriage ban after opponents stressed that the measure would ban governments from recognizing domestic-partnership arrangements between heterosexual couples as well. Massachusetts remained the only state to have legalized same-sex marriage. California, Hawaii, and Maine maintained domestic-partnership registries that conferred specific benefits to any couples who registered, same sex or opposite sex. (Colorado voters turned down a “domestic-partnership” proposal in November balloting.) Lawsuits seeking the legalization of same-sex marriage were pending in several states, and high-court decisions were being awaited in Maryland, Connecticut, and California.

      The South Dakota legislature approved a total ban on abortion, seeking to set up a new challenge in the U.S. Supreme Court to Roe v. Wade. In the November election, however, state voters voided the measure by a 56–44% margin. Louisiana enacted a near-total halt to abortions except those required for saving the life of the mother. Countering a national trend, voters in California and Oregon rejected a measure requiring parental notification when minors sought an abortion. A total of 35 states required either notification or consent by parents in such cases.

Law, Ethics.
       Gun rights advocates flexed their political muscle during 2006, and 14 states joined Florida in approving measures specifying that crime victims need not retreat before using deadly force against attackers. Supporters of the measures called the bills “stand your ground” legislation, but critics labeled them “shoot first” laws. Keying off reports from the Hurricane Katrina disaster, 10 states prohibited authorities from confiscating personal weapons during natural disaster recovery efforts. Two additional states, Nebraska and Kansas, joined the 46 that allowed “concealed-carry” gun permits to be issued to qualified applicants. Only Illinois and Wisconsin prohibited the carrying of a hidden weapon.

      Responding to disturbances staged by anti-homosexual-rights activists, 27 states banned picketing and demonstrations at funeral and memorial services for U.S. servicemen and women.

 Several states grappled with ethics issues. After state legislators became embroiled in financial scandals, North Carolina and Tennessee enacted sweeping ethics-reform legislation. Kentucky's governor was indicted on misdemeanor charges of having hired workers on the basis of their political loyalties, but the charges were later dropped. Former Illinois governor George Ryan was sentenced to six and a half years in a federal prison after his conviction on 18 federal felony corruption charges dating from his tenure as secretary of state. Outgoing Ohio Gov. Robert Taft was reprimanded by the state's Supreme Court for failing to report gifts.

       Tennessee became the first state to require retailers to check identification of all beer purchasers, regardless of how old they looked. Alaska's legislature attempted to recriminalize possession of small amounts of marijuana, but the attempt was largely voided by an Alaskan court. Following an accidental death, Florida prohibited military-style juvenile detention camps.

      Numerous states approved measures cracking down on sexual predators, including those using the Internet. California and New York joined Florida in enacting “Jessica's Law,” which imposed harsher prison sentences on convicted sex offenders and mandated that they be electronically monitored during their lifetime. The California version prohibited offenders from living within 600 m (2,000 ft) of a school or park, but a federal court declared that the law could not be applied retroactively; it would affect future moves of residence by registered sex offenders but not pertain to existing addresses.

      Imposition of the death penalty continued to decline across the country. California and Florida suspended capital punishment in December, after officials mishandled executions employing lethal injections. Federal courts ruled injection methods in Missouri and California to be unconstitutional. During the year, 53 convicts (including 24 in Texas) were executed, down from 98 in 1999. Countering a trend, Wisconsin voters approved a nonbinding referendum to restore the death penalty.

Health, Welfare.
      Massachusetts and Vermont approved innovative strategies for achieving universal health care coverage. The Massachusetts law provided subsidies for purchase of health insurance and levied fines on employers who failed to provide insurance to employees. Vermont's plan required private insurers to offer coverage to all, overseen by a new state board. Though California's legislature approved what would have been the nation's first publicly financed universal health care system, the measure was vetoed by Gov. Arnold Schwarzenegger, who signed a bill that pressured drug makers to negotiate discounts or risk losing contracts under the state's medical system.

      Following a contentious campaign, voters in Missouri, by 51–49%, granted legal protections to researchers studying embryonic stem cells. Antiabortion groups opposed destruction of embryos, and the referendum was closely watched nationally as an important political test on a divisive subject. Seven states endorsed stem cell research measures beyond limits established by the federal government.

      New statewide bans on smoking in public places were approved in Arkansas, Arizona, Colorado, Hawaii, Louisiana, Nevada, New Jersey, Ohio, and Pennsylvania, bringing to 21 the number of states prohibiting tobacco use in public places. Illinois, Massachusetts, and New Hampshire joined New York, Vermont, and California in requiring all cigarettes to be “fire safe”—to extinguish themselves if left unattended. The laws in Illinois and Massachusetts would become effective in 2008.

Environment, Education.
      Amid further complaints about federal inaction, California enacted the nation's first significant measure designed to combat global warming. The controversial measure ordered that greenhouse-gas emissions in the state be cut by 25% by 2020 through a cap-and-trade system. Washington became the first state to ban phosphates in residential dishwashing detergent.

      As Congress prepared to reauthorize the landmark 2001 No Child Left Behind education act, states continued to wrestle with federal mandates, including testing and accountability requirements. A series of federal waivers to state officials markedly reduced intergovernmental conflicts over the act. Illinois effectively created the first statewide preschool, which would include children three and four years old. In a controversial move that raised the spectre of resegregating classrooms, Nebraska divided Omaha into three racially distinct school districts for the purpose of restoring local control of education.

Regulation.
      Reacting to Washington, D.C.'s inaction on the minimum wage, frozen since 1997, legislators in 11 states and voters in 6 more approved increases in state minimum-wage rates. Four states provided automatic increases with inflation. At year's end 29 states would mandate rates above the $5.15 federal minimum.

      A nationwide campaign headed by union activists against Wal-Mart, the largest American retailer, created turmoil and legislative proposals in numerous states. Maryland's legislature overrode a gubernatorial veto and mandated that Wal-Mart increase employee health care benefits, but a federal court later overturned the law.

      States stepped up protections for private property in the wake of the 2005 Supreme Court Kelo v. City of New London decision, which allowed the government to condemn property, arguably for private purposes. Two dozen additional states limited local eminent domain powers, bringing to 27 the number of states curbing property appropriation over the past two years. In a November referendum Arizona joined Oregon in allowing compensation for property owners subject to government land-use restrictions. Similar initiatives in California, Idaho, and Washington failed by substantial margins, however.

      In an effort to aid victims of identity theft, 26 states allowed such individuals to put a security freeze on their credit reports to inhibit thieves from opening new accounts under their names. West Virginia approved a tough underground coal-mine-safety law following an accident in 2005 that killed nine miners. The measure was a model for a U.S. statute signed into law at midyear. Ohio, Oregon, Rhode Island, and Tennessee outlawed predatory practices by mortgage and payday lenders.

David C. Beckwith

▪ 2006

Introduction
Area:
9,366,008 sq km (3,616,236 sq mi), including 204,083 sq km of inland water but excluding the 156,049 sq km of the Great Lakes that lie within U.S. boundaries
Population
(2005 est.): 296,748,000
Capital:
Washington, D.C.
Head of state and government:
President George W. Bush

      In 2005, amid world skepticism and domestic opposition, the administration of U.S. Pres. George W. Bush forged ahead with its bold and aggressive response to international terrorism. Progress in pacifying a determined Iraqi insurgency and in establishing capable Iraqi security forces proved far more difficult than expected, however. American deaths in Iraq continued at a rate of nearly three per day. A drumbeat of criticism from a unified Democratic opposition helped tax American patience and weaken Bush's base of support. Even a purring U.S. economy failed to assuage doubters. By the fall of 2005, with more than 60% of Americans disapproving of his job performance and his conduct of the Iraqi war, President Bush appeared to be in serious danger, perhaps lacking the political support necessary for him to be able to continue pursuing his plan.

War on Terrorism.
      The American-led effort to establish a functioning democracy in Iraq again dominated world news during 2005. A determined resistance, including both Iraqi and foreign fighters, continued incessant bombing, small arms, and suicide attacks, and U.S. military deaths—846—were only slightly fewer than the 848 recorded in 2004.

 Iraq showed unmistakable signs of progress during the year, starting with a historic election on January 30, in which 57% of the voters turned out for elections to the National Assembly. Voter turnout in an October election to ratify the constitution was even higher (63%), and a third “purple-finger” election, held on December 15, produced a voter turnout of 70%. (See Iraq , above.)

      Allegations of widespread illegality in the UN's Iraq oil-for-food program in the months leading up to the U.S.-led 2003 invasion produced an independent investigation led by former U.S. Federal Reserve chairman Paul Volcker. The inquiry found “corrosive corruption” at the UN and blamed UN Secretary-General Kofi Annan for mismanagement. The report stated that Saddam Hussein had collected at least $229 million in bribes from a majority of companies involved in the program and that $10 billion in Iraqi oil had been illegally smuggled into adjacent countries. The report showed that French and Russian companies received $23.7 billion in Iraqi contracts from 1996 to 2003, during the period when both countries were strong critics of Iraqi sanctions and ultimately opposed the U.S.-led invasion.

      Even while violence continued in Iraq and Afghanistan, a potent political battle was being waged in the U.S. over the war on terrorism. Democrats continued to hammer at President Bush's decision to invade Iraq, suggesting that his stated fear of Iraq's harbouring weapons of mass destruction had been concocted. The controversy eroded Bush's polling numbers, and by October surveys were finding that the majority of the public believed that the decision to invade Iraq was a mistake.

      After Rep. John Murtha, a Democrat from Pennsylvania, called for the U.S. withdrawal from Iraq in November, the public focus turned from President Bush's decision in 2003 and his credibility to the future. Murtha's remarks delighted antiwar activists. Polling soon showed, however, that many Americans disagreed with that assessment and believed that the U.S. should stay the course in the war on terrorism. Defense Secretary Donald Rumsfeld announced plans to reduce U.S. troop strength from 160,000 to below 138,000 in early 2006, saying that trained Iraqi security forces would make up the difference. At year's end Bush's approval rating stood at 40%, up 5% from a month earlier.

Domestic Policy.
      President Bush laid out an unusually ambitious agenda following his second inauguration. He announced plans to regularize the national system of immigration and border control, which had fallen into disrepair. He promised a revamping of the nation's tax code and offered proposals to reform controversial legal liability procedures covering medical malpractice, class-action lawsuits, and asbestos cases. Finally, as the centrepiece of his 2005 agenda, Bush tackled the “third rail” of American politics, the Social Security retirement system, by suggesting an alternation of the current scheme, in which wage earners effectively fund benefits paid to retired Americans. Instead, Bush proposed that workers be given the opportunity to fund their own private retirement accounts, which they would own.

      Little of Bush's agenda became law. Instead of receding after their 2004 election defeat, congressional Democrats showed unusual unity and organized to stop the administration agenda; they were occasionally joined by key Republicans. Ethics problems sapped the majority party. When the U.S. House's GOP leader, Tom DeLay, was forced to step down after a Texas grand jury indicted him on election-law charges, Republican effectiveness frayed noticeably. The result was the worst political and legislative season of Bush's presidency.

      In early 2005 Bush traveled the country extensively, touting his Social Security proposals to enthusiastic, carefully selected crowds. He claimed that reform was needed to avoid the system's bankruptcy as baby boomers retired and laid claim to system payments. Democrat critics, however, rallied opposition by suggesting that Bush was attempting to “privatize” the system, throwing guaranteed benefits into doubt, and by pointing out that the transition period in Bush's plan would actually require more funding than the current plan. Political support for Bush's program was so anemic that the president never offered specific legislation, and the issue had died by year's end.

      Bush's immigration proposals also met with a storm of criticism from both the left and the right, with the most-heated comments coming from his own party. Instead of amnesty for the estimated 12 million illegals living within the U.S., Bush proposed establishing a “ guest worker” program that would grant them legal status and the opportunity for eventual citizenship. Outraged conservatives said that the Bush plan rewarded illegality and called instead for tighter border security and enforcement of often-ignored immigration statutes. The U.S. House, in a largely symbolic vote before adjourning, approved the establishment of a 1,100-km (700-mi) fence along key portions of the U.S.–Mexico border, and Bush was forced to add border-security language to his proposal for congressional consideration in 2006.

      Congress approved a limited portion of Bush's legal reform, moving many class-action lawsuits from state to federal courts, which had historically been less receptive to innovative claims from plaintiff's lawyers. No progress was made, however, on administration proposals to reform the tax system, asbestos litigation, or medical malpractice lawsuits.

      Some significant legislation passed the Congress, but little of it met with Bush's full approval. After nearly a decade under consideration, a bankruptcy-reform bill was signed into law; supporters claimed that by requiring more overextended debtors to adopt a long-term repayment plan instead of having their debts discharged, the measure would reduce credit abuse. Another long-stalled measure, a national energy bill, was approved amid claims that it mostly benefited highly profitable energy companies. Moderate Republicans joined most Democrats to strip from the bill an administration-backed provision allowing energy exploration in the Arctic National Wildlife Refuge.

      After promising to veto any highway-construction legislation that exceeded $256 billion over five years, the president in August signed a $286 billion measure that contained a record 6,371 congressional “ earmarks”—special provisions that individual senators and representatives had inserted for pet projects. One earmark inserted by powerful Alaska legislators was funding for a $223 million bridge from Ketchikan (pop. 8,000) to Gravina Island (pop. 50), currently served by an efficient ferry. After a nationwide protest, the bridge spending was rescinded, but Alaska authorities were allowed to take control of the funds for use on any project—including a Gravina bridge. Despite taxpayer group complaints over excessive spending by Congress, Bush completed his fifth consecutive year in office without casting his first veto.

 Legislative setbacks were almost directly tied to public antipathy over Bush's handling of the Iraq war. As violence continued and U.S. casualties mounted, Democrats concentrated on Bush's credibility, suggesting that he had deliberately misled the country about the threat of Iraqi weapons of mass destruction, never found following the Iraq invasion. When Bush spent his usual August recess month at his ranch in Crawford, Texas, he was dogged by Cindy Sheehan, the antiwar mother of a slain U.S. serviceman, who attracted daily news media attention as she demanded to meet with Bush. He declined.

      Bush's poll ratings, adversely affected by growing public impatience over Iraq, declined even further when government authorities proved incapable of dealing promptly with the fallout from Hurricane Katrina, a major disaster that devastated parts of Louisiana, Alabama, Mississippi, and Florida. Bush eventually took responsibility for the failed federal effort and promised a broad rebuilding package that some experts thought would reach $200 billion. Louisiana's congressional delegation proposed federal aid for that state alone that exceeded $250 billion. By year's end Congress had set aside about $64 billion for storm relief. (See Economic Affairs: Special Report (Preparing for Emergencies ).)

 Republicans were hard hit by a series of scandals. Shortly after DeLay was indicted, Senate Majority Leader Bill Frist revealed that he was being investigated by two federal agencies for having sold stock in a hospital company controlled by his family, shortly before bad news drove its stock price down. A long-running special counsel investigation into the 2003 naming of an undercover CIA operative by Washington columnist Robert Novak culminated in the indictment of a top White House aide. Lewis (“Scooter”) Libby, chief of staff to Vice Pres. Dick Cheney, was indicted for lying to Special Prosecutor Patrick J. Fitzgerald (Fitzgerald, Patrick J. ) (see Biographies) before a grand jury; Libby immediately resigned. Fitzgerald's probe continued into 2006. In a development that threatened to expose corrupt fund-raising and trading of favours on Capitol Hill, federal investigators in November obtained a guilty plea on a conspiracy charge and $19.7 million in restitution from Michael Scanlon, a former DeLay aide. Scanlon promised to testify against another grand jury target, lobbyist Jack Abramoff, over alleged bilking of Indian tribe clients whom they represented on gambling issues.

 A long-running dispute over confirmation of federal appellate judges was at least partially resolved during the summer, with a “Gang of 14” centrist senators, 7 from each party, agreeing to a compromise that seated eight contested Bush nominees. The agreement came just before two seats opened on the U.S. Supreme Court, one caused by the death of Chief Justice William Rehnquist (Rehnquist, William Hubbs ) (see Obituaries) and the other by the retirement of Justice Sandra Day O'Connor. Under terms of the agreement forbidding filibusters except in “exceptional circumstances,” a Washington, D.C., judge, John Roberts (Roberts, John G., Jr. ) (see Biographies), was quickly confirmed as chief justice. Bush suffered another setback when his choice to replace O'Connor—Bush confidant and White House counsel Harriett Miers—was judged unacceptable by conservative activists and withdrew. Bush then nominated New Jersey appellate judge Samuel Alito, whose confirmation was being opposed at year's end by an alliance of liberal interest groups.

      The administration suffered a final setback in December when Congress attempted to renew expiring portions of the 2001 USA PATRIOT Act designed to update law-enforcement tools against terrorism. After House and Senate conferees approved a compromise extension, a bipartisan coalition of senators refused to sign off, with four key Republicans claiming that the renewal potentially infringed on civil liberties. As the vote approached, the New York Times published details of a National Security Agency eavesdropping program on international calls; although technically unrelated, the article reinforced fears about the PATRIOT Act's reach. After applying political pressure by threatening to veto any temporary extension, President Bush in late December signed a mere five-week extension.

The Economy.
      On paper the U.S. economy enjoyed a banner 2005, shaking off natural disasters and spiking energy prices and growing at a robust 3.5% rate for the third consecutive year. Nearly two million new jobs were created, and the nation's unemployment rate fell from 5.4% to 4.9%. Interest rates and inflation, while rising modestly, remained at historically low levels. Labour productivity rose for a fifth consecutive year.

      The economic performance was particularly impressive in the third quarter as Hurricanes Katrina and Rita devastated the Gulf Coast region. The storms eliminated 600,000 jobs, disrupted shipping traffic, and shut down refining and energy infrastructure, sending gasoline prices nationwide temporarily over $3 per gallon. Relief from the federal government and from private insurers helped to jump-start rebuilding efforts, and the national economy grew by a healthy 4.1% during the August–October period.

      As the U.S again provided its traditional economic leadership among industrialized nations, however, there were disquieting signs of excess. The U.S. trade deficit, which had hit a record $618 billion in 2004, topped $700 billion in 2005.

      As the U.S. economy expanded, the Federal Reserve pursued its 18-month policy of nudging short-term interest rates higher, to combat anticipated inflation. The key federal funds rate was boosted by 0.25% on eight occasions during the year, to 4.25%, up from 1% in early 2004. U.S. consumer price inflation, pushed by rising fossil-fuel prices, rose more than 4% for the year, but core inflation (excluding food and energy) remained at modest levels, just over 2%. The gradual interest-rate rise finally contributed at year's end to a cooling of an extended boom in housing construction, sales, and refinancing. Meanwhile, property values in some major urban areas had doubled over the previous five years.

      In another cautious indicator, the solid economic growth failed to impress major equity markets. Stock averages dipped during the spring, recovered later in the year, but ended 2005 with only slight gains. Overall, smaller companies outperformed major firms. Most broad market gauges rose less than 5%, and the Dow Jones Industrial Average actually dropped by nearly 0.5% for the year.

Foreign Policy.
      As the year began, the U.S., Japan, India, and Australia led the world's humanitarian response to the December 2004 tsunami disaster in the Indian Ocean, which claimed an estimated 212,000 lives. U.S. Navy helicopter carriers arrived off Aceh, Indon., only five days after the devastation and were particularly effective in preventing additional disease and hardship by delivering fresh water, medical care and supplies, food, and other relief. The U.S. allocated about $1 billion in official aid, and private U.S. citizens donated another $700 million to the relief effort. The U.S. also provided significant aid when a cataclysmic earthquake struck Kashmir on Oct. 8, 2005, killing more than 87,000 people. (See Pakistan: Sidebar, above.)

      In his second inaugural address, President Bush ambitiously pledged to end tyranny around the globe and spread liberty and freedom “to the darkest corners of the world.” As he spoke, the U.S. was fully extended, financially and militarily, in Iraq and Afghanistan, arguably doing what Bush promised, but the strenuous effort seriously hampered U.S. ability to deliver further on Bush's goal.

      Even so, the administration could point to numerous advances in self-government, human rights, and democracy worldwide, all encouraged by U.S. policy. The breakthroughs included Syria's withdrawal from Lebanon, political progress by women in Muslim countries such as Kuwait and Saudi Arabia, advances toward free elections in Egypt and Liberia, and the historic seating of the first democratic national parliament in Afghanistan. The scheduled Palestinian vote, in addition to Israel's unilateral withdrawal from the occupied Gaza Strip, provided a glimmer of hope for that region.

      International efforts to stop persistent rogue nuclear-weapons-development programs in Iran and North Korea went nowhere during 2005. President Bush had dubbed both countries, with Iraq, “the axis of evil” in 2001, in part because of their nuclear ambitions. With allied military efforts overextended in Iraq and Afghanistan, the U.S. was forced to rely on diplomacy to bring pressure on North Korea and Iran.

      When six-nation talks were belatedly resumed in Beijing in July, North Korea agreed to curb its nuclear program and return to international safeguards provided that it received trade concessions, economic assistance, and security guarantees. Within days, however, the apparent deal broke down as the North Koreans demanded renewed assistance on two substitute light-water reactors, and the U.S. publicly accused North Korea of counterfeiting currency and assisting illegal nuclear proliferation. Pyongyang repudiated its concessions and claimed openly that it had already manufactured several atomic weapons in apparent violation of international law.

      Iran successfully stalled ongoing efforts by France, Great Britain, and Germany to negotiate an end to an illegal enrichment plan. The U.S. favoured a hard-line approach, threatening to seek economic sanctions against Iran at the UN Security Council, but did not press the issue because Russia and China, both with veto power over UN sanctions, opposed the move. At year's end, in an effort to break the impasse, Russia offered to host Iran's enrichment efforts and ensure that the uranium would be used only for energy production.

      U.S. relations with the United Nations, never smooth, suffered through an especially tumultuous year. As details of bribery and corruption in the UN's Iraq oil-for-food program came to light, the Bush administration appointed a vocal UN critic, conservative John Bolton (Bolton, John R. ) (see Biographies), as U.S. ambassador, over substantial U.S. Senate opposition. Bolton arrived at UN headquarters in August and immediately began pushing for significant reforms in transparency and efficiency. At one point Bolton unsuccessfully sought postponement of the UN budget until the management, finance, and appointment changes enacted at a September UN summit had been approved by the General Assembly.

      With China rapidly emerging as a world economic and military power, U.S. policy makers attempted to find a delicate balance in bilateral relations that were superficially correct but laden with serious tensions just below the surface. As the country's trade deficit with China topped a record $200 billion, its options were narrow in pursuing complaints about Chinese currency manipulation, political suppression, DVD and computer software piracy, and arms exports. The U.S. forged historically strong ties with Japan, Pakistan, and especially India in an attempt to counter steadily increasing Chinese influence all over Asia.

      As a wave of populism swept across Latin America, U.S. policy suffered several setbacks. President Bush's attempt to expand a free-trade zone was rejected by major South American countries at a November Western Hemisphere summit in Buenos Aires. A vocal critic of the U.S., Pres. Hugo Chávez (Chavez, Hugo ) (see Biographies) of oil-rich Venezuela, continued to taunt the U.S.; to highlight U.S. internal problems, he sent subsidized heating oil to low-income families in Boston and New York City. A Chávez admirer, Evo Morales, was elected president of Bolivia after promising to defy U.S. antidrug objections and facilitate coca-leaf production.

David C. Beckwith

Developments in the States

Party Strengths.
      An often-difficult relationship with the federal government marked 2005 for the 50 U.S. states; differences over funding, power, and responsibility frequently roiled the federalism partnership. State officials stepped up complaints over unfunded federal mandates and U.S. preemption of authority over traditional state powers. Uneven state/federal response to major natural disasters created major news, but the differences extended to numerous additional areas, including education, health care, and economic development. Meanwhile, the national economic recovery allowed states to restore some services that had been cut in previous years and prompted setbacks for antitax activists. All 50 states held regular legislative sessions during the year, and 24 of them staged special sessions on matters ranging from hurricane relief to school finance.

       Democrats fared well in limited 2005 state elections, capturing a handful of legislative seats and retaining governorships in Virginia and New Jersey. The partisan gubernatorial lineup across the country was therefore maintained at 28 Republicans and 22 Democrats. State legislatures remained at virtual parity between the parties nationwide. Republicans would enter 2006 with a two-house control of 20 states, Democrats dominated in 19 states, and the two parties split legislative authority in 10 states, all unchanged from 2005. Nebraska had a nonpartisan unicameral legislature.

Structures, Powers.
      Voters decided a record 18 citizen initiatives during off-year elections and rejected 16 of them. A recent trend toward limiting state spending, pushed by low-tax advocates, stalled during the year as states recovered from a national economic downturn.

      Voters in California and Ohio decisively rejected proposals to shift contentious legislative redistricting authority away from the state legislature. The California initiative would have turned redistricting over to a panel of retired judges, while Ohio's measure would have substituted a nonpartisan citizen commission.

      New Jersey became the 43rd state to establish the office of lieutenant governor, with power to succeed when the governorship became vacant. In 2004 when that state's governor resigned, the job had devolved to the state Senate president, who simultaneously served as acting governor and as a legislator. New York voters rejected a proposal to overhaul the state's chronically tardy budget process; the measure would have shifted significant budget responsibility from the governor to the legislature. Washington voters approved an initiative requiring periodic audits of local governments.

      In a late-night July vote, the Pennsylvania legislature approved a pay raise for legislators and judges without public notice or comment. Although no legislative elections were scheduled, the resulting public furor resulted in one state Supreme Court justice's losing his position in November balloting—the first judicial rejection in state history. The pay raise was rescinded later that month.

      Alabama, Delaware, and Texas approved new laws restricting eminent domain powers of local officials. The laws were approved after a divided U.S. Supreme Court, in the controversial Kelo v. City of New London (Conn.) decision, affirmed that local governments could condemn and seize private property to make way for commercial development that paid higher taxes. (See Law, Crime, and Law Enforcement : Court Decisions.)

Government Relations.
      Arguments over allocation of power between state and federal governments were front-page news during most of 2005. With fallout from Hurricane Katrina the most glaring example, state officials struggled to maintain productive relationships—and their traditional lines of authority in the U.S. system of federalism—during often-contentious dealings with Washington. Some state officials claimed that the federal government was neglecting its responsibility in vital areas, such as curbing global warming, lowering the prices of costly drugs, and funding stem-cell research. In other instances states asserted that federal authorities were not providing resources to pay for mandates that they imposed on the states. The National Council of State Legislatures claimed that over a two-year period it had identified $51 billion in largely uncompensated annual costs that states incurred as a result of federal mandates, not including the additional mandates that were on the drawing board. The officials also complained about increased federal preemption of state power to regulate health care, land use, technology, and other programs.

      In May Congress approved the REAL identification act, which set rigorous national standards for documents needed in order to obtain a driver's license. The new law effectively prohibited licenses for undocumented aliens, which a dozen states allowed. The law mandated costly new documentation requirements without providing any funding for state compliance.

 After Hurricane Katrina swept over Louisiana, Mississippi, Florida, and Alabama in late August, the devastation was exacerbated by arguments over responsibility for rescue, relief, and rebuilding. Disaster planning had traditionally been the purview of states, but the federal government had taken a steadily expanding role in recent years, blurring lines of authority and responsibility. With news media accounts blaming FEMA (the Federal Emergency Management Agency) for delays in providing relief services and supplies, federal officials made ill-disguised attempts to take control. Officials in Louisiana, Florida, and other affected states pushed back—even while demanding that the U.S. government pay for virtually all rebuilding efforts. The year ended in an uneasy truce, with lines of authority and responsibility remaining largely undefined. (See Economic Affairs: Special Report. (Preparing for Emergencies ))

      Pennsylvania, Connecticut, and Illinois sued the U.S. government in an attempt to save Air National Guard aircraft from being transferred to other states during the federal government's periodic Base Realignment and Closure procedure. National Guard units were controlled by state governors during peacetime but were susceptible to federal call-up in time of war. State officials also threatened lawsuits over provisions of the 2005 national energy bill that granted the federal government broad authority over the siting of liquefied natural gas ports and power lines.

Finances.
      States completed their recovery from the 2001–03 economic downturn during the year. An expanding economy generated revenue beyond projections and outpaced increased outlays for programs such as Medicaid and allowed states to replenish “rainy day” reserve funds that had been tapped in previous years. Legislatures avoided significant tax changes. Several states produced large surpluses, notably California, which boasted $3.4 billion of black ink and its first surplus since 2000. The year saw only a modest overall increase in state taxes, and a majority of the states were preparing for tax reductions in 2006.

      As fiscal restrictions eased, many states increased spending on both K–12 and higher education, which had been targeted for unpopular reductions in previous years. Often by tightening eligibility and reducing some benefits, states managed to slow growth of Medicaid spending from the nearly 15% increase in 2004. Tennessee, for example, started trimming 190,000 recipients from its generous TennCare program. State expenditures on correction facilities increased but also at a slower rate as a 10-year prison expansion stalled. Hurricane-battered Louisiana was forced to make major reductions across the board in state expenditures.

      Ohio was the only state to increase overall taxes significantly, enacting a new commercial-activities tax and boosting both sales and tobacco taxes. Idaho, Iowa, and Virginia approved modest tax reductions. Seven states increased cigarette taxes, and most states increased fees for motor vehicles, driver's licenses, court costs, and other state services.

      Efforts to curb state spending suffered setbacks in several state elections. In a significant setback for antitax enthusiasts, Colorado voters approved a suspension of a landmark 1992 Taxpayer Bill of Rights law that limited revenue increases to population growth plus inflation. Though the moratorium resulted in a refund of more than $3 billion to state taxpayers, it also prompted a shrinkage in state government relative to the state's economy and crimped state education and highway funding. The Colorado plan was being eyed as a model by several other state legislatures.

 California voters rejected an initiative backed by Gov. Arnold Schwarzenegger that would have capped state spending and given additional budget authority to the governor. Washington voters turned down a spending limit and refused to overturn a 9.5-cent gasoline-tax increase approved by the state legislature.

Marriage, Gay Rights.
      Activists seeking equal marital and other rights for homosexuals made additional progress during the year in the aftermath of a 2003 Massachusetts high-court decision that legalized gay marriage. Maneuvering to exploit or blunt the ruling's effect accelerated in courts, legislatures, and at the ballot box across the country. Voters in two additional states, Kansas and Texas, overwhelmingly approved a state constitutional amendment banning recognition of same-sex unions, bringing to 19 the number of states that rejected gay marriage in their basic state document.

      Equal-rights advocates also made breakthroughs, however. Connecticut's legislature voluntarily joined Vermont in recognizing same-sex civil unions. A similar measure, approved by the Maryland legislature, was vetoed by the state's governor. The Alaska Supreme Court ordered state and local governments to grant the same benefits to employees' same-sex partners as those offered to spouses. A federal judge in Nebraska added a new wrinkle to the debate in striking down that state's prohibition of same-sex marriage. The ruling said that state law went impermissibly beyond regulating marriage and denied gay couples fundamental rights guaranteed by the U.S. Constitution.California lawmakers failed in an attempt to recognize same-sex marriage. A measure, the first by a state legislature without a court order, was approved even though California voters had rejected the concept in a 2000 statewide referendum. Governor Schwarzenegger vetoed the bill, however, saying that he preferred that the state Supreme Court decide the matter. Maine voters rejected a measure that would have overturned a legislature-approved state law banning discrimination against homosexuals in housing, employment, and education.

Ethics.
      Ohio Gov. Robert Taft pleaded no contest to four misdemeanour counts of violating state ethics laws by failing to report golf outings and other gifts. Taft, a Republican, was found guilty and fined $4,000. The ethics probe began after it was discovered that an Ohio Republican fund-raiser had lost more than $10 million of the $50 million of state money that he had invested in rare coins.

Law, Justice.
      Continuing a recent trend, states including California, Montana, and New Hampshire toughened laws governing sex crimes against children. Iowa's new law was particularly dramatic, mandating life imprisonment for a major second offense.

       Arkansas, Nevada, North Dakota, and Texas joined California in prohibiting government use of data from chip-recording devices that were contained in most new cars. South Dakota authorities had used information from the chip—which recorded speed, brake and seat-belt use, and other data recoverable after a crash—to convict Gov. William Janklow of vehicular homicide in 2003. The new state laws required an owner's permission or a court order before insurers or law-enforcement personnel could access the data.

Health, Welfare.
      Continuing a recent trend, 11 states approved new laws that further restricted abortion. Mississippi, a state with only one abortion clinic, required that an abortion be done in a hospital or surgical centre in cases in which the pregnancy had exceeded three months. Arkansas, Florida, and Idaho approved new laws requiring consent of a parent or guardian before a minor could receive an abortion. California voters, however, rejected a similar law. Though 35 states now required parental involvement for abortions obtained by minors, courts struck down those laws in 9 additional states. Georgia mandated a 24-hour waiting period for most abortions; Indiana required doctors to offer ultrasound images to prospective abortion seekers; and Arkansas ordered that applicants who were seeking abortions after their 20th week of pregnancy receive mandatory counseling on the possibility of fetal pain during the procedure.

      States were badly split on their approach to a “ morning-after” pill to prevent pregnancies. New Hampshire and Massachusetts became the seventh and eighth states to allow purchase of the pill specifically without a prescription, and Bay State legislators overcame a gubernatorial veto. New York Gov. George Pataki successfully vetoed a similar bill. Some pharmacists balked at dispensing the drug, but Illinois Gov. Rod Blagojevich and the California legislature enacted measures that required pharmacies that sold birth-control pills to also stock the morning-after pill. Mississippi joined Arkansas, Georgia, and South Dakota in giving pharmacists the right to refuse to dispense the pill.

      State relationships with federal authorities on health care were uneven at best. The federal government's 2003 reform of Medicare included a new prescription-drug benefit that was initially expected to save significant state funds. Congress imposed a last-minute “clawback” provision, however, that required offsetting state payments, and nearly 30 states instead projected increased costs from the program. The U.S. Supreme Court, in a major blow to states' rights, declared that laws in California and 10 other states that allowed the medical use of marijuana had to give way to federal antidrug enforcement laws.

Education.
      A grassroots rebellion over federal mandates for K–12 schools simmered in numerous states throughout the year, despite Washington's efforts to accommodate complaints. Critics charged that the No Child Left Behind (NCLB) and Individuals with Disability Education (IDEA) acts were excessively costly and underfunded and usurped traditional local and state control of public schools. Connecticut and Michigan filed unsuccessful legal challenges to require full NCLB reimbursement from the federal government; states estimated that the unfunded mandates would cost $18 billion annually. Utah's legislature allowed school districts to ignore NCLB requirements that necessitated state financing or conflicted with state test guidelines. The Texas education commissioner declared that the state would ignore NCLB guidelines on testing special-education students.

      Federal officials attempted to mollify state critics by granting increased flexibility. The U.S. Department of Education announced that up to 10 states would be allowed to use a “growth-based” NCLB assessment scheme similar to that of Utah's testing regimen.

      Texas became the first state to require public schools to spend 65% of funding on classroom expenses. The gubernatorial mandate came after the state legislature had turned down the proposal. Legislatures in Kansas and Louisiana also approved measures that encouraged the “65-cent solution.” The proposal, which was aimed at reducing administrative spending, also had an impact on school buses, counselors, libraries, and ancillary educational services. Support for another reform idea, school vouchers, remained sluggish. Utah joined Florida in enacting a statewide voucher program but limited its application to special-education students.

Consumer Protection.
 Georgia and Washington approved tough statewide smoking bans, bringing to 13 the number of states that prohibited smoking in most public areas. The Washington ballot initiative was particularly sweeping; it outlawed smoking in all public buildings and workplaces, including private clubs, and even lighting up within 7.6 m (25 ft) of doorways, windows, and air vents of public buildings. New York, in an attempt to protect students, prohibited the “unrestricted marketing” of credit cards on college campuses. Georgia declared the sending of multiple unsolicited “ spam” e-mails—10,000 in a month or 1,000,000 in a year—to be a felony punishable by up to five years in prison.

David C. Beckwith

▪ 2005

Introduction
Area:
9,366,008 sq km (3,616,236 sq mi), including 204,083 sq km of inland water but excluding the 156,049 sq km of the Great Lakes that lie within U.S. boundaries
Population
(2004 est.): 293,850,000
Capital:
Washington, D.C.
Head of state and government:
President George W. Bush

      For a third consecutive year, the strategic response to the Sept. 11, 2001, terrorist attacks by the administration of Pres. George W. Bush (see Biographies (Bush, George W. )) dominated world affairs. The U.S. plan included two highly controversial initiatives—a proclaimed right of preemptive attack, to forestall perceived threats against U.S. interests, and a long-term objective of exporting democracy worldwide, to bring human rights to such areas as Afghanistan and Iraq, which had previously known mainly tyranny and despotism.

      The administration's initiatives caused deep divisions abroad. Support came from the U.K., Australia, and emerging Eastern Europe, but other nations voiced strong opposition and resentment. At home the body politic was also split, and President Bush's foreign policies, particularly the occupation and rehabilitation of Iraq, became the central issue in the 2004 national elections.

      Costs of the Iraq intervention continued to mount during the year. At times the U.S.-led effort appeared greatly overextended, putting unsustainable strain on U.S. resources, even the well-functioning U.S. economy. Domestic critics were unable to put forward an attractive alternative path as, in one sense, the November election became a referendum on the Bush terrorism strategy. In a high turnout of more than 60% by U.S. voters, Bush won reelection by a relatively narrow margin, 51–48%. (See Special Report (U.S. Election of 2004 ).)

War on Terrorism.
      The Bush administration could point to substantial progress in Iraq, from construction and infrastructure rebuilding to election preparations, but the U.S. was again on the defensive for most of 2004. Pentagon officials reported that 848 Americans died in Iraq during the year, and another 6,000 were wounded, a casualty rate nearly twice as high as 2003, which included the military invasion that had toppled Saddam Hussein.

      Early in 2004, in an assessment that cast a pall over the administration's rationale for the war, former U.S. arms inspector David Kay reported that allied prewar intelligence on Iraqi weapons of mass destruction was “almost all wrong.” Under pressure, President Bush reluctantly agreed to appoint a bipartisan commission to study the 9/11 attacks and their aftermath. The commission, headed by Republican former New Jersey governor Thomas Kean, proved activist and highly critical, and its periodic public hearings and reports roiled the domestic political landscape through the year.

      In late March, as the U.S.-dominated occupation attempted to prepare Iraq for elections and a handover to Iraqi control, authorities in Baghdad closed down a newspaper controlled by Muqtada al-Sadr, a militant Shiʿite cleric. A few days later four U.S. security contractors were ambushed and killed while driving in Fallujah, a city controlled by Islamic militants, and their bodies were publicly defiled. Militia forces loyal to Sadr then launched coordinated attacks in five Iraqi cities. The rebellion was particularly disheartening because Shiʿites, who had long been suppressed, were seen as the major beneficiaries of the transition to democracy.

      Allied forces eventually decimated the militia, retook several cities, and, with tacit backing of a more senior Shiʿite cleric, Ayatollah Ali al-Sistani (see Biographies (Sistani, Ali al- )), arranged a cease-fire with Sadr. Allied plans to pacify Fallujah, however, the apparent heart of the opposition, proved highly divisive, which prompted the resignation of two Iraqi Governing Council members. In a controversial step, the U.S. then postponed a planned major offensive on Fallujah.

      In late April photographs showing apparent U.S. military abuse of detainees at the notorious Abu Ghraib prison in Baghdad began circulating on the Internet, setting off a firestorm of criticism around the world against the U.S. occupation. The photos, taken by fellow soldiers, became key to a dozen investigations, including inquiries by both houses of Congress. Seven U.S. military personnel, most of them low-ranking, were prosecuted on abuse charges. One report called the Abu Ghraib abuse the result of “fundamental failures” in military oversight, but claims by some critics that the abuse stemmed from official U.S. policy, approved by Bush appointees, were never proved. (See Military Affairs: Special Report (POWs and the Global War on Terrorism ).)

      Coalition authorities handed over nominal control of Iraq on June 28, two days ahead of schedule, to an Iraqi interim government headed by Prime Minister Ayad Allawi (see Biographies (Allawi, Ayad )), a neurosurgeon allied with the U.S. Under the unusual arrangement, U.S. forces continued to lead security operations but operated technically under Iraqi supervision. The arrangement proved workable but did little to slow a continuing, apparently growing guerrilla insurgency, especially in Sunni areas.

      In early September, in a tacit acknowledgement of ongoing problems, the Bush administration asked Congress to reprogram funds designated for Iraqi reconstruction and shift $3.5 billion to law-enforcement and security accounts. At that point, largely owing to dangerous conditions, only 6% of the $18.4 billion appropriated in 2003 for rebuilding had been actually spent.

      Less than a week after the U.S. election, some 10,000 U.S. troops surrounded Fallujah and began a house-to-house campaign to uproot heavily armed insurgents. The assault took little more than a week to overrun the rebel area, and authorities announced that some 1,600 suspected insurgents had been killed, but most resistance leaders escaped the allied dragnet.

      Bombings, surprise attacks, and even frontal military assaults continued at a high level through the end of 2004. U.S. authorities, under continuing criticism for failing to supply adequate troop strength and supplies, including body and vehicle armour, announced plans to increase the U.S. presence to 150,000 in early 2005.

Domestic Policy.
      In 2004 numerous bills bogged down in partisan wrangling as both political parties maneuvered for electoral advantage, and congressional productivity was light.

       Democrats continued to throw up roadblocks to Bush appellate court nominees deemed excessively conservative, preventing 10 of 34 named by Bush during his first term from gaining an up-or-down vote on the Senate floor. The gridlock became an issue in the fall elections, with Senate Majority Leader Bill Frist, in a break from tradition, traveling in May to South Dakota, the home state of Sen. Tom Daschle, his Democratic counterpart, to campaign for Daschle's GOP opponent. Daschle was defeated. Following the election, Republican Sen. Arlen Specter of Pennsylvania, slated to become chairman of the Senate Judiciary Committee, seemed to warn President Bush in an interview against nominating antiabortion judges; following a storm of protest that reached his Senate colleagues, Specter withdrew his statement.

      With few exceptions, only relatively minor legislation was approved prior to November. One significant election-eve law awarded $140 billion in tax relief to U.S. business, including a $10 billion buyout for tobacco growers. Another bill extended temporarily four middle-class tax cuts previously won by the Bush administration but scheduled to expire, including a $1,000-per-couple child tax credit, expansion of the lowest (10%) tax bracket, exceptions for the alternative minimum tax, and relief from the so-called marriage penalty for two-income families.

      Reacting to increased abuse in the computer age, Congress increased penalties for identity theft, a growing source of fraud. At the urging of the Bush administration, and over objections of abortion rights advocates, Congress also specified that an individual alleged to have committed a violent crime against a pregnant woman could also be charged with a second offense, against the unborn child.

      Four hurricanes—Charley, Frances, Ivan, and Jeanne—rolled over Florida, a hotly contested presidential battleground state, during a six-week period in the fall, causing an estimated $50 billion in property damage. Congress responded with a $2 billion disaster-relief appropriation for the Federal Emergency Management Agency, followed later by another $11 billion in hurricane aid.

      As Massachusetts became the first state to legalize same-sex marriage, Congress struggled to fashion a federal legislative response. A proposed U.S. constitutional amendment defining marriage as only between a man and a woman went nowhere; the House approved the measure by only 227–186, less than the two-thirds required, and the Senate also failed, by 48–50, even to gain sufficient votes to stop debate on the measure. The House pursued an alternative idea, approving a measure to prohibit federal courts from hearing challenges to the 1997 Defense of Marriage Act. The Senate, however, never took up the bill. (See Law, Crime, and Law Enforcement: Special Report (Legal Debate over Same-Sex Marriages ).)

      Numerous congressional bills died or were postponed, including ones regarding bankruptcy reform, the banning of assault weapons, welfare reform, asbestos lawsuits, class-action and medical-malpractice legislation, and increased funding for federal highway construction.

      Congress adjourned in early October without having made major changes to the highly decentralized U.S. intelligence structure. Pressure generated by the 9/11 commission, however, helped prompt a congressional lame-duck session in early December. The result was a bipartisan reorganization of national intelligence operations under a single director, along with new surveillance and antiterrorism powers for the new agency.

The Economy.
      World turmoil impacted the nation's domestic business climate but failed to stop a continued expansion of the resilient U.S. economy. Dramatically higher oil prices provided a damper on strong United States economic growth. The U.S., spending heavily at home and abroad, resumed its place as the world's main economic engine in 2004, at least temporarily shrugging off heavy costs associated with homeland security and the war on terrorism, and finally reversing a decline in employment that had started with the 2001 recession.

      As the year began, the economy was growing at a robust pace. Expansion was stimulated by federal tax cuts and outlays from a record federal budget deficit and aided by low interest rates, modest inflation, and oil selling for $32.50 per barrel. Energy supplies, however, tightened under demand pressure from growing economies worldwide, especially in China. The growing insurgency in Iraq threatened supplies, as did less-violent uncertainty during the year in other major petroleum-producing countries, including Saudi Arabia, Russia, Nigeria, and Venezuela. By late October oil topped $55 per barrel, which acted as a major drain on the U.S. economy and helped turn what might have been an extraordinary economic year into a mere solid one.

      The U.S. GDP grew by 4.5% in the first quarter and readily topped 3.5% for the remainder of the year. The Federal Reserve Board increased historically low short-term interest rates by a modest 0.25% on five separate occasions, ending the year at 2.25%. The consumer price index rose by more than 3.5% for the year, higher than in recent years, but nearly half of that increase was attributable to higher energy prices.

      The national prosperity was fueled in part by unprecedented and disquieting red ink. The 2004 federal budget deficit, impacted by war, homeland security, and tax-cut measures, was $422 billion, less than forecast early in the year but easily topping the previous record 2003 deficit of $377 billion. U.S. imports of petroleum and Asian consumer goods paced record trade deficits that exceeded $50 billion a month through the year, another record pace. The weight of both deficits helped drive down the value of the U.S. dollar, a drop that accelerated after the November elections. The dollar finished the year at a historic low against the euro.

       Unemployment drifted lower during 2004, from 5.7% to 5.4%. About two million new jobs were created in the U.S. during the year, a creditable performance but not sufficient to fully offset jobs lost during the recession. In addition, jobs were also being “offshored” to countries that had lower labour costs. (See Economic Affairs: Special Report (Offshoring ).)

      The nation's equity markets followed a major bounce back in 2003 with a solid, if unspectacular, upward move in 2004. Broad indicators demonstrated that overall, share prices rose nearly 10% during the year, but some indexes were lower. The Dow Jones Industrial Average started the year above 10,400, but energy price increases and election uncertainty caused a sell-off to 9750 in late October. With election jitters settled, the Dow started a year-end rally and finished at 10,783, a gain of 3%.

      Business news was dominated by continued fallout from 2001–02 corporate scandals. Two onetime business titans, Kenneth Lay of Enron and Bernie Ebbers of WorldCom, were indicted for their roles in accounting irregularities that afflicted their companies. John Rigas, CEO of Adelphia, a major cable company, was convicted on 18 felony counts for misappropriation of corporate funds. Martha Stewart, head of a successful marketing and publishing company carrying her name, was convicted of having lied about stock trades and sentenced to five months' imprisonment. Stewart appealed the decision but began serving the sentence in October at a West Virginia penal facility in hopes of limiting damage to her firm.

      New York Attorney General Eliot Spitzer (see Biographies (Spitzer, Eliot )), who had rocked the mutual-fund industry in 2003 with allegations of after-hours trading and other improprieties, turned his attention to insurance in 2004. In a wide-ranging investigation affecting almost all types of insurance, Spitzer charged two companies with civil fraud for alleged bid rigging and steering business. At year-end, several insurers, while acknowledging problems in their industry, called for Congress to take over for state regulation of insurance companies.

Foreign Policy.
      With maneuvering ability almost nonexistent, owing to the war in Iraq, and constricted by domestic political considerations, U.S. diplomacy struggled through a dark 2004. Resentment toward perceived U.S. unilateralism coloured relationships with several countries, and despite earnest efforts, only marginal progress was recorded in expanding international participation in Iraq's security and reconstruction. The year saw some bright moments, particularly in nurturing democracy in Afghanistan, Indonesia, and Ukraine, but overall the year was replete with frustrations.

      U.S. attempts to stop Iran's and North Korea's progress in their development of nuclear weapons capability met little success. Early in the year Iran reneged on 2003 promises to cease uranium enrichment that can produce either low-grade nuclear fuel or raw material for nuclear weapons. The U.S. pressed the International Atomic Energy Agency for punitive sanctions. The U.K., France, and Germany, however, offered Iran a trade pact with the European Union instead. Iran eventually agreed to a temporary halt in enrichment activities, one that critics said would be meaningless in the country's drive for weapons capability.

      A long-running effort to dismantle North Korean nuclear designs made even less progress during 2004. The U.S. again refused North Korean demands for bilateral negotiations, insisting instead on six-party talks that included Japan, Russia, China, South Korea, North Korea, and the U.S. A June meeting produced no notable result, and North Korea then refused further negotiations, openly suggesting that the U.S. election might produce a new U.S. administration. The talks remained stalled at year's end.

      The brightest chapter in international cooperation came in Afghanistan, which had lacked a democratic tradition. With the assistance of numerous countries, however, Afghans set up a voter-registration system and attracted nearly eight million voters, with substantial participation by previously disenfranchised women. The Afghan success, along with democratic electoral progress in Indonesia and Ukraine, was considered a major accomplishment in the Bush administration's campaign to spread democracy worldwide.

      U.S. relations with Russia deteriorated amid charges that Russian Pres. Vladimir Putin was eroding democratic reforms, confiscating private property, and interfering in the internal affairs of European neighbours. In the Middle East, Russia was also suspected of providing assistance to Iran in its nuclear ambitions. U.S. authorities maintained a public facade of cooperation with the Putin regime but expressed private dismay over a variety of Russian actions, including nationalization of the giant Yukos oil company and heavy-handed—and ultimately unsuccessful—attempts to influence the election in Ukraine. (See Ukraine .)

      Bush administration relations with the UN were also superficially correct but deteriorated significantly. The international organization was rocked by scandal, ranging from harassment allegations against ranking officials at the UN headquarters in New York to sexual mistreatment of women and girls by UN peacekeepers in the Democratic Republic of the Congo to culpability in having allowed Saddam Hussein to divert an estimated $21 billion from the “oil for food” program. A Republican-led congressional inquiry into oil for food was largely stonewalled by UN officials, and prominent U.S. legislators publicly called for the resignation of UN Secretary-General Kofi Annan.

      The U.S. also fumed over lack of UN support for Iraq. UN relief officials had largely departed from Iraq in 2003 following a bombing attack on their headquarters and, citing ongoing security concerns, failed to return in 2004. In a notable interview in mid-September, only weeks before U.S. elections, Annan declared the 2003 U.S.-led invasion of Iraq to have been an illegal act, a declaration that Bush officials judged excessively political.

      The UN's largely ineffectual response to humanitarian concerns in the Darfur region of The Sudan was yet another issue. More than 100,000 largely Christian Darfur residents were driven out of their homes by Islamic Sudanese, and thousands died. U.S. Secretary of State Colin Powell called the situation “genocide” and facilitated U.S. aid, but UN efforts to stop the ethnic disruption were minimal.

      The tsunami disaster that followed the December 26 earthquake near Sumatra, Indon., also strained U.S.-UN relations. As the magnitude of the disaster began to unfold, the U.S. pledged an initial $15 million to the relief effort, and a ranking UN official labeled donations by wealthy countries as “stingy.” Within hours of the disaster, however, the U.S. began deploying military resources and mounted a major humanitarian-relief campaign to affected areas in conjunction with Australia and Japan, often bypassing the UN relief bureaucracy. The U.S. contributed $350 million to the relief effort, and Americans gave more than $200 million in private funds; donations were rising at year's end. (See Disasters: Sidebar (Deadliest Tsunami ).)

      The long-stalled Middle East peace process appeared close to renewal in October with the death of Palestinian leader Yasir Arafat (see Obituaries (Arafat, Yasir )), whose intransigence and encouragement of violence against Israel were widely blamed for the breakdown of a key 2000 U.S.-sponsored peace accord.

David C. Beckwith

Developments in the states
      A long-awaited economic expansion finally ended a serious budget crisis in U.S. state governments in 2004. Although the recovery was modest and allowed replenishment of exhausted accounts, there was little expansion of services. States continued to wrestle with the federal government over education, health care, and prescription-drug reimbursement, among other problems.

Party Strengths.
       Democrats made notable gains in 2004 in state legislative elections, and Republicans appeared to increase their control of governorships. The results left the two parties at virtual parity in state governments nationwide at year's end. In 2005 Republicans would control both state legislative chambers in 20 states, down from 21 in 2004, and Democrats would dominate both bodies in 19 states, up from 17 in 2003. Ten states were split, with neither party organizing both chambers, and Nebraska had a nonpartisan legislature.

      Republicans enjoyed a 28–22 edge in governorships for most of the year. In the November balloting Democrats took away GOP seats in Montana and New Hampshire, but Republicans were awarded previously Democratic governorships in Indiana, and Missouri. In Washington, after the closest gubernatorial election in state history, it appeared after the first recount of 2.9 million ballots that Republican Dino Rossi had bested Democrat Christine Gregoire by 42 votes, but the Democrats challenged the results. Following a second recount, Gregoire was declared the winner by 129 votes in December. That left the Republican prospective advantage for 2005 at 28–22.

Structures, Powers.
      An attempt to divide Colorado's presidential votes in the electoral college proportionately, abandoning the winner-take-all system, was soundly defeated in November voting. Citizens in Arkansas and Montana rejected November ballot proposals to relax term-limit laws for state officials. Wyoming's Supreme Court invalidated that state's term-limit law just as it began to take effect. Of 21 state laws approving term-limit laws over recent years, 6 were thrown out or repealed.

      Numerous states expanded early-voting opportunities, and Missouri, North Dakota, and Utah allowed overseas military personnel to vote by e-mail. South Dakota established a constitutional review commission.

Government Relations.
      State relationships with the federal government, which had always been strained, were tumultuous during 2004, particularly on public-education policy. Congress again extended a ban on state taxation of Internet services, this time until 2008. In another controversial action, a federal ban on the manufacture and sale of certain semiautomatic weapons was allowed to expire; only five states had enacted curbs on so-called assault rifles.

      The U.S. Supreme Court, in a 5–4 decision affecting 13 states, prohibited judges from considering aggravating factors and extending jury sentencing verdicts. In a bow to seven states that failed to impose a personal income tax, Congress approved a two-year measure to allow optional deduction of sales taxes on federal income-tax returns.

Finances.
      Pressure on state budgets eased markedly in 2004 as the national economy recovered, and this led to an uneventful year for tax legislation. States still faced substantial budget shortfalls, but most were able to balance their books without raising taxes or substantially cutting state spending. With budgets tight, few states expanded social services.

      Only nine states raised taxes during the year. Arkansas and Virginia increased their sales taxes. Alabama, Colorado, Michigan, Oklahoma, and Rhode Island raised their tobacco tax. Two states boosted personal-income levies on their highest-income taxpayers; California dedicated the added revenue to expanding mental health programs, and New Jersey funded a property-tax-rebate plan. Oregon voters repealed substantial personal and corporate tax increases approved by the 2003 legislature, and legislators in Iowa and New Hampshire reduced state sales taxes.

      Overall, states began rebuilding “rainy-day” funds and repaying accounts that had been used to steer state budgets through the 2001–03 down cycle. In recent years California, which was particularly hard-hit by the bursting of the dot-com bubble, had accounted for nearly 40% of state budget shortfalls. At the urging of California Gov. Arnold Schwarzenegger, voters extended resolution of the crisis via a $15 billion bond issue early in the year. The state worked through the down period by reducing spending (particularly on education), raiding other state funds, and increasing revenue incrementally via a tax-amnesty plan.

      Some 35 legislatures considered bills designed to curb outsourcing of jobs abroad, usually by banning out-of-state or foreign companies from doing state work. Only Tennessee enacted an antioutsourcing law, however, while governors in Maryland and Massachusetts vetoed similar measures. (See Economic Affairs: Special Report (Economic Affairs ).)

Marriage.
      Fallout from the November 2003 Massachusetts Supreme Judicial Court decision making single-sex marriage a state constitutional right created turmoil nationwide throughout the year. Backers of traditional marriage took vigorous steps to overturn the decision and to limit its effect to Massachusetts, with only partial success.

      When the decision became effective on May 17, state officials forestalled a nationwide influx by declaring that only Massachusetts residents were eligible for marriage licenses. The state legislature took initial steps toward placing the issue on the 2006 statewide ballot, obtaining 105 votes (with 101 required) for a constitutional amendment permitting civil unions but not same-sex marriage. Another legislative vote in 2005 was required before the ballot measure would be scheduled.

      Reaction in some states was sympathetic. New Jersey, anticipating a similar court decision in an ongoing lawsuit, joined Vermont in recognizing same-sex civil unions. Two lower court decisions in Washington state also declared the state ban on same-sex marriage to be unconstitutional, but the case was appealed. Local authorities in several jurisdictions, including San Francisco and Portland, Ore., began issuing same-sex marriage licenses before state authorities intervened; the San Francisco action was voided by the state Supreme Court.

      Other states began taking legal steps to prevent the Massachusetts decision from being recognized under the U.S. Constitution's “full faith and credit” clause. Louisiana and Missouri voters and state legislators in Wisconsin joined four other states in amending their state constitutions to ban same-sex marriages. On November 2, voters in 11 additional states overwhelmingly approved constitutional amendments: Oregon, Mississippi, and Montana barred same-sex marriages; Arkansas, Georgia, Kentucky, Michigan, North Dakota, Oklahoma, and Utah banned civil unions as well as domestic partnerships; and Ohio outlawed any benefits to same-sex couples. (See Crime and Law Enforcement: Law: Special Report (Legal Debate over Same-Sex Marriages ).)

Ethics.
      Two governors, John Rowland of Connecticut and James McGreevey of New Jersey, were forced to resign under a cloud of scandal during the year. Rowland, a Republican, quit June 21 as a federal grand jury probed multiple charges that he had steered state contracts to favoured firms and received free remodeling services from state contractors. His resignation halted impeachment proceedings initiated by the state legislature. In December Rowland pleaded guilty to a single federal felony count of conspiracy to steal honest service.

      McGreevey, a Democrat, became the first governor in history to be forced out over a sex scandal. On August 12, after a male former aide threatened him with sexual-harassment litigation, McGreevey announced that “I am a gay American” and declared that he would quit three months later. He was succeeded by the state Senate president, a Democrat, who would serve until January 2006; if McGreevey had left immediately, a special election in November would have filled the vacancy.

Law and Justice.
      States moved aggressively to combat escalating medical-malpractice insurance premiums, which were widely blamed on personal-injury lawsuits. Thirteen legislatures approved malpractice-relief bills, but governors in three states (Connecticut, Iowa, and Missouri) vetoed them. Florida voters approved a far-reaching plan to curb lawsuits and place a ceiling on noneconomic damage awards, and Nevada voters embraced a cap on noneconomic damages, but similar measures in Oregon and Wyoming were rejected in November balloting.

      Ohio became the first jurisdiction to reform asbestos-exposure litigation, which in recent years had led to the bankruptcy of more than 70 corporations. The new law required that plaintiffs prove that they were actually ill before they could receive compensation; up to two-thirds of current asbestos claimants had not been diagnosed with cancer or other diseases.

      Voters in Alaska rejected a proposal to effectively legalize and regulate marijuana use. Montana became the 11th state, most of them in the West, to allow the use of marijuana for medicinal purposes, but Oregon voters rejected an expansion of the state's similar program. Voters in Alaska and Maine turned down proposals to stop using baited traps in the hunting of bears.

      State-sponsored gambling enjoyed mixed luck during the year. Oklahoma and Pennsylvania allowed slot machines or video lottery terminals at horse-racing tracks. Oklahoma voters approved a new state lottery, with proceeds dedicated to education. Michigan voters, however, demanded veto power over any further expansion of gambling. Nebraska voters rejected a casino gambling plan approved by the state legislature, and California and Washington voters turned down revenue plans funded by expansion of Native American casinos.

      Loopholes exposed in the highly publicized case involving basketball player Kobe Bryant of the Los Angeles Lakers prompted California and Colorado to strengthen their shield laws protecting the identity of rape victims. Wisconsin barred police from requiring that rape victims submit to a lie-detector test.

      California became the first state to order suspects to submit DNA samples for testing after a felony arrest. Voters also narrowly defeated a proposal to relax the state's “three strikes” law, which mandated life imprisonment on a third felony conviction. A downward trend in application of the death penalty continued during 2004. During the year only 59 convicts were executed nationwide, down from 98 in 1999.

Health and Welfare.
      Conflict between state and federal approaches to health care policy was high during 2004, particularly over prescription drugs. A growing number of states—including Illinois, Minnesota, North Dakota, New Hampshire, and Wisconsin—actively defied a Food and Drug Administration (FDA) ban on the importation of drugs from abroad, particularly Canada, by setting up Internet sites to assist with such purchases. Oregon floated a plan to license foreign pharmacies; Minnesota waived co-payments for state employees and ordered Canadian drugs; and Vermont filed a lawsuit against the U.S. government seeking permission to import drugs directly. At year's end the FDA was continuing to battle the state action, asserting that uninspected imported drugs were not safe.

      The limits imposed by the administration of Pres. George W. Bush on federal stem-cell research were challenged in several states. New Jersey expanded funding for a state stem-cell institute, and in November California voters approved $3 billion in state bonds to support embryonic stem-cell research over 10 years. Delaware established a novel $10 million anticancer research program, which would guarantee health benefits for uninsured patients.

      States reacted warily as initial benefits began flowing from the federal government's 2003 reform of Medicare. Prescription-drug discount cards were offered to seniors nationwide, which created some confusion in 22 states that assisted with drugs via discount or subsidy programs. Twelve states approved new legislation to help transition seniors into expanded federal drug benefits expected in early 2006. (See Social Protection: Sidebar (Medicare's New Prescription-Drug Program ).)

      Georgia and Wisconsin became the first states to grant a major tax credit to encourage organ donation. Illinois allowed organ transfers from HIV donors to HIV-infected patients. Colorado, Tennessee, and Washington joined four other states that restricted student access to candy-, snack-, and soda-vending machines in public schools.

      State spending on Medicaid low-income health assistance—the states' fastest-growing program—continued to strain budgets, with a fourth consecutive year of double-digit increases. States continued to react by trimming benefits and eligibility, and Tennessee contemplated a wholesale revamping of its signature TennCare plan.

      California issued regulations aimed at fighting global warming by mandating reduced greenhouse-gas emissions, including carbon dioxide, in automobiles. Seven northeastern states tied their emission standards to California's. Arizona voters approved a law barring undocumented aliens from voting or applying for social services.

Education.
      State officials chafed under increasing pressure of the 2002 federal No Child Left Behind Act, which mandated gradually increasing standards for teachers and students. One-quarter of public schools failed initial testing requirements, and states sought exemptions from requirements for stepped-up teacher certification and achievement for at-risk and minority students. Protests against the estimated $9 billion annual costs, penalties, and unprecedented federal oversight were introduced in more than 20 legislatures. Only Maine and Utah, however, enacted legislation promising critical review of the Bush administration initiative.

Consumer Protection.
      Utah became the first state to ban “ spyware,” software installed on a computer without the owner's consent. New Jersey joined New York in banning the use of handheld cellular phones while driving. Massachusetts became the sixth state to outlaw smoking in virtually all public places, and Idaho also approved public-smoking curbs, with the exception of bars.

David C. Beckwith

▪ 2004

Introduction
Area:
9,363,364 sq km (3,615,215 sq mi), including 204,446 sq km of inland water but excluding the 155,534 sq km of the Great Lakes that lie within U.S. boundaries
Population
(2003 est.): 291,587,000; based on 2000 unadjusted census results
Capital:
Washington, D.C.
Head of state and government:
President George W. Bush

      Even as the U.S. struggled for months during 2003 with a sluggish economy and the multiple burdens of an unprecedented war on terrorism, overextension of unrivaled U.S. military and economic power seemed a remote prospect. In March, however, the United States initiated its second major military incursion in a Muslim country in 18 months when it led an invasion into Iraq. (U.S. troops were still committed to Afghanistan.) While staged combat was over quickly, an untidy aftermath in Iraq seriously strained both American resources and the national will. The aggressive U.S. action, grounded in a new assertion of the right to wage “preemptive war” against terrorists, badly divided the country's traditional allies and energized a long-dormant antiwar faction in the domestic American body politic. By year's end, although there were signs of stabilization in Iraq, the U.S. was scaling back ambitious plans to transition Iraq into a Western-style democracy, and the ultimate outcome of the U.S. commitment was very much in doubt.

      Backed by a handful of major countries, dubbed the “coalition of the willing” by Pres. George W. Bush, the U.S. in early spring overran Iraq in a little over three weeks. The invasion was at least partially justified on the basis of fears, fueled by reports compiled by Western intelligence agencies, about Iraq's possession of weapons of mass destruction, which by year's end had not been found. The liberation of Iraq was a clear humanitarian triumph, however, and a tonic for the U.S. economy as well. By coincidence or not, U.S. business expansion resumed with a vengeance in the weeks following the war, emphatically ending a 30-month economic malaise.

War in Iraq.
      During January and February some 300,000 U.S. and British troops and 1,150 coalition aircraft were deployed near Iraq—even while 200 newly admitted United Nations inspectors under Hans Blix scoured suspected Iraqi sites, looking for evidence of nuclear, chemical, and biological weaponry and banned missile systems. (See Military Affairs: Sidebar (Defining Weapons of Mass Destruction ).) The inspection team had limited success; they located and began arranging destruction of 120 al-Samoud 2 missiles but found no evidence of an active nuclear-weapons program. Additionally, Iraq could not account for chemical and biological agents, including anthrax, that had been in its possession in the late 1990s.

      Several influential countries, including France, Germany, and Russia, viewed the inspections as a major step forward in disarming Iraq; they counseled patience and additional diplomacy. Bush and British Prime Minister Tony Blair, however—their armed forces extended on combat readiness—declared the Iraqis to be stalling and continued the allied military buildup. The coalition suffered a major setback on March 1 when the Turkish parliament narrowly rejected a plan to allow U.S. troops to use Turkey, on Iraq's northern border, as a staging area.

      On March 17 President Bush gave Iraqi Pres. Saddam Hussein (see Biographies (Hussein, Saddam )) and his family 48 hours to leave the country so that all UN weapons-disarmament decrees could be fully enforced. Two days later, with no explicit UN approval, the U.S. began launching Tomahawk missile strikes on suspected Iraqi leadership sites. Coalition troops began crossing the Iraqi border from Kuwait on March 20. The attack moved quickly toward Baghdad from the west and southwest, covering 300 km (186 mi) in less than a week. Direct resistance was light, although guerrilla attacks behind supply lines inflicted some casualties on coalition forces. A week later U.S. airborne forces opened a third front from the north.

      By April 4 the U.S. expeditionary force had captured Saddam International Airport near Baghdad. Threats of block-by-block Iraqi resistance in crowded urban areas of Baghdad proved illusory. Repeated armoured probes of the capital failed to encounter major resistance, and the city was largely under coalition control by April 9. By the middle of the month, the final remnants of Iraqi military forces had been dispersed. That led to the toppling of statues of Saddam all over Iraq even while looters ravaged government offices and cultural centres that occupation troops had left unprotected. Fewer than 200 allied service personnel, including 138 Americans, died from hostile action during the invasion period.

      Almost immediately, however, hit-and-run attacks began on coalition forces even as the allies appointed an Iraqi Governing Council to oversee the transition to Iraqi civilian rule. The death of Saddam's sons Uday and Qusay on July 22 did little to stop sabotage and resistance. (See Obituaries (Hussein, Uday, and Hussein, Qusay ).) The attacks reached a crescendo in November when a series of bombing and missile attacks on helicopters, planes, and military vehicles left 81 Americans dead. On December 13, however, U.S. forces discovered Saddam hiding in an underground “spider hole” near his hometown, Tikrit. He was captured and held for trial. Even so, by year's end U.S. casualties had reached 480, and attacks on U.S. troops were continuing daily.

      The war on terrorism, including the Iraq invasion, dominated both U.S. domestic policy and foreign politics throughout the year. The war split Democrats and roiled the Democratic presidential campaign, leading directly to the emergence of former Vermont governor Howard Dean as the front-runner for the 2004 nomination. Traditional allies of the U.S., led by France, declined to share in the costs of putting Iraq back on its feet. In September the Bush administration acknowledged reluctantly that reconstruction costs in Iraq and Afghanistan would require $86 billion in additional U.S. funds. After extended controversy, Congress eventually approved the outlay.

Domestic Issues.
      Although Bush had vowed to bring a bipartisan civility to Washington, partisan divisions in Congress deepened during the year as the country prepared for the 2004 elections. In the U.S. Senate, Democrats expanded a campaign to block administration judicial nominees they considered excessively conservative from being confirmed to circuit courts of appeal. By the end of the year, an unprecedented six nominations were being stalled by threat of filibuster. The Congress also again failed, owing to regional and partisan differences, to approve long-considered legislation to stimulate U.S. energy supplies.

      After 15 years of discussion, however, legislators approved reform of the national Medicare system for older Americans, adding a controversial prescription-drug benefit and introducing private-sector competition to the plan. The price tag for the new drug-benefit entitlement was $400 billion over 10 years, an amount deemed inadequate by liberals and excessive by fiscal conservatives. Republicans were not keen to expand Medicare, but with medical costs rising rapidly and shifting heavily toward drug therapy, public support for a drug benefit was rising. In his 2000 campaign, candidate Bush had promised action on the measure, and in 2003 he pushed aggressively for its passage prior to the 2004 election year. The bill was approved only after it was endorsed by AARP (formerly the American Association of Retired Persons), an influential lobbying group for seniors, which vowed to seek improvements, including expansion of benefits, in future years.

      The Federal Trade Commission established a national “do not call” registry for persons wishing to avoid unsolicited sales calls over the telephone. More than 60 million telephone numbers were quickly registered, and Congress endorsed the registry in later legislation. Adopting an idea pioneered by state governments, Congress also approved a bill that permitted consumers free access to their credit reports. After a series of forest fires in California killed 22, destroyed 4,800 buildings, and burned nearly 400,000 ha (1 million ac) of land, Congress approved the Bush administration's Healthy Forests initiative. The measure, which was signed into law in December, provided for active federal land management, including thinning of undergrowth and planned burns, to reduce fire damage.

      Following an adverse ruling from the World Trade Organization, the Bush administration moved to rescind protective tariffs on steel imports first imposed in 2002. The tariffs followed another Bush campaign promise, this time to steel-manufacturing areas, but they were wildly unpopular among consumers of steel, including automobile manufacturers.

      The U.S. Supreme Court, in a 5–4 decision, upheld most of the 2002 McCain-Feingold campaign-finance law designed to reduce the influence of special-interest money in federal elections. The high court approved the law's ban on national “soft money” donations by corporations and labour unions and endorsed curbs on advertising by third-party groups that benefited individual candidates. In another landmark decision, also by a 5–4 vote, the high court approved limited use of affirmative-action policies to benefit minority candidates for admission to institutions of higher learning. (See Law, Crime, and Law Enforcement: Court Decisions.)

      The struggling national economy and controversy over President Bush's handling of Iraq drew a large field for the 2004 Democratic presidential nomination—10 candidates at one point, before their ranks were reduced to 9. The 10 were Dean; Rep. Richard Gephardt of Missouri, the former U.S. House leader; Rep. Dennis Kucinich of Ohio; North Carolina Sen. John Edwards; Florida Sen. Bob Graham (who dropped out after five months); Massachusetts Sen. John Kerry; Connecticut Sen. Joseph Lieberman; former Illinois senator Carol Moseley Braun; U.S. Army Gen. Wesley Clark; and African American leader Al Sharpton. Washington outsiders soon established themselves as the front-runners, however. Dean distinguished himself with a strong antiwar stance and savvy use of the Internet for fund-raising and organization; by year's end he was ahead in public-opinion polls but under assault from other contenders as excessively liberal and unelectable against Bush. Late entry Clark was viewed by some Democrats as best positioned to challenge Bush; he received backing from party moderates and aides to former president Bill Clinton.

      The U.S. space program sustained a catastrophic loss on February 1 when the space shuttle Columbia orbiter disintegrated on reentry over Texas, killing all seven astronauts on board. (For Obituaries of Columbia astronauts, see Michael P. Anderson (Anderson, Michael P. ), David M. Brown (Brown, David M. ), Kalpana Chawla (Chawla, Kalpana ), Laurel Blair Salton Clark (Clark, Laurel Blair Salton ), Rick D. Husband (Husband, Rick D. ), William C. McCool (McCool, William C. ), and Ilan Ramon (Ramon, Ilan ).) The tragedy was eventually traced to a 680-g (24-oz) section of foam insulation that had broken away from an external fuel tank on liftoff, damaging Columbia's left wing and dooming the mission. A commission of inquiry later criticized NASA for having a culture that allowed schedule requirements to dominate safety concerns. (See Physical Sciences: Space Exploration (Physical Sciences ).)

The Economy.
      With the world watching its main economic engine nervously, the U.S. economy finally shrugged off a lingering hangover from dot-com overexuberance and resumed serious growth during 2003. The revival ended two agonizing years of national economic drift and arrived even as the business community was wrestling with new allegations of wrongdoing in financial markets.

      The year started sluggishly, with the economy technically expanding but at such an anemic rate that jobs continued to disappear overall. Economic growth averaged only about 2% for the first six months, and unemployment rose from 5.7% in January to 6.4% by midyear. Government officials appeared to have exhausted their ability to recharge the economy. The Federal Reserve System reduced the already-nominal federal funds interest rate by one-quarter point to 1% in June.

      A coalition victory in Iraq, however, appeared to inspire an early spring revival in the equity markets, and by the third quarter the national economy was growing at an 8.2% rate, the fastest clip in two decades. The brisk expansion was fully under way by summer, spurred by low interest rates and inflation and the stimulus of major tax cuts and government spending flowing through the economy. Growth was also aided by a sizable jump in worker productivity, reflecting business economizing and efficiencies.

      Another stimulative factor was a record federal budget deficit. The shortfall was estimated at $455 billion in July, but rapid second-half growth lowered the actual deficit to $374 billion by October. Reversing their historic role, Democrats criticized the Bush administration for fiscal irresponsibility, alleging that Republican tax cuts mostly benefited the wealthy and were creating debt to be paid by future generations. Republicans attributed most of the shortfall to temporary costs associated with the moribund economy and the war on terrorism. Even so, the deficit and the rapid growth failed to produce any revival of inflation, with consumer prices growing less than 2% for the year. Though unemployment fell back to 5.7% in December, only 1,000 jobs were created that month, and about 309,000 people stopped looking for work.

      By year's end the equity markets had posted substantial gains, their first in three years. The Dow Jones Industrial Average finished the year at 10,453.92, more than 3,000 points higher than the March low, and the technology-heavy Nasdaq average rose from 1253.22 in March to above 2000 at year's end. Even so, both averages remained well under their record highs, established in 2000.

      Incidence of corporate wrongdoing and accounting irregularities, widespread during 2002, subsided during the year, but the national system of market regulation sustained major strains. Robert Grasso, chairman of the New York Stock Exchange, was forced to step down after his $187.5 million compensation package was revealed. The nation's $7 trillion mutual-fund industry was rocked by allegations of misconduct, including after-hours and insider trading. The mutual-fund investigation was spearheaded not by the federal Securities and Exchange Commission, which normally took the lead role in market regulation, but by Eliot Spitzer, the controversial and aggressive New York state attorney general.

Foreign Policy.
      The new U.S. preemptive-war policy, and particularly U.S. action in Iraq, threatened to fracture U.S. relations with several European powers. In March, France, Germany, and Russia refused to allow a United Nations vote authorizing the Iraq incursion. After the U.S.-led coalition victory, France was among the countries refusing to contribute security forces to restore law and order in Iraq and declining to assist in that country's economic reconstruction.

      With costs rising, including financial outlays and U.S. troop casualties, diplomacy came close to a breakdown. At one point U.S. Defense Secretary Donald Rumsfeld derisively dismissed recalcitrant major powers as “Old Europe,” contrasting their foot dragging with the actions of new democracies such as Poland, Romania, and Bulgaria, as well as other countries that wholeheartedly supported the coalition effort. The Pentagon then explicitly refused to consider corporate construction and supply bids for Iraq from countries that had failed to support the war effort, which further angered French, German, and Russian interests. At year's end, however, President Bush dispatched former secretary of state James Baker to negotiate a reduction of the $120 billion external debt left by the Saddam regime. Baker was largely successful, and U.S. diplomatic relations with its estranged allies improved.

      Another major effort to resolve the long-standing Israeli-Palestinian standoff foundered during the year. The European Union, Russia, the United States, and the United Nations devised a “road map to peace” and obtained nominal agreement to it from both sides. To aid in breaking the deadlock, Palestinian leader Yasir Arafat was forced to share power by appointing a prime minister. The new official, Mahmoud Abbas (see Biographies (Abbas, Mahmoud )), was not able to assert his authority, however, and he resigned his post, leaving the Middle East peace process with no significant progress for the year.

      Concerns over nuclear proliferation in Third World countries continued to preoccupy U.S. diplomats. As the year began, North Korea withdrew from the Nuclear Non-proliferation Treaty, the first signatory ever to do so, and threatened concerted efforts toward building up its nuclear-weapons program. North Korea insisted on direct negotiations with the U.S., preceded by a U.S. nonaggression guarantee. Six-country talks, including North Korea's ally China, were held at midyear, without apparent progress, but after the U.S. offered limited security promises, negotiations were again resumed at year's end.

      Iran and Libya, under international pressure, promised to open their long-running and secretive nuclear programs to inspection during the year. Iran revealed that its efforts had been under way for 18 years, which prompted U.S. calls for punitive measures, but UN authorities elected instead to push only for more effective future inspections. Libya, struggling to escape UN economic sanctions, agreed to pay $2.7 billion to families of victims of the 1988 airline tragedy in Lockerbie, Scot. Later in the year a shipment of centrifuge equipment heading to Libya was intercepted at an Italian port, the first action under a U.S.-led 11-nation Proliferation Security Initiative. Within weeks the Libyan regime publicly disclosed its own nuclear-weapons-development program and promised to dismantle it. Bush administration backers attributed progress on nuclear nonproliferation to the U.S. hard line on Iraq.

      U.S. relations with China continued to warm despite concerns over a major trade imbalance and Taiwan. As the Chinese economy expanded rapidly, creating a massive trade surplus with the U.S., the Bush administration suggested that China was manipulating its currency to make the trade imbalance even more one-sided. Later, however, as Taiwan politicians talked of independence, the U.S. forcefully reminded them that the U.S. “one China” policy opposed any complete and permanent Taiwan-China break.

David C. Beckwith

Developments in the States
      A roller-coaster national economy and unsettled relations with the federal government made 2003 a turbulent year for U.S. state governments. Severe budget problems deteriorated further early in the year, which prompted a variety of measures to balance revenue and spending. The national economy leveled off and began growing rapidly at midyear, which eased financial pressures on state governments but not before the tumult helped produce a rare event, the recall of a state governor.

Party Strengths.
      Democrats made modest gains overall in limited state legislative balloting in 2003; though they lost seats in Mississippi, they made gains in New Jersey and Virginia. Those results left the two major parties at virtually equal strength across the country, with Republicans holding a slight advantage of fewer than 1% of overall legislative seats.

      For 2004, Republicans would continue to control both state legislative chambers in 21 states. Democrats would dominate both bodies in 17 states, up from 16 in 2003. Eleven states were split, with neither party organizing both chambers. Nebraska has a nonpartisan legislature.

      For most of the year, Republicans had a 26–24 advantage in governorships. In October voters in California recalled Democratic Gov. Gray Davis and replaced him with Austrian-born actor Arnold Schwarzenegger (see Biographies (Schwarzenegger, Arnold )), a Republican. The next month Republicans won two of three gubernatorial elections, prevailing in Kentucky and Mississippi but losing in Louisiana. The gubernatorial lineup for 2004 would thus include 28 Republicans and 22 Democrats.

Structure, Powers.
      The chief justice of Alabama, Roy Moore, was removed from office after a judicial evaluation commission determined that he had failed to heed a federal court order. In 2001 Moore had installed a 2,400-kg (5,300-lb) granite monument to the Ten Commandments in the state judicial building lobby, and Moore later ignored a federal court order that it be removed.

      The Colorado Supreme Court, in a controversial ruling, declared that the state constitution prohibits mid-decade redistricting. Following the 2000 census, the legislature defaulted to the courts in its duty to draw a new district map. With Republicans in full control in 2003, the legislature attempted to strengthen its hold on seven of nine seats, but the state high court said that no further redistricting could be done until after the 2010 census.

      Republicans were more successful in Texas. After having taken majority control of the legislature in 2002 elections, Republicans started redrawing U.S. House of Representatives district lines. Democrat House members fled to Ardmore, Okla., for four days and thereby prevented a quorum from assembling during regular session. During a subsequent special session, Senate Democrats flew to Albuquerque, N.M., and stayed out of state for more than a month, which also prevented a quorum. In a third special session, however, a new map was approved that promised to add at least five new Republicans to a delegation previously controlled 17–15 by Democrats.

Government Relations.
      Relations between states and the federal government, always contentious, were uneven during 2003. After having mandated improvements in public education, homeland security, election procedures, and other local concerns, the U.S. government made only partial reimbursement for costs, and this had an adverse impact on deteriorating state budgets. With some state taxes tied to federal levies, administration-backed tax cuts eroded state revenue collections. Congress extended a ban on taxation of some Internet service providers, depriving states of a needed, growing revenue source.

      The administration of U.S. Pres. George W. Bush at midyear proposed converting six existing federal programs—Medicaid, low-income housing, workforce development, child protection, transportation, and Head Start—into block grants administered by the states. Backers suggested that local control would eliminate overhead and provide needed flexibility in the administration of social programs. No action was taken on the proposal during 2003.

      Citing excessive expense, legislators in Colorado, Kansas, Maine, New Mexico, North Dakota, Washington, and Utah canceled their states' presidential primaries, which had been scheduled for 2004. Governors in Arizona and Missouri vetoed similar bills and restored primary-election funding.

Finances.
      An underperforming national economy continued to limit state revenue growth and increase social-service costs, and many relatively painless budget adjustments were quickly exhausted. At one point 45 states faced budget shortfalls, and the cumulative state deficit nationwide was estimated at a record $70 billion. More than half of that, $38.2 billion, was the responsibility of California.

      States responded with a wide variety of measures. Nearly 30 states raised taxes, 8 of them by more than 5%. Alabama attempted to raise taxes by nearly 10%, but the measure was rejected by voters. Although 20 states were able to avoid significant tax increases, only Hawaii was able to reduce overall taxation levels during the year. Most states increased user fees on everything from health care and motor-vehicle licensing to court costs. Many states showed creativity in finding new revenue sources; Massachusetts, for example, increased fees for skating-rink licenses and for taking the bar exam. Fifteen states increased tuition at public colleges. Eight states raised revenue by expanding state-sponsored gambling, but Maine voters rejected a referendum to allow Indian-owned casinos. Other revenue measures included exhausting rainy-day savings, diverting other appropriated money, and enacting tax-amnesty or stepped-up tax-enforcement programs.

      Some 35 states slashed spending, usually by a reduction in workforce. The cuts even extended to previously sacrosanct areas such as public-school funding and safety-net expenditures. With health costs rising rapidly, many states trimmed Medicaid and children's health insurance, usually eliminating some coverage, reducing benefits, or establishing waiting periods.

      For months the Bush administration opposed federal assistance to hard-pressed state treasuries, urging states instead to reduce spending. In May, however, as part of a tax-cut compromise, Washington agreed to send $20 billion to state governments, roughly half in flexible grants and half in additional Medicaid funding. Those payments coincided with a midyear economic pickup that dramatically improved the outlook for state budgets. By year's end a majority of states were running ahead of budget projections, most states were recovering, and only California among major states was still projecting a significant deficit.

California Recall.
      Prior to 2003, citizens of only a single U.S. state—North Dakota in 1921—had ever recalled their governor by popular election. “A perfect storm” of economic and political maelstroms had enveloped California Governor Davis only months after his November 2002 reelection, however, and it prompted his recall and replacement by a political newcomer.

      Davis, faulted for a relatively colourless personal style, was weakened by his handling of California's electricity crisis in 2001 and his perceived failure to reign in state spending after the “dot-com boom” ended and government revenues plunged. After his reelection Davis boosted state-deficit estimates and then encountered gridlock in budget negotiations—Republicans refused to raise taxes, and Democrats resisted major cuts in spending. By midyear, after the state had tripled an unpopular automobile tax, opinion polls showed Davis's approval ratings hitting record lows.

      Recall advocates needed 897,000 voter signatures to force a recall election. Aided by funding from a wealthy Republican gubernatorial hopeful who later dropped out, anti-Davis forces gathered more than 1.3 million valid signatures. The election was eventually set for October 7 to decide two questions: should Davis be recalled, and, if so, who should replace him?

      During the campaign, Democrats were badly split; some concentrated on retaining Davis, but others backed Lieut. Gov. Cruz Bustamante in case Davis was recalled. On October 7 Davis was ousted by a margin of 55.4% to 44.6%. On the second question, voters chose from among 135 candidates of wildly varying backgrounds. The winner was Schwarzenegger, with a plurality of 48.7%; Bustamante was second with 31.6%. Schwarzenegger was sworn in after the results were certified on November 14.

Laws and Justice.
      With business groups warning of potential job losses, Washington voters overturned a legislature-approved ergonomics law that provided workers with strong protection against repetitive-motion injuries. Maryland joined 13 states providing protection to users of marijuana for medical purposes.

      Budget pressure spurred review of state corrections policies, and a recent prison-construction boom slowed. States executed 65 death-row inmates during the year, 24 of them in Texas. Illinois Gov. George Ryan, two days before leaving office in January, issued a blanket statewide clemency to all 167 convicts on death row. Ryan had suspended the imposition of capital punishment in 2000, saying it was applied arbitrarily. At year's end, Ryan was indicted on federal corruption charges, which were unrelated to his death-penalty actions.

Health and Welfare.
      States struggled to contain medical costs, particularly for expensive prescription drugs. Some states attempted to negotiate prices directly with pharmaceutical companies on behalf of low-income or elderly users, and the U.S. Supreme Court approved a closely watched Maine plan that drug companies alleged was coercive. Other states formed pools to facilitate bulk purchases of popular medications. Officials in several states moved to reimport American drugs from Canada, where prices were often cheaper, but the federal Food and Drug Administration rejected the idea. (See Canada: Sidebar (Filling Prescriptions for Americans-Big Business in Canada ).)

      New York and Massachusetts joined California, Connecticut, Delaware, and Maine in banning smoking in virtually all workplaces, including taverns and restaurants.

Education.
      A trend toward more competition in K–12 education expanded during 2003. Colorado's legislature approved a school-voucher plan, although a federal judge later struck it down as an unconstitutional interference in the local control of education. Officials in Arkansas, California, and Texas banned the sale of candy, gum, and soft drinks in public elementary and secondary schools. Tuition savings plans that guaranteed future state-university enrollment at current fees were a budget casualty in several states; Kentucky, Ohio, Texas, and West Virginia suspended new enrollments, and Colorado terminated its plan.

      States struggled with mandates of the federal No Child Left Behind Act, which required “high stakes” testing, upgraded teacher-qualification requirements, and prescribed penalties for lagging schools. The Bush administration said the tumult was an expected product of significant reform of public education, however, and state requests for waivers from or amendments to the act were postponed until after the 2004 election.

Equal Rights.
      At the urging of embattled Governor Davis, the California legislature approved a law allowing illegal aliens to obtain state drivers' licenses. The measure was widely viewed as having facilitated Davis's recall, and at year's end legislators repealed it by a near-unanimous vote. In a widely anticipated ruling based on two cases from the University of Michigan, the U.S. Supreme Court permitted affirmative action benefiting minorities in university admissions. The ruling had no effect in California and Washington, where voters had banned race-conscious state policies, but it allowed the resumption of affirmative action in Texas, Louisiana, and Alabama, where lower federal courts had ruled it unconstitutional.

      Supporters of homosexual rights made major gains during the year. The U.S. Supreme Court, in a Texas case, invalidated state sodomy statutes on privacy grounds. Critics charged that the ruling would inevitably lead to judicial sanction of same-sex marriage. Later in the year, in a 4–3 decision, the Massachusetts Supreme Judicial Court ruled that the state constitution forbade denying homosexual couples the right to marry. Similar rulings had been overturned by state constitutional amendments in Hawaii and Alaska and by a “civil unions” law in Vermont that granted only marriagelike rights. Though amendments to the Massachusetts constitution required at least two years for passage, the state high court gave the legislature only six months to comply. Supporters cheered the ruling as providing equality for homosexuals in hospital visits, inheritance rights, and even Social Security entitlements.

      The decision also created uncertainty nationwide on both state and federal levels. Reacting to the Hawaii decision, 37 states had approved laws defining marriage as a union between a man and a woman. The U.S. Constitution, however, requires states to give “full faith and credit” to laws of other states, and it was thus inferred that a homosexual marriage in Massachusetts had to be recognized universally, so the validity of those 37 state laws was in doubt. At year's end, traditional-family proponents vowed support for a U.S. constitutional amendment that would overturn the Massachusetts ruling, which they predicted would undermine traditional marriage, harm children, and threaten social stability.

David C. Beckwith

▪ 2003

Introduction
Area:
9,363,364 sq km (3,615,215 sq mi), including 204,446 sq km of inland water but excluding the 155,534 sq km of the Great Lakes that lie within U.S. boundaries
Population
(2002 est.): 287,602,000; based on 2000 unadjusted census results
Capital:
Washington, D.C.
Head of state and government:
President George W. Bush

      In the decade following the collapse of the Soviet Union, the reign of the United States as the world's sole superpower was largely positive, with little apparent downside. The U.S. military created a Pax Americana, its might virtually unchallenged, complementing a dependable U.S. economic engine that seemed to pull the global economy through good times and bad. In 2002, however, Americans came to understand that leadership was costly and often involved disquieting risk.

      The year started with the U.S. determinedly addressing fallout from the Sept. 11, 2001, terrorist attacks and apparently emerging from a mild economic recession. By year-end, however, both external and internal problems appeared far more complicated. Confrontation with the al-Qaeda terrorist network produced modest progress, but the overall terrorism conflict actually expanded; the U.S. was preparing for a potential military assault on Iraq and attempting to defuse a nuclear crisis with North Korea. The national economy, plagued by war jitters and corporate accounting irregularities, stalled in midrecovery, with stock prices plunging and unemployment edging upward, which threw the federal budget back into long-term deficit.

      Contributing to the national malaise were a series of crises suffered by major American institutions. Virtually unprecedented revelations of dishonesty in corporate executive suites, accompanied by a wave of major business bankruptcies, shook confidence in the foundations of U.S. economic prosperity. A sexual-abuse scandal rocked the Roman Catholic Church. (See Religion: Sidebar (Roman Catholic Church Scandal ).) In addition, the competency of the CIA and the FBI was questioned during inquiries into intelligence lapses before September 11.

      Nevertheless, Pres. George W. Bush managed to solidify his position with the American people, in large part owing to his purposeful handling of the “war on terrorism.” He announced a new policy favouring preemptive strikes against increased terrorist threats, expanding the national right of self-defense, and his allies steered several measures through Congress that increased U.S. preparedness. The U.S. Senate, however, controlled by Democrats, delayed approval of several administration initiatives, including terrorism-related bills. Bush took the issue into the midterm election in November, and his party regained total control of Congress. (See Sidebar (U.S. 2002 Midterm Elections ).)

War on Terrorism.
      In his January state of the union address, President Bush effectively broadened the antiterrorist struggle by declaring that nations attempting to produce “weapons of mass destruction” were part of the world terrorist threat. He specifically named Iraq, Iran, and North Korea as “an axis of evil” developing nuclear, chemical, or biological weaponry, and he challenged other governments to confront these states as well. The speech set the tone for a year in which the new terrorist threat dominated foreign relations as well as U.S. domestic politics.

      Dramatic developments in the war on terrorism were rare during 2002. U.S. forces led a successful March coalition military effort in Afghanistan, dubbed Operation Anaconda, that claimed an estimated 500 Taliban and al-Qaeda dead. The top al-Qaeda and Taliban leaders, Osama bin Laden and Mullah Mohammad Omar, remained at large throughout the year, however, and rumours of Bin Laden's death were never confirmed. Despite plentiful warnings and alarms, there were no new terrorist attacks on American soil. The perpetrator of anthrax attacks through U.S. postal facilities, which killed five Americans in late 2001, was never identified, nor was any connection with the September 11 events established. Nonetheless, a political consensus developed behind the main elements of the president's drive to increase domestic precautions against terrorist attacks—to beef up military preparedness and to lead the world response to the threat.

      Bush proposed a 14% increase—to $379 billion annually—for defense spending, the largest increase in two decades, and he sought a doubling of expenditures for homeland security, to $37.7 billion. Some proposals became entangled in politics. Numerous U.S. allies, including top officials of the European Union and France, faulted Bush's approach as excessively unilateral and jingoistic. Two key parts of Bush's antiterrorism legislative package—establishment of a new federal Department of Homeland Security and the provision of federal terrorism reinsurance—became stalled in the U.S. Senate owing to objections from labour unions and trial lawyers. They were belatedly approved only after the November election, along with a measure creating a bipartisan commission to study intelligence failures prior to the September 11 attacks. Most administration initiatives, however, including a major bioterrorism defense bill that increased vaccine stockpiles and protected water and food supplies, were swiftly put into place.

      Congress also accepted Bush's expanded definition of the war on terrorism, including his call for a “regime change” in Iraq. In October, only days before national elections, both chambers overwhelmingly approved a resolution authorizing the use of force against Saddam Hussein and Iraq. After an extended delay led by Russia, France, and other countries, the United Nations also agreed to demand Iraqi compliance with inspections to ensure that weapons prohibited in the 1991 peace agreement were not being developed. The inspectors were not scheduled to report their findings until early 2003, but by year's end a U.S.-dominated coalition had more than 100,000 troops deployed or en route to the region.

Domestic Issues.
      Election-year maneuvering had always had an impact on U.S. federal legislation, but the close division in the U.S. House and Senate made 2002 notable for bills that failed to become law. Only 2 of 13 final appropriation bills were cleared by year's end, for example, and partisan gridlock became a major issue in November balloting.

      Both chambers approved separate energy bills during the year, but conference negotiators failed to agree on a compromise; the Republican-controlled House insisted on oil exploration in Alaska's Arctic National Wildlife Refuge, a measure opposed by environmentalists. A major bankruptcy reform measure, approved by both the House and the Senate in 2001, also died over a partisan argument on the treatment of bankrupt abortion protesters. Congress also failed to agree on prescription drug benefits for Medicare recipients, on denying tax benefits to companies incorporating in offshore tax havens, on reforming medical malpractice liability, and on reauthorizing a successful 1996 welfare-reform law.

      Political considerations were apparent in legislation affecting corporate fraud and farm subsidies. Early in the year, amid early indications that Republicans would suffer from the 2001 Enron bankruptcy and other corporate malfeasance, Democrats pressed for punitive measures to address business accounting problems, corporate governance, and securities-law fraud. Public opinion polls showed, however, that neither political party had an advantage on the issue of corporate dishonesty; Congress easily approved a compromise bill tightening securities regulation and establishing an oversight board for the accounting industry. In renewing farm legislation, Republicans initially resisted a proposal to increase agricultural subsidies dramatically. A $248 billion, six-year bill was approved, however, after party strategists noted that most federal payments would go to states that had voted for Bush in 2000.

      Two measures regulating elections also became law, but their impact was in doubt. A campaign finance reform bill was approved that banned unrestricted “soft-money” donations from corporations and labour unions to national political parties and regulated campaign advertising by outside groups. The bill was quickly challenged in federal court, however, as violative of First Amendment free-speech protections. Critics noted that the law continued to allow soft-money donations to other groups, including state political parties, and reform supporters complained that Federal Election Commission members had begun watering down the reform via regulations. Congress later approved a long-delayed reform law, inspired by year 2000 problems in Florida and elsewhere, setting national standards for voting rules and equipment. The law envisioned $3.9 billion in federal aid to states to meet the standards, but Congress failed to appropriate those funds.

      After Republicans made unexpected gains in November, Democratic House Minority Leader Richard Gephardt of Missouri, a moderate who had sided with the president on national security issues, resigned his leadership post. Gephardt later announced his candidacy for president in 2004. He was replaced by Democratic Rep. Nancy Pelosi of California. The Senate Republican leader, Trent Lott of Mississippi, was forced to resign his post in a bizarre controversy that started at a 100th birthday party in December for Republican Sen. Strom Thurmond of South Carolina. Lott implied to the crowd that the U.S. might have been better off if Thurmond, who had run as an archsegregationist, had been elected president in 1948 instead of Harry S. Truman. Criticism of Lott's remarks started slowly but snowballed, and he resigned as presumptive Senate majority leader two weeks later.

      FBI statistics indicated that the incidence of serious crime in the U.S. began inching up again in 2002 following nine years of decline. The figures showed that while violent crimes dropped during the first six months of the year, crimes against property rose significantly, and the result was an overall 1.3% increase in seven index crimes. The body of former intern Chandra Levy, victim of the most notorious crime of 2001, was found in a Washington, D.C., park in May. She had apparently been strangled, but authorities brought no charges in the case. The U.S. congressman from her Modesto, Calif., district, Gary Condit, who had admitted to a relationship with Levy, was defeated in his reelection bid in the Democratic primary.

      The national capital area was again traumatized during 2002 by apparently random sniper shooting attacks that killed 10 people and wounded 3 in Maryland, Virginia, and Washington, D.C., over a 20-day period. The crime spree ended on October 24 with the arrest of John Allen Muhammad, a former army infantryman, and his teenage companion, John Lee Malvo. The pair, later named suspects in other crimes in Alabama, Louisiana, Arizona, and Georgia, apparently operated out of a 1990 Chevrolet Caprice that had been modified to allow rifle shots from a hiding place in the car's trunk. (See Law, Crime, and Law Enforcement: Crime.)

The Economy.
      For most of the previous decade, while other countries were suffering economic hard times, the U.S. economy had continued to expand, providing a market and needed economic activity that benefited global economic health. In 2002, however, the U.S. economic beacon flickered markedly, the strain aggravated by a declining stock market, fears over war and terrorism, government uncertainty, a historic wave of corporate dishonesty, and a near breakdown in the system of regulation that framed American economic success.

      The economic landscape was littered with casualties. Technically, the U.S. economy continued to expand during 2002, although anemically, but in little more than a year, 6 of the 10 largest corporate bankruptcies in U.S. history were recorded. Widespread accounting irregularities were reported, and Arthur Andersen LLP, one of the “Big Five” accounting firms, went out of business after its criminal conviction on obstruction of justice charges regarding the Enron investigation. (See Economic Affairs: Business Overview: Sidebar (Enron-What Happened? ).) Some 250 companies, a record by far, were forced to restate their earnings. Prominent businessmen were arrested, and some were led off in handcuffs, doing the “perp walk” for news cameras. The nation's stock markets declined for the third consecutive demoralizing year. At year's end, as problems mounted, President Bush replaced his economic team leadership, including the chairman of the Securities and Exchange Commission, Harvey Pitt (see Biographies (Pitt, Harvey )), in search of a fresh start.

      Some analysts blamed the debacle on a hangover from the 10-year expansion, the longest in U.S. history, that ended in March 2001 shortly after the technology-dominated dot-com bubble was deflated. Alan Greenspan, chairman of the Board of Governors of the Federal Reserve System, however, attributed the stock decline to “infectious greed” that corrupted even those who should police it: analysts, credit-rating agencies, and auditors. Others placed the blame on the rise of incentives for managers, especially stock options, which prompted a focus on short-term results rather than long-range strategy.

      As the year began, the national economy appeared to be rebounding smartly from a short-lived recession and adverse consequences of the September 2001 terrorist assault. Both interest rates and inflation remained low, and the economy expanded at a healthy 5% rate in the first quarter. Although business investment contracted, consumer spending, especially for homes and automobiles, remained vigorous, spurred by low interest rates. In April, however, the continuing wave of devastating corporate business news sent equity markets reeling. The Dow Jones Industrial average dropped from 10,600 to 7,200 over the next six months.

      Because the U.S. economy had proved so resilient in the past, government response was muted. The recession helped produce a federal deficit for fiscal 2002 of $159 billion, the first government red ink in four years. Federal Reserve officials had little room to maneuver: they had lowered interest rates 11 times in 2001, and they dropped the key federal funds rate another one-half point, to 1.25%, as markets deteriorated. After extensive discussion, Congress approved a corporate fraud reform law, known as the Sarbanes-Oxley bill, that provided for accounting standard oversight, banned auditors from supplying other services, and required audit committee board members to be independent company directors. The law also required corporate chief executive and financial officers to attest personally, with their signature, to the accuracy of their financial reports. For his part President Bush replaced his treasury secretary and his top economic adviser.

Foreign Policy.
      U.S. allies overwhelmingly supported the 2001 incursion into Afghanistan, but the Bush administration's stepped-up aggressiveness toward perceived terrorist threats in 2002, targeted initially at Iraq, attracted numerous skeptics. Especially in Europe, critics complained about U.S. arrogance and unilateralism. The new U.S. line was formalized in September in a document, “National Security Strategy of the United States—2002,” that promised U.S. preemptive removal of weapons of mass destruction from those deemed to be a national enemy. “The gravest danger our nation faces lies at the crossroads of radicalism and technology.…In the new world we have entered, the only path to peace and security is the path of action,” the Bush administration declared.

      Only a handful of countries, including Britain and Australia, endorsed the preemption policy openly. Reaction in France and Germany was hostile. German Chancellor Gerhard Schröder, running for reelection, repeatedly promised that his administration would never join any U.S. war effort against Iraq. President Bush early on demanded “regime change” in Iraq, but following domestic and international criticism, he appeared before the United Nations in September to urge multilateral support for merely disarming Iraq in accordance with agreements made following the 1991 Persian Gulf War. After an uncomfortable delay, the UN Security Council unanimously approved a strong resolution demanding that Saddam Hussein admit UN weapons inspectors with intrusive authority. Both France and Russia made it clear, however, that their involvement in any potential military action against Iraq would require specific UN approval.

      Hussein's government eventually agreed to—and did—provide a catalog of facilities, products, and scientists and submit to an inspection regime. At year's end the U.S.-Iraqi face-off intensified as inspectors examined Iraqi sites. Meanwhile, both sides worked a clamorous public relations strategy, with U.S. authorities proclaiming that Iraqis were violating their obligations by resisting enforcement of U.S.-led no-fly zones and Iraqis insisting that inspections had found nothing incriminating.

      A decade-old border conflict between India and Pakistan, two nuclear powers, threatened to escalate into open combat at midyear. At one point the two populous countries had one million troops massed on their common border. Top Bush administration officials, including Secretary of Defense Donald Rumsfeld (see Biographies (Rumsfeld, Donald )), led an international mediation effort that defused the immediate crisis.

      The Bush administration's tilt toward Israel in its half-century conflict with Palestinian interests—another issue dividing the U.S. from much of Europe—became more pronounced during the year. After a particularly bloody series of terrorist bombings that killed more than 30 Israelis in three days, the government of Ariel Sharon mounted a determined incursion into Palestinian territory. President Bush urged moderation on Israel but pointedly continued to refuse to meet with Palestinian leader Yasir Arafat or to intervene decisively to stop the Israeli action.

      U.S. relations with Russia under Pres. Vladimir Putin continued to improve. The two countries finally signed a delayed nuclear arms treaty reducing warheads on both sides. Nevertheless, U.S. exhortations failed to dissuade Russia from assisting Iran in weapons-capable nuclear-power projects.

      In early fall, even as the U.S. was focusing diplomatic and military efforts on Iraq, the third axis of evil country lurched again into world headlines. Confronted with evidence that its scientists had been working on a uranium-enrichment program in apparent violation of a 1994 promise, North Korean officials freely admitted the violation and implied that they were working on nuclear weapons as well. Under the 1994 pact, negotiated in part by former U.S. president Jimmy Carter, North Korea had agreed to accept two light-water reactors and 500,000 tons of heavy fuel oil annually in exchange for a freeze on weapons-capable nuclear power. North Korean officials followed the admission with further breaches, expelling International Atomic Energy Agency inspectors, removing surveillance cameras and seals from key sites, and restarting a nuclear plant using plutonium-generating spent fuel rods.

      Some analysts suggested that North Korean strongman Kim Jong Il was using a renewed nuclear threat to extort additional concessions from the West. North Korea, a land of scant resources, in recent years had devoted most of them to military purposes and depended on outside assistance in recent years to thwart famine, power shortages, and hardship for its 22 million citizens. Other analysts suggested that Kim, sensing that North Korea would be the next target of President Bush's campaign against the axis of evil, was arming himself with a nuclear deterrent. In any event, the Bush administration refused to negotiate with the North Koreans, and Rumsfeld pointedly warned that the Pentagon was prepared to fight a second war if Kim felt “emboldened” because of the world's preoccupation with Iraq.

      At year-end the threat of immediate conflict was receding. North Korea had 500 Scud missiles, plus additional Nodong and Taepodong-2 ballistic missiles capable of reaching Japan, Alaska, and eastern Russia. Since signing the 1994 agreement, according to Western intelligence reports, North Korea had gained the capability of producing both chemical and biological weapons. In December former president Carter was awarded the Nobel Prize for Peace, in part for his work on the North Korea situation. (See Nobel Prizes .)

David C. Beckwith

Developments in the States
      A decade-long revenue boom for state governments came to an abrupt halt in 2002 after events conspired to produce the most drastic state fiscal crisis in a half century. After having expanded spending programs freely and cut taxes in sunny economic times, officials were forced to reverse course sharply during the year, raising revenue and reducing services on even essential programs across the board.

      The hard economic times were exacerbated by continuing state struggles with the federal government, usually over which level should fund expensive initiatives such as those covering low-income persons' health coverage, election reform, education mandates, homeland security, and prescription drug costs. Although public education traditionally had been the purview of states, the year saw enactment of a significant new federal law addressing K–12 education, and federal courts approved state tax support for private schools. Those courts also banned state execution of the mentally impaired.

      Forty-four states held regular legislative sessions during the year, and more than two dozen held special sessions, often to deal with budget problems.

Party Strengths.
      Republicans made notable gains in state legislative elections and edged ahead of Democrats in total state legislative seats for the first time in five decades. Democrats, however, continued to erode a recent GOP advantage in governorships, particularly in larger states. The net result was that the two major parties were at virtual parity nationwide at year's end.

      After the new Congress assumed office in January 2003, Republicans would hold both state legislative chambers in 21 states, up from 17 before the election. Democrats would have control in 16 states, down from 18 in 2002. Twelve states were split, with neither party organizing both chambers. Nebraska had a unicameral, nonpartisan legislature.

      The incumbent party was turned out in half of the 36 gubernatorial elections nationwide, and Democrats made modest gains overall. Republicans had a 27–21 advantage (with two independents) prior to November balloting. In 2003 the party lineup would be 26 Republicans and 24 Democrats. (See Sidebar (U.S. 2002 Midterm Elections ).)

Government Structures, Powers.
      Efforts to limit the service of state officials, a popular cause in the 1990s, suffered setbacks during the year. Idaho voters endorsed a legislative initiative, and the state became the first to repeal a term-limit law. Oregon failed to overturn a late 2001 court decision invalidating that state's term limits.

      Rhode Island and North Dakota reduced the size of their legislatures. In Rhode Island the House saw a reduction of 25% (from 100 to 75), while the Senate was reduced from 50 members to 38. The reduction in North Dakota was smaller, the number in the House moving from 98 to 94 and that in the Senate from 49 to 47.

Government Relations.
      Controversy over the appropriate balance of responsibilities between states and the federal government, always fluid in the U.S. federalist system, escalated during 2002. States continued to protest unfunded mandates from Washington and complained that promises of added federal funding had been broken. State officials also campaigned specifically for additional U.S. funds to combat the state fiscal crisis. They noted that, in the absence of federal help, state budget-cutting efforts—raising taxes and cutting spending—would actually aggravate problems caused by the lagging national economy. A measure to provide temporary assistance to states was approved by the U.S. Senate but died owing to opposition from the administration of Pres. George W. Bush.

      States continued to complain about federal foot-dragging in homeland security reimbursement. President Bush proposed spending $3.5 billion to train local first responders, but Congress failed to appropriate the funds. Though Congress approved a law to clean up election procedures nationwide, no money was sent to states for new election machinery or for training the workers at the polls in compliance with the law.

      Nevada Gov. Kenny C. Guinn became the first state chief executive to veto a U.S. presidential decision, having turned down an executive order to establish a nuclear-waste repository at Yucca Mountain, near Las Vegas, Nev. After Congress reversed the state action and reinstated the executive order, the repository battle moved to federal courts. In another federalism struggle, a federal judge enjoined efforts by the U.S. Department of Justice to overturn Oregon's unique assisted-suicide law, which had been approved by state voters twice in the 1990s.

Finances.
      Long-term trends and cyclic events combined to thrust states into their most dramatic budget crisis since World War II. Revenues plunged as the national economy remained sluggish, and structural problems with state tax sources belatedly surfaced. Even as officials rushed to trim outlays, expenditures continued to rise owing to the escalating costs of medical, security, education, and other programs. The situation was aggravated by actions taken during the 1990s, when an economic bonanza allowed states to reduce tax rates and increase spending extensively.

      Several tax sources continued to deteriorate during the year. States, having found it difficult to tax services—which played a growing role in the modern U.S. economy—experienced a lag in sales-tax revenue. In addition, corporate income taxes dropped owing to the increasingly sophisticated measures used by corporations to move profits to low-tax jurisdictions. Income taxes garnered from capital gains and the exercise of stock options dried up as capital markets edged lower. A survey by the National Governors Association in late 2002 declared that “nearly every state is in fiscal crisis,” with a cumulative budget shortfall for the year of more than $40 billion.

      With 49 states required to balance their budgets, officials moved to stanch the red ink mainly by cutting costs—freezing employee salaries, laying off employees, and cutting Medicaid. Twelve states increased higher-education tuition. Though some 30 states had created a “rainy-day fund” to weather economic hard times, those savings were used to cushion the immediate impact of the downturn. States also tapped funds from the 1998 settlement with tobacco companies to shore up revenues.

      Most states resisted significant politically unpopular tax increases during the election year, although 19 states boosted cigarette levies. At year's end, however, the budget shortfalls in many states continued to accelerate, and fiscal experts predicted that tax-increase legislation was inevitable in many jurisdictions. In late December, California Gov. Gray Davis announced that the state's two-year deficit projection had been raised to just under $35 billion. State workers vowed to resist pay cuts and job losses, and conservative legislators declared that the shortfall was the result of profligate spending and promised to stop tax increases.

Health and Welfare.
      As state finances deteriorated, officials increasingly looked to Medicaid for savings. Outlays for the program, targeted at low-income individuals, rose more than 13% owing to rising medical costs and additional enrollees, even as increasing numbers of middle-class Americans lost their health coverage. Many states responded by reducing medical reimbursements and tightening eligibility, measures that further roiled an embattled health care system.

      Proposals for major reform were debated in several states, but progress was slow as states awaited relief from the federal government. Oregon voters turned down a referendum that would have established the nation's first universal health care program.

Education.
      A historic federal education law, which was titled No Child Left Behind, dramatically increased accountability requirements for states and their local school districts. The U.S. Education Department issued the regulations late in the year, however, and some states complained about inadequate direction and funding from Washington. At year's end several states were seeking temporary waivers from federal requirements, but critics viewed the law as a major step forward in improving public education. (See Education .)

      In a landmark 5–4 decision, the U.S. Supreme Court declared that it was constitutional to utilize public funds to assist elementary and secondary students in private and even parochial schools. The ruling upheld a pilot “voucher” program in Cleveland, Ohio, and appeared to settle a key issue in providing additional choice in education. No new states joined Florida, Ohio, and Wisconsin in allowing private school assistance during the year, but the high-court ruling ensured that the idea would be widely considered in 2003.

      Massachusetts voters joined California and Arizona in banning bilingual education, but Colorado rejected a similar measure. In Florida voters approved a measure limiting the number of pupils in a classroom. California endorsed funding for a new after-school enrichment program.

Law and Justice.
      Responding to perceived abuses, West Virginia, Pennsylvania, Mississippi, and Nevada approved new measures to reform their civil liability systems. Critics claimed that sizable jury awards in lawsuits brought by plaintiffs' trial lawyers were creating “jackpot justice” that distorted the economy, caused bankruptcies, and drove some lawsuit targets, including physicians, out of business. The new laws brought to 17 the number of states that had established a limit on punitive, or noneconomic, damage awards and boosted standards of proof in order to stabilize lawsuit risks.

      Voters in Arizona, Ohio, and Nevada rejected marijuana-liberalization proposals. The Nevada measure would have allowed possession of three ounces of the substance for personal use. A federal judge endorsed an antitrust suit settlement between the Bush administration, the Department of Justice, and Microsoft Corp., but intervening attorneys general in Massachusetts and West Virginia vowed to appeal, saying that the deal did not adequately address the software giant's alleged monopolistic practices.

      In a controversial ruling, the U.S. Supreme Court told 20 states that they could no longer execute mentally retarded convicts. The court cited changing public standards, including action by several state legislatures to eliminate the death penalty for those with low IQs.

      The high court decision did not quiet controversy over capital punishment. Maryland joined Illinois in imposing a moratorium on all executions pending a review of procedures. (See Law, Crime, and Law Enforcement: Special Report (Death Penalty on Trial ).)

Energy and Environment.
      The bankruptcy of energy giant Enron Corp., a politically active backer of deregulatory policies, helped stall the spread of electricity deregulation in state legislatures. No new states were added in 2002 to the 26 that had initiated a free market for electricity in previous years. (See Economic Affairs: Sidebar (Enron-What Happened? ).) Oregon voters rejected a proposal to require labeling of genetically modified foods.

David C. Beckwith

▪ 2002

Introduction
Area:
9,363,364 sq km (3,615,215 sq mi), including 204,446 sq km of inland water but excluding the 155,534 sq km of the Great Lakes that lie within U.S. boundaries
Population
(2001 est.): 286,067,000; significant revision based on the 2000 census
Capital:
Washington, D.C.
Head of state and government:
Presidents Bill Clinton and, from January 20, George W. Bush

      Resilience had been a fundamental element of the American character from colonial times, but in 2001 the United States' ability to recover from adversity was severely tested. Its national economy, weary from years as the engine of world growth, finally slipped into recession. An energy crisis threatened further disruption, producing major bankruptcies. Terrorist attacks on September 11 coupled with a subsequent public health scare sent shock waves across the nation; the dispirited American morale slowed economic activity further, and the U.S. was soon plunged into a distant Asian war against an implacable fundamentalist regime.

      Within weeks, however, the country had righted its listing self-confidence. Security measures gradually began restoring trust in public institutions. A series of government economic measures, including 11 interest-rate reductions and substantial emergency spending, established a foundation under the rocky economy. The Taliban regime in Afghanistan was rapidly uprooted and dispersed by a devastating show of American military technology. By year's end the United States was on the road to recovery, its position as the world's economic, cultural, and military leader not only restored but burnished in a year of challenges.

Security Crisis.
      Authorities responded immediately to the September 11 events, bolstering safety measures at public buildings, upgrading screening at airports, freezing assets of groups with suspected terrorist ties, and detaining more than 1,000 noncitizens for questioning. The measures included such extraordinary steps as the granting of authority to air force generals to shoot down hijacked civilian airliners and a provision for wartime military tribunals to try suspected alien terrorists. Some measures prompted criticism from civil liberties groups, but public opinion polls showed that the measures were widely supported.

      In a September 20 address to Congress, Pres. George W. Bush announced the creation of an Office of Homeland Security under former Pennsylvania governor Tom Ridge to coordinate the antiterrorism efforts of 40 federal agencies. (See Biographies (Ridge, Tom ).) Within days the new agency confronted a new threat when several employees of a tabloid newspaper publisher in Florida contracted anthrax, an infectious disease ordinarily confined to farm animals, via suspicious mail. Additional anthrax spores were soon discovered in a variety of places, including the offices of Senate Majority Leader Tom Daschle (Daschle, Tom ) (see Biographies), post offices, and various news organizations. Most of the spores were traced to mail originating near Trenton, N.J., but a connection to the September 11 terrorism was never established. By year's end two forms of anthrax had killed 5 persons and sickened 14 and prompted authorities to extend precautionary drug treatment to 32,000 persons and to update inadequate public health emergency-preparedness laws.

      Congress approved a variety of measures to counter economic and security concerns following the terrorist attacks; $15 billion was appropriated to assist U.S. airline firms, including $5 billion in grants; lawmakers appropriated an immediate $40 billion in additional spending for a variety of causes, including stepped-up military activity and assistance to affected areas, such as New York City; and President Bush received authority to expend half of the funds at his discretion. Congress also authorized the use of force to respond to the attacks, provided for federal takeover of some 28,000 airport security workers, and approved an antiterrorism law that allowed expanded law-enforcement powers over money laundering, electronic and telephone eavesdropping, and detention of suspected terrorists.

      By year's end the death toll from the attacks had been revised sharply downward. At one point unofficial estimates had projected up to 10,000 deaths in New York and 500 or more at the Pentagon near Washington. Authorities in December, while cautioning that the precise number of deaths might never be known, put the toll at nearly 2,900 in New York City, with an additional 189 at the Pentagon and 44 in Pennsylvania, where another hijacked plane crashed after passengers attempted to overpower the terrorists.

Domestic Issues.
      The September 11 events proved to be a critical turning point for President Bush and his administration. Bush was inaugurated in January after having lost the popular vote and enjoying the weakest mandate of any recent U.S. president. (See Sidebar (Election Reform Debate in the U.S. ).) Congress was nominally in Republican hands but was almost evenly divided. Bush surprised many observers by pushing an aggressively conservative agenda, including a 10-year, $1.6 trillion tax cut, expanded energy exploration, a faith-based social assistance initiative, and withdrawal from several international treaties.

      Following compromise with congressional Democrats, Bush signed an 11-year, $1,350,000,000,000 tax-reduction bill on June 7 that provided instant $300–$600 rebates to most taxpayers, reduced the four major marginal rates, repealed the estate tax, increased the child-care credit, and provided relief for married couples and incentives for savings.

      In late May veteran Republican lawmaker Sen. James M. Jeffords of Vermont announced that he would leave the GOP and become an Independent caucusing with Senate Democrats. That turned the Senate, previously divided 50–50 but under Republican organization, over to a 50–49–1 configuration under Democratic control. Jeffords cited disappointment with conservative GOP policies, including inadequate spending for education, and allies noted that the White House had slighted him by failing to invite him to a ceremony honouring a Vermont schoolteacher. With Congress now officially divided along partisan lines, Bush's agenda bogged down over the summer, and the president, while still enjoying general popular support, was widely viewed as tentative and ineffective in his public appearances.

      Within days of September 11, however, Bush had shed that image. He delivered a thoughtful eulogy to victims at the National Cathedral service in Washington, D.C., and won praise for his presence in an early visit to the World Trade Center site. Bush's September 20 speech to a special joint session of Congress received widespread acclaim for its eloquence and delivery and helped launch Bush's personal approval ratings in public opinion polls to record levels through the remainder of the year.

      During the fall, measures responding to the terrorist assault were approved by Congress with only modest opposition, particularly legislation covering military preparedness and disaster relief. In the realm of ongoing domestic policy, however, entrenched partisan arguments stopped passage of numerous bills, including several that had been debated for years. Among legislation failing to pass Congress during 2001 were the president's energy security bill (which included oil exploration in an Alaskan wilderness area), campaign finance reform, fast-track trade-negotiation authority, Bush's faith-based social initiative, an agriculture subsidy bill, a federal patients' bill of rights, and a fiscal stimulus bill that administration partisans said was vital to the national economic recovery.

      At year's end Congress did approve a compromise education-reform act cosponsored by Democratic Sen. Edward Kennedy of Massachusetts. The bill required for the first time annual reading and mathematics testing for students in grades three through eight nationwide. It also required school districts to close the gap between poor and middle-class achievement and mandated that consistently underperforming schools allot part of their federal financial assistance to tutoring or providing transportation to other schools. (See Special Report. (Does Testing Deserve a Passing Grade? ))

      Debate over the wisdom and ethics of advanced scientific research grew in intensity during the year. The U.S. House of Representatives approved a bill banning cloning of humans from embryos and prohibiting creation of cloned embryos for research, but the Senate delayed the measure. Under pressure to take a position, President Bush announced in August that he would allow federal funding only for research on the approximately 60 colonies of embryo cells that had been already created, saying he did not believe taxpayer dollars should support further destruction of human embryos. A National Academy of Sciences panel quickly published a report detailing problems with the Bush position, and little was settled on the subject.

      Recent FBI figures revealed that the incidence of serious crime had remained virtually unchanged following eight years of significant decline. The figures showed a modest 0.3% reduction in seven index crimes during the first half of 2001. On June 11 Timothy McVeigh (see Obituaries (McVeigh, Timothy James ))—the main perpetrator of the 1995 Oklahoma City, Okla., bombing of the Alfred P. Murrah Federal Building that killed 168 persons—was executed at a U.S. prison in Terre Haute, Ind. It was the first federal execution since 1963. A second federal prisoner, Juan Raul Garza, convicted of three 1993 drug-related murders, was put to death eight days later in the same prison.

      Republican businessman Michael Bloomberg (see Biographies (Bloomberg, Michael )) prevailed in the highest-profile election of 2001, the race to succeed Rudolph Giuliani as mayor of New York City. Bloomberg spent a record $69 million of personal funds on the campaign. The year's most bizarre political story involved the disappearance from Washington, D.C., of a 24-year-old government intern, Chandra Levy, shortly before she was to return home to Modesto, Calif. Her parents hired lawyers and investigators and turned a glaring media spotlight on their hometown congressman, Democratic Rep. Gary Condit, who eventually admitted to a “close relationship” with the missing woman. Levy remained missing at year's end, and Condit announced that he would launch an uphill bid for reelection.

The Economy.
      The national economic expansion ended with a whimper during 2001. A panel of the National Bureau of Economic Research (NBER) declared in November that the nation's economic growth had ended the previous March, exactly 10 years after it had started, which made it the longest-running expansion since the organization began keeping records in 1854. Government figures showed that gross domestic product had increased by a modest 1.2% in the first quarter and an anemic 0.3% in the second, followed by a 1.3% retraction in the third quarter. Though recessions had traditionally been declared after two consecutive quarters of negative growth, NBER economists, noting continued economic deterioration, cited other factors in their assessment.

      The trauma of September 11 effectively kicked the national economy while it was down. The events further shook consumer confidence, which had been declining, and markedly reduced personal and business travel, entertainment expenditures, and other economic activity. The national jobless rate, which had bottomed at 3.9% in 2000, had started to climb early in the year; it jumped from 4.9% to 5.4% in October, the biggest one-month jump in two decades. By December unemployment had soared to 5.8%, the highest level in six years. Another victim of the terrorist-exacerbated recession was the short-lived federal budget surplus: after a record $237 billion in black ink during fiscal 2000, the U.S. ended fiscal 2001 on September 30 with a fast-diminishing $127 billion surplus, with many fiscal 2002 projections anticipating a return to deficit spending.

      Even so, the recession's impact was cushioned by several events. Fearing an overexuberant stock market and inflation, the nation's Federal Reserve System had nudged up interest rates six times in 1999–2000. In 2001, however, the Fed sharply reversed field and lowered its key federal funds rate on 11 occasions, from 6.5% to 1.75%, in a desperate attempt to revive the failing national economy. The actions provided a ripple effect that lowered borrowing costs across the board for credit cards, mortgages, and businesses. Additionally, as the recession reduced energy demand, oil prices began dropping worldwide, providing further relief to consumers. The nation's major automobile manufacturers began offering no-interest loans in a successful effort to maintain high demand, and new auto sales continued through the last months of 2001 at record levels. The federal government further contributed with cash tax rebates and at least $60 billion in emergency spending following the terrorist attacks.

      By year's end some economists were predicting imminent resumption of national economic expansion. Two major measurements of consumer confidence were rising sharply in December. The Dow Jones Industrial Average, which had dipped as low as 8,235 in the wake of September 11, finished the year over 10,000 and rising. The national inflation rate dropped back to a modest 2.6%, and productivity gains remained strong, which led several economists to predict an end to the recession as the country put memories of the attacks behind it.

      The recession helped avoid a widely predicted energy disaster in California and neighbouring states. As the year began, California was suffering under a mishandled deregulation of electricity that led to severe power shortages and the bankruptcy of a major state public utility. Rolling blackouts plagued the state during January, and many analysts predicted further outages and economic disruption during the summer, when air-conditioner use would be high. A combination of state government assistance to the utilities, a cool summer, upgrading of electrical distribution line efficiency, reduced usage due to recession and conservation, and the worldwide energy surplus largely prevented serious incidents.

      During the height of the crisis, California Gov. Gray Davis denounced out-of-state energy companies for taking advantage of the state and its consumers, and he specifically named the Houston, Texas-based Enron Corp. Late in 2001 Enron—the seventh largest American corporation, with over $100 billion in revenue in 2000—filed for Chapter 11 bankruptcy protection. The company, listing $49.5 billion in assets, became the largest company in U.S. history to go under. The failure was only tangentially related to its long-running exploitation of deregulated markets for wholesale natural gas and electricity. Analysts discovered that key company officials, while operating largely unregulated marketplaces trading derivative energy contracts, were simultaneous running private off-book partnerships and profiting personally, even while they overstated Enron profits. The company's failure was particularly hard on employees, many of whom had retirement funds tied up in near-worthless company stock.

      The world's leading software company, Microsoft Corp., avoided a court-ordered breakup by settling its antitrust case with the Bush administration Justice Department. The company had been found guilty of monopolistic practices in a case brought by the Bill Clinton administration and ordered divided into at least two parts. An appeals court panel in June confirmed that Microsoft had monopoly power but disqualified the original trial judge for injudicious comments outside the courtroom. After Microsoft allowed computer makers to disable some parts of its Windows operating system and replace them with software from other firms, the replacement judge approved a settlement allowing the company to stay intact.

Foreign Policy.
      Within hours of the September 11 attacks, the Bush administration began preparations for a military assault on the al-Qaeda network in Afghanistan and started assembling international support for the mission. The U.S. received immediate and strong support from British Prime Minister Tony Blair, who helped the U.S. rally world opinion. The partners took pains to assure that it was international terrorists and their protectors who were targets, not Islam. In the end some 60 countries offered tangible assistance, including Muslim Pakistan as well as Russia, which provided access to military bases in nearby Tajikistan. The U.S. doubled its military presence in the region to 50,000 during the month the hostilities began.

      Demands that the Afghan Taliban regime locate and turn over Osama bin Laden (see Biographies (bin Laden, Osama )) to international forces were met by evasion, then refusal. U.S.-dominated military action started with cruise missile, bomber, and fighter jet attacks throughout Afghanistan on October 7, followed by continued military operations in support of the Northern Alliance Afghan resistance fighters. At the beginning U.S. preparations were met by a hailstorm of criticism and doubts; critics suggested that Americans would be repeating Russian mistakes in Afghanistan or would be bogged down in a Vietnam-style Asian conflict. Instead, the operation was largely completed in 11 weeks as the Taliban was driven from power, replaced by a UN-brokered coalition; Bin Laden's fighting forces, which included Arabs, Pakistanis, and Chechens, were killed or dispersed.

      In his September 20 congressional speech, President Bush declared, “From this day forward, any nation that continues to harbor or support terrorism will be regarded by the U.S. as a hostile regime.” By year's end neither Bin Laden nor Taliban leader Mullah Mohammad Omar had been located. While continuing to search for Taliban and al-Qaeda leadership in the area, the U.S. turned its attention toward other countries facilitating terrorist activity. The ongoing confrontation with rogue organizations and states, especially those believed to be developing chemical, biological, or nuclear weapons, continued to dominate world affairs.

      Cooperation on Afghanistan was a highlight of improved U.S. relations with Russia. In mid-November, Russian Pres. Vladimir Putin visited Washington and Bush's ranch in Crawford, Texas. Talks appeared promising when Putin said he would consider allowing the U.S. to test a missile defense system even though the test would be an apparent violation of the 1972 Anti-Ballistic Missile (ABM) Treaty, provided the two countries could agree on nuclear weapons reductions. Bush announced that the U.S. would slash nuclear warheads from 7,000 to the 1,700–2,200 range over the next decade, and Putin hinted at similar reductions in the Russian 5,800-warhead arsenal. The two were never able to hammer out an agreement on the antimissile test, however, and in December Bush announced that the U.S. would withdraw from the treaty and thereby leave the way open for missile defense testing.

      The long-running U.S. effort to broker a lasting peace in the Middle East appeared to collapse during the year. Talks between Israeli and Palestinian leaders sponsored by former president Clinton had fallen apart in late 2000, producing violence that escalated during 2001. After a period of inaction, the Bush administration attempted to revive talks but without success, and after September 11 Israeli advocates successfully likened Palestinian bombing and assaults to the terrorist attacks in the U.S. President Bush pointedly declined to condemn Israeli military responses against the Palestinian population and refused to meet with Palestinian leader Yasir Arafat.

      Free-trade advocates scored a major advance at an international meeting in Doha, Qatar, when major countries agreed to begin a new three-year round of trade negotiations. The talks would be aimed at reducing agricultural trade barriers and industrial tariffs. The U.S. made concessions, putting its antidumping law under review in spite of opposition from American steel interests and agreeing that less-developed nations could override drug patents in the interests of public health. Most analysts declared that no new trade agreement could be negotiated, however, unless the U.S. Senate voted fast-track negotiating authority to President Bush. The U.S. normalized trade relations with China during the year after having cleared the way for China's membership in the World Trade Organization.

      During 2001 the focus of war concerns shifted to Asia, including Afghanistan. U.S. military efforts aggravated the decades-long conflict between India and Pakistan, and the two countries, both possessing nuclear weapons, were at the brink of war at year's end. Ironically, in an effort to encourage cooperation in the Afghan operation, the U.S. had lifted sanctions imposed on both countries following their 1998 nuclear tests. Problems with North Korea, one of the world's last communist regimes, continued to fester and led to periodic threats of war against the U.S. and its allies, including Japan.

      The Bush administration's efforts to build a coalition to support military measures in Afghanistan reversed what critics had labeled U.S. rejection of international solutions to world problems, including its refusal to sign the ABM Treaty and the Comprehensive Test Ban Treaty. Earlier in the year the Bush administration had officially rejected the Kyoto Protocol, suggesting that the anti-global-warming treaty would affect the global economy disproportionately. In late summer the U.S. sent a delegation to the UN World Conference Against Racism in Durban, S.Af., but walked out in protest against proposed conference resolutions calling for reparations to blacks for slavery and for condemnation of Israel for alleged racism against Palestinians.

David C. Beckwith

▪ 2001

Introduction
Area:
9,363,364 sq km (3,615,215 sq mi), including 204,446 sq km of inland water but excluding the 155,534 sq km of the Great Lakes that lie within U.S. boundaries
Population
(2000 est.): 275,372,000
Capital:
Washington, D.C.
Head of state and government:
President Bill Clinton

      The United States stormed into 2000 full of energy and confidence, its economy purring, its world leadership role unchallenged, and its two-century-old democratic experiment still vigorous. Incidence of crime, welfare dependency, and joblessness were down, and the stock market was soaring.

      In February economic expansion surged through its 108th straight month, surpassing the nation's consecutive growth record set in the 1960s. A month later national capital markets hit all-time highs. A spirited battle was under way as both major political parties eagerly vied to supply the successor to Pres. Bill Clinton, whose legacy of economic prosperity and centrist-policy successes had been diminished only by personal scandal. Optimism was soaring, and the U.S. was the envy of the world in the realms of democracy, economy, cultural offerings, and military might.

      By year's end, however, the national mood had markedly changed. The new tone was one of bewilderment, even creeping pessimism. The national election, far from confirming a clear new path, had ended in a puzzling stalemate capped by an unprecedented and dispiriting legal challenge. The stock market was slumping badly; consumer confidence was shaken; and economic statistics had suddenly turned ominous. The effectiveness of American world leadership was under challenge. Americans seemed badly divided, even rudderless, and commentators had difficulty pinning down a precise cause.

Politics and the Election.
      Ever since the end of the Cold War a decade earlier, the U.S. had struggled to find a sense of national direction. Secure in the dominance of its economy and national security apparatus, the country internally split into two relatively equal political camps. The Democratic Party favoured the government's moving more actively to assist those citizens left behind in the general prosperity, whereas the Republican Party (GOP) believed that government should step back and allow the ingenuity of the American people to produce without interference. The 2000 election, if anything, muddled the debate further—the most equivocal result in U.S. history, a near 50–50 split on virtually every level of government, with no clear call for any political party or ideology.

      If any trend emerged from the national balloting, it was that the incumbent party lost. The Republican ticket of George W. Bush (Bush, George W. ) and Richard B. Cheney (Cheney, Richard B. ) narrowly defeated Democratic challengers Albert A. Gore, Jr. (Gore, Albert A., Jr. ) and Joseph I. Lieberman (Lieberman, Joseph I. ). (See Biographies.) Democrats, however, narrowed their deficit in the U.S. House of Representatives for the third election in a row, leaving the Republicans with less than a 10-seat advantage. Democrats also erased the Republican lead among U.S. senators, creating a 50–50 tie in the upper chamber. It was much the same on the state level—Republicans made gains in state legislatures (where Democrats had enjoyed a slight advantage), creating a virtual tie in party control nationwide, and the GOP lost part of its sizable lead in govenorships. (See Special Report (U.S. Election of 2000 ).)

      For the first time, a presidential spouse entered elective politics. Hillary Rodham Clinton, though outspent by her Republican opponent, Rick Lazio, won the open U.S. Senate seat in her adopted state of New York. Only six weeks prior to the election, special counsel Robert Ray had concluded a six-year investigation of the Clintons, pointing out untruthful testimony by Hillary Clinton but concluding that there was insufficient evidence to prove indictable criminal wrongdoing. In another unusual congressional contest, a plane crash claimed the life of Missouri Gov. Mel Carnahan, Democratic challenger for the Senate seat held by John Ashcroft, only days before the election. Nonetheless, Carnahan won a narrow victory after the new governor promised to appoint Carnahan's widow, Jean, to the Senate seat.

      Awaiting a signal from voters, Congress approved almost no major legislation during the year. With both parties contesting for support from the technology-driven “new economy,” two bills sought by Silicon Valley were easily approved. They expanded H1-B visas for highly skilled foreign workers and settled the legality of electronic signatures for commercial transactions.

      Other legislative accomplishments in a divided government were scarce. Neither Congress nor President Clinton made any serious attempt to reform Social Security or Medicare. For the third year, legislators could not establish a “patient's bill of rights” in dealing with health maintenance organizations, provide prescription-drug coverage for seniors, or enact more than nominal campaign-finance-reform legislation. Late in December, however, Clinton unveiled sweeping new rules to guard the privacy of patients' medical records; doctors and hospitals would be required to secure a patient's consent before disclosing health information to a third party. Congress also was unable to undo a June U.S. Supreme Court decision that voided state attempts to outlaw “partial-birth” abortions.

      Amid charges of election-year posturing, the Republican Congress approved bills eliminating the national estate and gift tax and ending the penalty imposed on two-income married families. Both were vetoed by President Clinton, who claimed the measures disproportionately favoured the wealthy. Clinton also vetoed a measure establishing a long-sought repository for nuclear waste at Yucca Mountain, Nevada, 160 km (100 mi) northwest of Las Vegas, Nev. None of the vetoes was overridden.

The Economy.
      The country's historic economic expansion finally ran out of steam late in the year. The slowdown arrived abruptly, without overt warning, and economists later blamed a combination of causes, including higher oil prices, violence in the Middle East, the uncertain election, delayed effects of multiple interest-rate increases, a stock market decline, the bursting of the Internet bubble, and the simple age of the up cycle. By year's end, with economic activity slowing and consumer confidence dropping, most analysts were predicting a period of reduced growth or even an economic recession.

      Fueled by world leadership in telecommunications and high tech, the U.S. economic engine actually accelerated early in the year. Effects of a widely feared year 2000 computer problem proved minimal, thanks to expensive remedial preparations. After gross domestic product (GDP) posted a robust 4.2% gain in 1999, the economy expanded by an extraordinary 5.6% in the first half of 2000. This led economists to worry anew over the potential revival of inflation, which crept steadily upward after several years in the nominal 2% range.

      Under Chairman of the Board of Governors of the U.S. Federal Reserve System (Fed) Alan Greenspan (see Biographies (Greenspan, Alan )), the inflation-fighting Fed had enacted three small interest-rate increases in 1999 and followed that with three more in early 2000, including a full 0.5% boost in May. That left interest rates 1.75% higher than in 1999, driving up costs for corporations and individuals alike. Even as those increases flowed through the system, the economy was further shocked by rapid increases in oil prices worldwide, the result of a 1999 cutback in production by OPEC cartel countries. At one point oil prices topped $35 per barrel, three times the price level in December 1998. Increased energy costs affected everyone, particularly in the Midwest, where supply and refinery problems sent gasoline prices spiraling above $2 per gallon.

      The twin blows from interest and energy increases produced a marked effect on financial markets. The National Association of Securities Dealers automated quotations (Nasdaq) stock market index, heavy with high-flying technology companies and overbought dot-coms, topped out above 5000 in March and then began an erratic and prolonged descent. By year's end the average had been halved—the worst performance in Nasdaq's nearly 30-year history. (See Economic Affairs .) The year's biggest economic story was the long-anticipated shakeout in dot-coms, companies attempting to capitalize commercially on surging use of the Internet. Dozens of once-high-flying firms exhausted their start-up funds without showing a profit during the year, and their bankruptcies or mergers contributed to the darkening mood by year's end.

      By the third quarter, GDP growth was down to 2.2% and slowing. Joblessness stayed near the 30-year low rate of 3.9% established during the year, but many companies were announcing layoffs and cutbacks at year's end. Inflation rose a modest 3.5%, but it too was trending upward.

Domestic Issues.
      Serious crime, which had declined in the U.S. for eight consecutive years, leveled out during 2000. Incidence of eight major personal and property offenses reported to local law-enforcement authorities dropped 0.3% during the first half of the year, compared with a 9.5% decrease in 1999. Analysts noted that demographic trends spurring the decrease over the previous decade—including a reduction in the crime-prone 15–25-year-old male population—would be reversed in coming years.

      Researchers funded by the federal government announced in June that they had virtually completed deciphering the entire human genetic code, well ahead of schedule. (See Life Sciences: Special Report (Human Genome Project:Road Map for Science and Medicine ).) A jury in Florida awarded a record $145 billion in punitive damages in a class-action suit brought against major tobacco companies by smokers afflicted with tobacco-related illnesses. Tobacco company officials warned that the verdict could prompt bankruptcies in the industry and adversely affect the 25-year, $246 billion settlement negotiated with states in 1998.

      In Washington, D.C., Judge Thomas P. Jackson, who had earlier declared software giant Microsoft Corp. guilty of antitrust violations, ordered the company broken up. The decision was immediately appealed by Microsoft, which started the year as the world's most valuable enterprise. If sustained on appeal, the ruling would produce the country's largest government-mandated breakup since AT&T was restructured in 1984.

      Reports of numerous deaths and injuries—eventually totaling 148 and 500, respectively—on Ford Motor Co. products, particularly the popular Explorer sports utility vehicle, prompted a historic recall of 6.5 million Firestone tires. As federal officials investigated, Ford and Firestone parent Bridgestone Corp. each blamed the other firm's manufacturing or design process for the problems. Lawmakers criticized both companies at separate House and Senate hearings. Firestone's chief executive officer made a public apology, while Ford said it would not rest until every faulty tire had been replaced. By year's end, Ford had settled at least seven lawsuits and planned to settle more cases stemming from accidents involving Explorer vehicles and Firestone tires.

      Another major economic setback occurred when forest fires swept a dozen western states in the summer, charring more than 400,000 ha (1,000,000 ac). One particularly virulent fire, which caused an estimated $300 million damage to the federal Los Alamos (N.M.) National Laboratory alone, started as a “controlled burn” set by the U.S. Forest Service in New Mexico.

      The booming economy and a landmark 1996 federal law helped spur a continued reduction in public-assistance rolls during the year. President Clinton noted that welfare caseloads nationwide had dropped by eight million, or 60%, during his presidency, most after passage of a bipartisan welfare-reform act just prior to the 1996 election. Apparently ending a decade-long controversy, the Food and Drug Administration (FDA) approved the U.S. sale of the European-developed “morning after” drug RU-486. The drug, also known as mifepristone, allows women to terminate a pregnancy up to several weeks following sexual contact. Republican presidential candidate George W. Bush said that he opposed the FDA move, but he stopped short of promising to reverse it.

      Preliminary census results for April were released on December 28. The U.S. population swelled to 281,421,906.

Foreign Policy.
      No challenger emerged during the year to the U.S.'s claim as the sole world superpower. Russia, Japan, and China continued to struggle with internal economic weakness, and European attempts to consolidate were hampered by an underperforming currency and intramural political difficulties. Throughout the year the U.S. military was deployed around the world to keep the peace, and its superiority in any pitched engagement was unquestioned.

      The resulting U.S. vulnerability to terrorism was underscored anew on October 12, however, when an explosives-laden rubber boat rammed a U.S. destroyer, the USS Cole, docked in Yemen for refueling. The resulting charge tore a major hole amidships, killing 17 American sailors and wounding 39. American investigative authorities rushed to the scene but received only desultory cooperation from sovereignty-minded Yemeni officials. U.S. forces were placed on alert worldwide, and, fearing sabotage, American authorities temporarily stopped military vessels from using Egypt's vulnerable Suez Canal.

      No credible group claimed responsibility for the assault. U.S. investigators soon focused suspicion on Osama bin Laden, a Saudi dissident operating a terrorist-training organization under protection of Taliban authorities in Afghanistan. Bin Laden had reportedly planned coordinated terrorist assaults on U.S. interests worldwide on Jan. 1, 2000, including an attack on a U.S. ship visiting Yemen, but most plans had been at least temporarily thwarted.

      The probe of a mysterious October 1999 EgyptAir plane crash off the coast of Nantucket, Mass., stalled as American and Egyptian investigators produced conflicting interpretations of available evidence. U.S. officials attributed the cause of the crash to a suicide by an off-duty co-pilot, Gamil al-Batouti, who was at the controls as the jumbo jet stalled and went into a fatal dive. Egyptians suggested that equipment failure prompted the disaster.

      U.S. foreign-policy makers could claim a major victory when Yugoslav Pres. Slobodan Milosevic resigned on October 6. U.S.-led NATO forces conducted a major bombing campaign against the Milosevic regime in early 1999 to stop mistreatment of ethnic Albanians in Kosovo (a province of Serbia) and had maintained economic sanctions against his regime following cessation of military action. Milosevic lost an election in late September but was holding out for a runoff when Serbian citizens stormed government buildings in Belgrade, prompting an immediate change in government.

      Two other peace initiatives championed by President Clinton suffered setbacks during the year. A peace plan in Ireland, which Clinton helped negotiate in 1998, stalled as the Irish Republican Army refused to decommission (surrender or destroy) its heavy weapons.

      The long-running Middle East peace process, on the verge of a major breakthrough at midyear, virtually collapsed despite major efforts by Clinton and his administration. Clinton summoned Israeli Prime Minister Ehud Barak and Palestinian leader Yasir Arafat to Camp David, Maryland, on July 11–25 for intensive discussions. With Clinton shuttling between the two and exerting maximum pressure, the principals edged close to an agreement before an impasse was ultimately declared. The major sticking point was the legal status of Jerusalem, which both Arabs and Jews claimed as their capital.

      In ensuing weeks the process broke down completely. Palestinian rioting began after conservative former general Ariel Sharon visited Temple Mount, technically in a neutral zone but traditionally off-limits for prominent Jewish visitors. Barak, suffering political criticism for excessive accommodation at Camp David, responded with force. As violence escalated, Israel suspended participation in the peace process, and Barak announced new national elections in 2001.

      Fidel Castro, the target of U.S. economic sanctions since shortly after he took over Cuba in 1959, enjoyed propaganda victories at the U.S.'s expense. Castro mobilized Cuban public opinion to demand the return of Elián González, a six-year-old boy whose mother had died at sea while fleeing Cuba for the U.S. in late 1999. The administration announced it would comply in early January, but the boy's Miami, Fla.-based relatives sued, tying his fate up in legal wrangling for months. In April Elián's father, Juan Miguel González, traveled to the U.S. to escort his son home.

      Following a legal ruling, armed agents of the Immigration and Naturalization Service stormed the Miami home of the boy's relatives in the early hours of April 22, seizing the child at gunpoint and reuniting him with his father in the Washington, D.C., area. U.S. authorities, however, prohibited the Cubans (now joined by several of Elián's Cuban classmates) from leaving until court appeals had been exhausted. Finally, on June 28, after the U.S. Supreme Court refused to issue a stay, Elián and his father returned to Havana and a highly publicized Castro welcome. Castro later poked fun at election difficulties in Florida, offering to send election assistance to ensure that democracy prevailed.

      U.S. relations with China continued on an uneven path. Trade relations between the two countries were finally normalized in October, overcoming U.S. concerns over Chinese human rights problems, China's militant attitude toward Taiwan, and the exclusion of U.S. investment. China lodged vigorous objections to U.S. prosecution of Wen Ho Lee, a scientist at the Los Alamos (N.M.) National Laboratory accused of having sent U.S. nuclear secrets to China. After having publicly proclaimed overwhelming evidence against Lee, the U.S. abruptly allowed the Taiwan native to plead guilty to reduced charges and thus seemingly confirmed China's reservations.

      The 1997 Kyoto global warming treaty, which would require the U.S. and other industrial countries markedly to reduce greenhouse gas emissions, suffered a major setback in a conference at The Hague. Complications over higher oil prices, the collapse of the Russian economy, and a plan to allow wealthy nations to buy “credits” for excessive emissions from less-developed countries prompted a near collapse of ongoing negotiations.

David C. Beckwith

▪ 2000

Introduction
Area:
9,363,364 sq km (3,615,215 sq mi), including 204,446 sq km of inland water but excluding the 155,534 sq km of the Great Lakes that lie within U.S. boundaries
Population
(1999 est.): 273,131,000
Capital:
Washington, D.C.
Head of state and government:
President Bill Clinton

      The 20th century had become widely known as the American Century, and the United States ended it by implanting an exclamation point on that concept. Even while its national government was effectively mired in gridlock—perhaps because of it—the U.S. economy in 1999 roared ahead in a ninth consecutive year of vibrant expansion, its most enduring ever. U.S. leadership was recognized worldwide. Seldom in history had a country so dominated the globe in so many ways—militarily, culturally, economically, scientifically. Commentator Alan Murray, writing in The Wall Street Journal, encapsulated the country's enviable position by saying, “The U.S. enters the 21st century in a position of unrivaled dominance that surpasses anything it experienced in the 20th. Coming out of World War II, the U.S. may have controlled a larger share of world output; but, it also faced threats to its security and its ideology. Today, those threats are gone, and the nation far outstrips its nearest rivals in economic and military power and cultural influence. America's free-market ideology is now the world's ideology; and the nation's Internet and biotechnology businesses are pioneering the technologies of tomorrow.”

      The national economic prosperity, however, masked an internal disquiet and raised difficult, perhaps unanswerable questions about the country's direction. Some concerned the U.S. responsibility as unquestioned world leader to act as a global policeman and confront human rights abuses abroad. Other questions addressed perceived inequity and deterioration in American society. The gap between rich and poor continued to widen, and evidence of breakdown in the traditional American family mounted as well. A series of mass shooting incidents across the country, highlighted by a major tragedy in Littleton, Colo., shocked a nation that still cherished its frontier heritage. At century's end the United States was still seeking internal harmony to accompany its economic might.

The Economy.
      The purring U.S. economy seemed to defy gravity during the year. Economic expansion continued at an average 3.5% rate for the eighth consecutive year. Such old-line measures as housing starts and vehicle production recorded unprecedented results; unemployment drifted down to 4.1%, its lowest level since 1970; and consumer confidence again hit a record high. These figures, which historically would have aroused inflationary fears, were now accompanied by low interest rates and a slow 2% growth in the consumer price index, combinations that economists said were unprecedented.

      To a great extent the economic boom in the U.S. was powered by its unquestioned premier position in technology, which allowed major gains in productivity and made near-instant millionaires out of thousands of entrepreneurs and investors. Some 570 new companies—half of them Internet-related enterprises— sold stock in initial public offerings during the year, raising a record $69.2 billion in new capital. Technology also powered the stock market; while the Standard & Poors Index of 500 stocks rose 20% during 1999, the tech-dominated Nasdaq index climbed 85%, and some Internet mutual funds rose 300% or more. To some critics the soaring values confirmed the market's faith in future technology profits. Others, however, viewed the soaring equity prices as a mania, the triumph of greed over common sense, a speculative bubble that would eventually have to burst.

      Unlike past years, the U.S. economic leadership enjoyed favourable global tail winds, with most major economies in Europe and Asia also posting positive growth. Fine-tuning of the U.S. economy was again supervised by the U.S. Federal Reserve System, which nudged federal fund interest rates upward three times during the year—from 4.75% to 5%, to 5.25%, and to 5.5%. These minihikes precisely reversed three identical stimulative cuts in 1998 and this time served to quench inflationary fears inflamed by the country's torrid economic performance. The irrepressible economy virtually ignored several major natural disasters. These included a record five category-four hurricanes along the east coast, including Floyd, which inflicted $6 billion in damage in North Carolina, where 30,000 homes were inundated and 42 people were killed.

      The economy also shrugged off two major challenges: the Year 2000 computer problem and a major antitrust ruling against Microsoft Corp., the country's largest company in terms of market value. Any adversity for Microsoft was serious, since the Seattle,Wash.-based firm, along with chip maker Intel Corp., had largely created the standards by which the U.S. dominated world technology. In findings of fact issued by a U.S. district judge in Washington, D.C., on November 5, after a 13-month trial, Microsoft was determined to have abused its near monopoly on personal computer operating systems, the software that runs computers. The judge also found that Microsoft had misused its dominant position to try for a similar advantage in marketing “browser” software used to explore the Internet. Remedies, which could include the breakup of the company, were to be decided in 2000. In the financial markets, however, the ruling was a nonevent. Prices of technology issues, including Microsoft, rose by more than 20% in the six weeks following the ruling.

      Concern over Y2K problems was intense in early 1999, with some experts predicting disasters ranging from a breakdown of the international banking system to a near-complete shutdown of world power supplies. The worry abated during the year, however, as companies and governments spent billions correcting computer programming and reassuring consumers about their efforts. Still, the year ended with some apprehension. Major airlines canceled one-third of their December 31 schedule, and numerous commercial parties celebrating the new millennium (see Mathematics and Physical Sciences: Sidebar (New Millennium-Just When Is It Anyway? ).) were canceled as potential patrons decided to stay safely at home.

Domestic Affairs.
      Antipathy between the Democratic president and the Republican Congress led to a virtual legislative stalemate during the year. The list of major measures either defeated or deferred was far longer than the number of significant legislative accomplishments. “This was a session that was postimpeachment and preelection,” observed U.S. Sen. Joseph Lieberman, a Democrat from Connecticut. Both sides entered 1999 in a weak position; Clinton faced an impeachment trial, and the GOP control over the U.S. House had been reduced to only five seats in late 1998 elections. As 1999 ended in near gridlock, both Clinton and the GOP Congress were, if anything, even weaker; Clinton battled oncoming lame-duck status and declining support in the polls, and many commentators predicted that Democrats would regain control of the U.S. House of Representatives in 2000 elections.

      A long-delayed reform of the nation's banking laws was signed into law, largely breaking down barriers to entry between the banking, financial, and insurance industries. Congress also gave flexibility to states in using federal education dollars and, following years of contentious debate, committed to development of a ballistic missile defense system for U.S. territory and armed forces.

      For the fourth consecutive year, Senate Republicans killed an overhaul of the nation's campaign finance laws. A bill approved by the House trimmed back “soft money” contributions to major political parties but was judged by GOP senators as restricting free-speech rights of their supporters, including corporations. Congress also turned down major legislative initiatives to restrict sales of handguns and to reform the nation's bankruptcy laws.

      President Clinton vetoed a 10-year, $792 billion tax-cut measure approved by Congress, calling the measure inequitable and excessive. The Senate rejected a Comprehensive Test Ban Treaty submitted by the administration; Republican-led opponents maintained that the treaty would hinder U.S. defense efforts without providing any real benefits. Owing to preelection maneuvering on both sides, no serious attempt was made to address badly needed reform of both Social Security retirement and Medicare systems, which were financially endangered by an imminent influx of baby-boomer recipients. Election-year considerations also delayed deliberation of two other popular ideas—a proposal to add a prescription drug benefit to the Medicare program for seniors and various bills regulating health maintenance organizations (HMOs), including the establishment of a “bill of rights” for health-plan patients. In both cases Democrats advocating the measures decided that debate on the proposals would be more useful during election year 2000.

      Two unexplained airplane crashes received overwhelming news coverage. On July 16 a plane that took off from Newark, N.J., piloted by John F. Kennedy, Jr. (see Obituaries (Kennedy, John Fitzgerald, Jr. )), crashed on approach to Martha's Vineyard, Mass., killing Kennedy, his wife, Carolyn, and her sister. On October 31 EgyptAir Flight 990 fell into the sea only minutes after takeoff from New York's Kennedy Airport en route to Cairo, killing all 217 aboard. U.S. investigators found no evidence of explosion or mechanical failure aboard the Boeing 767-300 and initially pointed to a suicide attempt by a relief co-pilot on board as the possible cause. When angry Egyptians blamed anti-Arab bias for this theory, however, U.S. officials backed away. The mystery remained unsolved at year's end.

      In a tragedy that dramatically affected the national mood, two heavily armed students terrorized Columbine High School in Littleton, Colo., on April 20, killing 12 other students and a teacher before turning their weapons on themselves. Although Eric Harris, 18, and Dylan Klebold, 17, were apparently reacting to personal social rejection, their actions highlighted the long-running U.S. struggle to reconcile its constitutional protection of gun ownership with the realities of modern urban violence.

      Mass shootings also erupted during the year all across the country—at a high school in Conyers, Ga., at two Atlanta, Ga., brokerage firms, on city and suburban streets in Indiana and Illinois, at a Jewish community centre in Los Angeles, at a Baptist church in Fort Worth, Texas, and at a Xerox warehouse in Honolulu. In most cases hatred of minorities appeared to fuel the attacks. The incidents renewed the national debate over gun control, reversed a trend toward liberalized gun-possession laws nationwide, and prompted concentrated examination, even as the U.S. enjoyed unprecedented prosperity, of the direction in which the country would move in the future.

      Even so, preliminary FBI figures indicated that incidence of serious crime in the U.S. dropped by 10% during 1999, the seventh consecutive year of declining crime rates. Analysts attributed the trend to a healthy economy, tougher laws, longer sentences, and added prison capacity.

Clinton and Politics.
      As the year began, the second presidential impeachment trial in U.S. history began with pomp and ceremony in the U.S. Senate chamber. Conviction of the president by the required two-thirds vote was never a serious possibility in the partisanly divided upper chamber, particularly with public opinion polls showing nearly two-thirds of Americans opposed to removing Clinton from office on the two impeachment counts submitted by the U.S. House of Representatives. At one point the Senate came close to postponing the proceedings indefinitely by majority vote. A compromise, approved largely along party lines, however, allowed the trial to proceed but permitted new testimony from only three witnesses.

      Most of the five-week trial was rhetorical, with 13 impeachment managers from the House summarizing previously recorded evidence against the president, followed by rebuttal from Clinton's personal and White House attorneys. The final vote was not close, with only 45 of 100 senators supporting conviction on Article One, the perjury count, and 50 voting guilty on Article Two, obstruction of justice. Following acquittal, senators of both parties nonetheless condemned Clinton's conduct, but a Democrat-led effort to issue a resolution of censure against the president was blocked by Sen. Phil Gramm, a Republican from Texas. (See Sidebar (Prosecuting the President ).)

      Although intensity diminished, Clinton's image troubles continued during the year. In February the NBC television network broadcast a detailed interview with an Arkansas woman, Juanita Broaddrick, who alleged that in 1978 Clinton had sexually assaulted her in a Little Rock, Ark., motel room. On April 12 U.S. Judge Susan Webber Wright held Clinton in contempt of court for having provided “intentionally false” testimony in the Paula Jones sexual harassment case. Clinton's sworn statement denying “sexual relations” with a White House intern, Monica Lewinsky, had helped persuade Wright to dismiss the Jones case, but Clinton later admitted to “inappropriate intimate contact” with Lewinsky. Clinton, who had paid Jones $850,000 to settle the case in 1998, was ordered to hand over an additional sum of close to $89,000 in legal expenses as compensation for the errant testimony.

      Clinton's reputation hung heavily over early maneuvering for the 2000 U.S. presidential election. Texas Gov. George W. Bush, son of former president George Bush, sprinted to a commanding early lead for the Republican nomination, in part by pledging to restore dignity to the Oval Office. Bush raised nearly $70 million in contributions during the year, double the previous record, and announced he would forgo matching federal funds in order to increase his flexibility in campaign spending. After some hopefuls dropped out, Bush was being contested at year-end by five other GOP candidates, notably maverick Sen. John McCain of Arizona, a former prisoner of war in Vietnam.

      Former senator Bill Bradley of New Jersey, a onetime professional basketball player, mounted an unexpectedly serious challenge to Vice Pres. Al Gore for the Democratic nomination. Gore was endorsed by Clinton and enjoyed backing from many party regulars, but Bradley's campaign was lifted by “Clinton fatigue,” a feeling that the Clinton-Gore administration had worn out its welcome.

Foreign Affairs.
      With no real challenger for world economic leadership, the U.S. nonetheless struggled to find its proper role in post-Cold War political affairs. Internally, there was no clear direction on when the United States should use its power and influence to intervene in conflicts abroad. Beset by internal politics, the U.S. also suffered setbacks in its efforts at building international consensus.

      The West's long-running problem with Yugoslav Pres. Slobodan Milosevic (see Biographies (Milosevic, Slobodan )) over ethnic persecution in his country boiled over into major violence. Milosevic repeatedly stalled efforts to enforce United Nations resolutions seeking exit of Serbian forces from the province of Kosovo, where Serbs dominated a population consisting of 90% ethnic Albanians. A stream of Albanian refugees into neighbouring areas turned into a flood when native Serbs, apparently with military backing, stepped up a campaign of terror, property destruction, and killings early in the year. Up to one million Albanians were displaced.

      On March 24 U.S.-led NATO forces began devastating bombing and missile attacks on Yugoslav positions. The strikes continued for 78 days and finally caused Milosevic to agree to the withdrawal of Serbian forces, the safe return of Albanian refugees, and the introduction of armed UN peacekeepers to ensure an end to violence. In the process, however, NATO forces made numerous mistakes, bombing journalistic buildings, civilian residential areas, and bridges. In one notable miscue, a U.S. B-2 stealth bomber destroyed the embassy of China in Belgrade, Yugos., killing three Chinese civilians and wounding 20 others. It was later revealed that the CIA had obtained the correct street address for a Serbian-controlled building but assigned it on a map to the wrong building, the embassy. President Clinton expressed “regrets and profound condolences,” but the incident had an impact on the U.S.'s rocky relations with China. In mid-December the U.S. promised to pay China $28 million in compensation for the May bombing.

      For 10 years, since the fall of the Berlin Wall and the breakup of the Soviet Union, the U.S. had made special efforts to influence Russia, in large part to encourage dismantlement of Russia's 3,000 nuclear warheads. Relations deteriorated markedly during 1999, however, as suspicions grew that individuals in Russia had embezzled and squandered billions of dollars in foreign assistance. In October three Russian immigrants as well as their companies were indicted on charges that stemmed from an investigation of money laundering at the Bank of New York. At year-end, over U.S. objections, Russian military forces engaged in another attempt to subdue the breakaway republic of Chechnya, which further strained U.S.-Russian relations.

      China, seen by many as the eventual challenger to U.S. world domination, continued to provide major headaches for U.S. policy makers. (See World Affairs: China: Special Report (China: Asia's Emerging Superpower ).) Despite U.S. entreaties, Chinese leaders provided no substantive satisfaction on charges that they had supervised the theft of U.S. nuclear lab secrets, improperly financed the 1996 U.S. election, threatened Taiwan militarily, violated human rights by suppressing political dissent and imposing population control measures, and unfairly barred U.S. businessmen from operating in China. Chinese Premier Zhu Rongji visited Washington in April, but President Clinton refused to sign a trade pact with China because he feared political backlash in the U.S. The Chinese embassy bombing a month later caused relations to deteriorate further and prompted virulent anti-American demonstrations all over China. By November, however, Clinton had agreed to a wide-ranging trade agreement that promised Chinese membership in the World Trade Organization (WTO) and increased access for American business in Chinese markets. The agreement required U.S. Senate approval in 2000 of “normal trade relations,” or permanent most-favoured-nation status, however, and prospects for Senate passage appeared anything but assured.

      With China and Russia reestablishing a close relationship after 30 years of estrangement—motivated in part by mutual antipathy toward the U.S.—world leaders looked to a November WTO meeting in Seattle to provide evidence of comity in the world community. Organizers had hoped for the international ratification of a major agreement reducing trade barriers worldwide, a document that had been prepared over years by trade officials. Instead, sometimes violent protests by masked demonstrators rocked the city, forcing delegates to be confined to their hotels at times. President Clinton had long championed the trade-agreement process despite opposition from his supporters in labour unions and environmental groups; in Seattle, Clinton abruptly acknowledged the protests and temporarily scuttled signing the pact. His decision was widely derided as caving to domestic political pressure and was denounced by numerous world leaders.

David C. Beckwith

▪ 1999

Introduction
      Area: 9,363,364 sq km (3,615,215 sq mi), including 204,446 sq km of inland water but excluding the 155,534 sq km of the Great Lakes that lie within U.S. boundaries

      Population (1998 est.): 270,262,000

      Capital: Washington, D.C.

      Head of state and government: President Bill Clinton

      In 1998 the United States experienced the best of times and the worst of times. On one level the national economy moved steadily forward through its eighth consecutive year of vigorous expansion, accompanied by remarkably low and declining inflation, interest rates, and unemployment. On an individual basis it was a great time to be an American, with the economy producing record real personal income, hundreds of thousands of new jobs, and lofty financial market prices for a prosperous and satisfied public. On another level, however, the national body politic was in turmoil. Years of investigations into various charges against Pres. Bill Clinton (see BIOGRAPHIES (Clinton, Bill )) coalesced during 1998 into a focused probe of his efforts to evade a sexual harassment lawsuit, which led to his impeachment at the year's end by a partisan and divided U.S. House of Representatives. The disconnection between the sunny economic conditions and the stormy wrangling in the capital split the country into two camps, a larger one happy with their lot under Clinton and bored by Washington's seeming obsession with scandal and a smaller group outraged by Clinton's conduct and determined to see him removed.

      The fragmented national mood confounded public opinion pollsters and helped produce an inconclusive national midterm election in November. With his personal popularity incongruously bolstered by the assaults on his character, President Clinton won almost every important battle with the Republican Congress when final tax and spending measures were enacted in October. The setbacks seemed to demoralize Republicans and energize the president's core supporters. Although most commentators predicted that Republicans would add to their majorities among governors and in the U.S. House and Senate, the election produced no change in the Senate and a reduction in the slim Republican advantage in the House. That result in turn prompted another major surprise: Clinton's chief Republican nemesis, controversial Speaker of the House Newt Gingrich, resigned his post and thereby effectively became the first major victim of the Clinton scandal.

The Economy.
      Although Clinton's problems captured more headlines, the most significant news of 1998 was the continuing awesome and enduring strength of the American economy, which loomed like a colossus over a troubled world. Two million new American jobs were created, many of them high-paying positions in technology, pharmaceuticals, finance, and health care. The national economy shook off a spate of bad external news and grew at a 3.5% rate for a third consecutive year, nearly double the rate at which economists begin to fear overheating. Yet inflation remained well below 2%, even while unemployment sank to a 29-year low of 4.5%, real incomes rose at near-historic rates, housing construction was booming, and consumer confidence reached a record high.

      If the U.S. had merely been leading worldwide economic growth, its business performance would have been impressive enough. More remarkably, the muscular American economic engine surged forward even as trouble enveloped much of the world, refusing to slow significantly as other major economies faltered and stalled. When the year started, Japan was mired in a long-term economic malaise, and other vibrant Asia economies, especially Thailand, South Korea, Indonesia, and Malaysia, were still reeling from a 1997 currency crisis. By midyear the small but emerging Russian economy had begun to unravel as the ruble began to lose value, and rumours of trouble in Brazil and Argentina had reached Wall Street.

      In July the U.S. stock market seemed to lose heart under the accumulated weight of world adversity and began a long, steady price drop, erasing in two short months a 20% gain posted early in the year. The Dow Jones industrial average fell from 9200 to 7400 by September, and a major New York hedge fund, Long Term Capital Management, heavily invested in Russia, was threatened with bankruptcy. Once again, as it had through much of the record American peacetime expansion, the U.S. Federal Reserve System rode to the rescue. Under Chairman Alan Greenspan's supervision, a consortium of private lenders poured liquidity into the fund, effectively taking it over. Greenspan also orchestrated three small but rapid reductions in short-term interest rates over a seven-week period from September to November. The cuts in the federal funds rate—from 5.5% to 5.25% to 5% to 4.75%—helped Wall Street recover its footing and reverse the downturn. By year's end the Dow had made its most spectacular comeback in history, eliminating the entire summer decline and again threatening record territory, even as the country's political leadership seemed to be disintegrating.

      The hearty American economic performance produced a political side benefit: elimination of the federal government's chronic budget deficit. The red ink had hit a high of $290 billion in 1992, and administration economists had projected permanent $200 billion deficits as recently as 1996. The relentless surge of the national economy, however, reduced social expenditures (especially as the 1996 national welfare-reform law took full effect) and increased tax revenues far more rapidly than any economist could predict. When President Clinton and Congress agreed on a balanced-budget deal in August 1997, the 1998 deficit was estimated at $90 billion. By January 1998 the projected deficit had shrunk to $22 billion, but when the fiscal year ended in October, authorities announced a surplus of $70 billion and forecast future black ink as far as the eye could see.

Domestic Affairs.
      In Washington, D.C., the vibrant economy did little to pacify an increasing partisanship that infected the nation's capital. Lawmakers were distracted during the year by the Clinton inquiry and worried about upcoming elections. For his part, the president was unable to provide effective and consistent leadership throughout the year. As a result, the year was more notable for legislation defeated than for initiatives approved. Congress did approve a massive $216 billion highway and transit reauthorization that provided many new public works projects in every congressional district during the next six years. Also, when polls showed overwhelming public support, Clinton signed a Republican-backed reform bill ordering the Internal Revenue Service to become more responsive to taxpayer concerns.

      Most other major legislative initiatives died in partisan crossfire. Going down to defeat were plans to fund a massive national missile defense effort and to cap punitive damages in product-liability cases (both blocked by Democrats) and efforts to hike the minimum wage, to overturn the president's partial-birth-abortion veto, to reform bankruptcy laws, and to expand patient's rights in their dealings with health care providers and employers (all opposed by Republicans).

      An effort to reform the nation's easily evaded campaign finance laws also died on the U.S. Senate floor. In August the House approved a measure limiting both "soft money" and "independent expenditures," two major loopholes in campaign laws. Unlimited soft money dollars flowed from corporations, labour unions, and other interested groups directly to major political party coffers; independent expenditures allowed groups to spend without limit as long as they purported to advocate issue positions rather than individual candidates. Republicans feared the plan would not curb Democratic-oriented donors such as labour unions and environmental activists as much as it would inhibit GOP-leaning contributors such as businesses. Consequently, a month later a substantial minority of 48 Republican senators talked the Senate version to death via filibuster, refusing to stop talking until the measure was abandoned.

      An attempt to fashion a comprehensive national settlement with tobacco companies over costs of smoking-related health problems met a similar fate. A bill implementing a $368 billion proposed settlement in 1997 was shepherded easily through the Senate Commerce Committee by Chairman John McCain. It would have raised federal cigarette taxes by $1.10 per pack, restricted tobacco advertising, ordered Federal Drug Administration regulation of tobacco, and established fines if the incidence of teenage smoking failed to drop. Antismoking senators then raised the industry cost to $516 billion and dropped company protections against future litigation. At that time the four major tobacco firms ceased negotiations and started a $40 million advertising campaign, attacking the Senate bill as a large tax increase to fund new government spending programs. Even though a majority of senators continued to favour the bill, they could not break another filibuster, and the measure died.

      Incidence of crime, particularly violent offenses, dropped in the U.S. for the sixth consecutive year, but several high-profile acts nonetheless raised fears about trends in American society. In July a deranged gunman attempting to enter the U.S. Capitol building in Washington killed two policemen, the first deaths ever recorded at the Capitol. The offender was critically wounded and was later found mentally incompetent to stand trial. Young students with firearms were involved in two highly publicized tragedies. In Jonesboro, Ark., two boys, 12 and 13, killed 4 students and a teacher and wounded 10 others in a shooting spree at their junior high school in March. Two months later a 15-year-old Springfield, Ore., boy shot and killed his parents and then took their guns to his high-school cafeteria, where he shot 24 students, 2 of them fatally. Three white men, two wearing tattoos identifying them as members of a white racist prison gang, were charged with murder after an African-American man was dragged to death behind a pickup truck near Jasper, Texas, in June. In Laramie, Wyo., an openly gay University of Wyoming student was kidnapped from a bar, tied to a fence in a rural area, and beaten to death by two men. The incidents prompted renewed calls for new laws to combat so-called hate crimes and illegal possession of firearms.

Investigating the President.
      Throughout his long political career, President Clinton had benefited from his extraordinary communication skills, a talent that enabled him to demonstrate empathy with his audience and to turn close arguments decisively in his favour. This skill often infuriated his rivals, who complained that his verbal adroitness hid a lack of character and appreciation for the truth. For years Clinton sailed serenely over those criticisms. In 1998, however, through a series of wildly improbable and unexpected events, he became enmeshed in a quagmire over credibility that culminated at the year's end with his becoming only the second president in history to be impeached by the U.S. House of Representatives.

      Clinton was compelled to give a sworn deposition as defendant in a sexual harassment civil rights lawsuit filed by Paula Jones, a former Arkansas state employee. Jones, having alleged that then governor Clinton had improperly propositioned her in a Little Rock hotel room in 1990, sought evidence of other extramarital adventures by Clinton. At the January 17 deposition, Clinton generally denied all but one such affair (he belatedly acknowledged a previously denied liaison with Gennifer Flowers), but he specifically rejected suggestions he had dallied with a former White House intern named Monica Lewinsky. An Arkansas federal judge, relying in part on Clinton's denials, subsequently dismissed Jones's case.

      Unknown to Clinton, however, Linda Tripp, a former White House employee, had secretly recorded some 20 hours of conversations with her friend Lewinsky in which the young woman detailed an intimate relationship with the president. Tripp took her evidence to Kenneth Starr (see BIOGRAPHIES (Starr, Kenneth W. )), the court-appointed independent counsel who had been investigating Clinton and associates for three years. Starr received court approval to expand his investigation to the Lewinsky matter. (See Sidebar (Limits of Power of the Independent Counsel ).)

      Through the spring, as Starr's grand juries gathered evidence on Lewinsky, the administration attempted to block testimony of various Clinton aides and security officers by asserting privilege claims, but most were rejected by the courts. After months of delay Lewinsky hired new lawyers and eventually began cooperating with Starr. Clinton on August 17 gave an extraordinary interview via video to the Starr grand jury, now admitting an "improper relationship" with Lewinsky but invoking precise word definitions to deny that he had lied or committed perjury in his previous sworn statements. Far from indicating damage, Clinton's high public-approval ratings actually rose following these concessions.

      In the fall a series of unexpected and virtually unprecedented events rocked Washington. Starr sent the U.S. House two truckloads of documents with a message alleging that Clinton may have committed at least five impeachable offenses in his handling of the Jones suit and subsequent investigation. Most commentators predicted the charges would aid Republicans in November elections, but the GOP actually lost five House seats. House Speaker Gingrich, a severe Clinton critic, then announced his resignation. At that point pundits declared any impeachment inquiry dead, killed in effect by will of the voters. Instead, however, the House Judiciary Committee under Chairman Henry Hyde conducted a hearing and voted along strict party lines to recommend impeachment of the president on four counts.

      In December the matter moved to the full Republican-controlled House, with Democrats continuing to complain that Clinton was being charged with personal offenses in his private life that had nothing to do with public conduct of his office. Only a few minutes before the actual voting, Gingrich's designated successor as speaker of the House, Louisiana Rep. Robert Livingston, announced that he would resign from the House after acknowledging he had engaged in extramarital affairs. The House then impeached Clinton on two counts, perjury and obstruction of justice, again largely along partisan lines, with only a half dozen representatives from each party straying from the party-line vote.

Foreign Affairs.
      The treaty ending the 1991 Persian Gulf War specified that international sanctions against Iraq should remain in place until United Nations inspectors could verify that Saddam Hussein's missile, biological, chemical, and nuclear weapons programs had been completely dismantled. As the year began, the U.S. and Great Britain sent a major military task force to the Persian Gulf to force compliance with inspection demands. In February UN Secretary-General Kofi Annan flew to Baghdad and hammered out an 11th-hour agreement calling for "unconditional and unrestricted" access for inspections. That pact began to fray almost immediately, as Iraq demanded a certain date for the conclusion of weapons monitoring. In August Saddam Hussein suspended cooperation with inspectors, which precipitated yet another slowly evolving crisis. Meanwhile, the top American inspector, Scott Ritter, resigned his post, accusing the Clinton administration of deliberately canceling aggressive inspections to avoid provoking Hussein. On November 13, following another allied military buildup in the region, President Clinton ordered U.S. forces to attack Iraq. After B-52 bombers were airborne, however, Clinton announced that Iraq had "backed down" and had promised full cooperation with inspectors.

      That agreement lasted little more than a month. UN inspectors, whom the Iraqis called deliberately provocative, were turned away from political and military sites in what Iraqis called deliberate provocations. In mid-December, just one day before the House was to vote on his impeachment, Clinton again issued orders for joint U.S.-British air strikes on Iraq. The resulting 70-hour bombardment produced uncertain damage to Iraqi installations but an apparently decisive political result; at year's end Hussein declared he would no longer allow UN inspectors to operate within his country.

      Within minutes on August 7, U.S. embassies in Nairobi, Kenya, and Dar es Salaam, Tanz., were hit by terrorist bombs that killed 262, including 12 American citizens. Within days, authorities in a dozen countries developed information linking the attacks to Al-Qaeda, a militant Islamic Army offshoot run by Saudi-born millionaire Osama bin Laden. (See BIOGRAPHIES (Bin Laden, Osama ).) On August 20 the U.S. military fired 75 Tomahawk cruise missiles at a bin Laden military training compound in eastern Afghanistan and at a pharmaceuticals factory in Khartoum, Sudan, that U.S. authorities claimed manufactured the "precursors" of chemical weapons.

      The U.S. policy of stopping nuclear weapons proliferation suffered several setbacks during the year. Catching U.S. intelligence largely unaware, India conducted a series of underground nuclear tests on May 11 and 13. President Clinton announced economic sanctions against India and dispatched a high-level delegation to dissuade rival Pakistan from duplicating the feat. Pakistan, however, performed six of its own weapons tests on May 28 and 30, again prompting U.S.-led world economic sanctions. Critics charged the U.S. with hypocrisy for partnering with China, another declared nuclear power. Both countries declared a moratorium on future tests, and by early November Clinton had canceled the short-lived sanctions.

      President Clinton, who had criticized his predecessor George Bush for preoccupation with foreign policy, spent an unprecedented 86 days traveling abroad during the year. He was periodically accused of taking action on a variety of international issues as a means of distracting attention from his personal legal problems. Even so, Clinton achieved a number of unqualified foreign policy successes during 1998, however. He was universally credited with a vital role in the brokering of a historic agreement to end 80 years of religious-based strife in Northern Ireland. He achieved a similar apparent breakthrough in the Middle East peace process in late October when Israeli Prime Minister Benjamin Netanyahu and Palestinian leader Yasir Arafat signed an interim agreement for Israeli withdrawal from part of the occupied West Bank. Clinton brought the two sides together and laid plans for what turned out to be an intense nine-day series of closed meetings at the Wye River Conference Center on Maryland's Eastern Shore. Although skeptics questioned the Wye Memorandum's durability, it appeared to provide a basis for historic cooperation between adversaries.

      As the Asia currency crisis appeared to bottom out during the year, the U.S. Congress reluctantly endorsed International Monetary Fund efforts to shore up foundering world economies. Republican congressmen faulted the IMF's secrecy and claimed that the organization's stringent lending requirements helped compound troubles faced by some countries. The U.S.'s regular $3.5 billion dues were eventually authorized, however, and another $17.9 billion special contribution was approved as part of the year-end budget deal in October.

DAVID C. BECKWITH

▪ 1998

Introduction
      Area: 9,363,364 sq km (3,615,215 sq mi), including 204,446 sq km of inland water but excluding the 155,534 sq km of the Great Lakes that lie within U.S. boundaries

      Population (1997 est.): 267,839,000

      Capital: Washington, D.C.

      Head of state and government: President Bill Clinton

      In 1997 the United States experienced a truly vintage year: a time of peace, prosperity, relative harmony, and rising prospects— favourable indicators that had not been seen for at least 25 years. On the world stage the U.S. stood unchallenged as the globe's sole superpower, and at home a business expansion already some seven years old continued. The U.S. was also at the centre of a global reorganization of production—the so-called new economy of computers and the Internet. As financial storms battered other parts of the world, U.S. stock markets were at an all-time high, and unemployment was at a 25-year low and shrinking. Inflation, the bane of fiscal conservatives during any economic surge, was virtually nonexistent, even though wages, for years stagnant as the economy endured painful restructuring, were finally on the rise. Unlike 25 years earlier, no great social or political conflicts shook the nation. Crime, the blight of the urban U.S., was on a sustained decline, and welfare rolls were shrinking dramatically.

Domestic Affairs.
      In Washington, D.C., Pres. Bill Clinton showed himself to be less of a master bridge builder than a shrewd fence straddler. In the wake of his resounding 1996 election victory, Clinton, the first Democrat to have won reelection since Franklin D. Roosevelt, continued to follow his "triangulation" strategy—placing himself to the right of most Democrats and to the left of most Republicans. His popularity ratings stayed consistently above 50% through much of the year, despite a variety of alleged and interminable scandals and investigations that had become a hallmark of his presidency. Even with a Republican-dominated Congress, Clinton achieved a goal that had eluded presidents since 1969—an extraordinary bipartisan agreement to balance the federal budget by the year 2002. In the process he presided over the largest U.S. tax cut since 1981, including reductions in capital gains (the maximum rate would drop from 28% to 20%) and estate taxes (the basic $600,000 exemption would double over time). In all, the tax reductions were estimated to be worth $96 billion over five years and $282 billion over a decade. In addition, Clinton doled out billions in additional subsidies for middle-class college education and health insurance for children.

      The main parts of the deal included a $58 billion reduction in nonmilitary spending, about $12 billion more than Clinton had originally proposed. More than $115 billion was also anticipated in savings from Medicare programs. Despite the austerity, the agreement provided $34 billion for important presidential priorities, including health insurance for up to 10 million children not covered by private or public plans. It also allowed for restoration of welfare benefits to legal immigrants who had been dropped during the budgetary wars of 1996, expansion of student loan programs, and new funding for early childhood assistance through Head Start programs. The $135 billion tax cuts were offset somewhat by the $50 billion saved by raising tax revenues on airline tickets and by closing alleged tax loopholes. Both the spending and the tax portions of the budget passed the two houses of Congress by wide margins.

      The sudden breakthrough in fiscal probity was attributed to economic growth, which changed government projections for social outlays and tax inflows and reduced the estimated budget deficit in 1997 to a comparatively paltry $22.6 billion. The agreement on such a sweeping deal between Clinton and Congress was a tribute to the president's political skills as well as a sign that the nation had retreated from a confrontational mood and expected politicians to do the same.

      The tangible decentralization of power showed itself in a multitude of ways, but one of the most obvious was welfare reform. Since 1996, when Congress passed the welfare-reform law, state and local governments had used their power to change dramatically their systems of social protection. Revised work and eligibility rules for welfare had cut rolls in Wisconsin by 55% since the start of the decade. Oregon, Indiana, West Virginia, Rhode Island, and Connecticut all experienced decreases of 40% or more. Throughout the Midwest and most of the old South, welfare rolls fell anywhere from 20% to 40%. Only California registered an increase.

      Americans endorsed mayors who followed federal and state trends toward spinning off government services to private contractors, balancing budgets, and reshaping old-fashioned labour-management relations while dealing briskly with crime. As a result, such urban areas as Philadelphia and Cleveland, Ohio, cities that had been fiscal sinkholes in the 1980s, were reporting substantial surpluses, better services for residents, and renewed optimism.

      In Los Angeles low-key Republican Richard Riordan soundly defeated Democratic Sen. Tom Hayden to win reelection to a second term as mayor in a city where Democrats outnumbered Republicans by two to one. In New York City Republican Rudolph Giuliani coasted to a similar victory in an even more stalwart Democratic stronghold.

The Economy.
      The reinvigoration, however, would not have been possible without the phenomenal performance of the economy, which entered the year growing at nearly a 4% rate, with unemployment hovering around 5.3%, and the Dow Jones industrial average heading toward 7000. Debate grew over whether the pace could be sustained without a revival in inflation, which had hit a meagre 2.4% in 1996. As growth surged at 5.9% in the first quarter of the year, Federal Reserve Board Chairman Alan Greenspan fired a warning shot by raising the federal funds' interest rate by 25 basis points, to 5.5%, the first interest-rate rise in two years. During the first half of the year, the economy continued to boom at a 4.1% rate—roughly double the pace at which economists generally feared a reignition of inflation. Yet Greenspan took no further action.

      The most striking economic phenomenon of 1997 was an enormous surge in jobs that did not bring about a corresponding rise in prices, even as real U.S. wages began to climb. By November the unemployment rate had fallen to 4.7%, the lowest since 1973. Meanwhile, over the 12-month period ended in November, Americans' incomes rose 4.1%, a real gain of 2% when adjusted for inflation—the highest rate recorded since the mid-1970s.

      The combined effect on the U.S. stock market of high growth, low unemployment, rising wages, and low inflation was galvanic. The Dow Jones industrial average broke through 8000 in July, and economists predicted that it might reach 10000 or even 12000 without a significant retrenchment. As a major financial crash in Southeast Asia cast clouds on the horizon, American optimism continued undiminished—until a minicrash came on October 27 that knocked 554 points off the Dow Jones in a single day. Yet 24 hours later the bull market regained momentum as the market climbed 337 points in a single session, the biggest rise in a decade.

      Organized labour, however, showed its resentment at Clinton's perceived bias in favour of conservative economic policies and corporate globalism. During his first term Clinton had strongly supported passage of the North American Free Trade Agreement between the U.S., Canada, and Mexico. During the deal making, he had traded away renewal of the administration's "fast-track" authority to negotiate trade agreements that could be approved or denied by Congress only without amendment. Without such authority the president was weakened in his position to reach agreements on trade issues with other nations. In November, however, the White House was forced to announce that it would not seek renewal of the fast-track authorization, chiefly because of opposition from Democrats, heavily supported by organized labour, who opposed free trade because they believed it resulted in job losses for Americans. Clinton vowed to seek the authorization again in early 1998, but the setback was a blow to his international prestige.

Ethics in Government.
      The president and his administration continued to be troubled by a number of scandals. The most personal accusation was the charge of sexual harassment made by Paula Corbin Jones, who had been an Arkansas state employee when Clinton was governor. The White House argued before the Supreme Court that a president in office should be allowed to postpone until the end of his term civil suits derived from past actions. The court, however, did not agree, and by the end of the year, the country was facing the prospect of the president's being forced to give testimony in court.

      First lady Hillary Rodham Clinton also sought, and failed, to create a Supreme Court precedent in the Whitewater affair. Her attorneys argued, and lost, an assertion that notes taken by White House lawyers during conversations with her were privileged under lawyer-client confidentiality and could be withheld from Kenneth Starr, the special prosecutor investigating the case. The court gave Starr access to the documents, but they did not lead to any startling changes in the three-year, $30 million probe of various real-estate deals conducted while Clinton was in Arkansas.

      All paled, however, before the outcry that arose, both in Republican circles and in the press, over the financing of the 1996 election campaign. At no time in U.S. history had more money been spent on electoral politics—$2.2 billion at all levels. A substantial amount of Democratic campaign funds, it appeared, had come from questionable sources, especially from businessmen with Asian backgrounds and often, it seemed, with interests in China. Revelations about Democratic Party fund-raising, which had trickled out even during the campaign, caused the Democratic National Committee (DNC) eventually to return $2.8 million in donations. The accusations became even more serious as various members of the U.S. national security establishment questioned the appropriateness of visits by some of the donors to the White House.

      Much was made of the activities of Charles Yah Lin Trie, a Taiwanese-born entrepreneur who ran a Little Rock, Ark., restaurant and who eventually became a top Democratic fund-raiser; he had visited the White House 23 times. Clinton admitted that it was "clearly inappropriate" for Trie, who had helped raise a substantial amount of money for the Democrats and for the Clintons' legal defense, to have escorted a known Chinese weapons dealer through the White House.

      Another figure in the fund-raising effort was California businessman Johnny Chung, who had donated a total of $366,000 to the DNC, all of which was later returned because the source of the money could not be verified. Among other indiscretions, Chung had managed to pass on a $50,000 check to the DNC through Hillary Clinton's then chief of staff, Margaret Williams. Two days later he escorted a number of Chinese business associates to a taping of Clinton's weekly radio address. The donation raised the issue of possible impropriety on the part of Williams for having accepted a campaign contribution on government property.

      The Republican-led furor over these and other revelations took on a shriller tone after it was discovered that Clinton and Vice Pres. Al Gore had made a number of fund-raising calls from their executive offices. The actions raised the spectre of a possible violation of the Pendleton Act, which forbids federal employees to solicit contributions on federal property. Although both denied wrongdoing, Gore said that he made calls on only "a few occasions," and the president claimed little recollection.

      Eventually, the campaign fund-raising issue came before a Senate investigating committee, chaired by Fred Thompson of Tennessee, who charged that the alleged scandal involved a plot on the part of China's government to influence U.S. politics. His committee issued 52 subpoenas, and fund-raiser Trie, for one, fled to China rather than testify. The hearings aired secret communications intercepts that indicated that Chinese officials in Beijing and Washington at least discussed how to increase their government's influence with U.S. local, state, and federal officials. In addition, the committee heard testimony that the Republican National Committee (RNC) had also received questionable support from abroad, dating back to 1994. The major donor was Hong Kong businessman Ambrous Tung Young, who had introduced Haley Barbour, then chairman of the RNC, to top Chinese officials. The RNC ultimately returned a $100,000 Young donation.

      Serious strains developed between Attorney General Janet Reno and FBI Director Louis Freeh over the fund-raising controversy. The dispute involved different interpretations of the 1978 Independent Counsel Act. Freeh believed that the act could be read broadly to ensure an impartial investigation; he urged Reno to turn the entire fund-raising matter over to an independent prosecutor because she, as a Cabinet official, faced a conflict of interest in investigating her own boss. Reno, however, took a narrower view of the legal grounds for appointing a special prosecutor. She insisted, with the backing of departmental attorneys, that only clear evidence of wrongdoing could trigger an independent investigation. Reno was shaken, however, when soon after she had made one of her clearest assertions of the lack of need for outside investigation, the White House began releasing videotapes of Clinton's meetings with various campaign donors, including controversial figures. Although none revealed anything illicit, Reno had not been informed of the existence of the tapes. In the end she remained firm—an independent counsel would not be appointed.

      The entire fund-raising issue clearly established that U.S. campaign-financing laws were in a quagmire, with bewildering distinctions between "hard" and "soft" campaign donations. As Clinton declared, reform of some kind was highly desirable, and several proposals were aired in Congress.

Foreign Affairs.
      Relations with China marked the point of greatest difficulty in making the distinction between foreign and domestic affairs in a globalized economy as greater numbers of Asians immigrated to the U.S. and more business was done with China. Greater commercial dealing with Asia's authoritarian regimes also raised larger questions of how to impress upon them the need for increased observance of human rights. All of these issues came to a head in late October and early November when Chinese Pres. Jiang Zemin made his first trip to the U.S., the first by a Chinese head of state in 12 years. His visit, coming only months after the return of Hong Kong to Chinese sovereignty, raised the issue of democracy and trade to a special level of sensitivity. In more than two hours of conversations with Clinton at the White House, and again in public, Clinton took unusual pains to stress that on the issue of democracy China's leadership was "on the wrong side of history." Jiang seemed unfazed by the admonition. On a more practical level, China pledged to cut off nuclear aid to Iran in exchange for future sales of American nuclear reactors to China.

      Late in the year the Clinton administration orchestrated a series of multibillion-dollar bailouts to shore up the short-circuited economies of Thailand, Malaysia, Indonesia, and South Korea, among others, which were caught up in a dominoes-style financial collapse. The International Monetary Fund was called in to provide what could prove to be upwards of $100 billion in interim financing, and the U.S. was embarrassed as a recalcitrant Congress refused to approve $3.5 billion in IMF contributions.

      In Europe U.S. foreign policy was on surer ground. In July the U.S.-led NATO alliance welcomed three new members to the security alliance—Poland, Hungary, and the Czech Republic—all of which would become members in 1999. The enlargement of NATO had been preceded by a lively debate within the administration over its advisability and had encountered vociferous Russian opposition. Nonetheless, the move proceeded as planned, with the alliance promising that it would deploy no combat troops or nuclear weapons in its new regions.

      Although most of the world's nations gathered in Ottawa in December to sign a treaty banning the use of antipersonnel land mines, the United States was not among the signatories. Clinton explained that treaty negotiators would not allow an interim exemption for the U.S., which had requested the continued use of antipersonnel mines to protect antitank defenses of vital importance in the Korean peninsula, where 40,000 U.S. troops and their South Korean allies were vastly outnumbered by the forces of North Korea.

      Clinton's most difficult foreign-policy challenge involved an old nemesis—Iraqi dictator Saddam Hussein, who had repeatedly shown an uncanny ability to win political advantage while still enduring the military and economic straitjacket imposed by a U.S.-led alliance after the Persian Gulf War. When teams of UN weapons inspectors apparently closed in on secret stocks of biological and chemical weapons, Hussein declared that American inspectors would no longer be allowed on the UN team hunting for Iraqi weapons of mass destruction. He eventually forced UN personnel to leave the country. Although the UN Security Council unanimously condemned Iraq but initially refused to follow an American lead of further sanctions against Baghdad, it later imposed additional sanctions because of Hussein's continued unwillingness to cooperate and his threats to U.S. reconnaissance aircraft. When Hussein threatened to shoot down American U-2 spy planes overflying sensitive Iraqi areas, Clinton ordered three carrier groups to operate within striking range and massed aircraft in Saudi Arabia and Turkey. Hussein shrewdly backed down and invited UN inspectors back into the country but refused to grant them entree to some 47 rebuilt presidential compounds. Although the U.S. sought to balance threats of continued economic embargo against incentives for further Iraqi cooperation, the consensus was that U.S. dependence on coalition building had perhaps resulted in a shift of political momentum toward its most dangerous regional adversary.

GEORGE RUSSELL
      See also Dependent States .

▪ 1997

Introduction
      The United States of America is a federal republic composed of 50 states. Area: 9,362,753 sq km (3,614,979 sq mi), including 203,679 sq km of inland water but excluding the 155,534 sq km of the Great Lakes that lie within U.S. boundaries. Pop. (1996 est.): 265,455,000. Cap.: Washington, D.C. Monetary unit: U.S. dollar, with (Oct. 11, 1996) a free rate of U.S. $1.58 to £ 1 sterling. President in 1996, Bill Clinton.

      In 1996 Bill Clinton (see BIOGRAPHIES (Clinton, Bill )) showed that he was a master at gauging shifts in national mood, and indeed of helping to create them, as he maneuvered in Washington, D.C., and campaigned across the country to become the first two-term U.S. president from the Democratic Party since Franklin D. Roosevelt 60 years earlier. Clinton's victory over his Republican opponent, former senator Bob Dole (see BIOGRAPHIES (Dole, Robert Joseph )), was all the more remarkable in that voters, in the lowest turnout since 1924, also returned a Republican-majority Congress for the first time since 1930. Never before had a Democrat won the nation's highest office with the Congress controlled by his opponents. Once again, however, the people had opted for the U.S. equivalent of minority government. (See Special Report (SPECIAL REPORT: The U.S. Presidential Election ).)

      Nonetheless, Clinton could claim a clear victory. He won 49.2% of the popular vote, compared with 40.8% for his Republican rival; the remainder went to maverick populist Ross Perot, who ran as the Reform Party candidate. According to exit polls, Clinton was particularly favoured by women, who endorsed him 54% to 38%; by African-Americans, who voted for him 83% to 12%; and by the elderly, who voted Democratic 50% to 43%. The Republican majority, by contrast, was shaved marginally in the House of Representatives and expanded slightly in the Senate. More than half of the Republican casualties came from among the representatives who had first been elected in 1994.

      Clinton won his victory by moving with agility to the right, a talent he had demonstrated throughout his national political career but never against such odds as those he faced in 1996. In the process he managed to emerge once again in the public eye as a moderate. To many he seemed more moderate than Dole and his fellow Republicans, especially the aggressive speaker of the House of Representatives, Newt Gingrich, whom Clinton brilliantly demonized in the presidential campaign as an avatar of mean-minded radical conservatism, threatening the poor, the elderly, and the middle class with cuts in federally mandated entitlements. The net result was that Clinton, who began the year almost passively, with the government paralyzed through a budget deadlock, emerged as a mediating chief executive who could urge his defeated adversaries to join him in seeking a "common ground" during his upcoming term.

      Clinton, moreover, achieved this feat despite a continuing rain of scandals great and small upon his administration. They covered everything from the continuing investigation into the decade-old Whitewater land deal to more sinister questions about the abuse of confidential FBI files on political opponents and the improper raising of campaign funds from non-U.S. sources. As the year closed, the U.S. Supreme Court was prepared to hear arguments on whether the president should be allowed, on account of his office, to postpone a civil suit leveled against him by Paula Corbin Jones, a former Arkansas state employee who alleged that Clinton had made sexual advances to her while he was governor. It was one sign of the administration's political skills that, although none of the scandals had gone away by the end of 1996 and some might return to hurt the president in his second term, none proved fatal to Clinton's reelection.

The Economy.
      The fact was that, however many questions were raised about the president's character or that of his administration, other, more fundamental factors weighed heavily in favour of his reelection. The nation was at peace, and, above all, it was prosperous. The monetary manipulations of the Federal Reserve System (Fed) chairman, Alan Greenspan, and his Open Market Committee ensured that economic growth continued. The Fed cut short-term rates just before the new year began, with the aim of keeping growth in the range of 2.5% for 1996. Any fears of flat growth or recession were thus dispelled, and the president signaled his approval for this course by renominating Greenspan, a Republican, for his third four-year term as Fed chairman and naming two other economic moderates to the seven-member board.

      The steady growth put further downward pressure on the U.S. jobless rate, which was only 5.6% when the year began. By the time the year ended, it was 5.3%, not much changed but nonetheless at the lowest level since the 1970s. Inflation, too, was contained, staying at roughly 2.5%. Blue-collar workers registered a real, if marginal, rise in income, as wage increases averaged 2.8%, and white-collar workers saw a 3.1% increase in pay. Overall economic productivity rose at a 1.2% rate, while productivity in manufacturing rose 3.2%. Thus, the nation's economic progress was steady, if not muscular. One of the more negative signals, however, was the steady rise in personal bankruptcies, which reached more than one million during the year. There was also continued volatility in sectoral employment as large-scale corporate downsizing continued.

      The most dynamic sector of the economy was the high-tech, particularly the computer-oriented, firms that continued to drive the stock market to new heights. In the first half of 1996, the sale of new public stock offerings continued to be one of the fastest avenues of growth for new companies, which went public at a rate of 70 or more per month. In the process many suddenly became worth 200 or 300 times their previous value, creating a steady procession of new millionaires. The same frothy optimism continued to affect more traditional stocks, as the Dow Jones industrial average continued its steady rise past 6,000. Among other things, the rise reflected a steady flow of money into equities from members of the baby-boomer generation, who were skeptical of the value of Social Security and were replacing it with contributions to such vehicles as 401(k) accounts. In midyear, however, there was a sudden correction in the upward rise of stocks, and the high-tech over-the-counter market, in particular, swooned. Nonetheless, by year's end the market had recovered, albeit selectively.

Developments in Government.
      If the November elections underlined anything, it was that the American people were eager to pull back from extremes that might erode their sense of stability, however transitory that might prove to be. The nation had been badly shocked in 1995 by signs that the social and political consensus was fraying in ways not seen since the Vietnam War. In Washington the tension was symbolized by the trench warfare between the White House and Congress over the 1996 budget, which had left the government essentially inoperative. Some 280,000 government workers were laid off, and another half million were working but not being paid. At issue were the differing ways in which the two sides proposed to close the budget deficit over seven years, chiefly in terms of taxes and in slowing the growth of such huge entitlement programs as Medicare and Medicaid. The Republicans wanted to cut $270 billion from Medicare growth, for example, while the president wanted to pare only $124 billion. Clinton had also rejected Republican efforts to give the military more than the $256 billion he had originally proposed.

      The standoff, which had begun in mid-December 1995, continued for 18 days before the Republican wall began to crack. It was Clinton's soon-to-be presidential rival, Senator Dole, who first urged his party to begin providing funds on a continuing basis so that the government could get back to work. He was then joined by Speaker Gingrich, who broke with more radical members of his party to do so. Both men realized that the American people, while sympathetic to the goal of cutting the size and scope of the government, were profoundly uneasy at its paralysis. After 21 days the funding cutoff ended on January 6, with both sides having submitted their proposals for seven-year reductions in spending. The squabbling over the actual 1996 budget continued until late April, however, with 13 separate temporary spending bills required for keeping the government functioning while the horse trading went on.

      In general, the outcome of the exhausting battle confirmed the thinking that had propelled the congressional Republicans to power in 1994. In the final budget more than 200 federal programs were abolished, mostly in the Labor and the Health and Human Services departments. Funding for the Corporation for Public Broadcasting, a longtime target of conservative ire, was slashed, though the corporation survived. So did such Clinton programs as the subsidized national service for youth, funding to put 100,000 extra police on the streets, and extra money to improve the quality of education, viewed by conservatives as a federal prop to a pillar of the Democratic Party, the National Education Association.

      The president was quick to turn the situation to political advantage and to articulate the theme that was to dominate the electoral politics of the year. "The era of big government is over," he told Congress and the nation in his annual state of the union address. He added, however, that "we cannot go back to the time when our citizens were left to fend for themselves." Clearly positioning himself as a moderate, he called for such achievements as bipartisan welfare reform, an increase in the minimum wage, and portability of health insurance so that workers would not lose coverage if they changed jobs. He also asked for a line-item veto of the kind already wielded by 43 of 50 state governors and endorsed by the Republican congressional majority.

      The limited nature of Clinton's 1996 goals stood in sharp contrast to the grandiose first-term proposals he had outlined for health care reform, which died ignominiously in 1994. The fate of the new proposals was also different. In May Congress endorsed the first hike in the federally mandated minimum wage in five years, from $4.25 to $4.75 an hour, with another rise to $5.15 a year later. Some 3.7 million Americans were affected by the measure, most of them women. The change was fiercely opposed by small business lobbies, but in the end Republicans split over the issue.

      At virtually the same time, Clinton won approval of the line-item veto, which allowed the president to strike a limited number of items from an appropriation bill rather than veto the entire document. The veto was highly limited, however. It applied to tax concessions only if they affected no more than 100 taxpayers and specifically could not be used on entitlement programs like Medicare and Social Security. Nor could it be used to block major tax reductions, and it could be overturned by a two-thirds congressional majority. Nonetheless, the veto was decried by Sen. Mark Hatfield, chairman of the Senate Appropriations Committee and one of only three Senate Republicans to vote against it, as "the greatest effort to shift the balance of power to the White House since Franklin Roosevelt attempted to pack the Supreme Court." The veto was immediately promised a constitutional challenge.

      If the president was able to win incremental victories that gave solace to the liberal constituencies within his party, he also made moves that set him apart from them. None was more symbolic, or fraught with more sweeping potential to affect American society, than his decision to sign the welfare reform act passed by Congress, the first comprehensive overhaul of the system in over 60 years. Momentum for some sort of change was clearly unstoppable. In polls the American people had frequently showed their unhappiness with welfare, particularly the $16 billion program known as Aid to Families with Dependent Children (AFDC). Clinton had already declared his willingness to accept a two-year limit on recipients in the program, but liberal members of his party had long argued that a welfare cutoff was meaningless, and perhaps dangerous, unless it was matched with expensive job-creation measures, probably in the public sector. The Republican Congress would have none of that. In the long wrangle over the bill, the White House was able to add a number of palliatives to the notion of a welfare cutoff: child nutrition programs, extra aid for recession-hit states, and money for child care and foster care. The overall direction of reform, however, was to take the federal government out of the social welfare business where possible and to hand its administration over to the states.

      Under the provisions of the measure, states were to receive block grants for all welfare expenditures, set in relation to 1996 levels, with added money to take account of recessions or unusual population growth. The act abolished the AFDC program entirely and gave the states until July 1, 1997, to come up with plans that required welfare recipients to go to work within two years, while setting a total limit on welfare assistance of five years per family. After six years states that failed to put welfare families in work of some kind would lose their federal funds, although 20% of a state's caseload could be exempted. The law contained a number of clauses aimed at reinforcing the work ethic. Administrators could cut payments to teenage mothers who did not finish high school, for example, or who did not live with an adult (a response to frequent criticisms that the AFDC program encouraged broken families and illegitimacy). State legislatures would need to provide a waiver to add payments for children born while their mothers were on welfare. On the other side, the measure set aside $400 million in bonuses for states that reduced or contained rates of illegitimate birth, including $250 million for education in abstinence as a form of birth control. The bill also barred legal immigrants who had not applied for citizenship from receiving food stamps and other forms of assistance. The law recognized that many states had long been trying to find more workable formulas, and it gave 44 states a year to wind down various experiments already under way.

      Some questioned whether this welfare reform was actually an answer to the problem or merely a means of shuffling the issue onto lower levels of government. Most experts agreed that without substantial levels of job training and placement, the two-year limit to federal funding might merely shift an immense burden onto state budgets. Many child-care advocates warned that the reforms would strike hardest at the children of those on welfare, perhaps adding millions to the rolls of a permanent underclass. Of course, the full impact of the welfare changes were not likely to be felt for several years, a point that was often made by its opponents, some of whom were closely aligned with the president's wife, Hillary Rodham Clinton. That, however, did not deter the president from signing the measure.

      Clinton also took a variety of conservative postures on other social and so-called family-values issues, especially those related to crime and drugs. He appointed a four-star army general, Barry R. McCaffrey, previously commander of the Pentagon's Southern Command in Panama, as the nation's drug czar. He raised the possibility of a mandatory drug test for teenagers seeking to obtain a driver's license. The president caused a fierce storm of protest among homosexuals when he announced his support for legislation that would ban the provision of federal benefits to the partners in a same-sex marriage. When the Defense of Marriage Act passed, Clinton signed it.

      The issue of same-sex partnerships proved a heated one across the country in an election year. The immediate reason for the furor was a series of court decisions in Hawaii, reaching to the state's Supreme Court, that ruled the prohibition of same-gender unions to be in violation of the state constitution's equal protection clause. The decisions led to conservative warnings that the ruling would usher in homosexual marriages across the nation as states were forced to recognize their legality under the "full faith and credit" provisions of the U.S. Constitution. In fact, the likelihood of such legitimacy was small, for 15 states had laws explicitly banning such marriages, and others were considering them.

Law Enforcement.
      While a looming election raised temperatures on some divisive social issues, the country clearly was in no mood to countenance a radicalism that threatened social war. The 1995 bombing of the Alfred P. Murrah Federal Building in Oklahoma City, Okla., which killed 169 people, had savagely underlined the horrors of extremism, and the nation clearly wanted no part of it. The two men charged with the crime, allegedly fringe members of a heavily armed antigovernment militia, awaited trial in 1996. There were no similar bombings during the year, but in July the federal Bureau of Alcohol, Tobacco and Firearms arrested 10 men and 2 women—members of a little-known Phoenix, Ariz., splinter group called the Viper Militia—who seemingly had like plans. The authorities confiscated two machine guns, six rifles, hundreds of rounds of ammunition, and hundreds of kilograms of chemicals similar to those used in the Oklahoma City bombing. They also impounded videotape of sundry Vipers giving guided tours of nearby federal buildings, with detailed instructions on how to blow them up.

      Federal authorities pulled off an even bigger coup when they staged a raid on a remote Montana mountain cabin and announced that they had arrested Theodore J. Kaczynski, thought to be the anonymous bomber who had eluded them for 18 years. Intermittently since 1978, the so-called Unabomber had mailed handmade explosive devices to a number of academics and business executives, killing 3 people and injuring 23. In the wake of the Oklahoma City bombing, he sent a bomb to the president of the California Forestry Association and threatened to blow up an aircraft leaving the Los Angeles airport unless the New York Times and Washington Post published his manifesto against industrialized society. The publication proved Kaczynski's undoing when his brother recognized the rhetoric and notified authorities. Kaczynski had no link to any organized causes.

      The arrest won back some lustre for federal law-enforcement agencies, which had suffered a great loss of prestige as a result of their handling of the 1993 siege near Waco, Texas, of the headquarters of the Branch Davidian sect, in which 82 members had died, and for the bungled 1992 arrest of a white separatist in Idaho, in which his wife and 14-year-old son had been killed. The FBI used different tactics in 1996 in outwaiting a group of self-described libertarian Freemen holed up on a ranch outside Jordan, Mont. The Freemen were faced with federal charges of writing millions of dollars' worth of bad checks and money orders and of threatening to kidnap and kill a federal judge involved in foreclosure on the farm. Mindful of the innocent women and children in the beleaguered camp, the FBI simply outwaited the defenders until they surrendered.

      The FBI's prestige was once again tarnished, however, this time in the midst of the year's most festive occasion, the Centennial Olympic Games in Atlanta, Ga. The Games had just finished their seventh day when, early in the morning, a homemade pipe bomb exploded in Centennial Olympic Park, killing one person and wounding 111. It was the first violence to occur at the Olympics since the massacre that had taken place in Munich, Ger., in 1972, and it happened despite unprecedented security. The bomb was contained in a knapsack left against a television broadcast tower in the park, a central meeting place. About 18 minutes before the explosion, an anonymous caller had phoned in a warning, and security personnel were trying to clear the area when the bomb went off. Official suspicion soon focused on Richard Jewell, an Olympics security guard, who was detained, interrogated, and investigated for months before being told that he was no longer a suspect. Jewell sued not only the authorities but also news media who publicized suspicions of his guilt. No other suspect was named in the bombing, despite a $500,000 FBI reward.

      The Olympics bombing came on the heels of a much greater disaster. On July 17 a TWA flight from New York City to Paris suddenly exploded over the Atlantic Ocean near Long Island, with 230 passengers and crew aboard. All perished. A massive underwater search across 620 sq km (240 sq mi) of ocean eventually recovered most of the bodies and about 95% of the Boeing 747 aircraft. Authorities worked to determine whether a bomb or a mechanical problem had caused the calamity aboard Flight 800. By the end of the year, the investigation was far from over, but some authorities were venturing that the cause was a buildup of explosive vapour in a fuel tank.

Foreign Affairs.
      Terrorism, nonetheless, continued to strike a strong chord with Americans. A month before the TWA disaster, a small group of men wheeled a large tanker truck up against a link fence in front of an apartment building in Dhahran, Saudi Arabia, and then fled before an enormous explosion tore the face off the building. The edifice housed U.S. Air Force personnel involved in interdicting flights in southern Iraq in the wake of the 1990-91 Persian Gulf War. A total of 19 airmen were killed and 50 hospitalized by the blast. The explosion was believed to be the work of Saudi Islamic militants.

      The Saudi attack was no doubt on President Clinton's mind two months later when he declared terrorism to be "the enemy of our generation" while signing a new law ordering sanctions against any nation investing in Iran and Libya, both considered terrorist states by the U.S. In fact, Clinton's action did nothing to lessen terrorist dangers, while it infuriated some of the closest U.S. allies. The law specifically penalized foreign firms that made investments in oil in the two countries, which were major petroleum suppliers to Europe. Clinton declared that the lesson for U.S. allies was "You cannot do business with countries that practice commerce with you by day while funding or protecting the terrorists who kill you and your innocent civilians by night." The allies said that this was posturing and an attempt to limit their sovereignty, and they filed a protest at The Hague.

      In fact, when it came to actual outrages perpetrated by tyrants, the administration's policy seemed singularly feckless. In a test of U.S. will, Iraq's Saddam Hussein sent some 40,000 armoured troops north from Baghdad on an incursion into ethnic Kurdish territories specifically declared a "no-go" zone by the victors in the Gulf War. Hussein effectively installed a puppet regime beholden to himself, wiped out bases where the CIA had launched covert actions against his government, and then withdrew. In retaliation, Clinton ordered two strikes of a total of 44 cruise missiles against replaceable Iraqi air defenses far to the south and increased the no-go zone in the same region. The symbolic action did nothing to restore the status quo.

      Clinton had irked allies earlier in the year with his posturing toward another old enemy, Fidel Castro. The U.S. was shocked when Cuba shot down two small, unarmed civilian planes as they flew over Cuban territorial waters from airfields in Miami, Fla. The aircraft were flown by members of the so-called Brothers to the Rescue, who had earlier goaded Castro by dropping anticommunist leaflets on Havana. In the wake of the shoot-down, Clinton threw his support behind the so-called Helms-Burton law, which allowed Cuban Americans whose businesses had been taken over during the 1959 revolution to file suit against foreign companies that bought or leased the assets from the Castro government. The law also mandated that the U.S. government deny a visa to any foreigner with a stake in such property. Clinton waived the more onerous sections of the law, but businesspeople from Canada and other countries were warned that they could face such sanctions. Their irate governments created countervailing sanctions in case the law was applied, and they filed suit against the U.S. before the World Trade Organization.

      In a further bow to conservative sentiment that irked many U.S. allies, not to mention many in the Third World, the Clinton administration cast a veto against the reelection of UN Secretary-General Boutros Boutros-Ghali. The U.S. was vexed at his secretive style, slowness to implement financial reforms, and ill-advised efforts to make the UN into a peacemaker in areas such as Bosnia and Herzegovina where peace might not be had without force. Boutros-Ghali's successor, Kofi Annan of Ghana, was applauded in the U.S. as a more open and reform-minded choice, but the move was resented, particularly by France.

      Such actions discomfited friends of the U.S., but in general the country's foreign policy during 1996 was aimed at avoiding political harm. Clinton endured criticism for his administration's continued support for the government of Russian Pres. Boris Yeltsin, but it seemed justified after Yeltsin had won elections against the resurgent Communist candidate, Gennady Zyuganov. (See BIOGRAPHIES (Zyuganov, Gennady Andreyevich ).) Yeltsin's health, however, continued to make the Clinton policy an open issue after the Russian president later underwent quintuple bypass surgery. Clinton's 1995 gamble to send U.S. troops to Bosnia in the aftermath of the Dayton Accords that ended the slaughter in former Yugoslavia likewise paid off as peaceful elections were carried out. The results followed predictable ethnic lines, and virtually no action was taken before world courts against the authors of acknowledged genocide. Growing public protests against the Serbian president, Slobodan Milosevic, whose irredentist ambitions were a prime cause of the Bosnian catastrophe, further seemed to vindicate the Clinton approach. The major loss to the U.S. in the Balkans during the year was the death of Commerce Secretary Ron Brown (see OBITUARIES (Brown, Ronald Harmon )), who died in an airplane crash near Dubrovnik, Croatia, as he led a group of business executives exploring the possibilities of economic reconstruction in the shattered area.

      The Middle East peace process, which Clinton had proudly midwifed, suffered a severe setback with the election in Israel of the conservative Likud government of Benjamin Netanyahu. The West Bank became embroiled in the worst Israeli-Palestinian violence in years. Nonetheless, by the end of the year, an uneasy peace had returned, and it seemed that progress was being made. Late in the year, Clinton also shuffled his foreign policy team, among other changes replacing Warren Christopher with the first woman to serve as secretary of state, former UN ambassador Madeleine Albright, and naming Bill Richardson as chief delegate to the UN.

      The area where U.S. foreign policy seemed to grow the most convoluted was in Asia, and once again election considerations lay at the bottom of it. The U.S. launched no major initiatives across the Pacific, where Asia was the focus of an immense industrial boom. The administration, however, had not come to a clear view of how to deal with this rising economic power, much of it the result of investments by U.S. businesses, or with an increasingly assertive China. In 1996 China replaced Japan as the largest single source of the U.S. trade deficit, and the U.S. frequently locked horns with China over that country's alleged violation of copyright laws, software piracy, and other economic issues. Despite allegations that the Chinese had sold magnets to Pakistan that could be used in developing nuclear weapons and the charge that U.S. businesses lost more than $2 billion annually to factories that illicitly copied software, films, and other intellectual property, the administration backed the extension of most-favoured-nation trading status for China.

      If Asian wealth was complicating foreign policy, it was also making a mockery of U.S. election law. As the election drew near, attention focused on the activities of John Huang, an Asian-American with connections to a wealthy Indonesian family that had business connections with China. Huang had raised more than $4 million for the Democratic Party during 1996. Possessed of a top security clearance, he had gathered in, among other things, an illegal $250,000 from a South Korean firm and $450,000 from an Indonesian couple. Another Asian-American fund-raiser and Clinton acquaintance, Taiwan-born Charles Yah Lin Trie, was revealed to have once taken a major Chinese arms dealer to the White House. Trie had also raised funds for the Clintons' steep legal bills in the Whitewater affair, some in the form of cash and checks in plain brown envelopes. Much of the money was returned, and there was no evidence of favours having been granted in return for the funds. Nonetheless, at year's end the Department of Justice had issued subpoenas to the White House for records on as many as 20 Democratic Party fund-raisers. (GEORGE RUSSELL)

      See also Dependent States .

▪ 1996

Introduction
      The United States of America is a federal republic composed of 50 states. Area: 9,372,571 sq km (3,618,770 sq mi), including 205,856 sq km of inland water but excluding the 156,492 sq km of the Great Lakes that lie within U.S. boundaries. Pop. (1995 est.): 263,057,000. Cap.: Washington, D.C. Monetary unit: U.S. dollar, with (Oct. 6, 1995) a free rate of U.S. $1.58 to £ 1 sterling. President in 1995, Bill Clinton.

      By all rights 1995 should have marked a political nadir for U.S. Pres. Bill Clinton. As a result of the 1994 congressional elections, he had become chief executive in what amounted, in U.S. terms, to a minority government. Control of the legislative agenda shifted to Congress, dominated, for the first time in 40 years, by Republicans, and especially to the combative speaker of the House of Representatives, Newt Gingrich. (See BIOGRAPHIES (Gingrich, Newt ).) A massive rollback of welfare legislation and federal dominance was set in motion as the Republicans moved to fulfill the conservative "Contract with America" within their first 100 days in office. (See Special Report .) The president seemed reduced to the role of a bystander. Defections from the Democratic Party continued apace; in all, five Democrats switched parties after the elections. Nonetheless, by the end of the year, the president, while giving considerable ground, had managed to achieve more of a stalemate with Congress than many had believed possible.

      In November the president's veto of the Republican budget led to a standoff that idled 800,000 employees and shut down so-called nonessential functions of the federal government for six days. Treasury Secretary Robert Rubin nimbly raided selected federal pension funds in the interim to forestall default on the government's obligations, while the two sides reached accommodation on such issues as the target of balancing the budget in seven years, as Republicans demanded.

      The president and the Congress remained far apart on the specifics of how to achieve that aim, however, with Republicans looking for more than $1 trillion in spending cuts, largely from social welfare programs, along with $245 billion in tax relief, spearheaded by a $500-per-child tax credit. Along with the tax issue, one of the central disagreements was over controlling Medicare and Medicaid costs. The Republicans wanted to save $270 billion over seven years by cutting back increases in Medicare spending from 10% to 7% annually. Clinton deemed that unacceptable and proposed savings of $124 billion. On Medicaid, Congress was determined to make cutbacks in spending, convert the remainder into block grants to the states, and allow each state to set eligibility requirements. The president was determined to keep Medicaid as an entitlement. When agreement was not reached by mid-December, those parts of the government not yet funded were again forced to shut down while the president and congressional leaders attempted to work out a compromise. This time some 280,000 government employees were furloughed, and thousands who did government work on a contract basis also were not paid. In spite of a series of meetings between Clinton and top congressional leaders, no solution to the impasse had been reached by the time the year ended. Bipartisan attempts by senate leaders to reach a compromise failed to gain backing from hard-line Republicans in the House of Representatives.

American Disaffection.
      While the budget dominated headlines, the forces swirling in the American political cauldron in 1995 were more dramatically epitomized in an event far from Washington, D.C. The country was stunned on April 19 when a rented truck parked outside the Alfred P. Murrah Federal Building in Oklahoma City, Okla., erupted shortly after 9 AM, tearing the front off the nine-story structure and leaving 168 people dead, including 19 children. In addition, a nurse was killed during rescue efforts. The truck had contained homemade explosives, a mixture of diesel fuel and ammonium nitrate. The man who was allegedly responsible for the bomb was a former member of the U.S. Army and a veteran of the Persian Gulf War, Timothy J. McVeigh, a fringe member of a heavily armed American subculture of "militia" that espoused antigovernment views. His alleged coconspirator was Terry Lynn Nichols, a farmer from Herington, Kan. Both men were charged with offenses that carried the death penalty.

      The Oklahoma bombing drew attention to a radical degree of disaffection with the government in general and a number of federal agencies in particular. In its most extreme form, the disaffected militia movement claimed about 100,000 members who expressed hostility to the federal government, believed in foreign conspiracies to erode the sovereignty or even the territory of the nation, and often stored food and arms and practiced military training in anticipation of either invasion or some form of federal police state. All such groups disclaimed anything to do with the Oklahoma City bombing.

      Like McVeigh, however, almost all militia members were virulently opposed to gun-control laws, like the 1994 federal assault weapons ban, and many saw the antigun actions of the FBI and the Bureau of Alcohol, Tobacco and Firearms as being, in the words of a National Rifle Association official, the work of "jack-booted government thugs" intent on tearing down what they saw as the Second Amendment's guarantee of the right to bear arms. In particular, they saw the 1993 siege in Waco, Texas, of the Branch Davidian compound, in which 82 cult members died, as being evidence of a sinister and cold-blooded federal government attitude toward like-minded dissidents. Authorities investigating the Oklahoma tragedy were convinced that the date of the crime—the anniversary of the federal raid at Waco—was no coincidence.

      At a series of congressional hearings, Attorney General Janet Reno justified her endorsement of the assault on Waco, but she did not convince many skeptics. The FBI, however, took a more self-critical view in another case that had aroused a similar furor: the 1992 attempt to arrest a heavily armed Idaho man named Randall Weaver, a believer in white racial separatism, at his mountain cabin. After Weaver's 14-year-old son was killed in the clash, an FBI sharpshooter killed Weaver's wife as she stood behind a door with their 10-month-old daughter in her arms. Three years after the firefight, the agency paid Weaver and his surviving children $3.1 million in a civil settlement. FBI Director Louis Freeh also suspended his close friend and the number two man at the FBI, Larry Potts, while probing Potts's involvement in a change of the rules of engagement at the shoot-out.

      The militias were only the most highly charged manifestation of a deep-rooted anger with the encroachments of the federal government that also showed itself in hostility to those wearing its civil uniforms, from the FBI to the Bureau of Land Management and the Forestry Service. The anger led to a sense of siege among many members of the federal bureaucracy. In some parts of the country—notably the West, where feelings ran high against federal control of as much as 80% of the land in certain jurisdictions—some federal officials refused to be seen in their work clothes for fear of attracting sniper fire. Others faced lawsuits and even disobedience from state officials, who claimed that they, rather than federal authorities, should claim ownership of such public property.

      Much like the fringe anti-Vietnam War radicalism of the 1970s, the antigovernment terrorism and civil disobedience of 1995 represented the overheated froth of a much broader and more moderate consensus—that government, particularly the federal government, had taken more than its share of resources and political space and had to be reduced. The consensus, however, was coupled with a continuing sense of disquiet and uncertainty about the future that gave a sharp edge to the national debate in many arenas, including the jostling leading up to the 1996 elections. Anti-Washington sentiment and a desire for leadership outside the traditional mold powered a deep groundswell of support for the idea of a presidential candidacy by Gen. Colin Powell, a black man who had retired as chairman of the Joint Chiefs of Staff. Powell, who declared himself a Republican, eventually declined to run, however, leaving Senate Majority Leader Robert Dole as the Republican front-runner, but it also fueled renewed candidacies by Texas billionaire H. Ross Perot, who announced the Independence Party as his political vehicle, and by the nativist conservative Patrick Buchanan, a combative orator with a strong anti-immigrant and anti-free-trade platform. Both of the dissident candidates reflected an isolationist uncertainty about the U.S. political and economic role in the world that paralleled the domestic uncertainty.

The Economy.
      There was considerable uncertainty on the economic front. For the first time since 1992, in July the Federal Reserve Board (Fed) announced a cut in short-term interest rates, from 6% to 5.75%. Chairman Alan Greenspan and the Fed's Open Market Committee then made another, year-end rate cut, to 5.5%. The Fed actions signaled that the economy, in Greenspan's view, had achieved the so-called soft landing that he had tried to manage through seven previous interest-rate hikes. Growth for 1995 appeared to be headed for the 2.5% level that Greenspan deemed optimal. The unemployment rate was hovering in the range of 5.4%, and inflation seemed likely to be no more than 2.5% for the year. Flat retail sales and weakness in a number of leading indicators, however, gave some warning of slightly lower growth in early 1996.

      Meanwhile, in the midst of the budget battle, the Dow Jones industrial average rose past 5,000 after having pushed through 4,000 early in the year. Low interest rates, the prospect of reduced government spending, and a welter of high-performing high-tech issues had a lot to do with the performance, as did a continuing wave of mergers and acquisitions. Hikes in stock prices and merger mania went hand in hand with economies of scale, however, and the continuing globalization of the U.S. economy produced pink slips and fear alongside the bullishness. Typical of the paradox was the behaviour of AT&T, a profitable $75 billion megalith, which announced that it would break itself into three separate companies and shed 78,000 jobs.

      In the atmosphere of uncertainty amid fast-changing economic forces, many Americans found it easy to believe that stability was indeed eroding and that their government was not doing enough to stem the advantages wielded by foreign countries that "gained" the jobs lost at home. Mindful of the sentiment, the Clinton administration used the threat of 100% tariffs on luxury-car imports to pressure the Japanese into expanding their North American auto production and buying more U.S.-made parts and also threatened China with $1 billion in tariffs to force the government into policing the rights of U.S. manufacturers of such often-pirated goods as computer software.

      One of Clinton's earlier international economic initiatives came back to haunt him, however. When the Mexican peso collapsed in December 1994, the U.S. had rushed to bail out its partner in the hard-won North American Free Trade Agreement (NAFTA). The administration helped to cobble together a $50 billion international credit arrangement that included $20 billion worth of U.S. guarantees, and Congress grudgingly went along with the fiscal legerdemain. By international standards the bailout was a considerable success in stemming a financial hemorrhage from Mexico and in restoring investment confidence. The country's living standards, currency values, and labour costs swooned, however. Purchases of foreign-made goods, especially from the U.S., collapsed, while exports, boosted by a cheap peso, took off. The result was that after years of enjoying trade surpluses with Mexico, the U.S. suddenly found itself running a deficit, and a number of U.S. companies announced that they would forsake the U.S. for the cheaper labour available there. At the same time, the number of Mexicans entering the U.S. illegally in search of work took a strong upward hike.

      One effect of the Mexican crisis was a likely halt to further expansion of NAFTA. A more dramatic effect was the boost that Mexico's plight gave to opponents of immigration to the U.S., both nationally and in states like California that were particularly hard hit by the influx. In the 1994 elections California residents had already given approval to Proposition 187, a measure that would deny schooling and other benefits to the children of illegal immigrants. The proposition was endorsed by Gov. Pete Wilson, but parts of the measure, notably the schooling ban, were declared unconstitutional by a federal judge. Meanwhile, the U.S. Congress also seemed intent on cutting back benefits to legal immigrants as part of its budget tightening. In a bow to the same anti-immigrant sentiments, the Clinton administration announced that it would end the policy of giving Cuban boat people special status as political refugees and would instead return them to their homeland.

Developments in Government.
      President Clinton had long been notorious in his critics' eyes for trimming sails to suit whatever political breezes were blowing, but the new Republican majority in Congress made that tendency a sometimes helpful tool of statecraft. While it caused considerable anguish in left-wing Democratic circles, the president, who was the native of a region where states' rights were still a shibboleth, found it easier to accept many of the decentralizing initiatives of the Republican legislators. On the other hand, the president also seemed capable of taking advantage of splits in his opponents' ranks. He was able, for example, to head off some cutbacks in the Environmental Protection Agency, long a demon of many Republicans, after a number of more moderate Senate Republicans reconsidered the measure.

      In the midst of the new federal diffidence toward expanding or defending its reach, more initiatives emerged from the states. Some were nothing less than reactionary, like the decision of Alabama to restore prison road gangs and bring back leg irons (though other states concurred with the notion of a tougher prison regimen less aimed at catering to prisoner comfort). On issues of broader import, however, many states had shown the way in endorsing programs of voucher-driven education and "workfare" for welfare recipients, but many also began to tackle other areas. One of the touchiest and most explosive issues was race-based preferment. In California, Governor Wilson signed an executive order that abolished almost all affirmative action policies. (President Clinton ordered a review of federal affirmative action policies but then declared that most should continue.)

      The issue of race, perhaps the most sensitive tissue in the body politic, seemed to be undergoing a different kind of examination on each side of the black-white divide. While whites debated affirmative action, the largest black demonstration in Washington, D.C.'s history—larger than the 1963 march led by Martin Luther King, Jr.—took place under the auspices of the black separatist Louis Farrakhan, head of the Nation of Islam. The "Million Man March" was a powerful demonstration of the concerns of black males about family disintegration and personal responsibility and endorsed personal and spiritual, rather than governmental, solutions to such ills. The demonstration also gave a powerful boost to the standing of Farrakhan, hitherto considered a mesmerizing but marginal racial demagogue.

      Race also played an underlying role in the trial of O.J. Simpson, a black television pitchman and former football star, for the slaying of his white former wife, Nicole Brown Simpson, and her acquaintance Ronald Goldman. Simpson was acquitted after less than four hours of jury deliberation. The trial's turning points were the fiery, racially tinged address of Simpson defense counsel Johnnie Cochran and the discrediting of the Los Angeles police detective Mark Fuhrman, an investigator of the slaying who had, long before the trial, boasted to an interviewer of his racial prejudice and his planting of evidence to convict other alleged criminals. Enthusiasm or dismay at the trial outcome seemed to split largely along racial lines, which reinforced the notion that blacks and whites had entirely different views about the nature of the justice system.

      In looking anew at affirmative action, both federal and state governments were following the lead of the Supreme Court. In 1995 the court agreed that affirmative action programs had to meet tests of strict judicial scrutiny to be constitutional. By a 5-4 vote the justices also struck down a Georgia statute that allowed the gerrymandering of electoral districts to compensate for past racial segregation. In a setback for homosexual activists, the court ruled that private parades such as Boston's St. Patrick's Day celebration could exclude those it did not want to participate.

      In a decision that could prove to be one of the more far-reaching of its term, the court set a limit on the federal government's ability to use the interstate commerce clause of the U.S. Constitution to impinge on matters otherwise outside its jurisdiction. The clause, which became a cornerstone of federal activism in the era of Franklin D. Roosevelt, had been used to justify everything from food standards to civil rights investigations. In overturning the federal Gun-Free School Zones Act of 1990, which used the clause to declare the possession of firearms around education sites to be a federal crime, the justices ended its infinite elasticity. On the other hand, the court agreed that no limits could be set on reelection to Congress without a constitutional amendment, a blow to the term-limits movement.

      In another development relating to interstate commerce, the Interstate Commerce Commission (ICC), once the most powerful bureaucracy in Washington, closed its doors at the end of the year. As of the first day of 1996, it would be no longer in existence. Established in 1887 to curb the power of the railroad "robber barons," the commission at one time had the power to regulate almost everything that moved across state lines. The deregulation of transportation in the 1980s had deprived the ICC of most of its reason for existing, but it had survived several attempts to close it. The remaining employees and commissioners were transferred to the Department of Transportation.

      The House of Representatives passed a nearly total ban on gifts from lobbyists, following in the wake of a less stringent Senate ban. The measure did little, however, to stem the most questionable source of money for influence, donations to political action committees, and other devices that congressmen used to finance their political survival. House Speaker Gingrich, who had earlier given up a multimillion-dollar book advance from communications mogul Rupert Murdoch, whose vast holdings were much affected by federal oversight, drew a House ethics investigation after questions were raised about his alleged use of GOPAC, a not-for-profit organization, to funnel money to Republican causes. Congress proved itself tough on matters of legislators' sexual behaviour. Sen. Robert Packwood, a Republican who headed the Finance Committee, resigned after the Senate Ethics Committee voted for his expulsion. Packwood had been charged with sexual harassment by 19 women, including a 17-year-old.

      On the most high-profile ethics issue, the turgid Whitewater scandal, little insight was gleaned. Much of the focus of congressional concern had long since shifted from the original property deal, which took place long before the Clintons reached Washington, to the behaviour of administration officials after the July 1993 suicide of Vincent Foster, the White House counsel and overseer of the Clintons' personal finances. Deputy Attorney General Philip Hyman told a Senate investigating committee that his department had been forced to stand by while White House Counselor Bernard Nussbaum entered Foster's office and took files related to the Clinton family's personal affairs. The senators were intrigued by telephone logs that showed long conversations between Hillary Rodham Clinton and two of the intruders immediately after the entry. After initially balking, President Clinton agreed at the end of the year to turn over to Senate investigators notes from meetings on the matter.

Foreign Affairs.
      Nothing a president does is likely to affect the feelings of the American people as much as his decision to send U.S. troops into harm's way. In this, Clinton crossed the Rubicon with his Bosnian policy. The war in the Balkans between Serbs, Croats, and Muslims had been a frustration and a challenge to U.S. diplomacy since its inception. A Vietnam-era protester who had not served in the military, Clinton was sensitive to the difficulty, frequently underlined by his military advisers, in becoming involved in a civil war in a country where American high-tech superiority might count for little and the possibility of casualties was high. The scale of the Balkan atrocities—perhaps 250,000 killed and 3 million displaced in "ethnic cleansing"—and the inability of European allies in NATO to find a solution prompted Clinton to act, however.

      At first Clinton did so rhetorically, urging a relatively safe bombing campaign against the Bosnian Serbs—considered the chief aggressors—as a way of halting the war. This did not suit American allies, who pointed out that the U.S. had no UN peacekeeping troops on the ground to worry about. Eventually, however, when the Bosnian Serbs began overrunning protected "safe areas" and killing or expelling Muslim inhabitants, Clinton acted, with unhappy results. As NATO aircraft bombed Bosnian Serb artillery positions, the Serbs took over 300 UN peacekeepers hostage and threatened to kill them if the bombing did not stop.

      In August a sudden Croatian military offensive regained territory previously taken by the Serbs. The offensive, it turned out, was the result of a covert U.S. retraining and reorganizing of the army of Croatian Pres. Franjo Tudjman, part of a policy advocated by Assistant Secretary of State Richard Holbrooke, who had emerged as the maestro of Balkan realpolitik. The next important stage was to bring together Tudjman with Serbian Pres. Slobodan Milosevic and Bosnian Pres. Alija Izetbegovic at Wright-Patterson Air Force Base near Dayton, Ohio, for talks in November that ended after three weeks with a fragile treaty. The agreement was to be overseen by a 60,000-member NATO force that would keep the enemies apart along 4-km (2.5-m) cease-fire zones. In the long run, the U.S. would train the weaker Muslim army to underpin the peace with a credible balance of power.

      The peace accord was a dramatic vindication of the U.S.'s role as the only remaining superpower and a huge political risk for Clinton as he entered an election year. Despite assurances that the troops would depart from Bosnia and Herzegovina within a year and would be able to respond with maximum force if attacked, the likelihood of at least some U.S. casualties seemed high, and no vital U.S. interest appeared to be served. Public opinion polls registered a great deal of opposition, but Clinton received support for his initiative from his likely presidential rival, Senator Dole. Other prominent Republicans attacked him for the risky venture.

      Twenty years after the end of the Vietnam War, Clinton extended diplomatic recognition to Hanoi. The action was greeted with protest by disaffected U.S. military veterans, but it was hailed by American business, which rushed in to make deals long available to European and Asian competitors. Skeptics also growled as the U.S. and North Korea signed a deal in which the U.S. provided two nuclear reactors in exchange for an agreement by the economically battered regime of Kim Jong Il that it would dismantle its nuclear enrichment program, widely seen as a prelude to acquiring nuclear weapons.

      Under congressional pressure, Clinton reversed a decade-old policy that had kept Taiwan's head of state, Pres. Lee Teng-hui, from setting foot on U.S. soil, a bow to China's claim to be the sole legitimate government. The administration decided to allow Lee to visit his alma mater, Cornell University, Ithaca, N.Y., to receive an honorary degree. The action led to strong statements from China about subversive American intentions, the punitive awarding of lucrative automotive contracts to non-American firms, and a tougher stance toward selected dissidents. China's continuing desire to gain entry to the world trading community, however, made it unlikely that the U.S. gesture would permanently mar relations with the world's most populous nation. (GEORGE RUSSELL)

      See also Dependent States .

▪ 1995

Introduction
      The United States of America is a federal republic composed of 50 states. Area: 9,372,571 sq km (3,618,770 sq mi), including 205,856 sq km of inland water but excluding the 156,492 sq km of the Great Lakes that lie within U.S. boundaries. Pop. (1994 est.): 260,967,000. Cap.: Washington, D.C. Monetary unit: U.S. dollar, with (Oct. 7, 1994) a free rate of U.S. $1.59 to £ 1 sterling. President in 1994, Bill Clinton.

      U.S. Pres. Bill Clinton must have been hard pressed to discern much cause for cheer by the time 1994 wore to a close. Battered by allegations of sexual and financial misconduct—the latter focused as well on first lady Hillary Rodham Clinton—the president also saw the centrepiece of his legislative program, health care reform, die in Congress. Within the White House, a new chief of staff failed to bring much-needed discipline or prevent a steady string of resignations by top aides under attack for alleged improprieties or conflicts of interest. By the end of the year, the president was deemed anathema even by considerable numbers of fellow Democrats, who declined his campaign support during the November elections. Paradoxically enough, the man elected in 1992 to solve the nation's festering domestic problems could take solace as 1994 ended chiefly in a string of foreign policy successes and a hard-won victory in expanding the global free-market system.

      For the first time since 1954, the Democrats lost control of both houses of Congress. (See Sidebar (UNITED STATES: The 1994 Midterm Elections ).) Newt Gingrich of Georgia, who would become the new speaker of the House, was hailed as the chief architect of the Republican triumph. The trend continued among the states, where Republicans had a net gain of 11 governorships, boosting their total to 30 and ousting such powerful figures as Mario Cuomo of New York and Ann Richards of Texas.

The Economy.
      The sentiment that seemed to motivate voters was not, on the surface, inspired by dire economic facts. The economic outlook in 1994 generally appeared to be good. The unemployment rate in December, 5.4%, was at a four-year low, down dramatically from a high of 7.8% two years earlier, and the economy was generating an average of some 275,000 new jobs every month, some 3.5 million for the year. The U.S. share of world manufactured exports, a time-honoured measure of national economic strength, was rising toward 16%, while those of Japan and Germany were in decline. Per capita disposable income was rising steadily, and so were corporate profits. General Motors, for example, the world's biggest industrial company, which had reported a titanic $4.9 billion loss in 1991, was showing a $2.8 billion profit by mid-1994, more than for all of 1993. A new wave of mergers and acquisitiveness gripped a number of U.S. business sectors, notably the telecommunications and health care industries. Inflation remained under control—the consumer price index rose 2.7% during the year—and price stability seemed more or less assured, at least for the short term.

      There was, however, a steady ratcheting up of interest rates by the Federal Reserve Bank (Fed), from a short-term figure of 3% at the beginning of the year to 5.5% at year-end. Between February and November the Fed raised rates six times, and at one point it hiked its key interest rate twice in little more than a month. The main reason for the Fed's action was the feeling on the part of its chairman, Alan Greenspan, and a majority of the members of the Open Market Committee that the continuing economic expansion might lead to eventual overheating and supply bottlenecks, which would, in turn, refuel inflation. By making money more expensive and thereby slowing the rate of expansion, the Fed aimed to keep the underlying potential for inflation under control. The moves spread turmoil in the financial markets, however, always sensitive to interest-rate hikes, and early in the year there occurred the biggest single-day drop in the Dow Jones industrial average since 1991.

      The effects were even more parlous in the bond markets, which had become highly dependent on mathematically complicated forms of futures contracts, known as derivatives, that offered substantial gains—and equally severe losses—depending on how successfully investors bet on the prevailing financial bellwethers. With the change in Fed policy, large numbers of institutional investors—from corporate treasurers to managers of college endowment funds—bet spectacularly wrong. In a move that rocked the municipal bond market, Orange county, in southern California, filed for bankruptcy protection after highly leveraged investments went sour and cost the county $2 billion. Lesser shocks were felt by millions of individual investors who had moved money out of traditional, low-interest forms of insured savings into mutual funds that held derivatives. The effect was to dispel some of the feeling of security and well-being that might have been inspired by the economic performance of goods, services, and jobs.

      As is common in economic recoveries, U.S. productivity and profitability increased in important measure because workers stayed on the job for more overtime hours—more so in 1994 than in previous business cycles. In the third quarter of the year, for example, the factory workweek reached a near-record 42 hours, including almost 5 hours of overtime. Among debt-laden consumers, however, the resulting income gains were offset by hikes in the interest costs for credit card purchases, mortgages, and car payments. Too, despite the swelling number of available jobs, many corporations continued to cut payrolls to maintain their competitive advantage. Consumer spending remained strong through most of the year, with an annual rise of 7.6% in 1994, but retail sales unexpectedly slumped in December. Overall the improved economic picture was marred by a continuing, deep-rooted sense among individuals that all was not as well as it should have been or as secure as it had been in the past.

Health Care.
      It was just such a feeling of insecurity that Clinton had addressed during his successful 1992 election campaign and that his proposal for universal health care seemed designed to allay. At first the nation seemed willing to make the changes required for providing health coverage for the 37 million or more Americans said to be uninsured. At the same time, there was a strong feeling that the patchwork U.S. health care system—with its welter of private insurers, employer-sponsored insurance plans, private doctors and hospitals, plus a government subsidy system for the poor and elderly—was far too expensive. Nonetheless, the plan the Clintons had unveiled in September 1993—with Mrs. Clinton as the overseer—ran into a minefield of opposition after it was presented to Congress. Its sheer complexity—the original document weighed in at 1,368 pages and included radical innovations such as national price controls, huge mandatory health care alliances, and government-mandated coverage by employers—brought together a broad array of opposition forces.

      In fact, a number of dramatic changes had been occurring in the health care system. Spurred by the notion of widespread government intervention, private care providers had begun to rein in spiraling costs. More and more employers had enrolled workers in health maintenance organizations (HMOs)—networks of doctors and hospitals that closely monitored costs and rewarded caregivers for keeping them under control. The HMOs were sometimes bureaucratic and unwieldy, but their rapid expansion through start-ups, mergers, and acquisitions was one of the salient features of economic activity during the year. Further, the more large-scale employers began to get their costs under control, the less enthusiastic they became about endorsing enhanced government control. For example, the Business Roundtable, a group of 200 of the largest U.S. corporations, endorsed a rival congressional scheme that did not place emphasis on controlling prices or on universal coverage. There was also opposition from other groups, including small businesses, insurance companies, and the elderly.

      As the president faced an increasing number of opponents to the proposal, he frequently tried to be conciliatory to all sides at once, even while trying to talk Congress into doing his bidding on the issue. At various times he declared almost every aspect of the Clinton health care plan to be negotiable. Universal coverage itself, however, the president declared to be inviolable—until he eventually gave a nod to a competing proposal that would settle for 95% coverage over several years' time. Opponents came up with even more alternative schemes to bleed momentum from the reform movement, and at one point more than 150 different health care bills clogged the congressional system. Eventually none of the proposals picked up the legislative support necessary to force a bill through Congress.

Welfare Reform and Crime.
      In his state of the union address, Clinton also turned his attention to two other social issues of long-standing concern, welfare reform and crime. Welfare reform in particular was a notion that stirred enthusiasm across the country, where it was assumed to mean a cutback in support payments to the poor and near poor, including such programs as Medicaid and food stamps. Of particular concern in the public mind was Aid to Families with Dependent Children, a program that cost $16 billion annually—not much in the overall budget but symbolic to many of the culture of welfare dependency, involving unwed mothers, neglected children, and unemployed teenagers. Various states were already experimenting with "workfare" programs involving mandated employment when Clinton announced in his address that he would propose a similar scheme, including a welfare payment cutoff after two years coupled with aggressive programs of job training and retraining. Traditional constituencies within his party objected, however, and Congress took no action.

      Even though various violent crime rates were still declining, Americans continued to see a growing threat to their way of life and to demand ever more draconian punishments. By 1994 the number of people sentenced to federal, state, and local prisons had far outstripped the nation's capacity to jail them. Federal and state prisons held some 925,000 inmates, about double the population of a decade earlier. Local jails held another 450,000, or triple the capacity 10 years earlier. The average cost of holding that population was $23,500 per inmate, yet the public demanded more: more police, more prisons, and more mandatory sentences.

      Clinton's 1994 crime bill attempted to ride the law-and-order wave by endorsing the controversial proposal of mandatory life sentences for violent offenders found guilty of three consecutive offenses. It also included $28 billion for additional prisons and police, which Congress speedily bid up to $33.5 billion—and, after a series of horrifying massacres around the country, a proposal for the first time to ban outright 19 different so-called assault weapons, firearms capable of rapid, automatic fire. The ban was virulently opposed by the National Rifle Association but was supported by law-enforcement agencies, and it narrowly passed the House 216 to 214. It eventually became law separate from the crime bill. The overall bill, however, went down to defeat when Republicans attacked it for containing excessive amounts of pork-barrel funding. After lobbying by the White House, a slightly trimmed version, calling for expenditures of $30.2 billion, became law.

Personnel and Personal Problems.
      Such near disasters only contributed to the Clinton White House's reputation for ill discipline, fecklessness, and lack of attention to the minutiae of pushing a program through Congress. The Clintons, loyal to the team of Arkansans and other friends they had brought to Washington, resolutely rejected the idea of a major administrative shake-up until the clamour grew too strong to ignore. The president in effect fired his boyhood chum, White House Chief of Staff Thomas ("Mac") McLarty, and replaced him with the head of the Office of Management and Budget, Leon Panetta. The anticipated broader shake-up failed to take place, however. Instead, the heads of top administration officials began to roll in connection with a variety of alleged scandals—none involving much hard evidence of wrongdoing—that had mostly been over long before the Clintons went to Washington and that were collectively known as the Whitewater affair.

      The details of Whitewater rivaled, in their numbing complexity, the details of the Iran-contra scandal of the Reagan era but without the grave implications for the institution of the presidency, since most of the Whitewater action had taken place during 1978-91, while Clinton mainly occupied the attorney general's office and the governor's mansion in Little Rock, Ark. The finger-pointing mostly revolved around the Clintons' failed investment in a small-scale rural land development north of Little Rock in partnership with James McDougal, owner of the Madison Guaranty Savings and Loan. Madison Guaranty eventually went bankrupt, costing taxpayers $45 million, and McDougal was charged with, but eventually acquitted of, bank fraud. There was no evidence that the Clintons, who claimed to have lost almost $69,000 in the land deal, were aware of any wrongdoing, but critics made much of their association with McDougal at a time when Clinton was ultimately responsible for banking oversight in the state and when his wife, then an attorney with the Rose Law Firm in Little Rock, at one point performed minor legal work for Madison Guaranty.

      The accusations of scandal had percolated without much result in 1993 until the apparent suicide that July of Vincent Foster, a Rose Law Firm partner who had gone to Washington as Clinton's personal counselor and the family lawyer. It was discovered that in the suicide's wake a number of top Clinton aides, including White House Counselor Bernard Nussbaum, had entered Foster's office and taken files related to the Clinton family's personal affairs. As critics cried cover-up, the Clintons spent much of 1994 in a determined effort to protect the privacy of their past dealings—which only convinced many, particularly in the press, that they had something to hide. The situation became even more difficult when a number of White House officials were subpoenaed to appear before Congress to explain their attempts to ride herd on the Whitewater scandal. Many of the officials suffered lapses of memory during their testimony, and one of them, Deputy Secretary of the Treasury Roger Altman, resigned after being accused of intentionally misleading Congress about his reports to the White House while serving as the acting head of the Resolution Trust Corporation, which was investigating the Madison Guaranty failure. An independent prosecutor continued investigation of Whitewater throughout the year.

      Another matter that continued in the news was a series of investments in 1978 and 1979 by Mrs. Clinton in cattle futures, which netted a profit of about $100,000 on an investment of $1,000, less than the usual minimum for such high-risk trading. She had been advised in her moves by an attorney associated with the Tyson food-processing empire, Arkansas's largest private company and one regulated by both state and federal governments.The clamour went up that the investment was an apparent conflict of interest, and eventually the stain spread to include Secretary of Agriculture Mike Espy, who resigned after it was revealed that he had accepted favours from Tyson while in office.

      On December 28 a federal district court judge ruled that a sexual harassment lawsuit filed against Clinton by a former Arkansas state employee should not proceed to trial until after the president left office.

Other Developments.
      One domestic triumph that stood out was the president's choice to replace Supreme Court Justice Harry Blackmun, who stepped down from the bench at age 85. In seeking a successor, Clinton first looked to Senate Majority Leader George Mitchell, who had decided to retire, but Mitchell declined. A month later Clinton named Boston federal appeals court judge Stephen Breyer (see BIOGRAPHIES (Breyer, Stephen )) to the post. Breyer, a onetime chief counsel to the Senate Judiciary Committee, an antitrust specialist, and an expert on administrative law, was almost universally applauded for his intellect and his consensus-making skills.

      On three occasions during 1994, the White House was the object of physical attacks. In September a small plane crash-landed on the grounds, killing the pilot. A month later a man, subsequently charged with several felonies, fired on the residence with a semiautomatic weapon. Near the end of the year, in December, shots were fired that reached the grounds and the White House itself, one bullet piercing a window in the State Dining Room. In none of them was the president injured or in immediate danger.

Foreign Affairs.
      In his first year in office, Clinton had gone to great lengths to avoid involvement in foreign affairs while pursuing his domestic agenda. In 1994, however, the sense of priorities was gradually reversed. The president began the year at a foreign policy summit, meeting with Russian Pres. Boris Yeltsin in Moscow in January and scoring a major national security triumph when the U.S. and Russia formally ended their mutual nuclear terror by agreeing to point their strategic missiles at empty oceans rather than at any country's territory. Ukrainian Pres. Leonid Kravchuk added further lustre to the trip when he agreed to dismantle about 175 former Soviet intercontinental ballistic missiles on his territory, along with their attendant 1,800 nuclear warheads, in exchange for $1 billion in aid. Soon thereafter, Clinton ended another decades-old enmity when he formally dropped the 19-year U.S. trade (and investment) embargo against Vietnam, citing the Hanoi government's cooperation in the search for U.S. servicemen still missing in action in Southeast Asia. Clinton then cauterized the embarrassment of the intervention in Somalia, undertaken by his predecessor, George Bush, by ordering U.S. troops out of the warlord-riddled country.

      As much as possible, Clinton installed trade and economics rather than military and ideological considerations at the centre of his foreign policy. Among other things he scrapped almost all export controls on previously sensitive telecommunications devices and computers to Russia, Eastern Europe, and China. In the case of China, he ended the linkage between human rights and most-favoured-nation trading status. Later in the year he met again with the other leaders of the 18-nation Asia-Pacific Economic Cooperation forum, and he agreed to join in the creation of an enormous trans-Pacific free-trade zone by 2020. Similar action for the Western Hemisphere was taken at the 34-nation Summit of the Americas held in December. In the wake of the punishing midterm election results, the president successfully lobbied for passage by Congress of the General Agreement on Tariffs and Trade.

      Throughout the year the administration kept up arduous and often frustrating negotiations with North Korea. (See East Asia and the Transition in North Korea (Spotlight: East Asia and the Transition in North Korea ).) The U.S. tried a wide variety of blandishments and threats to persuade the North Koreans to once again allow international inspections of their nuclear facilities. After Pres. Kim Il Sung died (see OBITUARIES (Kim Il Sung )) and was replaced by his son Kim Jong Il (see BIOGRAPHIES (Kim Jong Il )), former U.S. president Jimmy Carter resumed talks he had begun in June and successfully brokered an arrangement whereby North Korea would turn over outmoded equipment in exchange for less dangerous power reactors and agree to inspections in 10 years' time. In December, however, another crisis developed when a U.S. helicopter was downed on North Korean territory. One crew member was killed in the crash, while the other was released unharmed after 13 days of tense negotiations.

      In the Middle East, long a focus of U.S. preoccupation, Clinton did not have a major role to play in 1994, yet for the second year in a row, he witnessed the signing of a historic peace accord. This time the pact was between Jordan and Israel, and it left the issue of the Golan Heights and peace between Israel and Syria as the major unmet goal of diplomacy in the region. Clinton himself made a bid to move the process along at a meeting with Syrian Pres. Hafez al-Assad, but to little effect. Yet when it seemed appropriate to draw the sword in the Middle East, Clinton reacted with energy and dispatch. After Iraqi Pres. Saddam Hussein ordered 50,000 heavily armed troops toward the frontier with Kuwait, in October Clinton airlifted thousands of U.S. troops to the region, and the Iraqi dictator quickly backed away.

      The same could not be said for the warring sides in the Balkans, who scoffed at half-hearted efforts by NATO forces to impose limits on the long-running war in Bosnia and Herzegovina through ineffectual air strikes at nearly valueless targets. The NATO effort reflected a deep split between the U.S. and its chief European allies, notably Britain and France, which had peacekeeping forces on the ground in Bosnia, as the U.S. did not. The rift deepened and even threatened the foundations of the North Atlantic alliance as the year wore on, and the U.S., prompted by sentiment in Congress, tried to redress the military balance between the beleaguered Bosnian Muslim forces and the Bosnian Serbs, who had essentially won the genocidal war. The U.S. unilaterally ended its own arms embargo against both sides (which meant effectively against the Muslims) and said that it would not help its allies to enforce their ban. Later, the U.S. pressed for NATO air strikes. Finally, however, Washington acknowledged that NATO solidarity was more important than the integrity of Bosnia and backed down amid admissions from Secretary of State Warren Christopher that the entire crisis had been bungled. At the invitation of the Bosnian Serbs, Carter went to the area in December to broker a tentative cease-fire.

      The president was faced with equally thorny choices in defending U.S. borders from a flood of Cuban and Haitian refugees who took to the Caribbean in virtually anything that would float in order to escape conditions at home. In the case of the Cubans, Clinton at first hesitated and then reversed decades of U.S. policy that embraced such escapees automatically as legitimate seekers of political asylum. Some 30,000 were interned at U.S. bases at Guantánamo Bay and in Panama while the White House negotiated with the regime of Fidel Castro (see BIOGRAPHIES (Castro, Fidel )) to stanch the flow, to which the Cuban government had turned a blind eye. The two sides eventually agreed to an increase of 20,000 per year in the quota of Cubans allowed into the U.S. through proper channels.

      The Haitian tide was harder to stem. Throughout much of the year, the Clinton administration hoped that an effective economic embargo of Haiti would cause the regime of Gen. Raoul Cédras, the Haitian army commander, to accept the return of ousted Pres. Jean-Bertrand Aristide (see BIOGRAPHIES (Aristide, Jean-Bertrand )). For his part, Aristide fumed that the U.S. did not object to Cédras' remaining in control. As thousands of boat people washed up on the coast of Florida, however, the administration came to the view that only military intervention would work. In September the U.S. assembled a fleet of 23 warships and 20,000 troops and set out for Port-au-Prince. Once again a last-minute intercession by Carter proved to be decisive. With U.S. warships in sight, Cédras and his cohorts agreed to allow the troops ashore. The U.S. soldiers quickly took control, ferried the top military leadership into exile, reinstalled Aristide, and began the longer-term, and more difficult, task of helping to rebuild the poorest country in the Western Hemisphere from the ground up.

      (GEORGE RUSSELL)

      See also Dependent States .

▪ 1994

Introduction
      The United States of America is a federal republic composed of 50 states. Area: 9,372,571 sq km (3,618,770 sq mi), including 205,856 sq km of inland water but excluding the 156,492 sq km of the Great Lakes that lie within U.S. boundaries. Pop. (1993 est.): 258,233,000. Cap.: Washington, D.C. Monetary unit: U.S. dollar, with (Oct. 4, 1993) a free rate of U.S. $1.52 to £ 1 sterling. Presidents in 1993, George Bush and, from January 20, Bill Clinton.

      William Jefferson ("Bill") Clinton (see BIOGRAPHIES (Clinton, William Jefferson )) swept into the White House in 1993 on a wave of high expectations. As the candidate of "change," a word he used often during his presidential campaign against incumbent George Bush, President Clinton was committed to a dramatic reversal of the economic and political stagnation he had blamed on 12 years of conservative Republican rule. Within weeks of the January 1993 inauguration, however, Clinton's new administration was wobbling badly, the victim of ineptitude, bad judgment, and a knack for needless controversy. Fortunately for Clinton, the freshman jitters were eventually dispelled, and the 42nd president of the United States finished the year with an impressive record of accomplishment. According to Congressional Quarterly, for instance, he succeeded in moving more legislation through Congress in his first year than any other president since Dwight Eisenhower in 1952. And he did not have to use his veto power even once, a feat not seen since Richard Nixon's first year in 1969.

The Presidency.
      President Clinton's start was one of the shakiest in recent history. Among his first acts was his declaration that he would seek an end to the U.S. military's long-standing ban on homosexuals in the ranks. Though the move was popular among gays and many other Americans and Clinton had promised it during the election campaign, few Washington analysts thought he would move on such a potentially explosive issue so quickly. Indeed, Clinton's declaration put him at odds with top military leaders and with a number of key civilians who had oversight responsibilities for the armed forces. Chief among the latter was Sen. Sam Nunn, the Georgia Democrat who headed the Senate Armed Services Committee. After heated debate, Clinton managed to gain support for a compromise measure under which homosexual servicemen and servicewomen could remain in the military if they did not openly declare their sexual preference, a policy that quickly became known as "don't ask, don't tell." Yet military officers were overwhelmingly opposed to that approach, fearing that the mere presence of homosexuals in the armed forces would undermine morale. The policy was further undermined by discrimination suits that upheld the right of gays to serve in the military without fear of discrimination. The controversy helped send Clinton's approval ratings plunging to the lowest levels ever recorded for a first-year president and distracted the administration as it struggled to assemble its initial legislative agenda.

      The White House also encountered exasperating difficulty in filling a number of high-level positions in the new government. Two successive nominations for the job of attorney general, the nation's top law-enforcement officer, were derailed by disclosures involving the hiring of domestic help. Zoë Baird, a Connecticut insurance lawyer, was accused by Republicans of not having paid proper payroll taxes for a child-care worker; though the offense was minor and the taxes were eventually paid, she withdrew after being accused of impropriety. Kimba Wood, a federal judge in New York, was reported to have hired an undocumented foreigner for her household; though the practice was not illegal at the time and Wood had kept abreast of payroll taxes, she too was forced to withdraw. The job eventually went to Janet Reno, the state's attorney for Dade county, Fla. (See BIOGRAPHIES (Reno, Janet ).) The Baird and Wood incidents angered many women, who felt that such accusations would not have been brought up in connection with a male candidate. Indeed, Ron Brown, the former Democratic National Committee chairman whose nomination as commerce secretary sailed through Congress, admitted later—to no ill effect on his appointment—that he, too, had been less than punctilious in hiring domestic help.

      One other female nominee was sidelined by Republican opposition, though in this case ostensibly for ideological reasons. Lani Guinier, a law professor at the University of Pennsylvania, withdrew from consideration as the Justice Department's top civil rights official after conservatives objected to what they described as Guinier's radical positions on voting rights and related issues. Though Guinier's supporters protested that her views had been distorted and were hardly controversial, Clinton chose not to stand by her.

      And so it went throughout the early months of the administration. The White House would announce a nomination, Republican opposition would coalesce, and the candidate would withdraw. The failure rate was remarkable for a Democratic president whose party controlled both houses of Congress. Clinton was widely criticized for his timidity in confronting the Republicans. One crucial problem for him was an unusual degree of cohesion among the opposition. Far from being in disarray after losing the White House, the Republicans were lining up en bloc against administration initiatives. Conservative Republicans hinted that they were simply giving Clinton appointees the same sort of harassment they felt that three Republican nominees for the Supreme Court, Robert Bork, Douglas Ginsburg, and Clarence Thomas, had suffered at the hands of Democrats during the Reagan-Bush years (Bork and Ginsburg withdrew; Thomas eventually won confirmation, but only after televised hearings into allegations that he had sexually harassed a colleague, Anita Hill). The Democrats, meanwhile, were just as independent-minded as ever. Under long-standing House and Senate rules designed to limit abuses by the majority, a determined minority could prevent appointments and legislation from even coming to a vote. The Republicans acted cohesively enough to take advantage of those rules; the Democrats were too fractious to stop them. As a consequence, Clinton began to look ineffective.

      One of the president's first major pieces of legislation, an economic stimulus plan, was killed by a Republican filibuster. Clinton's next big initiative, a deficit-reduction package, ran into an early blitz of opposition from Republicans and from various special interests. That was not surprising, given its content: substantial tax increases and modest spending cuts that would affect many industries and individuals adversely. After months of wrangling, a watered-down version of the measure passed with the narrowest of margins; Vice Pres. Al Gore, in his role as president of the Senate, cast the tie-breaking vote.

      The package was expected to cut $500 billion from the federal budget deficit over five years. It included stiff tax increases for upper-income Americans, a slight boost in the corporate tax rate, and a 4-cent-a-gallon (1 gal = 3.8 litres) increase in the federal excise on gasoline. Americans barely noticed the latter, since a softness in global petroleum prices and notoriously low U.S. petroleum taxes had helped keep U.S. gasoline among the world's cheapest—about 25-30 cents a litre. The spending cuts ranged widely across the federal budget, though no serious reductions were made in such major and sacrosanct items as social security and Medicare.

      As the months wore on, Clinton began to gain expertise at wooing and arm-twisting. He succeeded in gaining adoption of his $1.5 billion national service plan, under which 100,000 young Americans would earn cash and credits toward college tuition by working in public service jobs. By autumn, when he faced one of the biggest tests of his administration, he was ready to wheel and deal. The issue was congressional approval of the North American Free Trade Agreement. NAFTA had been painstakingly negotiated by administrations of Ronald Reagan and George Bush, and Clinton had declared his support for it during the 1992 election campaign. The treaty would reduce tariffs between the U.S., Canada, and Mexico on a wide array of products and, in effect, create the world's largest free-trade zone. Business executives and economists supported the measure by a wide margin, confident that it would spur trade and thus prosperity in all three countries. Trade union leaders, environmentalists, and a variety of other interest groups opposed the measure, fearing, among other things, that it would prompt U.S. companies to move their operations to Mexico, where wages were lower than in the U.S. and Canada and environmental standards less rigorous.

      Prominent among NAFTA's opponents was H. Ross Perot, the Texas billionaire who a year earlier had made a run for the presidency. Perot's prediction that the measure would produce "a giant sucking sound" as U.S. jobs were lost to Mexico became a rallying cry of the treaty's critics. As the congressional vote on the agreement approached, chances of passage seemed dim. In apparent desperation, the White House accepted Perot's proposal that he and Vice President Gore debate the issue on national television. They appeared together on interviewer Larry King's Cable News Network talk show, and Gore was credited by many pundits and pollsters with having got the better of his challenger. In any case, public opinion began to swing toward the treaty. Meanwhile, Clinton was wooing legislators with intimate dinners at the White House and promises of federal largesse for their home districts. In the end Clinton prevailed, and NAFTA was passed by both houses.

      The victory provided the president with a measure of momentum that had previously eluded him. Capitalizing on it, he successfully pressed for the passage of a major anticrime bill that included a controversial waiting period on handgun purchases. He also intervened decisively a month later to end a strike by American Airlines flight attendants that threatened to disrupt travel over the Thanksgiving holiday weekend. By year's end it appeared that Clinton, a newcomer to Washington whose previous job had been governor of Arkansas, had figured out how to do business in the nation's capital.

Health Care.
      Perhaps the most important initiative of the new administration, health care, had not yet been formally debated by Congress by the end of 1993, but it nonetheless carried the potential for dramatically changing the way many Americans lived. Unlike most industrial countries, the individualistic, free-enterprise U.S. did not have a comprehensive government health care system. Instead, Americans made do with a patchwork of private insurers, employer-paid insurance, private doctors, private and tax-supported hospitals, and government subsidies for the poor and the elderly. For years the system worked satisfactorily. Though infant mortality rates were relatively high, Americans were generally healthy, and U.S. medical technology was the envy of the world. Yet the system was not without its critics. As Clinton noted during the election campaign, an estimated 37 million Americans had no health insurance coverage at all, and costs were rising sharply throughout the health care industry. In recent years costs had far outpaced the overall rate of inflation. By 1992 the U.S. was spending more than 14% of its gross domestic product (GDP) on health care, up from less than 6% in 1965 and double the percentages in Britain and Japan.

      The reasons for that explosive growth in spending were clear enough; insurance plans provided for virtually unlimited coverage, so hardly anyone in the health care system had an incentive to control costs, and Americans found it difficult to deny themselves access to the most expensive medical technology. Patients wanted the best care possible, and doctors gave it to them without regard to price because someone else, either an insurance company or the government's programs of Medicare (for the elderly) or Medicaid (for the poor), would pay a share of the bill. Yet costs were rising so steeply that the share that individuals had to pay was soaring. Opinion polls showed that while Americans were generally satisfied with the quality of care they were receiving, the costs worried them deeply.

      In a dramatic move to address those concerns, Clinton unveiled a thorough overhaul of the U.S. health care system. The plan, which had been formulated under the supervision of first lady Hillary Rodham Clinton (see BIOGRAPHIES (Clinton, Hillary Rodham )), had three basic elements: universal coverage for all Americans; employer mandates, under which companies would pay 80% of their workers' health insurance premiums; and a system of controls on medical costs. The plan had other details certain to be altered in the expected give-and-take with Congress and interest groups. For instance, the proposal would cover mental health costs, which could prove unacceptably expensive. Likewise, the plan called for a national health board that would enforce price controls, a notion that doctors and hospitals opposed and that economists called unworkable. Another feature of the plan, the creation of giant health alliances that would purchase coverage from private insurers on behalf of nearly all people in a particular region, was so radical that it faced months of study and debate, as well as a likelihood of being dropped.

The Economy.
      The pall of gloom that had hung over the U.S. economy for years was lifting. The recovery had actually begun during the Bush administration, but public perception did not catch up with reality until late in 1993. Most measures of business and consumer confidence were rising, and the stock markets hit new highs several times during the year.

      Signs of renewed vigour were almost everywhere. Consumer spending in the third quarter was up 4.2% from a year earlier. Investment in plant and equipment hit levels not seen since 1984. Unemployment dropped from 7% at the beginning of the year to 6.4% in December, and an average of 150,000 new jobs were created every month (despite a number of highly publicized mass layoffs announced by leading companies). GDP, the total amount of goods and services produced in the country, rose at an inflation-adjusted rate of more than 3%, about the same pace as in 1992. In September, sales of new single-family homes hit their highest monthly level since December 1986. The average price of those homes was up 7.9% from a year earlier, a clear sign of increased demand in a sector of the economy that had long been depressed. Even the American auto industry, battered for years by declining profits and rising imports from Japan, turned in its best year since 1989.

      All this activity raised fears that inflation might return, though prices remained remarkably stable throughout the year. The annual rate of increase of the Consumer Price Index hovered around 3%, one of the lowest levels in two decades. Partly as a consequence, interest rates remained extraordinarily low (lenders were willing to charge lower rates because they expected that the loans would be repaid in dollars that retained their value). Fixed-rate home mortgages, for instance, were carrying annual interest rates under 7%, a situation that had not prevailed in the adult lives of many home buyers.

      The economic picture might have been even brighter were it not for two major natural disasters. In the summer, heavy rains sent the Missouri, Mississippi, and other midwestern rivers surging over their banks. More than two million hectares (five million acres) of farmland were inundated; hundreds of cities and towns were flooded; and thousands of homes and factories were swept away. In the fall, wildfires devastated southern California, burning at least 61,500 ha (152,000 ac) and forcing 25,000 people from their homes. Damage from the two disasters totaled in the billions of dollars, and economists figured that the resulting dislocation may have shaved half a percentage point off the increase in GDP. Some of that was expected to be regained in 1994 as money spent to restore the damage flowed into the economy.

Social Issues.
      The year brought some major advances for women in the U.S. as they continued to gain important posts in business and government. Congress approved President Clinton's appointment of Ruth Bader Ginsburg (see BIOGRAPHIES (Ginsburg, Ruth Bader )), a New York law professor, to the Supreme Court, where she became the second woman on the nine-member panel. In addition, Congress enacted 30 major bills related to women and family issues, compared with 5 in 1989, according to the Congressional Caucus for Women's Issues. Prominent among the new laws was the Family and Medical Leave Act, which provided up to 12 weeks of job-guaranteed leave for workers to care for themselves or sick family members or to have or adopt a baby. The caucus, which at year's end comprised the 7 female members of the Senate and the 47 congresswomen (both numbers were up sharply from the previous legislative session), nonetheless failed in an effort to repeal a measure that banned Medicaid funds from being used for abortions.

      Despite the inauguration of a Democratic president committed to reproductive rights, foes of abortion continued their campaign of disruption and intimidation against clinics where the procedure was performed. The administration loosened some federal restrictions on terminating pregnancies, but abortion foes hoped to make it difficult for women to obtain them. In response, abortion rights advocates sought the intervention of local authorities and the courts. In one closely watched case, abortion rights advocates sought to have clinic blockaders prosecuted under the 1970 Racketeer Influence and Corrupt Organizations (RICO) Act, which was normally used against organized crime. The Supreme Court was expected to rule on the matter in 1994. The court had previously upheld the broadening of RICO to prosecute commodity traders and gang members, although in 1993 the justices ruled that federal courts may not stop abortion clinic blockades by invoking an 1871 civil rights law.

      A number of well-publicized incidents had the effect of polarizing popular opinion along gender lines. One was the disclosure that Sen. Robert Packwood, a veteran Oregon Republican, may have made sexual advances against more than two dozen women over 20 years and tried to intimidate some of his alleged victims into silence. Women in Congress demanded Packwood's resignation, and the Senate launched an investigation. At year's end Packwood hinted that he might resign.

      In December Secretary Hazel O'Leary announced that the Department of Energy would investigate reports that a number of major medical institutions and U.S. government research laboratories had exposed civilians to radiation without having fully informed them of the nature of the experiments. More than 1,000 subjects were involved in various programs dating from the late 1940s to the early 1970s.

Crime.
      A perennial concern among Americans, crime became almost a national obsession in 1993. Highly publicized reports of gang- and drug-related violence, carjackings that ended in death, innocent bystanders killed in gun battles, children bringing guns to school for protection, and foreign tourists killed during robberies in Florida all fanned the flames of public concern. In one typical survey nearly 90% of those polled said they believed that the country's crime problem was growing, and nearly half reported that there was more crime in their neighbourhoods than a year earlier.

      That fear of crime was seemingly at odds with reality. FBI statistics indicated a 4% drop in overall reported crime in 1992, and major cities reported declines in several categories of violent crime, including murder, rape, and robbery. Yet many Americans did not believe such reports, and their concerns led to a number of dramatic steps toward making their localities safer. Sharon Pratt Dixon, the mayor of Washington, D.C., asked the Clinton administration to provide National Guard troops to help police the city's more crime-ridden precincts (the request was denied). Voters in several states approved stiffer sentences for many crimes, as well as money to build more prisons.

      Criminal justice and public safety had long been a matter of state and local responsibility in the U.S., with only a modest federal role. As the clamour for relief from crime rose, however, Washington was listening. Congress passed the Clinton administration's crime bill, which went far beyond previous measures. It lengthened the list of offenses that could be prosecuted by federal authorities, including, as critics of the measure noted with derision, the murder of a federal chicken inspector. On a more positive note, the bill also provided funds to hire 100,000 new police. The most remarkable feature was the bill's inclusion of a long-standing proposal to require a five-day waiting period for the purchase of a handgun. That measure was known as the Brady bill, after James Brady, the White House press secretary who was seriously injured in the 1981 attack on Ronald Reagan. Brady, confined to a wheelchair and unable to resume his duties, campaigned hard for the bill, but it was fiercely opposed by the National Rifle Association (NRA), one of Washington's most formidable interest groups. Even supporters of the Brady bill conceded that it was unlikely to have a major effect on crime, but they welcomed its passage as a step toward more limits on the easy availability of handguns in the U.S. and as a major setback for the NRA.

      The agency responsible for federal criminal enforcement, the U.S. Justice Department, was widely criticized for the way it handled a standoff near Waco, Texas, between federal agents and heavily armed members of a religious cult known as the Branch Davidians and their charismatic leader, David Koresh. Four agents of the Bureau of Alcohol, Tobacco and Firearms were killed in an ill-planned attempt to storm the cult's 31-ha (77-acre) compound. That raid led to a nationally televised 51-day siege and a fiery conflagration after which it was discovered that some 75 people inside the compound, including at least 17 children, had died; a number had been shot. Nearly all the deaths appeared to have been caused by the Branch Davidians, but the report concluded that federal officials handled the situation ineptly.

Foreign Affairs.
      With an administration focused on its domestic policy agenda, international matters receded into the background of public attention. One reason was that since the fall of the Berlin Wall in 1989 and the general collapse of communism around the world, the Cold War no longer served as a framework for U.S. foreign policy and as a focus for public anxiety about the possibility of superpower confrontation. Another reason was that the international conflicts that did occupy the year's headlines in Somalia, the Balkans, Haiti, and the Middle East were mostly protracted regional affairs and were maddeningly resistant to the application of U.S. power.

      In Somalia, for instance, the U.S. began pulling out the more than 25,000 troops it had sent a year earlier to help ensure the distribution of relief supplies to a populace suffering from starvation and from the depredations of feuding warlords. U.S. forces were surprised to encounter hostility from the very people they had been sent to save. When an angry crowd of Somalis attacked a United Nations convoy, American helicopters fired into the crowd, killing and wounding more than 100 people. Then troops under the control of a leading warlord, Muhammad Farah Aydid (see BIOGRAPHIES (Aydid, Gen. Muhammad Farah )), whom the U.S. had been trying to capture, killed 18 Americans in a gun battle. President Clinton quickly announced a pullout of all remaining U.S. forces by March 1994. In an ironic twist to the unhappy American experience in Somalia, the U.S. not only dropped its attempt to seize Aydid but gave him preferential treatment and passage on a U.S. plane to attend peace talks in neighbouring Ethiopia.

      In the Balkans, President Clinton indicated his willingness to send U.S. troops to help maintain order if warring factions in Bosnia and Serbia could settle their differences. The offer was not taken up, in part because the conflict dragged on and European countries could not agree on a role for themselves and the U.S. On other matters Europe and the U.S. did appear to be in agreement. Among them was an American proposal to expand the membership of NATO possibly at some time in the future to include states of the Warsaw Pact, a now-defunct alliance of former Soviet-bloc states. That was an astonishing development, given the four decades of enmity between the two blocs. In addition, after years of sometimes desultory talks, the U.S. and Europe resolved most of their differences on trade and in December concluded an agreement under the General Agreement on Tariffs and Trade. (See Economic Affairs .)

      In Haiti the U.S. found itself in the position of supporting exiled Pres. Jean-Bertrand Aristide but unable to arrange his return. Haitian army commander Raoul Cédras, who ousted Aristide after the former Roman Catholic priest was democratically elected in 1990, refused to yield power. Cédras did participate in a UN-brokered agreement that would allow Aristide to take office, and the U.S. and Canada promised to send a small contingent of lightly armed troops to help police the arrangement. Yet when the U.S. troop ship arrived in Haiti, a violent mob of army-backed civilians refused to let it dock, and the troops returned home. Clinton ordered six American ships into the region to enforce a UN arms and oil embargo against Haiti. Meanwhile, forces loyal to Cédras continued to intimidate and even murder their opponents with impunity.

      In the Middle East, where the U.S. had long played a major role, Clinton presided over the historic meeting in Washington of Palestine Liberation Organization chief Yasir Arafat and Israeli Prime Minister Yitzhak Rabin. The two leaders met for the signing of an agreement allowing an unprecedented measure of Palestinian autonomy in the Israeli-occupied West Bank and Gaza Strip. The U.S. had little directly to do with arranging the agreement, and at one point late in the year, Rabin asked the U.S. to refrain from direct involvement in Israel's talks with the Palestinians.

      As the year came to a close, President Clinton shifted U.S. attention to North Korea. That country, ruled by the reclusive Kim Il Sung and dedicated to a brand of highly regimented Stalinist communism, was refusing to allow international inspections of its nuclear energy facilities. American policy makers, concerned that Kim was developing a nuclear weapons program, indicated that the U.S. might take military action if Kim's government did not comply with the inspection. North Korea declared that it was prepared to endure war or economic sanctions; in response, the U.S. said it would increase its military activities in South Korea, Kim's neighbour and bitter foe. Tensions eased somewhat when North Korea said that it might allow some inspections and that it would turn over the remains of U.S. soldiers killed four decades earlier in the Korean War.

      The book was finally closed on one of the country's most enduring political scandals: the Iran-contra affair. The final report of the special prosecutor investigating the matter indicated that former presidents Reagan and Bush were far more complicit than they had asserted. The scandal involved the sale of arms to Iran and the diversion of the resulting profits to provide arms for the contra rebels fighting the leftist government of Nicaragua in the 1980s. Though the Reagan and Bush administrations publicly favoured the contras, Congress had banned military support for them. The report, by prosecutor Lawrence Walsh, concluded that Reagan had set the stage for the illegal activities and that Bush was less than truthful when he declared that he was "out of the loop" and not kept informed about the matter. Neither man, however, was said to be guilty of a crime. (DONALD MORRISON)

      See also Dependent States, below.

* * *

Introduction
officially  United States of America , abbreviations  U.S.  or  U.S.A. , byname  America 
United States of America, flag of the   country of North America, a federal republic of 50 states. Besides the 48 contiguous states that occupy the middle latitudes of the continent, the United States includes the state of Alaska, at the northwestern extreme of North America, and the island state of Hawaii, in the mid-Pacific Ocean. The coterminous states are bounded on the north by Canada, on the east by the Atlantic Ocean, on the south by the Gulf of Mexico and Mexico, and on the west by the Pacific Ocean. The United States is the fourth largest country in the world in area (after Russia, Canada, and China). The national capital is Washington, which is coextensive with the District of Columbia, the federal capital region created in 1790.

      The major characteristic of the United States is probably its great variety. Its physical environment ranges from the Arctic to the subtropical, from the moist rain forest to the arid desert, from the rugged mountain peak to the flat prairie. Although the total population of the United States is large by world standards, its overall population density is relatively low; the country embraces some of the world's largest urban concentrations as well as some of the most extensive areas that are almost devoid of habitation.

      The United States contains a highly diverse population; but, unlike a country such as China that largely incorporated indigenous peoples, its diversity has to a great degree come from an immense and sustained global immigration. Probably no other country has a wider range of racial, ethnic, and cultural types than does the United States. In addition to the presence of surviving native Americans (including American Indians, Aleuts, and Eskimo) and the descendants of Africans taken as slaves to America, the national character has been enriched, tested, and constantly redefined by the tens of millions of immigrants who by and large have gone to America hoping for greater social, political, and economic opportunities than they had in the places they left.

      The United States is the world's greatest economic power, measured in terms of gross national product (GNP). The nation's wealth is partly a reflection of its rich natural resources and its enormous agricultural output, but it owes more to the country's highly developed industry. Despite its relative economic self-sufficiency in many areas, the United States is the most important single factor in world trade by virtue of the sheer size of its economy. Its exports and imports represent major proportions of the world total. The United States also impinges on the global economy as a source of and as a destination for investment capital. The country continues to sustain an economic life that is more diversified than any other on Earth, providing the majority of its people with one of the world's highest standards of living.

      The United States is relatively young by world standards, being barely more than 200 years old; it achieved its current size only in the mid-20th century. America was the first of the European colonies to separate successfully from its motherland, and it was the first nation to be established on the premise that sovereignty rests with its citizens and not with the government. In its first century and a half, the country was mainly preoccupied with its own territorial expansion and economic growth and with social debates that ultimately led to civil war and a healing period that is still not complete. In the 20th century the United States emerged as a world power, and since World War II it has been one of the preeminent powers. It has not accepted this mantle easily nor always carried it willingly; the principles and ideals of its founders have been tested by the pressures and exigencies of its dominant status. Although the United States still offers its residents opportunities for unparalleled personal advancement and wealth, the depletion of its resources, contamination of its environment, and continuing social and economic inequality that perpetuates areas of poverty and blight all threaten the fabric of the country.

Ed.
      The District of Columbia is discussed in the article Washington. For discussion of other major U.S. cities, see the articles Boston, Chicago, Los Angeles, New Orleans, New York City, Philadelphia, and San Francisco. Political units in association with the United States include Puerto Rico, discussed in the article Puerto Rico, and several Pacific islands, discussed in Guam, Northern Mariana Islands, and American Samoa.

The land (United States)
  The two great sets of elements that mold the physical environment of the United States are, first, the geologic, which determines the main patterns of landforms, drainage, and mineral resources and influences soils to a lesser degree, and, second, the atmospheric, which dictates not only climate and weather but also in large part the distribution of soils, plants, and animals. Although these elements are not entirely independent of one another, each produces on a map patterns that are so profoundly different that essentially they remain two separate geographies. (Since this article covers only the coterminous United States, see also the articles Alaska and Hawaii.)

Relief
      The centre of the coterminous United States is a great sprawling interior lowland, reaching from the ancient shield of central Canada on the north to the Gulf of Mexico on the south. To east and west this lowland rises, first gradually and then abruptly, to mountain ranges that divide it from the sea on both sides. The two mountain systems differ drastically. The Appalachian Mountains on the east are low, almost unbroken, and in the main set well back from the Atlantic. From New York to the Mexican border stretches the low Coastal Plain, which faces the ocean along a swampy, convoluted coast. The gently sloping surface of the plain extends out beneath the sea, where it forms the continental shelf, which, although submerged beneath shallow ocean water, is geologically identical to the Coastal Plain. Southward the plain grows wider, swinging westward in Georgia and Alabama to truncate the Appalachians along their southern extremity and separate the interior lowland from the Gulf.

      West of the Central Lowland is the mighty Cordillera, part of a global mountain system that rings the Pacific Basin. The Cordillera encompasses fully one-third of the United States, with an internal variety commensurate with its size. At its eastern margin lie the Rocky Mountains, a high, diverse, and discontinuous chain that stretches all the way from New Mexico to the Canadian border. The Cordillera's western edge is a Pacific coastal chain of rugged mountains and inland valleys, the whole rising spectacularly from the sea without benefit of a coastal plain. Pent between the Rockies and the Pacific chain is a vast intermontane complex of basins, plateaus, and isolated ranges so large and remarkable that they merit recognition as a region separate from the Cordillera itself.

      These regions—the Interior Lowlands and their upland fringes, the Appalachian Mountain system, the Atlantic Plain, the Western Cordillera, and the Western Intermontane Region—are so various that they require further division into 24 major subregions, or provinces (see map).

The Interior Lowlands and their upland fringes
      Andrew Jackson is supposed to have remarked that the United States begins at the Alleghenies, implying that only west of the mountains, in the isolation and freedom of the great Interior Lowlands, could people finally escape Old World influences. Whether or not the lowlands constitute the country's cultural core is debatable, but there can be no doubt that they comprise its geologic core and in many ways its geographic core as well.

      This enormous region rests upon an ancient, much-eroded platform of complex crystalline rocks that have for the most part lain undisturbed by major orogenic (mountain-building) activity for more than 600,000,000 years. Over much of central Canada, these Precambrian rocks are exposed at the surface and form the continent's single largest topographical region, the formidable and ice-scoured Canadian Shield.

      In the United States most of the crystalline platform is concealed under a deep blanket of sedimentary rocks. In the far north, however, the naked Canadian Shield extends into the United States far enough to form two small but distinctive landform regions: the rugged and occasionally spectacular Adirondack Mountains of northern New York; and the more subdued but austere Superior Uplands (Superior Upland) of northern Minnesota, Wisconsin, and Michigan. As in the rest of the shield, glaciers have stripped soils away, strewn the surface with boulders and other debris, and obliterated preglacial drainage systems. Most attempts at farming in these areas have been abandoned, but the combination of a comparative wilderness in a northern climate, clear lakes, and white-water streams has fostered the development of both regions as year-round outdoor recreation areas.

      Mineral wealth in the Superior Uplands is legendary. Iron lies near the surface and close to the deepwater ports of the upper Great Lakes. Iron is mined both north and south of Lake Superior, but best known are the colossal deposits of Minnesota's Mesabi Range, for more than a century one of the world's richest and a vital element in America's rise to industrial power. In spite of depletion, the Minnesota and Michigan mines still yield a major proportion of the country's iron and a significant percentage of the world's supply.

      South of the Adirondack Mountains and Superior Uplands lies the boundary between crystalline and sedimentary rocks; abruptly, everything is different. The core of this sedimentary region—the heartland of the United States—is the great Central Lowland, which stretches for 1,500 miles (2,400 kilometres) from New York to central Texas and north another 1,000 miles to the Canadian province of Saskatchewan. To some, the landscape may seem dull, for heights of more than 2,000 feet (600 metres) are unusual, and truly rough terrain is almost lacking. Landscapes are varied, however, largely as the result of glaciation that directly or indirectly affected most of the subregion. North of the Missouri–Ohio river line, the advance and readvance of continental ice left an intricate mosaic of boulders, sand, gravel, silt, and clay and a complex pattern of lakes and drainage channels, some abandoned, some still in use. The southern part of the Central Lowland is quite different, covered mostly with loess (wind-deposited silt) that further subdued the already low relief surface. Elsewhere, especially near major rivers, postglacial streams carved the loess into rounded hills, and visitors have aptly compared their billowing shapes to the waves of the sea. Above all, the loess produces soil of extraordinary fertility. As the Mesabi iron was a major source of America's industrial wealth, its agricultural prosperity has been rooted in Midwestern loess.

      The Central Lowland resembles a vast saucer, rising gradually to higher lands on all sides. Southward and eastward, the land rises gradually to three major plateaus. Beyond the reach of glaciation to the south, the sedimentary rocks have been raised into two broad upwarps, separated from one another by the great valley of the Mississippi River. The Ozark Plateau (Ozark Mountains) lies west of the river and occupies most of southern Missouri and northern Arkansas; on the east the Interior Low Plateaus dominate central Kentucky and Tennessee. Except for two nearly circular patches of rich limestone country—the Nashville Basin of Tennessee and the Kentucky Bluegrass region—most of both plateau regions consists of sandstone uplands, intricately dissected by streams. Local relief runs to several hundreds of feet in most places, and visitors to the region must travel winding roads along narrow stream valleys. The soils there are poor, and mineral resources are scanty.

      Eastward from the Central Lowland the Appalachian Plateau—a narrow band of dissected uplands that strongly resembles the Ozark Plateau and Interior Low Plateaus in steep slopes, wretched soils, and endemic poverty—forms a transition between the interior plains and the Appalachian Mountains. Usually, however, the Appalachian Plateau is considered a subregion of the Appalachian Mountains, partly on grounds of location, partly because of geologic structure. Unlike the other plateaus, where rocks are warped upward, the rocks there form an elongated basin, wherein bituminous coal has been preserved from erosion. This Appalachian coal, like the Mesabi iron that it complements in U.S. industry, is extraordinary. Extensive, thick, and close to the surface, it has stoked the furnaces of northeastern steel mills for decades and helps explain the huge concentration of heavy industry along the lower Great Lakes.

      The western flanks of the Interior Lowlands are the Great Plains, a territory of awesome bulk that spans the full distance between Canada and Mexico in a swath nearly 500 miles wide. The Great Plains were built by successive layers of poorly cemented sand, silt, and gravel—debris laid down by parallel east-flowing streams from the Rocky Mountains. Seen from the east, the surface of the Great Plains rises inexorably from about 2,000 feet near Omaha, Neb., to more than 6,000 feet at Cheyenne, Wyo., but the climb is so gradual that popular legend holds the Great Plains to be flat. True flatness is rare, although the High Plains of western Texas, Oklahoma, Kansas, and eastern Colorado come close. More commonly, the land is broadly rolling, and parts of the northern plains are sharply dissected into badlands.

      The main mineral wealth of the Interior Lowlands derives from fossil fuels. Coal occurs in structural basins protected from erosion—high-quality bituminous in the Appalachian, Illinois, and western Kentucky basins; and subbituminous and lignite in the eastern and northwestern Great Plains. Petroleum and natural gas have been found in nearly every state between the Appalachians and the Rockies, but the Midcontinent Fields of western Texas and the Texas Panhandle, Oklahoma, and Kansas surpass all others. Aside from small deposits of lead and zinc, metallic minerals are of little importance.

The Appalachian Mountain system
 The Appalachians (Appalachian Mountains) dominate the eastern United States and separate the Eastern Seaboard from the interior with a belt of subdued uplands that extends nearly 1,500 miles from northeastern Alabama to the Canadian border. They are old, complex mountains, the eroded stumps of much greater ranges. Present topography results from erosion that has carved weak rocks away, leaving a skeleton of resistant rocks behind as highlands. Geologic differences are thus faithfully reflected in topography. In the Appalachians these differences are sharply demarcated and neatly arranged, so that all the major subdivisions except New England lie in strips parallel to the Atlantic and to one another.

      The core of the Appalachians is a belt of complex metamorphic and igneous rocks that stretches all the way from Alabama to New Hampshire. The western side of this belt forms the long slender rampart of the Blue Ridge Mountains, containing the highest elevations in the Appalachians (Mount Mitchell (Mitchell, Mount), N.C., 6,684 feet [2,037 metres]) and some of its most handsome mountain scenery. On its eastern, or seaward, side the Blue Ridge descends in an abrupt and sometimes spectacular escarpment to the Piedmont, a well-drained, rolling land—never quite hills, but never quite a plain. Before the settlement of the Midwest the Piedmont was the most productive agricultural region in the United States, and several Pennsylvania counties still consistently report some of the highest farm yields per acre in the entire country.

      West of the crystalline zone, away from the axis of primary geologic deformation, sedimentary rocks have escaped metamorphism but are compressed into tight folds. Erosion has carved the upturned edges of these folded rocks into the remarkable Ridge and Valley country of the western Appalachians. Long linear ridges characteristically stand about 1,000 feet from base to crest and run for tens of miles, paralleled by broad open valleys of comparable length. In Pennsylvania, ridges run unbroken for great distances, occasionally turning abruptly in a zigzag pattern; by contrast, the southern ridges are broken by faults and form short, parallel segments that are lined up like magnetized iron filings. By far the largest valley—and one of the most important routes in North America—is the Great Valley (Great Appalachian Valley), an extraordinary trench of shale and limestone that runs nearly the entire length of the Appalachians. It provides a lowland passage from the middle Hudson valley to Harrisburg, Pa., and on southward, where it forms the Shenandoah (Shenandoah Valley) and Cumberland valleys, and has been one of the main paths through the Appalachians since pioneer times. In New England it is floored with slates and marbles and forms the Valley of Vermont, one of the few fertile areas in an otherwise mountainous region.

      Topography much like that of the Ridge and Valley is found in the Ouachita Mountains of western Arkansas and eastern Oklahoma, an area generally thought to be a detached continuation of Appalachian geologic structure, the intervening section buried beneath the sediments of the lower Mississippi valley.

      The once-glaciated New England section of the Appalachians is divided from the rest of the chain by an indentation of the Atlantic. Although almost completely underlain by crystalline rocks, New England is laid out in north–south bands, reminiscent of the southern Appalachians. The rolling, rocky hills of southeastern New England are not dissimilar to the Piedmont, while, farther northwest, the rugged and lofty White Mountains are a New England analogue to the Blue Ridge. (Mount Washington (Washington, Mount), N.H., at 6,288 feet [1917 metres], is the highest peak in the northeastern United States.) The westernmost ranges—the Taconics (Taconic Range), Berkshires (Berkshire Hills), and Green Mountains—show a strong north–south lineation like the Ridge and Valley. Unlike the rest of the Appalachians, however, glaciation has scoured the crystalline rocks much like those of the Canadian Shield, so that New England is best known for its picturesque landscape, not for its fertile soil.

      Typical of diverse geologic regions, the Appalachians contain a great variety of minerals. Only a few occur in quantities large enough for sustained exploitation, notably iron in Pennsylvania's Blue Ridge and Piedmont and the famous granites, marbles, and slates of northern New England. In Pennsylvania the Ridge and Valley region contains one of the world's largest deposits of anthracite coal, once the basis of a thriving mining economy; many of the mines are now shut, oil and gas having replaced coal as the major fuel used to heat homes.

The Atlantic Plain
      The eastern and southeastern fringes of the United States are part of the outermost margins of the continental platform, repeatedly invaded by the sea and veneered with layer after layer of young, poorly consolidated sediments. Part of this platform now lies slightly above sea level and forms a nearly flat and often swampy coastal plain, which stretches from Cape Cod, Mass., to beyond the Mexican border. Most of the platform, however, is still submerged, so that a band of shallow water, the continental shelf, parallels the Atlantic and Gulf coasts, in some places reaching 250 miles out to sea.

      The Atlantic Plain slopes so gently that even slight crustal upwarping can shift the coastline far out to sea at the expense of the continental shelf. The peninsula of Florida is just such an upwarp; nowhere in its 400-mile length does the land rise more than 350 feet above sea level; much of the southern and coastal areas rise less than 10 feet and are poorly drained and dangerously exposed to Atlantic storms. Downwarps can result in extensive flooding. North of New York City, for example, the weight of glacial ice depressed most of the Coastal Plain beneath the sea, and the Atlantic now beats directly against New England's rock-ribbed coasts. Cape Cod, Long Island (N.Y.), and a few offshore islands are all that remain of New England's drowned Coastal Plain. Another downwarp lies perpendicular to the Gulf coast and guides the course of the lower Mississippi. The river, however, has filled with alluvium what otherwise would be an arm of the Gulf, forming a great inland salient of the Coastal Plain called the Mississippi Embayment.

      South of New York the Coastal Plain gradually widens, but ocean water has invaded the lower valleys of most of the coastal rivers and has turned them into estuaries. The greatest of these is Chesapeake Bay, merely the flooded lower valley of the Susquehanna River and its tributaries, but there are hundreds of others. Offshore a line of sandbars and barrier beaches stretches intermittently the length of the Coastal Plain, hampering entry of shipping into the estuaries but providing the eastern United States with a playground that is more than 1,000 miles long.

      Poor soils are the rule on the Coastal Plain, though rare exceptions have formed some of America's most famous agricultural regions—for example, the citrus country of central Florida's limestone uplands and the Cotton Belt of the Old South, once centred on the alluvial plain of the Mississippi and belts of chalky black soils of eastern Texas, Alabama, and Mississippi. The Atlantic Plain's greatest natural wealth derives from petroleum and natural gas trapped in domal structures that dot the Gulf Coast of eastern Texas and Louisiana. Onshore and offshore drilling have revealed colossal reserves of oil and natural gas.

The Western Cordillera
      West of the Great Plains the United States seems to become a craggy land whose skyline is rarely without mountains—totally different from the open plains and rounded hills of the East. On a map the alignment of the two main chains—the Rocky Mountains on the east, the Pacific ranges on the west—tempts one to assume a geologic and hence topographic homogeneity. Nothing could be farther from the truth, for each chain is divided into widely disparate sections.

      The Rockies (Rocky Mountains) are typically diverse. The Southern Rockies are composed of a disconnected series of lofty elongated upwarps, their cores made of granitic basement rocks, stripped of sediments, and heavily glaciated at high elevations. In New Mexico and along the western flanks of the Colorado ranges, widespread volcanism and deformation of colourful sedimentary rocks have produced rugged and picturesque country, but the characteristic central Colorado or southern Wyoming range is impressively austere rather than spectacular. The Front Range west of Denver is prototypical, rising abruptly from its base at about 6,000 feet to rolling alpine meadows between 11,000 and 12,000 feet. Peaks appear as low hills perched on this high-level surface, so that Colorado, for example, boasts 53 mountains over 14,000 feet but not one over 14,500 feet.

      The Middle Rockies cover most of west central Wyoming. Most of the ranges resemble the granitic upwarps of Colorado, but thrust faulting and volcanism have produced varied and spectacular country to the west, some of which is included in Grand Teton and Yellowstone national parks. Much of the subregion, however, is not mountainous at all but consists of extensive intermontane basins and plains—largely floored with enormous volumes of sedimentary waste eroded from the mountains themselves. Whole ranges have been buried, producing the greatest gap in the Cordilleran system, the Wyoming Basin—resembling in geologic structure and topography an intermontane peninsula of the Great Plains. As a result, the Rockies have never posed an important barrier to east–west transportation in the United States; all major routes, from the Oregon Trail to interstate highways, funnel through the basin, essentially circumventing the main ranges of the Rockies.

      The Northern Rockies contain the most varied mountain landscapes of the Cordillera, reflecting a corresponding geologic complexity. The region's backbone is a mighty series of batholiths—huge masses of molten rock that slowly cooled below the surface and were later uplifted. The batholiths are eroded into rugged granitic ranges, which, in central Idaho, compose the most extensive wilderness country in the coterminous United States. East of the batholiths and opposite the Great Plains, sediments have been folded and thrust-faulted into a series of linear north–south ranges, a southern extension of the spectacular Canadian Rockies. Although elevations run 2,000 to 3,000 feet lower than the Colorado Rockies (most of the Idaho Rockies lie well below 10,000 feet), increased rainfall and northern latitude have encouraged glaciation—there as elsewhere a sculptor of handsome alpine landscape.

      The western branch of the Cordillera directly abuts the Pacific Ocean. This coastal chain, like its Rocky Mountain cousins on the eastern flank of the Cordillera, conceals bewildering complexity behind a facade of apparent simplicity. At first glance the chain consists merely of two lines of mountains with a discontinuous trough between them. Immediately behind the coast is a line of hills and low mountains—the Pacific Coast Ranges. Farther inland, averaging 150 miles from the coast, the line of the Sierra Nevada and the Cascade Range includes the highest elevations in the coterminous United States. Between these two unequal mountain lines is a discontinuous trench, the Troughs of the Coastal Margin.

 The apparent simplicity disappears under the most cursory examination. The Pacific Coast Ranges actually contain five distinct sections, each of different geologic origin and each with its own distinctive topography. The Transverse Ranges of southern California are a crowded assemblage of islandlike faulted ranges, with peak elevations of more than 10,000 feet but sufficiently separated by plains and low passes so that travel through them is easy. From Point Conception to the Oregon border, however, the main California Coast Ranges are entirely different, resembling the Appalachian Ridge and Valley region, with low linear ranges that result from erosion of faulted and folded rocks. Major faults run parallel to the low ridges, and the greatest—the notorious San Andreas Fault—was responsible for the earthquake that all but destroyed San Francisco in 1906. Along the California–Oregon border, everything changes again. In this region, the wildly rugged Klamath Mountains represent a western salient of interior structure reminiscent of the Idaho Rockies and the northern Sierra Nevada. In western Oregon and southwestern Washington the Coast Ranges are also different—a gentle, hilly land carved by streams from a broad arch of marine deposits interbedded with tabular lavas. In the northernmost part of the Coast Ranges and the remote northwest, a domal upwarp has produced the Olympic Mountains; (Olympic Mountains) its serrated peaks tower nearly 8,000 feet above Puget Sound and the Pacific, and the heavy precipitation on its upper slopes supports the largest active glaciers in the United States outside of Alaska.

      East of these Pacific Coast Ranges the Troughs of the Coastal Margin contain the only extensive lowland plains of the Pacific margin—California's Central Valley, Oregon's Willamette River valley, and the half-drowned basin of Puget Sound in Washington. Parts of an inland trench that extends for great distances along the east coast of the Pacific, similar valleys occur in such diverse areas as Chile and the Alaska panhandle. These valleys are blessed with superior soils, easily irrigated, and very accessible from the Pacific. They have enticed settlers for more than a century and have become the main centres of population and economic activity for much of the U.S. West Coast.

      Still farther east rise the two highest mountain chains in the coterminous United States—the Cascades and the Sierra Nevada. Aside from elevation, geographic continuity, and spectacular scenery, however, the two ranges differ in almost every important respect. Except for its northern section, where sedimentary and metamorphic rocks occur, the Sierra Nevada is largely made of granite, part of the same batholithic chain that creates the Idaho Rockies. The range is grossly asymmetrical, the result of massive faulting that has gently tilted the western slopes toward the Central Valley but has uplifted the eastern side to confront the interior with an escarpment nearly two miles high. At high elevation glaciers have scoured the granites to a gleaming white, while on the west the ice has carved spectacular valleys such as the Yosemite. The loftiest peak in the Sierras is Mount Whitney (Whitney, Mount), which at 14,494 feet (4,418 metres) is the highest mountain in the coterminous states. The upfaulting that produced Mount Whitney is accompanied by downfaulting that formed nearby Death Valley, at 282 feet (86 metres) below sea level the lowest point in North America.

      The Cascades (Cascade Range) are made largely of volcanic rock; those in northern Washington contain granite like the Sierras, but the rest are formed from relatively recent lava outpourings of dun-coloured basalt and andesite. The Cascades are in effect two ranges. The lower, older range is a long belt of upwarped lava, rising unspectacularly to elevations between 6,000 and 8,000 feet. Perched above the “low Cascades” is a chain of lofty volcanoes (volcano) that punctuate the horizon with magnificent glacier-clad peaks. The highest is Mount Rainier (Rainier, Mount), which at 14,410 feet (4,392 metres) is all the more dramatic for rising from near sea level. Most of these volcanoes are quiescent, but they are far from extinct. Mount Lassen (Lassen Peak) in northern California erupted violently in 1914, as did Mount St. Helens (Saint Helens, Mount) in the state of Washington in 1980. Most of the other high Cascade volcanoes exhibit some sign of seismic activity.

The Western Intermontane Region
 The Cordillera's two main chains enclose a vast intermontane region of arid basins, plateaus, and isolated mountain ranges that stretches from the Mexican border nearly to Canada and extends 600 miles from east to west. This enormous territory contains three huge subregions, each with a distinctive geologic history and its own striking topography.

      The Colorado Plateau, nestled against the western flanks of the Southern Rockies, is an extraordinary island of geologic stability set in the turbulent sea of Cordilleran tectonic activity. Stability was not absolute, of course, so that parts of the plateau are warped and injected with volcanics, but in general the landscape results from the erosion by streams of nearly flat-lying sedimentary rocks. The result is a mosaic of angular mesas, buttes, and steplike canyons intricately cut from rocks that often are vividly coloured. Large areas of the plateau are so improbably picturesque that they have been set aside as national preserves. The Grand Canyon of the Colorado River is the most famous of several dozen such areas.

      West of the plateau and abutting the Sierra Nevada's eastern escarpment lies the arid Basin and Range (Basin and Range Province) subregion, among the most remarkable topographic provinces of the United States. The Basin and Range extends from southern Oregon and Idaho into northern Mexico. Rocks of great complexity have been broken by faulting, and the resulting blocks have tumbled, eroded, and been partly buried by lava and alluvial debris accumulating in the desert basins. The eroded blocks form mountain ranges that are characteristically dozens of miles long, several thousand feet from base to crest, with peak elevations that rarely rise to more than 10,000 feet, and almost always aligned roughly north–south. The basin floors are typically alluvium and sometimes salt marshes or alkali flats.

      The third intermontane region, the Columbia Basin, is literally the last, for in some parts its rocks are still being formed. Its entire area is underlain by innumerable tabular lava flows that have flooded the basin between the Cascades and Northern Rockies to undetermined depths. The volume of lava must be measured in thousands of cubic miles, for the flows blanket large parts of Washington, Oregon, and Idaho and in southern Idaho have drowned the flanks of the Northern Rocky Mountains in a basaltic sea. Where the lavas are fresh, as in southern Idaho, the surface is often nearly flat, but more often the floors have been trenched by rivers—conspicuously the Columbia and the Snake—or by glacial floodwaters that have carved an intricate system of braided canyons in the remarkable Channeled Scablands of eastern Washington. In surface form the eroded lava often resembles the topography of the Colorado Plateau, but the gaudy colours of the Colorado are replaced here by the sombre black and rusty brown of weathered basalt.

      Most large mountain systems are sources of varied mineral wealth, and the American Cordillera is no exception. Metallic minerals have been taken from most crystalline regions and have furnished the United States with both romance and wealth—the Sierra Nevada gold that provoked the 1849 gold rush, the fabulous silver lodes of western Nevada's Basin and Range, and gold strikes all along the Rocky Mountain chain. Industrial metals, however, are now far more important; copper and lead are among the base metals, and the more exotic molybdenum, vanadium, and cadmium are mainly useful in alloys.

      In the Cordillera, as elsewhere, the greatest wealth stems from fuels. Most major basins contain oil and natural gas, conspicuously the Wyoming Basin, the Central Valley of California, and the Los Angeles Basin. The Colorado Plateau, however, has yielded some of the most interesting discoveries—considerable deposits of uranium and colossal occurrences of oil shale. Oil from the shale, however, probably cannot be economically removed without widespread strip-mining and correspondingly large-scale damage to the environment. Wide exploitation of low-sulfur bituminous coal has been initiated in the Four Corners area of the Colorado Plateau, and open-pit mining has already devastated parts of this once-pristine country as completely as it has West Virginia.

Drainage
      As befits a nation of continental proportions, the United States has an extraordinary network of rivers and lakes, including some of the largest and most useful in the world. In the humid East they provide an enormous mileage of cheap inland transportation; westward, most rivers and streams are unnavigable but are heavily used for irrigation and power generation. Both East and West, however, traditionally have used lakes and streams as public sewers, and despite efforts to clean them up, most large waterways are laden with vast, poisonous volumes of industrial, agricultural, and human wastes.

The Eastern systems
      Chief among U.S. rivers is the Mississippi (Mississippi River), which, with its great tributaries, the Ohio and the Missouri, drains most of the midcontinent. The Mississippi is navigable to Minneapolis nearly 1,200 miles by air from the Gulf of Mexico; and along with the Great Lakes–St. Lawrence system it forms the world's greatest network of inland waterways. The Mississippi's eastern branches, chiefly the Ohio (Ohio River) and the Tennessee (Tennessee River), are also navigable for great distances. From the west, however, many of its numerous Great Plains tributaries are too seasonal and choked with sandbars to be used for shipping. The Missouri (Missouri River), for example, though longer than the Mississippi itself, was essentially without navigation until the mid-20th century, when a combination of dams, locks, and dredging opened the river to barge traffic.

      The Great Lakes– (Great Lakes)St. Lawrence system, the other half of the midcontinental inland waterway, is connected to the Mississippi–Ohio via Chicago by canals and the Illinois River. The five Great Lakes (four of which are shared with Canada) constitute by far the largest freshwater lake group in the world and carry a larger tonnage of shipping than any other. The three main barriers to navigation—the St. Marys Rapids, at Sault Sainte Marie; Niagara Falls; and the rapids of the St. Lawrence—are all bypassed by locks, whose 27-foot draft lets ocean vessels penetrate 1,300 miles into the continent, as far as Duluth, Minnesota, and Chicago.

      The third group of Eastern rivers drains the coastal strip along the Atlantic Ocean and the Gulf of Mexico. Except for the Rio Grande, which rises west of the Rockies and flows about 1,900 circuitous miles to the Gulf, few of these coastal rivers measure more than 300 miles, and most flow in an almost straight line to the sea. Except in glaciated New England and in arid southwestern Texas, most of the larger coastal streams are navigable for some distance.

The Pacific systems
      West of the Rockies, nearly all of the rivers are strongly influenced by aridity. In the deserts and steppes of the intermontane basins, most of the scanty runoff disappears into interior basins, only one of which, the Great Salt Lake, holds any substantial volume of water. Aside from a few minor coastal streams, only three large river systems manage to reach the sea—the Columbia (Columbia River), the Colorado (Colorado River), and the San Joaquin–Sacramento system of California's Central Valley. All three of these river systems are exotic: that is, they flow for considerable distances across dry lands from which they receive little water. Both the Columbia and the Colorado have carved awesome gorges, the former through the sombre lavas of the Cascades and the Columbia Basin, the latter through the brilliantly coloured rocks of the Colorado Plateau. These gorges lend themselves to easy damming, and the once-wild Columbia has been turned into a stairway of placid lakes whose waters irrigate the arid plateaus of eastern Washington and power one of the world's largest hydroelectric networks. The Colorado is less extensively developed, and proposals for new dam construction have met fierce opposition from those who want to preserve the spectacular natural beauty of the river's canyon lands.

Climate
      Climate affects human habitats both directly and indirectly through its influence on vegetation, soils, and wildlife. In the United States, however, the natural environment has been altered drastically by nearly four centuries of European settlement, as well as thousands of years of Indian occupancy.

      Wherever land is abandoned, however, “wild” conditions return rapidly, achieving over the long run a dynamic equilibrium among soils, vegetation, and the inexorable strictures of climate. Thus, though Americans have created an artificial environment of continental proportions, the United States still can be divided into a mosaic of bioclimatic regions, each of them distinguished by peculiar climatic conditions and each with a potential vegetation and soil that eventually would return in the absence of humans. The main exception to this generalization applies to fauna, so drastically altered that it is almost impossible to know what sort of animal geography would redevelop in the areas of the United States if humans were removed from the scene.

Climatic controls
      The pattern of U.S. climates is largely set by the location of the coterminous United States almost entirely in the middle latitudes, by its position with respect to the continental landmass and its fringing oceans, and by the nation's gross pattern of mountains and lowlands. Each of these geographic controls operates to determine the character of air masses and their changing behaviour from season to season.

      The coterminous United States lies entirely between the tropic of Cancer and 50° N latitude, a position that confines Arctic climates to the high mountaintops and genuine tropics to a small part of southern Florida. By no means, however, is the climate literally temperate, for the middle latitudes are notorious for extreme variations of temperature and precipitation.

      The great size of the North American landmass tends to reinforce these extremes. Since land heats and cools more rapidly than bodies of water, places distant from an ocean tend to have continental climates; that is, they alternate between extremes of hot summers and cold winters, in contrast to the marine climates, which are more equable. Most U.S. climates are markedly continental, the more so because the Cordillera effectively confines the moderating Pacific influence to a narrow strip along the West Coast. Extremes of continentality occur near the centre of the country, and in North Dakota temperatures have ranged between a summer high record of 121 °F (49 °C) and a winter low of −60 °F (−51 °C). Moreover, the general eastward drift of air over the United States carries continental temperatures all the way to the Atlantic coast. Bismarck, N.D., for example, has a great annual temperature range. Boston, on the Atlantic but largely exempt from its influence, has a lesser but still-continental range, while San Francisco, which is under strong Pacific influence, has only a small summer–winter differential.

      In addition to confining Pacific temperatures to the coastal margin, the Pacific Coast Ranges are high enough to make a local rain shadow in their lee, although the main barrier is the great rampart formed by the Sierra Nevada and Cascade ranges. Rainy on their western slopes and barren on the east, this mountain crest forms one of the sharpest climatic divides in the United States.

      The rain shadow continues east to the Rockies, leaving the entire Intermontane Region either arid or semiarid, except where isolated ranges manage to capture leftover moisture at high altitudes. East of the Rockies the westerly drift brings mainly dry air, and as a result, the Great Plains are semiarid. Still farther east, humidity increases owing to the frequent incursion from the south of warm, moist, and unstable air from the Gulf of Mexico (Mexico, Gulf of), which produces more precipitation in the United States than the Pacific and Atlantic oceans combined.

      Although the landforms of the Interior Lowlands have been termed dull, there is nothing dull about their weather conditions. Air from the Gulf of Mexico can flow northward across the Great Plains, uninterrupted by topographical barriers, but continental Canadian air flows south by the same route, and, since these two air masses differ in every important respect, the collisions often produce disturbances of monumental violence. Plainsmen and Midwesterners are accustomed to sudden displays of furious weather—tornadoes, blizzards, hailstorms, precipitous drops and rises in temperature, and a host of other spectacular meteorological displays, sometimes dangerous but seldom boring.

The change of seasons
      Most of the United States is marked by sharp differences between winter and summer. In winter, when temperature contrasts between land and water are greatest, huge masses of frigid, dry Canadian air periodically spread far south over the midcontinent, bringing cold, sparkling weather to the interior and generating great cyclonic storms where their leading edges confront the shrunken mass of warm Gulf air to the south. Although such cyclonic activity occurs throughout the year, it is most frequent and intense during the winter, parading eastward out of the Great Plains to bring the Eastern states practically all their winter precipitation. Winter temperatures differ widely, depending largely on latitude. Thus, New Orleans, La., at 30° N latitude, and International Falls, Minn., at 49° N, have respective January temperature averages of 55 °F (13 °C) and 3 °F (−16° C). In the north, therefore, precipitation often comes as snow, often driven by furious winds; farther south, cold rain alternates with sleet and occasional snow. Southern Florida is the only dependably warm part of the East, though “polar outbursts” have been known to bring temperatures below 0 °F (−18 °C) as far south as Tallahassee. The main uniformity of Eastern weather in wintertime is the expectation of frequent change.

      Winter climate on the West Coast is very different. A great spiraling mass of relatively warm, moist air spreads south from the Aleutian Islands of Alaska, its semipermanent front producing gloomy overcast and drizzles that hang over the Pacific Northwest all winter long, occasionally reaching southern California, which receives nearly all of its rain at this time of year. This Pacific air brings mild temperatures along the length of the coast; the average January day in Seattle, Wash., ranges between 33 and 44 °F (1 and 7 °C) and in Los Angeles between 45 and 64 °F (7 and 18 °C). In southern California, however, rains are separated by long spells of fair weather, and the whole region is a winter haven for those seeking refuge from less agreeable weather in other parts of the country. The Intermontane Region is similar to the Pacific Coast, but with much less rainfall and a considerably wider range of temperatures.

      During the summer there is a reversal of the air masses, and east of the Rockies the change resembles the summer monsoon of Southeast Asia. As the midcontinent heats up, the cold Canadian air mass weakens and retreats, pushed north by an aggressive mass of warm, moist air from the Gulf. The great winter temperature differential between North and South disappears as the hot, soggy blanket spreads from the Gulf coast to the Canadian border. Heat and humidity are naturally most oppressive in the South, but there is little comfort in the more northern latitudes. In Houston, Texas, the temperature on a typical July day reaches 93 °F (34 °C), with relative humidity averaging near 75 percent, but Minneapolis, Minn., more than 1,000 miles north, is only slightly cooler and less humid.

      Since the Gulf air is unstable as well as wet, convectional and frontal summer thunderstorms are endemic east of the Rockies, accounting for a majority of total summer rain. These storms usually drench small areas with short-lived, sometimes violent downpours, so that crops in one Midwestern county may prosper, those in another shrivel in drought, and those in yet another be flattened by hailstones. Relief from the humid heat comes in the northern Midwest from occasional outbursts of cool Canadian air; small but more consistent relief is found downwind from the Great Lakes and at high elevations in the Appalachians. East of the Rockies, however, U.S. summers are distinctly uncomfortable, and air conditioning is viewed as a desirable amenity in most areas.

      Again, the Pacific regime is different. The moist Aleutian air retreats northward, to be replaced by mild, stable air from over the subtropical but cool waters of the Pacific, and except in the mountains the Pacific Coast is nearly rainless though often foggy. In the meanwhile, a small but potent mass of dry hot air raises temperatures to blistering levels over much of the intermontane Southwest. In Yuma, Ariz., for example, the normal temperature in July reaches 107 °F (42 °C), while nearby Death Valley, Calif., holds the national record, 134 °F (57 °C). During its summer peak this scorching air mass spreads from the Pacific margin as far as Texas on the east and Idaho to the north, turning the whole interior basin into a summer desert.

      Over most of the United States, as in most continental climates, spring and autumn are agreeable but disappointingly brief. Autumn is particularly idyllic in the East, with a romantic Indian summer of ripening corn and brilliantly coloured foliage and of mild days and frosty nights. The shift in dominance between marine and continental air masses, however, spawns furious weather in some regions. Along the Atlantic and Gulf coasts, for example, autumn is the season for hurricanes—the American equivalent of typhoons of the Asian Pacific—which rage northward from the warm tropics to create havoc along the Gulf and Atlantic coasts as far north as New England. The Mississippi valley holds the dubious distinction of recording more tornadoes than any other area on Earth. These violent and often deadly storms usually occur over relatively small areas and are confined largely to spring and early summer.

The bioclimatic regions
      Three first-order bioclimatic zones encompass most of the coterminous United States—regions in which climatic conditions are similar enough to dictate similar conditions of mature (zonal) soil and potential climax vegetation (i.e., the assemblage of plants that would grow and reproduce indefinitely given stable climate and average conditions of soil and drainage). These are the Humid East, the Humid Pacific Coast, and the Dry West. In addition, the boundary zone between the Humid East and the Dry West is so large and important that it constitutes a separate region, the Humid–Arid Transition. Finally, because the Western Cordillera contains an intricate mosaic of climatic types, largely determined by local elevation and exposure, it is useful to distinguish the Western Mountain Climate. The first three zones, however, are very diverse and require further breakdown, producing a total of 10 main bioclimatic regions. For two reasons, the boundaries of these bioclimatic regions are much less distinct than boundaries of landform regions. First, climate varies from year to year, especially in boundary zones, whereas landforms obviously do not. Second, regions of climate, vegetation, and soils coincide generally but sometimes not precisely. Boundaries, therefore, should be interpreted as zonal and transitional, and rarely should be considered as sharp lines in the landscape.

      For all of their indistinct boundaries, however, these bioclimatic regions have strong and easily recognized identities. Such regional identity is strongly reinforced when a particular area falls entirely within a single bioclimatic region and at the same time a single landform region. The result—as in the Piedmont South, the central Midwest, or the western Great Plains—is a landscape with an unmistakable regional personality.

The Humid East
      The largest and in some ways the most important of the bioclimatic zones, the Humid East was where the Europeans first settled, tamed the land, and adapted to American conditions. In early times almost all of this territory was forested, a fact of central importance in American history that profoundly influenced both soils and wildlife. As in most of the world's humid lands, soluble minerals have been leached from the earth, leaving a great family of soils called pedalfers, rich in relatively insoluble iron and aluminum compounds.

      Both forests and soils, however, differ considerably within this vast region. Since rainfall is ample and summers are warm everywhere, the main differences result from the length and severity of winters, which determine the length of the growing season. Winter, obviously, differs according to latitude, so that the Humid East is sliced into four great east–west bands of soils and vegetation, with progressively more amenable winters as one travels southward. These changes occur very gradually, however, and the boundaries therefore are extremely subtle.

      The Sub-Boreal Forest Region is the northernmost of these bands. It is only a small and discontinuous part of the United States, representing the tattered southern fringe of the vast Canadian taiga—a scrubby forest dominated by evergreen needle-leaf species that can endure the ferocious winters and reproduce during the short, erratic summers. Average growing seasons are less than 120 days, though localities in Michigan's Upper Peninsula have recorded frost-free periods lasting as long as 161 days and as short as 76 days. Soils of this region that survived the scour of glaciation are miserably thin podzols—heavily leached, highly acid, and often interrupted by extensive stretches of bog. Most attempts at farming in the region long since have been abandoned.

      Farther south lies the Humid Microthermal Zone of milder winters and longer summers. Large broadleaf trees begin to predominate over the evergreens, producing a mixed forest of greater floristic variety and economic value that is famous for its brilliant autumn colours. As the forest grows richer in species, sterile podzols give way to more productive gray-brown podzolic soils, stained and fertilized with humus. Although winters are warmer than in the Sub-Boreal zone, and although the Great Lakes help temper the bitterest cold, January temperatures ordinarily average below freezing, and a winter without a few days of subzero temperatures is uncommon. Everywhere, the ground is solidly frozen and snow covered for several months of the year.

      Still farther south are the Humid Subtropics. The region's northern boundary is one of the country's most significant climatic lines: the approximate northern limit of a growing season of 180–200 days, the outer margin of cotton growing, and, hence, of the Old South. Most of the South lies in the Piedmont and Coastal Plain, for higher elevations in the Appalachians cause a peninsula of Northern mixed forest to extend as far south as northern Georgia. The red-brown podzolic soil, once moderately fertile, has been severely damaged by overcropping and burning. Thus much of the region that once sustained a rich, broadleaf-forest flora now supports poor piney woods. Throughout the South, summers are hot, muggy, long, and disagreeable; Dixie's “frosty mornings” bring a welcome respite in winter.

      The southern margins of Florida contain the only real tropics in the coterminous United States; it is an area in which frost is almost unknown. Hot, rainy summers alternate with warm and somewhat drier winters, with a secondary rainfall peak during the autumn hurricane season—altogether a typical monsoonal regime. Soils and vegetation are mostly immature, however, since southern Florida rises so slightly above sea level that substantial areas, such as the Everglades, are swampy and often brackish. Peat and sand frequently masquerade as soil, and much of the vegetation is either salt-loving mangrove or sawgrass prairie.

The Humid Pacific Coast
      The western humid region differs from its eastern counterpart in so many ways as to be a world apart. Much smaller, it is crammed into a narrow littoral belt to the windward of the Sierra–Cascade summit, dominated by mild Pacific air, and chopped by irregular topography into an intricate mosaic of climatic and biotic habitats. Throughout the region rainfall is extremely seasonal, falling mostly in the winter half of the year. Summers are droughty everywhere, but the main regional differences come from the length of drought—from about two months in humid Seattle, Wash., to nearly five months in semiarid San Diego, Calif.

      Western Washington, Oregon, and northern California lie within a zone that climatologists call Marine West Coast. Winters are raw, overcast, and drizzly—not unlike northwestern Europe—with subfreezing temperatures restricted mainly to the mountains, upon which enormous snow accumulations produce local alpine glaciers. Summers, by contrast, are brilliantly cloudless, cool, and frequently foggy along the West Coast and somewhat warmer in the inland valleys. This mild marine climate produces some of the world's greatest forests of enormous straight-boled evergreen trees that furnish the United States with much of its commercial timber. Mature soils are typical of humid midlatitude forestlands, a moderately leached gray-brown podzol.

      Toward the south, with diminishing coastal rain the moist marine climate gradually gives way to California's (California) tiny but much-publicized Mediterranean regime. Although mountainous topography introduces a bewildering variety of local environments, scanty winter rains are quite inadequate to compensate for the long summer drought, and much of the region has a distinctly arid character. For much of the year, cool, stable Pacific air dominates the West Coast, bringing San Francisco its famous fogs and Los Angeles its infamous smoggy temperature inversions. Inland, however, summer temperatures reach blistering levels, so that in July, while Los Angeles expects a normal daily maximum of 83 °F (28 °C), Fresno expects 100 °F (38 °C) and is climatically a desert. As might be expected, Mediterranean California contains a huge variety of vegetal habitats, but the commonest perhaps is the chaparral, a drought-resistant, scrubby woodland of twisted hard-leafed trees, picturesque but of little economic value. Chaparral is a pyrophytic (fire-loving) vegetation—i.e., under natural conditions its growth and form depend on regular burning. These fires constitute a major environmental hazard in the suburban hills above Los Angeles and San Francisco Bay, especially in autumn, when hot dry Santa Ana winds from the interior regularly convert brush fires into infernos. Soils are similarly varied, but most of them are light in colour and rich in soluble minerals, qualities typical of subarid soils.

The Dry West
      In the United States, to speak of dry areas is to speak of the West. It covers an enormous region beyond the dependable reach of moist oceanic air, occupying the entire Intermontane area and sprawling from Canada to Mexico across the western part of the Great Plains. To Americans nurtured in the Humid East, this vast territory across the path of all transcontinental travelers has been harder to tame than any other—and no region has so gripped the national imagination as this fierce and dangerous land.

      In the Dry West nothing matters more than water. Thus, though temperatures may differ radically from place to place, the really important regional differences depend overwhelmingly on the degree of aridity, whether an area is extremely dry and hence desert or semiarid and therefore steppe.

      Americans of the 19th century were preoccupied by the myth of a Great American Desert (Great Plains), which supposedly occupied more than one-third of the entire country. True desert, however, is confined to the Southwest, with patchy outliers elsewhere, all without exception located in the lowland rain shadows of the Cordillera. Vegetation in these desert areas varies between nothing at all (a rare circumstance confined mainly to salt flats and sand dunes) to a low cover of scattered woody scrub and short-lived annuals that burst into flamboyant bloom after rains. Soils are usually thin, light-coloured, and very rich with mineral salts. In some areas wind erosion has removed fine-grained material, leaving behind desert pavement, a barren veneer of broken rock.

      Most of the West, however, lies in the semiarid region, in which rainfall is scanty but adequate to support a thin cover of short bunchgrass, commonly alternating with scrubby brush. Here, as in the desert, soils fall into the large family of the pedocals, rich in calcium and other soluble minerals, but in the slightly wetter environments of the West, they are enriched with humus from decomposed grass roots. Under the proper type of management, these chestnut-coloured steppe soils have the potential to be very fertile.

      Weather in the West resembles that of other dry regions of the world, often extreme, violent, and reliably unreliable. Rainfall, for example, obeys a cruel natural law: as total precipitation decreases, it becomes more undependable. John Steinbeck's novel The Grapes of Wrath describes the problems of a family enticed to the arid frontier of Oklahoma during a wet period only to be driven out by the savage drought of the 1930s that turned the western Great Plains into the great American Dust Bowl. Temperatures in the West also fluctuate convulsively within short periods, and high winds are infamous throughout the region.

The Humid–Arid Transition
      East of the Rockies all climatic boundaries are gradational. None, however, is so important or so imperceptibly subtle as the boundary zone that separates the Humid East from the Dry West and that alternates unpredictably between arid and humid conditions from year to year. Stretching approximately from Texas to North Dakota in an ill-defined band between the 95th and 100th meridians, this transitional region deserves separate recognition, partly because of its great size, and partly because of the fine balance between surplus and deficit rainfall, which produces a unique and valuable combination of soils, flora, and fauna. The native vegetation, insofar as it can be reconstructed, was prairie, the legendary sea of tall, deep-rooted grass now almost entirely tilled and planted to grains. Soils, often of loessial derivation, include the enormously productive chernozem (black earth) in the north, with reddish prairie soils of nearly equal fertility in the south. Throughout the region temperatures are severely continental, with bitterly cold winters in the north and scorching summers everywhere.

      The western edge of the prairie fades gradually into the shortgrass steppe of the High Plains, the change a function of diminishing rainfall. The eastern edge, however, represents one of the few major discordances between a climatic and biotic boundary in the United States, for the grassland penetrates the eastern forest in a great salient across humid Illinois and Indiana. Many scholars believe this part of the prairie was artificially induced by repeated burning and consequent destruction of the forest margins by Indians.

The Western mountains
      Throughout the Cordillera and Intermontane regions, irregular topography shatters the grand bioclimatic pattern into an intricate mosaic of tiny regions that differ drastically according to elevation and exposure. No small- or medium-scale map can accurately record such complexity, and mountainous parts of the West are said, noncommittally, to have a “mountain climate.” Lowlands are usually dry, but increasing elevation brings lower temperature, decreased evaporation, and—if a slope faces prevailing winds—greater precipitation. Soils vary wildly from place to place, but vegetation is fairly predictable. From the desert or steppe of intermontane valleys, a climber typically ascends into parklike savanna, then through an orderly sequence of increasingly humid and boreal forests until, if the range is high enough, one reaches the timberline and Arctic tundra. The very highest peaks are snow-capped, although permanent glaciers rarely occur outside the cool humid highlands of the Pacific Northwest.

Peirce F. Lewis

Plant life
      The dominant features of the vegetation are indicated by the terms forest, grassland, desert, and alpine tundra.

      A coniferous forest of white and red pine, hemlock, spruce, jack pine, and balsam fir extends interruptedly in a narrow strip near the Canadian border from Maine to Minnesota and southward along the Appalachian Mountains. There may be found smaller stands of tamarack, spruce, paper birch, willow, alder, and aspen or poplar. Southward, a transition zone of mixed conifers and deciduous trees gives way to a hardwood forest of broad-leaved trees. This forest, with varying mixtures of maple, oak, ash, locust, linden, sweet gum, walnut, hickory, sycamore, beech, and the more southerly tulip tree, once extended uninterruptedly from New England to Missouri and eastern Texas. Pines are prominent on the Atlantic and Gulf coastal plain and adjacent uplands, often occurring in nearly pure stands called pine barrens. Pitch, longleaf, slash, shortleaf, Virginia, and loblolly pines are commonest. Hickory and various oaks combine to form a significant part of this forest, with magnolia, white cedar, and ash often seen. In the frequent swamps, bald cypress, tupelo, and white cedar predominate. Pines, palmettos, and live oaks are replaced at the southern tip of Florida by the more tropical royal and thatch palms, figs, satinwood, and mangrove.

      The grasslands occur principally in the Great Plains area and extend westward into the intermontane basins and benchlands of the Rocky Mountains. Numerous grasses such as buffalo, grama, side oat, bunch, needle, and wheat grass, together with many kinds of herbs, make up the plant cover. Coniferous forests cover the lesser mountains and high plateaus of the Rockies, Cascades, and Sierra Nevada. Ponderosa (yellow) pine, Douglas fir, western red cedar, western larch, white pine, lodgepole pine, several spruces, western hemlock, grand fir, red fir, and the lofty redwood are the principal trees of these forests. The densest growth occurs west of the Cascade and Coast ranges in Washington, Oregon, and northern California, where the trees are often 100 feet or more in height. There the forest floor is so dark that only ferns, mosses, and a few shade-loving shrubs and herbs may be found.

      The alpine tundra, located in the coterminous United States only in the mountains above the limit of trees, consists principally of small plants that bloom brilliantly for a short season. Sagebrush is the most common plant of the arid basins and semideserts west of the Rocky Mountains, but juniper, nut pine, and mountain mahogany are often found on the slopes and low ridges. The desert, extending from southeastern California to Texas, is noted for the many species of cactus, some of which grow to the height of trees, and for the Joshua tree and other yuccas, creosote bush, mesquite, and acacias.

      The United States is rich in the variety of its native forest trees, some of which, as the species of sequoia, are the most massive known. More than 1,000 species and varieties have been described, of which almost 200 are of economic value, either because of the timber and other useful products that they yield or by reason of their importance in forestry.

      Besides the native flowering plants, estimated at between 20,000 to 25,000 species, many hundreds of species introduced from other regions—chiefly Europe, Asia, and tropical America—have become naturalized. A large proportion of these are common annual weeds of fields, pastures, and roadsides. In some districts these naturalized “aliens” constitute 50 percent or more of the total plant population.

Paul H. Oehser Reed C. Rollins Ed.

Animal life
      With most of North America, the United States lies in the Nearctic faunistic realm, a region containing an assemblage of species similar to Eurasia and North Africa but sharply different from the tropical and subtropical zones to the south. Main regional differences correspond roughly with primary climatic and vegetal patterns. Thus, for example, the animal communities of the Dry West differ sharply from those of the Humid East and from those of the Pacific Coast. Because animals tend to range over wider areas than plants, faunal regions are generally coarser than vegetal regions and harder to delineate sharply.

      The animal geography of the United States, however, is far from a natural pattern, for European settlement produced a series of environmental changes that grossly altered the distribution of animal communities. First, many species were hunted to extinction or near extinction, most conspicuously, perhaps, the American bison, which ranged by the millions nearly from coast to coast but now rarely lives outside of zoos and wildlife preserves. Second, habitats were upset or destroyed throughout most of the country—forests cut, grasslands plowed and overgrazed, and migration paths interrupted by fences, railroads, and highways. Third, certain introduced species found hospitable niches and, like the English sparrow, spread over huge areas, often preempting the habitats of native animals. Fourth, though their effects are not fully understood, chemical biocides such as DDT were used for so long and in such volume that they are believed at least partly responsible for catastrophic mortality rates among large mammals and birds, especially predators high on the food chain. Fifth, there has been a gradual northward migration of certain tropical and subtropical insects, birds, and mammals, perhaps encouraged by gradual climatic warming. In consequence, many native animals have been reduced to tiny fractions of their former ranges or exterminated completely, while other animals, both native and introduced, have found the new anthropocentric environment well suited to their needs, with explosive effects on their populations. The coyote, opossum, armadillo, and several species of deer are among the animals that now occupy much larger ranges than they once did.

Peirce F. Lewis
      Arrangement of the account of the distribution of the fauna according to the climatic and vegetal regions has the merit that it can be compared further with the distribution of insects and of other invertebrates, some of which may be expected to fall into the same patterns as the vertebrates, while others, with different modes or different ages of dispersal, have geographic patterns of their own.

      The transcontinental zone of coniferous forest at the north, the taiga (boreal forest), and the tundra zone into which it merges at the northern limit of tree growth are strikingly paralleled by similar vertical zones in the Rockies, and on Mount Washington in the east, where the area above the timberline and below the snow line is often inhabited with tundra animals like the ptarmigan and the white Parnassius butterflies, while the spruce and other conifers below the timberline form a belt sharply set off from the grassland or hardwood forest or desert at still lower altitudes.

      A whole series of important types of animals spread beyond the limits of such regions or zones, sometimes over most of the continent. Aquatic animals, in particular, may live equally in forest and plains, in the Gulf states, and at the Canadian border. Such widespread animals include the white-tailed (Virginia) deer and black bear, the puma (though only in the remotest parts of its former range) and bobcat, the river otter (though now rare in inland areas south of the Great Lakes) and mink, and the beaver and muskrat. The distinctive coyote ranges over all of western North America and eastward as far as Maine. The snapping turtle ranges from the Atlantic coast to the Rocky Mountains.

      In the northern coniferous forest zone, or taiga, the relations of animals with European or Eurasian representatives are numerous, and this zone is also essentially circumpolar. The relations are less close than in the Arctic forms, but the moose, beaver, hare, red fox, otter, wolverine, and wolf are recognizably related to Eurasian animals. Even some fishes, like the whitefishes (Coregonidae), the yellow perch, and the pike, exhibit this kind of Old World–New World relation. A distinctively North American animal in this taiga assemblage is the Canadian porcupine.

      The hardwood forest area of the eastern and the southeastern pinelands compose the most important of the faunal regions within the United States. A great variety of fishes, amphibians, and reptiles of this region have related forms in East Asia, and this pattern of representation is likewise found in the flora. This area is rich in catfishes, minnows, and suckers. The curious ganoid fishes, the bowfin and the gar, are ancient types. The spoonbill cat, a remarkable type of sturgeon in the lower Mississippi, is represented elsewhere in the world only in the Yangtze in China. The Appalachian region is headquarters for the salamanders of the world, with no less than seven of the eight families of this large group of amphibians represented; no other continent has more than three of the eight families together. The eellike sirens and amphiumas (congo snakes) are confined to the southeastern states. The lungless salamanders of the family Plethodontidae exhibit a remarkable variety of genera and a number of species centring in the Appalachians. There is a great variety of frogs, and these include tree frogs whose main development is South American and Australian. The emydid freshwater turtles of the southeast parallel those of East Asia to a remarkable degree, though the genus Clemmys is the only one represented in both regions. Much the same is true of the water snakes, pit vipers, rat snakes, and green snakes, though still others are peculiarly American. The familiar alligator is a form with an Asiatic relative, the only other living true alligator being a species in central China.

      In its mammals and birds the southeastern fauna is less sharply distinguished from the life to the north and west and is less directly related to that of East Asia. The forest is the home of the white-tailed deer, the black bear, the gray fox, the raccoon, and the common opossum. The wild turkey and the extinct hosts of the passenger pigeon were characteristic. There is a remarkable variety of woodpeckers. The birdlife in general tends to differ from that of Eurasia in the presence of birds, like the tanagers, American orioles, and hummingbirds, that belong to South American families. Small mammals abound with types of the worldwide rodent family Cricetidae, and with distinctive moles and shrews.

  Most distinctive of the grassland animals proper is the American bison (see photograph—>), whose nearly extinct European relative, the wisent, is a forest dweller. The most distinctive of the American hoofed animals is the pronghorn, or prongbuck, which represents a family intermediate between the deer and the true antelopes in that it sheds its horns like a deer but retains the bony horn cores. The pronghorn is perhaps primarily a desert mammal, but it formerly ranged widely into the shortgrass plains. Everywhere in open country in the West there are conspicuous and distinctive rodents. The burrowing pocket gopher is peculiarly American, rarely seen making its presence known by pushed-out mounds of earth. The ground squirrels of the genus Citellus are related to those of Central Asia, and resemble them in habit; in North America the gregarious prairie dog is a closely related form. The American badger (see photograph—>), not especially related to the badger of Europe, has its headquarters in the grasslands. The prairie chicken is a bird distinctive of the plains region, which is invaded everywhere by birds from both the east and the west.

 The Southwestern deserts are a paradise for reptiles. Distinctive lizards such as the poisonous Gila monster abound, and the rattlesnakes, of which only a few species are found elsewhere in the United States, are common there. Desert reptile species often range to the Pacific Coast and northward into the Great Basin. Noteworthy mammals are the graceful bipedal kangaroo rat (almost exlusively nocturnal; see photograph—>), the ring-tailed cat, a relative of the raccoon, and the piglike peccary.

      The Rocky Mountains and other western ranges afford distinctive habitats for rock- and cliff-dwelling hoofed animals and rodents. The small pikas, related to the rabbit, inhabit talus areas at high altitudes as they do in the mountain ranges of East Asia. Marmots live in the Rockies as in the Alps. Every western range formerly had its own race of mountain sheep. At the north the Rocky Mountain goat lives at high altitudes—it is more properly a goat antelope, related to the takin of the mountains of western China. The dipper, remarkable for its habit of feeding in swift-flowing streams, though otherwise a bird without special aquatic adaptations, is a Rocky Mountain form with relatives in Asia and Europe.

      In the Pacific region the extremely distinctive primitive tailed frog Ascaphus, which inhabits icy mountain brooks, represents a family by itself, perhaps more nearly related to the frogs of New Zealand than to more familiar types. The Cascades and Sierras form centres for salamanders of the families Ambystomoidae and Plethodontidae second only to the Appalachians, and there are also distinctive newts. The burrowing lizards, of the well-defined family Anniellidae, are found only in a limited area in coastal California. The only family of birds distinctive of North America, that of the wren-tits, Chamaeidae, is found in the chaparral of California. The mountain beaver, or sewellel (which is not at all beaverlike), is likewise a type peculiar to North America, confined to the Cascades and Sierras, and there are distinct kinds of moles in the Pacific area.

      The mammals of the two coasts are strikingly different, though true seals (the harbour seal and the harp seal) are found on both. The sea lions, with longer necks and with projecting ears, are found only in the Pacific—the California sea lion, the more northern Steller's sea lion, and the fur seal. On the East Coast the larger rivers of Florida are inhabited by the Florida manatee, or sea cow, a close relative of the more widespread and more distinctively marine West Indian species.

Karl Patterson Schmidt Ed.

Settlement patterns
      Although the land that now constitutes the United States was occupied and much affected by diverse Indian cultures over many millennia, these pre-European settlement patterns have had virtually no impact upon the contemporary nation—except locally, as in parts of New Mexico. A benign habitat permitted a huge contiguous tract of settled land to materialize across nearly all the eastern half of the United States and within substantial patches of the West. The vastness of the land, the scarcity of labour, and the abundance of migratory opportunities in a land replete with raw physical resources contributed to exceptional human mobility and a quick succession of ephemeral forms of land use and settlement. Human endeavours have greatly transformed the landscape, but such efforts have been largely destructive. Most of the pre-European landscape in the United States was so swiftly and radically altered that it is difficult to conjecture intelligently about its earlier appearance.

      The overall impression of the settled portion of the American landscape, rural or urban, is one of disorder and incoherence, even in areas of strict geometric survey. The individual landscape unit is seldom in visual harmony with its neighbour, so that, however sound in design or construction the single structure may be, the general effect is untidy. These attributes have been intensified by the acute individualism of the American, vigorous speculation in land and other commodities, a strongly utilitarian attitude toward the land and the treasures above and below it, and government policy and law. The landscape is also remarkable for its extensive transportation facilities, which have greatly influenced the configuration of the land.

      Another special characteristic of American settlement, one that became obvious only by the mid-20th century, is the convergence of rural and urban modes of life. The farmsteads—and rural folk in general—have become increasingly urbanized, and agricultural operations have become more automated, while the metropolis grows more gelatinous, unfocused, and pseudo-bucolic along its margins.

Rural (rural society) settlement
      Patterns of rural settlement indicate much about the history, economy, society, and minds of those who created them as well as about the land itself. The essential design of rural activity in the United States bears a strong family resemblance to that of other neo-European lands, such as Canada, Australia, New Zealand, South Africa, Argentina, or tsarist Siberia—places that have undergone rapid occupation and exploitation by immigrants intent upon short-term development and enrichment. In all such areas, under novel social and political conditions and with a relative abundance of territory and physical resources, ideas and institutions derived from a relatively stable medieval or early modern Europe have undergone major transformation. Further, these are nonpeasant countrysides, alike in having failed to achieve the intimate symbiosis of people and habitat, the humanized rural landscapes characteristic of many relatively dense, stable, earthbound communities in parts of Asia, Africa, Europe, and Latin America.

Early models of land allocation
      From the beginning the prevalent official policy of the British (except between 1763 and 1776) and then of the U.S. government was to promote agricultural and other settlement—to push the frontier westward as fast as physical and economic conditions permitted. The British crown's grants of large, often vaguely specified tracts to individual proprietors or companies enabled the grantees to draw settlers by the sale or lease of land at attractive prices or even by outright gift.

      Of the numerous attempts at group colonization, the most notable effort was the theocratic and collectivist New England town that flourished, especially in Massachusetts, Connecticut, and New Hampshire, during the first century of settlement. The town, the basic unit of government and comparable in area to townships in other states, allotted both rural and village parcels to single families by group decision. Contrary to earlier scholarly belief, in all but a few cases settlement was spatially dispersed in the socially cohesive towns, at least until about 1800. The relatively concentrated latter-day villages persist today as amoeba-like entities straggling along converging roads, neither fully rural nor agglomerated in form. The only latter-day settlement experiment of notable magnitude to achieve enduring success was a series of Mormon settlements in the Great Basin region of Utah and adjacent states, with their tightly concentrated farm villages reminiscent of the New England model. Other efforts have been made along ethnic, religious, or political lines, but success has been at best brief and fragile.

Creating the national domain
      With the coming of independence and after complex negotiations, the original 13 states surrendered to the new national government nearly all their claims to the unsettled western lands beyond their boundaries. Some tracts, however, were reserved for disposal to particular groups. Thus, the Western Reserve of northeastern Ohio gave preferential treatment to natives of Connecticut, while the military tracts in Ohio and Indiana were used as bonus payments to veterans of the American Revolution.

      A federally administered national domain was created, to which the great bulk of the territory acquired in 1803 in the Louisiana Purchase and later beyond the Mississippi and in 1819 in Florida was consigned. The only major exceptions were the public lands of Texas, which were left within that state's jurisdiction; such earlier French and Spanish land grants as were confirmed, often after tortuous litigation; and some Indian lands. In sharp contrast to the slipshod methods of colonial land survey and disposal, the federal land managers expeditiously surveyed, numbered, and mapped their territory in advance of settlement, beginning with Ohio in the 1780s, then sold or deeded it to settlers under inviting terms at a number of regional land offices.

      The design universally followed in the new survey system (except within the French, Spanish, and Indian grants) was a simple, efficient rectangular scheme. Townships (township) were laid out as blocks, each six by six miles in size, oriented with the compass directions. Thirty-six sections, each one square mile, or 640 acres (260 hectares), in size, were designated within each township; and public roads were established along section lines and, where needed, along half-section lines. At irregular intervals, offsets in survey lines and roads were introduced to allow for the Earth's curvature. Individual property lines were coincident with, or parallel to, survey lines, and this pervasive rectangularity generally carried over into the geometry of fields and fences or into the townsites later superimposed upon the basic rural survey.

      This all-encompassing checkerboard pattern is best appreciated from an airplane window over Iowa or Kansas. There, one sees few streams or other natural features and few diagonal highways or railroads interrupting the overwhelming squareness of the landscape. A systematic rectangular layout, rather less rigorous in form, also appears in much of Texas and in those portions of Maine, western New York and Pennsylvania, and southern Georgia that were settled after the 1780s.

Distribution of rural lands
      Since its formation, Congress has enacted a series of complex schemes for distribution of the national domain. The most famous of these plans was the Homestead Act of 1862, which offered title to 160 acres to individual settlers, subject only to residence for a certain period of time and to the making of minimal improvements to the land thus acquired. The legal provisions of such acts have varied with time as the nature of farming technology and of the remaining lands have changed, but their general effect has been to perpetuate the Jeffersonian ideal of a republic in which yeoman farmers own and till self-sufficient properties.

      The program was successful in providing private owners with relatively choice lands, aside from parcels reserved for schools and various township and municipal uses. More than one-third of the national territory, however, is still owned by federal and state governments, with much of this land in forest and wildlife preserves. A large proportion of this land is in the West and is unsuited for intensive agriculture or grazing because of the roughness, dryness, or salinity of the terrain; much of it is leased out for light grazing or for timber cutting.

Patterns of farm life
      During the classic period of American rural life, around 1900, the typical American lived or worked on a farm or was economically dependent upon farmers. In contrast to rural life in many other parts of the world, the farm family lived on an isolated farmstead some distance from town and often from farm neighbours; its property averaged less than one-quarter square mile. This farmstead varied in form and content with local tradition and economy. In particular, barn types were localized—for example, the tobacco barns of the South, the great dairy barns of Wisconsin, or the general-purpose forebay barns of southeastern Pennsylvania—as were modes of fencing. In general, however, the farmstead contained dwelling, barn, storage and sheds for small livestock and equipment, a small orchard, and a kitchen garden. A woodlot might be found in the least-accessible or least-fertile part of the farm.

      Successions of such farms were connected with one another and with the towns by means of a dense, usually rectangular lattice of roads, largely unimproved at the time. The hamlets, villages, and smaller cities were arrayed at relatively regular intervals, with size and affluence determined in large part by the presence and quality of rail service or status as the county seat. But, among people who have been historically rural, individualistic, and antiurban in bias, many services normally located in urban places might be found in rustic settings. Thus, much retail business was transacted by means of itinerant peddlers, while small shops for the fabrication, distribution, or repair of various items were often located in isolated farmsteads, as were many post offices.

      Social activity also tended to be widely dispersed among numerous rural churches, schools, or grange halls; and the climactic event of the year might well be the county fair, political rally, or religious encampment—again on a rural site. Not the least symptomatic sign of the strong tendency toward spatial isolation are the countless family burial plots or community cemeteries so liberally distributed across the countryside.

Regional small-town patterns
      There has been much regional variation among smaller villages and hamlets, but such phenomena have received relatively little attention from students of American culture or geography. The distinctive New England village, of course, is generally recognized and cherished: it consists of a loose clustering of white frame buildings, including a church (usually Congregationalist or Unitarian), town hall, shops, and stately homes with tall shade trees around the central green, or commons—a grassy expanse that may contain a bandstand and monuments or flowers. Derivative village forms were later carried westward to sections of the northern Midwest.

      Less widely known but equally distinctive is the town morphology characteristic of the Midland, or Pennsylvanian, culture area and most fully developed in southeastern and central Pennsylvania and Piedmont Maryland. It differs totally from the New England model in density, building materials, and general appearance. Closely packed, often contiguous buildings—mostly brick, but sometimes stone, frame, or stucco—abut directly on a sidewalk, which is often paved with brick and usually thickly planted with maple, sycamore, or other shade trees. Such towns are characteristically linear in plan, have dwellings intermingled with other types of buildings, have only one or two principal streets, and may radiate outward from a central square lined with commercial and governmental structures.

      The most characteristic U.S. small town is the one whose pattern evolved in the Midwest (Middle West). Its simple scheme is usually based on the grid plan. Functions are rigidly segregated spatially, with the central business district, consisting of closely packed two- or three-story brick buildings, limited exclusively to commercial and administrative activity. The residences, generally set well back within spacious lots, are peripheral in location, as are most rail facilities, factories, and warehouses.

      Even the modest urbanization of the small town came late to the South (South, the). Most urban functions long were spatially dispersed—almost totally so in the early Chesapeake Bay country or North Carolina—or were performed entirely by the larger plantations dominating the economic life of much of the region. When city and town began to materialize in the 19th and 20th centuries, they tended to follow the Midwestern model in layout.

      Although quite limited in geographic area, the characteristic villages of the Mormon and Hispanic-American districts are of considerable interest. The Mormon settlement uncompromisingly followed the ecclesiastically imposed grid plan composed of square blocks, each with perhaps only four very large house lots, and the block surrounded by extremely wide streets. Those villages in New Mexico in which population and culture were derived from Old Mexico were often built according to the standard Latin-American plan. The distinctive feature is a central plaza dominated by a Roman Catholic church and encircled by low stone or adobe buildings.

The rural–urban transition

Weakening of the agrarian ideal
      The United States has had little success in achieving or maintaining the ideal of the family farm. Through purchase, inheritance, leasing, and other means, some of dubious legality, smaller properties have been merged into much larger entities. By the late 1980s, for example, when the average farm size had surpassed 460 acres, farms containing 2,000 or more acres accounted for almost half of all farmland and 20 percent of the cropland harvested, even though they comprised less than 3 percent of all farms. At the other extreme were those 60 percent of all farms that contained fewer than 180 acres and reported less than 15 percent of cropland harvested. This trend toward fewer but larger farms has continued.

      The huge, heavily capitalized “neoplantation,” essentially a factory in the field, is especially conspicuous in parts of California, Arizona, and the Mississippi Delta, but examples can be found in any state. There are also many smaller but intensive operations that call for large investments and advanced managerial skills. This trend toward large-scale, capital-intensive farm enterprise has been paralleled by a sharp drop in rural farm population—a slump from the all-time high of some 32,000,000 in the early 20th century to about 5,000,000 in the late 1980s; but even in 1940, when farm folk still numbered more than 30,000,000, nearly 40 percent of farm operators were tenants, and another 10 percent were only partial owners.

      As the agrarian population has dwindled, so too has its immediate impact lessened, though less swiftly, in economic and political matters. The rural United States, however, has been the source of many of the nation's values and images. The United States has become a highly urbanized (urbanization), technologically advanced society far removed in daily life from cracker barrel, barnyard, corral, or logging camp. Although Americans have gravitated, sometimes reluctantly, to the big city, in the daydreams and assumptions that guide many sociopolitical decisions, the memory of a rapidly vanishing agrarian America is well noted. This is revealed not only in the works of contemporary novelists, poets, and painters but also throughout the popular arts: in movies, television, soap operas, folklore, country music, political oratory, and in much leisure activity.

Impact of the motor vehicle
      Since about 1920 more genuine change has occurred in American rural life than during the preceding three centuries of European settlement in North America. Although the basic explanation is the profound social and technological transformations engulfing most of the world, the most immediate agent of change has been the internal-combustion engine. The automobile, truck, bus, and paved highway have more than supplanted a moribund passenger and freight railroad system. While many local rail depots have been boarded up and scores of secondary lines have been abandoned, hundreds of thousands of miles of old dirt roads have been paved, and a vast system of interstate highways has been constructed to connect major cities in a single nonstop network. The net result has been a shrinking of travel time and an increase in miles traveled for the individual driver, rural or urban.

      Small towns in the United States have undergone a number of changes. Before 1970 towns near highways and urban centres generally prospered; while in the less-fortunate towns, where the residents lingered on for the sake of relatively cheap housing, downtown businesses often became extinct. From the late 1960s until about 1981 the rural and small-town population grew at a faster rate than the metropolitan population, the so-called metro–nonmetro turnaround, thus reversing more than a century of relatively greater urban growth. Subsequent evidence, however, suggests an approach toward equilibrium between the urban and rural sectors.

      As Americans have become increasingly mobile, the visual aspect of rural America has altered drastically. The highway has become the central route, and many of the functions once confined to the local town or city now stretch for many miles along major roads.

Reversal of the classic rural dominance
      The metropolitanization of life in the United States has not been limited to city, suburb, or exurb; it now involves most of the rural area and population. The result has been the decline of local crafts and regional peculiarities, quite visibly in such items as farm implements, fencing, silos, and housing and in commodities such as clothing or bread. In many ways, the countryside is now economically dependent on the city.

      The city dweller is the dominant consumer for products other than those of field, quarry, or lumber mill; and city location tends to determine patterns of rural economy rather than the reverse. During weekends and the vacation seasons, swarms of city folk stream out to second homes in the countryside and to campgrounds, ski runs, beaches, boating areas, or hunting and fishing tracts. For many large rural areas, recreation is the principal source of income and employment; and such areas as northern New England and upstate New York have become playgrounds and sylvan refuges for many urban residents.

      The larger cities reach far into the countryside for their vital supplies of water and energy. There is an increasing reliance upon distant coalfields to provide fuel for electrical power plants, and cities have gone far afield in seeking out rural disposal sites for their ever-growing volumes of garbage.

      The majority of the rural population now lives within daily commuting range of a sizable city. This enables many farm residents to operate their farms while, at the same time, working part- or full-time at a city job, and it thus helps to prevent the drastic decline in rural population that has occurred in remoter parts of the country. Similarly, many small towns within the shadow of a metropolis, with fewer and fewer farmers to service, have become dormitory satellites, serving residents from nearby cities and suburbs.

Urban settlement
      The United States has moved from a predominantly rural settlement into an urban society. In so doing, it has followed the general path that other advanced nations have traveled and one along which developing nations have begun to hasten. About three-fourths of the population live clustered within officially designated urban places and urbanized areas, which account for less than 2 percent of the national territory. At least another 15 percent live in dispersed residences that are actually urban in economic or social orientation.

Classic patterns of siting and growth
      Although more than 95 percent of the population was rural during the colonial period and for the first years of independence, cities were crucial elements in the settlement system from the earliest days. Boston; (Boston) New Amsterdam ( New York City); Jamestown, Va.; Charleston, S.C.; and Philadelphia were founded at the same time as the colonies they served. Like nearly all other North American colonial towns of consequence, they were ocean ports. Until at least the beginning of the 20th century the historical geography of U.S. cities was intimately related with that of successive transportation systems. The location of successful cities with respect to the areas they served, as well as their internal structure, was determined largely by the nature of these systems.

      The colonial cities acted as funnels for the collection and shipment of farm and forest products and other raw materials from the interior to trading partners in Europe, the Caribbean, or Africa and for the return flow of manufactured goods and other locally scarce items, as well as immigrants. Such cities were essentially marts and warehouses, and only minimal attention was given to social, military, educational, or religious functions. The inadequacy and high cost of overland traffic dictated sites along major ocean embayments or river estuaries; the only pre-1800 nonports worthy of notice were Lancaster and York, both in Pennsylvania, and Williamsburg, Va. With the populating of the interior and the spread of a system of canals and improved roads, such new cities as Pittsburgh, Pa.; Cincinnati, Ohio; Buffalo, N.Y.; and St. Louis, Mo., mushroomed at junctures between various routes or at which modes of transport were changed. Older ocean ports, such as New Castle, Del.; Newport, R.I.; Charleston, S.C.; Savannah, Ga.; and Portland, Maine, whose locations prevented them from serving large hinterlands, tended to stagnate.

      From about 1850 to 1920 the success of new cities and the further growth of older ones in large part were dependent on their location within the new steam railroad system and on their ability to dominate a large tributary territory. Such waterside rail hubs as Buffalo; Toledo, Ohio; Chicago; and San Francisco gained population and wealth rapidly, while such offspring of the rail era as Atlanta, Ga.; Indianapolis, Ind.; Minneapolis, Minn.; Fort Worth, Texas; and Tacoma, Wash., also grew dramatically. Much of the rapid industrialization of the 19th and early 20th centuries occurred in places already favoured by water or rail transport systems; but in some instances, such as in the cities of northeastern Pennsylvania's anthracite region, some New England mill towns, and the textile centres of the Carolina and Virginia Piedmont, manufacturing brought about rapid urbanization and the consequent attraction of transport facilities. The extraction of gold, silver, copper, coal, iron, and, in the 20th century, gas and oil led to rather ephemeral centres—unless these places were able to capitalize on local or regional advantages other than minerals.

      A strong early start, whatever the inital economic base may have been, was often the key factor in competition among cities. With sufficient early momentum, urban capital and population tended to expand almost automatically. The point is illustrated perfectly by the larger cities of the northeastern seaboard, from Portland, Maine, through Baltimore, Md. The nearby physical wealth is poor to mediocre, and they are now far off-centre on the national map; but a prosperous mercantile beginning, good land and sea connections with distant places, and a rich local accumulation of talent, capital, and initiative were sufficient to bring about the growth of one of the world's largest concentrations of industry, commerce, and people.

New factors in municipal development
      The pre-1900 development of the American city was almost completely a chronicle of the economics of the production, collection, and distribution of physical commodities and basic services dictated by geography, but there have been striking deviations from this pattern. The physical determinants of urban location and growth have given way to social factors. Increasingly, the most successful cities are oriented toward the more advanced modes for the production and consumption of services, specifically the knowledge, managerial, and recreational industries. The largest cities have become more dependent upon corporate headquarters, communications, and the manipulation of information for their sustenance. Washington, D.C. (Washington), is the most obvious example of a metropolis in which government and ancillary activities have been the spur for vigorous growth; but almost all of the state capitals have displayed a similar demographic and economic vitality. Further, urban centres that contain a major college or university often have enjoyed remarkable expansion.

      With the coming of relative affluence and abundant leisure to the population and a decrease of labour input in industrial processes, a new breed of cities has sprouted across the land: those that cater to the pleasure-seeker, vacationer, and the retired—for example, the young, flourishing cities of Florida or Nevada and many locations in California, Arizona, and Colorado.

      The automobile as a means of personal transportation was developed about the time of World War I, and the American city was catapulted into a radically new period, both quantitatively and qualitatively, in the further evolution of physical form and function. The size, density, and internal structure of the city were previously constrained by the limitations of the pedestrian and early mass-transit systems. Only the well-to-do could afford horse and carriage or a secluded villa in the countryside. Cities were relatively small and compact, with a single clearly defined centre, and they grew by accretion along their edges, without any significant spatial hiatuses except where commuter railroads linked outlying towns to the largest of metropolises. Workers living beyond the immediate vicinity of their work had to locate within reach of the few horse-drawn omnibuses or the later electric street railways.

      The universality of the automobile, even among the less affluent, and the parallel proliferation of service facilities and highways greatly loosened and fragmented the American city, which spread over surrounding rural lands. Older, formerly autonomous towns grew swiftly. Many towns became satellites of the larger city or were absorbed. Many suburbs and subdivisions arose with single-family homes on lots larger than had been possible for the ordinary householder in the city. These communities were almost totally dependent on the highway for the flow of commuters, goods, and services, and many were located in splendid isolation, separated by tracts of farmland, brush, or forest from other such developments. At the major interchanges of the limited-access highways, a new form of agglomerated settlement sprang up. In a further elaboration of this trend, many larger cities have been girdled by a set of mushrooming complexes. These creations of private enterprise embody a novel concept of urban existence: a metropolitan module no longer reliant on the central city or its downtown. Usually anchored on a cluster of shopping malls and office parks, these “hypersuburbs,” whose residents and employees circulate freely within the outer metropolitan ring, offer virtually all of the social and economic facilities needed for the modern life-style.

The new look of the metropolitan area
      The outcome has been a broad, ragged, semiurbanized belt of land surrounding each city, large or small, and quite often blending imperceptibly into the suburban-exurban halo encircling a neighbouring metropolitan centre. There is a great similarity in the makeup and general appearance of all such tracts: the planless intermixture of scraps of the rural landscape with the fragments of the scattered metropolis; the randomly distributed subdivisions or single homes; the vast shopping centres, the large commercial cemeteries, drive-in theatres, junkyards, and golf courses and other recreational enterprises; and the regional or metropolitan airport, often with its own cluster of factories, warehouses, or travel-oriented businesses. The traditional city—unitary, concentric in form, with a single well-defined middle—has been replaced by a relatively amorphous, polycentric metropolitan sprawl.

      The inner city of a large U.S. metropolitan area displays some traits that are common to the larger centres of all advanced nations. A central business district, almost always the oldest section of the city, is surrounded by a succession of roughly circular zones, each distinctive in economic and social-ethnic character. The symmetry of this scheme is distorted by the irregularities of surface and drainage or the effects of radial highways and railroads. Land is most costly, and hence land use is most intensive, toward the centre. Major business, financial and governmental offices, department stores, and specialty shops dominate the downtown, which is usually fringed by a band of factories and warehouses. The outer parts of the city, like the suburbs, are mainly residential.

      With some exceptions—e.g., large apartment complexes in downtown Chicago—people do not reside in the downtown areas, and there is a steady downward gradient in population density per unit area (and more open land and single-family residences) as one moves from the inner city toward the open country. Conversely, there is a general rise in income and social status with increasing distance from the core. The sharply defined immigrant neighbourhoods of the 19th century generally persist in a somewhat diluted form, though specific ethnic groups may have shifted their location. Later migrant groups, notably Southern blacks and Latin Americans, generally dominate the more run-down neighbourhoods of the inner cities.

Individual and collective character of cities
      American cities, more so than the small-town or agrarian landscape, tend to be the product of a particular period rather than of location. The relatively venerable centres of the Eastern Seaboard—Boston; Philadelphia; Baltimore, Md.; Albany, N.Y.; Chester, Pa.; Alexandria, Va.; or Georgetown (a district of Washington, D.C.), for example—are virtual replicas of the fashionable European models of their early period rather than the fruition of a regional culture, unlike New Orleans and Santa Fe, N.M., which reflect other times and regions. The townscapes of Pittsburgh; Detroit, Mich.; Chicago; and Denver, Colo., depict national modes of thought and the technological development of their formative years, just as Dallas, Texas; Las Vegas, Nev.; San Diego, Calif.; Tucson, Ariz.; and Albuquerque, N.M., proclaim contemporary values and gadgetry more than any local distinctiveness. When strong-minded city founders instituted a highly individual plan and their successors managed to preserve it—as, for example, in Savannah, Ga.; Washington, D.C.; and Salt Lake City, Utah—or when there is a happy combination of a spectacular site and appreciative residents—as in San Francisco or Seattle, Wash.—a genuine individuality does seem to emerge. Such an identity also may develop where immigration has been highly selective, as in such places as Miami, Fla.; Phoenix, Ariz.; and Los Angeles.

      As a group, U.S. cities differ from cities in other countries in both type and degree. The national political structure, the social inclinations of the people, and the strong outward surge of urban development have led to the political fragmentation of metropolises that socially and economically are relatively cohesive units. The fact that a single metropolitan area may sprawl across numerous incorporated towns and cities, several townships, and two or more counties and states has a major impact upon both its appearance and the way it functions. Not the least of these effects is a dearth of overall physical and social planning (urban planning) (or its ineffectuality when attempted), and the rather chaotic, inharmonious appearance of both inner-city and peripheral zones painfully reflects the absence of any effective collective action concerning such matters.

      The American city is a place of sharp transitions. Construction, demolition, and reconstruction go on almost ceaselessly, though increasing thought has been given to preserving monuments and buildings. From present evidence, it would be impossible to guess that New York City and Albany date from the 1620s or that Detroit was founded in 1701. Preservation and restoration do occur, but often only when it makes sense in terms of tourist revenue. Physical and social blight has reached epidemic proportions in the slum areas of the inner city; but, despite the wholesale razing of such areas and the subsequent urban-renewal projects (sometimes as apartment or commercial developments for the affluent), the belief has become widespread that the ills of the U.S. city are incurable, especially with the increasing flight of capital, tax revenue, and the more highly educated, affluent elements of the population to suburban areas and the spatial and political polarization of whites and nonwhites.

      In the central sections of U.S. cities, there is little sense of history or continuity; instead, one finds evidence of the dominance of the engineering mentality and of the credo that the business of the city is business. Commercial and administrative activities are paramount, and usually there is little room for church buildings or for parks or other nonprofit enterprises. The role of the cathedral, so central in the medieval European city, is filled by a U.S. invention serving both utilitarian and symbolic purposes, the skyscraper. Some cities have felt the need for other bold secular monuments; hence the Gateway Arch looming over St. Louis, Seattle's Space Needle, and Houston's Astrodome. Future archaeologists may well conclude from their excavations that American society was ruled by an oligarchy of highway engineers, architects, and bulldozer operators. The great expressways converging upon, or looping, the downtown area and the huge amount of space devoted to parking lots and garages are even more impressive than the massive surgery executed upon U.S. cities a century ago to hack out room for railroad terminals and marshaling yards.

      Within many urban sites there has been radical physical transformation of shoreline, drainage systems, and land surface that would be difficult to match elsewhere in the world. Thus, in their physical lineaments, Manhattan and inner Boston bear scant resemblance to the landscapes seen by their initial settlers. The surface of downtown Chicago has been raised several feet above its former swamp level, the city's lakefront extensively reshaped, and the flow of the Chicago River reversed. Los Angeles, notorious for its disregard of the environment, has its concrete arroyo bottoms, terraced hillsides and landslides, and its own artificial microclimate.

The supercities
      The unprecedented outward sprawl of American urban settlement has created some novel settlement forms, for the quantitative change has been so great as to induce qualitative transformation. The conurbation—a territorial coalescence of two or more sizable cities whose peripheral zones have grown together—may have first appeared in early 19th-century Europe. There are major examples in Great Britain, the Low Countries, and Germany, as well as in Japan.

      Nothing elsewhere, however, rivals in size and complexity the aptly named megalopolis, that supercity stretching along the Atlantic from Portland, Maine, past Richmond, Va. Other large conurbations include, in the Great Lakes region, one centred on Chicago and containing large slices of Illinois, Wisconsin, and Indiana; another based in Detroit, embracing large parts of Michigan and Ohio and reaching into Canada; and a third stretching from Buffalo through Cleveland and back to Pittsburgh. All three are reaching toward one another and may form another megalopolis that, in turn, may soon be grafted onto the seaboard megalopolis by a corridor through central New York state.

      Another example of a growing megalopolis is the huge southern California conurbation reaching from Santa Barbara, through a dominating Los Angeles, to the Mexican border. The solid strip of urban territory that lines the eastern shore of Puget Sound is a smaller counterpart. Quite exceptional in form is the slender linear multicity occupying Florida's Atlantic coastline, from Jacksonville to Miami, and the loose swarm of medium-sized cities clustering along the Southern Piedmont, from south-central Virginia to Birmingham, Ala.; also of note are the Texas cities of Dallas–Fort Worth, Houston, and San Antonio, which have formed a rapidly growing—though discontinuous—urbanized triangle.

      One of the few predictions that seem safe in so dynamic and innovative a land as the United States is that, unless severe and painful controls are placed on land use, the shape of the urban environment will be increasingly megalopolitan: a small set of great constellations of polycentric urban zones, each complexly interlocked socially and physically with its neighbours.

Traditional regions of the United States
      The differences among America's traditional regions, or culture areas, tend to be slight and shallow as compared with such areas in most older, more stable countries. The muted, often subtle nature of interregional differences can be ascribed to the relative newness of American settlement, a perpetually high degree of mobility, a superb communications system, and the galloping centralization of economy and government. It might even be argued that some of these regions are quaint vestiges of a vanishing past, of interest only to antiquarians.

      Yet, in spite of the nationwide standardization in many areas of American thought and behaviour, the lingering effects of the older culture areas do remain potent. In the case of the South, for example, the differences helped to precipitate the gravest political crisis and bloodiest military conflict in the nation's history. More than a century after the Civil War, the South remains a powerful entity in political, economic, and social terms, and its peculiar status is recognized in religious, educational, athletic, and literary circles.

      Even more intriguing is the appearance of a series of essentially 20th-century regions. Southern California is the largest and perhaps the most distinctive region, and its special culture has attracted large numbers of immigrants to the state. Similar trends are visible in southern Florida; in Texas, whose mystique has captured the national imagination; and to a certain degree in the more ebullient regions of New Mexico and Arizona as well. At the metropolitan level, it is difficult to believe that such distinctive cities as San Francisco, Las Vegas, Dallas, Tucson, and Seattle have become like all other American cities. A detailed examination, however, would show significant if sometimes subtle interregional differences in terms of language, religion, diet, folklore, folk architecture and handicrafts, political behaviour, social etiquette, and a number of other cultural categories.

The hierarchy of culture areas
      A multitiered hierarchy of culture areas might be postulated for the United States; but the most interesting levels are, first, the nation as a whole and, second, the five to 10 large subnational regions, each embracing several states or major portions thereof. There is a remarkably close coincidence between the political United States and the cultural United States. Crossing into Mexico, the traveler passes across a cultural chasm. If the contrasts are less dramatic between the two sides of the U.S.-Canadian boundary, they are nonetheless real, especially to the Canadian. Erosion of the cultural barrier has been largely limited to the area that stretches from northern New York state to Aroostook county, Maine. There, a vigorous demographic and cultural immigration by French-Canadians has gone far toward eradicating international differences.

      While the international boundaries act as a cultural container, the interstate boundaries are curiously irrelevant. Even when the state had a strong autonomous early existence—as happened with Massachusetts, Virginia, or Pennsylvania—subsequent economic and political forces have tended to wash away such initial identities. Actually, it could be argued that the political divisions of the 48 coterminous states are anachronistic in the context of contemporary socioeconomic and cultural forces. Partially convincing cases might be built for equating Utah and Texas with their respective culture areas because of exceptional historical and physical circumstances, or perhaps Oklahoma, given its very late European occupation and its dubious distinction as the territory to which exiled Indian tribes of the East were relegated. In most instances, however, the states either contain two or more distinctly different culture and political areas or fragments thereof or are part of a much larger single culture area. Thus sharp North–South dichotomies characterize California, Missouri, Illinois, Indiana, Ohio, and Florida, while Tennessee advertises that there are really three Tennessees. In Virginia the opposing cultural forces were so strong that actual fission took place in 1863 (with the admission to the Union of West Virginia) along one of those rare interstate boundaries that approximate a genuine cultural divide.

      Much remains to be learned about the cause and effect relations between economic and culture areas in the United States. If the South or New England could at one time be correlated with a specific economic system, this is no longer easy to do. Cultural systems appear to respond more slowly to agents of change than do economic or urban systems. Thus the Manufacturing Belt, a core region for many social and economic activities, now spans parts of four traditional culture areas—New England, the Midland, the Midwest, and the northern fringes of the South. The great urban sprawl, from southern Maine to central Virginia, blithely ignores the cultural slopes that are still visible in its more rural tracts.

The cultural hearths
      The culture areas of the United States are generally European in origin, the result of importing European colonists and ways of life and the subsequent adaptation of social groups to new habitats. The aboriginal cultures have had relatively little influence on the nation's modern culture. In the Southwestern and the indistinct Oklahoman subregions, the Indian element merits consideration only as one of several ingredients making up the regional mosaic. With some exceptions, the map of American culture areas in the East can be explained in terms of the genesis, development, and expansion of the three principal colonial cultural hearths along the Atlantic seaboard. Each was basically British in character, but their personalities remain distinct because of, first, different sets of social and political conditions during the critical period of first effective settlement and, second, local physical and economic circumstances. The cultural gradients between them tend to be much steeper and the boundaries more distinct than is true for the remainder of the nation.

 New England was the dominant region during the century of rapid expansion following the American Revolution and not merely in terms of demographic or economic expansion. In social and cultural life—in education, politics, theology, literature, science, architecture, and the more advanced forms of mechanical and social technology—the area exercised its primacy. New England was the leading source of ideas and styles for the nation from about 1780 to 1880; it furnishes an impressive example of the capacity of strongly motivated communities to rise above the constraints of a harsh environment.

      During its first two centuries, New England had an unusually homogeneous population. With some exceptions, the British immigrants shared the same nonconformist religious beliefs, language, social organization, and general outlook. A distinctive regional culture took form, most noticeably in terms of dialect, town morphology, and folk architecture. The personality of the people also took on a regional coloration both in folklore and in actuality; there is sound basis for the belief that the traditional New England Yankee is self-reliant, thrifty, inventive, and enterprising. The influx of immigrants that began in the 1830s diluted and altered the New England identity, but much of its early personality survived.

      By virtue of location, wealth, and seniority, the Boston metropolitan area has become the cultural economic centre of New England. This sovereignty is shared to some degree, however, with two other old centres, the lower Connecticut Valley and the Narragansett Bay region of Rhode Island.

      The early westward demographic and ideological expansion of New England was so influential that it is justifiable to call New York, northern New Jersey, northern Pennsylvania, and much of the Upper Midwest “New England Extended.” Further, the energetic endeavours of New England whalers, merchants, and missionaries had a considerable impact on the cultures of Hawaii, various other Pacific isles, and several points in the Caribbean. New Englanders also were active in the Americanization of early Oregon and Washington, with results that are still visible. Later, the overland diffusion of New England natives and practices meant a recognizable New England character not only for the Upper Midwest, from Ohio to the Dakotas, but also in the Pacific Northwest in general, though to a lesser degree.

The South (South, the)
  By far the largest of the three original Anglo-American culture areas, the South is also the most idiosyncratic with respect to national norms—or slowest to accept them. The South was once so distinct from the non-South in almost every observable or quantifiable feature and so fiercely proud of its peculiarities that for some years the question of whether it could maintain political and social unity with the non-South was in serious doubt. These differences are still observable in almost every realm of human activity, including rural economy, dialect, diet, costume, folklore, politics, architecture, social customs, and recreation. Only during the 20th century can an argument be made that it has achieved a decisive convergence with the rest of the nation, at least in terms of economic behaviour and material culture.

      A persistent deviation from the national mainstream probably began in the first years of settlement. The first settlers of the South were almost purely British, not outwardly different from those who flocked to New England or the Midland, but almost certainly distinct in terms of motives and social values and more conservative in retaining the rurality and the family and social structure of premodern Europe. The vast importation of African slaves was also a major factor, as was a degree of contact with the Indians that was less pronounced farther north. In addition, the unusual pattern of economy (much different from that of northwestern Europe), settlement, and social organization, which were in part an adaptation to a starkly unfamiliar physical habitat, accentuated the South's deviation from other culture areas.

      In both origin and spatial structure, the South has been characterized by diffuseness. In the search for a single cultural hearth, the most plausible choice is the Chesapeake Bay area and the northeastern corner of North Carolina, the earliest area of recognizably Southern character. Early components of Southern population and culture also arrived from other sources. A narrow coastal strip from North Carolina to the Georgia–Florida border and including the Sea Islands is decidedly Southern in character, yet it stands apart self-consciously from other parts of the South. Though colonized directly from Great Britain, it had also significant connections with the West Indies, in which relation the African cultural contribution was strongest and purest. Charleston and Savannah, which nurtured their own distinctive civilizations, dominated this subregion. Similarly, French Louisiana received elements of culture and population—to be stirred into the special Creole mixture—not only, putatively, from the Chesapeake Bay hearth area but also indirectly from France, French Nova Scotia, the French West Indies, and Africa. In south central Texas, the Germanic and Hispanic influx was so heavy that a special subregion can be designated.

      It would seem, then, that the Southern culture area may be an example of convergent, or parallel, evolution of a variety of elements arriving along several paths but subject to some single general process that could mold one larger regional consciousness and way of life.

      Because of its slowness in joining the national technological mainstream, the South can be subdivided into a much greater number of subregions than is possible for any of the other older traditional regions. Those described above are of lesser order than the two principal Souths, variously called Upper and Lower (or Deep) South, Upland and Lowland South, or Yeoman and Plantation South.

      The Upland South, which comprises the southern Appalachians, the upper Appalachian Piedmont, the Cumberland and other low interior plateaus, and the Ozarks and Ouachitas, was colonized culturally and demographically from the Chesapeake Bay hearth area and the Midland; it is most emphatically white Anglo-Saxon Protestant (WASP) in character. The latter area, which contains a large black population, includes the greater part of the South Atlantic and Gulf coastal plains and the lower Appalachian Piedmont. Its early major influences came from the Chesapeake Bay area, with only minor elements from the coastal Carolina–Georgia belt, Louisiana, and elsewhere. The division between the two subregions remains distinct from Virginia to Texas, but each region can be further subdivided. Within the Upland South, the Ozark region might legitimately be detached from the Appalachian; and, within the latter, the proud and prosperous Kentucky Bluegrass, with its emphasis on tobacco and Thoroughbreds, certainly merits special recognition.

      Toward the margins of the South, the difficulties in delimiting subregions become greater. The outer limits themselves are a topic of special interest. There seems to be more than an accidental relation between these limits and various climatic factors. The fuzzy northern boundary, definitely not associated with the conventional Mason and Dixon Line or the Ohio River, seems most closely associated with length of frost-free season or with temperature during the winter. As the Southern cultural complex was carried to the West, it not only retained its strength but became more intense, in contrast to the influence of New England and the Midland. But the South finally fades away as one approaches the 100th meridian, with its critical decline in annual precipitation. The apparent correlation between the cultural South and a humid subtropical climatic regime is in many ways valid.

      The Texas subregion is so large, distinctive, vigorous, and self-assertive that it presents some vexing classificatory questions. Is Texas simply a subregion of the Greater South, or has it acquired so strong and divergent an identity that it can be regarded as a major region in its own right? It is likely that a major region has been born in a frontier zone in which several distinct cultural communities confront one another and in which the mixture has bred the vigorous, extroverted, aggressive Texas personality so widely celebrated in song and story. Similarly, peninsular Florida may be considered either within or juxtaposed to the South but not necessarily part of it. In the case of Florida, an almost empty territory began to receive significant settlement only after about 1890, and if, like Texas, most of it came from the older South, there were also vigorous infusions from elsewhere.

The Midland
 The significance of this region has not been less than that of New England or the South, but its characteristics are the least conspicuous to outsiders as well as to its own residents—reflecting, perhaps, its centrality in the course of U.S. development. The Midland (a term not to be confused with Midwest) comprises portions of Middle Atlantic and Upper Southern states: Pennsylvania, New Jersey, Delaware, and Maryland. Serious European settlement of the Midland began a generation or more after that of the other major cultural centres and after several earlier, relatively ineffectual trials by the Dutch, Swedes, Finns, and British. But once begun late in the 17th century by William Penn (Penn, William) and his associates, the colonization of the area was a success. Within southeastern Pennsylvania this culture area first assumed its distinctive character: a prosperous, sober, industrious agricultural society that quickly became a mixed economy as mercantile and later industrial functions came to the fore. By the mid-18th century much of the region had acquired a markedly urban character, resembling in many ways the more advanced portions of the North Sea countries. In this respect, at least, the Midland was well ahead of neighbouring areas to the north and south.

      It differed also in its polyglot ethnicity (ethnic group). From almost the beginning, the various ethnic and religious groups of the British Isles were joined by immigrants from the European mainland. This diversity has grown and is likely to continue. The mosaic of colonial ethnic groups has persisted in much of Pennsylvania, New York, New Jersey, and Maryland, as has the remarkable variety of nationalities and churches in coalfields, company towns, cities, and many rural areas. Much of the same ethnic heterogeneity can be seen in New England, the Midwest, and a few other areas, but the Midland stands out as perhaps the most polyglot region of the nation. The Germanic element has always been notably strong, if irregularly distributed, in the Midland, accounting for more than 70 percent of the population of many towns. Had the Anglo-American culture not triumphed, the area might well have been designated Pennsylvania German.

      Physiography and migration carried the Midland culture area into the Maryland Piedmont. Although its width tapers quickly below the Potomac, it reaches into parts of Virginia and West Virginia, with traces legible far down the Appalachian zone and into the South.

      The northern half of the greater Midland region (the New York subregion, or New England Extended) cannot be assigned unequivocally to either New England or this Midland. Essentially it is a hybrid formed mainly from two regional strains of almost equal strength: New England and the post-1660 British element moving up the Hudson valley and beyond. There has also been a persistent, if slight, residue of early Dutch culture and some subtle filtering northward of Pennsylvanian influences. Apparently within the New York subregion occurred the first major fusion of American regional cultures, especially within the early 19th-century “Burned-Over District,” around the Finger Lakes and Genesee areas of central and western New York. This locality, the seedbed for a number of important social innovations, was a major staging area for westward migration and possibly a major source for the people and notions that were to build the Midwestern culture area.

      Toward the west the Midland retains its integrity for only a short distance—certainly no further than eastern Ohio—as it becomes submerged within the Midwest. Still, its significance in the genesis of the Midwest and the national culture should not be minimized. Its success in projecting its image upon so much of the country may have drawn attention away from the source area. As both name and location suggest, the Midland is intermediate in character in many respects, lying between New England and the South. Its residents are much less concerned with, or conscious of, a strong regional identity (excepting the Pennsylvania Dutch caricatures) than is true for the other regions, and, in addition, the Midland lacks their strong political and literary traditions, though it is unmistakable in its distinctive townscapes and farmsteads.

The newer culture areas

The Midwest (Middle West)
 There is no such self-effacement in the Midwest, that large triangular region justly regarded as the most nearly representative of the national average. Everyone within or outside of the Midwest knows of its existence, but no one is certain where it begins or ends. The older apex of the eastward-pointing triangle appears to rest around Pittsburgh, while the two western corners melt away somewhere in the Great Plains, possibly in southern Manitoba in the north and southern Kansas in the south. The eastern terminus and the southern and western borders are broad, indistinct transitional zones.

      Serious study of the historical geography of the Midwest began only in the 20th century, but it seems likely that this culture region was the combination of all three colonial regions and that this combination first took place in the upper Ohio valley. The early routes of travel—the Ohio and its tributaries, the Great Lakes, and the low, level corridor along the Mohawk and the coastal plains of Lake Ontario and Lake Erie—converge upon Ohio. There, the people and cultural traits from New England, the Midland, and the South were first funneled together. There seems to have been a fanlike widening of the new hybrid area into the West as settlers worked their way frontierward.

      Two major subregions are readily discerned, the Upper and Lower Midwest. They are separated by a line, roughly approximating the 41st parallel, that persists as far west as Colorado in terms of speech patterns and indicates differences in regional provenance in ethnic and religious terms as well. Much of the Upper Midwest retains a faint New England character, although Midland influences are probably as important. A rich mixture of German, Scandinavian, Slavic, and other non-WASP elements has greatly diversified a stock in which the British element usually remains dominant and the range of church denominations is great. The Lower Midwest, except for the relative scarcity of blacks, tends to resemble the South in its predominantly Protestant and British makeup. There are some areas with sizable Roman Catholic and non-WASP populations, but on the whole the subregion tends to be more WASP in inclination than most other parts of the nation.

The problem of “the West (West, The)”
      The foregoing culture areas account for roughly the eastern half of the coterminous United States. There is a dilemma in classifying the remaining half. The concept of the American West, strong in the popular imagination, is reinforced constantly by romanticized cinematic and television images of the cowboy. It is facile to accept the widespread Western livestock complex as epitomizing the full gamut of Western life, because although the cattle industry may have once accounted for more than one-half of the active Western domain as measured in acres, it employed only a relatively small fraction of the total population. As a single subculture, it cannot represent the total regional culture.

      It is not clear whether there is a genuine, single, grand Western culture region. Unlike the East, where virtually all the land is developed and culture areas and subregions abut and overlap in splendid confusion, the eight major and many lesser nodes of population in the western United States resemble oases, separated from one another by wide expanses of nearly unpopulated mountain or arid desert. The only obvious properties these isolated clusters have in common are, first, the intermixture of several strains of culture, primarily from the East but with additions from Europe, Mexico, and East Asia, and, second, except for one subregion, a general modernity, having been settled in a serious way no earlier than the 1840s. Some areas may be viewed as inchoate, or partially formed, cultural entities; the others have acquired definite personalities but are difficult to classify as first-order or lesser order culture areas.

      There are several major tracts in the western United States that reveal a genuine cultural identity: the Upper Rio Grande region, the Mormon region, southern California, and, by some accounts, northern California. To this group one might add the anomalous Texan and Oklahoman subregions, which have elements of both the West and the South.

 The term Upper Rio Grande region was coined to denote the oldest and strongest of the three sectors of Hispanic-American activity in the Southwest, the others being southern California and portions of Texas. Although covering the valley of the upper Rio Grande, the region also embraces segments of Arizona and Colorado as well as other parts of New Mexico. European communities and culture have been present there, with only one interruption, since the late 16th century. The initial sources were Spain and Mexico, but after 1848 at least three distinct strains of Anglo-American culture were increasingly well represented—the Southern, Mormon, and a general undifferentiated Northeastern culture—plus a distinct Texan subcategory. For once this has occurred without obliterating the Indians, whose culture endures in various stages of dilution, from the strongly Americanized or Hispanicized to the almost undisturbed.

      The general mosaic is a fabric of Indian, Anglo, and Hispanic elements, and all three major groups, furthermore, are complex in character. The Indian component is made up of Navajo, Pueblo, and several smaller groups, each of which is quite distinct from the others. The Hispanic element is also diverse—modally Mexican mestizo, but ranging from pure Spanish to nearly pure pre-Spanish aboriginal.

  The Mormon region is expansive in the religious and demographic realms, though it has ceased to expand territorially as it did in the decades after the first settlement in the Salt Lake valley in 1847. Despite its Great Basin location and an exemplary adaptation to environmental constraints, this cultural complex appears somewhat non-Western in spirit: the Mormons may be in the West, but they are not entirely of it. Their historical derivation from the Midwest and from ultimate sources in New York and New England is still apparent, along with the generous admixture of European converts to their religion.

      As in New England, the power of the human will and an intensely cherished abstract design have triumphed over an unfriendly habitat. The Mormon way of life is expressed in the settlement landscape and economic activities within a region more homogeneous internally than any other U.S. culture area.

 In contrast, northern California has yet to gain its own strong cultural coloration. From the beginning of the great 1849 gold rush the area drew a diverse population from Europe and Asia as well as the older portions of the United States. Whether the greater part of northern California has produced a culture amounting to more than the sum of the contributions brought by immigrants is questionable. San Francisco, the regional metropolis, may have crossed the qualitative threshold. An unusually cosmopolitan outlook that includes an awareness of the Orient stronger than that of any other U.S. city, a fierce self-esteem, and a unique townscape may be symptomatic of a genuinely new, emergent local culture.

      Southern California is the most spectacular of the Western regions, not only in terms of economic and population growth but also for the luxuriance, regional particularism, and general avant-garde character of its swiftly evolving cultural pattern. Until the coming of a direct transcontinental rail connection in 1885, the region was remote, rural, and largely inconsequential. Since then, the invasion by persons from virtually every corner of North America and by the world has been massive, but since the 1960s in-migration has slackened perceptibly, and many residents have begun to question the doctrine of unlimited growth. In any event, a loosely articulated series of urban and suburban developments continue to encroach upon what little is left of arable or habitable land in the Coast Ranges and valleys from Santa Barbara to the Mexican border.

      Although every major ethnic and racial group and every other U.S. culture area is amply represented in southern California, there is reason to suspect that a process of selection for certain types of people, attitudes, and personality traits may have been at work at both source and destination. The region is distinct from, or perhaps in the vanguard of, the remainder of the nation. One might view southern California as the super-American region or the outpost of a postindustrial future, but its cultural distinctiveness is very evident in landscape and social behaviour. Southern California in no way approaches being a “traditional region,” or even the smudged facsimile of such, but rather the largest, boldest experiment in creating a “voluntary region,” one built through the self-selection of immigrants and their subsequent interaction.

 The remaining identifiable Western regions—the Willamette valley of Oregon, the Puget Sound region, the Inland Empire of eastern Washington and adjacent tracts of Idaho and Oregon, central Arizona, and the Colorado Piedmont—can be treated jointly as potential, or emergent, culture areas, still too close to the national mean to display any cultural distinctiveness. In all of these regions is evident the arrival of a cross section of the national population and the growth of regional life around one or more major metropolises. A New England element is noteworthy in the Willamette valley and Puget Sound regions, while a Hispanic-American component appears in the Colorado Piedmont and central Arizona. Only time and further study will reveal whether any of these regions, so distant from the historic sources of U.S. population and culture, have the capacity to become an independent cultural area.

Wilbur Zelinsky

The people (United States)
      A nation for little more than 225 years, the United States is a relatively new member of the global community, but its rapid growth since the 18th century is unparalleled. The early promise of the New World as a refuge and land of opportunity was realized dramatically in the 20th century with the emergence of the United States as a world power. With a total population exceeded only by those of China and India, the United States is also characterized by an extraordinary diversity in ethnic and racial ancestry. A steady stream of immigration, notably from the 1830s onward, formed a pool of foreign-born persons unmatched by any other nation; 60 million people immigrated to U.S. shores in the 18th and 19th centuries. Many were driven, seeking escape from political or economic hardship, while others were drawn, by a demand for workers, abundant natural resources, and expansive cheap land. Most arrived hoping to remake themselves in the New World.

      Americans also have migrated internally with great vigour, exhibiting a restlessness that thrived in the open lands and on the frontier. Initially, migratory patterns ran east to west and from rural areas to cities, then, in the 20th century, from the South to the Northeast and Midwest. Since the 1950s, though, movement has been primarily from the cities to outlying suburbs, and from aging northern metropolises to the growing urban agglomerations of the South, Southwest, and West.

      At the dawn of the 21st century, the majority of the U.S. population had achieved a high level of material comfort, prosperity, and security. Nonetheless, Americans struggled with the unexpected problems of relative affluence, as well as the persistence of residual poverty. Crime, drug abuse, affordable energy sources, urban sprawl, voter apathy, pollution, high divorce rates, AIDS, and excessive litigation remained continuing subjects of concern, as were inequities and inadequacies in education and managed health care. Among the public policies widely debated were abortion, gun ownership, welfare reforms, and the death penalty.

      Many Americans perceive social tension as the product of their society's failure to extend the traditional dream of equality of opportunity to all people. Ideally, social, political, economic, and religious freedom would assure the like treatment of everyone, so that all could achieve goals in accord with their individual talents, if only they worked hard enough. This strongly held belief has united Americans throughout the centuries. The fact that some groups have not achieved full equality troubles citizens and policy-makers alike.

Ethnic distribution
      After decades of immigration and acculturation, many U.S. citizens can trace no discernible ethnic identity, describing themselves generically only as "American," while others claim mixed identities. The 2000 U.S. census introduced a new category for those who identified themselves as a member of more than one race; of 281.4 million counted, 2.4 percent chose this multiracial classification.

Ethnic European-Americans (ethnic group)
      Although the term "ethnic" is frequently confined to the descendants of the newest immigrants, its broader meaning applies to all groups unified by their cultural heritage and experience in the New World. In the 19th century, Yankees (Yankee) formed one such group, marked by common religion and by habits shaped by the original Puritan settlers. From New England, the Yankees spread westward through New York, northern Ohio, Indiana, Illinois, Iowa, and Kansas. Tightly knit communities, firm religious values, and a belief in the value of education resulted in prominent positions for Yankees in business, in literature and law, and in cultural and philanthropic institutions. They long identified with the Republican Party. Southern whites and their descendants, by contrast, remained preponderantly rural as migration took them westward across Tennessee and Kentucky to Arkansas, Missouri, Oklahoma, and Texas. These people inhabited small towns until the industrialization of the South in the 20th century, and they preserved affiliations with the Democratic Party until the 1960s.

      The colonial population also contained other elements that long sustained their group identities. The Pennsylvania Germans (Pennsylvania German), held together by religion and language, still pursue their own way of life after three centuries, as exemplified by the Amish. The great 19th-century German migrations, however, were made up of families who dispersed in the cities as well as in the agricultural areas to the West; to the extent that ethnic ties have survived they are largely sentimental. That is also true of the Scots, Scotch-Irish, Welsh, and Dutch, whose colonial nuclei received some reinforcement after 1800 but who gradually adapted to the ways of the larger surrounding groups.

      Distinctive language and religion preserved some coherence among the descendants of the Scandinavian newcomers of the 19th century. Where these people clustered in sizeable settlements, as in Minnesota, they transmitted a sense of identity beyond the second generation; and emotional attachments to the lands of origin lingered.

      Religion was a powerful force for cohesion among the Roman Catholic Irish and the Jews, both tiny groups before 1840, both reinforced by mass migration thereafter. Both have now become strikingly heterogeneous, displaying a wide variety of economic and social conditions, as well as a degree of conformity to the styles of life of other Americans. But the pull of external concerns—in the one case, unification of Ireland; in the other, Israel's security—have helped to preserve group loyalty.

      Indeed, by the 1970s "ethnic" (in its narrow connotation) had come to be used to describe the Americans of Polish, Italian, Lithuanian, Czech, and Ukrainian extraction, along with those of other eastern and southern European ancestry. Tending to be Roman Catholic and middle-class, most settled in the North and Midwest. The city neighbourhoods in which many of them lived initially had their roots in the "Little Italys" and "Polish Hills" established by the immigrants. By the 1980s and '90s a significant number had left these enclaves for nearby suburbs. The only European ethnic group to arrive in large numbers at the end of the 20th century were Russians, especially Russian Jews, benefiting from perestroika.

      In general, a pattern of immigration, self-support, and then assimilation was typical. Recently established ethnic groups often preserve greater visibility and greater cohesion. Their group identity is based not only upon a common cultural heritage but also on the common interests, needs, and problems they face in the present-day United States. As the immigrants and their descendants, most have been taught to believe that the road to success in the United States is achieved through individual effort. They tend to believe in equality of opportunity and self-improvement and attribute poverty to the failing of the individual and not to inequities in society. As the composition of the U.S. population changed, it was projected that sometime in the 21st century, Americans of European descent would be outnumbered by those from non-European ethnic groups.

African-Americans
      From colonial times, African-Americans arrived in large numbers as slaves and lived primarily on plantations in the South. In 1790 slave and free blacks together comprised about one-fifth of the U.S. population. As the nation split between southern slave and northern free states prior to the American Civil War, the Underground Railroad spirited thousands of escaped slaves from South to North. In the century following abolition, this migration pattern became more pronounced as 6.5 million blacks moved from rural areas of the South to northern cities between 1910 and 1970. On the heels of this massive internal shift came new immigrants from West Africa and the black Caribbean, principally Haiti, Jamaica, and the Dominican Republic.

      The civil rights movement in the 1950s and '60s awakened the nation's conscience to the plight of African-Americans, who had long been denied first-class citizenship. The movement used nonviolence and passive resistance to change discriminatory laws and practices, primarily in the South. As a result, increases in median income and college enrollment among the black (black nationalism) population were dramatic in the late 20th century. Widening access to professional and business opportunities included noteworthy political victories. By the early 1980s black mayors in Chicago, Los Angeles, Cleveland, Baltimore, Atlanta, and Washington, D.C., had gained election with white support. In 1984 and 1988 Jesse Jackson (Jackson, Jesse) ran for U.S. president; he was the first African-American to contend seriously for a major party nomination. However, despite an expanding black middle-class and equal-opportunity laws in education, housing, and employment, African-Americans continue to face staunch social and political challenges, especially those living in the inner cities, where some of American society's most difficult problems (such as crime and drug trafficking) are acute.

The Hispanics
      Like African-Americans, Hispanics (Latinos) make up about one-eighth of the U.S. population. Although they generally share Spanish as a second (and sometimes first) language, Hispanics are hardly a monolithic group. The majority, nearly three-fifths, are of Mexican origin—some descended from settlers in portions of the United States that were once part of Mexico (Texas, Arizona, New Mexico, and California), others legal and illegal migrants from across the loosely guarded Mexico–U.S. border. The greater opportunities and higher living standards in the United States have long attracted immigrants from Mexico and Central America.

      The Puerto Rican experience in the United States is markedly different from that of Mexican Americans. Most importantly, Puerto Ricans are American citizens by virtue of the island commonwealth's association with the United States. As a result, migration between Puerto Rico and the United States has been fairly fluid, mirroring the continuous process by which Americans have always moved to where chances seem best. While most of that migration traditionally has been toward the mainland, by the end of the 20th century in- and out-migration between the island and the United States equalized. Puerto Ricans now make up about one-tenth of the U.S. Latino population.

      Quite different, though also Spanish-speaking, are the Cubans who fled Fidel Castro's communist revolution of 1959 and their descendants. While representatives of every social group are among them, the initial wave of Cubans was distinctive because of the large number of professional and middle-class people who migrated. Their social and political attitudes differ significantly from those of Mexican Americans and Puerto Ricans, though this distinction was lessened by an influx of 120,000 Cuban refugees in the 1980s, known as the Mariel immigrants.

      After 1960 easy air travel and political and economic instability stimulated a significant migration from the Caribbean, Central America, and South America. The arrivals from Latin America in earlier years were often political refugees, more recently they usually have been economic refugees. Constituting about one-fourth of the Hispanic diaspora, this group comprises largely Central Americans, Colombians, and Dominicans, the last of whom have acted as a bridge between the black and Latino communities. Latinos have come together for better health, housing, and municipal services, for bilingual school programs, and for improved educational and economic opportunities.

Asian-Americans
      Asian-Americans as a group have confounded earlier expectations that they would form an indigestible mass in American society. The Chinese, earliest to arrive (in large numbers from the mid-19th century, principally as labourers, notably on the transcontinental railroad), and the Japanese were long victims of racial discrimination. In 1924 the law barred further entries; those already in the United States had been ineligible for citizenship since the previous year. In 1942 thousands of Japanese, many born in the United States and therefore American citizens, were interned in relocation camps because their loyalty was suspect after the United States engaged Japan in World War II. Subsequently, anti-Asian prejudice largely dissolved, and Chinese and Japanese, along with others such as the Vietnamese and Taiwanese, have adjusted and advanced. Among generally more recent arrivals, many Koreans, Filipinos, and Asian Indians have quickly enjoyed economic success. Though enumerated separately by the U.S. census, Pacific Islanders, such as native Hawaiians, constitute a small minority but contribute to making Hawaii and California the states with the largest percentages of Asian-Americans.

Middle Easterners
      Among the trends of Arab immigration in the 20th century were the arrival of Lebanese Christians in the first half of the century and Palestinian Muslims in the second half. Initially Arabs inhabited the East Coast, but by the end of the century there was a large settlement of Arabs in the greater Detroit area. Armenians, also from southwest Asia, arrived in large numbers in the early 20th century, eventually congregating largely in California, where, later in the century, Iranians were also concentrated. Some recent arrivals from the Middle East maintain national customs such as traditional dress.

Native Americans (Native American)
      Native Americans form an ethnic group only in a very general sense. In the East, centuries of coexistence with whites has led to some degree of intermarriage and assimilation and to various patterns of stable adjustment. In the West the hasty expansion of agricultural settlement crowded the Native Americans into reservations (reservation), where federal policy has vacillated between efforts at assimilation and the desire to preserve tribal cultural identity, with unhappy consequences. The Native American population has risen from its low point of 235,000 in 1900 to 2.5 million at the turn of the 21st century.

      The reservations are often enclaves of deep poverty and social distress, although the many casinos operated on their land have created great wealth in some instances. The physical and social isolation of the reservation prompted many Native Americans to migrate to large cities, but, by the end of the 20th century, a modest repopulation occurred in rural counties of the Great Plains. In census numerations Native Americans are categorized with Alaskan natives, notably Aleuts and Eskimos. In the latter half of the 20th century, intertribal organizations were founded to give Native Americans a unified, national presence.

Religious groups
      The U.S. government has never supported an established church, and the diversity of the population has discouraged any tendency toward uniformity in worship. As a result of this individualism, thousands of religious denominations thrive within the country. Only about one-sixth of religious adherents are not Christian, and although Roman Catholicism is the largest single denomination (about one-fifth of the U.S. population), the many churches of Protestantism constitute the majority. Some are the products of native development—among them the Disciples of Christ (founded in the early 19th century), Church of Jesus Christ of Latter-day Saints (Mormons; 1830), Seventh-day Adventists (officially established 1863), Jehovah's Witnesses (1872), Christian Scientists (1879), and the various Pentecostal churches (late 19th century).

      Other denominations had their origins in the Old World, but even these have taken distinctive American forms. Affiliated Roman Catholics look to Rome for guidance, although there are variations in practice from diocese to diocese. More than 5.5 million Jews are affiliated with three national organizations (Orthodox, Conservative, and Reform), as well as with many smaller sects. Most Protestant denominations also have European roots, the largest being the Baptists, Pentecostals, and Methodists. Among other groups are Lutherans, Presbyterians, Episcopalians, various Eastern churches (including Orthodox), Congregationalists, Reformed, Mennonites and Amish, various Brethren, Unitarians, and the Friends (Quakers). By 2000 substantial numbers of recent immigrants had increased the Muslim, Buddhist, and Hindu presence to about 4 million, 2.5 million, and 1 million believers, respectively.

Immigration
  Immigration legislation began in earnest in the late 19th century, but it was not until after World War I that the era of mass immigration came to an abrupt end. The Immigration Act of 1924 established an annual quota (fixed in 1929 at 150,000) and established the national-origins system, which was to characterize immigration policy for the next 40 years. Under it, quotas were established for each country based on the number of persons of that national origin who were living in the United States in 1920. The quotas reduced drastically the flow of immigrants from southeastern Europe in favour of the countries of northwestern Europe. The quota system was abolished in 1965 in favour of a predominantly first-come, first-served policy. An annual ceiling of immigrant visas was established for nations outside the Western Hemisphere (170,000, with 20,000 allowed to any one nation) and for all persons from the Western Hemisphere (120,000).

      The new policy radically changed the pattern of immigration. For the first time, non-Europeans formed the dominant immigrant group, with new arrivals from Asia, Latin America, the Caribbean, and the Middle East. In the 1980s and '90s immigration was further liberalized by granting amnesty to illegal aliens (alien), raising admission limits, and creating a system for validating refugees. The plurality of immigrants, both legal and illegal, recently hail from Mexico and elsewhere in Latin America, though Asians form a significant percentage.

Ed.John Naisbitt Thea K. Flaum Oscar Handlin

Economy
 The United States is the world's greatest economic power in terms of gross domestic product (GDP) and is among the greatest powers in terms of GDP per capita. With less than 5 percent of the world's population, the United States produces about one-fifth of the world's economic output.

      The sheer size of the U.S. economy makes it the most important single factor in global trade. Its exports represent more than one-tenth of the world total. The United States also influences the economies of the rest of the world because it is a significant source of investment capital. Just as direct investment, primarily by the British, was a major factor in 19th-century U.S. economic growth, so direct investment abroad by U.S. firms is a major factor in the economic well-being of Canada, Mexico, China, and many countries in Latin America, Europe, and Asia.

Strengths and weaknesses
      The U.S. economy is marked by resilience, flexibility, and innovation. In the first decade of the 21st century, the economy was able to withstand a number of costly setbacks. These included the collapse of stock markets following an untenable run-up in technology shares, losses from corporate scandals, the September 11 attacks in 2001, wars in Afghanistan and Iraq, and a devastating hurricane along the Gulf Coast near New Orleans in 2005.

      For the most part, the U.S. government plays only a small direct role in running the nation's economic enterprises. Businesses are free to hire or fire employees and open or close operations. Unlike the situation in many other countries, new products and innovative practices can be introduced with minimal bureaucratic delays. The government does, however, regulate various aspects of all U.S. industries. Federal agencies oversee worker safety and work conditions, air and water pollution, food and prescription drug safety, transportation safety, and automotive fuel economy—to name just a few examples. Moreover, the Social Security Administration operates the country's pension system, which is funded through payroll taxes. The government also operates public health programs such as Medicaid (for the poor) and Medicare (for the elderly).

      In an economy dominated by privately owned businesses, there are still some government-owned companies. These include the U.S. Postal Service, the Nuclear Regulatory Commission, the National Railroad Passenger Corporation (Amtrak), and the Tennessee Valley Authority.

      The federal government also influences economic activity in other ways. As a purchaser of goods, it exerts considerable leverage on certain sectors of the economy—most notably in the defense and aerospace industries. It also implements antitrust laws to prevent companies from colluding on prices or monopolizing market shares.

      Despite its ability to weather economic shocks, in the earliest years of the 21st century, the U.S. economy developed many weaknesses that pointed to future risks. The country faces a chronic trade deficit; imports greatly outweigh the value of U.S. goods and services exported to other countries. For many citizens, household incomes have effectively stagnated since the 1970s, while indebtedness reached record levels. Rising energy prices made it more costly to run businesses, heat homes, and transport goods and people. The country's aging population placed new burdens on public health spending and pension programs (including Social Security). At the same time, the burgeoning federal budget deficit limited the amount of funding available for social programs.

      Nearly all of the federal government's revenues come from taxes, with total income from federal taxes representing about one-fifth of GDP. The most important source of tax revenue is the personal income tax (personal income tax) (accounting for roughly half of federal revenue). Gross receipts from corporate income taxes yield a far smaller fraction (about one-eighth) of total federal receipts. Excise duties yield yet another small portion (less than one-tenth) of total federal revenue; however, individual states levy their own excise and sales taxes. Federal excises rest heavily on alcohol, gasoline, and tobacco. Other sources of revenue include Medicare and social security payroll taxes (which account for almost two-fifths of federal revenue) and estate and gift taxes (yielding only about 1 percent of the total).

Labour force
      With an unemployment rate of roughly 5 percent per year, the U.S. labour market is in line with those of other developed countries. The service sector accounts for more than three-fourths of the country's jobs, whereas industrial and manufacturing trades employ less than one-fifth of the labour market.

      After peaking in the 1950s, when 36 percent of American workers were enrolled in unions (organized labour), union membership at the beginning of the 21st century had fallen to less than 15 percent of U.S. workers, nearly half of them government employees. The transformation in the late 20th century to a service-based economy changed the nature of labour unions. Organizational efforts, once aimed primarily at manufacturing industries, are now focused on service industries. The country's largest union, the National Education Association (NEA), represents teachers. In 2005 three large labour unions broke their affiliation with the American Federation of Labor–Congress of Industrial Organizations (AFL-CIO), the nationwide federation of unions, and formed a new federation, the Change to Win coalition, with the goal of reviving union influence in the labour market. Although the freedom to strike is qualified with provisions requiring cooling-off periods and in some cases compulsory arbitration, major unions are able and sometimes willing to embark on long strikes.

Agriculture, forestry, and fishing
 Despite the enormous productivity of U.S. agriculture, the combined outputs of agriculture, forestry, and fishing contribute to only a small percentage of GDP. Advances in farm productivity (stemming from mechanization and organizational changes in commercial farming) have enabled a smaller labour force to produce greater quantities than ever before. Improvements in yields have also resulted from the increased use of fertilizers, pesticides, and herbicides and from changes in agricultural techniques (such as irrigation). Among the most important crops are corn (maize), soybeans, wheat, cotton, grapes, and potatoes.

      The United States is the world's major producer of timber. More than four-fifths of the trees harvested are softwoods such as Douglas fir and southern pine. The major hardwood is oak.

      The United States also ranks among the world's largest producers of edible and nonedible fish products. Fish for human consumption accounts for more than half of the tonnage landed. Shellfish account for less than one-fifth of the annual catch but for nearly half the total value.

      Less than one-fiftieth of the GDP comes from mining and quarrying, yet the United States is a leading producer of coal, petroleum, and some metals.

Resources and power
      The United States is one of the world's leading producers of energy. It is also the world's biggest consumer of energy. It therefore relies on other countries for many energy sources—petroleum products in particular. The country is notable for its efficient use of natural resources, and it excels in transforming its resources into usable products.

Minerals
      With major producing fields in Alaska, California, the Gulf of Mexico, Louisiana, and Oklahoma, the United States is one of the world's leading producers of refined petroleum and has important reserves of natural gas. It is also among the world's coal exporters. Recoverable coal deposits are concentrated largely in the Appalachian Mountains and in Wyoming. Nearly half the bituminous (bituminous coal) coal is mined in West Virginia and Kentucky, while Pennsylvania produces the country's only anthracite. Illinois, Indiana, and Ohio also produce coal.

      Iron ore is mined predominantly in Minnesota and Michigan. The United States also has important reserves of copper, magnesium, lead, and zinc. Copper production is concentrated in the mountainous western states of Arizona, Utah, Montana, Nevada, and New Mexico. Zinc is mined in Tennessee, Missouri, Idaho, and New York. Lead mining is concentrated in Missouri. Other metals mined in the United States are gold, silver, molybdenum, manganese, tungsten, bauxite, uranium, vanadium, and nickel. Important nonmetallic minerals produced are phosphates, potash, sulfur, stone, and clays.

Biological resources
      More than two-fifths of the total land area of the United States is devoted to farming (including pasture and range). Tobacco is produced in the Southeast and in Kentucky and cotton in the South and Southwest; California is noted for its vineyards, citrus groves, and truck gardens; the Midwest is the centre of corn and wheat farming, while dairy herds are concentrated in the Northern states. The Southwestern and Rocky Mountain states support large herds of livestock.

      Most of the U.S. forestland is located in the West (including Alaska), but significant forests also grow elsewhere. Almost half of the country's hardwood forests are located in Appalachia. Of total commercial forestland, more than two-thirds is privately owned. About one-fifth is owned or controlled by the federal government, the remainder being controlled by state and local governments.

      Hydroelectric resources are heavily concentrated in the Pacific and Mountain regions. Hydroelectricity, however, contributes less than one-tenth of the country's electricity supply. Coal-burning plants provide more than half of the country's power; nuclear generators contribute about one-fifth.

Manufacturing
      Since the mid-20th century, services (such as health care, entertainment, and finance) have grown faster than any other sector of the economy. Nevertheless, while manufacturing jobs have declined since the 1960s, advances in productivity have caused manufacturing output, including construction, to remain relatively constant, at about one-fifth of GDP.

      Significant economic productivity occurs in a wide range of industries. The manufacture of transportation equipment (including motor vehicles, aircraft, and space equipment) represents a leading sector. Computer and telecommunications firms (including software and hardware) remain strong, despite a downturn in the early 21st century. Other important sectors include drug manufacturing and biotechnology, health services, food products, chemicals, electrical and nonelectrical machinery, energy, and insurance.

Finance
      Under the Federal Reserve System, which regulates bank credit and influences the money supply, central banking functions are exercised by 12 regional Federal Reserve banks. The Board of Governors, appointed by the U.S. president, supervises these banks. Based in Washington, D.C., the board does not necessarily act in accord with the administration's views on economic policy. The U.S. Treasury also influences the working of the monetary system through its management of the national debt (which can affect interest rates) and by changing its own deposits with the Federal Reserve banks (which can affect the volume of credit). While only about two-fifths of all commercial banks belong to the Federal Reserve System, these banks hold almost three-fourths of all commercial bank deposits. Banks incorporated under national charter must be members of the system, while banks incorporated under state charters may become members. Member banks must maintain minimum legal reserves and must deposit a percentage of their savings and checking accounts with a Federal Reserve bank. There are also thousands of nonbank credit agencies such as personal credit institutions and savings and loan associations (S&Ls).

      Although banks supply less than half of the funds used for corporate finance, bank loans represent the country's largest source of capital for business borrowing. A liberalizing trend in state banking laws in the 1970s and '80s encouraged both intra- and interstate expansion of bank facilities and bank holding companies. Succeeding mergers (merger) among the country's largest banks led to the formation of large regional and national banking and financial services corporations. In serving both individual and commercial customers, these institutions accept deposits, provide checking accounts, underwrite securities, originate loans, offer mortgages, manage investments, and sponsor credit cards.

      Financial services are also provided by insurance companies and security brokerages. The federal government sponsors credit agencies in the areas of housing (home mortgages), farming (agricultural loans), and higher education (student loans). New York City has three organized stock exchanges—the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), and the National Association of Securities Dealers Automated Quotations (NASDAQ) Stock Market—which account for the bulk of all stock sales in the United States. The country's leading markets for commodities, futures, and options are the Chicago Board of Trade (CBOT), the Chicago Mercantile Exchange (CME), and the Chicago Board Options Exchange (CBOE). The Chicago Climate Exchange (CCX) specializes in futures contracts for greenhouse gas emissions (carbon credits). Smaller exchanges operate in a number of American cities.

Foreign trade
      International trade is crucial to the national economy, with the combined value of imports and exports equivalent to about one-sixth of the gross national product. Canada, Mexico, Japan, China, and the United Kingdom are the principal trading partners. Leading exports include electrical and office machinery, chemical products, motor vehicles, airplanes and aviation parts, and scientific equipment. Major imports include manufactured goods, petroleum and fuel products, and machinery and transportation equipment.

E.I.U. Ed.

Transportation
      The economic and social complexion of life in the United States mirrors the nation's extraordinary mobility. A pervasive transportation network has helped transform the vast geographic expanse into a surprisingly homogeneous and close-knit social and economic environment. Another aspect of mobility is flexibility, and this freedom to move is often seen as a major factor in the dynamism of the U.S. economy. Mobility has also had destructive effects: it has accelerated the deterioration of older urban areas, multiplied traffic congestion, intensified pollution of the environment, and diminished support for public transportation systems.

Roads and railroads
      Central to the U.S. transportation network is the 45,000-mile Interstate System, now known as the Dwight D. Eisenhower System of Interstate and Defense Highways. The system connects about nine-tenths of all cities of at least 50,000 population. Begun in the 1950s, the highway system carries about one-fifth of the country's motor traffic. Nearly nine-tenths of all households own at least one automobile or truck. At the end of the 20th century, these added up to more than 100 million privately owned vehicles. While most trips in metropolitan areas are made by automobile, the public transit and rail commuter lines play an important role in the most populous cities, with the majority of home-to-work commuters traveling by public carriers in such cities as New York City, Chicago, Philadelphia, and Boston. Although railroads once dominated both freight and passenger traffic in the United States, government regulation and increased competition from trucking reduced their role in transportation. Railroads move about one-third of the nation's intercity freight traffic. The most important items carried are coal, grain, chemicals, and motor vehicles. Many rail companies had given up passenger service by 1970, when Congress created the National Railroad Passenger Corporation (known as Amtrak), a government corporation, to take over passenger service. Amtrak operates a 21,000-mile system serving more than 500 stations across the country.

Water and air transport
 Navigable waterways are extensive and centre upon the Mississippi River system in the country's interior, the Great Lakes–St. Lawrence Seaway system in the north, and the Gulf Coast waterways along the Gulf of Mexico. Barges carry more than two-thirds of domestic waterborne traffic, transporting petroleum products, coal and coke, and grain. The country's largest ports in tonnage handled are the Port of South Louisiana; the Port of Houston, Texas; the Port of New York/New Jersey; and the Port of New Orleans, La.

      Air traffic has experienced spectacular growth in the United States since the mid-20th century. From 1970 to 1999, passenger traffic on certified air carriers increased 373 percent. Much of this growth occurred after airline deregulation, which began in 1978. There are more than 14,000 public and private airports, the busiest being in Atlanta, Ga., and Chicago for passenger traffic. Airports in Memphis, Tenn. (the hub of package-delivery company Federal Express), and Los Angeles handle the most freight cargo.

Government and society

Constitutional framework
      The Constitution of the United States (Constitution of the United States of America), written to redress the deficiencies of the country's first constitution, the Articles of Confederation (Confederation, Articles of) (1781–89), defines a federal system (federalism) of government in which certain powers are delegated to the national government and others are reserved to the states. The national government consists of executive, legislative, and judicial branches that are designed to ensure, through separation of powers (powers, separation of) and through checks and balances, that no one branch of government is able to subordinate the other two branches. All three branches are interrelated, each with overlapping yet quite distinct authority.

      The U.S. Constitution (see original text (Constitution of the United States)), the world's oldest written national constitution still in effect, was officially ratified on June 21, 1788 (when New Hampshire became the ninth state to ratify the document), and formally entered into force on March 4, 1789, when George Washington (Washington, George) was sworn in as the country's first president. Although the Constitution contains several specific provisions (such as age and residency requirements for holders of federal offices and powers granted to Congress), it is vague in many areas and could not have comprehensively addressed the complex myriad of issues (e.g., historical, technological, etc.) that have arisen in the centuries since its ratification. Thus, the Constitution is considered a living document, its meaning changing over time as a result of new interpretations of its provisions. In addition, the framers allowed for changes to the document, outlining in Article V the procedures required to amend the Constitution. Amending the Constitution requires a proposal by a two-thirds vote of each house of Congress or by a national convention called for at the request of the legislatures of two-thirds of the states, followed by ratification by three-fourths of the state legislatures or by conventions in as many states.

      In the more than two centuries since the Constitution's ratification, there have been 27 amendments. All successful amendments have been proposed by Congress, and all but one—the Twenty-first Amendment (1933), which repealed prohibition—have been ratified by state legislatures. The first 10 amendments, proposed by Congress in September 1789 and adopted in 1791, are known collectively as the Bill of Rights (Rights, Bill of), which places limits on the federal government's power to curtail individual freedoms. The First Amendment, for example, provides that “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.” Though the First Amendment's language appears absolute, it has been interpreted to mean that the federal government (and later the state governments) cannot place undue restrictions on individual liberties but can regulate speech, religion, and other rights. The Second and Third amendments, which, respectively, guarantee the people's right to bear arms and limit the quartering of soldiers in private houses, reflect the hostility of the framers to standing armies. The Fourth through Eighth amendments establish the rights of the criminally accused, including safeguards against unreasonable searches and seizures, protection from double jeopardy (being tried twice for the same offense), the right to refuse to testify against oneself, and the right to a trial by jury. The Ninth and Tenth amendments underscore the general rights of the people. The Ninth Amendment protects the unenumerated residual rights of the people (i.e., those not explicitly granted in the Constitution), and the Tenth Amendment reserves to the states or to the people those powers not delegated to the United States nor denied to the states.

      The guarantees of the Bill of Rights are steeped in controversy, and debate continues over the limits that the federal government may appropriately place on individuals. One source of conflict has been the ambiguity in the wording of many of the Constitution's provisions—such as the Second Amendment's right “to keep and bear arms” and the Eighth Amendment's prohibition of “cruel and unusual punishments.” Also problematic is the Tenth Amendment's apparent contradiction of the body of the Constitution; Article I, Section 8, enumerates the powers of Congress but also allows that it may make all laws “which shall be necessary and proper,” while the Tenth Amendment stipulates that “powers not delegated to the United States by the Constitution, nor prohibited by it to the States, are reserved to the States respectively, or to the people.” The distinction between what powers should be left to the states or to the people and what is a necessary and proper law for Congress to pass has not always been clear.

      Between the ratification of the Bill of Rights and the American Civil War (1861–65), only two amendments were passed, and both were technical in nature. The Eleventh Amendment (1795) forbade suits against the states in federal courts, and the Twelfth Amendment (1804) corrected a constitutional error that came to light in the presidential election of 1800, when Democratic-Republicans Thomas Jefferson (Jefferson, Thomas) and Aaron Burr (Burr, Aaron) each won 73 electors because electors were unable to cast separate ballots for president and vice president. The Thirteenth, Fourteenth, and Fifteenth amendments were passed in the aftermath of the Civil War. The Thirteenth (1865) abolished slavery, while the Fifteenth (1870) forbade denial of the right to vote to former male slaves. The Fourteenth Amendment, which granted citizenship rights to former slaves and guaranteed to every citizen due process and equal protection of the laws, was regarded for a while by the courts as limiting itself to the protection of freed slaves, but it has since been used to extend protections to all citizens. Initially, the Bill of Rights applied solely to the federal government and not to the states. In the 20th century, however, many (though not all) of the provisions of the Bill of Rights were extended by the Supreme Court through the Fourteenth Amendment to protect individuals from encroachments by the states. Notable amendments since the Civil War include the Sixteenth (1913), which enabled the imposition of a federal income tax; the Seventeenth (1913), which provided for the direct election of U.S. senators; the Nineteenth (1920), which established woman suffrage; the Twenty-fifth (1967), which established succession to the presidency and vice presidency; and the Twenty-sixth (1971), which extended voting rights to all citizens 18 years of age or older.

The executive branch
      The executive branch is headed by the president (presidency of the United States of America), who must be a natural-born citizen of the United States, at least 35 years old, and a resident of the country for at least 14 years. A president is elected indirectly by the people through an electoral college system to a four-year term and is limited to two elected terms of office by the Twenty-second Amendment (1951). The president's official residence and office is the White House, located at 1600 Pennsylvania Avenue N.W. in Washington, D.C. (Washington) The formal constitutional responsibilities vested in the presidency of the United States (presidency of the United States of America) include serving as commander in chief of the armed forces; negotiating treaties; appointing federal judges, ambassadors, and cabinet officials; and acting as head of state. In practice, presidential powers have expanded to include drafting legislation, formulating foreign policy, conducting personal diplomacy, and leading the president's political party.

      The members of the president's cabinet—the attorney general and the secretaries of State, Treasury, Defense, Homeland Security, Interior, Agriculture, Commerce, Labor, Health and Human Services, Housing and Urban Development, Transportation, Education, Energy, and Veterans Affairs—are appointed by the president with the approval of the Senate; although they are described in the Twenty-fifth Amendment as “the principal officers of the executive departments,” significant power has flowed to non-cabinet-level presidential aides, such as those serving in the Office of Management and Budget (OMB), the Council of Economic Advisers, the National Security Council (NSC), and the office of the White House Chief of Staff; cabinet-level rank may be conferred to the heads of such institutions at the discretion of the president. Members of the cabinet and presidential aides serve at the pleasure of the president and may be dismissed by him at any time.

      The executive branch also includes independent regulatory agencies such as the Federal Reserve System and the Securities and Exchange Commission. Governed by commissions appointed by the president and confirmed by the Senate (commissioners may not be removed by the president), these agencies protect the public interest by enforcing rules and resolving disputes over federal regulations. Also part of the executive branch are government corporations (e.g., the Tennessee Valley Authority, the National Railroad Passenger Corporation [ Amtrak], and the U.S. Postal Service), which supply services to consumers that could be provided by private corporations, and independent executive agencies (e.g., the Central Intelligence Agency, the National Science Foundation, and the National Aeronautics and Space Administration), which comprise the remainder of the federal government.

The legislative branch
      The U.S. Congress (Congress of the United States), the legislative branch of the federal government, consists of two houses: the Senate and the House of Representatives (Representatives, House of). Powers granted to Congress under the Constitution include the power to levy taxes, borrow money, regulate interstate commerce, impeach and convict the president, declare war, discipline its own membership, and determine its rules of procedure.

      With the exception of revenue bills, which must originate in the House of Representatives, legislative bills may be introduced in and amended by either house, and a bill—with its amendments—must pass both houses in identical form and be signed by the president before it becomes law. The president may veto a bill, but a veto can be overridden by a two-thirds vote of both houses. The House of Representatives may impeach a president or another public official by a majority vote; trials of impeached officials are conducted by the Senate, and a two-thirds majority is necessary to convict and remove the individual from office. Congress is assisted in its duties by the General Accounting Office (GAO), which examines all federal receipts and expenditures by auditing federal programs and assessing the fiscal impact of proposed legislation, and by the Congressional Budget Office (CBO), a legislative counterpart to the OMB, which assesses budget data, analyzes the fiscal impact of alternative policies, and makes economic forecasts.

      The House of Representatives is chosen by the direct vote of the electorate in single-member districts in each state. The number of representatives allotted to each state is based on its population as determined by a decennial census; states sometimes gain or lose seats, depending on population shifts. The overall membership of the House has been 435 since the 1910s, though it was temporarily expanded to 437 after Hawaii and Alaska were admitted as states in 1959. Members must be at least 25 years old, residents of the states from which they are elected, and previously citizens of the United States for at least seven years. It has become a practical imperative—though not a constitutional requirement—that a member be an inhabitant of the district that elects him. Members serve two-year terms, and there is no limit on the number of terms they may serve. The speaker of the House, who is chosen by the majority party, presides over debate, appoints members of select and conference committees, and performs other important duties; he is second in the line of presidential succession (following the vice president). The parliamentary leaders of the two main parties are the majority floor leader and the minority floor leader. The floor leaders are assisted by party whips, who are responsible for maintaining contact between the leadership and the members of the House. Bills introduced by members in the House of Representatives are received by standing committees, which can amend, expedite, delay, or kill legislation. Each committee is chaired by a member of the majority party, who traditionally attained this position on the basis of seniority, though the importance of seniority has eroded somewhat since the 1970s. Among the most important committees are those on Appropriations, Ways and Means, and Rules. The Rules Committee, for example, has significant power to determine which bills will be brought to the floor of the House for consideration and whether amendments will be allowed on a bill when it is debated by the entire House.

      Each state elects two senators at large. Senators must be at least 30 years old, residents of the state from which they are elected, and previously citizens of the United States for at least nine years. They serve six-year terms, which are arranged so that one-third of the Senate is elected every two years. Senators also are not subject to term limits. The vice president serves as president of the Senate, casting a vote only in the case of a tie, and in his absence the Senate is chaired by a president pro tempore, who is elected by the Senate and is third in the line of succession to the presidency. Among the Senate's most prominent standing committees are those on Foreign Relations, Finance, Appropriations, and Governmental Affairs. Debate is almost unlimited and may be used to delay a vote on a bill indefinitely. Such a delay, known as a filibuster, can be ended by three-fifths of the Senate through a procedure called cloture. Treaties negotiated by the president with other governments must be ratified by a two-thirds vote of the Senate. The Senate also has the power to confirm or reject presidentially appointed federal judges, ambassadors, and cabinet officials.

The judicial branch (judiciary)
      The judicial branch is headed by the Supreme Court of the United States, which interprets the Constitution and federal legislation. The Supreme Court consists of nine justices (including a chief justice) appointed to life terms by the president with the consent of the Senate. It has appellate jurisdiction over the lower federal courts and over state courts if a federal question is involved. It also has original jurisdiction (i.e., it serves as a trial court) in cases involving foreign ambassadors, ministers, and consuls and in cases to which a U.S. state is a party.

      Most cases reach the Supreme Court through its appellate jurisdiction. The Judiciary Act of 1925 provided the justices with the sole discretion to determine their caseload. In order to issue a writ of certiorari, which grants a court hearing to a case, at least four justices must agree (the “Rule of Four”). Three types of cases commonly reach the Supreme Court: cases involving litigants of different states, cases involving the interpretation of federal law, and cases involving the interpretation of the Constitution. The court can take official action with as few as six judges joining in deliberation, and a majority vote of the entire court is decisive; a tie vote sustains a lower-court decision. The official decision of the court is often supplemented by concurring opinions from justices who support the majority decision and dissenting opinions from justices who oppose it.

      Because the Constitution is vague and ambiguous in many places, it is often possible for critics to fault the Supreme Court for misinterpreting it. In the 1930s, for example, the Republican-dominated court was criticized for overturning much of the New Deal legislation of Democratic (Democratic Party) President Franklin D. Roosevelt (Roosevelt, Franklin D.). In the area of civil rights, the court has received criticism from various groups at different times. Its 1954 ruling in Board of Education of Topeka (Brown v. Board of Education of Topeka), which declared school segregation unconstitutional, was harshly attacked by Southern political leaders, who were later joined by Northern conservatives. A number of decisions involving the pretrial rights of prisoners, including the granting of Miranda (Miranda v. Arizona) rights and the adoption of the exclusionary rule, also came under attack on the ground that the court had made it difficult to convict criminals. On divisive issues such as abortion, affirmative action, school prayer, and flag burning, the court's decisions have aroused considerable opposition and controversy, with opponents sometimes seeking constitutional amendments to overturn the court's decisions.

      At the lowest level of the federal court system are district courts (see United States District Court). Each state has at least one federal district court and at least one federal judge. District judges are appointed to life terms by the president with the consent of the Senate. Appeals from district-court decisions are carried to the U.S. courts of appeals (see United States Court of Appeals). Losing parties at this level may appeal for a hearing from the Supreme Court. Special courts handle property and contract damage suits against the United States ( United States Court of Federal Claims), review customs rulings (United States Court of International Trade), hear complaints by individual taxpayers (United States Tax Court) or veterans (United States Court of Appeals for Veteran Claims), and apply the Uniform Code of Military Justice ( United States Court of Appeals for the Armed Forces).

State and local government
      Because the U.S. Constitution establishes a federal system, the state governments enjoy extensive authority. The Constitution outlines the specific powers granted to the national government and reserves the remainder to the states. However, because of ambiguity in the Constitution and disparate historical interpretations by the federal courts, the powers actually exercised by the states have waxed and waned over time. Beginning in the last decades of the 20th century, for example, decisions by conservative-leaning federal courts, along with a general trend favouring the decentralization of government, increased the power of the states relative to the federal government. In some areas, the authority of the federal and state governments overlap; for example, the state and federal governments both have the power to tax, establish courts, and make and enforce laws. In other areas, such as the regulation of commerce within a state, the establishment of local governments, and action on public health, safety, and morals, the state governments have considerable discretion. The Constitution also denies to the states certain powers; for example, the Constitution forbids states to enter into treaties, to tax imports or exports, or to coin money. States also may not adopt laws that contradict the U.S. Constitution.

      The governments of the 50 states have structures closely paralleling those of the federal government. Each state has a governor, a legislature, and a judiciary. Each state also has its own constitution.

      Mirroring the U.S. Congress, all state legislatures are bicameral (bicameral system) except Nebraska's, which is unicameral. Most state judicial systems are based upon elected justices of the peace (justice of the peace) (although in many states this term is not used), above whom are major trial courts, often called district courts, and appellate courts. Each state has its own supreme court. In addition, there are probate courts concerned with wills, estates, and guardianships. Most state judges are elected, though some states use an appointment process similar to the federal courts and some use a nonpartisan selection process known as the Missouri Plan.

      State governors are directly elected and serve varying terms (generally ranging from two to four years); in some states, the number of terms a governor may serve is limited. The powers of governors also vary, with some state constitutions ceding substantial authority to the chief executive (such as appointment and budgetary powers and the authority to veto legislation). In a few states, however, governors have highly circumscribed authority, with the constitution denying them the power to veto legislative bills.

      Most states have a lieutenant governor, who is often elected independently of the governor and is sometimes not a member of the governor's party. Lieutenant governors generally serve as the presiding officer of the state Senate. Other elected officials commonly include a secretary of state, state treasurer, state auditor, attorney general, and superintendent of public instruction.

      State governments have a wide array of functions, encompassing conservation, highway and motor vehicle supervision, public safety and corrections, professional licensing, regulation of agriculture and of intrastate business and industry, and certain aspects of education, public health, and welfare. The administrative departments that oversee these activities are headed by the governor.

      Each state may establish local governments to assist it in carrying out its constitutional powers. Local governments exercise only those powers that are granted to them by the states, and a state may redefine the role and authority of local government as it deems appropriate. The country has a long tradition of local democracy (e.g., the town meeting), and even some of the smallest areas have their own governments. There are some 85,000 local government units in the United States. The largest local government unit is the county (called a parish in Louisiana or a borough in Alaska). Counties range in population from as few as 100 people to millions (e.g., Los Angeles county). They often provide local services in rural areas and are responsible for law enforcement and keeping vital records. Smaller units include townships, villages, school districts, and special districts (e.g., housing authorities, conservation districts, and water authorities).

      Municipal, or city, governments are responsible for delivering most local services, particularly in urban areas. At the beginning of the 21st century there were some 20,000 municipal governments in the United States. They are more diverse in structure than state governments. There are three basic types: mayor-council (mayor and council system), commission, and council-manager governments. The mayor-council form, which is used in Boston, New York City, Philadelphia, Chicago, and thousands of smaller cities, consists of an elected mayor and council. The power of mayors and councils vary from city to city; in most cities the mayor has limited powers and serves largely as a ceremonial leader, but in some cities (particularly large urban areas) the council is nominally responsible for formulating city ordinances, which the mayor enforces, but the mayor often controls the actions of the council. In the commission type, used less frequently now than it was in the early 20th century, voters elect a number of commissioners, each of whom serves as head of a city department; the presiding commissioner is generally the mayor. In the council-manager type, used in large cities such as Charlotte (North Carolina), Dallas (Texas), Phoenix (Arizona), and San Diego (California), an elected council hires a city manager to administer the city departments. The mayor, elected by the council, simply chairs the council and officiates at important functions.

      As society has become increasingly urban, politics and government have become more complex. Many problems of the cities, including transportation, housing, education, health, and welfare, can no longer be handled entirely on the local level. Because even the states do not have the necessary resources, cities have often turned to the federal government for assistance, though proponents of local control have urged that the federal government provide block-grant aid to state and local governments without federal restrictions.

Political process
      The framers of the U.S. Constitution focused their efforts primarily on the role, power, and function of the state and national governments, only briefly addressing the political and electoral process. Indeed, three of the Constitution's four references to the election of public officials left the details to be determined by Congress or the states. The fourth reference, in Article II, Section 1, prescribed the role of the electoral college in choosing the president, but this section was soon amended (in 1804 by the Twelfth Amendment) to remedy the technical defects that had arisen in 1800, when all Democratic-Republican Party electors cast their votes for Thomas Jefferson and Aaron Burr, thereby creating a tie because electors were unable to differentiate between their presidential and vice presidential choices. (The election of 1800 was finally settled by Congress, which selected Jefferson president following 36 ballots.)

      In establishing the electoral college, the framers stipulated that “Congress may determine the Time of chusing [sic] the Electors, and the Day on which they shall give their votes; which Day shall be the same throughout the United States.” In 1845 Congress established that presidential electors would be appointed on the first Tuesday after the first Monday in November; the electors cast their ballots on the Monday following the second Wednesday in December. Article I, establishing Congress, merely provides (Section 2) that representatives are to be “chosen every second Year by the People of the several States” and that voting qualifications are to be the same for Congress as for the “most numerous Branch of the State Legislature.” Initially, senators were chosen by their respective state legislatures (Section 3), though this was changed to popular election by the Seventeenth Amendment in 1913. Section 4 leaves to the states the prescription of the “Times, Places and Manner of holding Elections for Senators and Representatives” but gives Congress the power “at any time by Law [to] make or alter such Regulations, except as to the Places of chusing Senators.” In 1875 Congress designated the first Tuesday after the first Monday in November in even years as federal election day.

      All citizens at least 18 years of age are eligible to vote. (Prisoners, ex-felons, and individuals on probation or parole are prohibited, sometimes permanently, from voting in some states.) The history of voting rights in the United States has been one of gradual extension of the franchise. Religion, property ownership, race, and gender have disappeared one by one as legal barriers to voting. In 1870, through the Fifteenth Amendment, former slaves were granted the right to vote, though African Americans were subsequently still denied the franchise (particularly in the South) through devices such as literacy tests, poll taxes, and grandfather clauses. Only in the 1960s, through the Twenty-fourth Amendment (barring poll taxes) and the Voting Rights Act, were the full voting rights of African Americans guaranteed. Though universal manhood suffrage had theoretically been achieved following the American Civil War, woman suffrage was not fully guaranteed until 1920 with the enactment of the Nineteenth Amendment (several states, particularly in the West, had begun granting women the right to vote and to run for political office beginning in the late 19th century). Suffrage was also extended by the Twenty-sixth Amendment (1971), which lowered the minimum voting age to 18.

Voting and elections
      Voters go to the polls in the United States not only to elect members of Congress and presidential electors but also to cast ballots for state and local officials, including governors, mayors, and judges, and on ballot initiatives and referendums that may range from local bond issues to state constitutional amendments (see referendum and initiative). The 435 members of the House of Representatives are chosen by the direct vote of the electorate in single-member districts in each state. State legislatures (sometimes with input from the courts) draw congressional district boundaries, often for partisan advantage (see gerrymandering); incumbents have always enjoyed an electoral advantage over challengers, but, as computer technology has made redistricting more sophisticated and easier to manipulate, elections to the House of Representatives have become even less competitive, with more than 90 percent of incumbents who choose to run for reelection regularly winning—often by significant margins. By contrast, Senate elections are generally more competitive.

      Voters indirectly elect the president (presidency of the United States of America) and vice president through the electoral college. Instead of choosing a candidate, voters actually choose electors committed to support a particular candidate. Each state is allotted one electoral vote for each of its senators and representatives in Congress; the Twenty-third Amendment (1961) granted electoral votes to the District of Columbia, which does not have congressional representation. A candidate must win a majority (270) of the 538 electoral votes to be elected president. If no candidate wins a majority, the House of Representatives selects the president, with each state delegation receiving one vote; the Senate elects the vice president if no vice presidential candidate secures an electoral college majority. A candidate may lose the popular vote but be elected president by winning a majority of the electoral vote (as George W. Bush (Bush, George W.) did in 2000), though such inversions are rare. Presidential elections are costly and generate much media and public attention—sometimes years before the actual date of the general election. Indeed, some presidential aspirants have declared their candidacies years in advance of the first primaries and caucuses, and some White House hopefuls drop out of the grueling process long before the first votes are cast.

      Voting in the United States is not compulsory, and, in contrast to most other Western countries, voter turnout is quite low. In the late 20th and the early 21st century, about 50 percent of Americans cast ballots in presidential elections; turnout was even lower for congressional and state and local elections, with participation dropping under 40 percent for most congressional midterm elections (held midway through a president's four-year term). Indeed, in some local elections (such as school board elections or bond issues) and primaries or caucuses, turnout has sometimes fallen below 10 percent. High abstention rates led to efforts to encourage voter participation by making voting easier. For example, in 1993 Congress passed the National Voter Registration Act (the so-called “motor-voter law”), which required states to allow citizens to register to vote when they received their driver's licenses, and in 1998 voters in Oregon approved a referendum that established a mail-in voting system. In addition, some states now allow residents to register to vote on election day, polls are opened on multiple days and in multiple locations in some states, and Internet voting has even been introduced on a limited basis for some elections.

Money and campaigns
      Campaigns for all levels of office are expensive in the United States compared with those in most other democratic countries. In an attempt to reduce the influence of money in the political process, reforms were instituted in the 1970s that required public disclosure of contributions and limited the amounts of contributions to candidates for federal office. Individuals were allowed to contribute directly to a candidate no more than $1,000 in so-called “hard money” (i.e., money regulated by federal election law) per candidate per election. The law, however, allowed labour unions, corporations, political advocacy groups, and political parties to raise and spend unregulated “soft money,” so long as funds were not spent specifically to support a candidate for federal office (in practice, this distinction was often blurry). Because there were no limits on such soft money, individuals or groups could contribute to political parties any sum at their disposal or spend limitlessly to advocate policy positions (often to the benefit or detriment of particular candidates). In the 2000 election cycle, it is estimated that more than $1 billion was spent by the Democratic and Republican parties and candidates for office, with more than two-fifths of this total coming from soft money contributions.

      Concerns about campaign financing led to the passage of the Bipartisan Campaign Reform Act of 2002 (popularly called the “McCain-Feingold law” for its two chief sponsors in the Senate, Republican John McCain and Democrat Russell Feingold), which banned national political parties from raising soft money. The law also increased the amount individuals could contribute to candidates (indexing the amount for inflation) and prevented interest groups from broadcasting advertisements that specifically referred to a candidate within 30 days of a primary election and 60 days of a general election.

      There are no federal limits on how much an individual may spend on his or her own candidacy. In 1992, for example, Ross Perot (Perot, Ross) spent more than $60 million of his fortune on his unsuccessful bid to become president of the United States, and Michael Bloomberg was elected mayor of New York City in 2001 after spending nearly $70 million of his own funds. The campaign finance law of 2002 allowed candidates for federal office to raise amounts greater than the normal limit on individual hard money contributions when running against wealthy, largely self-financed opponents.

      The United States has two major national political parties (political party), the Democratic Party and the Republican Party. Although the parties contest presidential elections every four years and have national party organizations, between elections they are often little more than loose alliances of state and local party organizations. Other parties have occasionally challenged the Democrats and Republicans. Since the Republican Party's rise to major party status in the 1850s, however, minor parties have had only limited electoral success, generally restricted either to influencing the platforms of the major parties or to siphoning off enough votes from a major party to deprive that party of victory in a presidential election. In the 1912 election, for example, former Republican president Theodore Roosevelt (Roosevelt, Theodore) challenged Republican President William Howard Taft (Taft, William Howard), splitting the votes of Republicans and allowing Democrat Woodrow Wilson (Wilson, Woodrow) to win the presidency with only 42 percent of the vote, and the 2.7 percent of the vote won by Green Party nominee Ralph Nader (Nader, Ralph) in 2000 may have tipped the presidency toward Republican George W. Bush by attracting votes that otherwise would have been cast for Democrat Al Gore (Gore, Al).

      There are several reasons for the failure of minor parties and the resilience of America's two-party system. In order to win a national election, a party must appeal to a broad base of voters and a wide spectrum of interests. The two major parties have tended to adopt centrist political programs, and sometimes there are only minor differences between them on major issues, especially those related to foreign affairs. Each party has both conservative and liberal wings, and on some issues (e.g., affirmative action) conservative Democrats have more in common with conservative Republicans than with liberal Democrats. The country's “winner-take-all” plurality system, in contrast to the proportional representation used in many other countries (whereby a party, for example, that won 5 percent of the vote would be entitled to roughly 5 percent of the seats in the legislature), has penalized minor parties by requiring them to win a plurality of the vote in individual districts in order to gain representation. The Democratic and Republican Party candidates are automatically placed on the general election ballot, while minor parties often have to expend considerable resources collecting enough signatures from registered voters to secure a position on the ballot. Finally, the cost of campaigns, particularly presidential campaigns, often discourages minor parties. Since the 1970s, presidential campaigns (primaries and caucuses, national conventions, and general elections) have been publicly funded through a tax checkoff system, whereby taxpayers can designate whether a portion of their federal taxes (in the early 21st century, $3 for an individual and $6 for a married couple) should be allocated to the presidential campaign fund. Whereas the Democratic and Republican presidential candidates receive full federal financing (nearly $75 million in 2004) for the general election, a minor party is eligible for a portion of the federal funds only if its candidate surpassed 5 percent in the prior presidential election (all parties with at least 25 percent of the national vote in the prior presidential election are entitled to equal funds). A new party contesting the presidential election is entitled to federal funds after the election if it received at least 5 percent of the national vote.

      Both the Democratic and Republican parties have undergone significant ideological transformations throughout their histories. The modern Democratic Party traditionally supports organized labour, minorities, and progressive reforms. Nationally, it generally espouses a liberal political philosophy, supporting greater governmental intervention in the economy and less governmental regulation of the private lives of citizens. It also generally supports higher taxes (particularly on the wealthy) to finance social welfare benefits that provide assistance to the elderly, the poor, the unemployed, and children. By contrast, the national Republican Party supports limited government regulation of the economy, lower taxes, and more conservative (traditional) social policies.

  At the state level, political parties reflect the diversity of the population. Democrats in the Southern states are generally more conservative than Democrats in New England or the Pacific Coast states; likewise, Republicans in New England or the mid-Atlantic states also generally adopt more liberal positions than Republicans in the South or the mountain states of the West. Large urban centres are more likely to support the Democratic Party, whereas rural areas, small cities, and suburban areas tend more often to vote Republican. Some states have traditionally given majorities to one particular party. For example, because of the legacy of the Civil War and its aftermath, the Democratic Party dominated the 11 Southern states of the former Confederacy (Confederate States of America) until the mid-20th century. Since the 1960s, however, the South and the mountain states of the West have heavily favoured the Republican Party; in other areas, such as New England, the mid-Atlantic, and the Pacific Coast, support for the Democratic Party is strong. Compare, for example, the 1960—> and 2000—> presidential elections.

      Both the Democratic and Republican parties select their candidates for office through primary elections. Traditionally, individuals worked their way up through the party organization, belonging to a neighbourhood party club, helping to raise funds, getting out the vote, watching the polls, and gradually rising to become a candidate for local, state, and—depending on chance, talent, political expediency, and a host of other factors—higher office. Because American elections are now more heavily candidate-centred rather than party-centred and are less susceptible to control by party bosses, wealthy candidates have often been able to circumvent the traditional party organization to win their party's nomination.

Security
National security
      The September 11 attacks of 2001 precipitated the creation of the Department of Homeland Security, which is charged with protecting the United States against terrorist attacks. The legislation establishing the department—the largest government reorganization in 50 years—consolidated much of the country's security infrastructure, integrating the functions of more than 20 agencies under Homeland Security. The department's substantive responsibilities are divided into four directorates: border and transportation security, emergency preparedness, information analysis and infrastructure protection, and science and technology. The Secret Service, which protects the president, vice president, and other designated individuals, is also under the department's jurisdiction.

      The country's military forces consist of the U.S. Army (United States Army, The), Navy (United States Navy, The) (including the Marine Corps (United States Marine Corps, The)), and Air Force (United States Air Force, The), under the umbrella of the Department of Defense, which is headquartered in the Pentagon building in Arlington county, Virginia. (A related force, the Coast Guard (United States Coast Guard), is under the jurisdiction of the Department of Homeland Security.) conscription was ended in 1973, and since that time the United States has maintained a wholly volunteer military force; since 1980, however, all male citizens (as well as immigrant alien males) between 18 and 25 years of age have been required to register for selective service in case a draft is necessary during a crisis. The armed services also maintain reserve forces that may be called upon in time of war. Each state has a National Guard consisting of reserve groups subject to call at any time by the governor of the state.

      Because a large portion of the military budget, which generally constitutes about 15 to 20 percent of government expenditures, is spent on matériel and research and development, military programs have considerable economic and political impact. The influence of the military also extends to other countries through a variety of multilateral and bilateral treaties and organizations (e.g., the North Atlantic Treaty Organization) for mutual defense and military assistance. The United States has military bases in Africa, Asia, Europe, and Latin America.

      The National Security Act of 1947 created a coordinated command for security and intelligence-gathering activities. The act established the National Security Council (NSC) and the Central Intelligence Agency (CIA), the latter under the authority of the NSC and responsible for foreign intelligence. The National Security Agency, an agency of the Department of Defense, is responsible for cryptographic and communications intelligence. The Department of Homeland Security analyzes information gathered by the CIA and its domestic counterpart, the Federal Bureau of Investigation (FBI), to assess threat levels against the United States.

Domestic law enforcement
      Traditionally, law enforcement in the United States has been concentrated in the hands of local police officials, though the number of federal law-enforcement officers began to increase in the late 20th century. The bulk of the work is performed by police and detectives in the cities and by sheriffs and constables in rural areas. Many state governments also have law-enforcement agencies, and all of them have highway-patrol systems for enforcing traffic law.

      The investigation of crimes that come under federal jurisdiction (e.g., those committed in more than one state) is the responsibility of the FBI (Federal Bureau of Investigation), which also provides assistance with fingerprint identification and technical laboratory services to state and local law-enforcement agencies. In addition, certain federal agencies—such as the Drug Enforcement Administration of the Department of Justice and the Bureau of Alcohol, Tobacco, and Firearms of the Department of the Treasury—are empowered to enforce specific federal laws.

Health and welfare
      Despite the country's enormous wealth, poverty remains a reality for many people in the United States, though programs such as social security and Medicare (Medicare and Medicaid) have significantly reduced the poverty rate among senior citizens. In the early 21st century, more than one-tenth of the general population—and about one-sixth of children under 18 years of age—lived in poverty. About half the poor live in homes in which the head of the household is a full- or part-time wage earner. Of the others living in poverty, many are too old to work or are disabled, and a large percentage are mothers of young children. The states provide assistance to the poor in varying amounts, and the United States Department of Agriculture subsidizes the distribution of low-cost food and food stamps to the poor through the state and local governments. Unemployment assistance, provided for by the 1935 Social Security Act, is funded through worker and employer contributions.

      Increasing public concern with poverty and welfare led to new federal legislation beginning in the 1960s, especially the Great Society programs of the presidential administration of Lyndon B. Johnson (Johnson, Lyndon B.). Work, training, and rehabilitation programs were established in 1964 for welfare recipients. Between 1964 and 1969 the Office of Economic Opportunity began a number of programs, including the Head Start program for preschool children, the Neighborhood Youth Corps, and the Teacher Corps. Responding to allegations of abuse in the country's welfare system and charges that it encouraged dependency, the federal government introduced reforms in 1996, including limiting long-term benefits, requiring recipients to find work, and devolving much of the decision making to the states.

      Persons who have been employed are eligible for retirement pensions under the Social Security program, and their surviving spouses and dependent children are generally eligible for survivor benefits. Many employers provide additional retirement benefits, usually funded by worker and employer contributions. In addition, millions of Americans maintain individual retirement accounts, such as the popular 401(k) plan, which is organized by employers and allows workers (sometimes with matching funds from their employer) to contribute part of their earnings on a tax-deferred basis to individual investment accounts.

      With total health-care spending significantly exceeding $1 trillion annually, the provision of medical and health care is one of the largest industries in the United States. There are, nevertheless, many inadequacies in medical services, particularly in rural and poor areas. Some two-thirds of the population is covered by employer-based health-insurance plans, and about one-sixth of the population, including members of the armed forces and their families, receives medical care paid for or subsidized by the federal government, with that for the poor provided by Medicaid. Approximately one-sixth of the population is not covered by any form of health insurance. Though the United States spends a larger proportion of its gross domestic product (GDP) on health care than any other major industrialized country, it is the only such country that does not guarantee health-care coverage for all its citizens. During the late 20th and the early 21st century, rising health-care and prescription drug costs were major concerns for both workers and employers.

      The federal Department of Health and Human Services, through its National Institutes of Health, supports much of the biomedical research in the United States. Grants are also made to researchers in clinics and medical schools.

Housing
      About three-fifths of the housing units in the United States are detached single-family homes, and about two-thirds are owner-occupied. Most houses are constructed of wood, and many are covered with shingles or brick veneer. The housing stock is relatively modern; nearly one-third of all units have been constructed since 1980, while about one-fifth of units were built prior to 1940. The average home is relatively large, with more than two-thirds of homes consisting of five or more rooms.

      Housing has long been considered a private rather than a public concern. The growth of urban slums, however, led many municipal governments to enact stricter building codes and sanitary regulations. In 1934 the Federal Housing Administration was established to make loans to institutions that would build low-rent dwellings. However, efforts to reduce slums in large cities by developing low-cost housing in other areas were frequently resisted by local residents who feared a subsequent decline in property values. For many years the restrictive covenant, by which property owners pledged not to sell to certain racial or religious groups, served to bar those groups from many communities. In 1948 the Supreme Court declared such covenants unenforceable, and in 1962 President John F. Kennedy (Kennedy, John F.) issued an executive order prohibiting discrimination in housing built with federal aid. Since that time many states and cities have adopted fair-housing laws and set up fair-housing commissions. Nevertheless, there are considerable racial disparities in home ownership; about three-fourths of whites but only about half of Hispanics and African Americans own their housing units.

      During the 1950s and '60s large high-rise public housing units were built for low-income families in many large U.S. cities, but these often became centres of crime and unemployment, and minority groups and the poor continued to live in segregated urban ghettos (ghetto). During the 1990s and the early 21st century, efforts were made to demolish many of the housing projects and to replace them with joint public-private housing communities that would include varying income levels.

Education
      The interplay of local, state, and national programs and policies is particularly evident in education. Historically, education has been considered the province of the state and local governments. Of the approximately 4,000 colleges and universities (including branch campuses), the academies of the armed services are among the few federal institutions. (The federal government also administers, among others, the University of the Virgin Islands.) However, since 1862—when public lands were granted to the states to sell to fund the establishment of colleges of agricultural and mechanical arts, called land-grant colleges (land-grant college)—the federal government has been involved in education at all levels. Additionally, the federal government supports school lunch programs, administers American Indian education, makes research grants to universities, underwrites loans to college students, and finances education for veterans. It has been widely debated whether the government should also give assistance to private and parochial (religious) schools or tax deductions to parents choosing to send their children to such schools. Although the Supreme Court has ruled that direct assistance to parochial schools is barred by the Constitution's First Amendment—which states that “Congress shall make no law respecting an establishment of religion”—it has allowed the provision of textbooks and so-called supplementary educational centres on the grounds that their primary purpose is educative rather than religious.

      Public secondary and elementary education is free and provided primarily by local government. Education is compulsory, generally from age 7 through 16, though the age requirements vary somewhat among the states. The literacy rate exceeds 95 percent. In order to address the educational needs of a complex society, governments at all levels have pursued diverse strategies, including preschool programs, classes in the community, summer and night schools, additional facilities for exceptional children, and programs aimed at culturally deprived and disaffected students.

      Although primary responsibility for elementary education rests with local government, it is increasingly affected by state and national policies. The Civil Rights Act of 1964, for example, required federal agencies to discontinue financial aid to school districts that were not racially integrated, and in Swann v. Charlotte-Mecklenburg County (North Carolina) Board of Education (1971) the Supreme Court mandated busing to achieve racially integrated schools, a remedy that often required long commutes for African American children living in largely segregated enclaves. In the late 20th and the early 21st century, busing remained a controversial political issue, and many localities (including Charlotte) ended their busing programs or had them terminated by federal judges. In addition, the No Child Left Behind Act, enacted in 2002, increased the federal role in elementary and secondary education by requiring states to implement standards of accountability for public elementary and secondary schools.

James T. Harris Ed.

Cultural life
      The great art historian Sir Ernst Hans Josef Gombrich (Gombrich, Sir Ernst Hans Josef) once wrote that there is really no such thing as “art”; there are only artists. This is a useful reminder to anyone studying, much less setting out to try to define, anything as big and varied as the culture of the United States. For the culture that endures in any country is made not by vast impersonal forces or by unfolding historical necessities but by uniquely talented men and women, one-of-a-kind people doing one thing at a time—doing what they can, or must. In the United States, particularly, where there is no more a truly “established” art than an established religion—no real academies, no real official art—culture is where one finds it, and many of the most gifted artists have chosen to make their art far from the parades and rallies of worldly life.

 Some of the keenest students of the American arts have even come to dislike the word culture as a catchall for the plastic and literary arts, since it is a term borrowed from anthropology, with its implication that there is any kind of seamless unity to the things that writers and poets and painters have made. The art of some of the greatest American artists and writers, after all, has been made in deliberate seclusion and has taken as its material the interior life of the mind and heart that shapes and precedes shared “national” experience. It is American art before it is the culture of the United States. Even if it is true that these habits of retreat are, in turn, themselves in part traditions, and culturally shaped, it is also true that the least illuminating way to approach the poems of Emily Dickinson (Dickinson, Emily) or the paintings of Winslow Homer (Homer, Winslow), to take only two imposing instances, is as the consequence of large-scale mass sociological phenomenon.

      Still, many, perhaps even most, American culture-makers have not only found themselves, as all Americans do, caught in the common life of their country—they have chosen to make the common catch their common subject. Their involvement with the problems they share with their neighbours, near and far, has given their art a common shape and often a common substance. And if one quarrel has absorbed American artists and thinkers more than any other, it has been that one between the values of a mass, democratic, popular culture and those of a refined elite culture accessible only to the few—the quarrel between “low” and “high.” From the very beginnings of American art, the “top down” model of all European civilization, with a fine art made for an elite class of patrons by a specialized class of artists, was in doubt, in part because many Americans did not want that kind of art, in part because, even if they wanted it, the social institutions—a court or a cathedral—just were not there to produce and welcome it. What came in its place was a commercial culture, a marketplace of the arts, which sometimes degraded art into mere commerce and at other times raised the common voice of the people to the level of high art.

      In the 20th century, this was, in some part, a problem that science left on the doorstep of the arts. Beginning at the turn of the century, the growth of the technology of mass communications—the movies, the phonograph, radio, and eventually television—created a potential audience for stories and music and theatre larger than anyone could previously have dreamed that made it possible for music and drama and pictures to reach more people than had ever been possible. People in San Francisco could look at the latest pictures or hear the latest music from New York months, or even moments, after they were made; a great performance demanded a pilgrimage no longer than the path to a corner movie theatre. High culture had come to the American living room.

 But, though interest in a “democratic” culture that could compete with traditional high culture has grown in recent times, it is hardly a new preoccupation. One has only to read such 19th-century classics as Mark Twain (Twain, Mark)'s The Innocents Abroad (1869) to be reminded of just how long, and just how keenly, Americans have asked themselves if all the stained glass and sacred music of European culture is all it is cracked up to be, and if the tall tales and Cigar-Store Indians did not have more juice and life in them for a new people in a new land. Twain's whole example, after all, was to show that American speech as it was actually spoken was closer to Homer than imported finery was.

      In this way, the new machines of mass reproduction and diffusion that fill modern times, from the daguerreotype to the World Wide Web, came not simply as a new or threatening force but also as the fulfillment of a standing American dream. Mass culture seemed to promise a democratic culture: a cultural life directed not to an aristocracy but to all men and women. It was not that the new machines produced new ideals but that the new machines made the old dreams seem suddenly a practical possibility.

      The practical appearance of this dream began in a spirit of hope. Much American art at the turn of the 20th century and through the 1920s, from the paintings of Charles Sheeler (Sheeler, Charles) to the poetry of Hart Crane (Crane, Hart), hymned the power of the new technology and the dream of a common culture. By the middle of the century, however, many people recoiled in dismay at what had happened to the American arts, high and low, and thought that these old dreams of a common, unifying culture had been irrevocably crushed. The new technology of mass communications, for the most part, seemed to have achieved not a generous democratization but a bland homogenization of culture. Many people thought that the control of culture had passed into the hands of advertisers, people who used the means of a common culture just to make a buck. It was not only that most of the new music and drama that had been made for movies and radio, and later for television, seemed shallow; it was also that the high or serious culture that had become available through the means of mass reproduction seemed to have been reduced to a string of popularized hits, which concealed the real complexity of art. Culture, made democratic, had become too easy.

 As a consequence, many intellectuals and artists around the end of World War II began to try to construct new kinds of elite “high” culture, art that would be deliberately difficult—and to many people it seemed that this new work was merely difficult. Much of the new art and dance seemed puzzling and deliberately obscure. Difficult art happened, above all, in New York City. During World War II, New York had seen an influx of avant-garde artists escaping Adolf Hitler's Europe, including the painters Max Ernst (Ernst, Max), Piet Mondrian (Mondrian, Piet), and Joan Miró (Miró, Joan), as well as the composer Igor Stravinsky (Stravinsky, Igor). They imported many of the ideals of the European avant-garde, particularly the belief that art should always be difficult and “ahead of its time.” (It is a paradox that the avant-garde movement in Europe had begun, in the late 19th century, in rebellion against what its advocates thought were the oppressive and stifling standards of high, official culture in Europe and that it had often looked to American mass culture for inspiration.) In the United States, however, the practice of avant-garde art became a way for artists and intellectuals to isolate themselves from what they thought was the cheapening of standards.

      And yet this counterculture had, by the 1960s, become in large American cities an official culture of its own. For many intellectuals around 1960, this gloomy situation seemed to be all too permanent. One could choose between an undemanding low culture and an austere but isolated high culture. For much of the century, scholars of culture saw these two worlds—the public world of popular culture and the private world of modern art—as irreconcilable antagonists and thought that American culture was defined by the abyss between them.

      As the century and its obsessions closed, however, more and more scholars came to see in the most enduring inventions of American culture patterns of cyclical renewal between high and low. And as scholars have studied particular cases instead of abstract ideas, it has become apparent that the contrast between high and low has often been overdrawn. Instead of a simple opposition between popular culture and elite culture, it is possible to recognize in the prolix and varied forms of popular culture innovations and inspirations that have enlivened the most original high American culture—and to then see how the inventions of high culture circulate back into the street, in a spiraling, creative flow. In the astonishing achievements of the American jazz musicians, who took the popular songs of Tin Pan Alley and the Broadway musical and inflected them with their own improvisational genius; in the works of great choreographers like Paul Taylor and George Balanchine, who found in tap dances and marches and ballroom bebop new kinds of movement that they then incorporated into the language of high dance; in the “dream boxes” of the American avant-garde artist Joseph Cornell (Cornell, Joseph), who took for his material the mundane goods of Woolworth's and the department store and used them as private symbols in surreal dioramas: in the work of all of these artists, and so many more, we see the same kind of inspiring dialogue between the austere discipline of avant-garde art and the enlivening touch of the vernacular.

      This argument has been so widely resolved, in fact, that, in the decades bracketing the turn of the 21st century, the old central and shaping American debate between high and low has been in part replaced by a new and, for the moment, still more clamorous argument. It might be said that if the old debate was between high and low, this one is between the “centre” and the “margins.” The argument between high and low was what gave the modern era its special savour. A new generation of critics and artists, defining themselves as “postmodern,” have argued passionately that the real central issue of culture is the “construction” of cultural values, whether high or low, and that these values reflect less enduring truth and beauty, or even authentic popular taste, than the prejudices of professors. Since culture has mostly been made by white males praising dead white males to other white males in classrooms, they argue, the resulting view of American culture has been made unduly pale, masculine, and lifeless. It is not only the art of African Americans and other minorities that has been unfairly excluded from the canon of what is read, seen, and taught, these scholars argue, often with more passion than evidence; it is also the work of anonymous artists, particularly women, that has been “marginalized” or treated as trivial. This argument can conclude with a rational, undeniable demand that more attention be paid to obscure and neglected writers and artists, or it can take the strong and often irrational form that all aesthetic values are merely prejudices enforced by power. If the old debate between high and low asked if real values could rise from humble beginnings, the new debate about American culture asks if true value, as opposed to mere power, exists at all.

Literature
      Because the most articulate artists are, by definition, writers, most of the arguments about what culture is and ought to do have been about what literature is and ought to do—and this can skew our perception of American culture a little, because the most memorable American art has not always appeared in books and novels and stories and plays. In part, perhaps, this is because writing was the first art form to undergo a revolution of mass technology; books were being printed in thousands of copies, while one still had to make a pilgrimage to hear a symphony or see a painting. The basic dispute between mass experience and individual experience has been therefore perhaps less keenly felt as an everyday fact in writing in the 20th and 21st centuries than it has been in other art forms. Still, writers have seen and recorded this quarrel as a feature of the world around them, and the evolution of American writing in the past 50 years has shown some of the same basic patterns that can be found in painting and dance and the theatre.

      In the United States after World War II, many writers, in opposition to what they perceived as the bland flattening out of cultural life, made their subject all the things that set Americans apart from one another. Although for many Americans, ethnic and even religious differences had become increasingly less important as the century moved on—holiday rather than everyday material—many writers after World War II seized on these differences to achieve a detached point of view on American life. Beginning in the 1940s and '50s, three groups in particular seemed to be “outsider-insiders” who could bring a special vision to fiction: Southerners, Jews, and African Americans.

      Each group had a sense of uncertainty, mixed emotions, and stifled aspirations that lent a questioning counterpoint to the general chorus of affirmation in American life. The Southerners—William Faulkner (Faulkner, William), Eudora Welty (Welty, Eudora), and Flannery O'Connor (O'Connor, Flannery) most particularly—thought that a noble tradition of defeat and failure had been part of the fabric of Southern life since the Civil War. At a time when “official” American culture often insisted that the American story was one of endless triumphs and optimism, they told stories of tragic fate. Jewish writers—most prominently Chicago novelist Saul Bellow (Bellow, Saul), who won the Nobel Prize for Literature in l976, Bernard Malamud (Malamud, Bernard), and Philip Roth (Roth, Philip)—found in the “golden exile” of Jews in the United States a juxtaposition of surface affluence with deeper unease and perplexity that seemed to many of their fellow Americans to offer a common predicament in a heightened form.

      For African Americans, of course, the promise of American life had in many respects never been fulfilled. “What happens to a dream deferred,” the poet Langston Hughes (Hughes, Langston) asked, and many African American writers attempted to answer that question, variously, through stories that mingled pride, perplexity, and rage. African American literature achieved one of the few unquestioned masterpieces of late 20th-century American fiction writing in Ralph Ellison (Ellison, Ralph)'s Invisible Man (l952). More recently, the rise of feminism as a political movement has given many women a sense that their experience too is richly and importantly outside the mainstream; since at least the 1960s, there has been an explosion of women's fiction, including the much-admired work of Toni Morrison (Morrison, Toni), the first African American female to win the Nobel Prize for Literature (1993); Anne Tyler (Tyler, Anne); and Ann Beattie (Beattie, Ann).

      Perhaps precisely because so many novelists sought to make their fiction from experiences that were deliberately imagined as marginal, set aside from the general condition of American life, many other writers had the sense that fiction, and particularly the novel, might not any longer be the best way to try to record American life. For many writers the novel seemed to have become above all a form of private, interior expression and could no longer keep up with the extravagant oddities of the United States. Many gifted writers took up journalism with some of the passion for perfection of style that had once been reserved for fiction. The exemplars of this form of poetic journalism included the masters of The New Yorker (New Yorker, The) magazine, most notably A.J. Liebling, whose books included The Earl of Louisiana (1961), a study of an election in Louisiana, as well as Joseph Mitchell, who in his books The Bottom of the Harbour (1944) and Joe Gould's Secret (1942) offered dark and perplexing accounts of the life of the American metropolis. The dream of combining real facts and lyrical fire also achieved a masterpiece in the poet James Agee (Agee, James)'s Let Us Now Praise Famous Men (l941; with photographs by Walker Evans (Evans, Walker)), an account of sharecropper life in the South that is a landmark in the struggle for fact writing that would have the beauty and permanence of poetry.

      As the century continued, this genre of imaginative nonfiction (sometimes called the documentary novel or the nonfiction novel) continued to evolve and took on many different forms. In the writing of Calvin Trillin, John McPhee (McPhee, John), Neil Sheehan, and Truman Capote (Capote, Truman), all among Liebling's and Mitchell's successors at The New Yorker, this new form continued to seek a tone of subdued and even amused understatement. Tom Wolfe (Wolfe, Tom), whose influential books included The Right Stuff (1979), an account of the early days of the American space program, and Norman Mailer (Mailer, Norman), whose books included Miami and the Siege of Chicago (1968), a ruminative piece about the Republican and Democratic national conventions in l968, deliberately took on huge public subjects and subjected them to the insights (and, many people thought, the idiosyncratic whims) of a personal sensibility.

      As the nonfiction novel often pursued extremes of grandiosity and hyperbole, the American short story assumed a previously unexpected importance in the life of American writing; the short story became the voice of private vision and private lives. The short story, with its natural insistence on the unique moment and the infrangible glimpse of something private and fragile, had a new prominence. The rise of the American short story is bracketed by two remarkable books: J.D. Salinger (Salinger, J D)'s Nine Stories (1953) and Raymond Carver (Carver, Raymond)'s collection What We Talk About When We Talk About Love (1981). Salinger inspired a generation by imagining that the serious search for a spiritual life could be reconciled with an art of gaiety and charm; Carver confirmed in the next generation their sense of a loss of spirituality in an art of taciturn reserve and cloaked emotions.

      Since Carver's death in 1988, the great novelist and man of letters John Updike (Updike, John) has remained perhaps the last undisputed master of literature in the high American sense that emerged with Ernest Hemingway (Hemingway, Ernest) and Faulkner. Yet in no area of the American arts, perhaps, have the claims of the marginal to take their place at the centre of the table been so fruitful, subtle, or varied as in literature. Perhaps because writing is inescapably personal, the trap of turning art into mere ideology has been most deftly avoided in its realm. This can be seen in the dramatically expanded horizons of the feminist and minority writers whose work first appeared in the 1970s and '80s, including the Chinese American Amy Tan (Tan, Amy). A new freedom to write about human erotic experience previously considered strange or even deviant shaped much new writing, from the comic obsessive novels of Nicholson Baker through the work of those short-story writers and novelists, including Edmund White (White, Edmund) and David Leavitt, who have made art out of previously repressed and unnarrated areas of homoerotic experience. Literature is above all the narrative medium of the arts, the one that still best relates What Happened to Me, and American literature, at least, has only been enriched by new “mes” and new narratives. (See also American literature.)

The visual arts and postmodernism
 Perhaps the greatest, and certainly the loudest, event in American cultural life since World War II was what the critic Irving Sandler has called “The Triumph of American Painting”—the emergence of a new form of art that allowed American painting to dominate the world. This dominance lasted for at least 40 years, from the birth of the so-called New York school, or Abstract Expressionism, around l945 until at least the mid-1980s, and it took in many different kinds of art and artists. In its first flowering, in the epic-scaled abstractions of Jackson Pollock (Pollock, Jackson), Mark Rothko (Rothko, Mark), Willem de Kooning (de Kooning, Willem), and the other members of the New York school, this new painting seemed abstract, rarefied, and constructed from a series of negations, from saying “no!” to everything except the purest elements of painting. Abstract Expressionism seemed to stand at the farthest possible remove from the common life of American culture and particularly from the life of American popular culture. Even this painting, however, later came under a new and perhaps less-austere scrutiny; and the art historian Robert Rosenblum has persuasively argued that many of the elements of Abstract Expressionism, for all their apparent hermetic distance from common experience, are inspired by the scale and light of the American landscape and American 19th-century landscape painting—by elements that run deep and centrally in Americans' sense of themselves and their country.

 It is certainly true that the next generation of painters, who throughout the 1950s continued the unparalleled dominance of American influence in the visual arts, made their art aggressively and unmistakably of the dialogue between the studio and the street. Jasper Johns (Johns, Jasper), for instance, took as his subject the most common and even banal of American symbols—maps of the 48 continental states, the flag itself—and depicted the quickly read and immediately identifiable common icons with a slow, meditative, painterly scrutiny. His contemporary and occasional partner Robert Rauschenberg (Rauschenberg, Robert) took up the same dialogue in a different form; his art consisted of dreamlike collages of images silk-screened from the mass media, combined with personal artifacts and personal symbols, all brought together in a mélange of jokes and deliberately perverse associations. In a remarkably similar spirit, the eccentric surrealist Joseph Cornell (Cornell, Joseph) made little shoe-box-like dioramas in which images taken from popular culture were made into a dreamlike language of nostalgia and poetic reverie. Although Cornell, like William Blake (Blake, William), whom he in many ways resembled, worked largely in isolation, his sense of the poetry that lurks unseen in even the most absurd everyday objects had a profound effect on other artists.

      By the early 1960s, with the explosion of the new art form called Pop art, the engagement of painting and drawing with popular culture seemed so explicit as to be almost overwhelming and, at times, risked losing any sense of private life and personal inflection at all—it risked becoming all street and no studio. Artists such as Andy Warhol (Warhol, Andy), Roy Lichtenstein (Lichtenstein, Roy), and Claes Oldenburg (Oldenburg, Claes) took the styles and objects of popular culture—everything from comic books to lipstick tubes—and treated them with the absorption and grave seriousness previously reserved for religious icons. But this art too had its secrets, as well as its strong individual voices and visions. In his series of drawings called Proposals for Monumental Buildings, 1965–69, Oldenburg drew ordinary things—fire hydrants, ice-cream bars, bananas—as though they were as big as skyscrapers. His pictures combined a virtuoso's gift for drawing with a vision, at once celebratory and satirical, of the P.T. Barnum (Barnum, P.T.) spirit of American life. Warhol silk-screened images of popular movie stars and Campbell's soup cans; in replicating them, he suggested that their reiteration by mass production had emptied them of their humanity but also given them a kind of hieratic immortality. Lichtenstein used the techniques of comic-book illustration to paraphrase some of the monuments of modern painting, making a coolly witty art in which Henri Matisse (Matisse, Henri) danced with Captain Marvel.

      But these artists who self-consciously chose to make their art out of popular materials and images were not the only ones who had something to say about the traffic between mass and elite culture. The so-called minimalists, who made abstract art out of simple and usually hard-edged geometric forms, from one point of view carried on the tradition of austere abstraction. But it was also the minimalists, as art historians have pointed out, who carried over the vocabulary of the new International Style of unornamented architecture into the world of the fine arts; minimalism imagined the dialogue between street and studio in terms of hard edges and simple forms rather than in terms of imagery, but it took part in the same dialogue. In some cases, the play between high and low has been carried out as a dialogue between Pop and minimalist styles themselves. Frank Stella (Stella, Frank), thought by many to be the preeminent American painter of the late 20th century, began as a minimalist, making extremely simple paintings of black chevrons from which everything was banished except the barest minimum of painterly cues. Yet in his subsequent work he became almost extravagantly “maximalist” and, as he began to make bas-reliefs, added to the stark elegance of his early paintings wild, Pop-art elements of outthrusting spirals and Day-Glo colors—even sequins and glitter—that deliberately suggested the invigorating vulgarity of the Las Vegas Strip. Stella's flamboyant reliefs combine the spare elegance of abstraction with the greedy vitality of the American street.

      In the 1980s and '90s, it was in the visual arts, however, that the debates over postmodern marginality and the construction of a fixed canon became, perhaps, most fierce—yet, oddly, were at the same time least eloquent, or least fully realized in emotionally potent works of art. Pictures and objects do not “argue” particularly well, so the tone of much contemporary American art became debased, with the cryptic languages of high abstraction and conceptual art put in the service of narrow ideological arguments. It became a standard practice in American avant-garde art of the 1980s and '90s to experience an installation in which an inarguable social message—for instance, that there should be fewer homeless people in the streets—was encoded in a highly oblique, Surrealist manner, with the duty of the viewer then reduced to decoding the manner back into the message. The long journey of American art in the 20th century away from socially “responsible” art that lacked intense artistic originality seemed to have been short-circuited, without necessarily producing much of a gain in clarity or accessibility.

      No subject or idea has been as powerful, or as controversial, in American arts and letters at the end of the 20th century and into the new millennium as the idea of the ‘‘postmodern, " and in no sphere has the argument been as lively as in that of the plastic arts. The idea of the postmodern has been powerful in the United States exactly because the idea of the modern was so powerful; where Europe has struggled with the idea of modernity, in the United States it has been largely triumphant, thus leaving the question of ‘‘what comes next "all the more problematic. Since the 1960s, the ascendance of postmodern culture has been argued—now it is even sometimes said that a ‘‘post-postmodern " epoch has begun, but what exactly that means is remarkably vague.

      In some media, what is meant by postmodern is clear and easy enough to point to: it is the rejection of the utopian aspects of modernism, and particularly of the attempt to express that utopianism in ideal or absolute form—the kind experienced in Bauhaus architecture or in minimalist painting. Postmodernism is an attempt to muddy lines drawn falsely clear. In American architecture, for instance, the meaning of postmodern is reasonably plain. Beginning with the work of Robert Venturi (Venturi, Robert), Denise Scott-Brown, and Peter Eisenman (Eisenman, Peter), postmodern architects deliberately rejected the pure forms and ‘‘truth to materials " of the modern architect and put in their place irony, ornament, historical reference, and deliberate paradox. Some American postmodern architecture has been ornamental and cheerfully cosmetic, as in the later work of Philip Johnson (Johnson, Philip C.) and the mid-1980s work of Michael Graves (Graves, Michael). Some has been demanding and deliberately challenging even to conventional ideas of spatial lucidity, as in Eisenman's Wexner Center in Columbus, Ohio. But one can see the difference just by looking.

      In painting and sculpture (Western sculpture), on the other hand, it is often harder to know where exactly to draw the line—and why the line is drawn. In the paintings of the American artist David Salle (Salle, David) or the photographs of Cindy Sherman (Sherman, Cindy), for instance, one sees apparently postmodern elements of pastiche, borrowed imagery, and deliberately ‘‘impure " collage. But all of these devices are also components of modernism and part of the heritage of Surrealism, though the formal devices of a Rauschenberg or Johns were used in a different emotional key. The true common element among the postmodern perhaps lies in a note of extreme pessimism and melancholy about the possibility of escaping from borrowed imagery into ‘‘authentic " experience. It is this emotional tone that gives postmodernism its peculiar register and, one might almost say, its authenticity.

      In literature, the postmodern is, once again, hard to separate from the modern, since many of its keynotes—for instance, a love of complicated artifice and obviously literary devices, along with the mixing of realistic and frankly fantastic or magical devices—are at least as old as James Joyce (Joyce, James)'s founding modernist fictions. But certainly the expansion of possible sources, the liberation from the narrowly white male view of the world, and a broadening of testimony given and testimony taken are part of what postmodern literature has in common with other kinds of postmodern culture. It has been part of the postmodern transformation in American fiction as well to place authors previously marginalized as genre writers at the centre of attention. The African American crime writer Chester Himes (Himes, Chester), for example, has been given serious critical attention, while the strange visionary science-fiction writer Philip K. Dick (Dick, Philip K.) was ushered, in 2007, from his long exile in paperback into the Library of America.

      What is at stake in the debates over modern and postmodern is finally the American idea of the individual. Where modernism in the United States placed its emphasis on the autonomous individual, the heroic artist, postmodernism places its emphasis on the ‘‘de-centred " subject, the artist as a prisoner, rueful or miserable, of culture. Art is seen as a social event rather than as communication between persons. If in modernism an individual artist made something that in turn created a community of observers, in the postmodern epoch the opposite is true: the social circumstance, the chain of connections that make seeming opposites unite, key off the artist and make him what he is. In the work of the artist Jeff Koons (Koons, Jeff), for instance—who makes nothing but has things, from kitsch figurines to giant puppies composed of flowers, made for him—this postmodern rejection of the handmade or authentic is given a weirdly comic tone, at once eccentric and humorous. It is the impurities of culture, rather than the purity of the artist's vision, that haunts contemporary art.

      Nonetheless, if the push and charge that had been so unlooked-for in American art since the 1940s seemed diminished, the turn of the 21st century was a rich time for second and even third acts. Richard Serra (Serra, Richard), John Baldessari (Baldessari, John), Elizabeth Murray (Murray, Elizabeth), and Chuck Close (Close, Chuck) were all American artists who continued to produce arresting, original work—most often balanced on that fine knife edge between the blankly literal and the disturbingly metaphoric—without worrying overmuch about theoretical fashions or fashionable theory.

      As recently as the 1980s, most surveys of American culture might not have thought photography of much importance. But at the turn of the century, photography began to lay a new claim to attention as a serious art form. For the bulk of the first part of the 20th century, the most remarkable American photographers had, on the whole, tried to make photography into a “fine art” by divorcing it from its ubiquitous presence as a recorder of moments and by splicing it onto older, painterly traditions. A clutch of gifted photographers, however, have, since the end of World War II, been able to transcend the distinction between media image and aesthetic object—between art and photojournalism—to make from a single, pregnant moment a complete and enduring image. Walker Evans (Evans, Walker), Margaret Bourke-White (Bourke-White, Margaret), and Robert Frank (Frank, Robert) (the latter, like so many artists of the postwar period, an emigrant), for instance, rather than trying to make of photography something as calculated and considered as the traditional fine arts, found in the instantaneous vision of the camera something at once personal and permanent. Frank's book The Americans (l956), the record of a tour of the United States that combined the sense of accident of a family slide show with a sense of the ominous worthy of the Italian painter Giorgio de Chirico (de Chirico, Giorgio), was the masterpiece of this vision; and no work of the postwar era was more influential in all fields of visual expression. Robert Mapplethorpe (Mapplethorpe, Robert), Diane Arbus (Arbus, Diane), and, above all, Richard Avedon (Avedon, Richard) and Irving Penn (Penn, Irving), who together dominated both fashion and portrait photography for almost half a century and straddled the lines between museum and magazine, high portraiture and low commercials, all came to seem, in their oscillations between glamour and gloom, exemplary of the predicaments facing the American artist.

The theatre (theatre, Western)
 Perhaps more than any other art form, the American theatre suffered from the invention of the new technologies of mass reproduction. Where painting and writing could choose their distance from (or intimacy with) the new mass culture, many of the age-old materials of the theatre had by the 1980s been subsumed by movies and television. What the theatre could do that could not be done elsewhere was not always clear. As a consequence, the Broadway theatre—which in the 1920s had still seemed a vital area of American culture and, in the high period of the playwright Eugene O'Neill (O'Neill, Eugene), a place of cultural renaissance—had by the end of the 1980s become very nearly defunct. A brief and largely false spring had taken place in the period just after World War II. Tennessee Williams (Williams, Tennessee) and Arthur Miller (Miller, Arthur), in particular, both wrote movingly and even courageously about the lives of the “left-out” Americans, demanding attention for the outcasts of a relentlessly commercial society. Viewing them from the 21st century, however, both seem more traditional and less profoundly innovative than their contemporaries in the other arts, more profoundly tied to the conventions of European naturalist theatre and less inclined or able to renew and rejuvenate the language of their form.

 Also much influenced by European models, though in his case by the absurdist theatre of Eugène Ionesco (Ionesco, Eugène) and Samuel Beckett (Beckett, Samuel), was Edward Albee (Albee, Edward), the most prominent American playwright of the 1960s. As Broadway's dominance of the American stage waned in the 1970s, regional theatre took on new importance, and cities such as Chicago, San Francisco, and Louisville, Ky., provided significant proving grounds for a new generation of playwrights. On those smaller but still potent stages, theatre continues to speak powerfully. An African American renaissance in the theatre has taken place, with its most notable figure being August Wilson (Wilson, August), whose 1985 play Fences won the Pulitzer Prize. And, for the renewal and preservation of the American language, there is still nothing to equal the stage: David Mamet (Mamet, David), in his plays, among them Glengarry, Glen Ross (1983) and Speed the Plow (1987), both caught and created an American vernacular—verbose, repetitive, obscene, and eloquent—that combined the local colour of Damon Runyon (Runyon, Damon) and the bleak truthfulness of Harold Pinter (Pinter, Harold). The one completely original American contribution to the stage, the musical theatre, blossomed in the 1940s and '50s in the works of Frank Loesser (Loesser, Frank) (especially Guys and Dolls, which the critic Kenneth Tynan regarded as one of the greatest of American plays) but became heavy-handed and exists at the beginning of the 21st century largely as a revival art and in the brave “holdout” work of composer and lyricist Stephen Sondheim (Sondheim, Stephen) (Company, Sweeney Todd, and Into the Woods).

Motion pictures (motion picture)
 In some respects the motion picture is the American art form par excellence, and no area of art has undergone a more dramatic revision in critical appraisal in the recent past. Throughout most of the 1940s and '50s, serious critics, with a few honourable exceptions (notably, James Agee and Manny Farber), even those who took the cinema seriously as a potential artistic medium, took it for granted that (excepting the work of D.W. Griffith (Griffith, D W) and Orson Welles (Welles, Orson)), the commercial Hollywood movie was, judged as art, hopelessly compromised by commerce. In the 1950s in France, however, a generation of critics associated with the magazine Cahiers du cinéma (many of whom later would become well-known filmmakers themselves, including François Truffaut (Truffaut, François) and Claude Lelouch (Lelouch, Claude)) argued that the American commercial film, precisely because its need to please a mass audience had helped it break out of the limiting gentility of the European cinema, had a vitality and, even more surprisingly, a set of master-makers (auteurs) without equal in the world. New studies and appreciations of such Hollywood filmmakers as John Ford (Ford, John), Howard Hawks (Hawks, Howard), and William Wyler (Wyler, William) resulted, and, eventually, this new evaluation worked its way back into the United States itself: another demonstration that one country's low art can become another country's high art. Imported back into the United States, this reevaluation changed and amended preconceptions that had hardened into prejudices.

 The new appreciation of the individual vision of the Hollywood film was to inspire a whole generation of young American filmmakers, including Francis Ford Coppola (Coppola, Francis Ford), Martin Scorsese (Scorsese, Martin), and George Lucas (Lucas, George), to attempt to use the commercial film as at once a form of personal expression and a means of empire building, with predictably mixed results. By the end of the century, another new wave of filmmakers (notably Spike Lee (Lee, Spike) and Stephen Soderbergh), like the previous generation mostly trained in film schools, had graduated from independent filmmaking to the mainstream, and the American tradition of film comedy stretching from Buster Keaton (Keaton, Buster) and Charlie Chaplin (Chaplin, Charlie) to Billy Wilder (Wilder, Billy), Preston Sturges (Sturges, Preston), and Woody Allen (Allen, Woody) had come to include the quirky sensibilities of Joel and Ethan Coen and Wes Anderson. In mixing a kind of eccentric, off-focus comedy with a private, screw-loose vision, they came close to defining another kind of postmodernism, one that was as antiheroic as the more academic sort but cheerfully self-possessed in tone. As the gap between big studio-made entertainment—produced for vast international audiences—and the small ‘‘art " or independent film widened, the best of the independents came to have the tone and idiosyncratic charm of good small novels: Nicole Holofcener's Lovely & Amazing (2001) or Kenneth Lonergan's You Can Count on Me (2000) reached audiences that felt bereft by the steady run of Batmans and Lethal Weapons. But with that achievement came a sense too that the audience for such serious work as Francis Ford Coppola's Godfather films and Chinatown (1974), which had been intact as late as the 1970s, had fragmented beyond recomposition.

Television
 If the Martian visitor beloved of anthropological storytelling were to visit the United States at the beginning of the 21st century, all of the art forms listed and enumerated here—painting and sculpture and literature, perhaps even motion pictures and popular music—would seem like tiny minority activities compared with the great gaping eye of American life: “the box,” television. Since the mid-1950s, television has been more than just the common language of American culture; it has been a common atmosphere. For many Americans television is not the chief manner of interpreting reality but a substitute for it, a wraparound simulated experience that has come to be more real than reality itself. Indeed, beginning in the 1990s, American television was inundated with a spate of “reality” programs, a wildly popular format that employed documentary techniques to examine ‘‘ordinary " people placed in unlikely situations, from the game-show structure of Survivor (marooned contestants struggling for supremacy) to legal dramas such as The People's Court and Cops, to American Idol, the often caustically judged talent show that made instant stars of some of its contestants. Certainly, no medium—not even motion pictures at the height of their popular appeal in the 1930s—has created so much hostility, fear, and disdain in some “right-thinking” people. Television is chewing gum for the eyes, famously characterized as a vast wasteland in 1961 by Newton Minow, then chairman of the Federal Communications Commission. When someone in the movies is meant to be shown living a life of meaningless alienation, he is usually shown watching television.

      Yet television itself is, of course, no one thing, nor, despite the many efforts since the time of the Canadian philosopher Marshall Mcluhan (McLuhan, Marshall) to define its essence, has it been shown to have a single nature that deforms the things it shows. Television can be everything from Monday Night Football to the Persian Gulf War's Operation Desert Storm to Who Wants to Be a Millionaire? The curious thing, perhaps, is that, unlike motion pictures, where unquestioned masters and undoubted masterpieces and a language of criticism had already emerged, television still waits for a way to be appreciated. Television is the dominant contemporary cultural reality, but it is still in many ways the poor relation. (It is not unusual for magazines and newspapers that keep on hand three art critics to have but one part-time television reviewer—in part because the art critic is in large part a cultural broker, a “cultural explainer,” and few think that television needs to be explained.)

      When television first appeared in the late 1940s, it threatened to be a “ghastly gelatinous nirvana,” in James Agee (Agee, James)'s memorable phrase. Yet the 1950s, the first full decade of television's impact on American life, was called then, and is still sometimes called, a “Golden Age.” Serious drama, inspired comedy, and high culture all found a place in prime-time programming. From Sid Caesar to Lucille Ball (Ball, Lucille), the performers of this period retain a special place in American affections. Yet in some ways these good things were derivative of other, older media, adaptations of the manner and styles of theatre and radio. It was perhaps only in the 1960s that television came into its own, not just as a way of showing things in a new way but as a way of seeing things in a new way. Events as widely varied in tone and feeling as the broadcast of the Olympic Games and the assassination and burial of Pres. John F. Kennedy (Kennedy, John F.)—extended events that took place in real time—brought the country together around a set of shared, collective images and narratives that often had neither an “author” nor an intended point or moral. The Vietnam War became known as the “living room war” because images (though still made on film) were broadcast every night into American homes; later conflicts, such as the Persian Gulf War and the Iraq War, were actually brought live and on direct video feed from the site of the battles into American homes. Lesser but still compelling live events, from the marriage of Charles, prince of Wales, and Lady Diana Spencer (Diana, princess of Wales) to the pursuit of then murder suspect O.J. Simpson (Simpson, O.J.) in his white Bronco by the Los Angeles police in 1994, came to have the urgency and shared common currency that had once belonged exclusively to high art. From ordinary television viewers to professors of the new field of cultural studies, many Americans sought in live televised events the kind of meaning and significance that they had once thought it possible to find only in highly wrought and artful myth. Beginning in the late 1960s with CBS's 60 minutes, this epic quality also informed the TV newsmagazine; presented with an in-depth approach that emphasized narrative drama, the personality of the presenters as well as the subjects, and muckraking and malfeasance, it became one of television's most popular and enduring formats.

      Even in the countless fictional programs that filled American evening television, a sense of spontaneity and immediacy seemed to be sought and found. Though television produced many stars and celebrities, they lacked the aura of distance and glamour that had once attached to the great performers of the Hollywood era. Yet if this implied a certain diminishment in splendour, it also meant that, particularly as American film became more and more dominated by the demands of sheer spectacle, a space opened on television for a more modest and convincing kind of realism. Television series, comedy and drama alike, now play the role that movies played in the earlier part of the century or that novels played in the 19th century: they are the modest mirror of their time, where Americans see, in forms stylized or natural, the best image of their own manners. The most acclaimed of these series—whether produced for broadcast television and its diminishing market share (thirtysomething, NYPD Blue, and Seinfeld) or the creations of cable providers (The Sopranos and Six Feet Under)—seem as likely to endure as popular storytelling as any literature made in the late 20th and early 21st centuries.

Popular music
      Every epoch since the Renaissance has had an art form that seems to become a kind of universal language, one dominant artistic form and language that sweeps the world and becomes the common property of an entire civilization, from one country to another. Italian painting in the 15th century, German music in the 18th century, or French painting in the 19th and early 20th centuries—all of these forms seem to transcend their local sources and become the one essential soundscape or image of their time. Johann Sebastian Bach (Bach, Johann Sebastian) and Georg Frideric Handel (Handel, George Frideric), like Claude Monet (Monet, Claude) and Édouard Manet (Manet, Édouard), are local and more.

      At the beginning of the 21st century, and seen from a worldwide perspective, it is the American popular music that had its origins among African Americans at the end of the 19th century that, in all its many forms— ragtime, jazz, swing, jazz-influenced popular song, blues, rock and roll and its art legacy as rock and later hip-hop—has become America's greatest contribution to the world's culture, the one indispensable and unavoidable art form of the 20th century.

      The recognition of this fact was a long time coming and has had to battle prejudice and misunderstanding that continues today. Indeed, jazz-inspired American popular music has not always been well served by its own defenders, who have tended to romanticize rather than explain and describe. In broad outlines, the history of American popular music involves the adulteration of a “pure” form of folk music, largely inspired by the work and spiritual and protest music of African Americans. But it involves less the adulteration of those pure forms by commercial motives and commercial sounds than the constant, fruitful hybridization of folk forms by other sounds, other musics—art and avant-garde and purely commercial, Bach and Broadway meeting at Birdland. Most of the watershed years turn out to be permeable; as the man who is by now recognized by many as the greatest of all American musicians, Louis Armstrong (Armstrong, Louis), once said, “There ain't but two kinds of music in this world. Good music and bad music, and good music you tap your toe to.”

 Armstrong's own career is a good model of the nature and evolution of American popular music at its best. Beginning in impossibly hard circumstances, he took up the trumpet at a time when it was the military instrument, filled with the marching sounds of another American original, John Phillip Sousa (Sousa, John Philip). On the riverboats and in the brothels of New Orleans, as the protégé of King Oliver (Oliver, King), Armstrong learned to play a new kind of syncopated ensemble music, decorated with solos. By the time he traveled to Chicago in the mid-1920s, his jazz had become a full-fledged art music, “full of a melancholy and majesty that were new to American music,” as Whitney Balliett has written. The duets he played with the renowned pianist Earl Hines (Hines, Earl), such as the 1928 version of "Weather Bird," have never been equaled in surprise and authority. This art music in turn became a kind of commercial or popular music, commercialized by the swing bands that dominated American popular music in the 1930s, one of which Armstrong fronted himself, becoming a popular vocalist, who in turn influenced such white pop vocalists as Bing Crosby (Crosby, Bing). The decline of the big bands led Armstrong back to a revival of his own earlier style, and, at the end, when he was no longer able to play the trumpet, he became, ironically, a still more celebrated straight “pop” performer, making hits out of Broadway tunes, among them the German-born Kurt Weill (Weill, Kurt)'s "Mack the Knife" and Jerry Herman's "Hello, Dolly." Throughout his career, Armstrong engaged in a constant cycling of creative crossbreeding—Sousa and the blues and Broadway each adding its own element to the mix.

 By the 1940s, the craze for jazz as a popular music had begun to recede, and it began to become an art music. Duke Ellington (Ellington, Duke), considered by many as the greatest American composer, assembled a matchless band to play his ambitious and inimitable compositions, and by the 1950s jazz had become dominated by such formidable and uncompromising creators as Miles Davis (Davis, Miles) and John Lewis of the Modern Jazz Quartet.

      Beginning in the 1940s, it was the singers whom jazz had helped spawn—those who used microphones in place of pure lung power and who adapted the Viennese operetta-inspired songs of the great Broadway composers (who had, in turn, already been changed by jazz)—who became the bearers of the next dominant American style. Simply to list their names is to evoke a social history of the United States since World War II: Frank Sinatra (Sinatra, Frank), Nat King Cole (Cole, Nat King), Mel Tormé (Tormé, Mel), Ella Fitzgerald (Fitzgerald, Ella), Billie Holiday (Holiday, Billie), Doris Day (Day, Doris), Sarah Vaughan (Vaughan, Sarah), Peggy Lee (Lee, Peggy), Joe Williams (Williams, Joe), Judy Garland (Garland, Judy), Patsy Cline (Cline, Patsy), Willie Nelson (Nelson, Willie), Tony Bennett (Bennett, Tony), and many others. More than any other single form or sound, it was their voices that created a national soundtrack of longing, fulfillment, and forever-renewed hope that sounded like America to Americans, and then sounded like America to the world.

 September 1954 is generally credited as the next watershed in the evolution of American popular music, when a recent high-school graduate and truck driver named Elvis Presley (Presley, Elvis) went into the Memphis Recording Service and recorded a series of songs for a small label called . An easy, swinging mixture of country music, rhythm and blues, and pop ballad singing, these were, if not the first, then the seminal recordings of a new music that, it is hardly an exaggeration to say, would make all other kinds of music in the world a minority taste: rock and roll (rock). What is impressive in retrospect is that, like Armstrong's leap a quarter century before, this was less the sudden shout of a new generation coming into being than, once again, the self-consciously eclectic manufacture of a hybrid thing. According to Presley's biographer Peter Guralnick, Presley and Sam Phillips, Sun's owner, knew exactly what they were doing when they blended country style, white pop singing, and African American rhythm and blues. What was new was the mixture, not the act of mixing.

      The subsequent evolution of this music into the single musical language of the last quarter of the 20th century hardly needs be told—like jazz, it showed an even more accelerated evolution from folk to pop to art music, though, unlike jazz, this was an evolution that depended on new machines and technologies for the DNA of its growth. Where even the best-selling recording artists of the earlier generations had learned their craft in live performance, Presley was a recording artist before he was a performing one, and the British musicians who would feed on his innovations knew him first and best through records (and, in the case of the Beatles (Beatles, the) particularly, made their own innovations in the privacy of the recording studio). Yet once again, the lines between the new music and the old—between rock and roll and the pop and jazz that came before it—can be, and often are, much too strongly drawn. Instead, the evolution of American popular music has been an ongoing dialogue between past and present—between the African-derived banjo and bluegrass, Beat poets (Beat movement) and bebop—that brought together the most heartfelt interests of poor black and white Americans in ways that Reconstruction could not, its common cause replaced for working-class whites by supremacist diversions. It became, to use Greil Marcus's phrase, an Invisible Republic, not only where Presley chose to sing Arthur (‘‘Big Boy ") Crudup (Crudup, Arthur)'s song ( "That's All Right Mama" ) but where Chuck Berry (Berry, Chuck), a brown-eyed handsome man (his own segregation-era euphemism), revved up Louis Jordan (Jordan, Louis)'s jump blues to turn "Ida Red," a country-and-western ditty, into "Maybelline," along the way inventing a telegraphic poetry that finally coupled adolescent love and lust. It was a crossroads where Delta bluesman Robert Johnson (Johnson, Robert), more often channeled as a guitarist and singer, wrote songs that were as much a part of the musical education of Bob Dylan (Dylan, Bob) as were those of Woody Guthrie (Guthrie, Woody) and Weill.

      Coined in the 1960s to describe a new form of African American rhythm and blues, a strikingly American single descriptive term encompasses this extraordinary flowering of creativity— soul music. All good American popular music, from Armstrong forward, can fairly be called soul music, not only in the sense of emotional directness but with the stronger sense that great emotion can be created within simple forms and limited time, that the crucial contribution of soul is, perhaps, a willingness to surrender to feeling rather than calculating it, to appear effortless even at the risk of seeming simpleminded—to surrender to plain form, direct emotion, unabashed sentiment, and even what in more austere precincts of art would be called sentimentality. What American soul music, in this broad, inclusive sense, has, and what makes it matter so much in the world, is the ability to generate emotion without seeming to engineer emotion—to sing without seeming to sweat too much. The test of the truth of this new soulfulness is, however, its universality. Revered and catalogued in France and imitated in England, this American soul music is adored throughout the world.

      It is, perhaps, necessary for an American to live abroad to grasp how entirely American soul music had become the model and template for a universal language of emotion by the 20th century. And for an American abroad, perhaps what is most surprising is how, for all the national reputation for energy, vim, and future-focused forgetfulness, the best of all this music—from that mournful majesty of Armstrong to the heartaching quiver of Presley—has a small-scale plangency and plaintive emotion that belies the national reputation for the overblown and hyperbolic. In every sense, American culture has given the world the gift of the blues.

 Serious dance hardly existed in the United States in the first half of the 20th century. One remarkable American, Isadora Duncan (Duncan, Isadora), had played as large a role at the turn of the century and after as anyone in the emancipation of dance from the rigid rules of classical ballet into a form of intense and improvisatory personal expression. But most of Duncan's work was done and her life spent in Europe, and she bequeathed to the American imagination a shining, influential image rather than a set of steps. Ruth St. Denis (St. Denis, Ruth) and Ted Shawn (Shawn, Ted), throughout the 1920s, kept dance in America alive; but it was in the work of the choreographer Martha Graham (Graham, Martha) that the tradition of modern dance in the United States that Duncan had invented found its first and most influential master. Graham's work, like that of her contemporaries among the Abstract Expressionist painters, sought a basic, timeless vocabulary of primal expression; but even after her own work seemed to belong only to a period, in the most direct sense she founded a tradition: a Graham dancer, Paul Taylor (Taylor, Paul), became the most influential modern dance master of the next generation, and a Taylor dancer, Twyla Tharp (Tharp, Twyla), in turn the most influential choreographer of the generation after that. Where Graham had deliberately turned her back on popular culture, however, both Taylor and Tharp, typical of their generations, viewed it quizzically, admiringly, and hungrily. Whether the low inspiration comes from music—as in Tharp's Sinatra Songs, choreographed to recordings by Frank Sinatra and employing and transforming the language of the ballroom dance—or comes directly off the street—as in a famous section of Taylor's dance Cloven Kingdom, in which the dancer's movement is inspired by the way Americans walk and strut and fight—both Taylor and Tharp continue to feed upon popular culture without being consumed by it. Perhaps for this reason, their art continues to seem of increasing stature around the world; they are intensely local yet greatly prized elsewhere.

 A similar arc can be traced from the contributions of African American dance pioneers Katherine Dunham (Dunham, Katherine), beginning in the 1930s, and Alvin Ailey (Ailey, Alvin, Jr.), who formed his own company in 1958, to Savion Glover (Glover, Savion), whose pounding style of tap dancing, know as ‘‘hitting, " was the rage of Broadway in the mid-1990s with Bring in 'Da Noise, Bring in 'Da Funk.

      George Balanchine (Balanchine, George), the choreographer who dominated the greatest of American ballet troupes, the New York City Ballet, from its founding in l946 as the Ballet Society until his death in l983, might be considered outside the bounds of purely “American” culture. Yet this only serves to remind us of how limited and provisional such national groupings must always be. For, though Mr. B., as he was always known, was born and educated in Russia and took his inspiration from a language of dance codified in France in the 19th century, no one has imagined the gestures of American life with more verve, love, or originality. His was an art made with every window in the soul open: to popular music (he choreographed major classical ballets to Sousa marches and George Gershwin (Gershwin, George) songs) as well as to austere and demanding American classical music (as in Ivesiana, his works choreographed to the music of Charles Ives (Ives, Charles)). He created new standards of beauty for both men and women dancers (and, not incidentally, helped spread those new standards of athletic beauty into the culture at large) and invented an audience for dance in the United States where none had existed before. By the end of his life, this Russian-born choreographer, who spoke all his life with a heavy accent, was perhaps the greatest and certainly among the most American of all artists.

Sports
      In many countries, the inclusion of sports, and particularly spectator sports, as part of “culture,” as opposed to the inclusion of recreation or medicine, would seem strange, even dubious. But no one can make sense of the culture of the United States without recognizing that Americans are crazy about games—playing them, watching them, and thinking about them. In no country have sports, especially commercialized, professional spectator sports, played so central a role as they have in the United States. Italy and England have their football (soccer) fanatics; the World Cups of rugby and cricket attract endless interest from the West Indies to Australia; but only in the United States do spectator sports, from “amateur” college (gridiron) football and basketball to the four major professional leagues—hockey, basketball, football, and baseball—play such a large role as a source of diversion, commerce, and, above all, shared common myth. In watching men (and sometimes women) play ball and comparing it with the way other men have played ball before, Americans have found their "proto-myth," a shared common romantic culture that unites them in ways that merely procedural laws cannot.

      Sports are central to American culture in two ways. First, they are themselves a part of the culture, binding, unifying theatrical events that bring together cities, classes, and regions not only in a common cause, however cynically conceived, but in shared experience. They have also provided essential material for culture, the means for writing and movies and poetry. If there is a “Matter of America” in the way that the King Arthur stories were the “Matter of Britain” and La Chanson de Roland (Chanson de Roland, La) the “Matter of France,” then it lies in the lore of professional sports and, perhaps, above all in the lore of baseball.

       baseball, more than any other sport played in the United States, remains the central national pastime and seems to attract mythmakers as Troy attracted poets. Some of the mythmaking has been naive or fatuous—onetime Major League Baseball commissioner Bartlett Giamatti wrote a book called Take Time for Paradise, finding in baseball a powerful metaphor for the time before the Fall. But the myths of baseball remain powerful even when they are not aided, or adulterated, by too-self-conscious appeals to poetry. The rhythm and variety of the game, the way in which its meanings and achievements depend crucially on a context, a learned history—the way that every swing of Hank Aaron (Aaron, Hank) was bound by the ghost of every swing by Babe Ruth (Ruth, Babe)—have served generations of Americans as their first contact with the nature of aesthetic experience, which, too, always depends on context and a sense of history, on what things mean in relation to other things that have come before. It may not be necessary to understand baseball to understand the United States, as someone once wrote, but it may be that many Americans get their first ideas about the power of the performing arts by seeing the art with which baseball players perform.

 Although baseball, with the declining and violent sport of boxing, remains by far the most literary of all American games, in recent decades it has been basketball—a sport invented as a small-town recreation more than a century ago and turned on American city playgrounds into the most spectacular and acrobatic of all team sports—that has attracted the most eager followers and passionate students. If baseball has provided generations of Americans with their first glimpse of the power of aesthetic context to make meaning—of the way that what happened before makes sense out of what happens next—then a new generation of spectators has often gotten its first essential glimpse of the poetry implicit in dance and sculpture, the unlimitable expressive power of the human body in motion, by watching such inimitable performers as Julius Erving (Erving, Julius), Magic Johnson (Johnson, Magic), and Michael Jordan (Jordan, Michael), a performer who, at the end of the 20th century, seemed to transcend not merely the boundaries between sport and art but even those between reality and myth, as larger-than-life as Paul Bunyan (Bunyan, Paul) and as iconic as Bugs Bunny, with whom he even shared the motion picture screen (Space Jam [1996]).

      By the beginning of the 21st century, the Super Bowl, professional football's championship game, American sports' gold standard of hype and commercial synergy, and the august ‘‘October classic, " Major League Baseball's World Series, had been surpassed for many as a shared event by college basketball's national championship. Mirroring a similar phenomenon on the high-school and state level, known popularly as March Madness, this single-elimination tournament whose early rounds feature David versus Goliath matchups and television coverage that shifts between a bevy of regional venues not only has been statistically proved to reduce the productivity of the American workers who monitor the progress of their brackets (predictions of winners and pairings on the way to the Final Four) but for a festive month both reminds the United States of its vanishing regional diversity and transforms the country into one gigantic community. In a similar way, the growth of fantasy baseball and football leagues—in which the participants ‘‘draft " real players—has created small communities while offering an escape, at least in fantasy, from the increasingly cynical world of commercial sports.

Audiences
      Art is made by artists, but it is possible only with audiences; and perhaps the most worrying trait of American culture in the past half century, with high and low dancing their sometimes happy, sometimes challenging dance, has been the threatened disappearance of a broad middlebrow audience for the arts. Many magazines (magazine) that had helped sustain a sense of community and debate among educated readers—Collier's, The Saturday Evening Post, Look—had all stopped publishing by the late 20th century or continued only as a newspaper insert (Life). Others, including Harper's and the Atlantic Monthly, continue principally as philanthropies.

      As the elephantine growth and devouring appetite of television has reduced the middle audience, there has also been a concurrent growth in the support of the arts in the university. The public support of higher education in the United States, although its ostensible purposes were often merely pragmatic and intended simply to produce skilled scientific workers for industry, has had the perhaps unintended effect of making the universities into cathedrals of culture. The positive side of this development should never be overlooked; things that began as scholarly pursuits—for instance, the enthusiasm for authentic performances of early music—have, after their incubation in the academy, given pleasure to increasingly larger audiences. The growth of the universities has also, for good or ill, helped decentralize culture; the Guthrie Theaterin Minnesota, for instance, or the regional opera companies of St. Louis, Mo., and Santa Fe, N.M., are difficult to imagine without the support and involvement of local universities. But many people believe that the “academicization” of the arts has also had the negative effect of encouraging art made by college professors for other college professors. In literature, some people believe, for instance, this has led to the development of a literature that is valued less for its engagement with the world than for its engagement with other kinds of writing.

      Yet a broad, middle-class audience for the arts, if it is endangered, continues to flourish too. The establishment of the Lincoln Center for the Performing Arts in the early 1960s provided a model for subsequent centres across the country, including the John F. Kennedy Center for the Performing Arts in Washington, D.C., which opened in l971. It is sometimes said, sourly, that the audiences who attend concerts and recitals at these centres are mere “consumers” of culture, rather than people engaged passionately in the ongoing life of the arts. But it seems probable that the motives that lead Americans to the concert hall or opera house are just as mixed as they have been in every other historical period: a desire for prestige, a sense of duty, and real love of the form all commingled together.

      The deeper problem that has led to one financial crisis after another for theatre companies and dance troupes and museums (the Twyla Tharp dance company, despite its worldwide reputation, for instance, and a popular orientation that included several successful seasons on Broadway, was compelled to survive only by being absorbed into America Ballet Theater) rests on hard and fixed facts about the economics of the arts, and about the economics of the performing arts in particular. Ballet, opera, symphony, and drama are labour-intensive industries in an era of labour-saving devices. Other industries have remained competitive by substituting automated labour for human labour; but, for all that new stage devices can help cut costs, the basic demands of the old art forms are hard to alter. The corps of a ballet cannot be mechanized or stored on software; voices belong to singers, and singers cannot be replicated. Many Americans, accustomed to the simple connection between popularity and financial success, have had a hard time grasping this fact; perhaps this is one of the reasons for the uniquely impoverished condition of government funding for the arts in the United States.

      First the movies, then broadcast television, then cable television, and now the Internet—again and again, some new technology promises to revolutionize the delivery systems of culture and therefore change culture with it. Promising at once a larger audience than ever before (a truly global village) and a smaller one (e.g., tiny groups interested only in Gershwin having their choice today of 50 Gershwin Web sites), the Internet is only the latest of these candidates. Cable television, the most trumpeted of the more recent mass technologies, has so far failed sadly to multiply the opportunities for new experience of the arts open to Americans. The problem of the “lowest common denominator” is not that it is low but that it is common. It is not that there is no audience for music and dance and jazz. It is that a much larger group is interested in sex and violent images and action, and therefore the common interest is so easy to please.

      Yet the growing anxiety about the future of the arts reflects, in part, the extraordinary demands Americans have come to make on them. No country has ever before, for good or ill, invested so much in the ideal of a common culture; the arts for most Americans are imagined as therapy, as education, as a common inheritance, as, in some sense, the definition of life itself and the summum bonum. Americans have increasingly asked art to play the role that religious ritual played in older cultures.

      The problem of American culture in the end is inseparable from the triumph of liberalism and of the free-market, largely libertarian social model that, at least for a while at the end of the 20th century, seemed entirely ascendant and which much of the world, despite understandable fits and starts, emulated. On the one hand, liberal societies create liberty and prosperity and abundance, and the United States, as the liberal society par excellence, has not only given freedom to its own artists but allowed artists from elsewhere, from John James Audubon (Audubon, John James) to Marcel Duchamp (Duchamp, Marcel), to exercise their freedom: artists, however marginalized, are free in the United States to create weird forms, new dance steps, strange rhythms, free verse, and inverted novels.

      At the same time, however, liberal societies break down the consensus, the commonality, and the shared viewpoint that is part of what is meant by traditional culture, and what is left that is held in common is often common in the wrong way. The division between mass product and art made for small and specific audiences has perhaps never seemed so vast as it does at the dawn of the new millennium, and the odds of leaping past the divisions into common language or even merely a decent commonplace civilization have never seemed greater. Even those who are generally enthusiastic about the democratization of culture in American history are bound to find a catch in their throat of protest or self-doubt as they watch bad television reality shows become still worse or bad comic-book movies become still more dominant. The appeal of the lowest common denominator, after all, does not mean that all the people who are watching something have no other or better interests; it just means that the one thing they can all be interested in at once is this kind of thing.

      Liberal societies create freedoms and end commonalities, and that is why they are both praised for their fertility and condemned for their pervasive alienation of audiences from artists, and of art from people. The history of the accompanying longing for authentic community may be a dubious and even comic one, but anyone who has spent a night in front of a screen watching the cynicism and proliferation of gratuitous violence and sexuality at the root of much of what passes for entertainment for most Americans cannot help but feel a little soul-deadened. In this way, as the 21st century began, the cultural paradoxes of American society—the constant oscillation between energy and cynicism, the capacity to make new things and the incapacity to protect the best of tradition—seemed likely not only to become still more evident but also to become the ground for the worldwide debate about the United States itself. Still, if there were not causes of triumph, there were grounds for hope.

      It is in the creative life of Americans that all the disparate parts of American culture can, for the length of a story or play or ballet, at least, come together. What is wonderful, and perhaps special, in the culture of the United States is that the marginal and central, like the high and the low, are not in permanent battle but instead always changing places. The sideshow becomes the centre ring of the circus, the thing repressed the thing admired. The world of American culture, at its best, is a circle, not a ladder. High and low link hands.

Adam Gopnik

History
 The territory represented by the continental United States had, of course, been discovered, perhaps several times, before the voyages of Christopher Columbus (Columbus, Christopher). When Columbus arrived, he found the New World inhabited by peoples who in all likelihood had originally come from the continent of Asia. Probably these first inhabitants had arrived 20,000 to 35,000 years before in a series of migrations from Asia to North America by way of the Bering Strait. By the time the first Europeans appeared, the indigenous people (commonly referred to as Indians (American Indian)) had spread and occupied all portions of the New World.

      The foods and other resources available in each physiographic region largely determined the type of culture prevailing there. Fish and sea mammals, for example, contributed the bulk of the food supply of coastal peoples, although the acorn was a staple for California Indians (California Indian); plant life and wild game (especially the American bison, or buffalo) were sources for the Plains Indians (Plains Indian); and small-game hunting and fishing (depending again on local resources) provided for Midwestern and Eastern American Indian groups. These foods were supplemented by corn (maize), which was a staple food for the Indians of the Southwest. The procurement of these foods called for the employment of fishing, hunting, plant and berry gathering, and farming techniques, the application of which depended, in turn, upon the food resources utilized in given areas.

 Foods and other raw materials likewise conditioned the material culture of the respective regional groups. All Indians transported goods by human carrier; the use of dogs to pull sleds or travois was widespread; and rafts, boats, and canoes were used where water facilities were available. The horse, imported by the Spanish in the early 16th century, was quickly adopted by the Indians once it had made its appearance. Notably, it came to be used widely by the buffalo-hunting Indians of the Great Plains.

      American Indian culture groups were distinguished, among other ways, by house types. Dome-shaped ice houses (igloos) were developed by the Eskimos (Eskimo) (called Inuit in Canada) in what would become Alaska; rectangular plank houses were produced by the Northwest Coast Indians (Northwest Coast Indian); earth and skin lodges and tepees, by plains and prairie tribes; flat-roofed and often multistoried houses, by some of the Pueblo Indians of the Southwest; and barrel houses, by the Northeast Indians (Northeast Indian). Clothing, or the lack of it, likewise varied with native groups, as did crafts, weapons, and tribal economic, social, and religious customs.

      At the time of Columbus's arrival there were probably roughly 1.5 million American Indians in what is now the continental United States, although estimates vary greatly. In order to assess the role and the impact of the American Indian upon the subsequent history of the United States in any meaningful way, one must understand the differentiating factors between Native American peoples, such as those mentioned above. Generally speaking, it may be said, however, that the American Indians as a whole exercised an important influence upon the civilization transplanted from Europe to the New World. Indian foods and herbs, articles of manufacture, methods of raising some crops, war techniques, words, a rich folklore, and ethnic infusions are among the more obvious general contributions of the Indians to their European conquerors. The protracted and brutal westward-moving conflict caused by “white” expansionism and Indian resistance constitutes one of the most tragic chapters in the history of the United States.

Oscar O. Winther

Colonial America to 1763
The European background (European exploration)
      The English colonization of North America was but one chapter in the larger story of European expansion throughout the globe. The Portuguese, beginning with a voyage to Porto Santo off the coast of West Africa in 1418, were the first Europeans to promote overseas exploration and colonization. By 1487 the Portuguese had traveled all the way to the southern tip of Africa, establishing trading stations at Arguin, Sierra Leone, and El Mina. In 1497 Vasco da Gama (Gama, Vasco da) rounded the Cape of Good Hope (Good Hope, Cape of) and sailed up the eastern coast of Africa, laying the groundwork for Portugal's later commercial control of India. By 1500, when Pedro Álvares Cabral (Cabral, Pedro Álvares) stumbled across the coast of Brazil en route to India, Portuguese influence had expanded to the New World as well.

      Though initially lagging behind the Portuguese in the arts of navigation and exploration, the Spanish quickly closed that gap in the decades following Columbus's voyages to America. First in the Caribbean and then in spectacular conquests of New Spain (New Spain, Viceroyalty of) and Peru, they captured the imagination, and the envy, of the European world.

 France, occupied with wars in Europe to preserve its own territorial integrity, was not able to devote as much time or effort to overseas expansion as did Spain and Portugal. Beginning in the early 16th century, however, French fishermen established an outpost in Newfoundland (Newfoundland and Labrador), and in 1534 Jacques Cartier (Cartier, Jacques) began exploring the Gulf of St. Lawrence (Saint Lawrence, Gulf of). By 1543 the French had ceased their efforts to colonize the northeast portion of the New World. In the last half of the 16th century, France attempted to found colonies in Florida and Brazil, but each of these efforts failed, and by the end of the century Spain and Portugal remained the only two European nations to have established successful colonies in America.

 The English (England), although eager to duplicate the Spanish and Portuguese successes, nevertheless lagged far behind in their colonization efforts. The English possessed a theoretical claim to the North American mainland by dint of the 1497 voyage of John Cabot (Cabot, John) off the coast of Nova Scotia, but in fact they had neither the means nor the desire to back up that claim during the 16th century. Thus it was that England relied instead on private trading companies, which were interested principally in commercial rather than territorial expansion, to defend its interests in the expanding European world. The first of these commercial ventures began with the formation of the Muscovy Company in 1554. In 1576–78 the English mariner Martin Frobisher (Frobisher, Sir Martin) undertook three voyages in search of a Northwest Passage to the Far East. In 1577 Sir Francis Drake (Drake, Sir Francis) made his famous voyage around the world, plundering the western coast of South America en route. A year later Sir Humphrey Gilbert (Gilbert, Sir Humphrey), one of the most dedicated of Elizabethan imperialists, began a series of ventures aimed at establishing permanent colonies in North America. All his efforts met with what was, at best, limited success. Finally, in September 1583, Gilbert, with five vessels and 260 men, disappeared in the North Atlantic. With the failure of Gilbert's voyage, the English turned to a new man, Sir Walter Raleigh (Raleigh, Sir Walter), and a new strategy—a southern rather than a northern route to North America—to advance England's fortunes in the New World. Although Raleigh's efforts to found a permanent colony off the coast of Virginia did finally fail with the mysterious destruction of the Roanoke Island colony (Roanoke Island) in 1587, they awakened popular interest in a permanent colonizing venture.

      During the years separating the failure of the Roanoke attempt and the establishment in 1607 of Jamestown colony, English propagandists worked hard to convince the public that a settlement in America would yield instant and easily exploitable wealth. Even men such as the English geographer Richard Hakluyt (Hakluyt, Richard) were not certain that the Spanish colonization experience could or should be imitated but hoped nevertheless that the English colonies in the New World would prove to be a source of immediate commercial gain. There were, of course, other motives for colonization. Some hoped to discover the much-sought-after route to the Orient (East Asia) in North America. English imperialists thought it necessary to settle in the New World in order to limit Spanish expansion. Once it was proved that America was a suitable place for settlement, some Englishmen would travel to those particular colonies that promised to free them from religious persecution. There were also Englishmen, primarily of lower- and middle-class origin, who hoped the New World would provide them with increased economic opportunity in the form of free or inexpensive land. These last two motives, while they have been given considerable attention by historians, appear not to have been so much original motives for English colonization as they were shifts of attitude once colonization had begun.

  The leaders of the Virginia Company, a joint-stock company in charge of the Jamestown (Jamestown Colony) enterprise, were for the most part wealthy and wellborn commercial and military adventurers eager to find new outlets for investment. During the first two years of its existence, the Virginia colony, under the charter of 1607, proved an extraordinarily bad investment. This was principally due to the unwillingness of the early colonizers to do the necessary work of providing for themselves and to the chronic shortage of capital to supply the venture.

      A new charter in 1609 significantly broadened membership in the Virginia Company, thereby temporarily increasing the supply of capital at the disposal of its directors, but most of the settlers continued to act as though they expected the Indians to provide for their existence, a notion that the Indians fiercely rejected. As a result, the enterprise still failed to yield any profits, and the number of investors again declined.

      The crown issued a third charter in 1612, authorizing the company to institute a lottery to raise more capital for the floundering enterprise. In that same year, John Rolfe (Rolfe, John) harvested the first crop of a high-grade and therefore potentially profitable strain of tobacco. At about the same time, with the arrival of Sir Thomas Dale in the colony as governor in 1611, the settlers gradually began to practice the discipline necessary for their survival, though at an enormous personal cost.

      Dale carried with him the “Laws Divine, Morall, and Martial,” which were intended to supervise nearly every aspect of the settlers' lives. Each person in Virginia, including women and children, was given a military rank, with duties spelled out in minute detail. Penalties imposed for violating these rules were severe: those who failed to obey the work regulations were to be forced to lie with neck and heels together all night for the first offense, whipped for the second, and sent to a year's service in English galleys (convict ships) for the third. The settlers could hardly protest the harshness of the code, for that might be deemed slander against the company—an offense punishable by service in the galleys or by death.

      Dale's code brought order to the Virginia experiment, but it hardly served to attract new settlers. To increase incentive the company, beginning in 1618, offered 50 acres (about 20 hectares) of land to those settlers who could pay their transportation to Virginia and a promise of 50 acres after seven years of service to those who could not pay their passage. Concurrently, the new governor of Virginia, Sir George Yeardley, issued a call for the election of representatives to a House of Burgesses (Burgesses, House of), which was to convene in Jamestown in July 1619. In its original form the House of Burgesses was little more than an agency of the governing board of the Virginia Company, but it would later expand its powers and prerogatives and become an important force for colonial self-government.

      Despite the introduction of these reforms, the years from 1619 to 1624 proved fatal to the future of the Virginia Company. Epidemics, constant warfare with the Indians, and internal disputes took a heavy toll on the colony. In 1624 the crown finally revoked the charter of the company and placed the colony under royal control. The introduction of royal government into Virginia, while it was to have important long-range consequences, did not produce an immediate change in the character of the colony. The economic and political life of the colony continued as it had in the past. The House of Burgesses, though its future under the royal commission of 1624 was uncertain, continued to meet on an informal basis; by 1629 it had been officially reestablished. The crown also grudgingly acquiesced to the decision of the Virginia settlers to continue to direct most of their energies to the growth and exportation of tobacco. By 1630 the Virginia colony, while not prosperous, at least was showing signs that it was capable of surviving without royal subsidy.

  Maryland, Virginia's neighbour to the north, was the first English colony to be controlled by a single proprietor (proprietary colony) rather than by a joint-stock company. Lord Baltimore (George Calvert) (Baltimore, George Calvert, 1st Baron) had been an investor in a number of colonizing schemes before being given a grant of land from the crown in 1632. Baltimore was given a sizable grant of power to go along with his grant of land; he had control over the trade and political system of the colony so long as he did nothing to deviate from the laws of England. Baltimore's son Cecilius Calvert took over the project at his father's death and promoted a settlement at St. Mary's (Saint Marys City) on the Potomac. Supplied in part by Virginia, the Maryland colonists managed to sustain their settlement in modest fashion from the beginning. As in Virginia, however, the early 17th-century settlement in Maryland was often unstable and unrefined; composed overwhelmingly of young single males—many of them indentured servants—it lacked the stabilizing force of a strong family structure to temper the rigours of life in the wilderness.

 The colony was intended to serve at least two purposes. Baltimore, a Roman Catholic, was eager to found a colony where Catholics could live in peace, but he was also eager to see his colony yield him as large a profit as possible. From the outset, Protestants outnumbered Catholics, although a few prominent Catholics tended to own an inordinate share of the land in the colony. Despite this favouritism in the area of land policy, Baltimore was for the most part a good and fair administrator.

      Following the accession of William III and Mary II to the English throne, however, control of the colony was taken away from the Calvert family and entrusted to the royal government. Shortly thereafter, the crown decreed that Anglicanism would be the established religion of the colony. In 1715, after the Calvert family had renounced Catholicism and embraced Anglicanism, the colony reverted back to a proprietary form of government.

The New England colonies (New England)
      Although lacking a charter, the founders of Plymouth in Massachusetts (Massachusetts Bay Colony) were, like their counterparts in Virginia, dependent upon private investments from profit-minded backers to finance their colony. The nucleus of that settlement was drawn from an enclave of English émigrés in Leiden, Holland (now in The Netherlands). These religious Separatists (Separatist) believed that the true church was a voluntary company of the faithful under the “guidance” of a pastor and tended to be exceedingly individualistic in matters of church doctrine. Unlike the settlers of Massachusetts Bay, these Pilgrims (Pilgrim Fathers) chose to “separate” from the Church of England (England, Church of) rather than to reform it from within.

  In 1620, the first year of settlement, nearly half the Pilgrim settlers died of disease. From that time forward, however, and despite decreasing support from English investors, the health and the economic position of the colonists improved. The Pilgrims soon secured peace treaties with most of the Indians around them, enabling them to devote their time to building a strong, stable economic base rather than diverting their efforts toward costly and time-consuming problems of defending the colony from attack. Although none of their principal economic pursuits—farming, fishing, and trading—promised them lavish wealth, the Pilgrims in America were, after only five years, self-sufficient.

 Although the Pilgrims were always a minority in Plymouth, they nevertheless controlled the entire governmental structure of their colony during the first four decades of settlement. Before disembarking from the Mayflower in 1620, the Pilgrim founders, led by William Bradford (Bradford, William), demanded that all the adult males aboard who were able to do so sign a compact promising obedience to the laws and ordinances drafted by the leaders of the enterprise. Although the Mayflower Compact has been interpreted as an important step in the evolution of democratic government in America, it is a fact that the compact represented a one-sided arrangement, with the settlers promising obedience and the Pilgrim founders promising very little. Although nearly all the male inhabitants were permitted to vote for deputies to a provincial assembly and for a governor, the colony, for at least the first 40 years of its existence, remained in the tight control of a few men. After 1660 the people of Plymouth gradually gained a greater voice in both their church and civic affairs, and by 1691, when Plymouth colony (also known as the Old Colony) was annexed to Massachusetts Bay, the Plymouth settlers had distinguished themselves by their quiet, orderly ways.

      The Puritans (Puritanism) of the Massachusetts Bay Colony, like the Pilgrims, sailed to America principally to free themselves from religious restraints. Unlike the Pilgrims, the Puritans did not desire to “separate” themselves from the Church of England but, rather, hoped by their example to reform it. Nonetheless, one of the recurring problems facing the leaders of the Massachusetts Bay colony was to be the tendency of some, in their desire to free themselves from the alleged corruption of the Church of England, to espouse Separatist doctrine. When these tendencies or any other hinting at deviation from orthodox Puritan doctrine developed, those holding them were either quickly corrected or expelled from the colony. The leaders of the Massachusetts Bay enterprise never intended their colony to be an outpost of toleration in the New World; rather, they intended it to be a “Zion in the wilderness,” a model of purity and orthodoxy, with all backsliders subject to immediate correction.

 The civil government of the colony was guided by a similar authoritarian spirit. Men such as John Winthrop (Winthrop, John), the first governor of Massachusetts Bay, believed that it was the duty of the governors of society not to act as the direct representatives of their constituents but rather to decide, independently, what measures were in the best interests of the total society. The original charter of 1629 gave all power in the colony to a General Court composed of only a small number of shareholders in the company. On arriving in Massachusetts, many disfranchised settlers immediately protested against this provision and caused the franchise to be widened to include all church members. These “freemen” were given the right to vote in the General Court once each year for a governor and a Council of Assistants. Although the charter of 1629 technically gave the General Court the power to decide on all matters affecting the colony, the members of the ruling elite initially refused to allow the freemen in the General Court to take part in the lawmaking process on the grounds that their numbers would render the court inefficient.

      In 1634 the General Court adopted a new plan of representation whereby the freemen of each town would be permitted to select two or three delegates and assistants, elected separately but sitting together in the General Court, who would be responsible for all legislation. There was always tension existing between the smaller, more prestigious group of assistants and the larger group of deputies. In 1644, as a result of this continuing tension, the two groups were officially lodged in separate houses of the General Court, with each house reserving a veto power over the other.

      Despite the authoritarian tendencies of the Massachusetts Bay colony, a spirit of community developed there as perhaps in no other colony. The same spirit that caused the residents of Massachusetts to report on their neighbours for deviation from the true principles of Puritan morality also prompted them to be extraordinarily solicitous about their neighbours' needs. Although life in Massachusetts was made difficult for those who dissented from the prevailing orthodoxy, it was marked by a feeling of attachment and community for those who lived within the enforced consensus of the society.

      Many New Englanders, however, refused to live within the orthodoxy imposed by the ruling elite of Massachusetts, and both Connecticut and Rhode Island were founded as a by-product of their discontent. The Rev. Thomas Hooker (Hooker, Thomas), who had arrived in Massachusetts Bay in 1633, soon found himself in opposition to the colony's restrictive policy regarding the admission of church members and to the oligarchic power of the leaders of the colony. Motivated both by a distaste for the religious and political structure of Massachusetts and by a desire to open up new land, Hooker and his followers began moving into the Connecticut valley in 1635. By 1636 they had succeeded in founding three towns— Hartford, Windsor, and Wethersford. In 1638 the separate colony of New Haven was founded, and in 1662 Connecticut and Rhode Island merged under one charter.

      Roger Williams (Williams, Roger), the man closely associated with the founding of Rhode Island, was banished from Massachusetts because of his unwillingness to conform to the orthodoxy established in that colony. Williams's views conflicted with those of the ruling hierarchy of Massachusetts in several important ways. His own strict criteria for determining who was regenerate, and therefore eligible for church membership, finally led him to deny any practical way to admit anyone into the church. Once he recognized that no church could ensure the purity of its congregation, he ceased using purity as a criterion and instead opened church membership to nearly everyone in the community. Moreover, Williams showed distinctly Separatist leanings, preaching that the Puritan church could not possibly achieve purity as long as it remained within the Church of England. Finally, and perhaps most serious, he openly disputed the right of the Massachusetts leaders to occupy land without first purchasing it from the Native Americans.

      The unpopularity of Williams's views forced him to flee Massachusetts Bay for Providence in 1636. In 1639 William Coddington (Coddington, William), another dissenter in Massachusetts, settled his congregation in Newport. Four years later Samuel Gorton, yet another minister banished from Massachusetts Bay because of his differences with the ruling oligarchy, settled in Shawomet (later renamed Warwick). In 1644 these three communities joined with a fourth in Portsmouth under one charter to become one colony called Providence Plantation in Narragansett Bay.

      The early settlers of New Hampshire and Maine were also ruled by the government of Massachusetts Bay. New Hampshire was permanently separated from Massachusetts in 1692, although it was not until 1741 that it was given its own royal governor. Maine remained under the jurisdiction of Massachusetts until 1820.

The middle colonies
 New Netherland, founded in 1624 at Fort Orange (now Albany) by the Dutch West India Company, was but one element in a wider program of Dutch expansion in the first half of the 17th century. In 1664 the English captured the colony of New Netherland, renaming it New York after James, duke of York (James II), brother of Charles II, and placing it under the proprietary control of the duke. In return for an annual gift to the king of 40 beaver skins, the duke of York and his resident board of governors were given extraordinary discretion in the ruling of the colony. Although the grant to the duke of York made mention of a representative assembly, the duke was not legally obliged to summon it and in fact did not summon it until 1683. The duke's interest in the colony was chiefly economic, not political, but most of his efforts to derive economic gain from New York proved futile. Indians, foreign interlopers (the Dutch actually recaptured New York in 1673 and held it for more than a year), and the success of the colonists in evading taxes made the proprietor's job a frustrating one.

      In February 1685 the duke of York found himself not only proprietor of New York but also king of England, a fact that changed the status of New York from that of a proprietary to a royal colony. The process of royal consolidation was accelerated when in 1688 the colony, along with the New England and New Jersey colonies, was made part of the ill-fated Dominion of New England. In 1691 Jacob Leisler (Leisler, Jacob), a German merchant living on Long Island, led a successful revolt against the rule of the deputy governor, Francis Nicholson. The revolt, which was a product of dissatisfaction with a small aristocratic ruling elite and a more general dislike of the consolidated scheme of government of the Dominion of New England, served to hasten the demise of the dominion.

  Pennsylvania, in part because of the liberal policies of its founder, William Penn (Penn, William), was destined to become the most diverse, dynamic, and prosperous of all the North American colonies. Penn himself was a liberal, but by no means radical, English Whig (Whig and Tory). His Quaker (Society of Friends (Friends, Society of)) faith was marked not by the religious extremism of some Quaker leaders of the day but rather by an adherence to certain dominant tenets of the faith—liberty of conscience and pacifism—and by an attachment to some of the basic tenets of Whig doctrine. Penn sought to implement these ideals in his “holy experiment” in the New World.

 Penn received his grant of land along the Delaware River in 1681 from Charles II as a reward for his father's service to the crown. The first “frame of government” proposed by Penn in 1682 provided for a council and an assembly, each to be elected by the freeholders of the colony. The council was to have the sole power of initiating legislation; the lower house could only approve or veto bills submitted by the council. After numerous objections about the “oligarchic” nature of this form of government, Penn issued a second frame of government in 1682 and then a third in 1696, but even these did not wholly satisfy the residents of the colony. Finally, in 1701, a Charter of Privileges, giving the lower house all legislative power and transforming the council into an appointive body with advisory functions only, was approved by the citizens. The Charter of Privileges, like the other three frames of government, continued to guarantee the principle of religious toleration to all Protestants.

 Pennsylvania prospered from the outset. Although there was some jealousy between the original settlers (who had received the best land and important commercial privileges) and the later arrivals, economic opportunity in Pennsylvania was on the whole greater than in any other colony. Beginning in 1683 with the immigration of Germans into the Delaware valley and continuing with an enormous influx of Irish and Scotch-Irish in the 1720s and '30s, the population of Pennsylvania increased and diversified. The fertile soil of the countryside, in conjunction with a generous government land policy, kept immigration at high levels throughout the 18th century. Ultimately, however, the continuing influx of European settlers hungry for land spelled doom for the pacific Indian policy initially envisioned by Penn. “Economic opportunity” for European settlers often depended on the dislocation, and frequent extermination, of the American Indian residents who had initially occupied the land in Penn's colony.

  New Jersey remained in the shadow of both New York and Pennsylvania throughout most of the colonial period. Part of the territory ceded to the duke of York by the English crown in 1664 lay in what would later become the colony of New Jersey. The duke of York in turn granted that portion of his lands to John Berkeley and George Carteret (Carteret, Sir George, Baronet), two close friends and allies of the king. In 1665 Berkeley and Carteret established a proprietary government under their own direction. Constant clashes, however, developed between the New Jersey and the New York proprietors over the precise nature of the New Jersey grant. The legal status of New Jersey became even more tangled when Berkeley sold his half interest in the colony to two Quakers, who in turn placed the management of the colony in the hands of three trustees, one of whom was Penn. The area was then divided into East Jersey, controlled by Carteret, and West Jersey, controlled by Penn and the other Quaker trustees. In 1682 the Quakers bought East Jersey. A multiplicity of owners and an uncertainty of administration caused both colonists and colonizers to feel dissatisfied with the proprietary arrangement, and in 1702 the crown united the two Jerseys into a single royal province.

      When the Quakers purchased East Jersey, they also acquired the tract of land that was to become Delaware, in order to protect their water route to Pennsylvania. That territory remained part of the Pennsylvania colony until 1704, when it was given an assembly of its own. It remained under the Pennsylvania governor, however, until the American Revolution.

The Carolinas and Georgia
      The English crown had issued grants to the Carolina territory as early as 1629, but it was not until 1663 that a group of eight proprietors—most of them men of extraordinary wealth and power even by English standards—actually began colonizing the area. The proprietors hoped to grow silk in the warm climate of the Carolinas, but all efforts to produce that valuable commodity failed. Moreover, it proved difficult to attract settlers to the Carolinas; it was not until 1718, after a series of violent Indian wars had subsided, that the population began to increase substantially. The pattern of settlement, once begun, followed two paths. North Carolina, which was largely cut off from the European and Caribbean trade by its unpromising coastline, developed into a colony of small to medium farms. South Carolina, with close ties to both the Caribbean and Europe, produced rice and, after 1742, indigo for a world market. The early settlers in both areas came primarily from the West Indian colonies. This pattern of migration was not, however, as distinctive in North Carolina, where many of the residents were part of the spillover from the natural expansion of Virginians southward.

      The original framework of government for the Carolinas, the Fundamental Constitutions, drafted in 1669 by Anthony Ashley Cooper (Lord Shaftesbury) (Shaftesbury, Anthony Ashley Cooper, 1st Earl of, Baron Cooper of Pawlett, Baron Ashley of Wimborne St. Giles) with the help of the philosopher John Locke (Locke, John), was largely ineffective because of its restrictive and feudal nature. The Fundamental Constitutions was abandoned in 1693 and replaced by a frame of government diminishing the powers of the proprietors and increasing the prerogatives of the provincial assembly. In 1729, primarily because of the proprietors' inability to meet the pressing problems of defense, the Carolinas were converted into the two separate royal colonies of North and South Carolina.

      The proprietors of Georgia, led by James Oglethorpe (Oglethorpe, James Edward), were wealthy philanthropic English gentlemen. It was Oglethorpe's plan to transport imprisoned debtors to Georgia, where they could rehabilitate themselves by profitable labour and make money for the proprietors in the process. Those who actually settled in Georgia—and by no means all of them were impoverished debtors—encountered a highly restrictive economic and social system. Oglethorpe and his partners limited the size of individual landholdings to 500 acres (about 200 hectares), prohibited slavery, forbade the drinking of rum, and instituted a system of inheritance that further restricted the accumulation of large estates. The regulations, though noble in intention, created considerable tension between some of the more enterprising settlers and the proprietors. Moreover, the economy did not live up to the expectations of the colony's promoters. The silk industry in Georgia, like that in the Carolinas, failed to produce even one profitable crop.

      The settlers were also dissatisfied with the political structure of the colony; the proprietors, concerned primarily with keeping close control over their utopian experiment, failed to provide for local institutions of self-government. As protests against the proprietors' policies mounted, the crown in 1752 assumed control over the colony; subsequently, many of the restrictions that the settlers had complained about, notably those discouraging the institution of slavery, were lifted.

Imperial organization
      British policy toward the American colonies was inevitably affected by the domestic politics of England; since the politics of England in the 17th and 18th centuries were never wholly stable, it is not surprising that British colonial policy during those years never developed along clear and consistent lines. During the first half century of colonization, it was even more difficult for England to establish an intelligent colonial policy because of the very disorganization of the colonies themselves. It was nearly impossible for England to predict what role Virginia, Maryland, Massachusetts, Connecticut, and Rhode Island would play in the overall scheme of empire because of the diversity of the aims and governmental structures of those colonies. By 1660, however, England had taken the first steps in reorganizing her empire in a more profitable manner. The Navigation Act of 1660 (Navigation Acts), a modification and amplification of a temporary series of acts passed in 1651, provided that goods (mercantilism) bound to England or to English colonies, regardless of origin, had to be shipped only in English vessels; that three-fourths of the personnel of those ships had to be Englishmen; and that certain “enumerated articles,” such as sugar, cotton, and tobacco, were to be shipped only to England, with trade (international trade) in those items with other countries prohibited. This last provision hit Virginia and Maryland particularly hard; although those two colonies were awarded a monopoly over the English tobacco market at the same time that they were prohibited from marketing their tobacco elsewhere, there was no way that England alone could absorb their tobacco production.

      The 1660 act proved inadequate to safeguard the entire British commercial empire, and in subsequent years other navigation acts were passed, strengthening the system. In 1663 Parliament passed an act requiring all vessels with European goods bound for the colonies to pass first through English ports to pay customs duties. In order to prevent merchants from shipping the enumerated articles from colony to colony in the coastal trade and then taking them to a foreign country, in 1673 Parliament required that merchants post bond guaranteeing that those goods would be taken only to England. Finally, in 1696 Parliament established a Board of Trade (Trade, Board of) to oversee Britain's commercial empire, instituted mechanisms to ensure that the colonial governors aided in the enforcement of trade regulations, and set up vice admiralty courts in America for the prosecution of those who violated the Navigation Acts. On the whole, this attempt at imperial consolidation—what some historians have called the process of Anglicization—was successful in bringing the economic activities of the colonies under closer crown control. While a significant amount of colonial trade continued to evade British regulation, it is nevertheless clear that the British were at least partially successful in imposing greater commercial and political order on the American colonies during the period from the late-17th to the mid-18th century.

      In addition to the agencies of royal control in England, there were a number of royal officials in America responsible not only for aiding in the regulation of Britain's commercial empire but also for overseeing the internal affairs of the colonies. The weaknesses of royal authority in the politics of provincial America were striking, however. In some areas, particularly in the corporate colonies of New England during the 17th century and in the proprietary colonies throughout their entire existence, direct royal authority in the person of a governor responsible to the crown was nonexistent. The absence of a royal governor in those colonies had a particularly deleterious effect on the enforcement of trade regulations. In fact, the lack of royal control over the political and commercial activities of New England prompted the Board of Trade to overturn the Massachusetts Bay charter in 1684 and to consolidate Massachusetts, along with the other New England colonies and New York, into the Dominion of New England. After the colonists, aided by the turmoil of the Glorious Revolution of 1688 (Glorious Revolution) in England, succeeded in overthrowing the dominion scheme, the crown installed a royal governor in Massachusetts to protect its interests.

      In those colonies with royal governors—the number of those colonies grew from one in 1650 to eight in 1760—the crown possessed a mechanism by which to ensure that royal policy was enforced. The Privy Council issued each royal governor in America a set of instructions carefully defining the limits of provincial authority. The royal governors were to have the power to decide when to call the provincial assemblies together, to prorogue, or dissolve, the assemblies, and to veto any legislation passed by those assemblies. The governor's power over other aspects of the political structure of the colony was just as great. In most royal colonies he was the one official primarily responsible for the composition of the upper houses of the colonial legislatures and for the appointment of important provincial officials, such as the treasurer, attorney general, and all colonial judges. Moreover, the governor had enormous patronage powers over the local agencies of government. The officials of the county court, who were the principal agents of local government, were appointed by the governor in most of the royal colonies. Thus, the governor had direct or indirect control over every agency of government in America.

The growth of provincial power

Political growth
      The distance separating England and America, the powerful pressures exerted on royal officials by Americans, and the inevitable inefficiency of any large bureaucracy all served to weaken royal power and to strengthen the hold of provincial leaders on the affairs of their respective colonies. During the 18th century the colonial legislatures gained control over their own parliamentary prerogatives, achieved primary responsibility for legislation affecting taxation and defense, and ultimately took control over the salaries paid to royal officials. Provincial leaders also made significant inroads into the governor's patronage powers. Although theoretically the governor continued to control the appointments of local officials, in reality he most often automatically followed the recommendations of the provincial leaders in the localities in question. Similarly, the governor's councils, theoretically agents of royal authority, came to be dominated by prominent provincial leaders who tended to reflect the interests of the leadership of the lower house of assembly rather than those of the royal government in London.

      Thus, by the mid-18th century most political power in America was concentrated in the hands of provincial rather than royal officials. These provincial leaders undoubtedly represented the interests of their constituents more faithfully than any royal official could, but it is clear that the politics of provincial America were hardly democratic by modern standards. In general, both social prestige and political power tended to be determined by economic standing, and the economic resources of colonial America, though not as unevenly distributed as in Europe, were nevertheless controlled by relatively few men.

      In the Chesapeake Bay societies of Virginia and Maryland, and particularly in the regions east of the Blue Ridge mountains, a planter class came to dominate nearly every aspect of those colonies' economic life. These same planters, joined by a few prominent merchants and lawyers, dominated the two most important agencies of local government—the county courts and the provincial assemblies. This extraordinary concentration of power in the hands of a wealthy few occurred in spite of the fact that a large percentage of the free adult male population (some have estimated as high as 80 to 90 percent) was able to participate in the political process. The ordinary citizens of the Chesapeake society, and those of most colonies, nevertheless continued to defer to those whom they considered to be their “betters.” Although the societal ethic that enabled power to be concentrated in the hands of a few was hardly a democratic one, there is little evidence, at least for Virginia and Maryland, that the people of those societies were dissatisfied with their rulers. In general, they believed that their local officials ruled responsively.

      In the Carolinas a small group of rice and indigo planters monopolized much of the wealth. As in Virginia and Maryland, the planter class came to constitute a social elite. As a rule, the planter class of the Carolinas did not have the same long tradition of responsible government as did the ruling oligarchies of Virginia and Maryland, and, as a consequence, they tended to be absentee landlords and governors, often passing much of their time in Charleston, away from their plantations and their political responsibilities.

      The western regions of both the Chesapeake and Carolina societies displayed distinctive characteristics of their own. Ruling traditions were fewer, accumulations of land and wealth less striking, and the social hierarchy less rigid in the west. In fact, in some western areas antagonism toward the restrictiveness of the east and toward eastern control of the political structure led to actual conflict. In both North and South Carolina armed risings of varying intensity erupted against the unresponsive nature of the eastern ruling elite. As the 18th century progressed, however, and as more men accumulated wealth and social prestige, the societies of the west came more closely to resemble those of the east.

      New England society was more diverse and the political system less oligarchic than that of the South. In New England the mechanisms of town government served to broaden popular participation in government beyond the narrow base of the county courts.

      The town meetings (town meeting), which elected the members of the provincial assemblies, were open to nearly all free adult males. Despite this, a relatively small group of men dominated the provincial governments of New England. As in the South, men of high occupational status and social prestige were closely concentrated in leadership positions in their respective colonies; in New England, merchants, lawyers, and to a lesser extent clergymen made up the bulk of the social and political elite.

      The social and political structure of the middle colonies was more diverse than that of any other region in America. New York, with its extensive system of manors and manor lords, often displayed genuinely feudal characteristics. The tenants on large manors often found it impossible to escape the influence of their manor lords. The administration of justice, the election of representatives, and the collection of taxes often took place on the manor itself. As a consequence, the large landowning families exercised an inordinate amount of economic and political power. The Great Rebellion of 1766, a short-lived outburst directed against the manor lords, was a symptom of the widespread discontent among the lower and middle classes. By contrast, Pennsylvania's governmental system was more open and responsive than that of any other colony in America. A unicameral legislature, free from the restraints imposed by a powerful governor's council, allowed Pennsylvania to be relatively independent of the influence of both the crown and the proprietor. This fact, in combination with the tolerant and relatively egalitarian bent of the early Quaker settlers and the subsequent immigration of large numbers of Europeans, made the social and political structure of Pennsylvania more democratic but more faction-ridden than that of any other colony.

Population growth
  The increasing political autonomy of the American colonies was a natural reflection of their increased stature in the overall scheme of the British Empire. In 1650 the population of the colonies had been about 52,000; in 1700 it was perhaps 250,000, and by 1760 it was approaching 1,700,000. Virginia had increased from about 54,000 in 1700 to approximately 340,000 in 1760. Pennsylvania had begun with about 500 settlers in 1681 and had attracted at least 250,000 people by 1760. And America's cities were beginning to grow as well. By 1765 Boston had reached 15,000; New York City, 16,000–17,000; and Philadelphia, the largest city in the colonies, 20,000.

      Part of that population growth was the result of the involuntary immigration of African slaves. During the 17th century, slaves remained a tiny minority of the population. By the mid-18th century, after Southern (South, the) colonists discovered that the profits generated by their plantations could support the relatively large initial investments needed for slave labour, the volume of the slave trade increased markedly. In Virginia the slave population leaped from about 2,000 in 1670 to perhaps 23,000 in 1715 and reached 150,000 on the eve of the American Revolution. In South Carolina it was even more dramatic. In 1700 there were probably no more than 2,500 blacks in the population; by 1765 there were 80,000–90,000, with blacks outnumbering whites by about 2 to 1.

      One of the principal attractions for the immigrants who moved to America voluntarily was the availability of inexpensive arable land. The westward migration to America's frontier—in the early 17th century all of America was a frontier, and by the 18th century the frontier ranged anywhere from 10 to 200 miles (15 to 320 km) from the coastline—was to become one of the distinctive elements in American history. English Puritans (Puritanism), beginning in 1629 and continuing through 1640, were the first to immigrate in large numbers to America. Throughout the 17th century most of the immigrants were English; but, beginning in the second decade of the 18th century, a wave of Germans, principally from the Rhineland Palatinate (Rhineland-Palatinate), arrived in America: by 1770 between 225,000 and 250,000 Germans had immigrated to America, more than 70 percent of them settling in the middle colonies, where generous land policies and religious toleration made life more comfortable for them. The Scotch-Irish and Irish immigration, which began on a large scale after 1713 and continued past the American Revolution, was more evenly distributed. By 1750 both Scotch-Irish and Irish could be found in the western portions of nearly every colony. In almost all the regions in which Europeans sought greater economic opportunity, however, that same quest for independence and self-sufficiency led to tragic conflict with Indians over the control of land. And in nearly every instance the outcome was similar: the Europeans, failing to respect Indian claims either to land or to cultural autonomy, pushed the Indians of North America farther and farther into the periphery.

 Provincial America came to be less dependent upon subsistence agriculture and more on the cultivation and manufacture of products for the world market. Land, which initially served only individual needs, came to be the fundamental source of economic enterprise. The independent yeoman farmer continued to exist, particularly in New England and the middle colonies, but most settled land in North America by 1750 was devoted to the cultivation of a cash crop. New England turned its land over to the raising of meat products for export. The middle colonies were the principal producers of grains. By 1700 Philadelphia exported more than 350,000 bushels of wheat and more than 18,000 tons of flour annually. The Southern colonies were, of course, even more closely tied to the cash crop system. South Carolina, aided by British incentives, turned to the production of rice and indigo. North Carolina, although less oriented toward the market economy than South Carolina, was nevertheless one of the principal suppliers of naval stores. Virginia and Maryland steadily increased their economic dependence on tobacco and on the London merchants who purchased that tobacco, and for the most part they ignored those who recommended that they diversify their economies by turning part of their land over to the cultivation of wheat. Their near-total dependence upon the world tobacco price would ultimately prove disastrous, but for most of the 18th century Virginia and Maryland soil remained productive enough to make a single-crop system reasonably profitable.

 As America evolved from subsistence to commercial agriculture, an influential commercial class increased its power in nearly every colony. Boston was the centre of the merchant elite of New England, who not only dominated economic life but also wielded social and political power as well. Merchants such as James De Lancey and Philip Livingston in New York and Joseph Galloway (Galloway, Joseph), Robert Morris (Morris, Robert), and Thomas Wharton in Philadelphia exerted an influence far beyond the confines of their occupations. In Charleston the Pinckney, Rutledge, and Lowndes families controlled much of the trade that passed through that port. Even in Virginia, where a strong merchant class was nonexistent, those people with the most economic and political power were those commercial farmers who best combined the occupations of merchant and farmer. And it is clear that the commercial importance of the colonies was increasing. During the years 1700–10, approximately £265,000 sterling was exported annually to Great Britain from the colonies, with roughly the same amount being imported by the Americans from Great Britain. By the decade 1760–70, that figure had risen to more than £1,000,000 sterling of goods exported annually to Great Britain and £1,760,000 annually imported from Great Britain.

Richard R. Beeman

      Although Frederick Jackson Turner (Turner, Frederick Jackson)'s 1893 “frontier thesis”—that American democracy was the result of an abundance of free land—has long been seriously challenged and modified, it is clear that the plentifulness of virgin acres and the lack of workers to till them did cause a loosening of the constraints of authority in the colonial and early national periods. Once it became clear that the easiest path to success for Britain's New World “plantations” lay in raising export crops, there was a constant demand for agricultural labour, which in turn spurred practices that—with the notable exception of slavery—compromised a strictly hierarchical social order.

      In all the colonies, whether governed directly by the king, by proprietors, or by chartered corporations, it was essential to attract settlers, and what governors had most plentifully to offer was land. Sometimes large grants were made to entire religious communities numbering in the hundreds or more. Sometimes tracts were allotted to wealthy men on the “head rights” (literally “per capita”) system of so many acres for each family member they brought over. Few Englishmen or Europeans had the means to buy farms outright, so the simple sale of homesteads by large-scale grantees was less common than renting. But there was another well-traveled road to individual proprietorship that also provided a workforce: the system of contract labour known as indentured service. Under it, an impecunious new arrival would sign on with a landowner for a period of service—commonly seven years—binding him to work in return for subsistence and sometimes for the repayment of his passage money to the ship captain who had taken him across the Atlantic (such immigrants were called “redemptioners”). At the end of this term, the indentured servant would in many cases be rewarded by the colony itself with “freedom dues,” a title to 50 or more acres of land in a yet-unsettled area. This somewhat biblically inspired precapitalist system of transfer was not unlike apprenticeship, the economic and social tool that added to the supply of skilled labour. The apprentice system called for a prepubescent boy to be “bound out” to a craftsman who would take him into his own home and there teach him his art while serving as a surrogate parent. (Girls were perennially “apprenticed” to their mothers as homemakers.) Both indentured servants and apprentices were subject to the discipline of the master, and their lot varied with his generosity or hard-fistedness. There must have been plenty of the latter type of master, as running away was common. The first Africans taken to Virginia, or at least some of them, appear to have worked as indentured servants. Not until the case of John Punch in the 1640s did it become legally established that black “servants” were to remain such for life. Having escaped, been caught, and brought to trial, Punch, an indentured servant of African descent, and two other indentured servants of European descent received very different sentences, with Punch's punishment being servitude for the “rest of his natural life” while that for the other two was merely an extension of their service.

      The harshness of New England's climate and topography meant that for most of its people the road to economic independence lay in trade, seafaring, fishing, or craftsmanship. But the craving for an individually owned subsistence farm grew stronger as the first generations of religious settlers who had “planted” by congregation died off. In the process the communal holding of land by townships—with small allotted family garden plots and common grazing and orchard lands, much in the style of medieval communities—yielded gradually to the more conventional privately owned fenced farm. The invitation that available land offered—individual control of one's life—was irresistible. Property in land also conferred civic privileges, so an unusually large number of male colonists were qualified for suffrage by the Revolution's eve, even though not all of them exercised the vote freely or without traditional deference to the elite.

      Slavery was the backbone of large-scale cultivation of such crops as tobacco and hence took strongest root in the Southern colonies. But thousands of white freeholders of small acreages also lived in those colonies; moreover, slavery on a small scale (mainly in domestic service and unskilled labour) was implanted in the North. The line between a free and a slaveholding America had not yet been sharply drawn.

      One truly destabilizing system of acquiring land was simply “squatting.” On the western fringes of settlement, it was not possible for colonial administrators to use police powers to expel those who helped themselves to acres technically owned by proprietors in the seaboard counties. Far from seeing themselves as outlaws, the squatters believed that they were doing civilization's work in putting new land into production, and they saw themselves as the moral superiors of eastern “owners” for whom land was a mere speculative commodity that they did not, with great danger and hardship, cultivate themselves. Squatting became a regular feature of westward expansion throughout early U.S. history.

Bernard A. Weisberger

Cultural and religious development

Colonial culture
  America's intellectual attainments during the 17th and 18th centuries, while not inferior to those of the countries of Europe, were nevertheless of a decidedly different character. It was the techniques of applied science (science, history of) that most excited the minds of Americans, who, faced with the problem of subduing an often wild and unruly land, saw in science the best way to explain, and eventually to harness, those forces around them. Ultimately this scientific mode of thought might be applied to the problems of civil society as well, but for the most part the emphasis in colonial America remained on science and technology (technology, history of), not politics or metaphysics. Typical of America's peculiar scientific genius was John Bartram (Bartram, John) of Pennsylvania, who collected and classified important botanical data from the New World. The American Philosophical Society, founded in 1744, is justly remembered as the focus of intellectual life in America. Men such as David Rittenhouse (Rittenhouse, David), an astronomer who built the first planetarium in America; Cadwallader Colden, the lieutenant governor of New York, whose accomplishments as a botanist and as an anthropologist probably outmatched his achievements as a politician; and Benjamin Rush (Rush, Benjamin), a pioneer in numerous areas of social reform as well as one of colonial America's foremost physicians, were among the many active members of the society. At the centre of the society (Franklin, Benjamin) was one of its founders, Benjamin Franklin (Franklin, Benjamin), who (in his experiments concerning the flow of electricity) proved to be one of the few American scientists to achieve a major theoretical breakthrough but who was more adept at the kinds of applied research that resulted in the manufacture of more efficient stoves and the development of the lightning rod.

 American cultural achievements in nonscientific fields were less impressive. American literature, at least in the traditional European forms, was nearly nonexistent. The most important American contribution to literature was neither in fiction nor in metaphysics but rather in such histories as Robert Beverley's History and Present State of Virginia (1705) or William Byrd (Byrd, William)'s History of the Dividing Line (1728–29, but not published until 1841). The most important cultural medium in America was not the book but the newspaper. The high cost of printing tended to eliminate all but the most vital news, and local gossip or extended speculative efforts were thus sacrificed so that more important material such as classified advertisements and reports of crop prices could be included. Next to newspapers, almanacs (almanac) were the most popular literary form in America, Franklin's Poor Richard's (Poor Richard) being only the most famous among scores of similar projects. Not until 1741 and the first installment of Franklin's General Magazine did literary magazines (magazine) begin to make their first appearance in America. Most of the 18th-century magazines, however, failed to attract subscribers, and nearly all of them collapsed after only a few years of operation.

 The visual and performing arts, though flourishing somewhat more than literature, were nevertheless slow to achieve real distinction in America. America did produce one good historical painter in Benjamin West (West, Benjamin) and two excellent portrait painters in John Copley (Copley, John Singleton) and Gilbert Stuart (Stuart, Gilbert), but it is not without significance that all three men passed much of their lives in London, where they received more attention and higher fees.

      The Southern colonies, particularly Charleston, seemed to be more interested in providing good theatre for their residents than did other regions, but in no colony did the theatre approach the excellence of that of Europe. In New England, Puritan influence was an obstacle to the performance of plays, and even in cosmopolitan Philadelphia the Quakers for a long time discouraged the development of the dramatic arts.

  If Americans in the colonial period did not excel in achieving a high level of traditional cultural attainment, they did manage at least to disseminate what culture they had in a manner slightly more equitable than that of most countries of the world. Newspapers and almanacs, though hardly on the same intellectual level as the Encyclopédie produced by the European philosophes, probably had a wider audience than any European cultural medium. The New England colonies, although they did not always manage to keep pace with population growth, pioneered in the field of public education (education). Outside New England, education remained the preserve of those who could afford to send their children to private schools, although the existence of privately supported but tuition-free charity schools and of relatively inexpensive “academies” made it possible for the children of the American middle class to receive at least some education. The principal institutions of higher learning—Harvard (Harvard University) (1636), William and Mary (William and Mary, College of) (1693), Yale (Yale University) (1701), Princeton (Princeton University) (1747), Pennsylvania (a college since 1755), King's College (1754, now Columbia University), Rhode Island College (1764, now Brown University) (Brown University), Queen's College (1766, now Rutgers University (Rutgers, The State University of New Jersey)), and Dartmouth (Dartmouth College) (1769)—served the upper class almost exclusively; and most of them had a close relationship with a particular religious point of view (e.g., Harvard (Harvard University) was a training ground for Congregational ministers, and Princeton (Princeton University) was closely associated with Presbyterianism).

Richard R. Beeman

From a city on a hill to the Great Awakening (Great Awakening)
      The part played by religion in the shaping of the American mind, while sometimes overstated, remains crucial. Over the first century and a half of colonial life, the strong religious impulses present in the original settlements—particularly those in New England—were somewhat secularized and democratized but kept much of their original power.

      When the Pilgrim Fathers signed the Mayflower Compact in 1620, resolving themselves into a “civil body politic,” they were explicitly making religious fellowship the basis of a political community. But even from the start, there were nonmembers of the Leiden Separatist congregation on the passenger list—the “strangers” among the “saints”—and they sought steady expansion of their rights in Plymouth colony until its absorption into Massachusetts in 1691.

      The Puritans were even more determined that their community be, as John Winthrop (Winthrop, John) called it in his founding sermon, “A Model of Christian Charity,” a “city on a hill,” to which all humankind should look for an example of heaven on earth. This theme, in various guises, resounds in every corner of American history. The traditional image of Massachusetts Puritanism is one of repressive authority, but what is overlooked is the consensus among Winthrop and his followers that they should be bound together by love and shared faith, an expectation that left them “free” to do voluntarily what they all agreed was right. It was a kind of elective theocracy for the insiders.

      The theocratic model, however, did not apply to nonmembers of the church, to whom the franchise was not originally extended, and problems soon arose in maintaining membership. Only those who had undergone a personal experience of “conversion” reassuring them of their salvation could be full members of the church and baptize their children. As the first generation died off, however, many of those children could not themselves personally testify to such conversion and so bring their own offspring into the church. They were finally allowed to do so by the Half-Way Covenant of 1662 but did not enjoy all the rights of full membership. Such apparent theological hair-splitting illustrated the power of the colony's expanding and dispersing population. As congregations hived off to different towns and immigration continued to bring in worshippers of other faiths, the rigidity of Puritan doctrine was forced to bend somewhat before the wind.

      Nevertheless, in the first few years of Massachusetts's history, Puritan disagreements over the proper interpretation of doctrine led to schisms, exilings, and the foundation of new colonies. Only in America could dissenters move into neighbouring “wilderness” and start anew, as they did in Rhode Island and Connecticut. So the American experience encouraged religious diversity from the start. Even the grim practice of punishing dissidents such as the Quakers (and “witches”) fell into disuse by the end of the 17th century.

      Toleration was a slow-growing plant, but circumstances sowed its seeds early in the colonial experience. Maryland's founders, thewell-born Catholic Calvert family, extended liberty to their fellow parishioners and other non-Anglicans in the Toleration Act of 1649. Despite the fact that Anglicanism was later established in Maryland, it remained the first locus of American Catholicism, and the first “American” bishop named after the Revolution, John Carroll (Carroll, John), was of English stock. Not until the 19th century would significant immigration from Germany, Ireland, Italy, and Poland provide U.S. Catholicism its own “melting pot.” Pennsylvania was not merely a refuge for the oppressed community who shared William Penn (Penn, William)'s Quaker faith but by design a model “commonwealth” of brotherly love in general. And Georgia was founded by idealistic and religious gentlemen to provide a second chance in the New World for debtors in a setting where both rum and slavery were banned, though neither prohibition lasted long.

      American Protestantism was also diversified by immigration. The arrival of thousands of Germans early in the 18th century brought, especially to western Pennsylvania, islands of German pietism as practiced by Mennonites (Mennonite), Moravians (Moravian church), Schwenkfelders, and others.

      Anabaptists (Anabaptist), also freshly arrived from the German states, broadened the foundations of the Baptist church in the new land. French Huguenots (Huguenot) fleeing fresh persecutions after 1687 (they had already begun arriving in North America in the 1650s) added a Gallic brand of Calvinism to the patchwork quilt of American faith. Jews arrived in what was then Dutch New Amsterdam in 1654 and were granted asylum by the Dutch West India Company, to the dismay of Gov. Peter Stuyvesant (Stuyvesant, Peter), who gloomily foresaw that it would be a precedent for liberality toward Quakers, Lutherans, and “Papists.” By 1763, synagogues had been established in New York, Philadelphia, Newport (R.I.), Savannah (Ga.), and other seaport cities where small Jewish mercantile communities existed.

      Religious life in the American colonies already had a distinctive stamp in the 1740s. Some of its original zeal had cooled as material prosperity increased and the hardships of the founding era faded in memory. But then came a shake-up.

Bernard A. Weisberger
      A series of religious revivals (revivalism) known collectively as the Great Awakening swept over the colonies in the 1730s and '40s. Its impact was first felt in the middle colonies, where Theodore J. Frelinghuysen, a minister of the Dutch Reformed Church, began preaching in the 1720s. In New England in the early 1730s, men such as Jonathan Edwards (Edwards, Jonathan), perhaps the most learned theologian of the 18th century, were responsible for a reawakening of religious fervour. By the late 1740s the movement had extended into the Southern colonies, where itinerant preachers such as Samuel Davies (Davies, Samuel) and George Whitefield (Whitefield, George) exerted considerable influence, particularly in the backcountry.

      The Great Awakening represented a reaction against the increasing secularization of society and against the corporate and materialistic nature of the principal churches of American society. By making conversion the initial step on the road to salvation and by opening up the conversion experience to all who recognized their own sinfulness, the ministers of the Great Awakening, some intentionally and others unwittingly, democratized Calvinist theology. The technique of many of the preachers of the Great Awakening was to inspire in their listeners a fear of the consequences of their sinful lives and a respect for the omnipotence of God. This sense of the ferocity of God was often tempered by the implied promise that a rejection of worldliness and a return to faith would result in a return to grace and an avoidance of the horrible punishments of an angry God. There was a certain contradictory quality about these two strains of Great Awakening theology, however. predestination, one of the principal tenets of the Calvinist theology of most of the ministers of the Great Awakening, was ultimately incompatible with the promise that man could, by a voluntary act of faith, achieve salvation by his own efforts. Furthermore, the call for a return to complete faith and the emphasis on the omnipotence of God was the very antithesis of Enlightenment thought, which called for a greater questioning of faith and a diminishing role for God in the daily affairs of man. On the other hand, Edwards, one of the principal figures of the Great Awakening in America, explicitly drew on the thought of men such as John Locke (Locke, John) and Isaac Newton (Newton, Sir Isaac) in an attempt to make religion rational. Perhaps most important, the evangelical styles of religious worship promoted by the Great Awakening helped make the religious doctrines of many of the insurgent church denominations—particularly those of the Baptists and the Methodists—more accessible to a wider cross section of the American population. This expansion in church membership extended to blacks as well as to those of European descent, and the ritual forms of Evangelical Protestantism possessed features that facilitated the syncretism of African and American forms of religious worship.

Colonial America, England, and the wider world
      The American colonies, though in many ways isolated from the countries of Europe, were nevertheless continually subject to diplomatic and military pressures from abroad. In particular, Spain and France were always nearby, waiting to exploit any signs of British weakness in America in order to increase their commercial and territorial designs on the North American mainland. The Great War for the Empire—or the French and Indian War, as it is known to Americans—was but another round in a century of warfare between the major European powers. First in King William's War (1689–97), then in Queen Anne's War (1702–13), and later in King George's War (1744–48; the American phase of the War of the Austrian Succession (Austrian Succession, War of the)), Englishmen and Frenchmen had vied for control over the Indians, for possession of the territory lying to the north of the North American colonies, for access to the trade in the Northwest, and for commercial superiority in the West Indies. In most of these encounters, France had been aided by Spain. Because of its own holdings immediately south and west of the British colonies and in the Caribbean, Spain realized that it was in its own interest to join with the French in limiting British expansion. The culmination of these struggles came in 1754 with the Great War for the Empire. Whereas previous contests between Great Britain and France in North America had been mostly provincial affairs, with American colonists doing most of the fighting for the British, the Great War for the Empire saw sizable commitments of British troops to America. The strategy of the British under William Pitt (Pitt, William, the Elder) was to allow their ally, Prussia, to carry the brunt of the fighting in Europe and thus free Britain to concentrate its troops in America.

   Despite the fact that they were outnumbered 15 to 1 by the British colonial population in America, the French were nevertheless well equipped to hold their own. They had a larger military organization in America than did the English; their troops were better trained; and they were more successful than the British in forming military alliances with the Indians. The early engagements of the war went to the French; the surrender of George Washington (Washington, George) to a superior French force at Fort Necessity, the annihilation of Gen. Edward Braddock (Braddock, Edward) at the Monongahela River, and French victories at Oswego and Fort William Henry all made it seem as if the war would be a short and unsuccessful one for the British. Even as these defeats took place, however, the British were able to increase their supplies of both men and matériel in America. By 1758, with its strength finally up to a satisfactory level, Britain began to implement its larger strategy, which involved sending a combined land and sea force to gain control of the St. Lawrence and a large land force aimed at Fort Ticonderoga (Ticonderoga) to eliminate French control of Lake Champlain (Champlain, Lake). The first expedition against the French at Ticonderoga was a disaster, as Gen. James Abercrombie (Abercrombie, James) led about 15,000 British and colonial troops in an attack against the French before his forces were adequately prepared. The British assault on Louisburg (Louisbourg), the key to the St. Lawrence, was more successful. In July 1758 Lord Jeffrey Amherst (Amherst, Jeffery Amherst, 1st Baron) led a naval attack in which his troops landed on the shores from small boats, established beachheads, and then captured the fort at Louisburg.

 In 1759, after several months of sporadic fighting, the forces of James Wolfe (Wolfe, James) captured Quebec (Quebec, Battle of) from the French army led by the marquis de Montcalm. This was probably the turning point of the war. By the fall of 1760, the British had taken Montreal, and Britain possessed practical control of all of the North American continent. It took another two years for Britain to defeat its rivals in other parts of the world, but the contest for control of North America had been settled.

      In the Treaty of Paris of 1763 (Paris, Treaty of), Great Britain took possession of all of Canada, East and West Florida, all territory east of the Mississippi in North America, and St. Vincent, Tobago, and Dominica in the Caribbean. At the time, the British victory seemed one of the greatest in its history. The British Empire in North America had been not only secured but also greatly expanded. But in winning the war Britain had dissolved the empire's most potent material adhesives. Conflicts arose as the needs and interests of the British Empire began to differ from those of the American colonies; and the colonies, now economically powerful, culturally distinct, and steadily becoming more independent politically, would ultimately rebel before submitting to the British plan of empire.

Richard R. Beeman

The Native American response
      The other major players in this struggle for control of North America were, of course, the American Indians. Modern historians no longer see the encounters between Native Americans and Europeans through the old lens in which “discoverers of a New World” find a “wilderness” inhabited by “savages.” Instead they see a story of different cultures interacting, with the better-armed Europeans eventually subduing the local population, but not before each side had borrowed practices and techniques from the other and certainly not according to any uniform plan.

  The English significantly differed from the Spanish and French colonizers in North America. Spain's widespread empire in the Southwest relied on scattered garrisons and missions to keep the Indians under control and “usefully” occupied. The French in Canada dealt with “their” Indians essentially as the gatherers of fur, who could therefore be left in de facto possession of vast forest tracts. English colonies, in what would eventually become their strength, came around to encouraging the immigration of an agricultural population that would require the exclusive use of large land areas to cultivate—which would have to be secured from native possessors.

      English colonial officials began by making land purchases, but such transactions worked to the disadvantage of the Indians, to whom the very concept of group or individual “ownership” of natural resources was alien. After a “sale” was concluded with representatives of Indian peoples (who themselves were not always the “proprietors” of what they signed away), the Indians were surprised to learn that they had relinquished their hunting and fishing rights, and settlers assumed an unqualified sovereignty that Native American culture did not recognize.

      In time, conflict was inevitable. In the early days of settlement, Indian-European cooperation could and did take place, as with, for example, the assistance rendered by Squanto to the settlers of Plymouth colony or the semidiplomatic marriage of Virginia's John Rolfe (Rolfe, John) to Pocahontas, the daughter of Powhatan. The Native Americans taught the newcomers techniques of survival in their new environment and in turn were introduced to and quickly adopted metal utensils, European fabrics, and especially firearms. They were less adept in countering two European advantages—the possession of a common written language and a modern system of exchange—so most purchases of Indian lands by colonial officials often turned into thinly disguised landgrabs. William Penn (Penn, William) and Roger Williams (Williams, Roger) made particular efforts to deal fairly with the Native Americans, but they were rare exceptions.

 The impact of Indian involvement in the affairs of the colonists was especially evident in the Franco-British struggle over Canada. For furs the French had depended on the Huron people settled around the Great Lakes, but the Iroquois Confederacy, based in western New York and southern Ontario, succeeded in crushing the Hurons and drove Huron allies such as the Susquehannocks (Susquehannock) and the Delawares (Delaware) southward into Pennsylvania. This action put the British in debt to the Iroquois because it diverted some of the fur trade from French Montreal and Quebec city to British Albany and New York City. European-Indian alliances also affected the way in which Choctaws (Choctaw), influenced by the French in Louisiana, battled with Spanish-supported Apalachees (Apalachee) from Florida and with the Cherokees (Cherokee), who were armed by the British in Georgia.

 The French and Indian War not only strengthened the military experience and self-awareness of the colonists but also produced several Indian leaders, such as Red Jacket and Joseph Brant (Brant, Joseph), who were competent in two or three languages and could negotiate deals between their own peoples and the European contestants. But the climactic Franco-British struggle was the beginning of disaster for the Indians. When the steady military success of the British culminated in the expulsion of France from Canada, the Indians no longer could play the diplomatic card of agreeing to support whichever king—the one in London or the one in Paris—would restrain westward settlement. Realizing this led some Indians to consider mounting a united resistance to further encroachments. This was the source of the rebellion led by the Ottawa chief Pontiac in 1763, but, like later efforts at cooperative Indian challenges to European and later U.S. power, it was simply not enough.

Bernard A. Weisberger

The American Revolution and the early federal republic
Prelude to revolution
      Britain's victory over France in the Great War for the Empire had been won at very great cost. British government expenditures, which had amounted to nearly £6.5 million annually before the war, rose to about £14.5 million annually during the war. As a result, the burden of taxation in England was probably the highest in the country's history, much of it borne by the politically influential landed classes. Furthermore, with the acquisition of the vast domain of Canada and the prospect of holding British territories both against the various nations of Indians and against the Spaniards to the south and west, the costs of colonial defense could be expected to continue indefinitely. Parliament, moreover, had voted to give Massachusetts a generous sum in compensation for its war expenses. It therefore seemed reasonable to British opinion that some of the future burden of payment should be shifted to the colonists themselves—who until then had been lightly taxed and indeed lightly governed.

      The prolonged wars had also revealed the need to tighten the administration of the loosely run and widely scattered elements of the British Empire. If the course of the war had confirmed the necessity, the end of the war presented the opportunity. The acquisition of Canada required officials in London to take responsibility for the unsettled western territories, now freed from the threat of French occupation. The British soon moved to take charge of the whole field of Indian (Native American) relations. By the royal Proclamation of 1763 (1763, Proclamation of), a line was drawn down the Appalachians marking the limit of settlement from the British colonies, beyond which Indian trade was to be conducted strictly through British-appointed commissioners. The proclamation sprang in part from a respect for Indian rights (though it did not come in time to prevent the uprising led by Pontiac). From London's viewpoint, leaving a lightly garrisoned West to the fur-gathering Indians also made economic and imperial sense. The proclamation, however, caused consternation among British colonists for two reasons. It meant that limits were being set to the prospects of settlement and speculation in western lands, and it took control of the west out of colonial hands. The most ambitious men in the colonies thus saw the proclamation as a loss of power to control their own fortunes. Indeed, the British government's huge underestimation of how deeply the halt in westward expansion would be resented by the colonists was one of the factors in sparking the 12-year crisis that led to the American Revolution. Indian efforts to preserve a terrain for themselves in the continental interior might still have had a chance with British policy makers, but they would be totally ineffective when the time came to deal with a triumphant United States of America.

The tax (taxation) controversy
      George Grenville (Grenville, George), who was named prime minister in 1763, was soon looking to meet the costs of defense by raising revenue in the colonies. The first measure was the Plantation Act (Sugar Act) of 1764, usually called the Revenue, or Sugar (Sugar Act), Act, which reduced to a mere threepence the duty on imported foreign molasses but linked with this a high duty on refined sugar and a prohibition on foreign rum (the needs of the British treasury were carefully balanced with those of West Indies planters and New England distillers). The last measure of this kind (1733) had not been enforced, but this time the government set up a system of customs houses, staffed by British officers, and even established a vice-admiralty court. The court sat at Halifax, N.S., and heard very few cases, but in principle it appeared to threaten the cherished British privilege of trials by local juries. Boston further objected to the tax's revenue-raising aspect on constitutional grounds, but, despite some expressions of anxiety, the colonies in general acquiesced.

      Parliament next affected colonial economic prospects by passing a Currency Act (1764) to withdraw paper currencies, many of them surviving from the war period, from circulation. This was not done to restrict economic growth so much as to take out currency that was thought to be unsound, but it did severely reduce the circulating medium during the difficult postwar period and further indicated that such matters were subject to British control.

      Grenville's next move was a stamp duty, to be raised on a wide variety of transactions, including legal writs, newspaper advertisements, and ships' bills of lading. The colonies were duly consulted and offered no alternative suggestions. The feeling in London, shared by Benjamin Franklin, was that, after making formal objections, the colonies would accept the new taxes as they had the earlier ones. But the Stamp Act (1765) hit harder and deeper than any previous parliamentary measure. As some agents had already pointed out, because of postwar economic difficulties the colonies were short of ready funds. (In Virginia this shortage was so serious that the province's treasurer, John Robinson, who was also speaker of the assembly, manipulated and redistributed paper money that had been officially withdrawn from circulation by the Currency Act; a large proportion of the landed gentry benefited from this largesse.) The Stamp Act struck at vital points of colonial economic operations, affecting transactions in trade. It also affected many of the most articulate and influential people in the colonies (lawyers, journalists, bankers). It was, moreover, the first “internal” tax levied directly on the colonies by Parliament. Previous colonial taxes had been levied by local authorities or had been “external” import duties whose primary aim could be viewed as regulating trade for the benefit of the empire as a whole rather than raising revenue. Yet no one, either in Britain or in the colonies, fully anticipated the uproar that followed the imposition of these duties. Mobs in Boston and other towns rioted and forced appointed stamp distributors to renounce their posts; legal business was largely halted. Several colonies sent delegations to a Congress in New York in the summer of 1765, where the Stamp Act was denounced as a violation of the Englishman's right to be taxed only through elected representatives, and plans were adopted to impose a nonimportation embargo on British goods.

      A change of ministry facilitated a change of British policy on taxation. Parliamentary opinion was angered by what it perceived as colonial lawlessness, but British merchants were worried about the embargo on British imports. The marquis of Rockingham (Rockingham, Charles Watson-Wentworth, 2nd marquess of), succeeding Grenville, was persuaded to repeal the Stamp Act—for domestic reasons rather than out of any sympathy with colonial protests—and in 1766 the repeal was passed. On the same day, however, Parliament also passed the Declaratory Act, which declared that Parliament had the power to bind or legislate the colonies “in all cases whatsoever.” Parliament would not have voted the repeal without this assertion of its authority.

 The colonists, jubilant at the repeal of the Stamp Act, drank innumerable toasts, sounded peals of cannon, and were prepared to ignore the Declaratory Act as face-saving window dressing. John Adams (Adams, John), however, warned in his Dissertation on the Canon and Feudal Law that Parliament, armed with this view of its powers, would try to tax the colonies again; and this happened in 1767 when Charles Townshend (Townshend, Charles) became chancellor of the Exchequer in a ministry formed by Pitt, now earl of Chatham. The problem was that Britain's financial burden had not been lifted. Townshend (Townshend Acts), claiming to take literally the colonial distinction between external and internal taxes, imposed external duties on a wide range of necessities, including lead, glass, paint, paper, and tea, the principal domestic beverage. One ominous result was that colonists now began to believe that the British were developing a long-term plan to reduce the colonies to a subservient position, which they were soon calling “slavery.” This view was ill-informed, however. Grenville's measures had been designed as a carefully considered package; apart from some tidying-up legislation, Grenville had had no further plans for the colonies after the Stamp Act. His successors developed further measures, not as extensions of an original plan but because the Stamp Act had been repealed.

 Nevertheless, the colonists were outraged. In Pennsylvania the lawyer and legislator John Dickinson (Dickinson, John) wrote a series of essays that, appearing in 1767 and 1768 as Letters from a Farmer in Pennsylvania, were widely reprinted and exerted great influence in forming a united colonial opposition. Dickinson agreed that Parliament had supreme power where the whole empire was concerned, but he denied that it had power over internal colonial affairs; he quietly implied that the basis of colonial loyalty lay in its utility among equals rather than in obedience owed to a superior.

      It proved easier to unite on opinion than on action. Gradually, after much maneuvering and negotiation, a wide-ranging nonimportation policy against British goods was brought into operation. Agreement had not been easy to reach, and the tensions sometimes broke out in acrimonious charges of noncooperation. In addition, the policy had to be enforced by newly created local committees, a process that put a new disciplinary power in the hands of local men who had not had much previous experience in public affairs. There were, as a result, many signs of discontent with the ordering of domestic affairs in some of the colonies—a development that had obvious implications for the future of colonial politics if more action was needed later.

Constitutional differences with Britain
 Very few colonists wanted or even envisaged independence at this stage. (Dickinson had hinted at such a possibility with expressions of pain that were obviously sincere.) The colonial struggle for power, although charged with intense feeling, was not an attempt to change government structure but an argument over legal interpretation. The core of the colonial case was that, as British subjects, they were entitled to the same privileges as their fellow subjects in Britain. They could not constitutionally be taxed without their own consent; and, because they were unrepresented in the Parliament that voted the taxes, they had not given this consent. James Otis (Otis, James), in two long pamphlets, ceded all sovereign power to Parliament with this proviso. Others, however, began to question whether Parliament did have lawful power to legislate over the colonies. These doubts were expressed by the late 1760s, when James Wilson (Wilson, James), a Scottish immigrant lawyer living in Philadelphia, wrote an essay on the subject. Because of the withdrawal of the Townshend round of duties in 1770, Wilson kept this essay private until new troubles arose in 1774, when he published it as Considerations on the Nature and Extent of the Legislative Authority of the British Parliament. In this he fully articulated a view that had been gathering force in the colonies (it was also the opinion of Franklin) that Parliament's lawful sovereignty stopped at the shores of Britain.

      The official British reply to the colonial case on representation was that the colonies were “virtually” represented in Parliament in the same sense that the large voteless majority of the British public was represented by those who did vote. To this Otis snorted that, if the majority of the British people did not have the vote, they ought to have it. The idea of colonial members of Parliament, several times suggested, was never a likely solution because of problems of time and distance and because, from the colonists' point of view, colonial members would not have adequate influence.

      The standpoints of the two sides to the controversy could be traced in the language used. The principle of parliamentary sovereignty was expressed in the language of paternalistic authority; the British referred to themselves as parents and to the colonists as children. Colonial Tories, who accepted Parliament's case in the interests of social stability, also used this terminology. From this point of view, colonial insubordination was “unnatural,” just as the revolt of children against parents was unnatural. The colonists replied to all this in the language of rights. They held that Parliament could do nothing in the colonies that it could not do in Britain because the Americans were protected by all the common-law rights of the British. (When the First Continental Congress met in September 1774, one of its first acts was to affirm that the colonies were entitled to the common law of England.)

      Rights, as Richard Bland of Virginia insisted in The Colonel Dismounted (as early as 1764), implied equality. And here he touched on the underlying source of colonial grievance. Americans were being treated as unequals, which they not only resented but also feared would lead to a loss of control of their own affairs. Colonists perceived legal inequality when writs of assistance (assistance, writ of)—essentially, general search warrants—were authorized in Boston in 1761 while closely related “general warrants” were outlawed in two celebrated cases in Britain. Townshend specifically legalized writs of assistance in the colonies in 1767. Dickinson devoted one of his Letters from a Farmer to this issue.

  When Lord North (North, Frederick North, Lord) became prime minister early in 1770, George III had at last found a minister who could work both with himself and with Parliament. British government began to acquire some stability. In 1770, in the face of the American policy of nonimportation, the Townshend tariffs were withdrawn—all except the tax on tea, which was kept for symbolic reasons. Relative calm returned, though it was ruffled on the New England coastline by frequent incidents of defiance of customs officers, who could get no support from local juries. These outbreaks did not win much sympathy from other colonies, but they were serious enough to call for an increase in the number of British regular forces stationed in Boston (Boston Massacre). One of the most violent clashes occurred in Boston just before the repeal of the Townshend duties. Threatened by mob harassment, a small British detachment opened fire and killed five people, an incident soon known as the Boston Massacre. The soldiers were charged with murder and were given a civilian trial, in which John Adams conducted a successful defense.

      The other serious quarrel with British authority occurred in New York, where the assembly refused to accept all the British demands for quartering troops. Before a compromise was reached, Parliament had threatened to suspend the assembly. The episode was ominous because it indicated that Parliament was taking the Declaratory Act at its word; on no previous occasion had the British legislature intervened in the operation of the constitution in an American colony. (Such interventions, which were rare, had come from the crown.)

      British intervention in colonial economic affairs occurred again when in 1773 Lord North's administration tried to rescue the East India Company from difficulties that had nothing to do with America. The Tea Act gave the company, which produced tea in India, a monopoly of distribution in the colonies. The company planned to sell its tea through its own agents, eliminating the system of sale by auction to independent merchants. By thus cutting the costs of middlemen, it hoped to undersell the widely purchased inferior smuggled tea. This plan naturally affected colonial merchants, and many colonists denounced the act as a plot to induce Americans to buy—and therefore pay the tax on—legally imported tea. Boston (Boston Tea Party) was not the only port to threaten to reject the casks of taxed tea, but its reply was the most dramatic—and provocative.

 On Dec. 16, 1773, a party of Bostonians, thinly disguised as Mohawk Indians, boarded the ships at anchor and dumped some £10,000 worth of tea into the harbour, an event popularly known as the Boston Tea Party. British opinion was outraged, and America's friends in Parliament were immobilized. (American merchants in other cities were also disturbed. Property was property.) In the spring of 1774, with hardly any opposition, Parliament passed a series of measures (Intolerable Acts) designed to reduce Massachusetts to order and imperial discipline. The port of Boston was closed, and, in the Massachusetts Government Act, Parliament for the first time actually altered a colonial charter, substituting an appointive council for the elective one established in 1691 and conferring extensive powers on the governor and council. The famous town meeting, a forum for radical thinkers, was outlawed as a political body. To make matters worse, Parliament also passed the Quebec Act for the government of Canada. To the horror of pious New England Calvinists, the Roman Catholic religion was recognized for the French inhabitants. In addition, Upper Canada (i.e., the southern section) was joined to the Mississippi valley for purposes of administration, permanently blocking the prospect of American control of western settlement.

 There was widespread agreement that this intervention in colonial government could threaten other provinces and could be countered only by collective action. After much intercolonial correspondence, a Continental Congress came into existence, meeting in Philadelphia in September 1774. Every colonial assembly except that of Georgia appointed and sent a delegation. The Virginia delegation's instructions were drafted by Thomas Jefferson (Jefferson, Thomas) and were later published as A Summary View of the Rights of British America (1774). Jefferson insisted on the autonomy of colonial legislative power and set forth a highly individualistic view of the basis of American rights. This belief that the American colonies and other members of the British Empire were distinct states united under the king and thus subject only to the king and not to Parliament was shared by several other delegates, notably James Wilson (Wilson, James) and John Adams (Adams, John), and strongly influenced the Congress.

      The Congress's first important decision was one on procedure: whether to vote by colony, each having one vote, or by wealth calculated on a ratio with population. The decision to vote by colony was made on practical grounds—neither wealth nor population could be satisfactorily ascertained—but it had important consequences. Individual colonies, no matter what their size, retained a degree of autonomy that translated immediately into the language and prerogatives of sovereignty. Under Massachusetts's influence, the Congress next adopted the Suffolk Resolves, recently voted in Suffolk county, Mass., which for the first time put natural rights into the official colonial argument (hitherto all remonstrances had been based on common law and constitutional rights). Apart from this, however, the prevailing mood was cautious.

      The Congress's aim was to put such pressure on the British government that it would redress all colonial grievances and restore the harmony that had once prevailed. The Congress thus adopted an Association that committed the colonies to a carefully phased plan of economic pressure, beginning with nonimportation, moving to nonconsumption, and finishing the following September (after the rice harvest had been exported) with nonexportation. A few New England and Virginia delegates were looking toward independence, but the majority went home hoping that these steps, together with new appeals to the king and to the British people, would avert the need for any further such meetings. If these measures failed, however, a second Congress would convene the following spring.

      Behind the unity achieved by the Congress lay deep divisions in colonial society. In the mid-1760s upriver New York was disrupted by land riots, which also broke out in parts of New Jersey; much worse disorder ravaged the backcountry of both North and South Carolina, where frontier people were left unprotected by legislatures that taxed them but in which they felt themselves unrepresented. A pitched battle at Alamance Creek in North Carolina in 1771 ended that rising, known as the Regulator (Regulators of North Carolina) Insurrection, and was followed by executions for treason. Although without such serious disorder, the cities also revealed acute social tensions and resentments of inequalities of economic opportunity and visible status. New York provincial politics were riven by intense rivalry between two great family-based factions, the DeLanceys, who benefited from royal government connections, and their rivals, the Livingstons. (The politics of the quarrel with Britain affected the domestic standing of these groups and eventually eclipsed the DeLanceys.) Another phenomenon was the rapid rise of dissenting religious sects, notably the Baptists (Baptist); although they carried no political program, their style of preaching suggested a strong undercurrent of social as well as religious dissent. There was no inherent unity to these disturbances, but many leaders of colonial society were reluctant to ally themselves with these disruptive elements even in protest against Britain. They were concerned about the domestic consequences of letting the protests take a revolutionary turn; power shared with these elements might never be recovered.

 When British Gen. Thomas Gage (Gage, Thomas) sent a force from Boston to destroy American rebel military stores at Concord, Mass., fighting broke out between militia and British troops at Lexington and Concord (Lexington and Concord, Battles of) on April 19, 1775. Reports of these clashes reached the Second Continental Congress, which met in Philadelphia in May. Although most colonial leaders still hoped for reconciliation with Britain, the news stirred the delegates to more radical action. Steps were taken to put the continent on a war footing. While a further appeal was addressed to the British people (mainly at Dickinson's insistence), the Congress raised an army, adopted a Declaration of the Causes and Necessity of Taking Up Arms, and appointed committees to deal with domestic supply and foreign affairs. In August 1775 the king declared a state of rebellion; by the end of the year, all colonial trade had been banned. Even yet, Gen. George Washington (Washington, George), commander of the Continental Army, still referred to the British troops as “ministerial” forces, indicating a civil war, not a war looking to separate national identity.

 Then in January 1776 the publication of Thomas Paine's (Paine, Thomas) irreverent pamphlet Common Sense abruptly shattered this hopeful complacency and put independence on the agenda. Paine's eloquent, direct language spoke people's unspoken thoughts; no pamphlet had ever made such an impact on colonial opinion. While the Congress negotiated urgently, but secretly, for a French alliance, power struggles erupted in provinces where conservatives still hoped for relief. The only form relief could take, however, was British concessions; as public opinion hardened in Britain, where a general election in November 1774 had returned a strong majority for Lord North, the hope for reconciliation faded. In the face of British intransigence, men committed to their definition of colonial rights were left with no alternative, and the substantial portion of colonists—about one-third according to John Adams, although contemporary historians believe the number to have been much smaller—who preferred loyalty (loyalist) to the crown, with all its disadvantages, were localized and outflanked. Where the British armies massed, they found plenty of loyalist support, but, when they moved on, they left the loyalists feeble and exposed.

      The most dramatic internal revolution occurred in Pennsylvania, where a strong radical party, based mainly in Philadelphia but with allies in the country, seized power in the course of the controversy over independence itself. Opinion for independence swept the colonies in the spring of 1776. The Congress recommended that colonies form their own governments and assigned a committee to draft a Declaration of Independence.

   This document, written by Thomas Jefferson but revised in committee, consisted of two parts. The preamble set the claims of the United States on a basis of natural rights, with a dedication to the principle of equality; the second was a long list of grievances against the crown—not Parliament now, since the argument was that Parliament had no lawful power in the colonies. On July 2 the Congress itself voted for independence; on July 4 it adopted the Declaration of Independence. (See also Founding Fathers.)

J.R. Pole

The American Revolutionary War (American Revolution)
 The American Revolutionary War (American Revolution) thus began as a civil conflict within the British Empire over colonial affairs, but, with America being joined by France in 1778, Spain in 1779, and the Netherlands in 1780, it became an international war. On land the Americans assembled both state militias (militia) and the Continental (national) Army, with approximately 20,000 men, mostly farmers, fighting at any given time. By contrast, the British army was composed of reliable and well-trained professionals, numbering about 42,000 regulars, supplemented by about 30,000 German (Hessian) mercenaries.

 After the fighting at Lexington and Concord that began the war, rebel forces began a siege of Boston that ended when the American Gen. Henry Knox (Knox, Henry) arrived with artillery captured from Fort Ticonderoga, forcing Gen. William Howe (Howe, William), Gage's replacement, to evacuate Boston (Boston, Siege of) on March 17, 1776. An American force under Gen. Richard Montgomery invaded Canada in the fall of 1775, captured Montreal, and launched an unsuccessful attack on Quebec (Quebec, Battle of), in which Montgomery was killed. The Americans maintained a siege on the city until the arrival of British reinforcements in the spring and then retreated to Fort Ticonderoga.

      The British government sent Howe's brother, Richard, Adm. Lord Howe (Howe, Richard Howe, Earl, Baron Howe of Langar), with a large fleet to join his brother in New York, authorizing them to treat with the Americans and assure them pardon should they submit. When the Americans refused this offer of peace, General Howe landed on Long Island (Long Island, Battle of) and on August 27 defeated the army led by Washington (Washington, George), who retreated into Manhattan. Howe drew him north, defeated his army at Chatterton Hill near White Plains on October 28, and then stormed the garrison Washington had left behind on Manhattan, seizing prisoners and supplies. Lord Charles Cornwallis (Cornwallis, Charles Cornwallis, 1st Marquess and 2nd Earl, Viscount Brome, Baron Cornwallis of Eye), having taken Washington's other garrison at Fort Lee, drove the American army across New Jersey to the western bank of the Delaware River and then quartered his troops for the winter at outposts in New Jersey. On Christmas night Washington stealthily crossed the Delaware and attacked Cornwallis's garrison at Trenton, taking nearly 1,000 prisoners. Though Cornwallis soon recaptured Trenton, Washington escaped and went on to defeat British reinforcements at Princeton. Washington's Trenton-Princeton campaign (Trenton and Princeton, battles of) roused the new country and kept the struggle for independence alive.

 In 1777 a British army under Gen. John Burgoyne (Burgoyne, John) moved south from Canada with Albany, N.Y., as its goal. Burgoyne captured Fort Ticonderoga (Ticonderoga) on July 5, but, as he approached Albany, he was twice defeated by an American force led by Generals Horatio Gates (Gates, Horatio) and Benedict Arnold (Arnold, Benedict), and on Oct. 17, 1777, at Saratoga (Saratoga, Battles of), he was forced to surrender his army. Earlier that fall Howe had sailed from New York to Chesapeake Bay, and once ashore he had defeated Washington's forces at Brandywine Creek (Brandywine, Battle of the) on September 11 and occupied the American capital of Philadelphia on September 25.

  After a mildly successful attack at Germantown (Germantown, Battle of), Pa., on October 4, Washington quartered his 11,000 troops for the winter at Valley Forge, Pa. Though the conditions at Valley Forge were bleak and food was scarce, a Prussian officer, Baron Friedrich Wilhelm von Steuben (Steuben, Frederick William, Freiherr von), was able to give the American troops valuable training in maneuvers and in the more efficient use of their weapons. Von Steuben's aid contributed greatly to Washington's success at Monmouth (Monmouth, Battle of) (now Freehold), N.J., on June 28, 1778. After that battle British forces in the north remained chiefly in and around the city of New York.

      While the French (Franco-American Alliance) had been secretly furnishing financial and material aid to the Americans since 1776, in 1778 they began to prepare fleets and armies and in June finally declared war on Britain. With action in the north largely a stalemate, their primary contribution was in the south, where they participated in such undertakings as the siege of British-held Savannah and the decisive siege of Yorktown. Cornwallis destroyed an army under Gates at Camden (Camden, Battle of), S.C., on Aug. 16, 1780, but suffered heavy setbacks at Kings Mountain (Kings Mountain, Battle of), S.C., on October 7 and at Cowpens (Cowpens, Battle of), S.C., on Jan. 17, 1781. After Cornwallis won a costly victory at Guilford Courthouse (Guilford Courthouse, Battle of), N.C., on March 15, 1781, he entered Virginia to join other British forces there, setting up a base at Yorktown (Yorktown, Siege of). Washington's army and a force under the French Count de Rochambeau (Rochambeau, Jean-Baptiste-Donatien de Vimeur, comte de) placed Yorktown under siege, and Cornwallis surrendered his army of more than 7,000 men on Oct. 19, 1781.

      Thereafter, land action in America died out, though war continued on the high seas. Although a Continental Navy was created in 1775, the American sea effort lapsed largely into privateering (privateer), and after 1780 the war at sea was fought chiefly between Britain and America's European allies. Still, American privateers swarmed around the British Isles, and by the end of the war they had captured 1,500 British merchant ships and 12,000 sailors. After 1780 Spain and the Netherlands were able to control much of the water around the British Isles, thus keeping the bulk of British naval forces tied down in Europe.

      The military verdict in North America was reflected in the preliminary Anglo-American peace treaty of 1782, which was included in the Treaty of Paris of 1783. Franklin, John Adams, John Jay, and Henry Laurens (Laurens, Henry) served as the American commissioners. By its terms Britain recognized the independence of the United States with generous boundaries, including the Mississippi River on the west. Britain retained Canada but ceded East and West Florida to Spain. Provisions were inserted calling for the payment of American private debts to British citizens, for American access to the Newfoundland fisheries, and for a recommendation by the Continental Congress to the states in favour of fair treatment of the loyalists (loyalist).

 Most of the loyalists (loyalist) remained in the new country; however, perhaps as many as 80,000 Tories migrated to Canada, England, and the British West Indies. Many of these had served as British soldiers, and many had been banished by the American states. The loyalists were harshly treated as dangerous enemies by the American states during the war and immediately afterward. They were commonly deprived of civil rights, often fined, and frequently relieved of their property. The more conspicuous were usually banished upon pain of death. The British government compensated more than 4,000 of the exiles for property losses, paying out almost £3.3 million. It also gave them land grants, pensions, and appointments to enable them to reestablish themselves. The less ardent and more cautious Tories, staying in the United States, accepted the separation from Britain as final and, after the passage of a generation, could not be distinguished from the patriots.

Foundations of the American republic
      It had been far from certain that the Americans could fight a successful war against the might of Britain. The scattered colonies had little inherent unity; their experience of collective action was limited; an army had to be created and maintained; they had no common institutions other than the Continental Congress; and they had almost no experience of continental public finance. The Americans could not have hoped to win the war without French help, and the French monarchy—whose interests were anti-British but not pro-American—had waited watchfully to see what the Americans could do in the field. Although the French began supplying arms, clothing, and loans surreptitiously soon after the Americans declared independence, it was not until 1778 that a formal alliance was forged.

      Most of these problems lasted beyond the achievement of independence and continued to vex American politics for many years, even for generations. Meanwhile, however, the colonies had valuable, though less visible, sources of strength. Practically all farmers had their own arms and could form into militia companies overnight. More fundamentally, Americans had for many years been receiving basically the same information, mainly from the English press, reprinted in identical form in colonial newspapers. The effect of this was to form a singularly wide body of agreed opinion about major public issues. Another force of incalculable importance was the fact that for several generations Americans had to a large extent been governing themselves through elected assemblies, which in turn had developed sophisticated experience in committee politics.

      This factor of “institutional memory” was of great importance in the forming of a mentality of self-government (self-determination). Men became attached to their habitual ways, especially when these were habitual ways of running their own affairs, and these habits formed the basis of an ideology just as pervasive and important to the people concerned as republican theories published in Britain and the European continent. Moreover, colonial self-government seemed, from a colonial point of view, to be continuous and consistent with the principles of English government—principles for which Parliament had fought the Civil Wars (English Civil Wars) in the mid-17th century and which colonists believed to have been reestablished by the Glorious Revolution of 1688–89. It was equally important that experience of self-government had taught colonial leaders how to get things done. When the Continental Congress met in 1774, members did not have to debate procedure (except on voting); they already knew it. Finally, the Congress's authority was rooted in traditions of legitimacy. The old election laws were used. Voters could transfer their allegiance with minimal difficulty from the dying colonial assemblies to the new assemblies and conventions of the states.

Problems before the Second Continental Congress
      When the Second Continental Congress assembled in Philadelphia in May 1775, revolution was not a certainty. The Congress had to prepare for that contingency nevertheless and thus was confronted by two parallel sets of problems. The first was how to organize for war; the second, which proved less urgent but could not be set aside forever, was how to define the legal relationship between the Congress and the states.

 In June 1775, in addition to appointing Washington (who had made a point of turning up in uniform) commander in chief, the Congress provided for the enlistment of an army. It then turned to the vexatious problems of finance. An aversion to taxation being one of the unities of American sentiment, the Congress began by trying to raise a domestic loan. It did not have much success, however, for the excellent reason that the outcome of the operation appeared highly dubious. At the same time, authority was taken for issuing a paper currency. This proved to be the most important method of domestic war finance, and, as the war years passed, Congress resorted to issuing more and more Continental currency, which depreciated rapidly and had to compete with currencies issued by state governments. (People were inclined to prefer local currencies.) The Continental Army was a further source of a form of currency because its commission agents issued certificates in exchange for goods; these certificates bore an official promise of redemption and could be used in personal transactions. Loans raised overseas, notably in France and the Netherlands, were another important source of revenue.

      In 1780 Congress decided to call in all former issues of currency and replace them with a new issue on a 40-to-1 ratio. The Philadelphia merchant Robert Morris (Morris, Robert), who was appointed superintendent of finance in 1781 and came to be known as “the Financier,” guided the United States through its complex fiscal difficulties. Morris's personal finances were inextricably tangled up with those of the country, and he became the object of much hostile comment, but he also used his own resources to secure urgently needed loans from abroad. In 1781 Morris secured a charter for the first Bank of North America, an institution that owed much to the example of the Bank of England. Although the bank was attacked by radical egalitarians as an unrepublican manifestation of privilege, it gave the United States a firmer financial foundation.

 The problem of financing and organizing the war sometimes overlapped with Congress's other major problem, that of defining its relations with the states. The Congress, being only an association of states, had no power to tax individuals. The Articles of Confederation (Confederation, Articles of), a plan of government organization adopted and put into practice by Congress in 1777, although not officially ratified by all the states until 1781, gave Congress the right to make requisitions on the states proportionate to their ability to pay. The states in turn had to raise these sums by their own domestic powers to tax, a method that state legislators looking for reelection were reluctant to employ. The result was that many states were constantly in heavy arrears, and, particularly after the urgency of the war years had subsided, the Congress's ability to meet expenses and repay its war debts was crippled.

      The Congress lacked power to enforce its requisitions and fell badly behind in repaying its wartime creditors. When individual states ( Maryland as early as 1782, Pennsylvania in 1785) passed legislation providing for repayment of the debt owed to their own citizens by the Continental Congress, one of the reasons for the Congress's existence had begun to crumble. Two attempts were made to get the states to agree to grant the Congress the power it needed to raise revenue by levying an impost on imports. Each failed for want of unanimous consent. Essentially, an impost would have been collected at ports, which belonged to individual states—there was no “national” territory—and therefore cut across the concept of state sovereignty. Agreement was nearly obtained on each occasion, and, if it had been, the Constitutional Convention might never have been called. But the failure sharply pointed up the weakness of the Congress and of the union between the states under the Articles of Confederation.

      The Articles of Confederation reflected strong preconceptions of state sovereignty. Article II expressly reserved sovereignty to the states individually, and another article even envisaged the possibility that one state might go to war without the others. Fundamental revisions could be made only with unanimous consent, because the Articles represented a treaty between sovereigns, not the creation of a new nation-state. Other major revisions required the consent of nine states. Yet state sovereignty principles rested on artificial foundations. The states could never have achieved independence on their own, and in fact the Congress had taken the first step both in recommending that the states form their own governments and in declaring their collective independence. Most important of its domestic responsibilities, by 1787 the Congress had enacted several ordinances (Northwest Ordinances) establishing procedures for incorporating new territories. (It had been conflicts over western land claims that had held up ratification of the Articles. Eventually the states with western claims, principally New York and Virginia, ceded them to the United States.) The Northwest Ordinance (Northwest Ordinances) of 1787 provided for the phased settlement and government of territories in the Ohio valley, leading to eventual admission as new states. It also excluded the introduction of slavery—though it did not exclude the retention of existing slaves.

      The states had constantly looked to the Congress for leadership in the difficulties of war; now that the danger was past, however, disunity began to threaten to turn into disintegration. The Congress was largely discredited in the eyes of a wide range of influential men, representing both old and new interests. The states were setting up their own tariff barriers against each other and quarreling among themselves; virtual war had broken out between competing settlers from Pennsylvania and Connecticut claiming the same lands. By 1786, well-informed men were discussing a probable breakup of the confederation into three or more new groups, which could have led to wars between the American republics.

State politics
      The problems of forming a new government affected the states individually as well as in confederation. Most of them established their own constitutions—formulated either in conventions or in the existing assemblies. The most democratic of these constitutions was the product of a virtual revolution in Pennsylvania, where a highly organized radical party seized the opportunity of the revolutionary crisis to gain power. Suffrage was put on a taxpayer basis, with nearly all adult males paying some tax; representation was reformed to bring in the populations of western counties; and a single-chamber legislature was established. An oath of loyalty to the constitution for some time excluded political opponents and particularly Quakers (Quaker) (who could not take oaths) from participation. The constitutions of the other states reflected the firm political ascendancy of the traditional ruling elite. Power ascended from a broad base in the elective franchise and representation through a narrowing hierarchy of offices restricted by property qualifications. State governors had in some cases to be men of great wealth. Senators were either wealthy or elected by the wealthy sector of the electorate. (These conditions were not invariable; Virginia, which had a powerful landed elite, dispensed with such restrictions.) Several states retained religious qualifications for office; the separation of church and state was not a popular concept, and minorities such as Baptists and Quakers were subjected to indignities that amounted in some places (notably Massachusetts and Connecticut) to forms of persecution.

      Elite power provided a lever for one of the most significant transformations of the era, one that took place almost without being either noticed or intended. This was the acceptance of the principle of giving representation in legislative bodies in proportion (proportional representation) to population. It was made not only possible but attractive when the larger aggregations of population broadly coincided with the highest concentrations of property: great merchants and landowners from populous areas could continue to exert political ascendancy so long as they retained some sort of hold on the political process. The principle reemerged to dominate the distribution of voters in the House of Representatives (Representatives, House of) and in the electoral college under the new federal Constitution.

      Relatively conservative constitutions did little to stem a tide of increasingly democratic politics. The old elites had to wrestle with new political forces (and in the process they learned how to organize in the new regime). Executive power was weakened. Many elections were held annually, and terms were limited. Legislatures quickly admitted new representatives from recent settlements, many with little previous political experience.

      The new state governments, moreover, had to tackle major issues that affected all classes. The needs of public finance led to emissions of paper money. In several states these were resumed after the war, and, since they tended (though not invariably) to depreciate, they led directly to fierce controversies. The treatment of loyalists (loyalist) was also a theme of intense political dispute after the war. Despite the protests of men such as Alexander Hamilton (Hamilton, Alexander), who urged restoration of property and rights, in many states loyalists were driven out and their estates seized and redistributed in forms of auction, providing opportunities for speculation rather than personal occupation. Many states were depressed economically. In Massachusetts, which remained under orthodox control, stiff taxation under conditions of postwar depression trapped many farmers into debt. Unable to meet their obligations, they rose late in 1786 under a Revolutionary War officer, Capt. Daniel Shays, in a movement to prevent the court sessions. Shays's Rebellion was crushed early in 1787 by an army raised in the state. The action caused only a few casualties, but the episode sent a shiver of fear throughout the country's propertied classes. It also seemed to justify the classical thesis that republics were unstable. It thus provided a potent stimulus to state legislatures to send delegates to the convention called (following a preliminary meeting in Annapolis) to meet at Philadelphia to revise the Articles of Confederation.

 The Philadelphia Convention (Constitutional Convention), which met in May 1787, was officially called for by the old Congress solely to remedy defects in the Articles of Confederation. But the Virginia Plan presented by the Virginia delegates went beyond revision and boldly proposed to introduce a new, national government in place of the existing confederation. The convention thus immediately faced the question of whether the United States was to be a country in the modern sense or would continue as a weak federation of autonomous and equal states represented in a single chamber, which was the principle embodied in the New Jersey Plan presented by several small states. This decision was effectively made when a compromise plan for a bicameral (bicameral system) legislature (Congress of the United States)—one house with representation based on population and one with equal representation for all states—was approved in mid-June. Though neither plan prevailed, the new national government in its final form was endowed with broad powers that made it indisputably national and superior.

 The Constitution (Constitution of the United States of America), as it emerged after a summer of debate, embodied a much stronger principle of separation of powers (powers, separation of) than was generally to be found in the state constitutions. The chief executive (presidency of the United States of America) was to be a single figure (a composite executive was discussed and rejected) and was to be elected by an electoral college, meeting in the states. This followed much debate over the Virginia Plan's preference for legislative election. The principal control on the chief executive, or president, against violation of the Constitution was the rather remote threat of impeachment (to which James Madison attached great importance). The Virginia Plan's proposal that representation be proportional to population in both houses was severely modified by the retention of equal representation for each state in the Senate. But the question of whether to count slaves in the population was abrasive. After some contention, antislavery forces gave way to a compromise by which three-fifths of the slaves would be counted as population for purposes of representation (and direct taxation). Slave states would thus be perpetually overrepresented in national politics; provision was also added for a law permitting the recapture of fugitive slaves, though in deference to republican scruples the word slaves was not used. (See also Sidebar: The Founding Fathers and Slavery.)

      Contemporary theory expected the legislature to be the most powerful branch of government. Thus, to balance the system, the executive was given a veto, and a judicial (Supreme Court of the United States) system with powers of review was established. It was also implicit in the structure that the new federal judiciary would have power to veto any state laws that conflicted either with the Constitution or with federal statutes. States were forbidden to pass laws impairing obligations of contract—a measure aimed at encouraging capital—and the Congress could pass no ex post facto law. But the Congress was endowed with the basic powers of a modern—and sovereign—government. This was a republic, and the United States could confer no aristocratic titles of honour. The prospect of eventual enlargement of federal power appeared in the clause giving the Congress powers to pass legislation “necessary and proper” for implementing the general purposes of the Constitution.

 The states retained their civil jurisdiction, but there was an emphatic shift of the political centre of gravity to the federal government, of which the most fundamental indication was the universal understanding that this government would act directly on citizens, as individuals, throughout all the states, regardless of state authority. The language of the Constitution told of the new style: it began, “We the people of the United States,” rather than “We the people of New Hampshire, Massachusetts, etc.”

      The draft Constitution aroused widespread opposition. Anti-Federalists—so-called because their opponents deftly seized the appellation of “Federalists (Federalist Party),” though they were really nationalists—were strong in states such as Virginia, New York, and Massachusetts (Massachusetts Bay Colony), where the economy was relatively successful and many people saw little need for such extreme remedies. Anti-Federalists also expressed fears—here touches of class conflict certainly arose—that the new government would fall into the hands of merchants and men of money. Many good republicans detected oligarchy in the structure of the Senate, with its six-year terms. The absence of a bill of rights aroused deep fears of central power. The Federalists, however, had the advantages of communications, the press, organization, and, generally, the better of the argument. Anti-Federalists also suffered the disadvantage of having no internal coherence or unified purpose.

  The debate gave rise to a very intensive literature, much of it at a very high level. The most sustained pro-Federalist argument, written mainly by Hamilton (Hamilton, Alexander) and Madison (Madison, James) (assisted by Jay) under the pseudonym Publius, appeared in the newspapers as The Federalist (Federalist papers). These essays attacked the feebleness of the confederation and claimed that the new Constitution would have advantages for all sectors of society while threatening none. In the course of the debate, they passed from a strongly nationalist standpoint to one that showed more respect for the idea of a mixed form of government that would safeguard the states. Madison contributed assurances that a multiplicity of interests would counteract each other, preventing the consolidation of power continually charged by their enemies.

 The Bill of Rights (Rights, Bill of), steered through the first Congress by Madison's diplomacy, mollified much of the latent opposition. These first 10 amendments, ratified in 1791, adopted into the Constitution the basic English common-law rights that Americans had fought for. But they did more. Unlike Britain, the United States secured a guarantee of freedom for the press and the right of (peaceable) assembly. Also unlike Britain, church and state were formally separated in a clause that seemed to set equal value on nonestablishment of religion and its free exercise. (This left the states free to maintain their own establishments.)

      In state conventions held through the winter of 1787 to the summer of 1788, the Constitution was ratified by the necessary minimum of nine states. But the vote was desperately close in Virginia and New York, respectively the 10th and 11th states to ratify, and without them the whole scheme would have been built on sand.

The social revolution
  The American Revolution was a great social upheaval but one that was widely diffused, often gradual, and different in different regions. The principles of liberty and equality stood in stark conflict with the institution of African slavery, which had built much of the country's wealth. One gradual effect of this conflict was the decline of slavery in all the Northern states; another was a spate of manumissions by liberal slave owners in Virginia. But with most slave owners, especially in South Carolina and Georgia, ideals counted for nothing. Throughout the slave states, the institution of slavery came to be reinforced by a white supremacist doctrine of racial inferiority. The manumissions did result in the development of new communities of free blacks, who enjoyed considerable freedom of movement for a few years and who produced some outstanding figures, such as the astronomer Benjamin Banneker (Banneker, Benjamin) and the religious leader Richard Allen (Allen, Richard), a founder of the African Methodist Episcopal Church Zion (African Methodist Episcopal Zion Church). But in the 1790s and after, the condition of free blacks deteriorated as states adopted laws restricting their activities, residences, and economic choices. In general they came to occupy poor neighbourhoods and grew into a permanent underclass, denied education and opportunity.

      The American Revolution also dramatized the economic importance of women. Women had always contributed indispensably to the operation of farms and often businesses, while they seldom acquired independent status; but, when war removed men from the locality, women often had to take full charge, which they proved they could do. Republican ideas spread among women, influencing discussion of women's rights, education, and role in society. Some states modified their inheritance and property laws to permit women to inherit a share of estates and to exercise limited control of property after marriage. On the whole, however, the Revolution itself had only very gradual and diffused effects on women's ultimate status. Such changes as took place amounted to a fuller recognition of the importance of women as mothers of republican citizens rather than making them into independent citizens of equal political and civil status with men.

Willard M. Wallace
      Americans had fought for independence to protect common-law rights; they had no program for legal reform. Gradually, however, some customary practices came to seem out of keeping with republican principles. The outstanding example was the law of inheritance. The new states took steps, where necessary, to remove the old rule of primogeniture in favour of equal partition of intestate estates; this conformed to both the egalitarian and the individualist principles preferred by American society. Humanization of the penal codes, however, occurred only gradually, in the 19th century, inspired as much by European example as by American sentiment.

Religious revivalism
      Religion played a central role in the emergence of a distinctively “American” society in the first years of independence. Several key developments took place. One was the creation of American denominations independent of their British and European origins and leadership. By 1789 American Anglicans (renaming themselves Episcopalians), Methodists (Methodism) (formerly Wesleyans), Roman Catholics, and members of various Baptist, Lutheran (Lutheran Church in America), and Dutch Reformed (Dutch Reformed Church) congregations had established organizations and chosen leaders who were born in or full-time residents of what had become the United States of America. Another pivotal postindependence development was a rekindling of religious enthusiasm, especially on the frontier, that opened the gates of religious activism to the laity. Still another was the disestablishment of tax-supported churches in those states most deeply feeling the impact of democratic diversity. And finally, this period saw the birth of a liberal and socially aware version of Christianity uniting Enlightenment values with American activism.

 Between 1798 and 1800 a sudden burst of revitalization shook frontier Protestant congregations, beginning with a great revival in Logan county, Ky., under the leadership of men such as James McGready and the brothers John and William McGee. This was followed by a gigantic camp meeting at Cane Ridge, where thousands were “converted.” The essence of the frontier revival was that this conversion from mere formal Christianity to a full conviction in God's mercy for the sinner was a deeply emotional experience accessible even to those with much faith and little learning. So exhorters who were barely literate themselves could preach brimstone and fire and showers of grace, bringing repentant listeners to a state of excitement in which they would weep and groan, writhe and faint, and undergo physical transports in full public view.

      “Heart religion” supplanted “head religion.” For the largely Scotch-Irish Presbyterian ministers in the West, this led to dangerous territory, because the official church leadership preferred more decorum and biblical scholarship from its pastors. Moreover, the idea of winning salvation by noisy penitence undercut Calvinist predestination. In fact, the fracture along fault lines of class and geography led to several schisms. Methodism had fewer problems of this kind. It never embraced predestination, and, more to the point, its structure was democratic, with rudimentarily educated lay preachers able to rise from leading individual congregations to presiding over districts and regional “conferences,” eventually embracing the entire church membership. Methodism fitted very neatly into frontier conditions through its use of traveling ministers, or circuit riders, who rode from isolated settlement to settlement, saving souls and mightily liberalizing the word of God.

      The revival spirit rolled back eastward to inspire a “Second Great Awakening,” especially in New England, that emphasized gatherings that were less uninhibited than camp meetings but warmer than conventional Congregational and Presbyterian services. Ordained and college-educated ministers such as Lyman Beecher (Beecher, Lyman) made it their mission to promote revivalism as a counterweight to the Deism of some of the Founding Fathers and the atheism of the French Revolution. (See Sidebar: The Founding Fathers, Deism, and Christianity.) Revivals also gave churches a new grasp on the loyalties of their congregations through lay participation in spreading the good word of salvation. This voluntarism more than offset the gradual state-by-state cancellation of taxpayer support for individual denominations.

      The era of the early republic also saw the growth, especially among the urban educated elite of Boston, of a gentler form of Christianity embodied in Unitarianism (Unitarianism and Universalism), which rested on the notion of an essentially benevolent God who made his will known to humankind through their exercise of the reasoning powers bestowed on them. In the Unitarian view, Jesus Christ was simply a great moral teacher. Many Christians of the “middling” sort viewed Unitarianism as excessively concerned with ideas and social reform and far too indulgent or indifferent to the existence of sin and Satan. By 1815, then, the social structure of American Protestantism, firmly embedded in many activist forms in the national culture, had taken shape.

Bernard A. Weisberger

The United States from 1789 to 1816

The Federalist (Federalist Party) administration and the formation of parties
  The first elections under the new Constitution were held in 1789. George Washington (Washington, George) was unanimously voted the country's first president. His secretary of the treasury, Alexander Hamilton (Hamilton, Alexander), formed a clear-cut program (fiscal policy) that soon gave substance to the old fears of the Anti-Federalists. Hamilton, who had believed since the early 1780s that a national debt would be “a national blessing,” both for economic reasons and because it would act as a “cement” to the union, used his new power base to realize the ambitions of the nationalists. He recommended that the federal government pay off the old Continental Congress's debts at par rather than at a depreciated value and that it assume state debts, drawing the interests of the creditors toward the central government rather than state governments. This plan met strong opposition from the many who had sold their securities at great discount during the postwar depression and from Southern states, which had repudiated their debts and did not want to be taxed to pay other states' debts. A compromise in Congress was reached—thanks to the efforts of Secretary of State Jefferson—whereby Southern states approved Hamilton's plan in return for Northern agreement to fix the location of the new national capital on the banks of the Potomac, closer to the South. When Hamilton next introduced his plan to found a Bank of the United States, modeled on the Bank of England (England, Bank of), opposition began to harden. Many argued that the Constitution did not confide this power to Congress. Hamilton, however, persuaded Washington that anything not expressly forbidden by the Constitution was permitted under implied powers—the beginning of “loose” as opposed to “strict” constructionist interpretations of the Constitution. The Bank Act passed in 1791. Hamilton also advocated plans for the support of nascent industry, which proved premature, and he imposed the revenue-raising whiskey (Whiskey Rebellion) excise that led to the Whiskey Rebellion, a minor uprising in western Pennsylvania in 1794.

 A party opposed to Hamilton's fiscal policies began to form in Congress. With Madison at its centre and with support from Jefferson, it soon extended its appeal beyond Congress to popular constituencies. Meanwhile, the French Revolution and France's (French revolutionary and Napoleonic wars) subsequent declaration of war against Great Britain, Spain, and Holland further divided American loyalties. Democratic-Republican (Democratic-Republican Party) societies sprang up to express support for France, while Hamilton and his supporters, known as Federalists, backed Britain for economic reasons. Washington pronounced American neutrality in Europe, but to prevent a war with Britain he sent Chief Justice John Jay (Jay, John) to London to negotiate a treaty. In the Jay Treaty (1794) the United States gained only minor concessions and—humiliatingly—accepted British naval supremacy as the price of protection for American shipping.

 Washington, whose tolerance had been severely strained by the Whiskey Rebellion and by criticism of the Jay Treaty, chose not to run for a third presidential term. In his Farewell Address (see original text (George Washington: Farewell Address)), in a passage drafted by Hamilton, he denounced the new party (political party) politics as divisive and dangerous. Parties did not yet aspire to national objectives, however, and, when the Federalist John Adams (Adams, John) was elected president, the Democrat-Republican Jefferson, as the presidential candidate with the second greatest number of votes, became vice president. (See primary source document: Right of Free Elections.) Wars in Europe and on the high seas, together with rampant opposition at home, gave the new administration little peace. Virtual naval war with France had followed from American acceptance of British naval protection. In 1798 a French attempt to solicit bribes from American commissioners negotiating a settlement of differences (the so-called XYZ Affair) aroused a wave of anti-French feeling. Later that year the Federalist majority in Congress passed the Alien and Sedition Acts, which imposed serious civil restrictions on aliens suspected of pro-French activities and penalized U.S. citizens who criticized the government, making nonsense of the First Amendment's guarantee of free press. The acts were most often invoked to prosecute Republican editors, some of whom served jail terms. These measures in turn called forth the Virginia and Kentucky Resolutions, drafted respectively by Madison and Jefferson, which invoked state sovereignty against intolerable federal powers. War with France often seemed imminent during this period, but Adams was determined to avoid issuing a formal declaration of war, and in this he succeeded.

      Taxation, which had been levied to pay anticipated war costs, brought more discontent, however, including a new minor rising in Pennsylvania led by Jacob Fries. Fries's Rebellion was put down without difficulty, but widespread disagreement over issues ranging from civil liberties to taxation was polarizing American politics. A basic sense of political identity now divided Federalists from Republicans, and in the election of 1800 Jefferson drew on deep sources of Anti-Federalist opposition to challenge and defeat his old friend and colleague Adams. The result was the first contest over the presidency between political parties and the first actual change of government as a result of a general election in modern history.

The Jeffersonian (Jefferson, Thomas) Republicans in power
 Jefferson (Jefferson, Thomas) began his presidency with a plea for reconciliation: “We are all Republicans, we are all Federalists.” (See First Inaugural original text. (Thomas Jefferson: First Inaugural Address)) He had no plans for a permanent two-party system of government. He also began with a strong commitment to limited government and strict construction of the Constitution. All these commitments were soon to be tested by the exigencies of war, diplomacy, and political contingency.

 On the American continent, Jefferson pursued a policy of expansion. He seized the opportunity when Napoleon I decided to relinquish French ambitions in North America by offering the Louisiana (Louisiana Purchase) territory for sale (Spain had recently ceded the territory to France). This extraordinary acquisition, the Louisiana Purchase, bought at a price of a few cents per acre, more than doubled the area of the United States. Jefferson had no constitutional sanction for such an exercise of executive power; he made up the rules as he went along, taking a broad construction view of the Constitution on this issue. He also sought opportunities to gain Florida from Spain, and, for scientific and political reasons, he sent Meriwether Lewis and William Clark (Lewis and Clark Expedition) on an expedition of exploration across the continent. This territorial expansion was not without problems. Various separatist movements periodically arose, including a plan for a Northern Confederacy formulated by New England Federalists. Aaron Burr (Burr, Aaron), who had been elected Jefferson's vice president in 1800 but was replaced in 1804, led several western conspiracies. Arrested and tried for treason, he was acquitted in 1807.

 As chief executive, Jefferson clashed with members of the judiciary, many of whom had been late appointments by Adams. One of his primary opponents was the late appointee Chief Justice John Marshall (Marshall, John), most notably in the case of Madison (Marbury v. Madison) (1803), in which the Supreme Court (Supreme Court of the United States) first exercised the power of judicial review of congressional legislation.

      By the start of Jefferson's second term in office, Europe was engulfed in the Napoleonic Wars (French revolutionary and Napoleonic wars). The United States remained neutral, but both Britain and France imposed various orders and decrees severely restricting American trade with Europe and confiscated American ships for violating the new rules. Britain also conducted impressment raids in which U.S. citizens were sometimes seized. Unable to agree to treaty terms with Britain, Jefferson tried to coerce both Britain and France into ceasing to violate “neutral rights” with a total embargo on American exports, enacted by Congress in 1807. The results were catastrophic for American commerce and produced bitter alienation in New England, where the embargo (written backward as “O grab me”) was held to be a Southern plot to destroy New England's wealth. In 1809, shortly after Madison (Madison, James) was elected president, the embargo act was repealed.

Madison as president and the War of 1812
  Madison's presidency was dominated by foreign affairs. Both Britain and France committed depredations on American shipping, but Britain was more resented, partly because with the greatest navy it was more effective and partly because Americans were extremely sensitive to British insults to national honour. Certain expansionist elements looking to both Florida and Canada began to press for war and took advantage of the issue of naval protection. Madison's own aim was to preserve the principle of freedom of the seas and to assert the ability of the United States to protect its own interests and its citizens. While striving to confront the European adversaries impartially, he was drawn into war against Britain, which was declared in June 1812 (1812, War of) on a vote of 79–49 in the House and 19–13 in the Senate. There was almost no support for war in the strong Federalist New England states.

   The War of 1812 (1812, War of) began and ended in irony. The British had already rescinded the offending orders in council, but the news had not reached the United States at the time of the declaration. The Americans were poorly placed from every point of view. Ideological objections to armies and navies had been responsible for a minimal naval force. Ideological objections to banks had been responsible, in 1812, for the Senate's refusal to renew the charter of the Bank of the United States. Mercantile sentiment was hostile to the administration. Under the circumstances, it was remarkable that the United States succeeded in staggering through two years of war, eventually winning important naval successes at sea, on the Great Lakes, and on Lake Champlain (Champlain, Lake). On land a British raiding party burned public buildings in Washington, D.C., and drove President Madison to flee from the capital. The only action with long-term implications was Andrew Jackson (Jackson, Andrew)'s victory at the Battle of New Orleans (New Orleans, Battle of)—won in January 1815, two weeks after peace had been achieved with the signing of the Treaty of Ghent (Ghent, Treaty of) (Belg.). Jackson's political reputation rose directly from this battle.

      In historical retrospect, the most important aspect of the peace settlement was an agreement to set up a boundary commission for the Canadian border, which could thenceforth be left unguarded. It was not the end of Anglo-American hostility, but the agreement marked the advent of an era of mutual trust. The conclusion of the War of 1812, which has sometimes been called the Second War of American Independence, marked a historical cycle. It resulted in a pacification of the old feelings of pain and resentment against Great Britain and its people—still for many Americans a kind of paternal relationship. And, by freeing them of anxieties on this front, it also freed Americans to look to the West.

J.R. Pole

The Indian-American problem
 The young United States believed that it had inherited an “Indian problem,” but it would be equally fair to say that the victory at Yorktown confronted the Indians (Native American) with an insoluble “American problem.” Whereas they had earlier dealt with representatives of Europe-based empires seeking only access to selected resources from a distant continent, now they faced a resident, united people yearly swelling in numbers, determined to make every acre of the West their own and culturally convinced of their absolute title under the laws of God and history. There was no room for compromise. Even before 1776, each step toward American independence reduced the Indians' control over their own future. The Proclamation Line of 1763 (1763, Proclamation of) was almost immediately violated by men like Daniel Boone (Boone, Daniel) on the Kentucky frontier. In the western parts of Pennsylvania and New York, however, despite extensive Indian land concessions in the 1768 Treaty of Fort Stanwix (Fort Stanwix, Treaties of), they still had enough power to bar an advance toward the Ohio Valley and the Great Lakes.

      For armed resistance to have had any hope of success, unity would be required between all the Indians from the Appalachians to the Mississippi. This unity simply could not be achieved. The Shawnee leaders known as Tenskatawa, or the Prophet (Prophet, The), and his brother Tecumseh attempted this kind of rallying movement, much as Pontiac had done some 40 years earlier, with equal lack of success. Some help was forthcoming in the form of arms from British traders remaining in the Northwest Territory in violation of the peace treaty, but the Indians failed to secure victory in a clash with American militia and regulars at the Battle of Tippecanoe Creek (Tippecanoe, Battle of) (near present-day West Lafayette, Ind.) in 1811.

 The outbreak of the War of 1812 (1812, War of) sparked renewed Indian hopes of protection by the crown, should the British win. Tecumseh himself was actually commissioned as a general in the royal forces, but, at the Battle of the Thames (Thames, Battle of the) in 1813, he was killed, and his dismembered body parts, according to legend, were divided between his conquerors as gruesome souvenirs.

      Meanwhile, in 1814, U.S. Gen. Andrew Jackson (Jackson, Andrew) defeated the British-supported Creeks (Creek) in the Southwest in the Battle of Horseshoe Bend. The war itself ended in a draw that left American territory intact. Thereafter, with minor exceptions, there was no major Indian resistance east of the Mississippi. After the lusty first quarter century of American nationhood, all roads left open to Native Americans ran downhill.

The United States from 1816 to 1850
The Era of Mixed Feelings
   The years between the election to the presidency of James Monroe (Monroe, James) in 1816 and of John Quincy Adams (Adams, John Quincy) in 1824 have long been known in American history as the Era of Good Feelings (Good Feelings, Era of). The phrase was conceived by a Boston editor during Monroe's visit to New England early in his first term. That a representative of the heartland of Federalism could speak in such positive terms of the visit by a Southern president whose decisive election had marked not only a sweeping Republican victory but also the demise of the national Federalist Party was dramatic testimony that former foes were inclined to put aside the sectional and political differences of the past.

Effects of the War of 1812
      Later scholars have questioned the strategy and tactics of the United States in the War of 1812, the war's tangible results, and even the wisdom of commencing it in the first place. To contemporary Americans, however, the striking naval victories and Jackson's victory over the British at New Orleans created a reservoir of “good feeling” on which Monroe was able to draw.

 Abetting the mood of nationalism was the foreign policy of the United States after the war. Florida was acquired from Spain (1819) in negotiations (Transcontinental Treaty), the success of which owed more to Jackson's indifference to such niceties as the inviolability of foreign borders and to the country's evident readiness to back him up than it did to diplomatic finesse. The Monroe Doctrine (1823), actually a few phrases inserted in a long presidential message (see original text (James Monroe: The Monroe Doctrine)), declared that the United States would not become involved in European affairs and would not accept European interference in the Americas; its immediate effect on other nations was slight, and that on its own citizenry was impossible to gauge, yet its self-assured tone in warning off the Old World from the New reflected well the nationalist mood that swept the country.

      Internally, the decisions of the Supreme Court (Supreme Court of the United States) under Chief Justice Marshall in such cases as Maryland (McCulloch v. Maryland) (1819) and Ogden (Gibbons v. Ogden) (1824) promoted nationalism by strengthening Congress and national power (federalism) at the expense of the states. The congressional decision to charter the second Bank of the United States (1816) was explained in part by the country's financial weaknesses, exposed by the War of 1812, and in part by the intrigues of financial interests. The readiness of Southern Jeffersonians—former strict constructionists—to support such a measure indicates, too, an amazing degree of nationalist feeling. Perhaps the clearest sign of a new sense of national unity was the victorious Republican Party, standing in solitary splendour on the national political horizon, its long-time foes the Federalists vanished without a trace (on the national level) and Monroe, the Republican standard-bearer, reelected so overwhelmingly in 1820 that it was long believed that the one electoral vote denied him had been held back only in order to preserve Washington's record of unanimous selection.

National disunity
      For all the signs of national unity and feelings of oneness, equally convincing evidence points in the opposite direction. The very Supreme Court decisions that delighted friends of strong national government infuriated its opponents, while Marshall's defense of the rights of private property was construed by critics as betraying a predilection for one kind of property over another. The growth of the West, encouraged by the conquest of Indian lands during the War of 1812, was by no means regarded as an unmixed blessing. Eastern conservatives sought to keep land prices high; speculative interests opposed a policy that would be advantageous to poor squatters; politicians feared a change in the sectional balance of power; and businessmen were wary of a new section with interests unlike their own. European visitors testified that, even during the so-called Era of Good Feelings (Good Feelings, Era of), Americans characteristically expressed scorn for their countrymen in sections other than their own.

      Economic hardship, especially the financial panic of 1819, also created disunity. The causes of the panic were complex, but its greatest effect was clearly the tendency of its victims to blame it on one or another hostile or malevolent interest—whether the second Bank of the United States, Eastern capitalists, selfish speculators, or perfidious politicians—each charge expressing the bad feeling that existed side by side with the good.

      If harmony seemed to reign on the level of national political parties, disharmony prevailed within the states. In the early 19th-century United States, local and state politics were typically waged less on behalf of great issues than for petty gain. That the goals of politics were often sordid did not mean that political contests were bland. In every section, state factions led by shrewd men waged bitter political warfare to attain or entrench themselves in power.

      The most dramatic manifestation of national division was the political struggle over slavery, particularly over its spread into new territories. The Missouri Compromise of 1820 eased the threat of further disunity, at least for the time being. The sectional balance between the states was preserved: in the Louisiana Purchase, with the exception of the Missouri Territory, slavery was to be confined to the area south of the 36°30′ line. Yet this compromise did not end the crisis but only postponed it. The determination by Northern and Southern senators not to be outnumbered by one another suggests that the people continued to believe in the conflicting interests of the various great geographic sections. The weight of evidence indicates that the decade after the Battle of New Orleans was not an era of good feelings so much as one of mixed feelings.

The economy
      The American economy expanded and matured at a remarkable rate in the decades after the War of 1812. The rapid growth of the West created a great new centre for the production of grains and pork, permitting the country's older sections to specialize in other crops. New processes of manufacture, particularly in textiles, not only accelerated an “industrial revolution” in the Northeast but also, by drastically enlarging the Northern market for raw materials, helped account for a boom in Southern cotton production. If by midcentury Southerners of European descent had come to regard slavery—on which the cotton economy relied—as a “positive good” rather than the “necessary evil” that they had earlier held the system to be, it was largely because of the increasingly central role played by cotton in earning profits for the region. Industrial workers organized the country's first trade unions and even workingmen's political parties early in the period. The corporate form thrived in an era of booming capital requirements, and older and simpler forms of attracting investment capital were rendered obsolete. Commerce became increasingly specialized, the division of labour in the disposal of goods for sale matching the increasingly sophisticated division of labour that had come to characterize production.

Edward Pessen
      The management of the growing economy was inseparable from political conflict in the emerging United States. At the start the issue was between agrarians (represented by Jeffersonian Republicans) wanting a decentralized system of easy credit and an investing community looking for stability and profit in financial markets. This latter group, championed by Hamilton and the Federalists, won the first round with the establishment of the first Bank of the United States (1791), jointly owned by the government and private stockholders. It was the government's fiscal agent, and it put the centre of gravity of the credit system in Philadelphia, its headquarters. Its charter expired in 1811, and the financial chaos that hindered procurement and mobilization during the ensuing War of 1812 demonstrated the importance of such centralization. Hence, even Jeffersonian Republicans were converted to acceptance of a second Bank of the United States, chartered in 1816.

      The second Bank of the United States faced constant political fire, but the conflict now was not merely between farming and mercantile interests but also between local bankers who wanted access to the profits of an expanding credit system and those who, like the president of the Bank of the United States, Nicholas Biddle (Biddle, Nicholas), wanted more regularity and predictability in banking through top-down control. The Constitution gave the United States exclusive power to coin money but allowed for the chartering of banks by individual states, and these banks were permitted to issue notes that also served as currency. The state banks, whose charters were often political plums, lacked coordinated inspection and safeguards against risky loans usually collateralized by land, whose value fluctuated wildly, as did the value of the banknotes. Overspeculation, bankruptcies, contraction, and panics were the inevitable result.

 Biddle's hope was that the large deposits of government funds in the Bank of the United States would allow it to become the major lender to local banks, and from that position of strength it could squeeze the unsound ones into either responsibility or extinction. But this notion ran afoul of the growing democratic spirit that insisted that the right to extend credit and choose its recipients was too precious to be confined to a wealthy elite. This difference of views produced the classic battle between Biddle and Jackson, culminating in Biddle's attempt to win recharter for the Bank of the United States, Jackson's veto and transfer of the government funds to pet banks, and the Panic of 1837. Not until the 1840s did the federal government place its funds in an independent treasury, and not until the Civil War was there legislation creating a national banking system. The country was strong enough to survive, but the politicization of fiscal policy making continued to be a major theme of American economic history.

Transportation revolution
  Improvements in transportation, a key to the advance of industrialization everywhere, were especially vital in the United States. A fundamental problem of the developing American economy was the great geographic extent of the country and the appallingly poor state of its roads. The broad challenge to weave the Great Lakes, Mississippi Valley, and Gulf and Atlantic coasts into a single national market was first met by putting steam to work on the rich network of navigable rivers. As early as 1787, John Fitch (Fitch, John) had demonstrated a workable steamboat to onlookers in Philadelphia; some years later, he repeated the feat in New York City. But it is characteristic of American history that, in the absence of governmental encouragement, private backing was needed to bring an invention into full play. As a result, popular credit for the first steamboat goes to Robert Fulton (Fulton, Robert), who found the financing to make his initial Hudson River run of the Clermont in 1807 more than a onetime feat. From that point forward, on inland waters, steam was king, and its most spectacular manifestation was the Mississippi River paddle wheeler, a unique creation of unsung marine engineers challenged to make a craft that could “work” in shallow swift-running waters. Their solution was to put cargo, engines, and passengers on a flat open deck above the waterline, which was possible in the mild climate of large parts of the drainage basin of the Father of Waters. The Mississippi River steamboat not only became an instantly recognizable American icon but also had an impact on the law. In the case of Ogden (Gibbons v. Ogden) (1824), Chief Justice Marshall affirmed the exclusive right of the federal government to regulate traffic on rivers flowing between states.

 Canals (canals and inland waterways) and railroads were not as distinctively American in origin as the paddle wheeler, but, whereas 18th-century canals in England and continental Europe were simple conveniences for moving bulky loads cheaply at low speed, Americans integrated the country's water transport system by connecting rivers flowing toward the Atlantic Ocean with the Great Lakes and the Ohio-Mississippi River valleys. The best-known conduit, the Erie Canal, connected the Hudson River to the Great Lakes, linking the West to the port of New York City. Other major canals in Pennsylvania, Maryland, and Ohio joined Philadelphia and Baltimore to the West via the Ohio River and its tributaries. Canal building was the rage throughout the 1820s and '30s, sometimes financed by states or by a combination of state and private effort. But many overbuilt or unwisely begun canal projects collapsed, and states that were “burned” in the process became more wary of such ventures.

  Canal development was overtaken by the growth of the railroads (railroad), which were far more efficient in covering the great distances underserved by the road system and indispensable in the trans-Mississippi West. Work on the Baltimore and Ohio (Baltimore and Ohio Railroad) line, the first railroad in the United States, was begun in 1828, and a great burst of construction boosted the country's rail network from zero to 30,000 miles (50,000 km) by 1860. The financing alone, no less than the operation of the burgeoning system, had a huge political and economic impact. Adams was a decided champion of “national internal improvements”—the federally assisted development of turnpikes, lighthouses, and dredging and channel-clearing operations (that is, whatever it took to assist commerce). That term, however, was more closely associated with Henry Clay (Clay, Henry), like Adams a strong nationalist. Clay proposed an American System, which would, through internal improvements and the imposition of tariffs, encourage the growth of an industrial sector that exchanged manufactured goods for the products of U.S. agriculture, thus benefiting each section of the country. But the passionate opposition of many agrarians to the costs and expanded federal control inherent in the program created one battlefield in the long contest between the Democratic (Democratic Party) and Whig (Whig Party) parties that did not end until the triumph of Whig economic ideas in the Republican (Republican Party) party during the Civil War.

Beginnings of industrialization
      Economic, social, and cultural history cannot easily be separated. The creation of the “factory system” in the United States was the outcome of interaction between several characteristically American forces: faith in the future, a generally welcoming attitude toward immigrants, an abundance of resources linked to a shortage of labour, and a hospitable view of innovation. The pioneering textile industry, for example, sprang from an alliance of invention, investment, and philanthropy. Moses Brown (later benefactor of the College of Rhode Island, renamed Brown University in honour of his nephew Nicholas) was looking to invest some of his family's mercantile fortune in the textile business. New England wool and southern cotton were readily available, as was water power from Rhode Island's swiftly flowing rivers. All that was lacking to convert a handcraft industry into one that was machine-based was machinery itself; however, the new devices for spinning and weaving that were coming into use in England were jealously guarded there. But Samuel Slater (Slater, Samuel), a young English mechanic who immigrated to the United States in 1790 carrying the designs for the necessary machinery in his prodigious memory, became aware of Brown's ambitions and of the problems he was having with his machinery. Slater formed a partnership with Brown and others to reproduce the crucial equipment and build prosperous Rhode Island fabric factories.

  Local American inventive talent embodied in sometimes self-taught engineers was available too. One conspicuous example was Delaware's Oliver Evans (Evans, Oliver), who built a totally automatic flour mill in the 1780s and later founded a factory that produced steam engines; another was the ultimate Connecticut Yankee, Eli Whitney (Whitney, Eli), who not only fathered the cotton gin but built a factory for mass producing muskets by fitting together interchangeable parts on an assembly line. Whitney got help from a supportive U.S. Army, which sustained him with advances on large procurement contracts. Such governmental support of industrial development was rare, but, when it occurred, it was a crucial if often understated element in the industrializing of America.

      Francis Cabot Lowell (Lowell, Francis Cabot), who opened a textile factory in 1811 in the Massachusetts town later named for him, played a pathbreaking role as a paternalistic model employer. Whereas Slater and Brown used local families, living at home, to provide “hands” for their factories, Lowell brought in young women from the countryside and put them up in boardinghouses adjacent to the mills. The “girls”—most of them in or just out of their teens—were happy to be paid a few dollars for 60-hour workweeks that were less taxing than those they put in as farmers' daughters. Their moral behaviour was supervised by matrons, and they themselves organized religious, dramatic, musical, and study groups. The idea was to create an American labour force that would not resemble the wretched proletarians of England and elsewhere in Europe.

  Lowell was marveled at by foreign and domestic visitors alike but lost its idyllic character as competitive pressures within the industry resulted in larger workloads, longer hours, and smaller wages. When, in the 1840s and 1850s, Yankee young women formed embryonic unions and struck, they were replaced by French-Canadian and Irish immigrants. Nonetheless, early New England industrialism carried the imprint of a conscious sense of American exceptionalism.

Bernard A. Weisberger

Social developments
      In the decades before the American Civil War (1861–65), the civilization of the United States exerted an irresistible pull on visitors, hundreds of whom were assigned to report back to European audiences that were fascinated by the new society and insatiable for information on every facet of the “fabled republic.” What appeared to intrigue the travelers above all was the uniqueness of American society. In contrast to the relatively static and well-ordered civilization of the Old World, America seemed turbulent, dynamic, and in constant flux, its people crude but vital, awesomely ambitious, optimistic, and independent. Many well-bred Europeans were evidently taken aback by the self-assurance of lightly educated American common folk. Ordinary Americans seemed unwilling to defer to anyone on the basis of rank or status.

Birth of American Culture
  “In the four quarters of the globe, who reads an American book?” asked an English satirist early in the 1800s. Had he looked beyond the limits of “high culture,” he would have found plenty of answers. As a matter of fact, the period between 1815 and 1860 produced an outpouring of traditional literary works now known to students of English-language prose and poetry everywhere—the verse of Henry Wadsworth Longfellow (Longfellow, Henry Wadsworth) and Edgar Allan Poe (Poe, Edgar Allan), the novels of James Fenimore Cooper (Cooper, James Fenimore), Nathaniel Hawthorne (Hawthorne, Nathaniel), and Herman Melville (Melville, Herman), as well as the essays of Ralph Waldo Emerson (Emerson, Ralph Waldo)—all expressing distinctively American themes and depicting distinctly American characters such as Natty Bumppo, Hester Prynne, and Captain Ahab who now belong to the world.

  But setting these aside, Nathaniel Bowditch (Bowditch, Nathaniel)'s The New American Practical Navigator (1802), Matthew Fontaine Maury (Maury, Matthew Fontaine)'s Physical Geography of the Sea (1855), and the reports from the Lewis and Clark Expedition and the various far Western explorations made by the U.S. Army's Corps of Engineers, as well as those of U.S. Navy Antarctic explorer Charles Wilkes (Wilkes, Charles), were the American books on the desks of sea captains, naturalists, biologists, and geologists throughout the world. By 1860 the international scientific community knew that there was an American intellectual presence.

   At home Noah Webster (Webster, Noah)'s An American Dictionary of the English Language (American Dictionary of the English Language, An) (1828) included hundreds of words of local origin to be incorporated in the former “King's English.” Webster's blue-backed “Speller,” published in 1783, the geography textbooks of Jedidiah Morse (Morse, Jedidiah), and the Eclectic Readers of William Holmes McGuffey (McGuffey, William Holmes) became staples in every 19th-century American classroom. Popular literature included the humorous works of writers such as Seba Smith (Smith, Seba), Joseph G. Baldwin, Johnson Jones Hooper, and Artemus Ward (Ward, Artemus), which featured frontier tall tales and rural dialect. In the growing cities there were new varieties of mass entertainment, including the blatantly racist minstrel shows (minstrel show), for which ballads like those of Stephen Foster (Foster, Stephen) were composed. The “museums” and circuses of P.T. Barnum (Barnum, P.T.) also entertained the middle-class audience, and the spread of literacy sustained a new kind of popular journalism, pioneered by James Gordon Bennett (Bennett, James Gordon), whose New York Herald mingled its up-to-the-moment political and international news with sports, crime, gossip, and trivia. Popular magazines such as Harper's Weekly, Frank Leslie's Illustrated Newspaper, and Godey's Lady's Book, edited by Sarah Josepha Hale with a keen eye toward women's wishes, also made their mark in an emerging urban America. All these added up to a flourishing democratic culture that could be dismissed as vulgar by foreign and domestic snobs but reflected a vitality loudly sung by Walt Whitman (Whitman, Walt) in Leaves of Grass (1855).

Bernard A. Weisberger

The people
      American society was rapidly changing. Population grew at what to Europeans was an amazing rate—although it was the normal pace of American population growth for the antebellum decades—of between three-tenths and one-third per decade. After 1820 the rate of growth was not uniform throughout the country. New England and the Southern Atlantic states languished—the former region because it was losing settlers to the superior farmlands of the Western Reserve, the latter because its economy offered too few places to newcomers.

      The special feature of the population increase of the 1830s and '40s was the extent to which it was composed of immigrants. Whereas about 250,000 Europeans had arrived in the first three decades of the 19th century, there were 10 times as many between 1830 and 1850. The newcomers were overwhelmingly Irish and German. Traveling in family groups rather than as individuals, they were attracted by the dazzling opportunities of American life: abundant work, land, food, and freedom on the one hand and the absence of compulsory military service on the other.

Edward Pessen  The mere statistics of immigration do not, however, tell the whole story of its vital role in pre-Civil War America. The intermingling of technology, politics, and accident produced yet another “great migration.” By the 1840s the beginnings of steam transportation on the Atlantic and improvements in the sailing speed of the last generation of windjammers made oceanic passages more frequent and regular. It became easier for hungry Europeans to answer the call of America to take up the farmlands and build the cities. Irish migration would have taken place in any case, but the catastrophe of the Irish Potato Famine of 1845–49 turned a stream into a torrent. Meanwhile, the steady growth of the democratic idea in Europe produced the Revolutions of 1848 (1848, Revolutions of) in France, Italy, Hungary, and Germany. The uprisings in the last three countries were brutally suppressed, creating a wave of political refugees. Hence, many of the Germans who traveled over in the wake of the revolutions—the Forty-Eighters—were refugees who took liberal ideals, professional educations, and other intellectual capital to the American West. Overall German contributions to American musical, educational, and business life simply cannot be measured in statistics. Neither can one quantify the impact of the Irish politicians, policemen, and priests on American urban life or the impact of the Irish in general on Roman Catholicism in the United States.

 Besides the Irish and Germans, there were thousands of Norwegians and Swedes who immigrated, driven by agricultural depression in the 1850s, to take up new land on the yet-unbroken Great Plains. And there was a much smaller migration to California in the 1850s of Chinese seeking to exchange hard times for new opportunities in the gold fields. These people too indelibly flavoured the culture of the United States.

  Mention must also be made of utopian immigrant colonies planted by thinkers who wanted to create a new society in a New World. Examples include Nashoba, Tenn., and New Harmony, Ind., by two British newcomers, Frances Wright (Wright, Frances) and Robert Dale Owen (Owen, Robert Dale), respectively. There also were German planned settlements at Amana (Amana Colonies), Iowa, and in New Ulm and New Braunfels, Texas. If the growth of materialistic and expansionist bumptiousness represented by the Manifest Destiny movement was fueled in part by the immigration-fed expansion of the American populace, these experiments in communal living added to the less materialistic forces driving American thought. They fit the pattern of searching for heaven on earth that marked the age of reform.

Bernard A. Weisberger
      Most African Americans in the North possessed theoretical freedom and little else. Confined to menial occupations for the most part, they fought a losing battle against the inroads of Irish competition in northeastern cities. The struggle between the two groups erupted spasmodically into ugly street riots. The hostility shown to free African Americans by the general community was less violent but equally unremitting. Discrimination in politics, employment, education, housing, religion, and even cemeteries resulted in a cruelly oppressive system. Unlike slaves, free African Americans in the North could criticize and petition against their subjugation, but this proved fruitless in preventing the continued deterioration of their situation.

      Most Americans continued to live in the country. Although improved machinery had resulted in expanded farm production and had given further impetus to the commercialization of agriculture, the way of life of independent agriculturists had changed little by midcentury. The public journals put out by some farmers insisted that their efforts were unappreciated by the larger community. The actuality was complex. Many farmers led lives marked by unremitting toil, cash shortage, and little leisure. Farm workers received minuscule wages. In all sections of the country, much of the best land was concentrated in the hands of a small number of wealthy farmers. The proportion of farm families who owned their own land, however, was far greater in the United States than in Europe, and varied evidence points to a steady improvement in the standard and style of living of agriculturalists as midcentury approached.

Cities (city)
 Cities, both old and new, thrived during the era, their growth (urbanization) in population outstripping the spectacular growth rate of the country as a whole and their importance and influence far transcending the relatively small proportions of citizens living in them. Whether on the “urban frontier” or in the older seaboard region, antebellum cities were the centres of wealth and political influence for their outlying hinterlands. New York City, with a population approaching 500,000 by midcentury, faced problems of a different order of magnitude from those confronting such cities as Poughkeepsie, N.Y., and Newark, N.J. Yet the pattern of change during the era was amazingly similar for eastern cities or western, old cities or new, great cities or small. The lifeblood of them all was commerce. Old ideals of economy in town government were grudgingly abandoned by the merchant, professional, and landowning elites who typically ruled. Taxes were increased in order to deal with pressing new problems and to enable the urban community of midcentury to realize new opportunities. Harbours were improved, police forces professionalized, services expanded, waste more reliably removed, streets improved, and welfare activities broadened, all as the result of the statesmanship and the self-interest of property owners who were convinced that amelioration was socially beneficial.

Edward Pessen

education and the role of women
  Cities were also centres of educational and intellectual progress. The emergence of a relatively well-financed public educational system, free of the stigma of “pauper” or “charity” schools, and the emergence of a lively “penny press,” made possible by a technological revolution, were among the most important developments. The role of women in America's expanding society was intriguingly shaped by conflicting forces. On one hand, there were factors that abetted emancipation. For example, the growing cities offered new job opportunities as clerks and shop assistants for girls and young women with elementary educations furnished by the public schools. And the need for trained teachers for those schools offered another avenue to female independence. At higher levels, new rungs on the ladder of upward mobility were provided by the creation of women's colleges, such as Mount Holyoke (Mount Holyoke College) in South Hadley, Mass. (1837), and by the admission of women to a very few coeducational colleges, such as Oberlin (Oberlin College) (1833) and Antioch (Antioch University) (1852), both in Ohio. A rare woman or two even broke into professional ranks, including Elizabeth Blackwell (Blackwell, Elizabeth), considered the first woman physician of modern times, and the Rev. Olympia Brown (Brown, Olympia), one of the first American women whose ordination was sanctioned by a full denomination.

      On the other hand, traditionally educated women from genteel families remained bound by silken cords of expectation. The “duties of womanhood” expounded by popular media included, to the exclusion of all else, the conservation of a husband's resources, the religious and moral education of children and servants, and the cultivation of higher sensibilities through the proper selection of decorative objects and reading matter. The “true woman” made the home an island of tranquility and uplift to which the busy male could retreat after a day's struggle in the hard world of the marketplace. In so doing, she was venerated but kept in a clearly noncompetitive role.

Bernard A. Weisberger

      The brilliant French visitor Alexis de Tocqueville (Tocqueville, Alexis de), in common with most contemporary observers, believed American society to be remarkably egalitarian. Most rich American men were thought to have been born poor; “self-made” was the term Henry Clay (Clay, Henry) popularized for them. The society was allegedly a very fluid one, marked by the rapid rise and fall of fortunes, with room at the top accessible to all but the most humble; opportunity for success seemed freely available to all, and, although material possessions were not distributed perfectly equally, they were, in theory, dispersed so fairly that only a few poor and a few rich men existed at either end of the social spectrum.

      The actuality, however, was far different. While the rich were inevitably not numerous, America by 1850 had more millionaires than all of Europe. New York, Boston, and Philadelphia each had perhaps1,000 individuals admitting to assets of $100,000 or more, at a time when wealthy taxpayers kept secret from assessors the bulk of their wealth. Because an annual income of $4,000 or $5,000 enabled a person to live luxuriously, these were great fortunes indeed. Typically, the wealthiest 1 percent of urban citizens owned approximately one-half the wealth of the great cities of the Northeast, while the great bulk of their populations possessed little or nothing. In what has long been called the “Age of the Common Man,” rich men were almost invariably born not into humble or poor families but into wealthy and prestigious ones. In western cities too, class lines increasingly hardened after 1830. The common man lived in the age, but he did not dominate it. It appears that contemporaries, overimpressed with the absence of a titled aristocracy and with the democratic tone and manner of American life, failed to see the extent to which money, family, and status exerted power in the New World even as they did in the Old.

The democratization of politics
      Nevertheless, American politics became increasingly democratic during the 1820s and '30s. Local and state offices that had earlier been appointive became elective. suffrage was expanded as property and other restrictions on voting were reduced or abandoned in most states. The freehold requirement that had denied voting to all but holders of real estate was almost everywhere discarded before 1820, while the taxpaying qualification was also removed, if more slowly and gradually. In many states a printed ballot replaced the earlier system of voice voting, while the secret ballot also grew in favour. Whereas in 1800 only two states provided for the popular choice of presidential electors, by 1832 only South Carolina still left the decision to the legislature. Conventions of elected delegates increasingly replaced legislative or congressional caucuses as the agencies for making party nominations. By the latter change, a system for nominating candidates by self-appointed cliques meeting in secret was replaced by a system of open selection of candidates by democratically elected bodies.

      These democratic changes were not engineered by Andrew Jackson (Jackson, Andrew) and his followers, as was once believed. Most of them antedated the emergence of Jackson's Democratic Party, and in New York, Mississippi, and other states some of the reforms were accomplished over the objections of the Jacksonians. There were men in all sections who feared the spread of political democracy, but by the 1830s few were willing to voice such misgivings publicly. Jacksonians effectively sought to fix the impression that they alone were champions of democracy, engaged in mortal struggle against aristocratic opponents. The accuracy of such propaganda varied according to local circumstances. The great political reforms of the early 19th century in actuality were conceived by no one faction or party. The real question about these reforms concerns the extent to which they truly represented the victory of democracy in the United States.

      Small cliques or entrenched “machines” dominated democratically elected nominating conventions as earlier they had controlled caucuses. While by the 1830s the common man—of European descent—had come into possession of the vote in most states, the nomination process continued to be outside his control. More important, the policies adopted by competing factions and parties in the states owed little to ordinary voters. The legislative programs of the “regencies” and juntos that effectively ran state politics were designed primarily to reward the party faithful and to keep them in power. State parties extolled the common people in grandiloquent terms but characteristically focused on prosaic legislation that awarded bank charters or monopoly rights to construct transportation projects to favoured insiders. That American parties would be pragmatic vote-getting coalitions, rather than organizations devoted to high political principles, was due largely to another series of reforms enacted during the era. Electoral changes that rewarded winners or plurality gatherers in small districts, in contrast to a previous system that divided a state's offices among the several leading vote getters, worked against the chances of “single issue” or “ideological” parties while strengthening parties that tried to be many things to many people.

The Jacksonians
      To his army of followers, Jackson was the embodiment of popular democracy. A truly self-made man of strong will and courage, he personified for many citizens the vast power of nature and Providence, on the one hand, and the majesty of the people, on the other. His very weaknesses, such as a nearly uncontrollable temper, were political strengths. Opponents who branded him an enemy of property and order only gave credence to the claim of Jackson's supporters that he stood for the poor against the rich, the plain people against the interests.

      Jackson, like most of his leading antagonists, was in fact a wealthy man of conservative social beliefs. In his many volumes of correspondence he rarely referred to labour. As a lawyer and man of affairs in Tennessee prior to his accession to the presidency, he aligned himself not with have-nots but with the influential, not with the debtor but with the creditor. His reputation was created largely by astute men who propagated the belief that his party was the people's party and that the policies of his administrations were in the popular interest. Savage attacks on those policies by some wealthy critics only fortified the belief that the Jacksonian movement was radical as well as democratic.

  At its birth in the mid-1820s, the Jacksonian, or Democratic (Democratic Party), Party was a loose coalition of diverse men and interests united primarily by a practical vision. They held to the twin beliefs that Old Hickory, as Jackson was known, was a magnificent candidate and that his election to the presidency would benefit those who helped bring it about. His excellence as candidate derived in part from the fact that he appeared to have no known political principles of any sort. In this period there were no distinct parties on the national level. Jackson, Clay, John C. Calhoun (Calhoun, John C), John Quincy Adams (Adams, John Quincy), and William H. Crawford (Crawford, William H)—the leading presidential aspirants—all portrayed themselves as “Republicans,” followers of the party of the revered Jefferson. The National Republicans (National Republican Party) were the followers of Adams and Clay; the Whigs (Whig Party), who emerged in 1834, were, above all else, the party dedicated to the defeat of Jackson.

The major parties
      The great parties of the era were thus created to attain victory for men rather than measures. Once the parties were in being, their leaders understandably sought to convince the electorate of the primacy of principles. It is noteworthy, however, that former Federalists at first flocked to the new parties in largely equal numbers and that men on opposite sides of such issues as internal improvements or a national bank could unite behind Jackson. With the passage of time, the parties did come increasingly to be identified with distinctive, and opposing, political policies.

      By the 1840s, Whig and Democratic congressmen voted as rival blocs. Whigs supported and Democrats opposed a weak executive, a new Bank of the United States, a high tariff, distribution of land revenues to the states, relief legislation to mitigate the effects of the depression, and federal reapportionment of House seats. Whigs voted against and Democrats approved an independent treasury, an aggressive foreign policy, and expansionism. These were important issues, capable of dividing the electorate just as they divided the major parties in Congress. Certainly it was significant that Jacksonians were more ready than their opponents to take punitive measures against African Americans or abolitionists or to banish and use other forceful measures against the southern Indian tribes, brushing aside treaties protecting Native American rights. But these differences do not substantiate the belief that the Democrats and Whigs were divided ideologically, with only the former somehow representing the interests of the propertyless.

 Party lines earlier had been more easily broken, as during the crisis that erupted over South Carolina's bitter objections to the high Tariff of 1828. Jackson's firm opposition to Calhoun's (Calhoun, John C) policy of nullification (i.e., the right of a state to nullify a federal law, in this case the tariff) had commanded wide support within and outside the Democratic Party. Clay's (Clay, Henry) solution to the crisis, a compromise tariff, represented not an ideological split with Jackson but Clay's ability to conciliate and to draw political advantage from astute tactical maneuvering.

      The Jacksonians depicted their war (Bank War) on the second Bank of the United States as a struggle against an alleged aristocratic monster that oppressed the West, debtor farmers, and poor people generally. Jackson's decisive reelection in 1832 was once interpreted as a sign of popular agreement with the Democratic interpretation of the Bank War, but more recent evidence discloses that Jackson's margin was hardly unprecedented and that Democratic success may have been due to other considerations. The second Bank was evidently well thought of by many Westerners, many farmers, and even Democratic politicians who admitted to opposing it primarily not to incur the wrath of Jackson.

      Jackson's reasons for detesting the second Bank and its president (Biddle (Biddle, Nicholas)) were complex. Anticapitalist ideology would not explain a Jacksonian policy that replaced a quasi-national bank as repository of government funds with dozens of state and private banks, equally controlled by capitalists and even more dedicated than was Biddle to profit making. The saving virtue of these “pet banks” appeared to be the Democratic political affiliations of their directors. Perhaps the pragmatism as well as the large degree of similarity between the Democrats and Whigs is best indicated by their frank adoption of the “ spoils system.” The Whigs, while out of office, denounced the vile Democratic policy for turning lucrative customhouse and other posts over to supporters, but once in office they resorted to similar practices. It is of interest that the Jacksonian appointees were hardly more plebeian than were their so-called aristocratic predecessors.

Minor parties
      The politics of principle was represented during the era not by the major parties but by the minor ones. The Anti-Masons (Anti-Masonic Movement) aimed to stamp out an alleged aristocratic conspiracy. The Workingmen's Party called for “social justice.” The Locofocos (Locofoco Party) (so named after the matches they used to light up their first meeting in a hall darkened by their opponents) denounced monopolists in the Democratic Party and out. The variously named nativist parties accused the Roman Catholic Church of all manner of evil. The Liberty Party opposed the spread of slavery. All these parties were ephemeral because they proved incapable of mounting a broad appeal that attracted masses of voters in addition to their original constituencies. The Democratic and Whig parties thrived not in spite of their opportunism but because of it, reflecting well the practical spirit that animated most American voters.

An age of reform
 Historians have labeled the period 1830–50 an “age of reform.” At the same time that the pursuit of the dollar was becoming so frenzied that some observers called it the country's true religion, tens of thousands of Americans joined an array of movements dedicated to spiritual and secular uplift. There is not yet agreement as to why a rage for reform erupted in the antebellum decades. A few of the explanations cited, none of them conclusive, include an outburst of Protestant Evangelicalism, a reform spirit that swept across the Anglo-American community, a delayed reaction to the perfectionist teachings of the Enlightenment, and the worldwide revolution in communications that was a feature of 19th-century capitalism.

      What is not in question is the amazing variety of reform movements that flourished simultaneously in the North—women's rights, pacifism, temperance, prison reform, abolition of imprisonment for debt, an end to capital punishment, improving the conditions of the working classes, a system of universal education, the organization of communities that discarded private property, improving the condition of the insane and the congenitally enfeebled, and the regeneration of the individual were among the causes that inspired zealots during the era.

Edward Pessen
      The strangest thing about American life was its combination of economic hunger and spiritual striving. Both rested on the conviction that the future could be controlled and improved. Life might have been cruel and harsh on the frontier, but there was a strong belief that the human condition was sure to change for the better: human nature itself was not stuck in the groove of perpetual shortcoming, as old-time Calvinism had predicted.

      The period of “freedom's ferment” from 1830 to 1860 combined the humanitarian impulses of the late 18th century with the revivalistic pulse of the early 19th century. The two streams flowed together. For example, the earnest Christians who founded the American Christian Missionary (mission) Society believed it to be their duty to bring the good news of salvation through Jesus Christ to the “heathens” of Asia. But in carrying out this somewhat arrogant assault on the religions of the poor in China and India, they founded schools and hospitals that greatly improved the earthly lot of their Chinese and “Hindoo” converts in a manner of which Jefferson might have approved.

   millennialism—the belief that the world might soon end and had to be purged of sin before Christ's Second Coming (as preached by revivalists such as Charles Grandison Finney (Finney, Charles Grandison))—found its counterpart in secular perfectionism, which held that it was possible to abolish every form of social and personal suffering through achievable changes in the way the world worked. Hence, a broad variety of crusades and crusaders flourished. Universal education was seen as the key to it all, which accounted for many college foundings and for the push toward universal free public schooling led by Horace Mann (Mann, Horace), who went from being the secretary to Massachusetts's State Board of Education to being the president of Antioch College, where he told his students to “be ashamed to die until you have won some victory for humanity.”

      One way to forge such victories was to improve the condition of those whom fate had smitten and society had neglected or abused. There was, for example, the movement to provide special education for the deaf, led by Samuel Gridley Howe (Howe, Samuel Gridley), as well as the founding of an institute to teach the blind by Boston merchant Thomas Handasyd Perkins, who found philanthropy a good way for a Christian businessman to show his appreciation for what he saw as God's blessings on his enterprises. There also was the work of Dorothea Lynde Dix (Dix, Dorothea Lynde) to humanize the appalling treatment of the insane, which followed up on the precedent set by Benjamin Rush (Rush, Benjamin), signer of the Declaration of Independence, a devout believer in God and science.

 As the march of industrialization made thousands of workers dependent on the uncontrollable ups and downs of the business cycle and the generosity of employers—described by some at the time as “putting the living of the many in the hands of the few”—the widening imbalance between classes spurred economic reformers to action. Some accepted the permanence of capitalism but tried to enhance the bargaining power of employees through labour unions. Others rejected the private enterprise model and looked to a reorganization of society on cooperative rather than competitive lines. Such was the basis of Fourierism and utopian socialism. One labour reformer, George Henry Evans (Evans, George Henry), proposed that wages be raised by reducing the supply of labourers through awarding some of them free farms, “homesteads” carved from the public domain. Even some of the fighters for immigration restriction who belonged to the Know-Nothing party had the same aim—namely, to preserve jobs for the native-born. Other reformers focused on peripheral issues such as the healthier diet expounded by Sylvester Graham (Graham, Sylvester) or the sensible women's dress advocated by Amelia Jenks Bloomer (Bloomer, Amelia Jenks), both of whom saw these small steps as leading toward more-rational and gentle human behaviour overall.

      Whatever a reform movement's nature, whether as pragmatic as agricultural improvement or as utopian as universal peace, the techniques that spread the message over America's broad expanses were similar. Voluntary associations were formed to spread the word and win supporters, a practice that Tocqueville, in 1841, found to be a key to American democracy. Even when church-affiliated, these groups were usually directed by professional men rather than ministers, and lawyers were conspicuously numerous. Next came publicity through organizational newspapers, which were easy to found on small amounts of capital and sweat. So when, as one observer noted, almost every American had a plan for the universal improvement of society in his pocket, every other American was likely to be aware of it.

 Two of these crusades lingered in strength well beyond the Civil War era. Temperance (temperance movement) was one, probably because it invoked lasting values—moralism, efficiency, and health. Drinking was viewed as a sin that, if overindulged, led to alcoholism, incurred social costs, hurt productivity, and harmed one's body. The women's rights crusade (women's movement), which first came to national attention in the Seneca Falls Convention of 1848, persisted because it touched upon a perennial and universal question of the just allotment of gender roles.

Bernard A. Weisberger

      Finally and fatally there was abolitionism, the antislavery movement. Passionately advocated and resisted with equal intensity, it appeared as late as the 1850s to be a failure in politics. Yet by 1865 it had succeeded in embedding its goal in the Constitution by amendment, though at the cost of a civil war. At its core lay the issue of “race,” over which Americans have shown their best and worst faces for more than three centuries. When it became entangled in this period with the dynamics of American sectional conflict, its full explosive potential was released. If the reform impulse was a common one uniting the American people in the mid-19th century, its manifestation in abolitionism finally split them apart for four bloody years

   Abolition itself was a diverse phenomenon. At one end of its spectrum was William Lloyd Garrison (Garrison, William Lloyd), an “immediatist,” who denounced not only slavery but the Constitution of the United States for tolerating the evil. His newspaper, The Liberator (Liberator, The), lived up to its promise that it would not equivocate in its war against slavery. Garrison's uncompromising tone infuriated not only the South but many Northerners as well and was long treated as though it were typical of abolitionism in general. Actually it was not. At the other end of the abolitionist spectrum and in between stood such men and women as Theodore Weld (Weld, Theodore Dwight), James Gillespie Birney (Birney, James Gillespie), Gerrit Smith (Smith, Gerrit), Theodore Parker (Parker, Theodore), Julia Ward Howe (Howe, Julia Ward), Lewis Tappan, Salmon P. Chase (Chase, Salmon P.), and Lydia Maria Child (Child, Lydia Maria), all of whom represented a variety of stances, all more conciliatory than Garrison's. James Russell Lowell (Lowell, James Russell), whose emotional balance was cited by a biographer as proof that abolitionists need not have been unstable, urged in contrast to Garrison that “the world must be healed by degrees.” Also of importance was the work of free blacks such as David Walker (Walker, David) and Robert Forten and ex-slaves such as Frederick Douglass (Douglass, Frederick), who had the clearest of all reasons to work for the cause but who shared some broader humanitarian motives with their white coworkers.

      Whether they were Garrisonians or not, abolitionist leaders have been scorned as cranks who were either working out their own personal maladjustments or as people using the slavery issue to restore a status that as an alleged New England elite they feared they were losing. The truth may be simpler. Few neurotics and few members of the northern socioeconomic elite became abolitionists. For all the movement's zeal and propagandistic successes, it was bitterly resented by many Northerners, and the masses of free whites were indifferent to its message. In the 1830s urban mobs, typically led by “gentlemen of property and standing,” stormed abolitionist meetings, wreaking violence on the property and persons of African Americans and their white sympathizers, evidently indifferent to the niceties distinguishing one abolitionist theorist from another. The fact that abolition leaders were remarkably similar in their New England backgrounds, their Calvinist self-righteousness, their high social status, and the relative excellence of their educations is hardly evidence that their cause was either snobbish or elitist. Ordinary citizens were more inclined to loathe African Americans and to preoccupy themselves with personal advance within the system.

Support of reform movements
 The existence of many reform movements did not mean that a vast number of Americans supported them. Abolition did poorly at the polls. Some reforms were more popular than others, but by and large none of the major movements had mass followings. The evidence indicates that few persons actually participated in these activities. Utopian (utopia) communities such as Brook Farm and those in New Harmony, Ind., and Oneida (Oneida Community), N.Y., did not succeed in winning over many followers or in inspiring many other groups to imitate their example. The importance of these and the other movements derived neither from their size nor from their achievements. Reform reflected the sensitivity of a small number of persons to imperfections in American life. In a sense, the reformers were “voices of conscience,” reminding their materialistic fellow citizens that the American Dream was not yet a reality, pointing to the gulf between the ideal and the actuality.

Religious-inspired reform
      Notwithstanding the wide impact of the American version of secular perfectionism, it was the reform inspired by religious zeal that was most apparent in the antebellum United States. Not that religious enthusiasm was invariably identified with social uplift; many reformers were more concerned with saving souls than with curing social ills. The merchant princes who played active roles in—and donated large sums of money to—the Sunday school unions, home missionary (mission) societies, and Bible and tract societies did so in part out of altruism and in part because the latter organizations stressed spiritual rather than social improvement while teaching the doctrine of the “contented poor.” In effect, conservatives who were strongly religious found no difficulty in using religious institutions to fortify their social predilections. Radicals, on the other hand, interpreted Christianity as a call to social action, convinced that true Christian rectitude could be achieved only in struggles that infuriated the smug and the greedy. Ralph Waldo Emerson (Emerson, Ralph Waldo) was an example of the American reformer's insistence on the primacy of the individual. The great goal according to him was the regeneration of the human spirit, rather than a mere improvement in material conditions. Emerson and reformers like him, however, acted on the premise that a foolish consistency was indeed the hobgoblin of little minds, for they saw no contradiction in uniting with like-minded idealists to act out or argue for a new social model. The spirit was to be revived and strengthened through forthright social action undertaken by similarly independent individuals.

Expansionism and political crisis at midcentury
      Throughout the 19th century, eastern settlers kept spilling over into the Mississippi valley and beyond, pushing the frontier farther westward. The Louisiana Purchase territory offered ample room to pioneers and those who came after. American wanderlust, however, was not confined to that area. Throughout the era Americans in varying numbers moved into regions south, west, and north of the Louisiana Territory. Because Mexico and Great Britain held or claimed most of these lands, dispute inevitably broke out between these governments and the United States.

  The growing nationalism of the American people was effectively engaged by the Democratic presidents Jackson and James K. Polk (Polk, James K.) (served 1845–49) and by the expansionist Whig president John Tyler (Tyler, John) (served 1841–45) to promote their goal of enlarging the “empire for liberty.” Each of these presidents performed shrewdly. Jackson waited until his last day in office to establish formal relations with the Republic of Texas, one year after his friend Sam Houston (Houston, Sam) had succeeded in dissolving the ties between Mexico and the newly independent state of Texas. On the Senate's overwhelming repudiation of his proposed treaty of annexation, Tyler resorted to the use of a joint resolution so that each house could vote by a narrow margin for incorporation of Texas into the Union. Polk succeeded in getting the British to negotiate a treaty (1846) whereby the Oregon country south of the 49th parallel would revert to the United States. These were precisely the terms of his earlier proposal, which had been rejected by the British. Ready to resort to almost any means to secure the Mexican (Mexican-American War) territories of New Mexico and upper California, Polk used a border incident as a pretext for commencing a war with Mexico. The Mexican-American War was not widely acclaimed, and many congressmen disliked it, but few dared to oppose the appropriations that financed it.

      Although there is no evidence that these actions had anything like a public mandate, clearly they did not evoke widespread opposition. Nonetheless, the expansionists' assertion that Polk's election in 1844 could be construed as a popular clamour for the annexation of Texas was hardly a solid claim; Clay was narrowly defeated and would have won but for the defection from Whig ranks of small numbers of Liberty Party and nativist voters. The nationalistic idea, conceived in the 1840s by a Democratic editor, that it was the “ Manifest Destiny” of the United States to expand westward to the Pacific undoubtedly prepared public opinion for the militant policies undertaken by Polk shortly thereafter. It has been said that this notion represented the mood of the American people; it is safer to say it reflected the feelings of many of the people.

Edward Pessen
      The continuation of westward expansion naturally came at the further expense of the American Indians (American Indian). The sociocultural environment of “young America” offered fresh rationales for the dispossession of Native Americans; the broadening of federal power provided administrative machinery to carry it out; and the booming economy spurred the demand to bring ever more “virgin land” still in Indian hands into the orbit of “civilization.”

      After 1815, control of Indian affairs was shifted from the State Department to the War Department (and subsequently to the Department of the Interior, created in 1849.) The Indians were no longer treated as peoples of separate nations but were considered wards of the United States, to be relocated at the convenience of the government when necessary. The acquisition of the Louisiana Territory in 1803 and Florida in 1819 removed the last possibilities of outside help for the Indians from France or Spain; moreover, they opened new areas for “resettlement” of unassimilable population elements.

  The decimated and dependent Indian peoples of Michigan, Indiana, Illinois, and Wisconsin were, one after another, forced onto reservations within those states in areas that Americans of European descent did not yet see as valuable. There was almost no resistance, except for the Sauk and Fox uprising led by Black Hawk (the Black Hawk War) in 1832 and put down by local militia whose ranks included a young Abraham Lincoln (Lincoln, Abraham). It was a slightly different story in the Southeast, where the so-called Five Civilized Tribes (the Chickasaw, Cherokee, Creek, Choctaw, and Seminole peoples) were moving toward assimilation. Many individual members of these groups had become landholders and even slaveowners. The Cherokee, under the guidance of their outstanding statesman Sequoyah, had even developed a written language and were establishing U.S.-style communal institutions on lands in north Georgia ceded to them by treaty. The Treaty of New Echota was violated by squatters on Indian land, but when the Cherokees went to court—not to war—and won their case in the Supreme Court (Worcester v. Georgia), Pres. Andrew Jackson supported Georgia in contemptuously ignoring the decision. The national government moved on inexorably toward a policy of resettlement in the Indian Territory (later Oklahoma) beyond the Mississippi, and, after the policy's enactment into law in 1830, the Southeast Indian peoples were driven westward along the Trail of Tears. The Seminole, however, resisted and fought the seven-year-long Second Seminole War (Seminole Wars) in the swamps of Florida before the inevitable surrender in 1842.

 That a policy of “population transfer” foreshadowing some of the later totalitarian infamies of the 20th century should be so readily embraced in democratic 19th-century America is comprehensible in the light of cultural forces. The revival-inspired missionary (mission) movement, while Native American-friendly in theory, assumed that the cultural integrity of Indian land would and should disappear when the Indians were “brought to Christ.” A romantic sentimentalization of the “noble red man,” evidenced in the literary works of James Fenimore Cooper (Cooper, James Fenimore) and Henry Wadsworth Longfellow (Longfellow, Henry Wadsworth), called attention to positive aspects of Indian life but saw Native Americans as essentially a vanishing breed. Far more common in American thought was the concept of the “treacherous redskin,” which lifted Jackson and William Henry Harrison to the presidency in 1828 and 1840, respectively, partly on the strength of their military victories over Indians. Popular celebration of allegedly Anglo-Saxon characteristics of energy and independence helped to brand other “races”—Indians as well as Africans, Asians, and Hispanics—as inferiors who would have to yield to progress. In all, the historical moment was unkind to the Indians, as some of the values that in fact did sustain the growth and prosperity of the United States were the same ones that worked against any live-and-let-live arrangement between the original Americans and the newcomers.

Bernard A. Weisberger

Attitudes toward expansionism
 Public attitudes toward expansion into Mexican territories were very much affected by the issue of slavery. Those opposed to the spread of slavery or simply not in favour of the institution joined abolitionists in discerning a proslavery policy in the Mexican-American War. The great political issue of the postwar years concerned slavery in the territories. Calhoun and spokesmen for the slave-owning South argued that slavery could not be constitutionally prohibited in the Mexican cession. “Free Soilers (Free-Soil Party)” supported the Wilmot Proviso idea—that slavery should not be permitted in the new territory. Others supported the proposal that popular sovereignty (called “squatter sovereignty (popular sovereignty)” by its detractors) should prevail—that is, that settlers in the territories should decide the issue. Still others called for the extension westward of the 36°30′ line of demarcation for slavery that had resolved the Missouri controversy in 1820. Now, 30 years later, Clay again pressed a compromise on the country, supported dramatically by the aging Daniel Webster (Webster, Daniel) and by moderates in and out of the Congress. As the events in the California gold fields showed (beginning in 1849), many people had things other than political principles on their minds. The Compromise of 1850 (1850, Compromise of), as the separate resolutions resolving the controversy came to be known, infuriated those of high principle on both sides of the issue—Southerners resented that the compromise admitted California as a free state, abolished the slave trade in the District of Columbia, and gave territories the theoretical right to deny existence to their “peculiar institution,” while antislavery men deplored the same theoretical right of territories to permit the institution and abhorred the new, more-stringent federal fugitive-slave law. That Southern political leaders ceased talking secession shortly after the enactment of the compromise indicates who truly won the political skirmish. The people probably approved the settlement—but as subsequent events were to show, the issues had not been met but had been only deferred.

Edward Pessen

The Civil War (American Civil War)
Prelude to war, 1850–60
      Before the Civil War the United States experienced a whole generation of nearly unremitting political crisis. Underlying the problem was the fact that America in the early 19th century had been a country, not a nation. The major functions of government—those relating to education, transportation, health, and public order—were performed on the state or local level, and little more than a loose allegiance to the government in Washington, D.C., a few national institutions such as churches and political parties, and a shared memory of the Founding Fathers of the republic tied the country together. Within this loosely structured society every section, every state, every locality, every group could pretty much go its own way.

      Gradually, however, changes in technology and in the economy were bringing all the elements of the country into steady and close contact. Improvements in transportation—first canals, then toll roads, and especially railroads—broke down isolation and encouraged the boy from the country to wander to the city, the farmer from New Hampshire to migrate to Iowa. Improvements in the printing press, which permitted the publication of penny newspapers, and the development of the telegraph system broke through the barriers of intellectual provincialism and made everybody almost instantaneously aware of what was going on throughout the country. As the railroad network proliferated, it had to have central direction and control; and national railroad corporations—the first true “big businesses” in the United States—emerged to provide order and stability.

      For many Americans the wrench from a largely rural, slow-moving, fragmented society in the early 1800s to a bustling, integrated, national social order in the mid-century was an abrupt and painful one, and they often resisted it. Sometimes resentment against change manifested itself in harsh attacks upon those who appeared to be the agents of change—especially immigrants, who seemed to personify the forces that were altering the older America. Vigorous nativist movements appeared in most cities during the 1840s; but not until the 1850s, when the huge numbers of Irish and German immigrants of the previous decade became eligible to vote, did the antiforeign fever reach its peak. Directed both against immigrants and against the Roman Catholic church, to which so many of them belonged, the so-called Know-Nothings (Know-Nothing party) emerged as a powerful political force in 1854 and increased the resistance to change.

Sectionalism and slavery
      A more enduring manifestation of hostility toward the nationalizing tendencies in American life was the reassertion of strong feelings of sectional loyalty. New Englanders felt threatened by the West, which drained off the ablest and most vigorous members of the labour force and also, once the railroad network was complete, produced wool and grain that undersold the products of the poor New England hill country. The West, too, developed a strong sectional feeling, blending its sense of its uniqueness, its feeling of being looked down upon as raw and uncultured, and its awareness that it was being exploited by the businessmen of the East.

 The most conspicuous and distinctive section, however, was the South (South, the)—an area set apart by climate, by a plantation system designed for the production of such staple crops as cotton, tobacco, and sugar, and, especially, by the persistence of slavery, which had been abolished or prohibited in all other parts of the United States. It should not be thought that all or even most white Southerners were directly involved in the section's “peculiar institution.” Indeed, in 1850 there were only 347,525 slaveholders in a total white population of about 6,000,000 in the slave states. Half of these owned four slaves or fewer and could not be considered planters. In the entire South there were fewer than 1,800 persons who owned more than 100 slaves.

      Nevertheless, slavery did give a distinctive tone to the whole pattern of Southern life. If the large planters were few, they were also wealthy, prestigious, and powerful; often they were the political as well as the economic leaders of their section; and their values pervaded every stratum of Southern society. Far from opposing slavery, small farmers thought only of the possibility that they too might, with hard work and good fortune, some day join the ranks of the planter class—to which they were closely connected by ties of blood, marriage, and friendship. Behind this virtually unanimous support of slavery lay the universal belief (racism)—shared by many whites in the North and West as well—that blacks were an innately inferior people who had risen only to a state of barbarism in their native Africa and who could live in a civilized society only if disciplined through slavery. Though by 1860 there were in fact about 250,000 free blacks in the South, most Southern whites resolutely refused to believe that the slaves, if freed, could ever coexist peacefully with their former masters. With shuddering horror, they pointed to an insurrection of blacks that had occurred in Santo Domingo, to a brief slave rebellion led by the African American Gabriel in Virginia in 1800, to a plot of Charleston, South Carolina, blacks headed by Denmark Vesey (Vesey, Denmark) in 1822, and, especially, to a bloody and determined Virginia insurrection led by Nat Turner (Turner, Nat) in 1831 as evidence that African Americans had to be kept under iron control. Facing increasing opposition to slavery outside their section, Southerners developed an elaborate proslavery argument, defending the institution on biblical, economic, and sociological grounds.

A decade of political crises
      In the early years of the republic, sectional differences had existed, but it had been possible to reconcile or ignore them because distances were great, communication was difficult, and the powerless national government had almost nothing to do. The revolution in transportation and communication, however, eliminated much of the isolation, and the victory of the United States in its brief war with Mexico left the national government with problems that required action.

 The Compromise of 1850 (1850, Compromise of) was an uneasy patchwork of concessions to all sides that began to fall apart as soon as it was enacted. In the long run the principle of popular sovereignty proved to be most unsatisfactory of all, making each territory a battleground where the supporters of the South contended with the defenders of the North and West.

      The seriousness of those conflicts became clear in 1854, when Stephen A. Douglas (Douglas, Stephen A) introduced his Kansas bill in Congress, establishing a territorial government for the vast region that lay between the Missouri River and the Rocky Mountains. In the Senate the bill was amended to create not one but two territories—Kansas and Nebraska (Kansas-Nebraska Act)—from the part of the Louisiana Purchase from which the Missouri Compromise of 1820 had forever excluded slavery. Douglas, who was unconcerned over the moral issue of slavery and desirous of getting on with the settling of the West and the construction of a transcontinental railroad, knew that the Southern senators would block the organization of Kansas as a free territory. Recognizing that the North and West had outstripped their section in population and hence in the House of Representatives, Southerners clung desperately to an equality of votes in the Senate and were not disposed to welcome any new free territories, which would inevitably become additional free states (as California had done through the Compromise of 1850). Accordingly, Douglas thought that the doctrine of popular sovereignty, which had been applied to the territories gained from Mexico, would avoid a political contest over the Kansas territory: it would permit Southern slaveholders to move into the area, but, since the region was unsuited for plantation slavery, it would inevitably result in the formation of additional free states. His bill therefore allowed the inhabitants of the territory self-government in all matters of domestic importance, including the slavery issue. This provision in effect allowed the territorial legislatures to mandate slavery in their areas and was directly contrary to the Missouri Compromise. With the backing of President Franklin Pierce (Pierce, Franklin) (served 1853–57), Douglas bullied, wheedled, and bluffed congressmen into passing his bill.

Polarization over slavery
      Northern sensibilities were outraged. Although disliking slavery, Northerners had made few efforts to change the South's “peculiar institution” so long as the republic was loosely articulated. (Indeed, when William Lloyd Garrison (Garrison, William Lloyd) began his Liberator in 1831, urging the immediate and unconditional emancipation of all slaves, he had only a tiny following; and a few years later he had actually been mobbed in Boston.) But with the sections, perforce, being drawn closely together, Northerners could no longer profess indifference to the South and its institutions. Sectional differences, centring on the issue of slavery, began to appear in every American institution. During the 1840s the major national religious denominations, such as the Methodists and the Presbyterians, split over the slavery question. The Whig Party, which had once allied the conservative businessmen of the North and West with the planters of the South, divided and virtually disappeared after the election of 1852. When Douglas's bill opened up to slavery Kansas and Nebraska—land that had long been reserved for the westward expansion of the free states—Northerners began to organize into an antislavery political party, called in some states the Anti-Nebraska Democratic Party, in others the People's Party, but in most places, the Republican Party.

      Events of 1855 and 1856 further exacerbated relations between the sections and strengthened this new party. Kansas, once organized by Congress, became the field of battle between the free and the slave states in a contest in which concern over slavery was mixed with land speculation and office seeking. A virtual civil war (Bleeding Kansas) broke out, with rival free- and slave-state legislatures both claiming legitimacy. Disputes between individual settlers sometimes erupted into violence. A proslavery mob sacked the town of Lawrence, an antislavery stronghold, on May 21, 1856. On May 24–25 John Brown (Brown, John), a free-state partisan, led a small party in a raid upon some proslavery settlers on Pottawatomie (Pottawatomie Massacre) Creek, murdered five men in cold blood, and left their gashed and mutilated bodies as a warning to the slaveholders. Not even the U.S. Capitol was safe from the violence. On May 22 Preston S. Brooks, a South Carolina congressman, brutally attacked Senator Charles Sumner (Sumner, Charles) of Massachusetts at his desk in the Senate chamber because he had presumably insulted the Carolinian's “honour” in a speech he had given in support of Kansas abolitionists. The 1856 presidential election made it clear that voting was becoming polarized along sectional lines. Though James Buchanan (Buchanan, James), the Democratic nominee, was elected, John C. Frémont, the Republican candidate, received a majority of the votes in the free states.

      The following year the Supreme Court of the United States tried to solve the sectional conflicts that had baffled both the Congress and the president. Hearing the case of Dred Scott (Dred Scott decision), a Missouri slave who claimed freedom on the ground that his master had taken him to live in free territory, the majority of the court, headed by Chief Justice Roger B. Taney (Taney, Roger Brooke), found that African Americans were not citizens of the United States and that Scott hence had no right to bring suit before the court. Taney also concluded that the U.S. laws prohibiting slavery in the territory were unconstitutional. Two Northern antislavery judges on the court bitterly attacked Taney's logic and his conclusions. Acclaimed in the South, the Dred Scott decision was condemned and repudiated throughout the North.

      By this point many Americans, North and South, had come to the conclusion that slavery and freedom could not much longer coexist in the United States. For Southerners the answer was withdrawal from a Union that no longer protected their rights and interests; they had talked of it as early as the Nashville Convention of 1850, when the compromise measures were under consideration, and now more and more Southerners favoured secession. For Northerners the remedy was to change the social institutions of the South; few advocated immediate or complete emancipation of the slaves, but many felt that the South's “peculiar institution” must be contained. In 1858 William H. Seward (Seward, William H), the leading Republican of New York, spoke of an “irrepressible conflict” between freedom and slavery; and in Illinois a rising Republican politician, Abraham Lincoln (Lincoln, Abraham), who unsuccessfully contested Douglas for a seat in the Senate, announced that “this government cannot endure, permanently half slave and half free.”

 That it was not possible to end the agitation over slavery became further apparent in 1859 when on the night of October 16, John Brown, who had escaped punishment for the Pottawatomie massacre, staged a raid on Harpers Ferry, Virginia (now in West Virginia), designed to free the slaves and, apparently, to help them begin a guerrilla war against the Southern whites. Even though Brown was promptly captured and Virginia slaves gave no heed to his appeals, Southerners feared that this was the beginning of organized Northern efforts to undermine their social system. The fact that Brown was a fanatic and an inept strategist whose actions were considered questionable even by abolitionists did not lessen Northern admiration for him.

      The presidential election of 1860 occurred, therefore, in an atmosphere of great tension. Southerners, determined that their rights should be guaranteed by law, insisted upon a Democratic candidate willing to protect slavery in the territories; and they rejected Stephen A. Douglas, whose popular-sovereignty doctrine left the question in doubt, in favour of John C. Breckinridge. Douglas, backed by most of the Northern and border-state Democrats, ran on a separate Democratic ticket. Elderly conservatives, who deplored all agitation of the sectional questions but advanced no solutions, offered John Bell as candidate of the Constitutional Union Party. Republicans, confident of success, passed over the claims of Seward, who had accumulated too many liabilities in his long public career, and nominated Lincoln instead. Voting in the subsequent election was along markedly sectional patterns, with Republican strength confined almost completely to the North and West. Though Lincoln received only a plurality of the popular vote, he was an easy winner in the electoral college.

secession and the politics of the Civil War, 1860–65

The coming of the war
      In the South, Lincoln's election was taken as the signal for secession, and on December 20 South Carolina became the first state to withdraw from the Union. Promptly the other states of the lower South followed. Feeble efforts on the part of Buchanan's administration to check secession failed, and one by one most of the federal forts in the Southern states were taken over by secessionists. Meanwhile, strenuous efforts in Washington to work out another compromise failed. (The most promising plan was John J. Crittenden's proposal to extend the Missouri Compromise line, dividing free from slave states, to the Pacific.)

      Neither extreme Southerners, now intent upon secession, nor Republicans, intent upon reaping the rewards of their hard-won election victory, were really interested in compromise. On February 4, 1861—a month before Lincoln could be inaugurated in Washington—six Southern states (South Carolina, Georgia, Alabama, Florida, Mississippi, Louisiana) sent representatives to Montgomery, Alabama, to set up a new independent government. Delegates from Texas soon joined them. With Jefferson Davis (Davis, Jefferson) of Mississippi at its head, the Confederate States of America came into being, set up its own bureaus and offices, issued its own money, raised its own taxes, and flew its own flag. Not until May 1861, after hostilities had broken out and Virginia had seceded, did the new government transfer its capital to Richmond.

 Faced with a fait accompli, Lincoln when inaugurated was prepared to conciliate the South in every way but one: he would not recognize that the Union could be divided. The test of his determination came early in his administration, when he learned that the Federal troops under Major Robert Anderson in Fort Sumter (Fort Sumter National Monument), South Carolina—then one of the few military installations in the South still in Federal hands—had to be promptly supplied or withdrawn. After agonized consultation with his cabinet, Lincoln determined that supplies must be sent even if doing so provoked the Confederates into firing the first shot. On April 12, 1861, just before Federal supply ships could reach the beleaguered Anderson, Confederate guns in Charleston opened fire upon Fort Sumter, and the war began.

The political course of the war
      For the next four years the Union and the Confederacy were locked in conflict—by far the most titanic waged in the Western Hemisphere.

      The policies pursued by the governments of Abraham Lincoln and Jefferson Davis were astonishingly similar. Both presidents at first relied upon volunteers to man the armies, and both administrations were poorly prepared to arm and equip the hordes of young men who flocked to the colours in the initial stages of the war. As the fighting progressed, both governments reluctantly resorted to conscription—the Confederates first, in early 1862, and the Federal government more slowly, with an ineffective measure of late 1862 followed by a more stringent law in 1863. Both governments pursued an essentially laissez-faire policy in economic matters, with little effort to control prices, wages, or profits. Only the railroads were subject to close government regulation in both regions; and the Confederacy, in constructing some of its own powder mills, made a few experiments in “state socialism.” Neither Lincoln's nor Davis's administration knew how to cope with financing the war; neither developed an effective system of taxation until late in the conflict, and both relied heavily upon borrowing. Faced with a shortage of funds, both governments were obliged to turn to the printing press and to issue fiat money; the U.S. government issued $432,000,000 in “greenbacks” (as this irredeemable, non-interest-bearing paper money was called), while the Confederacy printed over $1,554,000,000 in such paper currency. In consequence, both sections experienced runaway inflation, which was much more drastic in the South, where, by the end of the war, flour sold at $1,000 a barrel.

      Even toward slavery, the root cause of the war, the policies of the two warring governments were surprisingly similar. The Confederate constitution, which was in most other ways similar to that of the United States, expressly guaranteed the institution of slavery. Despite pressure from abolitionists, Lincoln's administration was not initially disposed to disturb the “peculiar institution,” if only because any move toward emancipation would upset the loyalty of Delaware, Maryland, Kentucky, and Missouri—the four slave states that remained in the Union.

Moves toward emancipation
      Gradually, however, under the pressure of war, both governments moved to end slavery. Lincoln came to see that emancipation of African Americans would favourably influence European opinion toward the Northern cause, might deprive the Confederates of their productive labour force on the farms, and would add much-needed recruits to the Federal armies. In September 1862 he issued his preliminary proclamation of emancipation, promising to free all slaves in rebel territory by January 1, 1863, unless those states returned to the Union; and when the Confederates remained obdurate, he followed it with his promised final proclamation. A natural accompaniment of emancipation was the use of African American troops, and by the end of the war the number of blacks who served in the Federal armies totaled 178,895. Uncertain of the constitutionality of his Emancipation Proclamation, Lincoln urged Congress to abolish slavery by constitutional amendment; but this was not done until January 31, 1865, with the Thirteenth Amendment, and the actual ratification did not take place until after the war.

      Meanwhile the Confederacy, though much more slowly, was also inexorably drifting in the direction of emancipation. The South's desperate need for troops caused many military men, including Robert E. Lee (Lee, Robert E.), to demand the recruitment of blacks; finally, in March 1865 the Confederate congress authorized the raising of African American regiments. Though a few blacks were recruited for the Confederate armies, none actually served in battle because surrender was at hand. In yet another way Davis's government showed its awareness of slavery's inevitable end when, in a belated diplomatic mission to seek assistance from Europe, the Confederacy in March 1865 promised to emancipate the slaves in return for diplomatic recognition. Nothing came of the proposal, but it is further evidence that by the end of the war both North and South realized that slavery was doomed.

Sectional dissatisfaction
      As war leaders, both Lincoln and Davis came under severe attack in their own sections. Both had to face problems of disloyalty. In Lincoln's case, the Irish immigrants to the eastern cities and the Southern-born settlers of the northwestern states were especially hostile to African Americans and, therefore, to emancipation, while many other Northerners became tired and disaffected as the war dragged on interminably. Residents of the Southern hill country, where slavery never had much of a foothold, were similarly hostile toward Davis. Furthermore, in order to wage war, both presidents had to strengthen the powers of central government, thus further accelerating the process of national integration that had brought on the war. Both administrations were, in consequence, vigorously attacked by state governors, who resented the encroachment upon their authority and who strongly favoured local autonomy.

      The extent of Northern dissatisfaction was indicated in the congressional elections of 1862, when Lincoln and his party sustained a severe rebuff at the polls and the Republican majority in the House of Representatives was drastically reduced. Similarly in the Confederacy the congressional elections of 1863 went so strongly against the administration that Davis was able to command a majority for his measures only through the continued support of representatives and senators from the states of the upper South, which were under control of the Federal army and consequently unable to hold new elections.

 As late as August 1864, Lincoln despaired of his reelection to the presidency and fully expected that the Democratic candidate, General George B. McClellan (McClellan, George B), would defeat him. Davis, at about the same time, was openly attacked by Alexander H. Stephens (Stephens, Alexander H), the vice president of the Confederacy. But Federal military victories, especially William Tecumseh Sherman's capture of Atlanta, greatly strengthened Lincoln; and, as the war came to a triumphant close for the North, he attained new heights of popularity. Davis's administration, on the other hand, lost support with each successive defeat, and in January 1865 the Confederate congress insisted that Davis make Robert E. Lee the supreme commander of all Southern forces. (Some, it is clear, would have preferred to make the general dictator.)

David Herbert Donald

Fighting the Civil War
      Following the capture of Fort Sumter, both sides quickly began raising and organizing armies. On July 21, 1861, some 30,000 Union troops marching toward the Confederate capital of Richmond, Virginia, were stopped at Bull Run (Bull Run, battles of) (Manassas) and then driven back to Washington, D.C., by Confederates under General Thomas J. “Stonewall” Jackson (Jackson, Thomas Jonathan) and General P.G.T. Beauregard (Beauregard, P.G.T.). The shock of defeat galvanized the Union, which called for 500,000 more recruits. General George B. McClellan (McClellan, George B) was given the job of training the Union's Army of the Potomac.

  The first major campaign of the war began in February 1862, when the Union general Ulysses S. Grant (Grant, Ulysses S.) captured the Confederate strongholds of Fort Henry and Fort Donelson in western Tennessee; this action was followed by the Union general John Pope's capture of New Madrid, Missouri, a bloody but inconclusive battle at Shiloh (Shiloh, Battle of) (Pittsburg Landing), Tennessee, on April 6–7, and the occupation of Corinth and Memphis, Tennessee, in June. Also in April, the Union naval commodore David G. Farragut (Farragut, David) gained control of New Orleans. In the East, McClellan launched a long-awaited offensive with 100,000 men in another attempt to capture Richmond. Opposed by General Robert E. Lee (Lee, Robert E.) and his able lieutenants Jackson and J.E. Johnston (Johnston, Joseph E), McClellan moved cautiously and in the Seven Days' Battles (June 25–July 1) was turned back, his Peninsular Campaign a failure. At the Second Battle of Bull Run (August 29–30), Lee drove another Union army, under Pope, out of Virginia and followed up by invading Maryland. McClellan was able to check Lee's forces at Antietam (Antietam, Battle of) (or Sharpsburg, September 17). Lee withdrew, regrouped, and dealt McClellan's successor, A.E. Burnside (Burnside, Ambrose Everett), a heavy defeat at Fredericksburg (Fredericksburg, Battle of), Virginia, on December 13.

    Burnside was in turn replaced as commander of the Army of the Potomac by General Joseph Hooker (Hooker, Joseph), who took the offensive in April 1863. He attempted to outflank Lee's position at Chancellorsville (Chancellorsville, Battle of), Virginia, but was completely outmaneuvered (May 1–5) and forced to retreat. Lee then undertook a second invasion of the North. He entered Pennsylvania, and a chance encounter of small units developed into a climactic battle at Gettysburg (Gettysburg, Battle of) (July 1–3), where the new Union commander, General George G. Meade (Meade, George G), commanded defensive positions. Lee's forces were repulsed at the Battle of Gettysburg and fell back into Virginia. At nearly the same time, a turning point was reached in the West. After two months of masterly maneuvering, Grant captured Vicksburg (Vicksburg Campaign), Mississippi, on July 4, 1863. Soon the Mississippi River was entirely under Union control, effectively cutting the Confederacy in two. In October, after a Union army under General W.S. Rosecrans (Rosecrans, William S) had been defeated at Chickamauga Creek (Chickamauga Creek, Battle of), Georgia (September 19–20), Grant was called to take command in that theatre. Ably assisted by General William Tecumseh Sherman (Sherman, William Tecumseh) and General George Thomas (Thomas, George H), Grant drove Confederate general Braxton Bragg (Bragg, Braxton) out of Chattanooga (Chattanooga, Battle of) (November 23–25) and out of Tennessee; Sherman subsequently secured Knoxville.

  In March 1864 Lincoln gave Grant supreme command of the Union armies. Grant took personal command of the Army of the Potomac in the east and soon formulated a strategy of attrition based upon the Union's overwhelming superiority in numbers and supplies. He began to move in May, suffering extremely heavy casualties in the battles of the Wilderness, Spotsylvania, and Cold Harbor (see photograph—>), all in Virginia, and by mid-June he had Lee pinned down in fortifications before Petersburg (Petersburg Campaign), Virginia. For nearly 10 months the siege of Petersburg continued, while Grant slowly closed around Lee's positions. Meanwhile, Sherman faced the only other Confederate force of consequence in Georgia. Sherman captured Atlanta early in September, and in November he set out on his 300-mile (480-km) march through Georgia, leaving a swath of devastation behind him. He reached Savannah on December 10 and soon captured that city.

 By March 1865 Lee's army was thinned by casualties and desertions and was desperately short of supplies. Grant began his final advance on April 1 at Five Forks, captured Richmond on April 3, and accepted Lee's surrender at nearby Appomattox Court House on April 9. Sherman had moved north into North Carolina, and on April 26 he received the surrender of J.E. Johnston. The war was over.

      Naval operations in the Civil War were secondary to the war on land, but there were nonetheless some celebrated exploits. David Farragut was justly hailed for his actions at New Orleans and at Mobile Bay (August 5, 1864), and the battle (Monitor and Merrimack, Battle of the) of the ironclads (ironclad) Monitor and Merrimack (March 9, 1862) is often held to have opened the modern era of naval warfare. For the most part, however, the naval war was one of blockade as the Union attempted, largely successfully, to stop the Confederacy's commerce with Europe.

      Davis and many Confederates expected recognition of their independence and direct intervention in the war on their behalf by Great Britain and possibly France. But they were cruelly disappointed, in part through the skillful diplomacy of Lincoln, Secretary of State Seward, and the Union ambassador to England, Charles Francis Adams, and in part through Confederate military failure at a crucial stage of the war.

      The Union's first trouble with Britain came when Captain Charles Wilkes (Wilkes, Charles) halted the British steamer Trent (Trent Affair) on November 8, 1861, and forcibly removed two Confederate envoys, James M. Mason (Mason, James Murray) and John Slidell (Slidell, John), bound for Europe. Only the eventual release of the two men prevented a diplomatic rupture with Lord Palmerston's (Palmerston, Henry John Temple, 3rd Viscount, Baron Temple Of Mount Temple) government in London. Another crisis erupted between the Union and England when the Alabama, built in the British Isles, was permitted upon completion to sail and join the Confederate navy, despite Adams's protestations. And when word reached the Lincoln government that two powerful rams were being constructed in Britain for the Confederacy, Adams reputedly sent his famous “this is war” note to Palmerston, and the rams were seized by the British government at the last moment.

      The diplomatic crisis of the Civil War came after Lee's striking victory at the Second Battle of Bull Run in late August 1862 and subsequent invasion of Maryland. The British government was set to offer mediation of the war and, if this was refused by the Lincoln administration (as it would have been), forceful intervention on behalf of the Confederacy. Only a victory by Lee on Northern soil was needed, but he was stopped by McClellan in September at Antietam, the Union's most needed success. The Confederate defeats at Gettysburg and Vicksburg the following summer ensured the continuing neutrality of Britain and France, especially when Russia seemed inclined to favour the Northern cause. Even the growing British shortage of cotton from the Southern states did not force Palmerston's government into Davis's camp, particularly when British consuls in the Confederacy were more closely restricted toward the close of the war. In the final act, even the Confederate offer to abolish slavery in early 1865 in return for British recognition fell on deaf ears.

      The war was horribly costly for both sides. The Federal forces sustained more than a half million casualties (including nearly 360,000 deaths); the Confederate armies suffered about 483,000 casualties (approximately 258,000 deaths). Both governments, after strenuous attempts to finance loans, were obliged to resort to the printing press to make fiat money. While separate Confederate figures are lacking, the war finally cost the United States more than $15 billion. The South, especially, where most of the war was fought and which lost its labour system, was physically and economically devastated. In sum, although the Union was preserved and restored, the cost in physical and moral suffering was incalculable, and some spiritual wounds caused by the war still have not been healed.

Warren W. Hassler, Jr. Ed.

Reconstruction and the New South, 1865–1900
Reconstruction, 1865–77

Reconstruction under Abraham Lincoln
 The original Northern objective in the Civil War was the preservation of the Union—a war aim with which virtually everybody in the free states agreed. As the fighting progressed, the Lincoln government concluded that emancipation of the slaves was necessary in order to secure military victory; and thereafter freedom became a second war aim for the members of the Republican Party. The more radical members of that party—men like Charles Sumner (Sumner, Charles) and Thaddeus Stevens (Stevens, Thaddeus)—believed that emancipation would prove a sham unless the government guaranteed the civil and political rights of the freedmen; thus, equality of all citizens before the law became a third war aim for this powerful faction. The fierce controversies of the Reconstruction era raged over which of these objectives should be insisted upon and how these goals should be secured.

      Lincoln himself had a flexible and pragmatic approach to Reconstruction, insisting only that the Southerners, when defeated, pledge future loyalty to the Union and emancipate their slaves. As the Southern states were subdued, he appointed military governors to supervise their restoration. The most vigorous and effective of these appointees was Andrew Johnson (Johnson, Andrew), a War Democrat whose success in reconstituting a loyal government in Tennessee led to his nomination as vice president on the Republican ticket with Lincoln in 1864. In December 1863 Lincoln announced a general plan for the orderly Reconstruction of the Southern (South, the) states, promising to recognize the government of any state that pledged to support the Constitution and the Union and to emancipate the slaves if it was backed by at least 10 percent of the number of voters in the 1860 presidential election. In Louisiana, Arkansas, and Tennessee loyal governments were formed under Lincoln's plan; and they sought readmission to the Union with the seating of their senators and representatives in Congress.

The Radicals' plan
      Radical Republicans (Radical Republican) were outraged at these procedures, which savoured of executive usurpation of congressional powers, which required only minimal changes in the Southern social system, and which left political power essentially in the hands of the same Southerners who had led their states out of the Union. The Radicals put forth their own plan of Reconstruction in the Wade–Davis Bill (Wade-Davis Bill), which Congress passed on July 2, 1864; it required not 10 percent but a majority of the white male citizens in each Southern state to participate in the reconstruction process, and it insisted upon an oath of past, not just of future, loyalty. Finding the bill too rigorous and inflexible, Lincoln pocket vetoed it; and the Radicals bitterly denounced him. During the 1864–65 session of Congress, they in turn defeated the president's proposal to recognize the Louisiana government organized under his 10 percent plan. At the time of Lincoln's assassination, therefore, the president and the Congress were at loggerheads over Reconstruction.

Reconstruction under Andrew Johnson
      At first it seemed that Johnson might be able to work more cooperatively with Congress in the process of Reconstruction. A former representative and a former senator, he understood congressmen. A loyal Unionist who had stood by his country even at the risk of his life when Tennessee seceded, he was certain not to compromise with secession; and his experience as military governor of that state showed him to be politically shrewd and tough toward the slaveholders. “Johnson, we have faith in you,” Radical Benjamin F. Wade assured the new president on the day he took the oath of office. “By the gods, there will be no trouble running the government.”

Johnson's policy
      Such Radical trust in Johnson proved misplaced. The new president was, first of all, himself a Southerner. He was a Democrat who looked for the restoration of his old party partly as a step toward his own reelection to the presidency in 1868. Most important of all, Johnson shared the white Southerners' attitude toward African Americans, considering black men innately inferior and unready for equal civil or political rights. On May 29, 1865, Johnson made his policy clear when he issued a general proclamation of pardon and amnesty for most Confederates and authorized the provisional governor of North Carolina to proceed with the reorganization of that state. Shortly afterward he issued similar proclamations for the other former Confederate states. In each case a state constitutional convention was to be chosen by the voters who pledged future loyalty to the U.S. Constitution. The conventions were expected to repeal the ordinances of secession, to repudiate the Confederate debt, and to accept the Thirteenth Amendment, abolishing slavery. The president did not, however, require them to enfranchise African Americans.

“Black Codes”
      Given little guidance from Washington, Southern whites turned to the traditional political leaders of their section for guidance in reorganizing their governments; and the new regimes in the South were suspiciously like those of the antebellum period. To be sure, slavery was abolished; but each reconstructed Southern state government proceeded to adopt a “ black code,” regulating the rights and privileges of freedmen. Varying from state to state, these codes in general treated African Americans as inferiors, relegated to a secondary and subordinate position in society. Their right to own land was restricted, they could not bear arms, and they might be bound out in servitude for vagrancy and other offenses. The conduct of white Southerners indicated that they were not prepared to guarantee even minimal protection of African American rights. In riots in Memphis (May 1866) and New Orleans (July 1866), African Americans were brutally assaulted and promiscuously killed.

Civil rights legislation
      Watching these developments with forebodings, Northern Republicans during the congressional session of 1865–66 inevitably drifted into conflict with the president. Congress attempted to protect the rights of African Americans by extending the life of the Freedmen's Bureau, a welfare agency established in March 1865 to ease the transition from slavery to freedom; but Johnson vetoed the bill. An act to define and guarantee African Americans' basic civil rights met a similar fate, but Republicans succeeded in passing it over the president's veto. While the president, from the porch of the White House, denounced the leaders of the Republican Party as “traitors,” Republicans in Congress tried to formulate their own plan to reconstruct the South. Their first effort was the passage of the Fourteenth Amendment, which guaranteed the basic civil rights of all citizens, regardless of colour, and which tried to persuade the Southern states to enfranchise African Americans by threatening to reduce their representation in Congress.

      The president, the Northern Democrats, and the Southern whites spurned this Republican plan of Reconstruction. Johnson tried to organize his own political party in the National Union Convention, which met in Philadelphia in August 1866; and in August and September he visited many Northern and Western cities in order to defend his policies and to attack the Republican leaders. At the president's urging, every Southern state except Tennessee overwhelmingly rejected the Fourteenth Amendment.

      Victorious in the fall elections, congressional Republicans moved during the 1866–67 session to devise a second, more stringent program for reconstructing the South. After long and acrimonious quarrels between Radical and moderate Republicans, the party leaders finally produced a compromise plan in the First Reconstruction Act of 1867. Expanded and clarified in three supplementary Reconstruction acts, this legislation swept away the regimes the president had set up in the South, put the former Confederacy back under military control, called for the election of new constitutional conventions, and required the constitutions adopted by these bodies to include both African American suffrage and the disqualification of former Confederate leaders from officeholding. Under this legislation, new governments were established in all the former Confederate states (except Tennessee, which had already been readmitted); and by July 1868 Congress agreed to seat senators and representatives from Alabama, Arkansas, Florida, Louisiana, North Carolina, and South Carolina. By July 1870 the remaining Southern states had been similarly reorganized and readmitted.

      Suspicious of Andrew Johnson, Republicans in Congress did not trust the president to enforce the Reconstruction legislation they passed over his repeated vetoes, and they tried to deprive him of as much power as possible. Congress limited the president's control over the army by requiring that all his military orders be issued through the general of the army, Ulysses S. Grant, who was believed loyal to the Radical cause; and in the Tenure of Office Act (1867) they limited the president's right to remove appointive officers. When Johnson continued to do all he could to block the enforcement of Radical legislation in the South, the more extreme members of the Republican Party demanded his impeachment. The president's decision in February 1868 to remove the Radical secretary of war Edwin M. Stanton from the Cabinet, in apparent defiance of the Tenure of Office Act, provided a pretext for impeachment proceedings. The House of Representatives voted to impeach the president, and after a protracted trial the Senate acquitted him by the margin of only one vote.

The South during Reconstruction
      In the South the Reconstruction period was a time of readjustment accompanied by disorder. Southern whites wished to keep African Americans in a condition of quasi-servitude, extending few civil rights and firmly rejecting social equality. African Americans, on the other hand, wanted full freedom and, above all, land of their own. Inevitably, there were frequent clashes. Some erupted into race riots, but acts of terrorism against individual African American leaders were more common.

      During this turmoil, Southern whites and blacks began to work out ways of getting their farms back into operation and of making a living. Indeed, the most important developments of the Reconstruction era were not the highly publicized political contests but the slow, almost imperceptible changes that occurred in Southern society. African Americans could now legally marry, and they set up conventional and usually stable family units; they quietly seceded from the white churches and formed their own religious organizations, which became centres for the African American community. Without land or money, most freedmen had to continue working for white masters; but they were now unwilling to labour in gangs or to live in the old slave quarters under the eye of the plantation owner.

      Sharecropping gradually became the accepted labour system in most of the South—planters, short of capital, favoured the system because it did not require them to pay cash wages; African Americans preferred it because they could live in individual cabins on the tracts they rented and because they had a degree of independence in choosing what to plant and how to cultivate. The section as a whole, however, was desperately poor throughout the Reconstruction era; and a series of disastrously bad crops in the late 1860s, followed by the general agricultural depression of the 1870s, hurt both whites and blacks.

      The governments set up in the Southern states under the congressional program of Reconstruction were, contrary to traditional clichés, fairly honest and effective. Though the period has sometimes been labeled “Black Reconstruction,” the Radical governments in the South were never dominated by African Americans. There were no black governors, only two black senators and a handful of congressmen, and only one legislature controlled by blacks. Those African Americans who did hold office appear to have been similar in competence and honesty to the whites. It is true that these Radical governments were expensive, but large state expenditures were necessary to rebuild after the war and to establish—for the first time in most Southern states—a system of common schools. Corruption there certainly was, though nowhere on the scale of the Tweed Ring, which at that time was busily looting New York City; but it is not possible to show that Republicans were more guilty than Democrats, or blacks than whites, in the scandals that did occur.

      Though some Southern whites in the mountainous regions and some planters in the rich bottomlands were willing to cooperate with the African Americans and their Northern-born “carpetbagger” allies in these new governments, there were relatively few such “scalawags”; the mass of Southern whites remained fiercely opposed to African American political, civil, and social equality. Sometimes their hostility was expressed through such terrorist organizations as the Ku Klux Klan, which sought to punish so-called “uppity Negroes” and to drive their white collaborators from the South. More frequently it was manifested through support of the Democratic Party, which gradually regained its strength in the South and waited for the time when the North would tire of supporting the Radical regimes and would withdraw federal troops from the South.

The Ulysses S. Grant (Grant, Ulysses S.) administrations, 1869–77
      During the two administrations of President Grant there was a gradual attrition of Republican strength. As a politician the president was passive, exhibiting none of the brilliance he had shown on the battlefield. His administration was tarnished by the dishonesty of his subordinates, whom he loyally defended. As the older Radical leaders—men like Sumner, Wade, and Stevens—died, leadership in the Republican Party fell into the hands of technicians like Roscoe Conkling and James G. Blaine, men devoid of the idealistic fervour that had marked the early Republicans. At the same time, many Northerners were growing tired of the whole Reconstruction issue and were weary of the annual outbreaks of violence in the South that required repeated use of federal force.

      Efforts to shore up the Radical regimes in the South grew increasingly unsuccessful. The adoption of the Fifteenth Amendment (1870), prohibiting discrimination in voting on account of race, had little effect in the South, where terrorist organizations and economic pressure from planters kept African Americans from the polls. Nor were three Force Acts passed by the Republicans (1870–71), giving the president the power to suspend the writ of habeas corpus and imposing heavy penalties upon terroristic organizations, in the long run more successful. If they succeeded in dispersing the Ku Klux Klan as an organization, they also drove its members, and their tactics, more than ever into the Democratic camp.

      Growing Northern disillusionment with Radical Reconstruction and with the Grant administration became evident in the Liberal Republican movement of 1872, which resulted in the nomination of the erratic Horace Greeley for president. Though Grant was overwhelmingly reelected, the true temper of the country was demonstrated in the congressional elections of 1874, which gave the Democrats control of the House of Representatives for the first time since the outbreak of the Civil War. Despite Grant's hope for a third term in office, most Republicans recognized by 1876 that it was time to change both the candidate and his Reconstruction program, and the nomination of Rutherford B. Hayes (Hayes, Rutherford B.) of Ohio, a moderate Republican of high principles and of deep sympathy for the South, marked the end of the Radical domination of the Republican Party.

      The circumstances surrounding the disputed election of 1876 strengthened Hayes's intention to work with the Southern whites, even if it meant abandoning the few Radical regimes that remained in the South. In an election marked by widespread fraud and many irregularities, the Democratic candidate, Samuel J. Tilden (Tilden, Samuel J), received the majority of the popular vote; but the vote in the electoral college was long in doubt. In order to resolve the impasse, Hayes's lieutenants had to enter into agreement with Southern Democratic congressmen, promising to withdraw the remaining federal troops from the South, to share the Southern patronage with Democrats, and to favour that section's demands for federal subsidies in the building of levees and railroads. Hayes's inauguration marked, for practical purposes, the restoration of “ home rule” for the South—i.e., that the North would no longer interfere in Southern elections to protect African Americans and that the Southern whites would again take control of their state governments.

The New South, 1877–90

The era of conservative domination, 1877–90
      The Republican regimes in the Southern states began to fall as early as 1870; by 1877 they had all collapsed. For the next 13 years the South was under the leadership of white Democrats whom their critics called Bourbons because, like the French royal family, they supposedly had learned nothing and forgotten nothing from the revolution they had experienced. For the South as a whole, the characterization is neither quite accurate nor quite fair. In most Southern states the new political leaders represented not only the planters but also the rising Southern business community, interested in railroads, cotton textiles, and urban land speculation.

      Even on racial questions the new Southern political leaders were not so reactionary as the label Bourbon might suggest. Though whites were in the majority in all but two of the Southern states, the conservative regimes did not attempt to disfranchise African Americans. Partly their restraint was caused by fear of further federal intervention; chiefly, however, it stemmed from a conviction on the part of conservative leaders that they could control African American voters, whether through fraud, intimidation, or manipulation.

      Indeed, African American votes were sometimes of great value to these regimes, which favoured the businessmen and planters of the South at the expense of the small white farmers. These “Redeemer” governments sharply reduced or even eliminated the programs of the state governments that benefited poor people. The public school system was starved for money; in 1890 the per capita expenditure in the South for public education was only 97 cents, as compared with $2.24 in the country as a whole. The care of state prisoners, the insane, and the blind was also neglected; and measures to safeguard the public health were rejected. At the same time these conservative regimes were often astonishingly corrupt, and embezzlement and defalcation on the part of public officials were even greater than during the Reconstruction years.

      The small white farmers resentful of planter dominance, residents of the hill country outvoted by Black Belt constituencies, and politicians excluded from the ruling cabals tried repeatedly to overthrow the conservative regimes in the South. During the 1870s they supported Independent or Greenback Labor candidates, but without notable success. In 1879 the Readjuster Party in Virginia—so named because its supporters sought to readjust the huge funded debt of that state so as to lessen the tax burden on small farmers—gained control of the legislature and secured in 1880 the election of its leader, General William Mahone (Mahone, William), to the U.S. Senate. Not until 1890, however, when the powerful Farmers' Alliance, hitherto devoted exclusively to the promotion of agricultural reforms, dropped its ban on politics, was there an effective challenge to conservative hegemony. In that year, with Alliance backing, Benjamin R. Tillman was chosen governor of South Carolina and James S. Hogg was elected governor of Texas; the heyday of Southern populism was at hand.

Jim Crow (Jim Crow law) legislation
      African American voting (suffrage) in the South was a casualty of the conflict between Redeemers and Populists. Although some Populist leaders, such as Tom Watson in Georgia, saw that poor whites and poor blacks in the South had a community of interest in the struggle against the planters and the businessmen, most small white farmers exhibited vindictive hatred toward African Americans, whose votes had so often been instrumental in upholding conservative regimes. Beginning in 1890, when Mississippi held a new constitutional convention, and continuing through 1908, when Georgia amended its constitution, every state of the former Confederacy moved to disfranchise African Americans. Because the U.S. Constitution forbade outright racial discrimination, the Southern states excluded African Americans by requiring that potential voters be able to read or to interpret any section of the Constitution—a requirement that local registrars waived for whites but rigorously insisted upon when an audacious black wanted to vote. Louisiana, more ingenious, added the “ grandfather clause” to its constitution, which exempted from this literacy test all of those who had been entitled to vote on Jan. 1, 1867—i.e., before Congress imposed African American suffrage upon the South—together with their sons and grandsons. Other states imposed stringent property qualifications for voting or enacted complex poll taxes (poll tax).

      Socially as well as politically, race relations in the South deteriorated as farmers' movements rose to challenge the conservative regimes. By 1890, with the triumph of Southern populism, the African American's place was clearly defined by law; he was relegated to a subordinate and entirely segregated position. Not only were legal sanctions (some reminiscent of the “Black Codes”) being imposed upon African Americans, but informal, extralegal, and often brutal steps were also being taken to keep them in their “place.” From 1889 to 1899, lynchings in the South averaged 187.5 per year.

Booker T. Washington (Washington, Booker T) and the Atlanta Compromise
      Faced with implacable and growing hostility from Southern whites, many African Americans during the 1880s and '90s felt that their only sensible course was to avoid open conflict and to work out some pattern of accommodation. The most influential African American spokesman for this policy was Booker T. Washington, the head of Tuskegee Institute in Alabama, who urged his fellow African Americans to forget about politics and college education in the classical languages and to learn how to be better farmers and artisans. With thrift, industry, and abstention from politics, he thought that African Americans could gradually win the respect of their white neighbours. In 1895, in a speech at the opening of the Atlanta Cotton States and International Exposition, Washington most fully elaborated his position, which became known as the Atlanta Compromise. Abjuring hopes of federal intervention in behalf of African Americans, Washington argued that reform in the South would have to come from within. Change could best be brought about if blacks and whites recognized that “the agitation of questions of social equality is the extremest folly”; in the social life the races in the South could be as separate as the fingers, but in economic progress as united as the hand.

      Enthusiastically received by Southern whites, Washington's program also found many adherents among Southern blacks, who saw in his doctrine a way to avoid head-on, disastrous confrontations with overwhelming white force. Whether or not Washington's plan would have produced a generation of orderly, industrious, frugal African Americans slowly working themselves into middle-class status is not known because of the intervention of a profound economic depression throughout the South during most of the post-Reconstruction period. Neither poor whites nor poor blacks had much opportunity to rise in a region that was desperately impoverished. By 1890 the South ranked lowest in every index that compared the sections of the United States—lowest in per capita income, lowest in public health, lowest in education. In short, by the 1890s the South, a poor and backward region, had yet to recover from the ravages of the Civil War or to reconcile itself to the readjustments required by the Reconstruction era.

David Herbert Donald

The transformation of American society, 1865–1900
National expansion

Growth of the nation
 The population of the continental United States in 1880 was slightly above 50,000,000. In 1900 it was just under 76,000,000, a gain of more than 50 percent, but still the smallest rate of population increase for any 20-year period of the 19th century. The rate of growth was unevenly distributed, ranging from less than 10 percent in northern New England to more than 125 percent in the 11 states and territories of the Far West. Most of the states east of the Mississippi reported gains slightly below the national average.

 Much of the population increase was due to the more than 9,000,000 immigrants who entered the United States in the last 20 years of the century, the largest number to arrive in any comparable period up to that time. From the earliest days of the republic until 1895, the majority of immigrants had always come from northern or western Europe. Beginning in 1896, however, the great majority of the immigrants were from southern or eastern Europe. Nervous Americans, already convinced that immigrants wielded too much political power or were responsible for violence and industrial strife, found new cause for alarm, fearing that the new immigrants could not easily be assimilated into American society. Those fears gave added stimulus to agitation for legislation to limit the number of immigrants eligible for admission to the United States and led, in the early 20th century, to quota laws favouring immigrants from northern and western Europe.

      Until that time, the only major restriction against immigration was the Chinese Exclusion Act, passed by Congress in 1882, prohibiting for a period of 10 years the immigration of Chinese labourers into the United States. This act was both the culmination of more than a decade of agitation on the West Coast for the exclusion of the Chinese and an early sign of the coming change in the traditional U.S. philosophy of welcoming virtually all immigrants. In response to pressure from California, Congress had passed an exclusion act in 1879, but it had been vetoed by President Hayes on the ground that it abrogated rights guaranteed to the Chinese by the Burlingame Treaty of 1868. In 1880 these treaty provisions were revised to permit the United States to suspend the immigration of Chinese. The Chinese Exclusion Act was renewed in 1892 for another 10-year period, and in 1902 the suspension of Chinese immigration was made indefinite.

Westward migration
      The United States completed its North American expansion in 1867, when Secretary of State Seward (Seward, William H) persuaded Congress to purchase Alaska (Alaska Purchase) from Russia for $7,200,000. Thereafter, the development of the West progressed rapidly, with the percentage of American citizens living west of the Mississippi increasing from about 22 percent in 1880 to 27 percent in 1900. New states were added to the Union throughout the century, and by 1900 there were only three territories still awaiting statehood in the continental United States: Oklahoma, Arizona, and New Mexico.

Urban growth
      In 1890 the Bureau of the Census discovered that a continuous line could no longer be drawn across the West to define the farthest advance of settlement. Despite the continuing westward movement of population, the frontier had become a symbol of the past. The movement of people from farms to cities (urbanization) more accurately predicted the trends of the future. In 1880 about 28 percent of the American people lived in communities designated by the Bureau of the Census as urban; by 1900 that figure had risen to 40 percent. In those statistics could be read the beginning of the decline of rural power in America and the emergence of a society built upon a burgeoning industrial complex.

The West
      Abraham Lincoln once described the West as the “treasure house of the nation.” In the 30 years after the discovery of gold in California, prospectors found gold or silver in every state and territory of the Far West.

The mineral empire
      There were few truly rich “strikes” in the post-Civil War years. Of those few, the most important were the fabulously rich Comstock Lode of silver in western Nevada (first discovered in 1859 but developed more extensively later) and the discovery of gold in the Black Hills of South Dakota (1874) and at Cripple Creek, Colo. (1891).

      Each new discovery of gold or silver produced an instant mining town to supply the needs and pleasures of the prospectors. If most of the ore was close to the surface, the prospectors would soon extract it and depart, leaving behind a ghost town—empty of people but a reminder of a romantic moment in the past. If the veins ran deep, organized groups with the capital to buy the needed machinery would move in to mine the subsoil wealth, and the mining town would gain some stability as the centre of a local industry. In a few instances, those towns gained permanent status as the commercial centres of agricultural areas that first developed to meet the needs of the miners but later expanded to produce a surplus that they exported to other parts of the West.

      At the close of the Civil War, the price of beef in the Northern states was abnormally high. At the same time, millions of cattle grazed aimlessly on the plains of Texas. A few shrewd Texans concluded that there might be greater profits in cattle than in cotton, especially because it required little capital to enter the cattle business—only enough to employ a few cowboys to tend the cattle during the year and to drive them to market in the spring. No one owned the cattle, and they grazed without charge upon the public domain.

      The one serious problem was the shipment of the cattle to market. The Kansas Pacific resolved that problem when it completed a rail line that ran as far west as Abilene, Kan., in 1867. Abilene was 200 miles (300 kilometres) from the nearest point in Texas where the cattle grazed during the year, but Texas cattlemen almost immediately instituted the annual practice of driving that portion of their herds that was ready for market overland to Abilene in the spring. There they met representatives of Eastern packinghouses, to whom they sold their cattle.

      The open-range cattle industry prospered beyond expectations and even attracted capital from conservative investors in the British Isles. By the 1880s the industry had expanded along the plains as far north as the Dakotas. In the meantime, a new menace had appeared in the form of the advancing frontier of population; but the construction of the Santa Fe Railway through Dodge City, Kan., to La Junta, Colo., permitted the cattlemen to move their operations westward ahead of the settlers; Dodge City replaced Abilene as the principal centre for the annual meeting of cattlemen and buyers. Despite sporadic conflicts with settlers encroaching upon the high plains, the open range survived until a series of savage blizzards struck the plains with unprecedented fury in the winter of 1886–87, killing hundreds of thousands of cattle and forcing many owners into bankruptcy. Those who still had some cattle and some capital abandoned the open range, gained title to lands farther west, where they could provide shelter for their livestock, and revived a cattle industry on land that would be immune to further advances of the frontier of settlement. Their removal to these new lands had been made possible in part by the construction of other railroads connecting the region with Chicago and the Pacific coast.

The expansion of the railroads (railroad)
      In 1862 Congress authorized the construction of two railroads that together would provide the first railroad link between the Mississippi valley and the Pacific coast. One was the Union Pacific (Union Pacific Railroad Company), to run westward from Council Bluffs, Iowa; the other was the Central Pacific (Central Pacific Railroad), to run eastward from Sacramento, Calif. To encourage the rapid completion of those roads, Congress provided generous subsidies in the form of land grants and loans. Construction was slower than Congress had anticipated, but the two lines met, with elaborate ceremonies, on May 10, 1869, at Promontory, Utah.

      In the meantime, other railroads had begun construction westward, but the panic of 1873 and the ensuing depression halted or delayed progress on many of those lines. With the return of prosperity after 1877, some railroads resumed or accelerated construction; and by 1883 three more rail connections between the Mississippi valley and the West Coast had been completed—the Northern Pacific, from St. Paul to Portland; the Santa Fe (Atchison, Topeka and Santa Fe Railway Company, The), from Chicago to Los Angeles; and the Southern Pacific (Southern Pacific Railroad), from New Orleans to Los Angeles. The Southern Pacific had also acquired, by purchase or construction, lines from Portland to San Francisco and from San Francisco to Los Angeles.

      The construction of the railroads from the Midwest to the Pacific coast was the railroad builders' most spectacular achievement in the quarter century after the Civil War. No less important, in terms of the national economy, was the development in the same period of an adequate rail network in the Southern states and the building of other railroads that connected virtually every important community west of the Mississippi with Chicago.

      The West developed simultaneously with the building of the Western railroads, and in no part of the nation was the importance of railroads more generally recognized. The railroad gave vitality to the regions it served, but, by withholding service, it could doom a community to stagnation. The railroads appeared to be ruthless in exploiting their powerful position: they fixed prices to suit their convenience; they discriminated among their customers; they attempted to gain a monopoly of transportation wherever possible; and they interfered in state and local politics to elect favourites to office, to block unfriendly legislation, and even to influence the decisions of the courts.

Indian policy
      Large tracts of land in the West were reserved by law for the exclusive use of specified Indian tribes (Native American). By 1870, however, the invasion of these lands by hordes of prospectors, by cattlemen and farmers, and by the transcontinental railroads had resulted in the outbreak of a series of savage Indian wars and had raised serious questions about the government's Indian policies. Many agents of the Bureau of Indian Affairs were lax in their responsibility for dealing directly with the tribes, and some were corrupt in the discharge of their duties. Most Westerners and some army officers contended that the only satisfactory resolution of the Indian question was the removal of the tribes from all lands coveted by the whites.

      In the immediate postwar years, reformers advocated adoption of programs designed to prepare the Indians for ultimate assimilation into American society. In 1869 the reformers persuaded President Grant and Congress to establish a nonpolitical Board of Indian Commissioners to supervise the administration of relations between the government and the Indians. The board, however, encountered so much political opposition that it accomplished little. The reformers then proposed legislation to grant title for specific acreages of land to the head of each family in those tribes thought to be ready to adopt a sedentary life as farmers. Congress resisted that proposal until land-hungry Westerners discovered that, if the land were thus distributed, a vast surplus of land would result that could be added to the public domain. When land speculators joined the reformers in support of the proposed legislation, Congress in 1887 enacted the Dawes Act (Dawes General Allotment Act), which empowered the president to grant title to 160 acres (65 hectares) to the head of each family, with smaller allotments to single members of the tribe, in those tribes believed ready to accept a new way of life as farmers. With the grant of land, which could not be alienated by the Indians for 25 years, they were to be granted U.S. citizenship. Reformers rejoiced that they had finally given the Indians an opportunity to have a dignified role in U.S. society, overlooking the possibility that there might be values in Indian culture worthy of preservation. Meanwhile, the land promoters placed successive presidents under great pressure to accelerate the application of the Dawes Act in order to open more land for occupation or speculation.

Industrialization of the U.S. economy

The growth of industry
      By 1878 the United States had reentered a period of prosperity after the long depression of the mid-1870s. In the ensuing 20 years the volume of industrial (Industrial Revolution) production, the number of workers employed in industry, and the number of manufacturing plants all more than doubled. A more accurate index to the scope of this industrial advance may be found in the aggregate annual value of all manufactured goods, which increased from about $5,400,000,000 in 1879 to perhaps $13,000,000,000 in 1899. The expansion of the iron (iron processing) and steel industry, always a key factor in any industrial economy, was even more impressive: from 1880 to 1900 the annual production of steel in the United States went from about 1,400,000 to more than 11,000,000 tons. Before the end of the century, the United States surpassed Great Britain in the production of iron and steel and was providing more than one-quarter of the world's supply of pig iron.

      Many factors combined to produce this burst of industrial activity. The exploitation of Western resources, including mines and lumber, stimulated a demand for improved transportation, while the gold and silver mines provided new sources of capital for investment in the East. The construction of railroads, especially in the West and South, with the resulting demand for steel rails, was a major force in the expansion of the steel industry and increased the railroad mileage in the United States from less than 93,262 miles (150,151 kilometres) in 1880 to about 190,000 miles (310,000 kilometres) in 1900. Technological advances, including the utilization of the Bessemer and open-hearth processes in the manufacture of steel, resulted in improved products and lower production costs. A series of major inventions, including the telephone, typewriter, linotype, phonograph, electric light, cash register, air brake, refrigerator car, and the automobile, became the bases for new industries, while many of them revolutionized the conduct of business. The use of petroleum products in industry as well as for domestic heating and lighting became the cornerstone of the most powerful of the new industries of the period, while the trolley car, the increased use of gas and electric power, and the telephone led to the establishment of important public utilities that were natural monopolies and could operate only on the basis of franchises granted by state or municipal governments. The widespread employment of the corporate form of business organization offered new opportunities for large-scale financing of business enterprise and attracted new capital, much of it furnished by European investors. Over all this industrial activity, there presided a colourful and energetic group of entrepreneurs, who gained the attention, if not always the commendation, of the public and who appeared to symbolize for the public the new class of leadership in the United States. Of this numerous group the best known were John D. Rockefeller (Rockefeller, John D.) in oil, Andrew Carnegie (Carnegie, Andrew) in steel, and such railroad builders and promoters as Cornelius Vanderbilt, Leland Stanford, Collis P. Huntington, Henry Villard, and James J. Hill.

The dispersion of industry
      The period was notable also for the wide geographic distribution of industry. The Eastern Seaboard from Massachusetts to Pennsylvania continued to be the most heavily industrialized section of the United States, but there was a substantial development of manufacturing in the states adjacent to the Great Lakes and in certain sections of the South.

      The experience of the steel industry reflected this new pattern of diffusion. Two-thirds of the iron and steel industry was concentrated in the area of western Pennsylvania and eastern Ohio. After 1880, however, the development of iron mines in northern Minnesota (the Vermilion Range in 1884 and the Mesabi Range in 1892) and in Tennessee and northern Alabama was followed by the expansion of the iron and steel industry in the Chicago area and by the establishment of steel mills in northern Alabama and in Tennessee.

      Most manufacturing in the Midwest was in enterprises closely associated with agriculture and represented expansion of industries that had first been established before 1860. Meat-packing, which in the years after 1875 became one of the major industries of the nation in terms of the value of its products, was almost a Midwestern monopoly, with a large part of the industry concentrated in Chicago. Flour milling, brewing, and the manufacture of farm machinery and lumber products were other important Midwestern industries.

      The industrial invasion of the South was spearheaded by textiles (textile). Cotton mills became the symbol of the New South, and mills and mill towns sprang up in the Piedmont region from Virginia to Georgia and into Alabama. By 1900 almost one-quarter of all the cotton spindles in the United States were in the South, and Southern mills were expanding their operations more rapidly than were their well-established competitors in New England. The development of lumbering in the South was even more impressive, though less publicized; by the end of the century the South led the nation in lumber production, contributing almost one-third of the annual supply.

Industrial combinations
      The geographic dispersal of industry was part of a movement that was converting the United States into an industrial nation. It attracted less attention, however, than the trend toward the consolidation of competing firms into large units capable of dominating an entire industry. The movement toward consolidation received special attention in 1882 when Rockefeller and his associates organized the Standard Oil Trust (Standard Oil Company and Trust) under the laws of Ohio. A trust was a new type of industrial organization, in which the voting rights of a controlling number of shares of competing firms were entrusted to a small group of men, or trustees, who thus were able to prevent competition among the companies they controlled. The stockholders presumably benefited through the larger dividends they received. For a few years the trust was a popular vehicle for the creation of monopolies, and by 1890 there were trusts in whiskey, lead, cottonseed oil, and salt.

      In 1892 the courts of Ohio ruled that the trust violated that state's antimonopoly laws. Standard Oil then reincorporated as a holding company under the more hospitable laws of New Jersey. Thereafter, holding companies or outright mergers became the favourite forms for the creation of monopolies, though the term trust remained in the popular vocabulary as a common description of any monopoly. The best-known mergers of the period were those leading to the formation of the American Tobacco Company (1890) and the American Sugar Refining Company (1891). The latter was especially successful in stifling competition, for it quickly gained control of most of the sugar refined in the United States.

Foreign commerce
      The foreign trade of the United States, if judged by the value of exports, kept pace with the growth of domestic industry. Exclusive of gold, silver, and reexports, the annual value of exports from the United States in 1877 was about $590,000,000; by 1900 it had increased to approximately $1,371,000,000. The value of imports also rose, though at a slower rate. When gold and silver are included, there was only one year in the entire period in which the United States had an unfavourable balance of trade; and, as the century drew to a close, the excess of exports over imports increased perceptibly.

      Agriculture continued to furnish the bulk of U.S. exports. Cotton, wheat, flour, and meat products were consistently the items with the greatest annual value among exports. Of the nonagricultural products sent abroad, petroleum was the most important, though by the end of the century its position on the list of exports was being challenged by machinery.

      Despite the expansion of foreign trade, the U.S. merchant marine was a major casualty of the period. While the aggregate tonnage of all shipping flying the U.S. flag remained remarkably constant, the tonnage engaged in foreign trade declined sharply, dropping from more than 2,400,000 tons on the eve of the Civil War to a low point of only 726,000 tons in 1898. The decline began during the Civil War when hundreds of ships were transferred to foreign registries to avoid destruction. Later, cost disadvantages in shipbuilding and repair and the American policy of registering only American-built ships hindered growth until World War I.

      The expansion of industry was accompanied by increased tensions between employers and workers and by the appearance, for the first time in the United States, of national labour unions (organized labour).

Formation of unions
      The first effective labour organization that was more than regional in membership and influence was the Knights of Labor, organized in 1869. The Knights believed in the unity of the interests of all producing groups and sought to enlist in their ranks not only all labourers but everyone who could be truly classified as a producer. They championed a variety of causes, many of them more political than industrial, and they hoped to gain their ends through politics and education rather than through economic coercion.

      The hardships suffered by many workers during the depression of 1873–78 and the failure of a nationwide railroad strike, which was broken when President Hayes sent federal troops to suppress disorders in Pittsburgh and St. Louis, caused much discontent in the ranks of the Knights. In 1879 Terence V. Powderly (Powderly, Terence V.), a railroad worker and mayor of Scranton, Pa., was elected grand master workman of the national organization. He favoured cooperation over a program of aggressive action, but the effective control of the Knights shifted to regional leaders who were willing to initiate strikes or other forms of economic pressure to gain their objectives. The Knights reached the peak of their influence in 1884–85, when much-publicized strikes against the Union Pacific, Southwest System, and Wabash railroads attracted substantial public sympathy and succeeded in preventing a reduction in wages. At that time they claimed a national membership of nearly 700,000. In 1885 Congress, taking note of the apparently increasing power of labour, acceded to union demands to prohibit the entry into the United States of immigrants who had signed contracts to work for specific employers.

      The year 1886 was a troubled one in labour relations. There were nearly 1,600 strikes, involving about 600,000 workers, with the eight-hour day the most prominent item in the demands of labour. About half of these strikes were called for May Day; some of them were successful, but the failure of others and internal conflicts between skilled and unskilled members led to a decline in the Knights' popularity and influence.

      The most serious blow to the unions came from a tragic occurrence with which they were only indirectly associated. One of the strikes called for May Day in 1886 was against the McCormick Harvesting Machine Company in Chicago. Fighting broke out along the picket lines on May 3, and, when police intervened to restore order, several strikers were injured or killed. Union leaders called a protest meeting at Haymarket Square for the evening of May 4; but, as the meeting was breaking up, a group of anarchists took over and began to make inflammatory speeches. The police quickly intervened, and a bomb exploded, killing seven policemen and injuring many others. Eight of the anarchists were arrested, tried, and convicted of murder. Four of them were hanged, and one committed suicide. The remaining three were pardoned in 1893 by Governor John P. Altgeld, who was persuaded that they had been convicted in such an atmosphere of prejudice that it was impossible to be certain that they were guilty.

      The public tended to blame organized labour for the Haymarket tragedy, and many persons had become convinced that the activities of unions were likely to be attended by violence. The Knights never regained the ground they lost in 1886, and, until after the turn of the century, organized labour seldom gained any measure of public sympathy. Aggregate union membership did not again reach its 1885–86 figure until 1900. Unions, however, continued to be active; and in each year from 1889 through the end of the century there were more than 1,000 strikes.

      As the power of the Knights declined, the leadership in the trade union movement passed to the American Federation of Labor (AFL). This was a loose federation of local and craft unions, organized first in 1881 and reorganized in 1886. For a few years there was some nominal cooperation between the Knights and the AFL, but the basic organization and philosophy of the two groups made cooperation difficult. The AFL appealed only to skilled workers, and its objectives were those of immediate concern to its members: hours, wages, working conditions, and the recognition of the union. It relied on economic weapons, chiefly the strike and boycott, and it eschewed political activity, except for state and local election campaigns. The central figure in the AFL was Samuel Gompers (Gompers, Samuel), a New York cigar maker, who was its president from 1886 to his death in 1924.

National politics
      The dominant forces in American life in the last quarter of the 19th century were economic and social rather than political. This fact was reflected in the ineffectiveness of political leadership and in the absence of deeply divisive issues in politics, except perhaps for the continuing agrarian agitation for inflation. There were colourful political personalities, but they gained their following on a personal basis rather than as spokesmen for a program of political action. No president of the period was truly the leader of his party, and none apparently aspired to that status except Grover Cleveland during his second term (1893–97). Such shrewd observers of U.S. politics as Woodrow Wilson and James Bryce agreed that great men did not become presidents; and it was clear that the nominating conventions of both major parties commonly selected candidates who were “available” in the sense that they had few enemies.

      Congress had been steadily increasing in power since the Johnson administration and, in the absence of leadership from the White House, was largely responsible for formulating public policy. As a result, public policy commonly represented a compromise among the views of many congressional leaders—a situation made the more essential because of the fact that in only four of the 20 years from 1877 to 1897 did the same party control the White House, the Senate, and the House.

      The Republicans appeared to be the majority party in national politics. From the Civil War to the end of the century, they won every presidential election save those of 1884 and 1892, and they had a majority in the Senate in all but three Congresses during that same period. The Democrats, however, won a majority in the House in eight of the 10 Congresses from 1875 to 1895. The success of the Republicans was achieved in the face of bitter intraparty schisms that plagued Republican leaders from 1870 until after 1890 and despite the fact that, in every election campaign after 1876, they were forced to concede the entire South to the opposition. The Republicans had the advantage of having been the party that had defended the Union against secession and had freed the slaves. When all other appeals failed, Republican leaders could salvage votes in the North and West by reviving memories of the war. A less tangible but equally valuable advantage was the widespread belief that the continued industrial development of the nation would be more secure under a Republican than under a Democratic administration. Except in years of economic adversity, the memory of the war and confidence in the economic program of the Republican Party were normally enough to ensure Republican success in most of the Northern and Western states.

The Rutherford B. Hayes (Hayes, Rutherford B.) administration
      President Hayes (served 1877–81) willingly carried out the commitments made by his friends to secure the disputed Southern votes needed for his election. He withdrew the federal troops still in the South, and he appointed former senator David M. Key of Tennessee to his Cabinet as postmaster general. Hayes hoped that these conciliatory gestures would encourage many Southern conservatives to support the Republican Party in the future. But the Southerners' primary concern was the maintenance of white supremacy; this, they believed, required a monopoly of political power in the South by the Democratic Party. As a result, the policies of Hayes led to the virtual extinction rather than the revival of the Republican Party in the South.

      Hayes's efforts to woo the South irritated some Republicans, but his attitude toward patronage in the federal civil service was a more immediate challenge to his party. In June 1877 he issued an executive order prohibiting political activity by those who held federal appointments. When two friends of Senator Roscoe Conkling (Conkling, Roscoe) defied this order, Hayes removed them from their posts in the administration of the Port of New York. Conkling and his associates showed their contempt for Hayes by bringing about the election of one of the men (Alonzo B. Cornell) as governor of New York in 1879 and by nominating the other (Chester A. Arthur) as Republican candidate for the vice presidency in 1880.

      One of the most serious issues facing Hayes was that of inflation. Hayes and many other Republicans were staunch supporters of a sound-money policy, but the issues were sectional rather than partisan. In general, sentiment in the agricultural South and West was favourable to inflation, while industrial and financial groups in the Northeast opposed any move to inflate the currency, holding that this would benefit debtors at the expense of creditors.

      In 1873 Congress had discontinued the minting of silver dollars (Free Silver Movement), an action later stigmatized by friends of silver as the Crime of '73. As the depression deepened, inflationists began campaigns to persuade Congress to resume coinage of silver dollars and to repeal the act providing for the redemption of Civil War greenbacks in gold after Jan. 1, 1879. By 1878 the sentiment for silver and inflation was so strong that Congress passed, over the president's veto, the Bland–Allison Act, which renewed the coinage of silver dollars and, more significantly, included a mandate to the secretary of the treasury to purchase silver bullion at the market price in amounts of not less than $2,000,000 and not more than $4,000,000 each month.

      Opponents of inflation were somewhat reassured by the care with which Secretary of the Treasury John Sherman (Sherman, John) was making preparation to have an adequate gold reserve to meet any demands on the Treasury for the redemption of greenbacks. Equally reassuring were indications that the nation had at last recovered from the long period of depression. These factors reestablished confidence in the financial stability of the government; and, when the date for the redemption of greenbacks arrived, there was no appreciable demand upon the Treasury to exchange them for gold.

      Hayes chose not to run for reelection. Had he sought a second term, he would almost certainly have been denied renomination by the Republican leaders, many of whom he had alienated through his policies of patronage reform and Southern conciliation. Three prominent candidates contended for the Republican nomination in 1880: Grant, the choice of the “Stalwart” faction led by Senator Conkling; James G. Blaine, the leader of the rival “Half-Breed” faction; and Secretary of the Treasury Sherman. Grant had a substantial and loyal bloc of delegates in the convention, but their number was short of a majority. Neither of the other candidates could command a majority, and on the 36th ballot the weary delegates nominated a compromise candidate, Congressman James A. Garfield (Garfield, James A.) of Ohio. To placate the Stalwart faction, the convention nominated Chester A. Arthur of New York for vice president.

      The Democrats probably would have renominated Samuel J. Tilden in 1880, hoping thereby to gain votes from those who believed Tilden had lost in 1876 through fraud. But Tilden declined to become a candidate again, and the Democratic convention nominated General Winfield S. Hancock (Hancock, Winfield Scott). Hancock had been a Federal general during the Civil War, but he had no political record and little familiarity with questions of public policy.

      The campaign failed to generate any unusual excitement and produced no novel issues. As in every national election of the period, the Republicans stressed their role as the party of the protective tariff and asserted that Democratic opposition to the tariff would impede the growth of domestic industry. Actually, the Democrats were badly divided on the tariff, and Hancock surprised political leaders of both parties by declaring that the tariff was an issue of only local interest.

      Garfield won the election with an electoral margin of 214 to 155, but his plurality in the popular vote was a slim 9,644. The election revealed the existence of a new “solid South,” for Hancock carried all the former Confederate states and three of the former slave states that had remained loyal to the Union.

The administrations of James A. Garfield and Chester A. Arthur
      Garfield had not been closely identified with either the Stalwarts (Stalwart) or the Half-Breeds, the two major factions within the Republican Party, but, upon becoming president, he upset the Stalwarts by naming the Half-Breed Blaine secretary of state. He gave even more serious offense to the Stalwart faction by appointing as collector of customs at New York a man who was unacceptable to the two senators from that state, Conkling and Thomas Platt, who showed their displeasure by resigning their Senate seats, expecting to be reelected triumphantly by the legislature of New York; but in this they were disappointed.

      The tragic climax to this intraparty strife came on July 2, 1881, when Garfield was shot in Washington, D.C., by a disappointed and mentally deranged office seeker and Stalwart supporter. For two months the president lingered between life and death. He died on September 19 and was succeeded by Vice President Arthur. (Arthur, Chester A.)

      Arthur's accession to the presidency caused widespread concern. He had held no elective office before becoming vice president, and he had been closely associated with the Stalwart wing of the party. It was assumed that, like others in that group, he would be hostile to civil service reform, and his nomination for the vice presidency had been generally regarded as a deliberate rebuke to President Hayes. The members of Garfield's Cabinet immediately tendered their resignations, but Arthur asked them to continue in office for a time. By mid-April 1882, however, all but one of the Cabinet officers had been replaced.

      Arthur soon surprised his critics and the country by demonstrating an unexpected independence of his former political friends. In his first annual message to Congress, in December 1881, he announced his qualified approval of legislation that would remove appointments to the federal civil service from partisan control. In January 1883 Congress passed and Arthur signed the Pendleton Civil Service Act, which established the Civil Service Commission and provided that appointments to certain categories of offices should be made on the basis of examinations and the appointees given an indefinite tenure in their positions.

      By 1884, when the next presidential election was held, Arthur's administration had won the respect of many who had viewed his accession to office with misgivings. It had not, however, gained him any strong following among the leaders of his party. The foremost candidate for the Republican nomination was the perennially powerful Blaine, who, despite opposition from those who believed he was too partisan in spirit or that he was vulnerable to charges of corrupt actions while speaker of the house many years before, was nominated on the fourth ballot.

      The Democratic candidate, Governor Grover Cleveland (Cleveland, Grover) of New York, was in many respects the antithesis of Blaine. He was a relative newcomer to politics. He had been elected mayor of Buffalo in 1881 and governor of New York in 1882. In both positions he had earned a reputation for political independence, inflexible honesty, and an industrious and conservative administration. His record made him an attractive candidate for persons who accepted the dictum that “a public office is a public trust.” This was, in 1884, a valuable asset; and it won for Cleveland the support of a few outstanding Republicans and some journals of national circulation that usually favoured Republican nominees for office.

      As in 1880, the campaign was almost devoid of issues of public policy: only the perennial question of the tariff appeared to separate the two parties. Cleveland had not served in the army during the Civil War, and Republicans made an effort to use this fact, together with the power of the South in the Democratic Party, to arouse sectional prejudices against Cleveland. During the campaign it was revealed that Cleveland, a bachelor, was the father of an illegitimate son, an indiscretion that gave the Republicans a moral issue with which to counteract charges of corruption against their own candidate.

      The election was very close. On the evening of the voting it was apparent that the result depended upon the vote in New York state, but not until the end of the week was it certain that Cleveland had carried New York by the narrow margin of some 1,100 votes (out of more than one million) and been elected president.

Grover Cleveland's first term
      Cleveland was the first Democratic president since James Buchanan a quarter of a century earlier. More than two-thirds of the electoral votes he received came from Southern or border states, so that it appeared that his election marked the close of one epoch and the beginning of a new political era in which the South could again hope to have a major voice in the conduct of national affairs. Because of his brief career in politics, Cleveland had only a limited acquaintance with leaders of his own party. He accepted literally the constitutional principle of the separation of powers, and he opened his first annual message to Congress, in December 1885, with an affirmation of his devotion to “the partitions of power between our respective departments.” This appeared to be a disavowal of presidential leadership, but it quickly became apparent that Cleveland intended to defend vigorously the prerogatives that he believed belonged to the executive.

      During his first term (1885–89) Cleveland was confronted with a divided Congress—a Republican Senate and a Democratic House. This added to the complexities of administration, especially in the matter of appointments. Cleveland was a firm believer in a civil service based on merit rather than on partisan considerations, but, as the first Democratic president in a quarter of a century, he was under great pressure to replace Republicans in appointive offices with Democrats. He followed a line of compromise. In his first two years he removed the incumbents from about two-thirds of the offices subject to his control, but he scrutinized the qualifications of Democrats recommended for appointment and in a number of instances refused to abide by the recommendations of his party leaders. He thus offended both the reformers, who wished no partisan removals, and his fellow Democrats, whose nominees he rejected. Although his handling of the patronage alienated some powerful Democrats, he scored a personal triumph when he persuaded Congress to repeal the obsolete Tenure of Office Act of 1867, which Republican senators had threatened to revive in order to embarrass him.

      Cleveland was a conservative on all matters relating to money, and he was inflexibly opposed to wasteful expenditure of public funds. This caused him to investigate as many as possible of the hundreds of private bills passed by Congress to compensate private individuals, usually Federal veterans, for claims against the federal government. When, as was frequently the case, he judged these claims to be ill-founded, he vetoed the bill. He was the first president to use the veto power extensively to block the enactment of this type of private legislation.

The surplus and the tariff
      The flurry of private pension bills had been stimulated, in part, by a growing surplus in the Treasury. In every year since the Civil War, there had been an excess of revenue over expenditures, a circumstance that encouraged suggestions for appropriations of public funds for a variety of purposes. The surplus also focused attention upon the tariff, the principal source of this excess revenue. In 1883 Congress had reviewed the tariff and made numerous changes in the rates, increasing the tariff on some items and reducing it on others, without materially decreasing the revenue received. Cleveland believed that the surplus presented a very real problem. It hoarded in the Treasury money that could have been in circulation, and it encouraged reckless spending by the government. Like many other Democrats, he disliked the high protective tariff. After waiting in vain for two years for Congress to meet this issue boldly, Cleveland adopted the extraordinary tactic of devoting his entire annual message in 1887 to a discussion of this question and to an appeal for a lowering of the tariff. The House then passed a bill generally conforming to Cleveland's views on the tariff; but the Senate rejected it, and the tariff became a leading issue in the presidential campaign of 1888.

The public domain
      After 1877 hundreds of thousands of agricultural settlers went westward to the Plains, where they came into competition for control of the land with the cattlemen, who hitherto had dominated the open range. The pressure of population as it moved into the Plains called attention to the diminishing supply of good arable land still open to settlement, thus presaging the day when there would no longer be a vast reservoir of land in the West awaiting the farmer. It also drew attention to the fact that millions of acres of Western land were being held for speculative purposes and that other millions of acres had been acquired by questionable means or were still in the possession of railroads that failed to fulfill the obligations they had assumed when the land was granted to them. Upon assuming office, Cleveland was confronted with evidence that some of these claims had been fraudulently obtained by railroads, speculators, cattlemen, or lumbering interests. He ordered an investigation, and for more than a year agents of the Land Office roamed over the West uncovering evidence of irregularities and neglected obligations. Cleveland acted firmly. By executive orders and court action he succeeded in restoring more than 81,000,000 acres (33,000,000 hectares) to the public domain.

The Interstate Commerce Act
      The railroads were vital to the nation's economy, but, because in so many regions a single company enjoyed a monopoly of rail transportation, many of the railroads adopted policies that large numbers of their customers felt to be unfair and discriminatory. Before 1884 it was clear that the Granger laws of the preceding decade (state laws prohibiting various abuses by the railroads) were ineffective, and pressure groups turned to the federal government for relief. In this, Western farm organizations were joined by influential Eastern businessmen who believed that they, too, were the victims of discrimination by the railroads. This powerful political alliance persuaded both parties to include regulation of the railroads in their national platforms in 1884 and induced Congress to enact the Interstate Commerce Act in 1887.

      This law, designed to prevent unjust discrimination by the railroads, prohibited the pooling of traffic and profits, made it illegal for a railroad to charge more for a short haul than for a longer one, required that the roads publicize their rates, and established the Interstate Commerce Commission to supervise the enforcement of the law. The rulings of the commission were subject to review by the federal courts, the decisions of which tended to narrow the scope of the act. The commission was less effective than the sponsors of the act had hoped, but the act in itself was an indication of the growing realization that only the federal government could cope with the new economic problems of the day.

The election of 1888
      Cleveland's plea for a reduction of the tariff in his annual message of 1887 made it certain that the tariff would be the central issue in the presidential campaign of 1888. The Democrats renominated Cleveland, although it was thought that he had endangered his chances of reelection by his outspoken advocacy of tariff reduction. The Republicans had their usual difficulty in selecting a candidate. Blaine refused to enter the race, and no other person in the party commanded substantial support. From among the many who were willing to accept the nomination, the Republicans selected Benjamin Harrison (Harrison, Benjamin) of Indiana, a Federal general in the Civil War and the grandson of President William Henry Harrison.

      Cleveland had won respect as a man of integrity and courage, but neither he nor Harrison aroused any great enthusiasm among the voters. One feature of the campaign noted by observers was the extensive use of money to influence the outcome; this was not a new phenomenon, but the spending of money to carry doubtful states and the apparent alliance between business and political bosses had never before been so open.

      The results were again close. Cleveland had a plurality of about 100,000 popular votes, but the Republicans carried two states, New York and Indiana, which they had lost in 1884, and in the electoral college Harrison won by a margin of 233 to 168.

The Benjamin Harrison administration
      The Republicans also gained control of both houses of the 51st Congress. Their margin in the House of Representatives, however, was so small that it seemed uncertain whether they could carry controversial legislation through it. This obstacle was overcome by the speaker of the House, Thomas B. Reed (Reed, Thomas B) of Maine. Reed refused to recognize dilatory motions, and, contrary to precedent, he counted as present all members who were in the chamber. Using that tactic, he ruled, on occasion, that a quorum was present even though fewer than a majority had actually answered a roll call. His iron rule of the House earned him the sobriquet Czar Reed, but only through his firm control of the House could the Republicans pass three controversial bills in the summer and early autumn of 1890. One dealt with monopolies, another with silver, and the third with the tariff.

The Sherman Anti-Trust Act (Sherman Antitrust Act)
      The first of these major measures declared illegal all combinations that restrained trade between states or with foreign nations. This law, known as the Sherman Anti-Trust Act, was passed by Congress early in July. It was the congressional response to evidence of growing public dissatisfaction with the development of industrial monopolies, which had been so notable a feature of the preceding decade.

      More than 10 years passed before the Sherman Act was used to break up any industrial monopoly. It was invoked by the federal government in 1894 to obtain an injunction against a striking railroad union accused of restraint of interstate commerce, and the use of the injunction was upheld by the Supreme Court in 1895. Indeed, it is unlikely that the Senate would have passed the bill in 1890 had not the chairman of the Senate Judiciary Committee, George F. Edmunds (Edmunds, George Franklin) of Vermont, felt certain that unions were combinations in restraint of trade within the meaning of the law. To those who hoped that the Sherman Act would inhibit the growth of monopoly, the results were disappointing. The passage of the act only three years after the Interstate Commerce Act was, however, another sign that the public was turning from state capitals to Washington for effective regulation of industrial giants.

The silver issue
      Less than two weeks after Congress passed the antitrust law, it enacted the Sherman Silver Purchase Act, which required the secretary of the treasury to purchase each month 4,500,000 ounces (130,000 kilograms) of silver at the market price. This act superseded the Bland–Allison Act of 1878, effectively increasing the government's monthly purchase of silver by more than 50 percent. It was adopted in response to pressure from mineowners, who were alarmed by the falling price of silver, and from Western farmers, who were always favourable to inflationary measures and who, in 1890, were also suffering from the depressed prices of their products.

The McKinley tariff
      Most Republican leaders had been lukewarm to the proposal to increase the purchase of silver and had accepted it only to assure Western votes for the measure in which they were most interested—upward revision of the protective tariff. This was accomplished in the McKinley Tariff Act of October 1890, passed by Congress one month before the midterm elections of that year. The tariff was designed to appeal to the farmers because some agricultural products were added to the protected list. A few items, notably sugar, were placed on the free list, and domestic sugar planters were to be compensated by a subsidy of two cents a pound. The central feature of the act, however, was a general increase in tariff schedules, with many of these increases applying to items of general consumption.

      The new tariff immediately became an issue in the congressional elections. It failed to halt the downward spiral of farm prices, but there was an almost immediate increase in the cost of many items purchased by the farmers. With discontent already rife in the agricultural regions of the West and South, the McKinley tariff added to the agrarian resentment. The outcome of the elections was a major defeat for the Republicans, whose strength in the House of Representatives was reduced by almost half.

The agrarian revolt
      Political disaster befell the Republicans in the trans-Mississippi West, resulting from an economic and psychological depression that enveloped the region after widespread crop failures and the collapse of inflated land prices in the summer of 1887. The Western boom had begun in the late 1870s, when the tide of migration into the unoccupied farmlands beyond the Mississippi quickly led to the settlement of hitherto unoccupied parts of Iowa and Minnesota and to the pushing of the frontier westward across the Plains almost literally to the shadows of the Rocky Mountains.

      Westward expansion was encouraged by the railroads that served the region. It was supported by the satisfactory price and encouraging foreign market for wheat, the money crop of the Plains. For 10 years, from 1877 through 1886, the farmers on the Plains had the benefit of an abnormally generous rainfall, leading many to assume that climatic conditions had changed and that the rain belt had moved westward to provide adequate rainfall for the Plains. Confidence was followed by unrestrained optimism that engendered wild speculation and a rise in land prices. Lured on by these illusions, the settlers went into debt to make improvements on their farms while small-town leaders dreamed of prodigious growth and authorized bond issues to construct the public improvements they felt certain would soon be needed.

      The collapse of these dreams came in 1887. The year opened ominously when the Plains were swept by a catastrophic blizzard in January that killed thousands of head of cattle and virtually destroyed the cattle industry of the open range. The following summer was dry and hot; crops were poor; and, to compound the woes of the farmers, the price of wheat began to slide downward. The dry summer of 1887 was the beginning of a 10-year cycle of little rainfall and searingly hot summers. By the autumn of 1887 the exodus from the Plains had begun; five years later, areas of western Kansas and Nebraska that had once been thriving agricultural centres were almost depopulated. The agricultural regions east of the Plains were less directly affected, though there the farmers suffered from the general decline in farm prices.

      Although the disaster on the Plains bred a sense of distress and frustration, the lure of good land was still strong. When the central portion of the present state of Oklahoma was opened to settlement in April 1889, an army of eager settlers, estimated to have numbered 100,000, rushed into the district to claim homesteads and build homes.

The Populists (Populist Movement)
      The collapse of the boom and the falling prices of agricultural products forced many farmers to seek relief through political action. In 1888 and again in 1890 this discontent was expressed through local political groups, commonly known as Farmers' Alliances, which quickly spread through parts of the West and in the South, where economic problems had been aggravated by the shift following the Civil War from a plantation system to sharecrop and crop-lien systems. The alliances won some local victories and contributed to the discomfiture of the Republicans in 1890. They were not, however, an effective vehicle for concerted political action; and in 1891 the leaders of the alliances formed the People's (Populist) Party.

      The Populists aspired to become a national party and hoped to attract support from labour and from reform groups generally. In practice, however, they continued through their brief career to be almost wholly a party of Western farmers. (Southern farmers, afraid of splitting the white vote and thereby allowing blacks into power, largely remained loyal to the Democratic Party.) The Populists demanded an increase in the circulating currency, to be achieved by the unlimited coinage of silver, a graduated income tax, government ownership of the railroads, a tariff for revenue only, the direct election of U.S. senators, and other measures designed to strengthen political democracy and give the farmers economic parity with business and industry. In 1892 the Populists nominated General James B. Weaver (Weaver, James B) of Iowa for president.

The election of 1892
      The nominees of the two major parties for president in 1892 were the same as in the election of 1888: Harrison and Cleveland. The unpopularity of the McKinley tariff gave Cleveland an advantage, as did the discontent in the West, which was directed largely against the Republican Party. From the beginning of the campaign it appeared probable that the Democrats would be successful, and Cleveland carried not only the Southern states but also such key Northern states as New York and Illinois. His electoral vote was 277 to 145 for Harrison. Weaver carried only four Western states, three of them states with important silver mines, and received 22 electoral votes.

Cleveland's second term
      When Cleveland was inaugurated for his second term in March 1893, the country hovered on the brink of financial panic. Six years of depression in the trans-Mississippi West, the decline of foreign trade after the enactment of the McKinley tariff, and an abnormally high burden of private debt were disquieting features of the situation. Most attention was centred, however, on the gold reserve in the federal Treasury. It was assumed that a minimum reserve of $100,000,000 was necessary to assure redemption of government obligations in gold. When on April 21, 1893, the reserve fell below that amount, the psychological impact was far-reaching. Investors hastened to convert their holdings into gold; banks and brokerage houses were hard-pressed; and many business houses and financial institutions failed. Prices dropped, employment was curtailed, and the nation entered a period of severe economic depression that continued for more than three years.

      The causes of this disaster were numerous and complex, but the attention that focused on the gold reserve tended to concentrate concern upon a single factor—the restoration of the Treasury's supply of gold. It was widely believed that the principal cause of the drain on the Treasury was the obligation to purchase large amounts of silver. To those who held this view, the obvious remedy was the repeal of the Sherman Silver Purchase Act.

      The issue was political as well as economic. It divided both major parties, but most of the leading advocates of existing silver policies were Democrats. Cleveland, however, had long been opposed to the silver-purchase policy, and in the crisis he resolved upon repeal as an essential step in protecting the Treasury. He therefore called Congress to meet in special session on Aug. 7, 1893.

      The new Congress had Democratic majorities in both houses, and, if it had any mandate, it was to repeal the McKinley tariff. It had no mandate on the silver issue, and more than half of its Democratic members came from constituencies that favoured an increase in the coinage of silver. Cleveland faced a herculean task in forcing repeal through Congress, but, by the use of every power at his command, he gained his objective. The Sherman Silver Purchase Act was repealed at the end of October by a bill that made no compensating provision for the coinage of silver. Cleveland had won a personal triumph, but he had irrevocably divided his party; and in some sections of the nation he had become the most unpopular president of his generation.

      The extent to which Cleveland had lost control of his party became apparent when Congress turned from silver to the tariff. The House passed a bill that would have revised tariff rates downward in accordance with the president's views. In the Senate, however, the bill was so altered that it bore little resemblance to the original measure, and on some items it imposed higher duties than had the McKinley Tariff Act. It was finally passed in August 1894, but Cleveland was so dissatisfied that he refused to sign it; and it became law without his signature. The act contained a provision for an income tax, but this feature was declared unconstitutional by the Supreme Court in 1895.

      In the midterm elections of 1894 the Republicans recaptured control of both houses of Congress. This indicated the discontent produced by the continuing depression. It also guaranteed that, with a Democratic president and Republican Congress, there would be inaction in domestic legislation while both parties looked forward to the election of 1896.

      At their convention in St. Louis the Republicans selected Governor William McKinley (McKinley, William) of Ohio as their presidential nominee. He had served in the Federal army during the Civil War, and his record as governor of Ohio tended to offset his association with the unpopular tariff of 1890. His most effective support in winning the nomination, however, was provided by Mark Hanna, a wealthy Cleveland businessman who was McKinley's closest friend.

      The Democratic convention in Chicago was unusually exciting. It was controlled by groups hostile to Cleveland's (Cleveland, Grover) financial policies, and it took the unprecedented step of rejecting a resolution commending the administration of a president of its own party. The debate on the party platform featured an eloquent defense of silver and agrarian interests by William Jennings Bryan (Bryan, William Jennings), which won him not only a prolonged ovation but also his party's presidential nomination. Bryan was a former congressman from Nebraska, and at 36 he was the youngest man ever to be the nominee for president of a major party. By experience and conviction he shared the outlook of the agrarian elements that dominated the convention and whose principal spokesman he became.

      Bryan conducted a vigorous campaign. For the first time a presidential candidate carried his case to the people in all parts of the country, and for a time it appeared that he might win. The worried conservatives charged that Bryan was a dangerous demagogue, and they interpreted the campaign as a conflict between defenders of a sound economic system that would produce prosperity and dishonest radicals who championed reckless innovations that would undermine the financial security of the nation. On this interpretation they succeeded in raising large campaign funds from industrialists who feared their interests were threatened. With this money, the Republicans were able to turn the tide and win a decisive victory. Outside the South, Bryan carried only the Western silver states and Kansas and Nebraska.

Economic recovery
      Soon after taking office on March 4, 1897, McKinley called Congress into special session to revise the tariff once again. Congress responded by passing the Dingley Tariff Act, which eliminated many items from the free list and generally raised duties on imports to the highest level they had yet reached.

      Although the preservation of the gold standard had been the chief appeal of the Republicans in 1896, it was not until March 1900 that Congress enacted the Gold Standard Act, which required the Treasury to maintain a minimum gold reserve of $150,000,000 and authorized the issuance of bonds, if necessary, to protect that minimum. In 1900 such a measure was almost anticlimactic, for an adequate gold supply had ceased to be a practical problem. Beginning in 1893, the production of gold in the United States had increased steadily; by 1899 the annual value of gold added to the American supply was double that of any year between 1881 and 1892. The chief source of the new supply of gold was the Klondike, where important deposits of gold had been discovered during the summer of 1896.

      By 1898 the depression had run its course; farm prices and the volume of farm exports were again rising steadily, and Western farmers appeared to forget their recent troubles and to regain confidence in their economic prospects. In industry, the return of prosperity was marked by a resumption of the move toward more industrial combinations, despite the antitrust law; and great banking houses, such as J.P. Morgan and Company of New York, played a key role in many of the most important of these combinations by providing the necessary capital and receiving, in return, an influential voice in the management of the companies created by this capital.

Harold Whitman Bradley Ed.

Imperialism, the Progressive era, and the rise to world power, 1896–1920
American imperialism

      Militarily speaking, the Spanish-American War of 1898 was so brief and relatively bloodless as to have been a mere passing episode in the history of modern warfare. Its political and diplomatic consequences, however, were enormous: it catapulted the United States into the arena of world politics and set it, at least briefly, on the new road of imperialism. To be sure, specific events drove the United States to hostilities in 1898; but the stage had already been set by profound changes in thought about the nation's mission and its destiny.

      Before the 1890s, roughly speaking, most Americans had adhered stubbornly to the belief, as old as the Revolution itself, that their country should remain aloof from European affairs and offer an example of democracy and peace to the rest of the world; but slowly in the 1880s, and more rapidly in the 1890s, new currents of thought eroded this historic conviction. The United States had become a great power by virtue of its prodigious economic growth since the Civil War; numerous publicists said that it ought to begin to act like one. Propagandists of sea power, above all, Captain Alfred T. Mahan, argued that future national security and greatness depended upon a large navy supported by bases throughout the world. After the disappearance of the American frontier in 1890, the conviction grew that the United States would have to find new outlets for an ever-increasing population and agricultural and industrial production; this belief was particularly rife among farmers in dire distress in the 1890s. Social Darwinists said that the world is a jungle, with international rivalries inevitable, and that only strong nations could survive. Added to these arguments were those of idealists and religious leaders that Americans had a duty to “take up the white man's burden” and to carry their assertedly superior culture and the blessings of Christianity to the backward peoples of the world.

      It was against this background that the events of 1898 propelled the United States along the road to war and empire. Cuban rebels had begun a violent revolution against Spanish rule in 1895, set off by a depression caused by a decline in U.S. sugar purchases from Cuba. Rebel violence led progressively to more repressive Spanish countermeasures. Cuban refugees in the United States spread exaggerated tales of Spanish atrocities, and these and numerous others were reprinted widely (particularly by William Randolph Hearst's New York American and Joseph Pulitzer's New York World, then engaged in a fierce battle for circulation). President Cleveland resisted the rising public demand for intervention, but by early 1898 the pressure, then on his successor, McKinley, was too great to be defied. When an explosion—caused by a submarine mine, according to a U.S. naval court of inquiry—sank the USS Maine (Maine, destruction of the) with large loss of life in Havana harbour on Feb. 15, 1898, events moved beyond the president's control. Though Spain was willing to make large concessions to avoid war, it adamantly resisted what had become the minimum public and official U.S. demand—Spanish withdrawal from Cuba and recognition of the island's independence. Hence Congress in mid-April authorized McKinley to use the armed forces to expel the Spanish from Cuba.

      For Americans it was, as Secretary of State John Hay (Hay, John) put it in a letter to Theodore Roosevelt, “a splendid little war.” An American expeditionary force, after quickly overcoming the Spaniards in Cuba, turned against Spain's last island in the Caribbean, Puerto Rico. Meanwhile, on May 1, 1898, the American commodore George Dewey (Dewey, George), with his Asiatic squadron, destroyed a decrepit Spanish flotilla in the harbour of Manila in the Philippines.

      The fighting was over by August 12, when the United States and Spain signed a preliminary peace treaty in Washington, D.C. Negotiators met in Paris in October to draw up a definitive agreement. Spain recognized the independence of Cuba and ceded Puerto Rico and Guam to the United States, but the disposition of the Philippines was another matter. Business interests in the United States, which had been noticeably cool about a war over Cuba, demanded the acquisition of the entire Philippine archipelago in the hope that Manila would become the entrepôt for a great Far Eastern trade; chauvinists declaimed against lowering the flag under Spanish pressure. Concluding that he had no alternative, McKinley forced the Spanish to “sell” the Philippines to the United States for $20,000,000.

      But a strong reaction in the United States against acquisition of the Philippines had already set in by the time the Treaty of Paris was signed on Dec. 10, 1898; and anti-imperialists declared that the control and governance of distant alien peoples violated all American traditions of self-determination and would even threaten the very fabric of the republic. Though there were more than enough votes in the Senate to defeat the treaty, that body gave its consent to ratification largely because William Jennings Bryan (Bryan, William Jennings), the Democratic leader, wanted Democrats to approve the treaty and then make imperialism the chief issue of the 1900 presidential campaign.

The new American empire
      McKinley easily defeated Bryan in 1900. The victory, however, was hardly a mandate for imperialism, and, as events were soon to disclose, the American people were perhaps the most reluctant imperialists in history. No sooner had they acquired an overseas empire than they set in motion the process of its dissolution or transformation.

      By the so-called Teller Amendment to the war resolution, Congress had declared that the United States would not annex Cuba. This pledge was kept, although Cuba was forced in 1903 to sign a treaty making it virtually a protectorate of the United States. The Hawaiian (Hawaii) Islands, annexed by Congress on July 7, 1898, were made a territory in 1900 and were hence, technically, only briefly part of the American empire. Puerto Rico was given limited self-government in 1900; and the Jones Act of 1917 conferred full territorial status on the island, gave U.S. citizenship to its inhabitants, and limited its self-government only by the veto of a governor appointed by the president of the United States. Establishing any kind of government in the Philippines was much more difficult because a large band of Filipinos resisted American rule as bravely as they had fought the Spanish. The Philippine insurrection was over by 1901, however, and the Philippine Government Act of 1902 inaugurated the beginning of partial self-government, which was transformed into almost complete home rule by the Jones Act of 1916 (Jones Act).

The Open Door in the Far East
      Although Americans were reluctant imperialists, the United States was an important Pacific power after 1898, and American businessmen had inflated ambitions to tap what they thought was the huge Chinese market. The doors to that market were being rapidly closed in the 1890s, however, as Britain, France, Russia, and Japan carved out large so-called spheres of influence all the way from Manchuria to southern China. With Britain's support (the British stood to gain the most from equal trade opportunities), on Sept. 6, 1899, Secretary of State Hay addressed the first so-called Open Door (Open Door policy) note to the powers with interests in China; it asked them to accord equal trade and investment opportunities to all nationals in their spheres of interest and leased territories. With considerable bravado, Hay announced that all the powers had agreed to respect the Open Door, even though the Russians had declined to give any pledges. On July 3, 1900, after the Boxer Rebellion—an uprising in China against foreign influence—Hay circulated a second Open Door note announcing that it was American policy to preserve Chinese territorial and political integrity.

      Such pronouncements had little effect because the United States was not prepared to support the Open Door policy with force; successive administrations to the 1940s, however, considered it the cornerstone of their Far Eastern policy. President Theodore Roosevelt (Roosevelt, Theodore) reluctantly mediated the Russo-Japanese War in 1905 in part to protect the Open Door as well as to maintain a balance of power in the Far East. When Japan attempted in 1915 to force a virtual protectorate on China, President Woodrow Wilson intervened sternly and in some measure successfully to protect Chinese independence. Victory for American policy seemed to come with the Nine-Power Treaty of Washington of 1922, when all nations with interests in China promised to respect the Open Door.

Building the Panama Canal and American domination in the Caribbean
      Strategic necessity and the desire of Eastern businessmen to have easy access to Pacific markets combined in the late 1890s to convince the president, Congress, and a vast majority of Americans that an isthmian canal linking the Atlantic and Pacific oceans was vital to national security and prosperity. In the Hay–Pauncefote Treaty of 1901, the British government gave up the rights to joint construction with the United States that it had gained under the Clayton–Bulwer Treaty of 1850. A French company, which had tried unsuccessfully to dig a canal across the Isthmus of Panama, was eager to sell its right-of-way to the United States. Thus, the only obstacle to the project was the government of Colombia, which owned Panama. When Colombia was slow to cooperate, Roosevelt, in 1903, covertly supported a Panamanian revolution engineered by officials of the French company. A treaty was quickly negotiated between the United States and the new Republic of Panama; construction began, and the canal was opened to shipping on Aug. 15, 1914.

      Concern over what Americans regarded increasingly as their “lifeline” increased in proportion to progress in the construction of the canal. An early manifestation of that concern came in 1902–03, when Britain, Germany, and Italy blockaded Venezuela to force the payment of debts, and particularly when the Germans bombarded and destroyed a Venezuelan town; so agitated was American opinion that Roosevelt used a veiled threat to force Germany to accept arbitration of the debt question by the Hague Court. When the Dominican Republic defaulted on its foreign debt to several European countries in 1904, Roosevelt quickly established an American receivership of the Dominican customs in order to collect the revenues to meet the country's debt payments. Moreover, in his annual message to Congress of 1904, the president announced a new Latin-American policy, soon called the Roosevelt Corollary to the Monroe Doctrine—because the Monroe Doctrine forbade European use of force in the New World, the United States would itself take whatever action necessary to guarantee that Latin-American states gave no cause for such European intervention. It was, in fact, a considerable extension of the Monroe Doctrine, not a correct historical interpretation of it; but it remained the cornerstone of American policy in the Caribbean at least until 1928.

      Actually, Roosevelt was reluctant to interfere in the domestic affairs of neighbouring states; his one significant intervention after 1904—the administration of the Cuban government from 1906 to 1909—was undertaken in order to prevent civil war and at the insistence of Cuban authorities. Roosevelt's successor, however, William Howard Taft (Taft, William Howard), had more ambitious plans to guarantee American hegemony in the approaches to the Panama Canal. Adopting a policy called Dollar Diplomacy, Taft hoped to persuade American private bankers to displace European creditors in the Caribbean area and thereby to increase American influence and encourage stability in countries prone to revolution. Dollar Diplomacy was a total failure; its one result was to involve the United States in a civil war in Nicaragua with the effect of perpetuating a reactionary and unpopular regime. (Similar initiatives by the Taft administration in the Far East—most notably a plan for the internationalization of the railroads of Manchuria—also failed.)

      The accession of Woodrow Wilson (Wilson, Woodrow) in 1913 seemed to augur the beginning of a new era in Latin-American relations; the new president and his secretary of state, William Jennings Bryan, were idealists who had strongly condemned interventions and Dollar Diplomacy. But, although Wilson did negotiate a treaty with Colombia to make reparation for U.S. complicity in the Panamanian revolution, it was defeated by the Senate. Wilson also tried hard to promote a Pan-American nonaggression pact; but it foundered on the opposition of Chile, which had a long-standing border dispute with Peru.

      When crises threatened the domestic stability of the Caribbean area, however, Wilson revealed that he was just as determined to protect American security as Roosevelt and Taft had been and that he was perhaps even more willing to use force. Frequent revolutions and the fear of European intervention led Wilson to impose a protectorate and a puppet government upon Haiti in 1915 and a military occupation of the Dominican Republic in 1916. He concluded a treaty with Nicaragua making that country a protectorate of the United States. Moreover, he purchased the Danish Virgin Islands (Virgin Islands of the United States) in 1916 at the inflated price of $25,000,000 in order to prevent their possible transfer from Denmark to Germany.

The Progressive era

The character and variety of the Progressive movement
      The inauguration of President McKinley (McKinley, William) in 1897 had seemed to mark the end of an era of domestic turmoil and the beginning of a new period of unparalleled tranquility. Prosperity was returning after the devastating panic of 1893. The agrarian uprising led by Bryan in the election of 1896 had been turned back, and the national government was securely in the hands of friends of big business. The Dingley Tariff Act of 1897 greatly increased tariff rates; the Gold Standard Act of 1897 dashed the hopes of advocates of the free coinage of silver; and McKinley did nothing to stop a series of industrial combinations in defiance of the Sherman Anti-Trust Act.

Origins of progressivism
      Never were superficial signs more deceiving. Actually, the United States already was in the first stages of what historians came to call the Progressive movement. Generally speaking, progressivism was the response of various groups to problems raised by the rapid industrialization and urbanization that followed the Civil War. These problems included the spread of slums and poverty; the exploitation of labour; the breakdown of democratic government in the cities and states caused by the emergence of political organizations, or machines, allied with business interests; and a rapid movement toward financial and industrial concentration. Many Americans feared that their historic traditions of responsible democratic government and free economic opportunity for all were being destroyed by gigantic combinations of economic and political power.

      Actually there was not, either in the 1890s or later, any single Progressive movement. The numerous movements for reform on the local, state, and national levels were too diverse, and sometimes too mutually antagonistic, ever to coalesce into a national crusade. But they were generally motivated by common assumptions and goals—e.g., the repudiation of individualism and laissez-faire, concern for the underprivileged and downtrodden, the control of government by the rank and file, and the enlargement of governmental power in order to bring industry and finance under a measure of popular control.

      The origins of progressivism were as complex and are as difficult to describe as the movement itself. In the vanguard were various agrarian crusaders, such as the Grangers and the Populists and Democrats under Bryan, with their demands for stringent railroad regulation and national control of banks and the money supply. At the same time a new generation of economists, sociologists, and political scientists was undermining the philosophical foundations of the laissez-faire state and constructing a new ideology to justify democratic collectivism; and a new school of social workers was establishing settlement houses and going into the slums to discover the extent of human degradation. Allied with them was a growing body of ministers, priests, and rabbis—proponents of what was called the Social Gospel—who struggled to arouse the social concerns and consciences of their parishioners. Finally, journalists called “muckrakers (muckraker)” probed into all the dark corners of American life and carried their message of reform through mass-circulation newspapers and magazines.

      Two specific catalytic agents set off the Progressive movement—the agrarian depression of the early 1890s and the financial and industrial depression that began in 1893. Low prices drove farmers by the hundreds of thousands into the People's Party (Populist Movement) of 1892. Widespread suffering in the cities beginning in 1893 caused a breakdown of many social services and dramatized for the increasing number of urban middle-class Americans the gross inefficiency of most municipal governments.

Urban reforms
      A movement already begun, to wrest control of city governments from corrupt political machines, was given tremendous impetus by the panic of 1893. The National Municipal League, organized in 1894, united various city reform groups throughout the country; corrupt local governments were overthrown in such cities as New York in 1894, Baltimore in 1895, and Chicago in 1896–97. And so it went all over the country well into the 20th century.

      Despite initial differences among urban reformers, by the early 1900s the vast majority of them were fighting for and winning much the same objectives—more equitable taxation of railroad and corporate property, tenement house reform, better schools, and expanded social services for the poor. Even big-city machines like Tammany Hall became increasingly sensitive to the social and economic needs of their constituents. Reformers also devised new forms of city government to replace the old mayor–city-council arrangement that had proved to be so susceptible to corrupt influences. One was the commission form, which vested all responsibility in a small group of commissioners, each responsible for a single department; another was the city-manager form, which provided administration by a professionally trained expert, responsible to a popularly elected council (these two forms were in widespread use in small and medium-sized cities by 1920).

Reform in state governments
      The reform movement spread almost at once to the state level, for it was in state capitals that important decisions affecting the cities were made. Entrenched and very professional political organizations, generously financed by officeholders and businessmen wanting special privileges, controlled most state governments in the late 1890s; everywhere, these organizations were challenged by a rising generation of young and idealistic antiorganization leaders, ambitious for power. They were most successful in the Midwest, under such leaders as Robert M. La Follette (La Follette, Robert M) of Wisconsin; but they had counterparts all over the country—e.g., Charles Evans Hughes of New York, Woodrow Wilson of New Jersey, Andrew J. Montague of Virginia, and Hiram W. Johnson of California.

      These young leaders revolutionized the art and practice of politics in the United States, not only by exercising strong leadership but also by effecting institutional changes such as the direct primary, direct election of senators (rather than by state legislatures), the initiative, referendum, and recall—which helped restore and revitalize political democracy. More important, perhaps, progressives to a large degree achieved their economic and social objectives—among them, strict regulation of intrastate railroads and public utilities, legislation to prevent child labour and to protect women workers, penal reform, expanded charitable services to the poor, and accident insurance systems to provide compensation to workers and their families.

Theodore Roosevelt (Roosevelt, Theodore) and the Progressive movement
      By 1901 the reform upheaval was too strong to be contained within state boundaries. Moreover, certain problems with which only the federal government was apparently competent to deal cried out for solution. McKinley might have succeeded in ignoring the rising tide of public opinion had he served out his second term; but McKinley's assassination in September 1901 brought to the presidency an entirely different kind of man—Theodore Roosevelt, at age 42 the youngest man yet to enter the White House. Roosevelt had broad democratic sympathies; moreover, thanks to his experience as police commissioner of New York City and governor of New York state, he was the first president to have an intimate knowledge of modern urban problems. Because Congress was securely controlled by a group of archconservative Republicans, the new president had to feel his way cautiously in legislative matters; but he emerged full-grown as a tribune of the people after his triumph in the presidential election of 1904. By 1906 he was the undisputed spokesman of national progressivism and by far its best publicity agent. (The White House was, he said, “a bully pulpit.”) Meanwhile, by his leadership of public opinion and by acting as a spur on Congress, he had revived the presidency and made it incomparably the most powerful force in national politics.

      In 1901, Americans were perhaps most alarmed about the spread of so-called trusts, or industrial combinations, which they thought were responsible for the steady price increases that had occurred each year since 1897. Ever alert to the winds of public opinion, Roosevelt responded by activating the Sherman Anti-Trust Act (Sherman Antitrust Act) of 1890, which had lain dormant because of Cleveland's and McKinley's refusal to enforce it and also because of the Supreme Court's ruling of 1895 that the measure did not apply to combinations in manufacturing. Beginning in 1902 with a suit to dissolve a northwestern railroad monopoly, Roosevelt moved next against the so-called Beef Trust, then against the oil, tobacco, and other monopolies. In every case the Supreme Court supported the administration, going so far in the oil and tobacco decisions of 1911 as to reverse its 1895 decision. In addition, in 1903 Roosevelt persuaded a reluctant Congress to establish a Bureau of Corporations with sweeping power to investigate business practices; the bureau's thoroughgoing reports were of immense assistance in antitrust cases. While establishing the supremacy of the federal government in the industrial field, Roosevelt in 1902 also took action unprecedented in the history of the presidency by intervening on labour's behalf to force the arbitration of a strike by the United Mine Workers of America against the Pennsylvania anthracite coal operators.

      Roosevelt moved much more aggressively after his 1904 election. Public demand for effective national regulation of interstate railroad rates had been growing since the Supreme Court had emasculated the Interstate Commerce Commission's (ICC) rate-making authority in the 1890s. Determined to bring the railroads—the country's single greatest private economic interest—under effective national control, Roosevelt waged an unrelenting battle with Congress in 1905–06. The outcome—the Hepburn Act of 1906—was his own personal triumph; it greatly enlarged the ICC's jurisdiction and forbade railroads to increase rates without its approval. By using the same tactics of aggressive leadership, Roosevelt in 1906 also obtained passage of a Meat Inspection Act and a Pure Food and Drug Act. Passage of the former was aided by the publication of Upton Sinclair's famous novel, The Jungle (1906), which revealed in gory detail the unsanitary conditions of the Chicago stockyards and meat-packing plants.

      Meanwhile, almost from his accession to the presidency, Roosevelt had been carrying on a crusade, often independent of Congress, to conserve the nation's fast-dwindling natural resources and to make them available for exploitation under rigorous national supervision. He withdrew from the public domain some 148,000,000 acres of forest lands, 80,000,000 acres of mineral lands, and 1,500,000 acres of water-power sites. Moreover, adoption of the National Reclamation Act of 1902 made possible the beginning of an ambitious federal program of irrigation and hydroelectric development in the West.

Republican troubles under William Howard Taft
      Roosevelt was so much the idol of the masses of 1908 that he could have easily gained the Republican nomination in that year. After his election in 1904, however, he had announced that he would not be a candidate four years later; adhering stubbornly to his pledge, he arranged the nomination of his secretary of war, William Howard Taft (Taft, William Howard) of Ohio, who easily defeated Bryan.

      Taft might have made an ideal president during a time of domestic tranquility, but his tenure in the White House was far from peaceful. National progressivism was nearly at high tide; and a large group of Republican progressives, called “insurgents,” sat in both houses of Congress.

The Republican insurgents
      These Republicans, like a majority of Americans, demanded such reforms as tariff reductions, an income tax, the direct election of senators, and even stricter railroad and corporation regulations. Taft, who had strongly supported Roosevelt's policies, thought of himself as a progressive. Actually he was temperamentally and philosophically a conservative; moreover, he lacked the qualities of a dynamic popular leader. In the circumstances, his ineptness, indecision, and failure to lead could only spell disaster for his party.

      Taft's troubles began when he called Congress into special session in 1909 to take up the first item on his agenda—tariff reform. The measure that emerged from Congress actually increased rates. Republican insurgents and a majority of Americans were outraged, but Taft signed the bill and called it the best tariff law the Republicans had ever enacted. Conflicts and misunderstandings over conservation and legislative procedure caused the rift between Taft Republicans and the insurgents to grow. By 1910 the Republican insurgents were clearly in the ascendancy in the Congress. Taking control of the president's railroad-regulation measure, they added new provisions that greatly enlarged the ICC's authority. The following year they bitterly opposed Taft's measure for tariff reciprocity with Canada; it passed with Democratic support in Congress, only to go down to defeat at the hands of the Canadian electorate.

The 1912 election
      Republican insurgents were determined to prevent Taft's renomination in 1912. They found their leader in Roosevelt, who had become increasingly alienated from Taft and who made a whirlwind campaign for the presidential nomination in the winter and spring of 1912. Roosevelt swept the presidential primaries, even in Taft's own state of Ohio; but Taft and conservative Republicans controlled the powerful state organizations and the Republican National Committee and were able to nominate Taft by a narrow margin. Convinced that the bosses had stolen the nomination from him, Roosevelt led his followers out of the Republican convention. In August they organized the Progressive (“Bull Moose (Bull Moose Party)”) Party and named Roosevelt to lead the third-party cause.

      Democrats had swept the 1910 congressional and gubernatorial elections; and, after the disruption of the Republican Party in the spring of 1912, it was obvious that almost any passable Democrat could win the presidency in that year. Woodrow Wilson, former president of Princeton University, who had made a brilliant Progressive record as governor of New Jersey, was nominated by the Democrats on the 46th ballot.

      Taft's single objective in the 1912 campaign was to defeat Roosevelt. The real contest was between Roosevelt and Wilson for control of the Progressive majority. Campaigning strenuously on a platform that he called the New Nationalism, Roosevelt demanded effective control of big business through a strong federal commission, radical tax reform, and a whole series of measures to put the federal government squarely into the business of social and economic reform. By contrast Wilson seemed conservative with a program he called the New Freedom; it envisaged a concerted effort to destroy monopoly and to open the doors of economic opportunity to small businessmen through drastic tariff reduction, banking reform, and severe tightening of the antitrust laws. Roosevelt outpolled Taft in the election, but he failed to win many Democratic Progressives away from Wilson, who won by a huge majority of electoral votes, though receiving only about 42 percent of the popular vote.

The New Freedom and its transformation
      A trained political scientist and historian, Wilson (Wilson, Woodrow) believed that the president should be the leader of public opinion, the chief formulator of legislative policy, and virtually sovereign in the conduct of foreign relations. With the support of an aroused public opinion and a compliant Democratic majority, he was able to put his theories of leadership into effect with spectacular success.

      The first item in Wilson's program was tariff reform, a perennial Democratic objective since the Civil War; the president's measure, the Underwood Tariff Act of 1913, reduced average rates from 40 percent to 25 percent, greatly enlarged the free list, and included a modest income tax. Next came adoption of the president's measure for banking and monetary reform, the Federal Reserve Act of 1913, which created a federal reserve system to mobilize banking reserves and issue a flexible new currency—federal reserve notes—based on gold and commercial paper; uniting and supervising the entire system was a federal reserve board of presidential appointees.

      The third, and Wilson thought the last, part of the New Freedom program was antitrust reform. In his first significant movement toward Roosevelt's New Nationalism, Wilson reversed his position that merely strengthening the Sherman Anti-Trust Act would suffice to prevent monopoly. Instead, he took up and pushed through Congress the Progressive-sponsored Federal Trade Commission Act of 1914. It established an agency—the Federal Trade Commission (FTC)—with sweeping authority to prevent business practices that would lead to monopoly. Meanwhile, Wilson had abandoned his original measure, the Clayton Anti-Trust Act (Clayton Antitrust Act) passed by Congress in 1914; its severe provisions against interlocking directorates and practices tending toward monopoly had been gravely weakened by the time the president signed it. The Clayton Act included a declaration that labour unions, as such, were not to be construed as conspiracies in restraint of trade in violation of the antitrust laws; but what organized labour wanted, and did not get, was immunity from prosecution for such measures as the sympathetic strike and the secondary boycott, which the courts had proscribed as violations of the Sherman Act.

      In a public letter in November 1914, the president announced that his reform program was complete. But various groups were still demanding the advanced kind of social and economic legislation that Roosevelt had advocated in 1912; also, by early 1916 the Progressive Party had largely disintegrated, and Wilson knew that he could win reelection only with the support of a substantial minority of Roosevelt's former followers. Consequently—and also because his own political thinking had been moving toward a more advanced Progressive position—Wilson struck out upon a new political course in 1916. He began by appointing Louis D. Brandeis (Brandeis, Louis), the leading critic of big business and finance, to the Supreme Court. Then in quick succession he obtained passage of a rural-credits measure to supply cheap long-term credit to farmers; anti-child-labour and federal workmen's-compensation legislation; the Adamson Act, establishing the eight-hour day for interstate railroad workers; and measures for federal aid to education and highway construction. With such a program behind him, Wilson was able to rally a new coalition of Democrats, former Progressives, independents, social workers, and a large minority of Socialists; and he narrowly defeated his Republican opponent, Charles Evans Hughes, in the 1916 presidential election.

The rise to world power

Woodrow Wilson and the Mexican Revolution
      Although Wilson's consuming interest was in domestic politics, he had to deal primarily with foreign affairs while in the White House; and before the end of his presidency he had developed into a diplomatist of great skill as well as one of the commanding figures in world affairs. He was a “strong” president in the conduct of foreign policy, writing most of the important diplomatic correspondence of his government and making all important decisions himself. He usually worked well with his secretaries of state, Bryan and Robert Lansing, and often relied for advice upon his confidential counselor, Colonel Edward M. House (House, Edward M.) of Texas.

      Wilson served his apprenticeship by having to deal at the outset of his administration with an uprising in Mexico, set off when a military usurper, Victoriano Huerta (Huerta, Victoriano), murdered liberal president Francisco Madero and seized the executive power in February 1913. It was difficult for the United States to remain aloof because Americans had invested heavily in Mexico and 40,000 U.S. citizens resided there.

      If Wilson had followed conventional policy and the urgings of Americans with interests in Mexico, he would have recognized Huerta (as most European governments did), who promised to respect and protect all foreign investments and concessions. But Wilson was revolted by Huerta's bloody rise to power; moreover, he believed that the revolution begun by Madero in 1910 was a glorious episode in the history of human liberty. Wilson thus not only refused to recognize Huerta but also tried to persuade the dictator to step down from office and permit the holding of free elections for a new democratic government. When Huerta refused to cooperate, Wilson gave open support to the Constitutionalists—Huerta's opponents under Madero's successor, Venustiano Carranza (Carranza, Venustiano)—and, when it seemed that the Constitutionalists could not themselves drive Huerta from power, Wilson seized the port of Veracruz in April 1914 to cut off Huerta's supplies and revenues. This stratagem succeeded, and Carranza and his army occupied Mexico City in August.

      The revolutionary forces then divided between Carranza's followers and those of his chief rival and most colorful general, Pancho Villa (Villa, Pancho); and civil war raged for another year. Wilson refused to interfere. Carranza emerged victorious by the summer of 1915, and Wilson accorded him de facto recognition in October. In January 1916, however, Villa executed about 17 U.S. citizens at Santa Isabel to demonstrate Carranza's lack of control in northern Mexico. Then, seeking to provoke war between the United States and Mexico, he raided Columbus, New Mexico, on March 9, 1916, burning the town and killing some 17 inhabitants. Wilson sent a punitive expedition under General John J. Pershing (Pershing, John J) into Mexico in hot pursuit of Villa; but the wily guerrilla eluded Pershing, and, the deeper the U.S. forces penetrated into Mexican territory, the more agitated the Carranza government became. There were two serious skirmishes between regular Mexican and U.S. troops in the spring, and full-scale war was averted only when Wilson withdrew Pershing's column some months later. Relations between the two governments were greatly improved when Wilson extended de jure recognition to Carranza's new Constitutional regime in April 1917. Thereafter, Wilson adamantly rejected all further foreign and American suggestions for intervention in Mexico.

The struggle for neutrality
      The outbreak of general war in Europe in August 1914 raised grave challenges to Wilson's skill and leadership in foreign affairs. In spite of the appeals of propagandists for the rival Allies and Central Powers, the great majority of Americans were doggedly neutral and determined to avoid involvement unless American rights and interests were grossly violated. This, too, was Wilson's own feeling, and in August he issued an official proclamation of neutrality and two weeks later appealed to Americans to be “impartial in thought as well as in action.”

Loans and supplies for the Allies
      Difficulties arose first with the British government, which at once used its vast fleet to establish a long-range blockade of Germany. The U.S. State Department sent several strong protests to London, particularly against British suppression of American exports of food and raw materials to Germany. Anglo-American blockade controversies were not acute, however, because the British put their blockade controls into effect gradually, always paid for goods seized, argued persuasively that in a total war food and raw materials were as essential as guns and ammunition, and pointed out that they, the British, were simply following blockade precedents established by the United States itself during the American Civil War. As a result of a tacit Anglo-American agreement, the United States soon became the chief external source of supply for the food, raw materials, and munitions that fed the British and French war machines. In addition, and in accordance with the strict rules of neutrality, the Wilson administration permitted the Allied governments to borrow more than $2,000,000,000 in the United States in order to finance the war trade. At the same time, the president resisted all efforts by German Americans for an arms embargo on the ground that such a measure would be grossly unneutral toward the Allies.

      There was no possibility of conflict between Germany and the United States so long as the former confined its warfare to the continent of Europe; a new situation full of potential danger arose, however, when the German authorities decided to use their new weapon, the submarine, to challenge British control of the seas. The German admiralty announced in February 1915 that all Allied vessels would be torpedoed without warning in a broad area and that even neutral vessels were not safe. Wilson replied at once that he would hold Germany to “strict accountability” (a conventional diplomatic term) if submarines destroyed American ships and lives without warning. The Germans soon gave broad guarantees concerning American ships, and their safety against illegal submarine attacks was not an issue between the two countries before 1917.

      An issue much more fraught with danger was the safety of Americans traveling and working on Allied ships. A German submarine sank the unarmed British liner Lusitania without warning on May 7, 1915, killing, among others, 128 Americans. Wilson at first appealed to the Germans on broad grounds of humanity to abandon submarine warfare, but in the subsequent negotiations he narrowed the issue to one of safety for unarmed passenger liners against violent underseas attack. Momentary resolution came when a submarine sank the unarmed British liner Arabic in August. Wilson warned that he would break diplomatic relations if such attacks continued, and the Germans grudgingly promised not to attack unarmed passenger ships without warning. The controversy escalated to a more dangerous level when a submarine torpedoed the packet steamer Sussex (Sussex Incident) in the English Channel with heavy loss of life in March 1916. In an ultimatum to Berlin, Wilson threatened to break diplomatic relations if the Germans did not cease attacking liners and merchantmen without warning; once again the Germans capitulated, but they threatened to resume unrestricted submarine warfare if the United States failed to force the British to observe international law in their blockade practices.

      The Allies complicated the submarine controversy in late 1915 by arming many of their liners and merchantmen sailing to American ports. Wilson tried to arrange a compromise by which the Allies would disarm their ships in return for a German promise not to sink them without warning. When the British rejected the proposal, the president gave the impression that he would hold Germany accountable for American lives lost on armed ships, setting off a rebellion in Congress and the near passage of resolutions forbidding American citizens to travel on armed ships. Actually, the president had no intention of permitting armed ships to become a serious issue; their status was never a subject of serious controversy between the United States and Germany.

Arming for war
      Meanwhile, the increasingly perilous state of relations with Germany had prompted Wilson, in December 1915, to call for a considerable expansion in the nation's armed forces. A violent controversy over preparedness ensued, both in Congress and in the country at large. The army legislation of 1916 was a compromise, with Wilson obtaining only a modest increase in the army and a strengthening of the National Guard; but the Naval Appropriations Act of 1916 provided for more ships than the administration had requested.

The United States enters the Great War (World War I)
      Wilson's most passionate desire, aside from avoiding belligerency, was to bring an end to the war through his personal mediation. He sent Colonel House (House, Edward M.) to Europe in early 1915 to explore the possibilities of peace and again early in 1916 to press for a plan of Anglo-American cooperation for peace. The British refused to cooperate, and the president, more than ever eager to avoid a final confrontation with Germany on the submarine issue, decided to press forward with independent mediation. He was by this time also angered by the intensification of British blockade practices and convinced that both sides were fighting for world domination and spoils. On Dec. 18, 1916, Wilson asked the belligerents to state the terms upon which they would be willing to make peace. Soon afterward, in secret, high-level negotiations, he appealed to Britain and Germany to hold an early peace conference under his leadership.

Break with Germany
      Chances for peace were blasted by a decision of the German leaders, made at an imperial conference on Jan. 9, 1917, to inaugurate an all-out submarine war against all commerce, neutral as well as belligerent. The Germans knew that such a campaign would bring the United States into the war; but they were confident that their augmented submarine fleet could starve Britain into submission before the United States could mobilize and participate effectively.

      The announcement of the new submarine blockade in January left the president no alternative but to break diplomatic relations with Germany, which he did on February 3. At the same time, and in subsequent addresses, the president made it clear that he would accept unrestricted submarine warfare against belligerent merchantmen and would act only if American ships were sunk. In early March he put arms on American ships in the hope that this would deter submarine attacks. The Germans began to sink American ships indiscriminately in mid-March, and on April 2 Wilson asked Congress to recognize that a state of war existed between the United States and the German Empire. Congress approved the war resolution quickly, and Wilson signed it on April 6. (For U.S. military involvement in World War I, see the article World War I.)

      Generally speaking, the efforts at mobilization went through two stages. During the first, lasting roughly from April to December 1917, the administration relied mainly on voluntary and cooperative efforts. During the second stage, after December 1917, the government moved rapidly to establish complete control over every important phase of economic life. Railroads were nationalized; a war industries board established ironclad controls over industry; food and fuel were strictly rationed; an emergency-fleet corporation began construction of a vast merchant fleet; and a war labour board used coercive measures to prevent strikes. Opposition to the war was sternly suppressed under the Espionage Act of 1917. At the same time, the Committee on Public Information, headed by the progressive journalist George Creel, mobilized publicists, scholars, and others in a vast prowar propaganda effort. By the spring of 1918, the American people and their economy had been harnessed for total war (a near miracle, considering the lack of preparedness only a year before).

America's role in the war
      The American military contribution, while small compared to that of the Allies during the entire war, was in two respects decisive in the outcome. The U.S. Navy, fully prepared at the outset, provided the ships that helped the British overcome the submarine threat by the autumn of 1917. The U.S. Army, some 4,000,000 men strong, was raised mainly by conscription under the Selective Service Act of 1917; the American Expeditionary Force of more than 1,200,000 men under General Pershing reached France by September 1918, and this huge infusion of manpower tipped the balance on the Western Front and helped to end the war in November 1918, a year earlier than military planners had anticipated.

Wilson's vision of a new world order
      In one of the most ambitious rhetorical efforts in modern history, President Wilson attempted to rally the people of the world in a movement for a peace settlement that would remove the causes of future wars and establish machinery to maintain peace. In an address to the Senate on Jan. 22, 1917, he called for a “peace without victory” to be enforced by a league of nations that the United States would join and strongly support. He reiterated this program in his war message, adding that the United States wanted above all else to “make the world safe for democracy.” And when he failed to persuade the British and French leaders to join him in issuing a common statement of war aims, he went to Congress on Jan. 8, 1918, to make, in his Fourteen Points address, his definitive avowal to the American people and the world.

      In his general points Wilson demanded an end to the old diplomacy that had led to wars in the past. He proposed open diplomacy instead of entangling alliances, and he called for freedom of the seas, an impartial settlement of colonial claims, general disarmament, removal of artificial trade barriers, and, most important, a league of nations to promote peace and protect the territorial integrity and independence of its members. On specific issues he demanded, among other things, the restoration of a Belgium ravaged by the Germans; sympathetic treatment of the Russians, then involved in a civil war; establishment of an independent Poland; the return of Alsace-Lorraine to France; and autonomy or self-determination for the subject peoples of the Austro-Hungarian and Ottoman empires. A breathtaking pronouncement, the Fourteen Points gave new hope to millions of liberals and moderate socialists who were fighting for a new international order based upon peace and justice.

The Paris Peace Conference and the Versailles Treaty (Versailles, Treaty of)
      With their armies reeling under the weight of a combined Allied and American assault, the Germans appealed to Wilson in October 1918 for an armistice based on the Fourteen Points and other presidential pronouncements. The Allies agreed to conclude peace on this basis, except that the British entered a reservation about freedom of the seas, and Wilson agreed to an Anglo-French demand that the Germans be required to make reparation for damages to civilian property.

      Wilson led the U.S. delegation and a large group of experts to the peace conference, which opened in Paris in January 1919. He fought heroically for his Fourteen Points against the Allied leaders—David Lloyd George of Britain, Georges Clemenceau of France, and Vittorio Orlando of Italy—who, under heavy pressure from their own constituencies, were determined to divide the territories of the vanquished and make Germany pay the full cost of the war. Wilson made a number of compromises that violated the spirit if not the letter of the Fourteen Points, including the imposition of an indefinitely large reparations bill upon Germany. Moreover, the Allies had intervened in the Russian Civil War against the dominant revolutionary socialist faction, the Bolsheviks; and Wilson had halfheartedly cooperated with the Allies by dispatching small numbers of troops to northern Russia, to protect military supplies against the advancing Germans, and to Siberia, mainly to keep an eye on the Japanese, who had sent a large force there. But Wilson won many more of his Fourteen Points than he lost; his greatest victories were to prevent the dismemberment of Germany in the west and further intervention in Russia and, most important, to obtain the incorporation of the Covenant of the League of Nations (Nations, League of) into the Versailles Treaty. He was confident that the League, under American leadership, would soon rectify the injustices of the treaty.

The fight over the treaty and the election of 1920
      Public opinion in the United States seemed strongly in favour of quick ratification of the Versailles Treaty when the president presented that document to the Senate in July 1919. Traditional isolationist sentiment was beginning to revive, however, and a small minority of 16 senators, irreconcilably opposed to U.S. membership in the League, vowed to oppose the treaty to the bitter end. In addition, a crucial controversy developed between the president and a majority of the Republican senators, led by Henry Cabot Lodge (Lodge, Henry Cabot) of Massachusetts. Lodge insisted upon adding 14 reservations to the treaty. The second reservation declared that the United States assumed no obligations under Article X of the Covenant, which guaranteed the integrity and independence of members of the League; moreover it said that the president could not use the armed forces to support the Covenant without the explicit consent of Congress.

      Calling this reservation a nullification of the treaty, Wilson in September made a long speaking tour of the West to build up public support for unconditional ratification. He suffered a breakdown at the end of his tour and a serious stroke on October 2. The president's illness, which incapacitated him for several months, increased his intransigence against the Lodge reservations; with equal stubbornness, the Massachusetts senator refused to consent to any compromise. The result was failure to obtain the necessary two-thirds majority for ratification, with or without reservations, when the Senate voted on Nov. 19, 1919, and again on March 19, 1920.

      Wilson had suggested that the ensuing presidential campaign and election should be a “great and solemn referendum” on the League. The Democratic candidate, James M. Cox of Ohio, fought hard to make it the leading issue; but the Republican candidate, Warren G. Harding (Harding, Warren G.) of Ohio, was evasive on the subject, and a group of 31 leading Republican internationalists assured the country that Harding's election would be the best guarantee of U.S. membership in the League of Nations. Harding swamped Cox, and his victory ended all hopes for U.S. membership. In his inaugural Harding announced that the United States would not be entangled in European affairs; he emphasized this determination by concluding a separate peace with Germany in 1921.

Arthur S. Link

The United States from 1920 to 1945
The postwar Republican administrations

      After the end of World War I, many Americans were left with a feeling of distrust toward foreigners and radicals, whom they held responsible for the war. The Russian Revolution of 1917 and the founding of the communists' Third International in 1919 further fanned American fears of radicalism. Race riots and labour unrest added to the tension. Thus, when a series of strikes and indiscriminate bombings began in 1919, the unrelated incidents were all assumed—incorrectly in most cases—to be communist (communism)-inspired. During the ensuing Red Scare, civil liberties were sometimes grossly violated and many innocent aliens were deported. The Red Scare was over within a year, but a general distrust of foreigners, liberal reform movements, and organized labour remained throughout the 1920s. In fact, many viewed Harding's (Harding, Warren G.) landslide victory in 1920 as a repudiation of Wilson's internationalism and of the reforms of the Progressive era.

Peace and prosperity
      Harding took office with a clear mandate to restore business as usual, a condition he termed “normalcy.” Americans wished to put reminders of the Great War behind them, as well as the brutal strikes, the Red Scare, and the sharp recession of Wilson's last years in office. Peace and prosperity were what people desired, and these would be achieved under Harding.

      As part of his policy of returning America to prewar conditions, Harding pardoned many individuals who had been convicted of antiwar activities or for being radicals. His main concern, however, was business. Reversing progressive and wartime trends, the Harding administration strove to establish probusiness policies. Attorney General Harry M. Daugherty (Daugherty, Harry Micajah) obtained injunctions against striking workers. The Supreme Court sided with management in disputes over unions, minimum wage laws, child labour, and other issues. Secretary of Commerce Herbert Hoover expanded the size of his department fourfold during the next eight years in attempts to foster business growth and efficiency and to encourage trade associations and business–labour cooperation. Secretary of the Treasury Andrew W. Mellon (Mellon, Andrew W.), one of the nation's richest men, drastically cut taxes, especially on the wealthy; he also cut federal spending to reduce the national debt.

      In foreign affairs the Harding administration tried to ensure peace by urging disarmament, and at the Washington Naval Conference in 1921 Secretary of State Charles Evans Hughes (Hughes, Charles Evans) negotiated the first effective arms-reduction agreement in history. On the whole, however, the policies of the United States were narrow and nationalistic. It did not cooperate with the League of Nations. It insisted that Europeans pay their American debts but in 1922 passed the Fordney–McCumber Tariff, which raised duties so high that foreigners had great difficulty earning the necessary dollars. When immigration reached prewar levels (some 800,000 people entered the country between June 1920 and June 1921), Congress gave in to the protests of organized labour, which believed immigrants were taking jobs away from American citizens, and to the objections of business leaders and patriotic organizations, who feared that some of the immigrants might be radicals. Reversing traditional American policy, Congress passed first an emergency restriction bill and then in 1924 the National Origins Act. The act set a quota limiting the number of immigrants to 164,000 annually (150,000 after July 1, 1927); it discriminated against immigrants from southern and eastern Europe and barred Asians completely. The quota did not pertain to North Americans, however.

      Harding's policies, his genial nature, and the return of prosperity made the president extremely popular. His sudden death, of a cerebral embolism, in the summer of 1923 resulted in a national outpouring of grief. Yet it soon became evident that his administration had been the most corrupt since Grant's. Harding had appointed venal mediocrities, many of them old cronies, to office, and they had betrayed his trust. The most publicized scandal was the illegal leasing of naval oil reserves at Teapot Dome (Teapot Dome Scandal), Wyo., which led to the conviction of Secretary of the Interior Albert B. Fall (Fall, Albert Bacon) for accepting a bribe.

      Calvin Coolidge (Coolidge, Calvin), Harding's vice president and successor, was a taciturn, parsimonious New Englander who restored honesty to government. His administration suffered none of the stigma of the Harding scandals, and Coolidge, thanks to a buoyant economy and a divided Democratic Party, easily defeated the conservative Democrat John W. Davis (Davis, John W.) in the election of 1924. Even though an independent campaign by Senator Robert M. La Follette (La Follette, Robert M) of Wisconsin drew off insurgent Republicans, Coolidge received more popular, and electoral, votes than his opponents combined.

      Coolidge followed Harding's policies, and prosperity continued for most of the decade. From 1922 to 1929, stock dividends rose by 108 percent, corporate profits by 76 percent, and wages by 33 percent. In 1929, 4,455,100 passenger cars were sold by American factories, one for every 27 members of the population, a record that was not broken until 1950. Productivity was the key to America's economic growth. Because of improvements in technology, overall labour costs declined by nearly 10 percent, even though the wages of individual workers rose.

      The prosperity was not solidly based, however. The wealthy benefited most, and agriculture and several industries, such as textiles and bituminous coal mining, were seriously depressed; after 1926 construction declined.

New social trends
      For millions of Americans, the sober-minded Coolidge was a more appropriate symbol for the era than the journalistic terms Jazz Age or Roaring Twenties. These terms were exaggerations, but they did have some basis in fact. Many young men and women who had been disillusioned by their experiences in World War I rebelled against what they viewed as unsuccessful, outmoded prewar conventions and attitudes. Women who had been forced to work outside the home because of labour shortages during the war were unwilling to give up their social and economic independence after the war had ended. Having won the right to vote when the Nineteenth Amendment was ratified in 1920, the new “emancipated” woman, the flapper, demanded to be recognized as man's equal in all areas. She adopted a masculine look, bobbing her hair and abandoning corsets; she drank and smoked in public; and she was more open about sex.

      Social changes were not limited to the young. Productivity gains brought most Americans up to at least a modest level of comfort. People were working fewer hours a week and earning more money than ever before. New consumer goods—radios, telephones, refrigerators, and above all the motor car—made life better, and they were easier to buy thanks to a vastly expanded consumer credit system. Leisure activities became more important, professional sports boomed, and the rapid growth of tabloid newspapers, magazines, movies, and radios enabled millions to share in the exciting world of speakeasies, flappers, and jazz music, even if only vicariously.

      On the darker side, antiforeign sentiment led to the revival of the racist, anti-Semitic, and anti-Catholic Ku Klux Klan, especially in rural areas. During the early 1920s the Klan achieved a membership of some 5,000,000 and gained control of, or influence over, many city and state governments. Rural areas also provided the base for a Christian fundamentalist (fundamentalism, Christian) movement, as farmers and small-town dwellers who felt threatened and alienated by the rapidly expanding, socially changing cities fought to preserve American moral standards by stressing religious orthodoxy. The movement grew steadily until 1925, when John T. Scopes (Scopes Trial), a biology teacher in Dayton, Tenn., was tried for violating a law common to many Southern states prohibiting the teaching of the theory of evolution. Although Scopes was found guilty of breaking the law, both the law itself and fundamentalist beliefs were ridiculed during the course of the trial, which attracted national attention.

      One fundamentalist goal that was achieved was the passage in 1919 of the Prohibition (Eighteenth) Amendment, which prohibited the manufacture, sale, or transportation of intoxicating liquors. Millions of mostly Protestant churchgoers hailed Prohibition as a moral advance, and the liquor consumption of working people, as well as the incidence of alcohol-related diseases and deaths, does seem to have dropped during the period. On the other hand, millions of otherwise law-abiding citizens drank the prohibited liquor, prompting the growth of organized crime. The illegal liquor business was so lucrative and federal prohibition enforcement machinery was so slight that gangsters were soon engaged in the large-scale smuggling, manufacture, and sale of alcoholic beverages.

      As in legitimate business, the highest profits came from achieving economies of scale, so gangsters engaged in complex mergers and takeovers; but, unlike corporate warfare, the underworld used real guns to wipe out competition. In 1931 a national law-enforcement commission, formed to study the flouting of prohibition and the activities of gangsters, was to report that prohibition was virtually unenforceable; and, with the coming of the Great Depression, prohibition ceased to be a key political issue. In 1933 the Twenty-first Amendment brought its repeal.

      In the meantime, prohibition and religion were the major issues of the 1928 presidential campaign between the Republican nominee, Herbert Hoover (Hoover, Herbert), and the Democrat, Governor Alfred E. Smith (Smith, Al) of New York. Smith was an opponent of prohibition and a Roman Catholic. His candidacy brought enthusiasm and a heavy Democratic vote in the large cities, but a landslide against him in the dry and Protestant hinterlands secured the election for Hoover.

The Great Depression (Great Depression)
      In October 1929, only months after Hoover took office, the stock market crashed (stock market crash of 1929), the average value of 50 leading stocks falling by almost half in two months. Despite occasional rallies, the slide persisted until 1932, when stock averages were barely a fourth of what they had been in 1929. Industrial production soon followed the stock market, giving rise to the worst unemployment the country had ever seen. By 1933 at least a quarter of the work force was unemployed. Adjusted for deflation, salaries had fallen by 40 percent and industrial wages by 60 percent.

      The causes of the Great Depression were many and various. Agriculture had collapsed in 1919 and was a continuing source of weakness. Because of poor regulatory policies, many banks were overextended. Wages had not kept up with profits, and by the late 1920s consumers were reaching the limits of their ability to borrow and spend. Production had already begun to decline and unemployment to rise before the crash. The crash, which was inevitable since stock prices were much in excess of real value, greatly accelerated every bad tendency, destroying the confidence of investors and consumers alike.

      Hoover met the crisis energetically, in contrast to earlier administrations, which had done little to cope with panics except reduce government spending. He extracted promises from manufacturers to maintain production. He signed legislation providing generous additional sums for public works. He also signed the infamous Smoot–Hawley Tariff (Smoot-Hawley Tariff Act) of 1930, which raised duties to an average level of 50 percent. These steps failed to ease the depression, however, while the tariff helped to export it. International trade had never recovered from World War I. Europe still depended on American sales and investments for income and on American loans to maintain the complicated structure of debt payments and reparations erected in the 1920s. After the crash Americans stopped investing in Europe, and the tariff deprived foreigners of their American markets. Foreign nations struck back with tariffs of their own, and all suffered from the resulting anarchy.

      In the 1930 elections the Democratic Party won control of the House of Representatives and, in combination with liberal Republicans, the Senate as well. Soon afterward a slight rise in production and employment made it seem that the worst of the depression was over. Then, in the spring of 1931, another crisis erupted. The weakening western European economy brought down a major bank in Vienna, and Germany defaulted on its reparations payments. Hoover proposed a one-year moratorium on reparations and war-debt payments, but, even though the moratorium was adopted, it was too little too late. In the resulting financial panic most European governments went off the gold standard (gold-exchange standard) and devalued their currencies, thus destroying the exchange system, with devastating effects upon trade. Europeans withdrew gold from American banks, leading the banks to call in their loans to American businesses. A cascade of bankruptcies ensued, bank customers collapsing first and after them the banks.

      Hoover tried hard to stabilize the economy. He persuaded Congress to establish a Reconstruction Finance Corporation to lend funds to banks, railroads, insurance companies, and other institutions. At the same time, in January 1932, new capital was arranged for federal land banks. The Glass–Steagall Act provided gold to meet foreign withdrawals and liberalized Federal Reserve credit. The Federal Home Loan Bank Act sought to prop up threatened building and loan associations. But these measures failed to promote recovery or to arrest the rising tide of unemployment. Hoover, whose administrative abilities had masked severe political shortcomings, made things worse by offering negative leadership to the nation. His public addresses were conspicuously lacking in candor. He vetoed measures for direct federal relief, despite the fact that local governments and private charities, the traditional sources for welfare, were clearly incapable of providing adequate aid for the ever-rising numbers of homeless and hungry. When unemployed veterans refused to leave Washington after their request for immediate payment of approved bonuses was denied, Hoover sent out the army, which dispersed the protesters at bayonet point and burned down their makeshift quarters.

      Hoover's failures and mistakes guaranteed that whoever the Democrats nominated in 1932 would become the next president. Their candidate was Governor Franklin Delano Roosevelt (Roosevelt, Franklin D.) of New York. He won the election by a large margin, and the Democrats won majorities in both branches of Congress.

The New Deal (New Deal)

The first New Deal
      Roosevelt took office amid a terrifying bank crisis that had forced many states to suspend banking activities. He acted quickly to restore public confidence. On Inaugural Day, March 4, 1933, he declared that “the only thing we have to fear is fear itself.” The next day he halted trading in gold and declared a national “ bank holiday.” On March 9 he submitted to Congress an Emergency Banking Bill authorizing government to strengthen, reorganize, and reopen solvent banks. The House passed the bill by acclamation, sight unseen, after only 38 minutes of debate. That night the Senate passed it unamended, 73 votes to 7. On March 12 Roosevelt announced that, on the following day, sound banks would begin to reopen. On March 13, deposits exceeded withdrawals in the first reopened banks. “Capitalism was saved in eight days,” Raymond Moley, a member of the president's famous “brain trust,” later observed.

      In fact, the legal basis for the bank holiday was doubtful. The term itself was a misnomer, intended to give a festive air to what was actually a desperate last resort. Most of the reopened banks were not audited to establish their solvency; instead the public was asked to trust the president. Nevertheless, the bank holiday exemplified brilliant leadership at work. It restored confidence where all had been lost and saved the financial system. Roosevelt followed it up with legislation that did actually put the banking structure on a solid footing. The Glass–Steagall Act of 1933 separated commercial from investment banking and created the Federal Deposit Insurance Corporation to guarantee small deposits. The Banking Act of 1935 strengthened the Federal Reserve System, the first major improvement since its birth in 1913.

      With the country enthusiastically behind him, Roosevelt kept Congress in special session and piece by piece sent it recommendations that formed the basic recovery program of his first 100 days in office. From March 9 to June 16, 1933, Congress enacted all of Roosevelt's proposals. Among the bills passed was one creating the Tennessee Valley Authority, which would build dams and power plants and in many other ways salvage a vast, impoverished region. The Securities Exchange Act gave the Federal Trade Commission broad new regulatory powers, which in 1934 were passed on to the newly created Securities and Exchange Commission. The Home Owners Loan Act established a corporation that refinanced one of every five mortgages on urban private residences. Other bills passed during the Hundred Days, as well as subsequent legislation, provided aid for the unemployed and the working poor and attacked the problems of agriculture and business.

      Nothing required more urgent attention than the masses of unemployed workers who, with their families, had soon overwhelmed the miserably underfinanced bodies that provided direct relief. On May 12, 1933, Congress established a Federal Emergency Relief Administration to distribute half a billion dollars to state and local agencies. Roosevelt also created the Civil Works Administration, which by January 1934 was employing more than 4,000,000 men and women. Alarmed by rising costs, Roosevelt dismantled the CWA in 1934, but the persistence of high unemployment led him to make another about-face. In 1935 the Emergency Relief Appropriation Act provided almost $5,000,000,000 to create work for some 3,500,000 persons. The Public Works Administration (PWA), established in 1933, provided jobs on long-term construction projects, and the Civilian Conservation Corps put 2,500,000 young men to work planting or otherwise improving huge tracts of forestland. For homeowners, the Federal Housing Administration began insuring private home-improvement loans to middle-income families in 1934; in 1938 it became a home-building agency as well.

Agricultural recovery
      Hoover's Federal Farm Board had tried to end the long-standing agricultural depression by raising prices without limiting production. Roosevelt's Agricultural Adjustment Act (AAA) of 1933 was designed to correct the imbalance. Farmers who agreed to limit production would receive “parity” payments to balance prices between farm and nonfarm products, based on prewar income levels. Farmers benefited also from numerous other measures, such as the Farm Credit Act of 1933, which refinanced a fifth of all farm mortgages in a period of 18 months, and the creation in 1935 of the Rural Electrification Administration (REA), which did more to bring farmers into the 20th century than any other single act. Thanks to the REA, nine out of 10 farms were electrified by 1950, compared to one out of 10 in 1935.

      These additional measures were made all the more important by the limited success of the AAA. Production did fall as intended, aided by the severe drought of 1933–36, and prices rose in consequence; but many, perhaps a majority, of farmers did not prosper as a result. The AAA was of more value to big operators than to small family farmers, who often could not meet their expenses if they restricted their output and therefore could not qualify for parity payments. The farm corporation, however, was able to slash its labour costs by cutting acreage and could cut costs further by using government subsidies to purchase machinery. Thus, even before the Supreme Court invalidated the AAA in 1936, support for it had diminished.

Business recovery
      As the economic crisis was above all an industrial depression, business recovery headed the New Deal's list of priorities. Working toward that goal, the administration drafted the National Industrial Recovery Act of 1933, which, among other things, created a National Recovery Administration to help business leaders draw up and enforce codes governing prices, wages, and other matters (coded industries would be exempt from the antitrust laws). Labour was offered protection from unfair practices and given the right to bargain collectively. A large-scale public works appropriation, administered through the PWA, was intended to pour sufficient money into the economy to increase consumer buying power while prices and wages went up.

      Despite great initial enthusiasm for the NRA program, it was a failure. The codes became too numerous and complex for proper enforcement, and they were resented because they tended to favour the leading producers in each regulated industry. The protections afforded labour proved illusory, while the PWA, despite an impressive building record that included not only dams, bridges, and schools but also aircraft carriers, was too slow and too small to have much effect on the economy as a whole.

      Yet, even if the NRA had overcome its technical problems, failure would probably still have resulted. What the country needed was economic growth, but the NRA assumed that the United States had a mature economic structure incapable of further expansion. Accordingly, it worked to stabilize the economy, eliminate wasteful or predatory competition, and protect the rights of labour. Encouraging growth was not on its agenda.

The second New Deal and the Supreme Court
      In reaction to pressures from the left and hostility from the right, the New Deal shifted more toward reform in 1935–36. Popular leaders, promising more than Roosevelt, threatened to pull sufficient votes from him in the 1936 election to bring Republican victory. Senator Huey P. Long (Long, Huey) of Louisiana was building a national following with a “Share the Wealth” program. The poor in Northern cities were attracted to the Roman Catholic priest Charles E. Coughlin (Coughlin, Charles E), who later switched from a program of nationalization and currency inflation to an antidemocratic, anti-Semitic emphasis. Many older people supported Francis E. Townsend's plan to provide $200 per month for everyone over age 60. At the same time, conservatives, including such groups as the American Liberty League, founded in 1934, attacked the New Deal as a threat to states' rights, free enterprise, and the open shop.

      Roosevelt's response in 1935 was to propose greater aid to the underprivileged and extensive reforms. Congress created the Works Progress Administration, which replaced direct relief with work relief; between 1935 and 1941 the WPA employed an annual average of 2,100,000 workers, including artists and writers, who built or improved schools, hospitals, airports, and other facilities by the tens of thousands. The National Youth Administration created part-time jobs for millions of college students, high-school students, and other youngsters. Of long-range significance was the Social Security Act of 1935, which provided federal aid for the aged, retirement annuities, unemployment insurance, aid for persons who were blind or crippled, and aid to dependent children; the original act suffered from various inadequacies, but it was the beginning of a permanent, expanding national program. A tax reform law fell heavily upon corporations and well-to-do people. The National Labor Relations Act, or Wagner Act, gave organized labour federal protection in collective bargaining; it prohibited a number of “unfair practices” on the part of employers and created the strong National Labor Relations Board to enforce the law.

      In the 1936 elections Roosevelt, aided by his reform program, formed a coalition that included liberals, urban ethnics, farmers, trade unionists, and the elderly. He easily defeated the Republican nominee for president, Governor Alfred (“Alf”) M. Landon of Kansas, receiving more than 60 percent of the popular vote and the electoral votes of every state except Maine and Vermont. The Democratic majorities in the House and Senate were also strengthened. Viewing his decisive victory as an electoral mandate for continued reform, Roosevelt sought to neutralize the Supreme Court (Supreme Court of the United States), which in 1935 had invalidated several early New Deal reform measures and now seemed about to strike down the Wagner Act and the Social Security Act. In February 1937 Roosevelt created a furor by proposing a reorganization of the court system that would have included giving him the power to appoint up to six new justices, thus giving the court a liberal majority. Some Democrats and a few liberal Republicans in Congress supported the proposal, but a strong coalition of Republicans and conservative Democrats, backed by much public support, fought the so-called court-packing plan.

      Meanwhile the court itself in a new series of decisions began upholding as constitutional measures involving both state and federal economic regulation. These decisions, which began an extensive revision of constitutional law concerning governmental regulation, made the reorganization plan unnecessary; the Senate defeated it in July 1937 by a vote of 70 to 22. Roosevelt had suffered a stinging political defeat, even though he no longer had to fear the court. Turnover on the court was rapid as older members retired or died; by 1942 all but two of the justices were Roosevelt appointees.

The culmination of the New Deal
      Roosevelt lost further prestige in the summer of 1937, when the nation plunged into a sharp recession. Economists had feared an inflationary boom as industrial production moved up to within 7.5 percent of 1929. Other indices were high except for a lag in capital investment and continued heavy unemployment. Roosevelt, fearing a boom and eager to balance the budget, cut government spending, which most economists felt had brought the recovery. The new Social Security taxes removed an additional $2,000,000,000 from circulation. Between August 1937 and May 1938 the index of production fell from 117 to 76 (on a 1929 base of 100), and unemployment increased by perhaps 4,000,000 persons. Congress voted an emergency appropriation of $5,000,000,000 for work relief and public works, and by June 1938 recovery once more was under way, although unemployment remained higher than before the recession.

      Roosevelt's loss of power became evident in 1938, when his attempts to defeat conservative congressional Democrats in the primaries failed. In the fall Republicans gained 80 seats in the House and seven in the Senate. The Democratic Party retained nominal control of Congress, but conservative Democrats and Republicans voting together defeated many of Roosevelt's proposals. A few last bills slipped through. The U.S. Housing Authority was created in 1937 to provide low-cost public housing. In 1938 the Fair Labor Standards Act established a minimum wage and a maximum work week. Otherwise, the president seldom got what he asked for.

      Apart from the New Deal itself, no development in the 1930s was more important than the rise of organized labour. This too had negative, or at least mixed, effects upon Roosevelt's political power. When the depression struck, only 5 percent of the work force was unionized, compared to 12 percent in 1920. The great change began in 1935 when the American Federation of Labor's Committee for Industrial Organization broke away from its timid parent and, as the Congress of Industrial Organizations (after 1938), began unionizing the mass production industries. The CIO had a unique tool, the sit-down strike. Instead of picketing a plant, CIO strikers closed it down from inside, taking the factory hostage and preventing management from operating with nonunion workers. This, together with the new reluctance of authorities, many of them Roosevelt Democrats, to act against labour, made sit-down strikes highly successful. On Feb. 11, 1937, after a long sit-down strike, General Motors (General Motors Corporation), the country's mightiest corporation, recognized the United Auto Workers (United Automobile Workers). The United States Steel Corporation caved in less than a month later, and by 1941 some 10,500,000 workers were unionized, three times as many as a decade before. The CIO became a mainstay of the New Deal coalition, yet it also aroused great resentment among middle-class Americans, who opposed strikes in general but the CIO's tactics especially. This further narrowed Roosevelt's political base.

An assessment of the New Deal
      The New Deal established federal responsibility for the welfare of the economy and the American people. At the time, conservative critics charged it was bringing statism or even socialism. Left-wing critics of a later generation charged just the reverse—that it bolstered the old order and prevented significant reform. Others suggested that the New Deal was no more than the extension and culmination of progressivism. In its early stages, the New Deal did perhaps begin where progressivism left off and built upon the Hoover program for fighting the depression. But Roosevelt soon took the New Deal well beyond Hoover and progressivism, establishing a precedent for large-scale social programs and for government participation in economic activities. Despite the importance of this growth of federal responsibility, the New Deal's greatest achievement was to restore faith in American democracy at a time when many people believed that the only choice left was between communism and fascism. Its greatest failure was its inability to bring about complete economic recovery. Some economists, notably John Maynard Keynes of Great Britain, were calling for massive deficit spending to promote recovery; and by 1937 the New Deal's own experience proved that pump priming worked, whereas spending cutbacks only hurt the economy. Roosevelt remained unpersuaded, however, and the depression lingered on until U.S. entry into World War II brought full employment.

The road to war
      After World War I most Americans concluded that participating in international affairs had been a mistake. They sought peace through isolation and throughout the 1920s advocated a policy of disarmament and nonintervention. As a result, relations with Latin-American (Latin America, history of) nations improved substantially under Hoover, an anti-imperialist. This enabled Roosevelt to establish what became known as the Good Neighbor Policy, which repudiated altogether the right of intervention in Latin America. By exercising restraint in the region as a whole and by withdrawing American occupation forces from the Caribbean, Roosevelt increased the prestige of the United States in Latin America to its highest level in memory.

      As the European (Europe, history of) situation became more tense, the United States continued to hold to its isolationist policy. Congress, with the approval of Roosevelt and Secretary of State Cordell Hull, enacted a series of neutrality laws that legislated against the factors that supposedly had taken the United States into World War I. As Italy prepared to invade Ethiopia, Congress passed the Neutrality Act of 1935, embargoing shipment of arms to either aggressor or victim. Stronger legislation followed the outbreak of the Spanish Civil War in 1936, in effect penalizing the Spanish government, whose fascist enemies were receiving strong support from Benito Mussolini and Adolph Hitler.

      In the Pacific Roosevelt continued Hoover's policy of nonrecognition of Japan's conquests in Asia. When Japan invaded China in 1937, however, he seemed to begin moving away from isolationism. He did not invoke the Neutrality Act, which had just been revised, and in October he warned that war was like a disease and suggested that it might be desirable for peace-loving nations to “quarantine” aggressor nations. He then quickly denied that his statement had any policy implications, and by December, when Japanese aircraft sank a U.S. gunboat in the Yangtze River, thoughts of reprisal were stifled by public apathy and by Japan's offer of apologies and indemnities. With strong public opposition to foreign intervention, Roosevelt concentrated on regional defense, continuing to build up the navy and signing mutual security agreements with other governments in North and South America.

      When Germany's invasion of Poland in 1939 touched off World War II, Roosevelt called Congress into special session to revise the Neutrality Act to allow belligerents (in reality only Great Britain and France, both on the Allied side) to purchase munitions on a cash-and-carry basis. With the fall of France to Germany in June 1940, Roosevelt, with heavy public support, threw the resources of the United States behind the British. He ordered the War and Navy departments to resupply British divisions that had been rescued at Dunkirk minus their weaponry, and in September he agreed to exchange 50 obsolescent destroyers for 99-year leases on eight British naval and air bases in the Western Hemisphere.

      The question of how much and what type of additional aid should be given to the Allies became a major issue of the election of 1940, in which Roosevelt ran for an unprecedented third term. Public opinion polls, a new influence upon decision makers, showed that most Americans favoured Britain but still wished to stay out of war. Roosevelt's opponent, Wendell Willkie (Willkie, Wendell L.), capitalized on this and rose steadily in the polls by attacking the president as a warmonger. An alarmed Roosevelt fought back, going so far as to make what he knew was an empty promise. “Your boys,” he said just before the election, “are not going to be sent into any foreign wars.” In truth, both candidates realized that U.S. intervention in the war might become essential, contrary to their public statements. Roosevelt won a decisive victory.

      Upon being returned to office, Roosevelt moved quickly to aid the Allies. His Lend-Lease Act (lend-lease), passed in March 1941 after vehement debate, committed the United States to supply the Allies on credit. When Germany, on March 25, extended its war zone to include Iceland and the Denmark Straits, Roosevelt retaliated in April by extending the American Neutrality Patrol to Iceland. In July the United States occupied Iceland, and U.S. naval vessels began escorting convoys of American and Icelandic ships. That summer Lend-Lease was extended to the Soviet Union (Union of Soviet Socialist Republics) after it was invaded by Germany. In August Roosevelt met with the British prime minister, Winston Churchill, off the coast of Newfoundland to issue a set of war aims known as the Atlantic Charter. It called for national self-determination, larger economic opportunities, freedom from fear and want, freedom of the seas, and disarmament.

      Although in retrospect U.S. entry into World War II seems inevitable, in 1941 it was still the subject of great debate. Isolationism was a great political force, and many influential individuals were determined that U.S. aid policy stop short of war. In fact, as late as Aug. 12, 1941, the House of Representatives extended the Selective Training and Service Act of 1940 by a vote of only 203 to 202. Despite isolationist resistance, Roosevelt pushed cautiously forward. In late August the navy added British and Allied ships to its Icelandic convoys. Its orders were to shoot German and Italian warships on sight, thus making the United States an undeclared participant in the Battle of the Atlantic. During October one U.S. destroyer was damaged by a German U-boat and another was sunk. The United States now embarked on an undeclared naval war against Germany, but Roosevelt refrained from asking for a formal declaration of war. According to public opinion polls, a majority of Americans still hoped to remain neutral.

      The war question was soon resolved by events in the Pacific. As much as a distant neutral could, the United States had been supporting China in its war against Japan, yet it continued to sell Japan products and commodities essential to the Japanese war effort. Then, in July 1940, the United States applied an embargo on the sale of aviation gas, lubricants, and prime scrap metal to Japan. When Japanese armies invaded French Indochina in September with the apparent purpose of establishing bases for an attack on the East Indies, the United States struck back by embargoing all types of scrap iron and steel and by extending a loan to China. Japan promptly retaliated by signing a limited treaty of alliance, the Tripartite Pact, with Germany and Italy. Roosevelt extended a much larger loan to China and in December embargoed iron ore, pig iron, and a variety of other products.

      Japan and the United States then entered into complex negotiations in the spring of 1941. Neither country would compromise on the China question, however, Japan refusing to withdraw and the United States insisting upon it. Believing that Japan intended to attack the East Indies, the United States stopped exporting oil to Japan at the end of the summer. In effect an ultimatum, since Japan had limited oil stocks and no alternative source of supply, the oil embargo confirmed Japan's decision to eliminate the U.S. Pacific Fleet and to conquer Southeast Asia, thereby becoming self-sufficient in crude oil and other vital resources. By the end of November Roosevelt and his military advisers knew (through intercepted Japanese messages) that a military attack was likely; they expected it to be against the East Indies or the Philippines. To their astonishment, on December 7 Japan directed its first blow against naval and air installations in Hawaii. In a bold surprise attack, Japanese aircraft destroyed or damaged 18 ships of war at Pearl Harbor (Pearl Harbor attack), including the entire battleship force, and 347 planes. Total U.S. casualties amounted to 2,403 dead and 1,178 wounded.

 On Dec. 8, 1941, Congress with only one dissenting vote declared war against Japan. Three days later Germany and Italy declared war against the United States; and Congress, voting unanimously, reciprocated. As a result of the attack on Pearl Harbor, the previously divided nation entered into the global struggle with virtual unanimity.

The United States at war
      Although isolationism died at Pearl Harbor, its legacy of unpreparedness lived on. Anticipating war, Roosevelt and his advisers had been able to develop and execute some plans for military expansion, but public opinion prohibited large-scale appropriations for armament and defense. Thus, when Pearl Harbor was attacked, the United States had some 2,200,000 men under arms, but most were ill-trained and poorly equipped. Barely a handful of army divisions even approached a state of readiness. The Army Air Corps possessed only 1,100 combat planes, many of which were outdated. The navy was better prepared, but it was too small to fight a two-ocean war and had barely been able to provide enough ships for convoy duty in the North Atlantic. Eventually more than 15,000,000 men and women would serve in the armed forces, but not until 1943 would the United States be strong enough to undertake large-scale offensive operations. (For U.S. military involvement in World War II, see the article World War II.)

War production
      Roosevelt had begun establishing mobilization agencies in 1939, but none had sufficient power or authority to bring order out of the chaos generated as industry converted to war production. He therefore created the War Production Board in January 1942 to coordinate mobilization, and in 1943 an Office of War Mobilization was established to supervise the host of defense agencies that had sprung up in Washington, D.C. Gradually, a priorities system was devised to supply defense plants with raw materials; a synthetic rubber industry was developed from scratch; rationing conserved scarce resources; and the Office of Price Administration kept inflation under control.

      After initial snarls and never-ending disputes, by the beginning of 1944 production was reaching astronomical totals—double those of all the enemy countries combined. Hailed at the time as a production miracle, this increase was about equal to what the country would have produced in peacetime, assuming full employment. War production might have risen even higher if regulation of civilian consumption and industry had been stricter.

      Scientists, under the direction of the Office of Scientific Research and Development, played a more important role in production than in any previous war, making gains in rocketry, radar and sonar, and other areas. Among the new inventions was the proximity fuse (proximity fuze), which contained a tiny radio that detonated an artillery shell in the vicinity of its target, making a direct hit unnecessary. Of greatest importance was the atomic bomb, developed by scientists in secrecy and first tested on July 6, 1945.

Financing the war (war finance)
      The total cost of the war to the federal government between 1941 and 1945 was about $321,000,000,000 (10 times as much as World War I). Taxes paid 41 percent of the cost, less than Roosevelt requested but more than the World War I figure of 33 percent. The remainder was financed by borrowing from financial institutions, an expensive method but one that Congress preferred over the alternatives of raising taxes even higher or making war bond purchases compulsory. In consequence the national debt increased fivefold, amounting to $259,000,000,000 in 1945. The Revenue Act of 1942 revolutionized the tax structure by increasing the number who paid income taxes from 13,000,000 to 50,000,000. At the same time, through taxes on excess profits and other sources of income, the rich were made to bear a larger part of the burden, making this the only period in modern history when wealth was significantly redistributed.

Social consequences of the war
      Despite the vast number of men and women in uniform, civilian employment rose from 46,000,000 in 1940 to more than 53,000,000 in 1945. The pool of unemployed men dried up in 1943, and further employment increases consisted of women, minorities, and over- or underage males. These were not enough to meet all needs, and by the end of the year a manpower shortage had developed.

      One result of this shortage was that blacks made significant social and economic progress. Although the armed forces continued to practice segregation, as did Red Cross blood banks, Roosevelt, under pressure from blacks, who were outraged by the refusal of defense industries to integrate their labour forces, signed Executive Order 8802 on June 25, 1941. It prohibited racial discrimination in job training programs and by defense contractors and established a Fair Employment Practices Committee to insure compliance. By the end of 1944 nearly 2,000,000 blacks were at work in defense industries. As black contributions to the military and industry increased, so did their demands for equality. This sometimes led to racial hostilities, as on June 20, 1943, when mobs of whites invaded the black section of Detroit. Nevertheless, the gains offset the losses. Lynching virtually died out, several states outlawed discriminatory voting practices, and others adopted fair employment laws.

      Full employment also resulted in raised income levels, which, through a mixture of price and wage controls (wage-price control), were kept ahead of inflation. Despite both this increase in income and a no-strike pledge given by trade union leaders after Pearl Harbor, there were numerous labour actions. Workers resented wage ceilings because much of their increased income went to pay taxes and was earned by working overtime rather than through higher hourly rates. In consequence, there were almost 15,000 labour stoppages during the war at a cost of some 36,000,000 man-days. Strikes were greatly resented, particularly by the armed forces, but their effects were more symbolic than harmful. The time lost amounted to only one-ninth of 1 percent of all hours worked.

      Because Pearl Harbor had united the nation, few people were prosecuted for disloyalty or sedition, unlike during World War I. The one glaring exception to this policy was the scandalous treatment of Japanese and Americans of Japanese descent (Nisei). In 1942, on the basis of groundless racial fears and suspicions, virtually the entire Japanese-American population of the West Coast, amounting to 110,000 persons, was rounded up and imprisoned in “relocation” centres, which the inmates regarded as concentration camps. The Japanese-Americans lost their liberty, and in most cases their property as well, despite the fact that the Federal Bureau of Investigation, which had already arrested those individuals it considered security risks, had verified their loyalty.

The 1944 election
      Roosevelt soundly defeated Governor Thomas E. Dewey of New York in the 1944 election, but his margin of victory was smaller than it had been previously. His running mate, chosen by party leaders who disliked former vice president Henry A. Wallace for his extreme liberalism, was Senator Harry S. Truman (Truman, Harry S.) of Missouri, a party Democrat who had distinguished himself by investigating fraud and waste among war contractors.

The new U.S. role in world affairs
      The U.S. entry into World War II had brought an end to isolation, and President Roosevelt was determined to prevent a retreat into isolationism once the war was over. After a series of conferences in December 1941, Roosevelt and Prime Minister Churchill (Churchill, Sir Winston) announced the formation of the United Nations, a wartime alliance of 26 nations. In 1943 Roosevelt began planning the organization of a postwar United Nations, meeting with congressional leaders to assure bipartisan support. The public supported Roosevelt's efforts, and that fall Congress passed resolutions committing the United States to membership in an international body “with power adequate to establish and to maintain a just and lasting peace.” Finally, in the spring of 1945, delegates from 50 nations signed the charter for a permanent United Nations. In addition to political harmony, Roosevelt promoted economic cooperation, and, with his full support, in 1944 the World Bank and the International Monetary Fund were created to bar a return of the cutthroat economic nationalism that had prevailed before the war.

      Throughout the war Roosevelt met with Churchill and Stalin (Stalin, Joseph) to plan military strategy and postwar policy. His last great conference with them took place at Yalta (Yalta Conference) in the Crimea in February 1945. There policies were agreed upon to enforce the unconditional surrender of Germany, to divide it into zones for occupation and policing by the respective Allied forces, and to provide democratic regimes in eastern European nations. A series of secret agreements were also made at Yalta; chief among these was the Soviet pledge to enter the war against Japan after the German surrender, in return for concessions in East Asia.

 Roosevelt (Roosevelt, Franklin D.) died suddenly of a cerebral hemorrhage on April 12 and was succeeded by Truman. In the following months the German armed forces collapsed, and on May 7 all German forces surrendered. In the Pacific the invasions of Iwo Jima and Okinawa in early 1945 brought Japan under a state of siege. In the summer, before an invasion could take place, the United States dropped atomic bombs on Hiroshima and Nagasaki. On September 2 the surrender of Japan was signed in Tokyo harbour on the battleship Missouri.

Frank Freidel William L. O'Neill

The United States since 1945
The peak Cold War years, 1945–60

The Truman Doctrine and containment
 Truman, who had been chosen as vice president for domestic political reasons, was poorly prepared to assume the presidency. He had no experience of foreign affairs, knew little about Roosevelt's intentions, and was intimidated by the giant shoes he now had to fill. His first decisions were dictated by events or plans already laid. In July, two months after the German forces surrendered, he met at Potsdam (Potsdam Conference), Germany, with Stalin and Churchill (who was succeeded at the conference by Clement Attlee) to discuss future operations against Japan and a peace settlement for Europe. Little was accomplished, and there would not be another meeting between Soviet and American heads of state for 10 years.

      Hopes that good relations between the superpowers would ensure world peace soon faded as a result of the Stalinization of eastern Europe and Soviet support of communist insurgencies in various parts of the globe. Events came to a head in 1947 when Britain, weakened by a failing economy, decided to pull out of the eastern Mediterranean. This would leave both Greece (Greece, history of), where a communist-inspired civil war was raging, and Turkey to the mercies of the Soviet Union. Truman now came into his own as a national leader, asking Congress to appropriate aid to Greece and Turkey and asserting, in effect, that henceforth the United States must help free peoples in general to resist communist aggression. This policy, known as the Truman Doctrine, has been criticized for committing the United States to the support of unworthy regimes and for taking on greater burdens than it was safe to assume. At first, however, the Truman Doctrine was narrowly applied. Congress appropriated $400,000,000 for Greece and Turkey, saving both from falling into unfriendly hands, and thereafter the United States relied mainly on economic assistance to support its foreign policy.

      The keystone of this policy, and its greatest success, was the European Recovery Program, usually called the Marshall Plan. Europe's economy had failed to recover after the war, its paralysis being worsened by the exceptionally severe winter of 1946–47. Thus, in June 1947 Secretary of State George C. Marshall (Marshall, George Catlett) proposed the greatest foreign-aid program in world history in order to bring Europe back to economic health. In 1948 Congress created the Economic Cooperation Administration and over the next five years poured some $13,000,000,000 worth of aid into western Europe. (Assistance was offered to Eastern-bloc countries also, but they were forced by Stalin to decline.) The plan restored economic vitality and confidence to the region, while undermining the local communist parties. In 1949 Truman proposed extending similar aid to underdeveloped nations throughout the world, but the resulting Point Four Program was less successful than the Marshall Plan. Experience showed that it was easier to rebuild a modern industrial economy than to develop one from scratch.

 U.S. policy for limiting Soviet expansion had developed with remarkable speed. Soon after the collapse of hopes for world peace in 1945 and 1946, the Truman administration had accepted the danger posed by Soviet aggression and resolved to shore up noncommunist defenses at their most critical points. This policy, known as containment, a term suggested by its principal framer, George Kennan (Kennan, George F.), resulted in the Truman Doctrine and the Marshall Plan, as well as in the decision to make the western zones of Germany (later West Germany) a pillar of strength. When the Soviet Union countered this development in June 1948 by blocking (Berlin blockade and airlift) all surface routes into the western-occupied zones of Berlin, Britain and the United States supplied the sectors by air for almost a year until the Soviet Union called off the blockade. A logical culmination of U.S. policy was the creation in 1949 of the North Atlantic Treaty Organization (NATO), a military alliance among 12 (later 16) nations to resist Soviet aggression.

      Containment worked less well in Asia. In December 1945 Truman sent General Marshall to China with instructions to work out an agreement between the Communist rebels and the Nationalist government of Chiang Kai-shek. This was an impossible task, and in the subsequent fighting Mao Zedong's Communist forces prevailed. The Nationalist government fled to Taiwan in 1949, and the United States then decided to concentrate its East Asian policy upon strengthening occupied Japan, with much better results.

Postwar domestic reorganization
      After the end of World War II the vast U.S. military establishment was dismantled, its strength falling from 12,000,000 men and women to about 1,500,000 in 1947. The navy and army air forces remained the world's strongest, however, and the U.S. monopoly of atomic weapons seemed to ensure security. In 1946 the United States formed an Atomic Energy Commission for purposes of research and development. The armed forces were reorganized under a secretary of defense by the National Security Act of 1947, which also created the U.S. Air Force (United States Air Force, The) as an independent service. In 1949 the services were brought together in a single Department of Defense, though each retained considerable autonomy. In that same year the Soviet Union exploded its own atomic device, opening an era of intense nuclear, and soon thermonuclear, competition.

      Peace brought with it new fears. Demobilizing the armed forces might result in massive unemployment and another depression. Or, conversely, the huge savings accumulated during the war could promote runaway inflation. The first anxiety proved groundless, even though government did little to ease the transition to a peacetime economy. War contracts were canceled, war agencies diminished or dissolved, and government-owned war plants sold to private parties. But, after laying off defense workers, manufacturers rapidly tooled up and began producing consumer goods in volume. The housing industry grew too, despite shortages of every kind, thanks to mass construction techniques pioneered by the firm of Levitt and Sons, Inc., and other developers. All this activity created millions of new jobs. The Serviceman's Readjustment Act of 1944, known as the G.I. Bill of Rights, also helped ease military personnel back into civilian life. It provided veterans with loans, educational subsidies, and other benefits.

      Inflation was more troublesome. Congress lacked enthusiasm for wartime price controls and in June 1946 passed a bill preserving only limited controls. Truman vetoed the bill as inadequate, controls expired, and prices immediately soared. Congress then passed an even weaker price-control bill, which Truman signed. Nevertheless, by the end of the year, most price and wage controls (wage-price control) had been lifted. In December the Office of Price Administration began to close down. As a result the consumer price index did not stabilize until 1948, when prices were more than a third above the 1945 level, while wage and salary income had risen by only about 15 percent.

      Truman's difficulties with Congress had begun in September 1945 when he submitted a 21-point domestic program, including proposals for an expansion of social security and public housing and for the establishment of a permanent Fair Employment Practices Act banning discrimination. These and subsequent liberal initiatives, later known as the Fair Deal, were rejected by Congress, which passed only the Employment Act of 1946. This clearly stated the government's responsibility for maintaining full employment and established a Council of Economic Advisers to advise the president.

      Truman's relations with Congress worsened after the 1946 elections. Voters, who were angered by the price-control debacle, a wave of strikes, and Truman's seeming inability to lead or govern, gave control of both houses of Congress to Republicans for the first time since 1928. The president and the extremely conservative 80th Congress battled from beginning to end, not over foreign policy, where bipartisanship prevailed, but over domestic matters. Congress passed two tax reductions over Truman's vetoes and in 1947, again over Truman's veto, passed the Taft–Hartley Act, which restricted unions while extending the rights of management. Congress also rejected various liberal measures submitted by Truman, who did not expect the proposals to pass but wanted Congress on record as having opposed important social legislation.

  By 1948, Truman had won support for his foreign policy, but he was expected to lose the presidential election that year because of his poor domestic record. Polls showed him lagging behind Dewey, again the Republican nominee, and to make matters worse the Democratic Party splintered. Former vice president Henry A. Wallace headed the Progressive Party ticket, which pledged to improve Soviet-American relations whatever the cost. Southerners, known as Dixiecrats (Dixiecrat), who were alienated by the Democratic Party's strong civil rights plank, formed the States' Rights Democratic Party and nominated Governor Strom Thurmond of South Carolina for president. These defections appeared to ensure Truman's defeat. Instead Truman won handily, receiving almost as many votes as his opponents combined. His support came largely from labour, which was upset by the Republican passage of the Taft–Hartley Act, from blacks, who strongly supported the Democrats' civil rights provisions, and from farmers, who preferred the higher agricultural subsidies promised by the Democrats, especially at a time when commodity prices were falling.

      The Democrats regained control of Congress in 1948, but Truman's relations with that body continued to be troubled. In January 1949 he asked for a broad range of Fair Deal measures, with uneven results. Congress did approve a higher minimum wage, the extension of social security to 10,000,000 additional persons, more public works, larger sums for the TVA and for rural electrification, and the Housing Act of 1949, which authorized construction of 810,000 units for low-income families. Truman failed, however, to persuade Congress to repeal Taft–Hartley, to reform the agricultural subsidy system, to secure federal aid to education, to adopt his civil rights program, or, most importantly, to accept his proposal for national health insurance. He succeeded nevertheless in protecting the New Deal principle of federal responsibility for social welfare, and he helped form the Democratic agenda for the 1960s.

The Red Scare
      Truman's last years in office were marred by charges that his administration was lax about, or even condoned, subversion and disloyalty and that communists, called “reds,” had infiltrated the government. These accusations were made despite Truman's strongly anticommunist foreign policy and his creation, in 1947, of an elaborate Federal Employee Loyalty Program, which resulted in hundreds of federal workers being fired and in several thousand more being forced to resign.

 The excessive fear of communist subversion was fed by numerous sources. China's fall to communism and the announcement of a Soviet atomic explosion in 1949 alarmed many, and fighting between communist and U.S.-supported factions in Korea heightened political emotions as well. Real cases of disloyalty and espionage also contributed, notably the theft of atomic secrets, for which Soviet agent Julius Rosenberg and his wife Ethel were convicted in 1951 and executed two years later. Republicans had much to gain from exploiting these and related issues.

 Senator Joseph R. McCarthy (McCarthy, Joseph R.) of Wisconsin stood out among those who held that the Roosevelt and Truman administrations amounted to “20 years of treason.” In February 1950 McCarthy claimed that he had a list (whose number varied) of State Department employees who were loyal only to the Soviet Union. McCarthy offered no evidence to support his charges and revealed only a single name, that of Owen Lattimore (Lattimore, Owen), who was not in the State Department and would never be convicted of a single offense. Nevertheless, McCarthy enjoyed a highly successful career, and won a large personal following, by making charges of disloyalty that, though mostly undocumented, badly hurt the Democrats. Many others promoted the scare in various ways, leading to few convictions but much loss of employment by government employees, teachers, scholars, and people in the mass media.

      On June 25, 1950, a powerful invading force from the Soviet-supported Democratic People's Republic of Korea (Korean War) (North Korea) swept south of the 38th parallel into the Republic of Korea (Korean War) (South Korea). Within days, President Truman resolved to defend South Korea, even though there were few Americans in Korea and few troops ready for combat. The UN Security Council, acting during a Soviet boycott, quickly passed a resolution calling upon UN members to resist North Korean aggression.

  After almost being driven into the sea, UN forces, made up largely of U.S. troops and commanded by U.S. General Douglas MacArthur (MacArthur, Douglas), counterattacked successfully and in September pushed the North Korean forces back across the border. Not content with this victory, the United States attempted to unify Korea by force, advancing almost to the borders of China and the Soviet Union. China, after its warnings were ignored, then entered the war, driving the UN forces back into South Korea. The battle line was soon stabilized along the 38th parallel, and armistice talks began on July 10, 1951, three months after Truman had relieved MacArthur for openly challenging U.S. policies. The talks dragged on fruitlessly, interrupted by outbreaks of fighting, until Eisenhower became president. The United States sustained some 142,000 casualties in this limited war, most of them occurring after China's entry.

      In addition to militarizing the Cold War, the Korean conflict widened its field. The United States assumed responsibility for protecting Taiwan against invasion from mainland China. Additional military aid was extended to the French in Indochina. In December 1950 Truman called for a crash program of rearmament, not just to support the forces in Korea but especially to expand the U.S. presence in Europe (Europe, history of). As a result defense expenditures rose to $53,600,000,000 in 1953, four times the pre-Korean level, and would decline only modestly after the armistice.

      The stalemated Korean War, a renewal of inflation, and the continuing Red Scare persuaded Truman not to stand for reelection in 1952 and also gravely handicapped Governor Adlai E. Stevenson of Illinois, the Democratic nominee. His opponent, General Dwight D. Eisenhower (Eisenhower, Dwight D.), was an immensely popular war hero with great personal charm and no political record, making him extremely hard to attack. Although he disliked their methods, Eisenhower allowed Republican campaigners, including his running mate, Senator Richard M. Nixon (Nixon, Richard M.) of California, to capitalize on the Red Scare by accusing the Truman administration of disloyalty. Eisenhower himself charged the administration with responsibility for the communist invasion of Korea and won wide acclaim when he dramatically promised that if elected he would visit Korea in person to end the war.

 Eisenhower won over many farmers, ethnic whites, workers, and Roman Catholics who had previously voted Democratic. He defeated Stevenson by a large margin, carrying 39 states, including three in the once solidly Democratic South. Despite Eisenhower's overwhelming victory, Republicans gained control of the House by just eight votes and managed only a tie in the Senate. Because the Republican margin was so slight, and because many right-wing Republicans in Congress disagreed with his policies, Eisenhower would increasingly depend upon Democrats to realize his objectives.

      Eisenhower had promised to end the Korean War, hold the line on government spending, balance the budget, abolish inflation, and reform the Republican Party. On July 27, 1953, an armistice was signed in Korea freezing the status quo. By cutting defense spending while taxes remained fairly high, and by keeping a tight rein on credit, Eisenhower was able to avoid serious deficits, abolish inflation, and, despite several small recessions, encourage steady economic growth that made Americans more prosperous than they had ever been before. Eisenhower also supported public works and a modest expansion of government social programs. In 1954 the St. Lawrence Seaway Development Corporation was established by Congress. In 1956 Congress authorized the National System of Interstate and Defense Highways, Eisenhower's pet project and the largest public works program in history. Amendments to the Social Security Act in 1954 and 1956 extended benefits to millions not previously covered. Thus, Eisenhower achieved all but the last of his goals, and even in that he was at least partially successful. At first Eisenhower did little to check the Red Scare, but in 1954 Senator McCarthy (McCarthy, Joseph R.) unwisely began to investigate the administration and the U.S. Army. This led to a full-scale investigation of McCarthy's own activities, and on December 2 the Senate, with Eisenhower playing a behind-the-scenes role, formally censured McCarthy for abusing his colleagues. McCarthy soon lost all influence, and his fall did much to remove the poison that had infected American politics. In short, Eisenhower was so successful in restoring tranquillity that, by the end of his first term, some people were complaining that life had become too dull.

      Tensions eased in foreign affairs as well. On March 5, 1953, Joseph Stalin died, opening the door to better relations with the Soviet Union. In 1955 the Soviets agreed to end the four-power occupation of Austria, and in that July Eisenhower met in Geneva with the new Soviet leader, Nikita S. Khrushchev (Khrushchev, Nikita Sergeyevich), for talks that were friendly though inconclusive.

      As for military policy, Eisenhower instituted the “New Look,” which entailed reducing the army from 1,500,000 men in 1953 to 900,000 in 1960. The navy experienced smaller reductions, while air force expenditures rose. Eisenhower was primarily interested in deterring a nuclear attack and to that end promoted expensive developments in nuclear weaponry and long-range missiles.

Eisenhower's second term
 Despite suffering a heart attack in 1955 and a case of ileitis that required surgery the next year, Eisenhower stood for reelection in 1956. His opponent was once again Stevenson. Two world crises dominated the campaign. On October 23, Hungarians revolted against communist rule, an uprising that was swiftly crushed by Red Army tanks. On October 29, Israel invaded Egypt, supported by British and French forces looking to regain control of the Suez Canal (Suez Crisis) and, perhaps, to destroy Egypt's president, Gamal Abdel Nasser, who had nationalized the canal in July. Eisenhower handled both crises deftly, forcing the invaders to withdraw from Egypt and preventing events in Hungary from triggering a confrontation between the superpowers. Owing in part to these crises, Eisenhower carried all but seven states in the election. It was a purely personal victory, however, for the Democrats retained control of both houses of Congress.

Domestic issues
      Although the Eisenhower administration can, in general, be characterized as a period of growth and prosperity, some problems did begin to arise during the second term. In 1957–58 an economic recession hit and unemployment rose to its highest level since 1941. Labour problems increased in intensity, with some 500,000 steelworkers going on strike for 116 days in 1959. There was even evidence of corruption on the Eisenhower staff. The president remained personally popular, but public discontent was demonstrated in the large majorities gained by the Democrats in the congressional elections of 1958.

      Problems associated with postwar population trends also began to be recognized. The U.S. population, which had grown markedly throughout the 1950s, passed 179,000,000 in 1960. Growth was concentrated in the West, and the country became increasingly urbanized as the middle class moved from the cities to new suburban developments. The migration left cities without their tax base but with responsibility for an increasing number of poor residents. It also resulted in a huge increase in commuters, which in turn led to continuing problems of traffic and pollution.

  During Eisenhower's second term, race became a central national concern for the first time since Reconstruction. Some civil rights advances had been made in previous years. In 1954 the Supreme Court (Supreme Court of the United States) had ruled that racially segregated schools were unconstitutional. The decision provoked intense resistance in the South but was followed by a chain of rulings and orders that continually narrowed the right to discriminate. In 1955 Martin Luther King, Jr. (King, Martin Luther, Jr.), led a boycott of segregated buses in Montgomery, Alabama, giving rise to the nonviolent civil rights movement. But neither the president nor Congress became involved in the race issue until 1957, when the segregationist governor of Arkansas blocked the integration of a high school in Little Rock. Eisenhower then sent federal troops to enforce the court's order for integration. Congress was similarly prompted to pass the first civil rights law in 82 years, the Civil Rights Act of 1960, which made a serious effort to protect black voters.

 On October 4, 1957, the Soviet Union orbited the first artificial satellite (Sputnik), arousing fears that the United States was falling behind the Soviets technologically. This prompted Eisenhower, who generally held the line on spending, to sign the National Defense Education Act of 1958, which provided extensive aid to schools and students in order to bring American education up to what were regarded as Soviet levels of achievement. The event also strengthened demands for the acceleration of the arms and space races, which eventually led to the U.S. Moon landing on July 20, 1969, and to a remarkable expansion of scientific knowledge. In 1958, threatened and actual conflicts between governments friendly to Western powers and unfriendly or communist forces in Lebanon, the islands of Quemoy and Matsu offshore of China, Berlin, and Cuba caused additional concern. Only a minority believed that the United States was still ahead in military and space technology, though in fact this was true.

      The illness of Secretary of State John Foster Dulles in March 1959, and his subsequent resignation, led the president to increase his own activity in foreign affairs. He now traveled more and met more often with heads of state. The most important meeting was to be a summit in 1960 with Khrushchev (Khrushchev, Nikita Sergeyevich) and Western leaders to discuss such matters as Berlin, German reunification, and arms control. But two weeks before the scheduled date an American U-2 spy plane was shot down deep inside the Soviet Union. Wrangling over this incident destroyed both the Paris summit and any hopes of bettering U.S.-Soviet relations.

An assessment of the postwar era
 Despite great differences in style and emphasis, the administrations of Truman and Eisenhower were notable for their continuity. Both were essentially periods of reconstruction. After 15 years of depression and war, people were not interested in social reform but in rebuilding and expanding the educational and transportation systems, achieving stable economic growth, and, in the case of the younger generation whose lives had been most disrupted by World War II, in marrying and having children. Thus, the postwar era was the age of the housing boom, the television boom, and the baby boom, of high birth and comparatively low divorce rates, of proliferating suburbs and a self-conscious emphasis upon family “togetherness.” Though frustrating to social reformers, this was probably a necessary phase of development. Once the country had been physically rebuilt, the practical needs of a rapidly growing population had been met, and standards of living had risen, there would come another age of reform.

 The arrival of this new age was indicated in 1960 by the comparative youth of the presidential candidates chosen by the two major parties. The Democratic nominee, Senator John F. Kennedy (Kennedy, John F.) of Massachusetts, was 43; the Republican, Vice President Nixon, was 47. They both were ardent cold warriors and political moderates. Kennedy's relative inexperience and his religion (he was the first Roman Catholic presidential nominee since Al Smith) placed him at an initial disadvantage. But the favourable impression he created during a series of televised debates with Nixon and the support he received from blacks after he helped the imprisoned black leader Martin Luther King, Jr., enabled him to defeat Nixon in a closely contested election.

Edgar Eugene Robinson William L. O'Neill

The Kennedy and Johnson administrations

The New Frontier
  During the campaign Kennedy had stated that America was “on the edge of a New Frontier”; in his inaugural speech he spoke of “a new generation of Americans”; and during his presidency he seemed to be taking government in a new direction, away from the easygoing Eisenhower style. His administration was headed by strong, dedicated personalities. The Kennedy staff was also predominantly young. Its energy and commitment revitalized the nation, but its competence was soon called into question.

      In April 1961 Kennedy authorized a plan that had been initiated under Eisenhower for a covert invasion of Cuba to overthrow the newly installed, Soviet-supported Communist regime of Fidel Castro. The invasion was repulsed at the Bay of Pigs (Bay of Pigs invasion), embarrassing the administration and worsening relations between the United States and the Soviet Union. These deteriorated further at a private meeting between Kennedy and Khrushchev in June 1961 when the Soviet leader was perceived as attempting to bully his young American counterpart. Relations hit bottom in October 1962 when the Soviets secretly began to install long-range offensive missiles in Cuba (Cuban missile crisis), which threatened to tip the balance of nuclear power. Kennedy forced the removal of the missiles, gaining back the status he had lost at the Bay of Pigs and in his meeting with Khrushchev. Kennedy then began to work toward improving international relations, and in July 1963 he concluded a treaty with Britain and the Soviet Union banning atomic tests in the atmosphere and underwater. His program of aid to Latin America (Latin America, history of), the Alliance for Progress, raised inter-American relations to their highest level since the days of Franklin Roosevelt.

      Kennedy's domestic policies were designed to stimulate international trade, reduce unemployment, provide medical care for the aged, reduce federal income taxes, and protect the civil rights of blacks. The latter issue, which had aroused national concern in 1962 when federal troops were employed to assure the admission of a Negro at the University of Mississippi (Mississippi, University of), caused further concern in 1963, when similar action was taken at the University of Alabama (Alabama, University of) and mass demonstrations were held in support of desegregation. Although the Democrats controlled both houses of Congress, the administration's proposals usually encountered strong opposition from a coalition of Republicans and Southern Democrats. With Congress's support, Kennedy was able to increase military spending substantially. This led to greater readiness but also to a significant rise in the number of long-range U.S. missiles, which prompted a similar Soviet response.

 On November 22, 1963, President Kennedy was assassinated in Dallas, Texas, most probably by a lone gunman, though conspiracy theories abounded. Vice President Lyndon B. Johnson (Johnson, Lyndon B.) took the oath of office immediately.

 Johnson's first job in office was to secure enactment of New Frontier bills that had been languishing in Congress. By far the most important of these was the Civil Rights Act of 1964, which Johnson pushed through despite a filibuster by Southern senators that lasted 57 days. The act provided machinery to secure equal access to accommodations, to prevent discrimination in employment by federal contractors, and to cut off funds to segregated school districts. It also authorized the Justice Department to take a more active role in civil rights cases. Johnson went beyond the New Frontier in 1964 by declaring war on poverty. His Economic Opportunity Act provided funds for vocational training, created a Job Corps to train youths in conservation camps and urban centres, encouraged community action programs, extended loans to small businessmen and farmers, and established a domestic peace corps, the counterpart of a popular foreign program created by President Kennedy.

      Johnson provided dynamic and successful leadership at a time of national trauma, and in the election of 1964 he won a landslide victory over his Republican opponent, the conservative senator Barry Goldwater of Arizona. More importantly, the Democrats gained 38 seats in the House of Representatives that year, enough to override the conservative bloc and enact a body of liberal social legislation.

 With this clear mandate, Johnson submitted the most sweeping legislative program to Congress since the New Deal. He outlined his plan for achieving a “Great Society” in his 1965 State of the Union address, and over the next two years he persuaded Congress to approve most of his proposals. The Appalachian Regional Development Act provided aid for that economically depressed area. The Housing and Urban Development Act of 1965 established a Cabinet-level department to coordinate federal housing programs. Johnson's Medicare (Medicare and Medicaid) bill fulfilled President Truman's dream of providing health care for the aged. The Elementary and Secondary Education Act of 1965 provided federal funding for public and private education below the college level. The Higher Education Act of 1965 provided scholarships for more than 140,000 needy students and authorized a National Teachers Corps. The Immigration Act of 1965 abolished the discriminatory national-origins quota system. The minimum wage was raised and its coverage extended in 1966. In 1967, social security pensions were raised and coverage expanded. The Demonstration Cities and Metropolitan Area Redevelopment Act of 1966 provided aid to cities rebuilding blighted areas. Other measures dealt with mass transit, truth in packaging and lending, beautification, conservation, water and air quality, safety, and support for the arts.

Race relations
      The civil rights revolution (civil rights movement) came to a head under the Johnson administration. Despite the Civil Rights Act of 1964, most Southern blacks found it difficult to exercise their voting rights. In 1965, mass demonstrations were held to protest the violence and other means used to prevent black voter registration. After a peaceful protest march at Selma, Alabama, was violently broken up by white authorities, Johnson responded with the Voting Rights Act of 1965, which abolished literacy tests and other voter restrictions and authorized federal intervention against voter discrimination. The subsequent rise in black voter registration transformed politics in the South.

 Despite these gains, many blacks (black nationalism) remained dissatisfied by the slow progress. The nonviolent civil rights movement was challenged by “black power” advocates, who expelled or alienated whites and crippled the movement. Race riots broke out in most of the nation's large cities, notably in 1965 in the Watts district of Los Angeles, leaving 34 dead, and two years later in Newark and Detroit. Four summers of violence resulted in many deaths and property losses that left whole neighborhoods ruined and their residents more distressed than ever. After a final round provoked by the assassination of Martin Luther King, Jr., in April 1968, the rioting abated.

Social changes
 The 1960s were marked by the greatest changes in morals and manners since the 1920s. Young people, college students in particular, rebelled against what they viewed as the repressed, conformist society of their parents. They advocated a sexual revolution, aided by the birth control pill and later by Roe v. Wade (1973), a Supreme Court ruling that legalized abortion. “Recreational” drugs such as marijuana and LSD were increasingly used. Opposition to U.S. involvement in Vietnam promoted the rise of a New Left, which was anticapitalist as well as antiwar. A “counterculture” sprang up that legitimized radical standards of taste and behaviour in the arts as well as in life. Feminism (women's movement) was reborn and joined the ranks of radical causes.

 Except for feminism, most organized expressions of the counterculture and the New Left did not long survive the sixties. Nevertheless they changed American life. Drug taking, previously confined largely to ghettos, became part of middle-class life. The sexual revolution reduced government censorship, changed attitudes toward traditional sexual roles, and enabled homosexuals (gay rights movement) to organize and acknowledge their identities as never before. Unrestrained individualism played havoc with family values. People began marrying later and having fewer children. The divorce rate accelerated to the point that the number of divorces per year was roughly half the number of marriages. The number of abortions rose, as did the illegitimacy rate. By the 1980s one in six families was headed by a single woman, and over half of all people living in poverty, including some 12,000,000 children, belonged to such families. Because inflation and recession made it hard to support even intact families on a single income, a majority of mothers entered the work force. Thus the stable, family-oriented society of the 1950s became a thing of the past.

 U.S. involvement in Vietnam dated to the Truman administration, when economic and military aid was provided to deter a communist takeover of French Indochina. When France withdrew and Vietnam was divided in two in 1954, the United States continued to support anticommunist forces in South Vietnam. By 1964, communist insurgents were winning their struggle against the government of South Vietnam, which a decade of American aid had failed to strengthen or reform. In August, following an allegedly unprovoked attack on U.S. warships patrolling the Gulf of Tonkin (Gulf of Tonkin Resolution), a resolution pledging complete support for American action in Vietnam was passed unanimously in the House of Representatives and with only two dissenting votes in the Senate.

      After the fall elections, Johnson began deploying a huge force in Vietnam (more than half a million troops in 1968, together with strong air and naval units). This power was directed not only against the Viet Cong insurgents but also against North Vietnam, which increased its efforts as American participation escalated. Despite massive U.S. bombing of North Vietnam, the communists refused to yield. On January 30, 1968, disregarding a truce called for the Tet (lunar new year) holiday, the communists launched an offensive against every major urban area in South Vietnam. Although the Tet Offensive was a military failure, it proved to be a political victory for the communists because it persuaded many Americans that the war could not be ended at a bearable price. Opposition to U.S. involvement became the major issue of the 1968 election. After Senator Eugene McCarthy (McCarthy, Eugene J.), a leading critic of the war, ran strongly against him in the New Hampshire primary, Johnson announced that he would not seek or accept renomination. He also curtailed bombing operations, opened peace talks with the North Vietnamese, and on November 1 ended the bombing of North Vietnam.

      While war efforts were being reduced, violence within the United States seemed to be growing. Just two months after King's assassination, Senator Robert F. Kennedy (Kennedy, Robert F.), a leading contender for the Democratic presidential nomination, was assassinated. President Johnson then secured the nomination of Vice President Hubert H. Humphrey (Humphrey, Hubert H) at the Democratic National Convention at Chicago, where violence again erupted as antiwar demonstrators were manhandled by local police. Humphrey lost the election to the Republican nominee, former vice president Richard Nixon. The narrowness of Nixon's margin resulted from a third-party campaign by the former governor of Alabama, George Wallace (Wallace, George C.), who attracted conservative votes that would otherwise have gone to Nixon. Democrats retained large majorities in both houses of Congress.

The 1970s

The Richard M. Nixon (Nixon, Richard M.) administration

 Nixon and his national security adviser, Henry Kissinger (Kissinger, Henry A.), believed that American power relative to that of other nations had declined to the point where a fundamental reorientation was necessary. They sought improved relations with the Soviet Union to make possible reductions in military strength while at the same time enhancing American security. In 1969 the Nixon Doctrine called for allied nations, especially in Asia, to take more responsibility for their own defense. Nixon's policy of détente led to Strategic Arms Limitation Talks (SALT), which resulted in a treaty with the Soviet Union all but terminating antiballistic missile systems. In 1972 Nixon and Kissinger negotiated an Interim Agreement that limited the number of strategic offensive missiles each side could deploy in the future. Nixon also dramatically reversed Sino-American relations with a secret visit by Kissinger to Peking in July 1971. This led to a presidential visit the following year and to the establishment of strong ties between the two nations. Nixon then visited Moscow as well, showing that détente with the rival communist powers did not mean that he would play them off against one another.

      The limits of détente were tested by the Arab-Israeli (Arab-Israeli wars) Yom Kippur War of October 1973, in which the United States supported Israel and the Soviet Union the Arabs. Nixon managed the crisis well, preventing the confrontation with the Soviets from getting out of hand and negotiating a cease-fire that made possible later improvements in Israeli-Egyptian relations. Nixon and Kissinger dramatically altered U.S. foreign relations, modifying containment, reducing the importance of alliances, and making the balance of power and the dual relationship with the Soviet Union and China keystones of national policy.

 Meanwhile, inconclusive fighting continued in Vietnam (Vietnam War), and unproductive peace talks continued in Paris. Although in 1969 Nixon announced his policy of “Vietnamization,” according to which more and more of the fighting was to be assumed by South Vietnam itself, he began by expanding the fighting in Southeast Asia with a 1970 “incursion” into Cambodia. This incident aroused strong protest; student demonstrations at Kent State University in Ohio led on May 4 to a confrontation with troops of the Ohio National Guard, who fired on the students without orders, killing four and wounding several others. National revulsion at this act led to serious disorders at many universities and forced some of them to close for the remainder of the term. Further antiwar demonstrations followed the 1971 U.S. invasion of Laos and Nixon's decision to resume intensive bombing of North Vietnam in 1972.

      Peace negotiations with North Vietnam slowly progressed, and a cease-fire agreement was finally signed on January 27, 1973. The agreement, which provided for exchange of prisoners of war and for U.S. withdrawal from South Vietnam without any similar commitment from the North Vietnamese, ended 12 years of U.S. military effort that had taken some 58,000 American lives.

Domestic affairs
      When Chief Justice Earl Warren (Warren, Earl), who had presided over the most liberal Supreme Court in history, retired in 1969, Nixon replaced him with the conservative Warren Burger (Burger, Warren E.). Three other retirements enabled Nixon to appoint a total of four moderate or conservative justices. The Burger court, though it was expected to, did not reverse the policies laid down by its predecessor.

      Congress enacted Nixon's revenue-sharing program, which provided direct grants to state and local governments. Congress also expanded social security and federally subsidized housing. In 1972 the Congress, with the support of the president, adopted a proposed constitutional amendment guaranteeing equal rights for women. Despite widespread support, the Equal Rights Amendment, or ERA, as it was called, failed to secure ratification in a sufficient number of states. (Subsequent legislation and court decisions, however, gave women in substance what the ERA had been designed to secure.)

      The cost of living continued to rise, until by June 1970 it was 30 percent above the 1960 level; industrial production declined, as did the stock market. By mid-1971 unemployment reached a 10-year peak of 6 percent, and inflation continued. Wage and price controls (wage-price control) were instituted, the dollar was devalued, and the limitation on the national debt was raised three times in 1972 alone. The U.S. trade deficit improved, but inflation remained unchecked.

      A scandal surfaced in June 1972, when five men were arrested for breaking into the Democratic national headquarters at the Watergate office-apartment building in Washington. When it was learned that the burglars had been hired by the Committee to Re-Elect the President (CRP), John Mitchell (Mitchell, John), a former U.S. attorney general, resigned as director of CRP. These events, however, had no effect on the election that fall. Even though the Democrats retained majorities in both the Senate and the House, Nixon won a landslide victory over Democratic nominee Senator George McGovern (McGovern, George S) of South Dakota, who won only Massachusetts and the District of Columbia.

  In 1973, however, it was revealed that an attempt to suppress knowledge of the connection between the Watergate affair and CRP involved highly placed members of the White House staff. In response, a Senate select committee was formed and opened hearings in May, and Nixon appointed Archibald Cox as a special prosecutor to investigate the scandal. Amid conflicting testimony, almost daily disclosures of further scandals, and continuing resignations of administrative personnel, a battle developed between the legislative and executive branches of government. Nixon attempted to stop the investigation by firing Cox, leading Attorney General Elliot Richardson and Deputy Attorney General William D. Ruckelshaus to resign. This “Saturday night massacre” of Justice Department officials did not, however, stem the flow of damaging revelations, confessions, and indictments.

 The Watergate affair itself was further complicated by the revelation of other irregularities. It became known that a security unit in the White House had engaged in illegal activities under the cloak of national security. Nixon's personal finances were questioned, and Vice President Spiro T. Agnew (Agnew, Spiro T.) resigned after pleading no contest to charges of income tax evasion. On December 6, 1973, Nixon's nominee, Congressman Gerald R. Ford of Michigan, was approved by Congress as the new vice president.

      On May 9, 1974, the Judiciary Committee of the House of Representatives began hearing evidence relating to a possible impeachment proceeding. On July 27–30 it voted to recommend that Nixon be impeached on three charges. On August 5 Nixon obeyed a Supreme Court order to release transcripts of three tape-recorded conversations, and he admitted that, as evidenced in the recordings, he had taken steps to direct the Federal Bureau of Investigation away from the White House when its inquiries into the Watergate burglary were leading it toward his staff.

 Nixon's support in Congress vanished, and it seemed probable that he would be impeached. On the evening of August 8, in a television address, Nixon announced his resignation, effective the next day. At noon on August 9, Vice President Ford was sworn in as his successor, the first president not elected either to the office or to the vice presidency.

The Gerald R. Ford (Ford, Gerald R.) administration
 Ford's was essentially a caretaker government. He had no mandate and no broad political base, his party was tainted by Watergate, and he angered many when he granted Nixon an unconditional pardon on September 8, 1974. Henry Kissinger remained secretary of state and conducted foreign policy along the lines previously laid down by Nixon and himself. Ford's principal concern was the economy, which had begun to show signs of weakness. A brief Arab oil embargo during the Yom Kippur War had led to a quadrupling of oil prices, and the oil shock produced both galloping inflation and a recession. Prices rose more than 10 percent in 1974 and unemployment reached 9.2 percent in May 1974. Ford was no more able than Nixon to deal with the combination of inflation and recession, called “stagflation,” and Congress had no remedies either. For the most part Congress and the president were at odds. Ford vetoed no fewer than 50 bills during his short term in office.

 In the election of 1976 Ford won the nomination of his party, fighting off a strong challenge by Ronald Reagan, the former governor of California. In a crowded field of contenders, the little-known ex-governor of Georgia, Jimmy Carter, won the Democratic nomination by starting early and making a virtue of his inexperience. Ford, despite Watergate and stagflation, nearly won the election, Carter receiving the smallest electoral margin since 1916.

The Jimmy Carter (Carter, Jimmy) administration

      More than any other president, Carter used diplomacy to promote human rights, especially with regard to the governments of South Korea, Iran, Argentina, South Africa, and Rhodesia (Zimbabwe). Efforts to continue the détente with the U.S.S.R. foundered as the Soviets supported revolutions in Africa, deployed medium-range nuclear weapons in Europe, and occupied Afghanistan. Relations with the People's Republic of China, on the other hand, improved, and full diplomatic recognition of the Communist government took effect on January 1, 1979. In September 1977 the United States and Panama signed two treaties giving control of the Panama Canal to Panama in the year 2000 and providing for the neutrality of the waterway.

 Carter's most noted achievement was to sponsor a great step toward peace in the Middle East. In September 1978 he met with Egyptian President Anwar el-Sādāt (Sādāt, Anwar el-) and Israeli Prime Minister Menachem Begin (Begin, Menachem) at a two-week negotiating session at Camp David (Camp David Accords), Maryland, and on September 17 Carter announced that two accords had been signed establishing the terms for a peace treaty between Egypt and Israel. Further torturous negotiations followed before the peace treaty was signed in Washington, D.C., on March 26, 1979.

      Carter's greatest defeat was administered by Iran. Following the overthrow of Mohammad Reza Shah Pahlavi, who had been supported by the United States, the Islāmic Republic of Iran was proclaimed in Iran on February 1, 1979, under the leadership of Ayatollah Ruhollah Khomeini (Khomeini, Ruhollah). In November militants seized the U.S. embassy in Tehrān and held its occupants hostage (Iran hostage crisis). An attempt to rescue the hostages in April 1980 failed, and the hostages were not released until Carter left office in January 1981. Carter's inability to either resolve the hostage crisis or to manage American perceptions of it disabled him as a leader.

Domestic policy
      Carter's effectiveness in domestic affairs was generally hampered by his failure to establish good relations with Congress, his frequent changes of course, the distractions caused by foreign problems, and his inability to inspire public confidence. His major domestic effort was directed against the energy crisis, though with indifferent results. Inflation continued to rise, and in the summer of 1979 Carter appointed Paul Volcker (Volcker, Paul) as chairman of the Federal Reserve Board. Volcker raised interest rates to unprecedented levels, which resulted in a severe recession but brought inflation under control.

      In the election of 1980 Ronald Reagan was the Republican nominee, while Republican John B. Anderson of Illinois headed a third ticket and received 5,600,000 votes. Reagan easily defeated the discredited Carter, and the Republicans gained control of the Senate for the first time since 1954.

William L. O'Neill

The late 20th century

The Ronald Reagan (Reagan, Ronald W.) administration
 Reagan took office and pledged to reverse the trend toward big government and to rejuvenate the economy, based on the theory that cutting taxes would stimulate so much growth that tax revenues would actually rise. In May 1981, two months after there had been an assassination attempt on Reagan, Congress approved his program, which would reduce income taxes by 25 percent over a three-year period, cut federal spending on social programs, and greatly accelerate a military buildup that had begun under Carter. The recession that had resulted from Volcker's policy of ending inflation through high interest rates deepened in 1981, but by 1984 it was clearly waning, without a resurgence of inflation. The U.S. economy experienced a strong recovery.

      In foreign affairs Reagan often took bold action, but the results were usually disappointing. His effort to unseat the leftist Sandinista regime in Nicaragua through aid to the Contras, a rebel force seeking to overthrow the government, was unpopular and unsuccessful. U.S.-Soviet relations were the chilliest they had been since the height of the Cold War. Reagan's decision to send a battalion of U.S. marines to Lebanon in support of a cease-fire resulted in a terrorist attack in 1983, in which some 260 marines were killed. On October 21, 1983, he launched an invasion of the Caribbean nation of Grenada, where Cuban influence was growing. U.S. forces prevailed, despite much bungling. Popular at home, the invasion was criticized almost everywhere else. Relations with China worsened at first but improved in 1984 with an exchange of state visits.

      Reagan benefited in the election of 1984 from a high degree of personal popularity, from the reduction in inflation, and from the beginnings of economic recovery. This combination proved too much for the Democratic nominee, former vice president Walter Mondale (Mondale, Walter) of Minnesota, and his running mate, Congresswoman Geraldine Ferraro (Ferraro, Geraldine A.) of New York, the first female vice presidential candidate ever to be named by a major party.

 Reagan's second term was more successful than his first in regard to foreign affairs. In 1987 he negotiated an intermediate-range nuclear forces (INF) treaty (Intermediate-Range Nuclear Forces Treaty) with the Soviet Union, eliminating two classes of weapon systems that each nation had deployed in Europe. This was the first arms-limitation agreement ever to result in the actual destruction of existing weapons. Relations between the superpowers had improved radically by 1988, owing primarily to the new Soviet premier, Mikhail Gorbachev (Gorbachev, Mikhail), whose reforms at home were matched by equally great changes in foreign policy. An exchange of unusually warm state visits in 1988 was followed by Soviet promises of substantial force reductions, especially in Europe.

      Reagan's domestic policies were unchanged. His popularity remained consistently high, dipping only briefly in 1987 after it was learned that his administration had secretly sold arms to Iran (Iran-Contra Affair) in exchange for American hostages and then had illegally used the profits to subsidize the Contras. In the short run his economic measures succeeded. Inflation remained low, as did unemployment, while economic growth continued. Nonetheless, while spending for domestic programs fell, military spending continued to rise, and revenues did not increase as had been predicted. The result was a staggering growth in the budget deficit. The United States, which had been a creditor nation in 1980, was by the late 1980s the world's largest debtor nation.

      Furthermore, although economic recovery had been strong, individual income in constant dollars was still lower than in the early 1970s, and family income remained constant only because many more married women were in the labour force. Savings were at an all-time low, and productivity gains were averaging only about 1 percent a year. Reagan had solved the short-term problems of inflation and recession, but he did so with borrowed money and without touching the deeper sources of America's economic decline. In 1988 Vice President George Bush (Bush, George) of Texas defeated the Democratic nominee, Michael Dukakis (Dukakis, Michael S.), the governor of Massachusetts.

The George Bush (Bush, George) administration
 In foreign affairs Bush continued the key policies of the Reagan administration, especially by retaining cordial relations with the Soviet Union and its successor states. In December 1989 Bush ordered U.S. troops to seize control of Panama and arrest its de facto ruler, General Manuel Noriega (Noriega, Manuel), who faced drug trafficking and racketeering charges in the United States.

      Bush's leadership and diplomatic skills were severely tested by the Iraqi invasion of Kuwait, which began on August 2, 1990. At risk was not only the sovereignty of this small sheikhdom but also U.S. interests in the Persian Gulf (Persian Gulf War), including access to the region's vast oil supplies. Fearing that Iraqi aggression would spill over into Saudi Arabia, Bush swiftly organized a multinational coalition composed mostly of NATO and Arab countries. Under the auspices of the United Nations, some 500,000 U.S. troops (the largest mobilization of U.S. military personnel since the Vietnam War) were brought together with other coalition forces in Saudi Arabia. Lasting from January 16 to February 28, the war was easily won by the coalition at only slight material and human cost, but its sophisticated weapons caused heavy damage to Iraq's military and civilian infrastructure and left many Iraqi soldiers dead. With the declining power (and subsequent collapse in 1991) of the Soviet Union (Union of Soviet Socialist Republics), the war also emphasized the role of the United States as the world's single military superpower.

      This short and relatively inexpensive war, paid for largely by U.S. allies, was popular while it lasted but stimulated a recession that ruined Bush's approval rating. The immense national debt ruled out large federal expenditures, the usual cure for recessions. The modest bills Bush supported failed in Congress, which was controlled by the Democrats. Apart from a budget agreement with Congress in 1990, which broke Bush's promise not to raise taxes, little was done to control the annual deficits, made worse by the recession.

      In the 1992 presidential election, Democrat William (Bill) Clinton (Clinton, Bill), the governor of Arkansas, defeated Bush in a race in which independent candidate Ross Perot (Perot, Ross) won 19 percent of the popular vote—more than any third candidate had received since Theodore Roosevelt (Roosevelt, Theodore) in 1912.

William L. O'Neill Ed.

The Bill Clinton (Clinton, Bill) administration
  The beginning of the 1990s was a difficult time for the United States. The country was plagued not only by a sluggish economy but by violent crime (much of it drug-related), poverty, welfare dependency, problematic race relations, and spiraling health costs. Although Clinton promised to boost both the economy and the quality of life, his administration got off to a shaky start, the victim of what some critics have called ineptitude and bad judgment. One of Clinton's first acts was to attempt to fulfill a campaign promise to end discrimination against gay men and lesbians (homosexuality) in the military. After encountering strong criticism from conservatives and some military leaders—including Colin Powell (Powell, Colin), the chairman of the Joint Chiefs of Staff—Clinton was eventually forced to support a compromise policy—summed up by the phrase “don't ask, don't tell”—that was viewed as being at once ambiguous, unsatisfactory to either side of the issue, and possibly unconstitutional. (The practical effect of the policy was actually to increase the number of men and women discharged from the military for homosexuality.) His first two nominees for attorney general withdrew over ethics questions, and two major pieces of legislation—an economic stimulus package and a campaign finance reform bill—were blocked by a Republican filibuster in the Senate. In the hope that he could avoid a major confrontation with Congress, he set aside any further attempts at campaign finance reform. During the presidential campaign, Clinton promised to institute a system of universal health insurance. His appointment of his wife, Hillary Rodham Clinton (Clinton, Hillary Rodham), to chair a task force on health care reform drew stark criticism from Republicans, who objected both to the propriety of the arrangement and to what they considered her outspoken feminism. They campaigned fiercely against the task force's eventual proposal, and none of the numerous recommendations were formally submitted to Congress.

      Despite these early missteps, the Clinton administration had numerous policy and personnel successes. Although Perot had spoken vividly of the effects of the North American Free Trade Agreement, which he said would produce a “giant sucking sound” as American jobs were lost to Mexico, Congress passed the measure and Clinton signed it into law, thereby creating a generally successful free-trade zone between the United States, Canada, and Mexico. During Clinton's first term, Congress enacted with Clinton's support a deficit reduction package to reverse the spiraling debt that had been accrued during the 1980s and '90s, and he signed some 30 major bills related to women and family issues, including the Family and Medical Leave Act and the Brady Handgun Violence Prevention Act. Clinton also changed the face of the federal government, appointing women and minorities to significant posts throughout his administration, including Janet Reno as the first woman attorney general, Donna Shalala as secretary of Health and Human Services, Joycelyn Elders as surgeon general, Madeleine Albright (Albright, Madeleine) as the first woman secretary of state, and Ruth Bader Ginsburg (Ginsburg, Ruth Bader) as a justice on the Supreme Court.

      With Clinton's popularity sagging after the health care debacle, the 1994 elections resulted in the opposition Republican Party winning a majority in both houses of Congress for the first time in 40 years. This historic victory was viewed by many—especially the House Republicans led by Speaker Newt Gingrich—as the voters' repudiation of the Clinton presidency. A chastened Clinton subsequently accommodated some of the Republican proposals—offering a more aggressive deficit reduction plan and a massive overhaul of the nation's welfare system—while opposing Republican efforts to slow the growth of government spending on popular programs such as Medicare (Medicare and Medicaid). Ultimately the uncompromising and confrontational behaviour of the congressional Republicans produced the opposite of what they intended, and after a budget impasse between the Republicans and Clinton in 1995 and 1996—which forced two partial government shutdowns, including one for 22 days (the longest closure of government operations to date)—Clinton won considerable public support for his more moderate approach.

      Clinton's foreign policy ventures included a successful effort in 1994 to reinstate Haitian President Jean-Bertrand Aristide, who had been ousted by a military coup in 1991; a commitment of U.S. forces to a peacekeeping initiative in Bosnia and Herzegovina; and a leading role in the ongoing initiatives to bring a permanent resolution to the dispute between Palestinians and Israelis. In 1993 he invited Israeli Prime Minister Yitzhak Rabin (Rabin, Yitzhak) (who was later assassinated by a Jewish extremist opposed to territorial concessions to the Palestinians) and Palestine Liberation Organization (PLO) chairman Yāsir ʾArafāt (Arafāt, Yāsirʿ) to Washington to sign a historic agreement that granted limited Palestinian self-rule in the Gaza Strip and Jericho.

 During the Clinton administration the United States remained a target for international terrorists with bomb attacks on the World Trade Center in New York City (1993), on U.S. embassies in Kenya and Tanzania (1998), and on the U.S. Navy in Yemen (2000). The domestic front, though, was the site of unexpected antigovernment violence when on April 19, 1995, an American, Timothy McVeigh, detonated a bomb in a terrorist attack (Oklahoma City bombing) on the Alfred P. Murrah Federal Building in Oklahoma City, Oklahoma, killing 168 and injuring more than 500.

      Although scandal was never far from the White House—a fellow Arkansan who had been part of the administration committed suicide; there were rumours of financial irregularities that had occurred while Clinton was governor of Arkansas; opponents charged that the first lady engineered the firing of staff in the White House travel office (“Travelgate”); former associates were indicted and convicted of crimes; and rumours of sexual impropriety persisted—the economy made a slow but steady recovery after 1991, marked by dramatic gains in the stock market in the mid-1990s. Buoyed by the economic growth, Clinton was easily reelected in 1996, capturing 49 percent of the popular vote to 41 percent for Republican challenger Bob Dole (Dole, Bob) and 8 percent for Perot. In the electoral college Clinton won 379 votes to Dole's 159.

      Economic growth continued during Clinton's second term, eventually setting a record for the nation's longest peacetime economic expansion. After enormous budget deficits throughout the 1980s and early 1990s—including a $290 billion deficit in 1992—by 1998 the Clinton administration oversaw the first balanced budget and budget surpluses since 1969. The vibrant economy produced a tripling in the value of the stock market, historically high levels of home ownership, and the lowest unemployment rate in nearly 30 years.

      During Clinton's first term Attorney General Reno approved an investigation into Clinton's business dealings in Arkansas. The resulting inquiry, known as Whitewater—the name of the housing development corporation at the centre of the controversy—was led from 1994 by independent counsel Kenneth Starr. Although the investigation lasted several years and cost more than $50 million, Starr was unable to find conclusive evidence of wrongdoing by the Clintons. When a three-judge panel allowed him to expand the scope of his investigation, however, he uncovered evidence of an affair between Clinton and Monica Lewinsky, a White House intern. Clinton repeatedly and publicly denied that the affair had taken place. After conclusive evidence of the affair surfaced, Clinton admitted the affair and apologized to his family and to the American public. On the basis of Starr's 445-page report and supporting evidence, hearings conducted before the 1998 midterm elections resulted in Clinton's impeachment for perjury and obstruction of justice by a lame-duck session of the House of Representatives after the election. Clinton was acquitted of the charges by the Senate in 1999. During the impeachment proceedings, foreign policy also dominated the headlines. In December 1998 Clinton, citing Iraqi noncompliance with UN resolutions and weapons inspectors, ordered a four-day bombing campaign against Iraq; the military action prompted Iraq to halt further weapons inspections.

      When the dust had settled, the Clinton administration was damaged but not broken. Bill Clinton's job approval rating remained high during the final years of his presidency, and in 1999 Hillary Clinton launched a successful campaign for the U.S. Senate seat being vacated by Democrat Daniel Patrick Moynihan (Moynihan, Daniel Patrick) in New York, thereby becoming the first first lady to win elective office. During the final year of his presidency, Clinton invited Yāsir ʾArafāt and Israeli Prime Minister Ehud Barak to the United States in an attempt to broker a final settlement between the Israelis and the Palestinians. The eventual breakdown of the talks, along with subsequent events in Jerusalem and elsewhere, resulted in some of the deadliest conflicts between Israelis and Palestinians in more than a decade. Clinton also became the first American president to visit Vietnam since the end of the Vietnam War.

 Despite continued economic growth, the 2000 presidential election between Vice President Al Gore (Gore, Al) and Texas Governor George W. Bush (Bush, George W.), the former president's eldest son, was one of the closest and most controversial in the republic's history. Although Gore won the nationwide popular vote by more than 500,000 votes, the presidency hinged on the outcome in Florida, whose 25 electoral votes would give the winner of that state a narrow majority in the electoral college. With Bush leading in Florida by fewer than 1,000 votes after a mandatory statewide recount, the presidency remained undecided for five weeks as Florida state courts and federal courts heard numerous legal challenges. After a divided Florida Supreme Court ordered a statewide manual recount of the approximately 45,000 “undervotes” (i.e., ballots that machines recorded as not clearly expressing a presidential vote) and the inclusion of hand-counted ballots in two counties that had not been previously certified by Florida's secretary of state—which reduced Bush's margin to under 200 votes before the manual recounting began—the Bush campaign quickly filed an appeal to halt the manual recount, which the U.S. Supreme Court granted by a 5–4 vote pending oral arguments. Concluding (7–2) that a quick statewide recount could not be performed fairly unless elaborate ground rules were established, the court issued a controversial 5-to-4 decision to reverse the Florida Supreme Court's recount order, effectively awarding the presidency to Bush (see Gore (Bush v. Gore)). With his 271-to-266 victory in the electoral college, Bush became the first president since 1888 to win the election despite losing the nationwide popular vote.

The George W. Bush (Bush, George W.) administration
 Bush became the first Republican president since the 1950s to enjoy a majority in both houses of Congress. Among the initial domestic challenges that faced the Bush administration were a weakening national economy and an energy crisis in California. Bush, who had campaigned as a “compassionate conservative,” promoted traditionally conservative policies in domestic affairs, the centrepiece of which was a $1.35 trillion tax-cut bill he signed into law in June 2001. That month, however, Republican Senator Jim Jeffords became an independent, giving the Democrats control of the Senate. Subsequently Bush encountered strong congressional resistance to some of his initiatives, such as an educational voucher program that would provide subsidies to parents who send their children to private schools, the creation of a nuclear missile defense system, and federal funding for selected social programs of religious groups. In foreign affairs, the administration attempted to liberalize U.S. immigration policy with regard to Mexico, with which it struck closer ties. But it faced sharp criticism from China for its outspoken support of Taiwan and from Europe and elsewhere for its abandonment of the Kyoto Protocol, a 1997 treaty aimed at reducing the emission of greenhouse gases, and for its declared intention to withdraw from the 1972 Treaty on the Limitation of Anti-Ballistic Missile Systems (it formally withdrew from the treaty in 2002).

 The greatest challenge of Bush's first year in office came on the heels of a massive terrorist attack on September 11 (September 11 attacks), 2001, in which hijacked commercial airliners were employed as suicide bombs. Two of the four hijacked planes leveled the twin towers of the World Trade Center and collapsed or damaged many of the surrounding buildings in New York City, another destroyed a large section of the Pentagon outside Washington, D.C., and still another crashed in the southern Pennsylvania countryside. Some 3,000 people were killed in this, the worst act of terrorism in U.S. history (see September 11 attacks). Bush responded with a call for a global war on terrorism. Identifying exiled Saudi millionaire and terrorist mastermind Osama bin Laden (bin Laden, Osama) as the primary suspect in the acts, Bush built an international coalition against bin Laden (who later claimed responsibility for the attacks) and his network, al-Qaeda (Qaeda, al-) (“the Base”), and the Taliban government of Afghanistan, which had harboured bin Laden and his followers. On October 7 the United States launched aerial attacks against Afghanistan; by the end of the year the Taliban and bin Laden's forces were routed or forced into hiding, and the Bush administration was negotiating with Afghanistan's many factions in an attempt to establish a stable regime there.

      In 2002 the U.S. economy worsened, as consumer confidence and the stock market continued to fall and corporate scandals dominated the headlines. Nevertheless, Bush remained popular, and he led the Republican Party to majorities in both the House and Senate in the midterm elections of 2002.

      Despite the economic difficulties, foreign affairs continued to dominate the Bush administration's agenda. In 2002 Bush focused world attention on Iraq, accusing Ṣaddām Ḥussein's government of having ties to al-Qaeda and of continuing to possess and develop weapons of mass destruction, contrary to UN mandates. In November Bush's secretary of state, Colin Powell (Powell, Colin), engineered a UN Security Council (Security Council, United Nations) resolution authorizing the return of weapons inspectors to Iraq. Soon thereafter Bush declared that Iraq was in breach of the new resolution for its failure to cooperate fully with the inspectors. In mid-March, declaring that diplomacy was at an end, he issued an ultimatum giving Ṣaddām 48 hours to leave Iraq or face removal by force (though he indicated that, even if Ṣaddām chose to leave, U.S.-led military forces would enter the country to search for weapons of mass destruction and to stabilize the new government). On March 20 (local time), following Ṣaddām's public refusal to leave, the United States and allied forces launched an attack on Iraq, called Operation Iraqi Freedom (Iraq War).

 With some international assistance, notably from the United Kingdom, the United States launched a brief air bombing campaign in Iraq followed by a massive ground invasion, arising from Kuwait in the south. The resistance encountered was heavier than expected, especially in the major cities, which nevertheless capitulated and fell under U.S. or British control by the end of April; on May 1 President Bush declared an end to major combat. Armed resistance, however, continued and even increased, primarily as guerrilla attacks on U.S. soldiers and on Iraqis assuming positions of leadership. The American goal of a rebuilt, democratic state in Iraq proved elusive, as U.S. administrators struggled to reinstitute basic infrastructure to the country following the victory. Just as elusive were Iraq's former leader, Ṣaddām Ḥussein, who was eventually captured in December, and hard evidence of weapons of mass destruction. The lack of such evidence and continuing American casualties emboldened critics of the administration, who questioned the prewar intelligence gathered to support the invasion.

      As a result, the Iraq War became a major issue in the campaign for the 2004 presidential election between Bush and his Democratic challenger, U.S. Senator John Kerry (Kerry, John) of Massachusetts. Other campaign issues included joblessness, homeland security, free trade, health care, and the role of the country in the international community, as well as debates over religion, abortion, marriage, and civil rights. Candidate spending, voter turnout, and partisan dissension were high, and Bush defeated Kerry in a contentious and close election, which seemed, like the 2000 election, to hinge on the electoral votes of a single state, this time Ohio.

      Bush began his second term emboldened by a larger Republican majority in both the House of Representatives and the Senate, with promises to prop up the sagging economy, allay domestic security fears, reduce the national debt, lower unemployment, and help usher in an era of democracy in Iraq. In particular, he sought to privatize Social Security and overhaul the tax system.

      By mid-decade the economy showed strong signs of revival, based partly on the continuing upsurge of the housing market. Bush's plan for Social Security reform, however, proved unpopular and never even came to a vote. The president's personal popularity and that of his party began to wane as it was beset with a series of ethics-related scandals. In 2005 Republican House majority leader Tom Delay was forced to step down after a Texas grand jury indicted him on election-law violations; later, he was further linked to influence-peddling indiscretions that led to the conviction and imprisonment of lobbyist Jack Abramoff. In 2006, reports of national security-related government wiretapping and allegations of torture of some suspected terrorists alarmed civil libertarians. The next year Attorney General Alberto Gonzales (Gonzales, Alberto R.) was forced to resign after a probe into the “political” firing of eight U.S. attorneys; and Lewis (“Scooter”) Libby, special assistant to Vice President Dick Cheney (Cheney, Dick), was convicted of lying to a special counsel regarding his involvement in the politically motivated leak of a CIA agent's covert identity.

      Even more damaging to Bush's standing with many Americans was what was widely seen as the federal government's failure to deal promptly and effectively with the fallout from Hurricane Katrina (Katrina, Hurricane), which devastated parts of Alabama, Mississippi, Florida, and Louisiana, especially New Orleans, in early September 2006. Moreover, with casualties mounting in Iraq, more people had come to believe that the Bush administration had misled the country into war. As a result of all these factors, the Democrats were able to win narrow majorities in both houses of Congress following the 2006 midterm election. Determined to stay the course in Iraq and in spite of strong Democratic opposition, Bush authorized a “surge” of an additional 30,000 troops that brought the total of U.S. combatants in the country to some 160,000 by autumn 2007. But even as the surge reduced violence in Iraq, the war and the president remained unpopular.

      The election to succeed Bush was between Senator John McCain (McCain, John) of Arizona, the Republican candidate, and Senator Barack Obama (Obama, Barack) of Illinois, who had triumphed over the favourite, Senator Hillary Clinton (Clinton, Hillary Rodham) of New York, in a long primary battle to win the Democratic nomination. At the height of the contest, the U.S. economy was thrown into turmoil by a financial crisis. From September 19 to October 10, the Dow Jones Average dropped 26 percent. At the same time, there was a severe contraction of liquidity in credit markets worldwide, caused in part by a debacle related to subprime mortgages. While the housing market boomed, individuals lacking the credit ratings necessary for conventional mortgages had been able to obtain subprime mortgages, most of which were adjustable-rate mortgages (ARM) at low, so-called teaser, interest rates that ballooned after a few years. The rates for many of those ARMs jumped at the same time that overbuilding undercut the housing market; foreclosures mounted, and investment banks that under recent deregulation had been allowed to overleverage their assets foundered, resulting in the bankruptcy or sale of several major financial institutions. The U.S. economic and political establishment reacted by passing (after an unsuccessful first attempt) the Emergency Economic Stabilization Act (Emergency Economic Stabilization Act of 2008), which sought to prevent further collapse and to bail out the economy. In the process, the U.S. government provided loans to, and in some cases took an ownership stake in, financial institutions through the Troubled Assets Relief Program (TARP), which allocated $700 billion to the recovery effort.

The Barack Obama administration
      The crisis worked against McCain, whom many voters associated with the unpopular policies of the administration, and worked for the highly charismatic Obama, whose campaign from its outset had been based on the theme of sweeping political change. Obama defeated McCain, becoming the first African American elected to the presidency. He captured nearly 53 percent of the popular vote and 365 electoral votes—defending those states that had gone Democratic in the 2004 election, taking the lion's share of battleground states, and winning several states that had been reliably Republican in recent presidential elections.

      In the interim between the election and Obama's inauguration as president on January 20, 2009, the Bush administration's handling of the distribution of the first half of the TARP funds came under considerable criticism. There were accusations that it had infused too much money into large banks without placing adequate conditions on them, rather than purchasing “toxic” assets as it had promised. In the lead-up to the inauguration, Obama and his transition team, working with Bush, persuaded the Senate to release the last half of the TARP funds, promising that they would be targeted at relief for homeowners and at stimulating the credit markets. Because authorization to block the release of the funds required assent by both houses of Congress, a vote by the House of Representatives was unnecessary.

Ed.

Presidents of the United States
       Presidents of the United States Presidents of the United StatesThe table provides a chronological list of the presidents of the United States.

Vice presidents of the United States
       Vice presidents of the United States Vice presidents of the United StatesThe table provides a chronological list of the vice presidents of the United States.

First ladies of the United States
       First ladies of the United States First ladies of the United StatesThe table provides a chronological list of the first ladies of the United States.

State maps, flags, and seals
       State maps, flags and seals State maps, flags and sealsThe table provides a list of state maps, flags, and seals.

State nicknames and symbols
       State nicknames and symbols State nicknames and symbolsThe table provides a list of state nicknames and symbols.

Additional Reading

Geography
The land
(Landforms and geology): The standard work on the landform regions of the United States is William D. Thornbury, Regional Geomorphology of the United States (1965). Walter Sullivan, Landprints: On the Magnificent American Landscape (1984), is a lively, authoritative, and well-illustrated treatment. An elementary, illustrated textbook is E.C. Pirkle and W.H. Yoho, Natural Landscapes of the United States, 4th ed. (1985). Nevin M. Fenneman, Physiography of Western United States (1931), and Physiography of Eastern United States (1938), are exhaustive and still standard references. William L. Graf (ed.), Geomorphic Systems of North America (1987), is a highly technical discussion. Recommended atlases include Charles O. Paullin, Atlas of the Historical Geography of the United States (1932, reprinted 1975); and Geological Survey (U.S.), The National Atlas of the United States of America (1970).(Climate): Stephen S. Visher, Climatic Atlas of the United States (1954, reprinted 1966), contains more than 1,000 maps. United States National Oceanic and Atmospheric Administration, Climates of the United States, 2nd ed., 2 vol. (1980), makes available physical and climatic data in narrative, tabular, and map form. Scholarly discussions are found in Reid A. Bryson and F. Kenneth Hare (eds.), Climates of North America (1974).(Plant and animal life): An authoritative regional treatment of plant and animal ecology is Victor E. Shelford, The Ecology of North America (1963, reprinted 1978). Michael G. Barbour and William Dwight Billings (eds.), North American Terrestrial Vegetation (1988), covers all major types. The relationship between climate and natural vegetation is obvious but far from simple; the most ambitious cartographic attempt to correlate them in a North American setting is explained in Robert G. Bailey (comp.), Description of the Ecoregions of the United States (1978).(Human geography): A general text covering the human geography of the continent is J. Wreford Watson, North America, Its Countries and Regions, rev. ed. (1967). D.W. Meinig, The Shaping of America, vol. 1, Atlantic America, 1492–1800 (1986), is indispensable for an understanding of the origins of America's human geography. Joel Garreau, The Nine Nations of North America (1981), is a lively, highly readable description of the emerging socioeconomic regions.(Landscape and land use): Stephen S. Birdsall and John W. Florin, Regional Landscapes of the United States and Canada, 3rd ed. (1985), is a general introduction. John R. Stilgoe, Common Landscape of America, 1580 to 1845 (1982), offers a valuable account of the early evolution of settlement. John Brinckerhoff Jackson, Discovering the Vernacular Landscape (1984), delves into the meaning of everyday man-made environments. The growth and development of America's cities and towns are detailed in Alexander B. Callow, Jr. (ed.), American Urban History, 3rd ed. (1982); and Richard Lingeman, Small Town America: A Narrative History, 1620–the Present (1980).Peirce F. Lewis Wilbur Zelinsky

The people
The Statistical Abstract of the United States, published annually by the United States Bureau of the Census, is the standard summary of statistics on the country's social, political, and economic composition. Interpretations of demographic data include Edward G. Stockwell, Population and People (1968); and Richard M. Scammon and Ben J. Wattenberg, The Real Majority (1970). For an analysis of national values, a classic account is Gunnar Myrdal, An American Dilemma: The Negro Problem and Modern Democracy, 2 vol. (1944, reprinted 1975). Inquiries into the nature of American society include Seymour Martin Lipset, American Exceptionalism: A Double-Edged Sword (1996); and Robert D. Putnam, Bowling Alone: The Collapse and Revival of American Community (2000). Immigration is discussed in John Isbister, The Immigration Debate: Remaking America (1996); and Joel Millman, The Other Americans: How Immigrants Renew Our Country, Our Economy, and Our Values (1997). Oscar Handlin, The Uprooted, 2nd ed. enlarged (1973), covers the era of mass immigration, 1860–1920. Examinations of contemporary American minority groups include Stephan Thernstorm (ed.), Harvard Encyclopedia of American Ethnic Groups (1980); and Frank D. Bean and W. Parker Frisbie (eds.), The Demography of Racial and Ethnic Groups (1978). Ethnic patterns are treated in James Paul Allen and Eugene James Turner, We the People: An Atlas of America's Ethnic Diversity (1988).Oscar Handlin

The economy
Anthony S. Campagna, U.S. National Economic Policy, 1917–1985 (1987), chronicles changes in U.S. economic policies through much of the 20th century. Aspects of the economy are treated in Howard F. Gregor, Industrialization of U.S. Agriculture (1982), an atlas emphasizing aspects of industrialized farming, mainly from U.S. census information; and Robert J. Newman, Growth in the American South (1984), on the shift of U.S. manufacturing to the Southern states in the 1960s and '70s. Two good sources of data on the U.S. economy are the Economic Report of the President (published every year), and the Statistical Abstract of the United States. W. Michael Cox and Richard Alm, Myths of Rich and Poor (1999), gives a century-long perspective on the U.S. economy.

Administration and social conditions
The United States Government Manual (annual) offers a broad overview of the federal structure; while the Congressional Quarterly Weekly Report and National Journal (weekly) provide closer views of the public record of the federal legislature. Congressional Quarterly's Guide to Congress, 3rd ed. (1982), details the development and organization of Congress. See also The Book of the States, published biennially by the Council of State Governments. Donald R. Whitnah (ed.), Government Agencies (1983), contains essays on the agencies' purposes and histories, with bibliographies. Discussions of election politics include Fred I. Greenstein and Frank B. Feigert, The American Party System and the American People, 3rd ed. (1985); the series by Theodore H. White, begun with The Making of the President, 1960 (1961), which continued by covering subsequent presidential elections; and Jack P. Greene (ed.), Encyclopedia of American Political History, 3 vol. (1984). Alexander DeConde (ed.), Encyclopedia of American Foreign Policy, 3 vol. (1978), also contains useful bibliographies. Neal R. Peirce and Jerry Hagstrom, The Book of America: Inside 50 States Today, rev. and updated ed. (1984), is an insightful look at persistent social differences among various regions of the country. Ed.

Cultural life
Kirk Varnedoes and Adam Gopnik, High & Low: Modern Art, Popular Culture (1990), was an early attempt to address the “high and low” question unemotionally. Robert Hughes, American Visions: The Epic History of Art in America (1997, reissued 1999), tried to use the broader social context now demanded to chronicle the ambitions and limitations of American art. Important “post-structuralist” views of American art have also been offered by Arthur C. Danto, The Madonna of the Future: Essays in a Pluralistic Art World (2001). The broader questions of the future of American culture in a time of multicultural transformation have been engaged in many places, memorably in Richard Rorty, Essays on Heidegger and Others (1991). The debate over “political correctness” has been examined in Roger Kimball, Tenured Radicals: How Politics Has Corrupted Our Higher Education, rev. ed. (1998); Dinesh D'souza, Illiberal Education: The Politics of Race and Sex on Campus (1991); and Robert Hughes, Culture of Complaint: The Fraying of America (1993). Louis Menand, The Metaphysical Club (2001), attempts to track the crucial influence on American culture of America's most distinct philosophical movement, Pragmatism. The classic statement of the American vision in literary criticism is Lionel Trilling, The Liberal Imagination (1950, reissued 1976). See also Leslie Fiedler, What Was Literature? (1982), a radical egalitarian polemic against the division of American literature into “high” and “low” forms. Emory Elliott (ed.), Columbia Literary History of the United States (1988), covers the many aspects of American literature. Daniel Hoffman (ed.), Harvard Guide to Contemporary American Writing (1979), is a general introduction. Useful works on art include Dore Ashton, American Art Since 1945 (1982); Irving Sandler, The Triumph of American Painting: A History of Abstract Expressionism (1970, reissued 1982); and Milton W. Brown et al., American Art (1979). The renaissance of American dance has produced two great dance critics, Arlene Croce and Edwin Denby. Their works include Arlene Croce, Going to the Dance (1982), and Sight Lines (1987); and Edwin Denby, Dance Writings (1986). For a history of America's unique contribution to the theatre arts, see Gerald Bordman, American Musical Theater, expanded ed. (1986). James Agee, Agee on Film, vol. 1, Reviews and Comments (1958, reprinted 1983), is still the most eloquent writing about American movies. Stephen Mamber, Cinema Verite in America: Studies in Uncontrollable Documentary (1974), is a good introduction to alternative theories about alternative film. H. Wiley Hitchcock and Stanley Sadie (eds.), The New Grove Dictionary of American Music, 4 vol. (1986), is an excellent starting point for research. Gilbert Chase, America's Music, from the Pilgrims to the Present, rev. 3rd ed. (1987), is invaluable and readable. Geoffrey C. Ward, Jazz: A History of America's Music (2000)—based on a film by Ken Burns—is a stimulating and serious history of America's most original art form. Whitney Balliett, Collected Works: A Journal of Jazz (2000), is a personal history of the achievement of Ellington and Armstrong.Adam Gopnik

History
Among the many overviews of U.S. history, the following are representative: Samuel Eliot Morison, Henry Steele Commager, and William E. Leuchtenburg, The Growth of the American Republic, 7th ed. (1980); and John A. Garraty and Robert A. McCaughey, The American Nation, 6th ed. (1987). Reference sources include Dictionary of American History, rev. ed., 8 vol. (1976–78); and Richard B. Morris (ed.), Encyclopaedia of American History, 6th ed. (1982).

Discovery and exploration
Useful introductions include Samuel Eliot Morison, The European Discovery of America, 2 vol. (1971–74); and David B. Quinn, North America from Earliest Discovery to First Settlements: The Norse Voyages to 1612 (1977). Ed.

Colonial development to 1763
Charles M. Andrews, The Colonial Period of American History, 4 vol. (1934–38, reprinted 1964), is the starting point for an understanding of the structure of the British Empire in America. Lawrence Henry Gipson, The British Empire Before the American Revolution, 15 vol. (1936–70), represents the culmination of the “British Imperial” school of interpretation. Gary B. Nash, Red, White, and Black: The Peoples of Early America, 2nd ed. (1982); and Jack P. Greene and J.R. Pole (eds.), Colonial British America (1984), are excellent surveys.(Settlement): Perry Miller, The New England Mind: The Seventeenth Century (1939, reissued 1983), and a sequel, The New England Mind: From Colony to Province (1953, reissued 1967), together constitute perhaps the finest work of intellectual history ever written by an American historian. Francis Jennings, The Invasion of America (1975); and James Axtell, The European and the Indian (1982), are important accounts of white–Indian relations.(Imperial organization): Useful surveys include Michael Kammen, Empire and Interest: The American Colonies and the Politics of Mercantilism (1970); and Stephen Saunders Webb, 1676, the End of American Independence (1984).(The growth of provincial power): James A. Henretta, The Evolution of American Society, 1700–1815 (1973), is an excellent survey of the American economic and political order. Jack P. Greene, Pursuits of Happiness (1988), seeks to demonstrate the variety of colonial social developments. Carl Bridenbaugh, Myths and Realities: Societies of the Colonial South (1952, reprinted 1981), argues persuasively that the colonial South consisted of not one but three sections. Rhys Isaac, The Transformation of Virginia, 1740–1790 (1982), imaginatively surveys the social order of 18th-century Virginia. Gary B. Nash, The Urban Crucible: Social Change, Political Consciousness, and the Origins of the American Revolution (1979), surveys the growth of American cities in the 18th century. John J. McCusker and Russell R. Menard, The Economy of British America, 1607–1789 (1985), is a good survey.(Cultural and religious development): Daniel J. Boorstin, The Americans: The Colonial Experience (1958, reissued 1988), gives a brilliant, if overstated, account of American uniqueness. Henry F. May, The Enlightenment in America (1976), provocatively examines American intellectual development. See also Brooke Hindle, The Pursuit of Science in Revolutionary America, 1735–1789 (1956, reprinted 1974). Alan Heimert, Religion and the American Mind, from the Great Awakening to the Revolution (1966), makes an important though polemical contribution to the understanding of the Great Awakening.(America, England, and the wider world): Overviews are found in Francis Parkman, A Half-Century of Conflict, 2 vol. (1892, reprinted 1965); Howard H. Peckham, The Colonial Wars, 1689–1762 (1964); and Alan Rogers, Empire and Liberty: American Resistance to British Authority, 1755–1763 (1974).Richard R. Beeman

The American Revolution
Richard L. Blanco (ed.), The American Revolution, 1775–1783: An Encyclopedia, 2 vol. (1993), is a valuable reference source. Edward Countryman, The American Revolution (1985), considers American social history in the explanation of how American resistance developed. P.G.D. Thomas, British Politics and the Stamp Act Crisis (1975), is a scholarly account of British objectives and methods, and The Townshend Duties Crisis (1987) is the most comprehensive account of this episode. Jerrilyn Greene Marston, King and Congress (1987), studies how Congress acquired formal “legitimacy” in the course of rebellion. Morton White, The Philosophy of the American Revolution (1978), analyzes the concepts that took shape in the Declaration of Independence. Jack N. Rakove, The Beginnings of National Politics (1979), interprets the complex politics of the Continental Congress.Willard M. Wallace J.R. Pole

The early federal republic
Peter S. Onuf, The Origins of the Federal Republic (1983), stresses the jurisdictional problems of relations among states and between states and the Confederation. Gordon S. Wood, The Creation of the American Republic, 1776–1787 (1969), provides a comprehensive “ideological” interpretation emphasizing the transformation of political thought into action. David F. Epstein, The Political Theory of The Federalist (1984); and the lengthy introduction to Cecelia M. Kenyon, The Antifederalists (1966, reprinted 1985), are excellent studies. Jackson Turner Main, The Antifederalists: Critics of the Constitution, 1781–1788 (1961, reprinted 1974), analyzes the social origins and aspirations of the Anti-Federalists. Joyce Appleby, Capitalism and a New Social Order (1984), argues that capitalism was seen as a liberating force by Jeffersonians as well as by Hamiltonians. Other studies of the period include Gerald Stourzh, Alexander Hamilton and the Idea of Republican Government (1970); James M. Banner, Jr., To the Hartford Convention: The Federalists and the Origins of Party Politics in Massachusetts, 1789–1815 (1970); John Zvesper, Political Philosophy and Rhetoric (1977); Richard Hofstadter, The Idea of a Party System (1969); and Noble E. Cunningham, The Jeffersonian Republicans (1957), The Process of Government Under Jefferson (1978), and The Jeffersonian Republicans in Power (1963).J.R. Pole

From 1816 to 1850
(The Era of Mixed Feelings): A comprehensive overview of the politics of this period is George Dangerfield, The Era of Good Feelings (1952, reprinted 1973). Shaw Livermore, Jr., The Twilight of Federalism: The Disintegration of the Federalist Party, 1815–1830 (1962, reissued 1972), is an excellent analysis. Glover Moore, The Missouri Controversy, 1819–1821 (1953, reissued 1967), skillfully untangles that complex problem.(Economic development): Still valuable and informative are Bray Hammond, Banks and Politics in America, from the Revolution to the Civil War (1957, reissued 1967); Edward Pessen, Most Uncommon Jacksonians: The Radical Leaders of the Early Labor Movement (1967, reprinted 1970); and Walter Buckingham Smith, Economic Aspects of the Second Bank of the United States (1953, reissued 1969).(Blacks, slave and free): Particularly noteworthy studies are Eugene D. Genovese, Roll, Jordan, Roll: The World the Slaves Made (1974); Herbert G. Gutman, The Black Family in Slavery and Freedom, 1750–1925 (1976); Leon F. Litwack, North of Slavery: The Negro in the Free States, 1790–1860 (1961, reprinted 1970); and Ira Berlin, Slaves Without Masters: The Free Negro in the Antebellum South (1974, reissued 1981).(Social and intellectual developments): Lightly documented but brilliantly insightful is Alexis de Tocqueville, Democracy in America, 2 vol. (1835; originally published in French, 1835), available in many later editions. Edward Pessen, Riches, Class, and Power Before the Civil War (1973), challenges Tocqueville's version of equality in Jacksonian America. Other useful treatments are William H. Pease and Jane H. Pease, The Web of Progress: Private Values and Public Styles in Boston and Charleston, 1828–1843 (1985); and Barbara Welter, Dimity Convictions: The American Woman in the Nineteenth Century (1976); Rush Welter, The Mind of America, 1820–1860 (1975); Martin Duberman (ed.), The Antislavery Vanguard (1965); and David Brion Davis (comp.), Ante-Bellum Reform (1967).(Jacksonian politics): Arthur M. Schlesinger, Jr., The Age of Jackson (1945, reissued 1953), is an influential study that stimulated a great array of refutations of its pro-Jackson interpretation, including Edward Pessen, Jacksonian America, new ed. (1978, reprinted 1985). A stimulating if not always convincing comparison of Jacksonian and earlier America is Robert H. Wiebe, The Opening of American Society: From the Adoption of the Constitution to the Eve of Disunion (1984). Richard P. McCormick, The Second American Party System (1966, reissued 1973), is an influential study. Michael Paul Rogin, Fathers and Children: Andrew Jackson and the Subjugation of the American Indian (1975), is brilliant, original, and controversial. John M. Belohlavek, Let the Eagle Soar!: The Foreign Policy of Andrew Jackson (1985), fills a void in the Jacksonian literature.(Expansionism): Bernard De Voto, The Year of Decision, 1846 (1942, reissued 1989); and K. Jack Bauer, The Mexican War, 1846–1848 (1974), are scholarly treatments.Edward Pessen

The Civil War
Syntheses of modern scholarship are James M. McPherson, Ordeal by Fire (1982); and J.G. Randall and David Donald, The Civil War and Reconstruction, 2nd ed. rev. (1969). Allan Nevins, Ordeal of the Union, 8 vol. (1947–71), provides a comprehensive history. Clement Eaton, A History of the Old South, 3rd ed. (1975, reissued 1988), is a general history of the region. Full, critical assessments of slavery are provided by Kenneth M. Stampp, The Peculiar Institution (1956, reprinted 1978); and the study on slavery by Genovese, cited in the section covering 1816 to 1850. A perceptive account of the political conflicts of the late 1850s is Roy F. Nichols, The Disruption of American Democracy (1948, reissued 1967); while Don E. Fehrenbacher, The Dred Scott Case (1978), offers an analysis of the constitutional issues. Jean H. Baker, Affairs of Party (1983), discusses the strong partisan attachments of ordinary citizens. James M. McPherson, Battle Cry of Freedom (1988), is an engrossing narrative history of the Civil War. Comprehensive coverage of the Confederate military effort in the East is Douglas Southall Freeman, Lee's Lieutenants, a Study in Command, 3 vol. (1942–44, reissued 1970–72); while Warren W. Hassler, Jr., Commanders of the Army of the Potomac (1962, reprinted 1979), does the same for the Federals. Studies of the war in the Mississippi valley include Thomas L. Connelly, Army of the Heartland: The Army of Tennessee, 1861–1862 (1967), and Autumn of Glory: The Army of Tennessee, 1862–1865 (1971). An examination of the Gettysburg battle is Edwin B. Coddington, The Gettysburg Campaign: A Study in Command (1968, reissued 1984). Virgil Carrington Jones, The Civil War at Sea, 3 vol. (1960–62), describes the naval war.David Herbert Donald Warren W. Hassler, Jr.

Excellent syntheses of scholarship on the Reconstruction period are Rembert W. Patrick, The Reconstruction of the Nation (1967); John Hope Franklin, Reconstruction (1961); and Kenneth M. Stampp, The Era of Reconstruction, 1865–1877 (1965, reprinted 1975). The fullest account of blacks' experience in the postwar years are Leon F. Litwack, Been in the Storm So Long: The Aftermath of Slavery (1979); and Eric Foner, Reconstruction: America's Unfinished Revolution, 1863–1877 (1988). C. Vann Woodward, Reunion and Reaction (1951, reissued 1966), covers behind-the-scenes political and economic negotiations in the disputed 1876–77 election. A definitive account of the South in the post-Reconstruction era is C. Vann Woodward, Origins of the New South, 1877–1913 (1951, reissued 1971). Important studies of postwar race relations include C. Vann Woodward, The Strange Career of Jim Crow, 3rd rev. ed. (1974, reissued 1982); and Joel Williamson, The Crucible of Race (1984).David Herbert Donald

The transformation of American society, 1865–1900
(National expansion): A comprehensive study of the American “frontiers” of the period is Harold E. Briggs, Frontiers of the Northwest: A History of the Upper Missouri Valley (1940, reissued 1950). Walter Prescott Webb, The Great Plains (1931, reprinted 1981), is a scholarly classic; see also Ray Allen Billington and Martin Ridge, Westward Expansion, 5th ed. (1982); and Rodman W. Paul, The Far West and the Great Plains in Transition, 1859–1900 (1988). Henry E. Fritz, The Movement for Indian Assimilation, 1860–1890 (1963, reprinted 1981), traces the development of this policy after the Civil War. Studies of the occupation of the Plains by the farmers are Fred A. Shannon, The Farmer's Last Frontier: Agriculture, 1860–1897 (1945, reprinted 1977); and Gilbert C. Fite, The Farmers' Frontier, 1865–1900 (1966, reissued 1987).(Industrial development): Edward C. Kirkland, Industry Comes of Age (1961), recounts development from the Civil War to 1897. Samuel P. Hays, The Response to Industrialism, 1885–1914 (1957), offers a perceptive appraisal of the impact of industry on American life. Discussion of the trade unions during the second half of the 19th century is Norman J. Ware, The Labor Movement in the United States, 1860–1895 (1929, reprinted 1964).(Politics): Sean Dennis Cashman, America in the Gilded Age: From the Death of Lincoln to the Rise of Theodore Roosevelt, 2nd ed. (1988), provides an overview of the era. Leonard D. White, The Republican Era, 1869–1901 (1958, reissued 1965), presents a careful and useful analysis. H. Wayne Morgan, From Hayes to McKinley: National Party Politics, 1877–1896 (1969); and Harold U. Faulkner, Politics, Reform, and Expansion, 1890–1900 (1959, reissued 1963), are also valuable. Studies of populism include John D. Hicks, The Populist Revolt (1931, reprinted 1981); and Lawrence Goodwyn, Democratic Promise: The Populist Moment in America (1976).Harold Whitman Bradley Ed.

Imperialism, progressivism, and America's rise to power in the world, 1896–1920
(American imperialism): Varying interpretations of imperialism are presented by Ernest R. May, Imperial Democracy (1961, reissued 1973); Walter LaFeber, The New Empire: An Interpretation of American Expansion, 1860–1898 (1963); and Richard E. Welch, Jr., Response to Imperialism: The United States and the Philippine-American War, 1899–1902 (1979). David F. Trask, The War with Spain (1981), is an account of the Spanish-American War. Julius W. Pratt, America's Colonial Experiment (1950, reissued 1964), discusses the administration of the American overseas empire. A. Whitney Griswold, The Far Eastern Policy of the United States (1938, reissued 1966), remains the standard work; but, for the Open Door policy and relations with China, see also Tyler Dennett, John Hay: From Poetry to Politics (1933, reissued 1963). The U.S. penetration and domination of the Caribbean is most authoritatively recounted in Dana G. Munro, Intervention and Dollar Diplomacy in the Caribbean, 1900–1921 (1964, reprinted 1980).(The Progressive era): An introduction to the United States during the Progressive era is John Whiteclay Chambers II, The Tyranny of Change (1980); and Arthur S. Link and Richard L. McCormick, Progressivism (1983).(The rise to world power): An overview of the period is John M. Dobson, America's Ascent: The United States Becomes a Great Power, 1880–1914 (1978). Surveys of American national politics from Roosevelt through Wilson are George E. Mowry, The Era of Theodore Roosevelt, 1900–1912 (1958, reprinted 1962); Arthur S. Link, Woodrow Wilson and the Progressive Era, 1910–1917 (1954, reprinted 1963); and Robert H. Ferrell, Woodrow Wilson and World War I, 1917–1921 (1985). On the neutrality issue, see Ernest R. May, The World War and American Isolation, 1914–1917 (1959); and Arthur S. Link, Wilson, 5 vol. (1947–65), especially the last three volumes. American mobilization is well covered by Daniel R. Beaver, Newton D. Baker and the American War Effort, 1917–1919 (1966); and Neil A. Wynn, From Progressivism to Prosperity: World War I and American Society (1986). Arno J. Mayer, Political Origins of the New Diplomacy, 1917–1918 (1959, reissued 1970), and a sequel, Politics and Diplomacy of Peacemaking: Containment and Counterrevolution at Versailles, 1918–1919 (1967), include a brilliant account of the development of Wilson's peace program in its worldwide context. A study on Wilson and American diplomacy at the Paris peace conference is Arthur Walworth, Wilson and His Peacemakers (1986). For an account of the fight over the treaty in the United States, see William C. Widenor, Henry Cabot Lodge and the Search for an American Foreign Policy (1980). Wesley M. Bagby, The Road to Normalcy: The Presidential Campaign and Election of 1920 (1962), is an excellent study.Arthur S. Link

From 1920 to 1945
Geoffrey Perrett, America in the Twenties (1982), gives extensive overviews of political, social, and cultural aspects of this period. Ascholarly history is William E. Leuchtenburg, The Perils of Prosperity, 1914–32 (1958). Norman H. Clark, Deliver Us from Evil (1976), provides a challenging revisionist history of Prohibition. Frederick Lewis Allen, Only Yesterday (1931, reprinted 1986), is a contemporaneous account, covering all aspects of the years 1919–31; its companion volume is Since Yesterday (1940, reprinted 1986), on the 1930s. The standard account of politics in the 1930s is William E. Leuchtenburg, Franklin D. Roosevelt and the New Deal, 1932–1940 (1963). J.C. Furnas, Stormy Weather: Crosslights on the Nineteen Thirties (1977), is a complete survey. Irving Bernstein, Turbulent Years: A History of the American Worker, 1933–1941 (1969), is authoritative. Geoffrey Perrett, Days of Sadness, Years of Triumph (1973, reprinted 1985), comprehensively covers the war years 1939–45. John Morton Blum, V Was for Victory: Politics and American Culture During World War II (1976), offers a critique of the war period. Military history is provided by Kenneth S. Davis, Experience of War: The United States in World War II (1965; also published as The American Experience of War, 1939–1945, 1967). A comprehensive study is I.C.B. Dear and M.R.D. Foot (eds.), The Oxford Companion to World War II (also published as The Oxford Companion to the Second World War, 1995). Civil and military history is discussed in William L. O'Neill, A Democracy at War: America's Fight at Home and Abroad in World War II (1993, reissued 1995).

From 1945 to the present
A general discussion of U.S. history since 1945 is Michael Schaller, Virginia Scharff, and Robert D. Schulzinger, Present Tense: The United States Since 1945, 2nd ed. (1996). A critical perspective is Melvyn Dubofsky and Athan Theoharis, Imperial Democracy: The United States Since 1945, 2nd ed. (1988). An overview of the early postwar years is John Patrick Diggins, The Proud Decades: America in War and in Peace, 1941–1960 (1988). James Gilbert, Another Chance: Postwar America, 1945–1985, 2nd ed. edited by R. Jackson Wilson (1986), is a useful survey. Coverage of the Cold War is provided by Ralph B. Levering, The Cold War, 1945–1987, 2nd ed. (1988); and John Lewis Gaddis, Strategies of Containment (1982), a brilliant analysis of U.S. Cold War policies. Burton I. Kaufman, The Korean War (1986), is a reliable overview. One of the most useful histories of the Civil Rights Movement is Taylor Branch, Parting the Waters: America in the King Years, 1954–1963 (1988). George C. Herring, America's Longest War: The United States and Vietnam, 1950–1975, 2nd ed. (1986), is solid. William L. O'Neill, Coming Apart: An Informal History of America in the 1960's (1971), is a study of the quality of American life under the impact of changing social values. Frederick F. Siegel, Troubled Journey: From Pearl Harbor to Ronald Reagan (1984), analyzes the relationship between American social and cultural life and government policy. Lyndon Johnson is the subject of Robert Dallek, Lyndon Johnson and His Times, 2 vol. (1991–98). An examination of American Cold War foreign policy is John Lewis Gaddis, The Long Peace: Inquiries into the History of the Cold War (1988, reprinted 1989).William L. O'Neill

* * *


Universalium. 2010.

Игры ⚽ Поможем написать реферат

Look at other dictionaries:

  • United States — bezeichnet: im Englischen die Kurzbezeichnung der Vereinigten Staaten von Amerika, siehe Vereinigte Staaten United States (Schiff), ein Passagierschiffs USS United States, mehrere US Kriegsschiffe eine Reederei, siehe United States Lines Siehe… …   Deutsch Wikipedia

  • United-States — Cette page d’homonymie répertorie les différents sujets et articles partageant un même nom …   Wikipédia en Français

  • United States — (55) The term United States , when used in a geographical sense, includes all locations where the judicial jurisdiction of the United States extends, including territories and possessions of the United States. United Glossary of Bankruptcy Terms… …   Glossary of Bankruptcy

  • united states — attested from 1617, originally with reference to Holland; the North American confederation first so called in 1776. United Provinces were the seven northern provinces of the Netherlands, allied from 1579, later developing into the kingdom of… …   Etymology dictionary

  • United States — (engl., spr. junāīted ßtēts; abgekürzt: U. S.), die Vereinigten Staaten (von Nordamerika) …   Meyers Großes Konversations-Lexikon

  • United States — (of America; engl., spr. juneitĕd stehts), die Ver. Staaten (von Amerika) …   Kleines Konversations-Lexikon

  • United States — This article is about the United States of America. For other uses of terms redirecting here, see US (disambiguation), USA (disambiguation), and United States (disambiguation). United States of America …   Wikipedia

  • United States —    The dissemination of Hindu thought and practice in the United States began before any Hindu teacher entered America. Ironically, the assimilation of immigrants from India has not been a primary vehicle for the introduction or popularization of …   Encyclopedia of Hinduism

  • United States — <p></p> <p></p> Introduction ::United States <p></p> Background: <p></p> Britain s American colonies broke with the mother country in 1776 and were recognized as the new nation of the United States… …   The World Factbook

  • United States —    In recent years the United States has become by far the most important outside state involved in the Kurdish problem. This is mainly because the two Gulf Wars against Iraq have involved it with the future of the Kurdistan Regional Government… …   Historical Dictionary of the Kurds

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”