Earth Sciences
▪ 2009


Geology and Geochemistry
      The theme of the 33rd International Geological Congress, which was held in Norway in August 2008, was “Earth System Science: Foundation for Sustainable Development.” It was attended by nearly 6,000 scientists from 113 countries. In addition to the standard symposia, there were seven sessions—on such topics as geohazards, resources (water, minerals, and energy), and climate change—that highlighted the relevance of geology to society. The OneGeology global project was officially launched during the meeting. The project was a breakthrough in international scientific cooperation, with more than 90 countries participating to create a global database of geologic map data that could be accessed on the World Wide Web.

      Not only was human society dependent upon geology, but humans had become a significant force in geologic processes. Members of the stratigraphy commission of the Geological Society of London published a paper that explored the idea that the Earth had entered a new geologic epoch—the Anthropocene—characterized by a global environment dominated by human activity. With the beginning of the Industrial Revolution, as global population exploded, agricultural and industrial activities began to leave distinctive stratigraphic signatures that included novel sedimentary, geochemical, biotic, and climatic changes. One such change was the dramatic increase in erosion and denudation of the continents that by the 21st century had exceeded the natural production of sediments by an order of magnitude. Population growth with industrialization had disrupted the biogeochemical carbon cycle by leading to the burning, within a few hundred years, of fossil carbon fuels that had accumulated within rocks through hundreds of millions of years. The resultant carbon emissions were causing significant changes in global temperature, ocean acidity, and the geochemistry of the biosphere.

      Going back in geologic time, Jonathan O'Neil of McGill University, Montreal, and coauthors published the geochemistry of geologically complex rocks from a portion of bedrock in northern Quebec. Their analysis of the rocks' content of neodymium and samarium, two rare-earth elements, indicated that the rocks were 4.28 billion years old and suggested that they might represent the oldest preserved crustal rocks on Earth. The most ancient rocks known previously were about 4.03 billion years old. (4.36-billion-year-old zircon crystals had also been identified but only as tiny mineral grains embedded in younger rock.)

      Despite society's influence (and dependence) on geology and geochemistry, humans remained vulnerable to the power of geologic processes, as demonstrated by the earthquake of moment magnitude 7.9 that devastated Sichuan province, China, on May 12, 2008. (See Geophysics.) This earthquake had not been expected on the basis of standard geophysical criteria. In a report published in July, Eric Kirby of Pennsylvania State University and coauthors described how their geomorphic analysis of these mountains had identified locations of active rock uplift indicating seismic risk. Tectonic signatures for active displacements included dramatic changes in the steepness of river profiles in the rugged margins of the mountain ranges that coincided precisely with faults. They concluded that the Sichuan earthquake provided compelling evidence that the landscape contained much information about rates of tectonic activity. Quantitative geologic analyses of similar information in other locations could become a useful tool for refinement of potential earthquake risk that was not recorded by satellite measurements of displacement rates.

      Rebecca Flowers and colleagues at the California Institute of Technology changed the widely accepted interpretation for the uplift history of the Colorado Plateau and its incision by the Colorado River to form the Grand Canyon. The accepted interpretation had been that the plateau began to rise to its present elevation of about 2,100 m (7,000 ft) some 6 million years ago, with the river cutting downward as the land rose. The new results demonstrated that the uplift process began more than 55 million years ago. Between about 550 million and 250 million years ago, the layers of sediments forming the Colorado Plateau accumulated beneath a sea. The sediments increased in temperature as they became deeply buried but then cooled as they later were uplifted slowly while erosion stripped away the overlying rocks. The researchers used a new geochemical technique to analyze and date the mineral apatite that existed in trace amounts within the sediments. The helium-uranium-thorium dating procedure determined when the apatite crystal in the heated rock cooled to about 70 °C (160 °F). The crystal typically reached that temperature when the buried rock had risen to about 1.6 km (1 mi) beneath the eroded surface. By dating the apatite minerals from within canyons and across the plateau surface, the researchers were able to correlate through time the elevations of sediments in different locations. For example, they found that sediments at the bottom of an eastern part of the canyon had the same apatite-derived age—55 million years ago—as the sediments on the plateau above. This demonstrated that a canyon had already been carved through a plateau that existed at that time. The study revealed many historical complexities, including the unexpected result that while the canyon was being cut deeper (through about 1,500 m [5,000 ft] of rock), the adjacent plateau sediments were also being eroded away.

      Geologic and geochemical studies of sediments could yield many historical records, including temperature change through time. Jean-Noel Proust and other members of a France–New Zealand research program presented some initial results from 31 sediment cores recovered from the Tasman Sea near New Zealand. The objective was “to disentangle the impact of tectonics and climate on the landscape evolution of New Zealand over the past million years…relating to events such as earthquakes, tsunamis, and cyclones.” New Zealand is associated with active tectonic plate boundaries, mountain building, and earthquakes. It occupies a unique position in the system of global ocean currents and in the westerly atmospheric wind belt. During the past one million years, it experienced drastic glacial-interglacial climatic changes. Large amounts of sediment were deposited into the adjacent seas because of this confluence of tectonic and climatic conditions, and these sediments reflected the conditions of erosion, transportation, and submarine deposition. The high sedimentation rates permitted high-resolution chronological studies in steps as small as 100 years, and preliminary results confirmed complex interactions between tectonics and climate.

      Achim Brauer of the German Research Centre for Geosciences and coauthors provided the precise date for a sudden episode of cooling, called the Younger Dryas, that occurred about 12,700 years ago. From their analyses of annually laminated sediments below a deep volcanic lake in Germany, they defined and dated (using carbon isotopes) thin layers that spanned a 230-year period around the start of the episode. Their microscopic and geochemical studies of minerals and fossils carried out to a resolution of 50 microns permitted interpretation of lake level and wind speed from year to year and even between seasons. Following a series of annual and decadal oscillations, a final abrupt increase in winter storminess occurred in 12,679 BP. The researchers suggested that the event marked a shift in the North Atlantic westerly winds, which caused the climate to topple within one year into a completely different mode—one of extreme cooling.

      Mark Schaefer and six other former officials from a variety of U.S. federal agencies proposed that the U.S. Geological Survey (USGS) and the National Oceanic and Atmospheric Administration (NOAA) be merged to form a new Earth Systems Science Agency (ESSA). The sciences of geology and geochemistry extend from solid rock through the hydrosphere and into the atmosphere and thereby overlapped the domains of the USGS and NOAA. Under the proposal, ESSA would build a strong collaboration with the Earth Science programs of NASA, especially its space-based Earth Observing Systems. The authors made the case that this reorganization would be more efficient and effective in meeting the future threats to the economic security of the United States and other countries. The threats were represented by risks concerning geologic resources (such as minerals, fossil fuels, and water supply) and the environment (such as natural disasters and climate change).

Peter J. Wyllie

      A devastating earthquake occurred on May 12, 2008, near the town of Wenchuan in Sichuan province, China. The earthquake, which had a moment magnitude of 7.9, involved a 280-km (174-mi) rupture along the Longmenshan fault and a relative motion of as much as 10 m (33 ft) between the two sides of the fault. More than 87,000 people were killed and 300,000 were injured, with about 5,000,000 left homeless. Shaking from the earthquake triggered many landslides in the mountainous area, and 34 temporary lakes were created by debris that clogged rivers and streams. The economic loss associated with the earthquake was estimated at $86 billion. Although the Wenchuan earthquake occurred in the interior of the Eurasian tectonic plate, it was directly related to the ongoing collision between the Indian and Eurasian tectonic plates. The northward motion of India had strongly deformed Eurasia and created the Himalayan mountains and Tibetan Plateau. This region, however, had reached its upper topographic limit in terms of gravitational stability, and the continuing northward motion of the Indian plate was being accommodated by the east-west extension and extrusion of the Eurasian lithosphere. This process, known as escape tectonics, had caused the compression that led to the Wenchuan earthquake.

      In February 2008 seismologists from the Japan Meteorological Agency reported on the initial results of an earthquake early warning (EEW) system that had became fully operational in October 2007 after several years of preliminary work. The system was designed to locate and estimate the size of a local earthquake very quickly. Although damaging seismic waves from an earthquake travel at a speed of several kilometres per second, alerts sent immediately by electronic communication (such as radio or television) to neighbouring regions that were expected to have strong shaking could provide a warning up to tens of seconds in advance of the seismic waves. This short warning time could greatly reduce damage and injuries associated with an earthquake. For example, it was enough time for people to take shelter under a desk away from windows, for elevators to stop at the nearest floor and open its doors, or for doctors to halt surgical procedures. During a two-year trial run, the EEW system issued 855 alerts, and of these only 26 were false alarms. The Japanese EEW system relied on data taken by over 1,000 seismometers that were spaced at intervals of about 20 km (12 mi) and continuously recorded the movement of the ground across Japan.

      Measuring the state of stress in the Earth's crust is an important goal of geophysicists, primarily because earthquakes occur when the stress along a fault zone crosses some critical threshold. Traditionally, instruments called strainmeters have been used to measure the deformation near the Earth's surface and to infer details about the stress regime. Fenglin Niu of Rice University, Houston, and colleagues announced the development of a new, indirect type of strainmeter that was potentially more precise than previous instruments. Using two holes that had been drilled into the San Andreas Fault Zone to depths of about 1 km (about 0.62 mi), the researchers placed a seismometer in one hole and a piezoelectric sound emitter in the other. Over the course of two months, the seismologists repeatedly measured the time it took for the seismic waves produced by the emitter to travel to the seismometer with a precision of about one ten-millionth of a second. The travel time was not constant but varied according to changing geologic conditions. The variation was directly related to the opening and closing of minuscule cracks (called microcracks) in the rock between the two holes, which in turn was related to changes in the ambient stress level in the rock. The scientists found that most of the variation was caused by daily temperature changes, but two large excursions from the normal measurements occurred at the time of the two small nearby earthquakes. Remarkably, the stress anomalies began hours before the earthquakes took place. If these results could be verified and expanded to other regions where earthquakes occur, seismologists would possess a powerful new tool for forecasting earthquake hazard.

      It was well known that small earthquakes occur in association with the flow of magma in volcanic areas. Although the precise mechanism by which such quakes are produced was controversial, it had generally been assumed that they occur in the rock that surrounds the underground conduits of magma. In two papers on the phenomenon published in May, Yan Lavallée of Ludwig-Maximilians University (Munich), Hugh Tuffen of Lancaster (Eng.) University, and their colleagues presented some surprising results. The two research groups found that when silicic magmas were heated and deformed according to real-world conditions, the magmas produced acoustic emissions. In other words, the fluid magma deformed in a brittle manner that was similar to the way in which normal rock fails during a tectonic earthquake. The magma behaved in this way because it had high viscosity, and the rapid changes in strain expected to occur in volcanic systems caused it to act as a solid. The pattern of acoustic emissions, also known as microseismicity, changed markedly as strain rate was increased, so these results may help volcanologists better understand eruptive processes. In particular, the results may change how the material failure forecast method was being applied to dome-building eruptions.

      Understanding the origin of the Earth's magnetic field continued to be one of the most difficult problems in geophysics. Because of the great complexity of the geomagnetic dynamo (the magnetohydrodynamic system that generates the Earth's magnetic field), computer simulations of the process had to use stringent approximations of some of the governing parameters. A breakthrough in this area was reported in August by Akira Kageyama and co-workers at the Japan Agency for Marine-Earth Science and Technology (Yokohama). They used a supercomputer known as the Earth Simulator to model the geomagnetic dynamo for a period of 2,000 simulated years. The calculation used 4,096 microprocessors and took several months to run. By using such tremendous computing power, the researchers achieved the most realistic simulation of the geomagnetic dynamo to date. Interestingly, they found that the shape of the flow of molten material in the Earth's liquid outer core took the form of elongated sheets that emanated outward from the Earth's rotation axis. This structure was very different from the classical model of columnar flow parallel to the rotation axis. Nevertheless, the sheetlike flow was able to generate a magnetic field.

Keith D. Koper

Meteorology and Climate
      In 2008, the year after the United Nations panel of experts on global change completed its Fourth Assessment Report, the U.S. government issued a report—“Weather and Climate Extremes in a Changing Climate”—that focused on climate change in North America. The report, which was released by the U.S. Climate Change Science Program and the Subcommittee on Global Change Research, provided the first comprehensive study of observed and projected changes in North American weather and climate extremes. Citing human activity as the primary cause of global warming over the past 50 years, the U.S. study indicated that weather and climate extremes were likely to become more commonplace as human-induced increases in the concentration of carbon dioxide and other greenhouse gases in the atmosphere continued. More specifically, the report stated that it was “very likely” that in the 21st century most areas of North America would see more frequent hot days and nights and heat waves and that many areas would see more frequent and intense heavy downpours. Although there had been no overall average change in the area affected by drought in North America during the past 50 years, in the southwestern United States and parts of Mexico and the Caribbean, the area affected by drought was likely to increase. Regarding the issue of hurricane intensity, the report indicated that more intense hurricanes were likely but that the linkage of human activity to observed changes in hurricanes required further study in order to make a confident assessment.

      Another report from the Climate Change Science Program examined the impacts of climate change on agriculture and land resources in the U.S. The growing season—the period between the last spring freeze and first autumn freeze—had increased by 10 to 14 days over the previous 19 years across temperate latitudes. The study also found that elevated CO2 concentrations would spur the growth of weeds and that young forests on fertile soils would achieve higher productivity. Rising temperatures would also increase the risk of crop failures, particularly if precipitation decreased or became more variable.

      The debate on the link between global warming and tropical-cyclone intensity and frequency was accentuated by the billions of dollars of damage in the United States and the hundreds of deaths in the Caribbean that hurricanes caused in 2008. A report published in Natural Hazards Review by Roger Pielke, Jr., of the Center for Science and Technology Policy Research (Boulder, Colo.) and colleagues found that hurricane damage in the United States had been increasing because of growing population, infrastructure, and wealth along coastlines and not as the result of any spike in the number or intensity of hurricanes. The study showed that damage caused by hurricanes in the U.S. had been doubling every 10 to 15 years and that future economic losses might be far greater than previously thought if people continued to move to coastal areas. Research by Chunzai Wang and Sang-Ki Lee of the National Oceanic and Atmospheric Administration (NOAA) challenged the idea that warming oceans might lead to more tropical cyclones. They showed that in the primary region in the Atlantic where tropical cyclones develop, the warming of the oceans was associated with a long-term increase of vertical wind shear (changes in wind speed or direction with altitude). Wind shear is the enemy of cyclone development, since it suppresses the concentration of the heat energy that fuels the storms, and the increased shear coincided with a decrease in the number of hurricanes that made landfall in the United States. Another study, however, indicated that the strongest tropical cyclones were increasing in intensity. James Elsner of Florida State University and colleagues used wind-speed data derived from an archive of satellite records to examine global trends in the intensity of tropical cyclones for the years 1981–2006. All areas where hurricanes develop, with the exception of the South Pacific Ocean, showed increases in the highest maximum wind speeds attained by the strongest storms. The greatest increases occurred for storms over the North Atlantic and northern Indian Ocean.

      During the 2008 hurricane season, NOAA scientists made abundant use of a variety of observing technologies to collect data that might be of use in predicting the intensity of tropical cyclones. As part of the NOAA Intensity Forecast Experiment, three aircraft flew a total of 65 missions and logged 605 hours to gather data in a number of such storms, including Hurricanes Dolly, Gustav, and Ike. The aircraft deployed a total of 453 airborne expendable bathythermographs to obtain ocean temperatures from the surface down to a depth of 200 m (656 ft), and the data were used to initialize and verify ocean models for studying hurricane development. During Hurricanes Gustav and Ike, aircraft transmitted three-dimensional analyses of Doppler-radar data for use by forecasters.

      The El Niño/Southern Oscillation (ENSO) phenomenon, which is associated with the warming and cooling of the equatorial Pacific Ocean, plays a major role in climate variability. The ENSO influences temperature patterns and the occurrence of drought and floods in many parts of the world, but changes in the ENSO over very long time scales were not well understood. Geli Wang of the Chinese Academy of Sciences (Beijing) and Anastasios Tsonis of the University of Wisconsin at Milwaukee studied the record from a sediment core retrieved from Laguna Pallcacocha, a lake in southern Ecuador. From variations in the sedimentation, the scientists were able to create a time series of El Niño and La Niña events that spanned the past 11,000 years. They found that El Niño events had been more frequent and stronger during the past 5,000 years than in the previous 6,000 years, when La Niña was dominant, and they suggested that these long-lasting extremes may have had serious consequences for many cultures in the past. For example, drought associated with persisting El Niño events 3,500–3,000 years ago may have contributed to the demise of the Minoan civilization on the island of Crete.

Douglas Le Comte

▪ 2008

Diamond inclusions in ancient terrestrial rock provided clues about the early history of Earth's crust. Scientists studied slow earthquakes and the crystalline structure of Earth's inner core. International scientific studies documented global warming, and new research anticipated climate-change effects in specific areas of the U.S.

Geology and Geochemistry
      The oldest diamonds known in terrestrial rock were described in 2007 by Martina Menneken of the Institute for Mineralogy, Münster, Ger., and colleagues. The diamonds appeared as tiny inclusions in zircon crystals extracted from ancient metamorphosed sediments from Jack Hills in Western Australia. The scientists studied 1,000 zircon grains and found diamonds in 45 of them. Isotope dating of these zircons had previously established their extreme age, with some being as old as 4.25 billion years. Although geologic processes had destroyed all rocks from so long ago, the resistant zircons passed from one rock cycle to the next. The trapped inclusions were therefore the only known source of physical information about conditions on early Earth. The prevailing view had been that these early conditions were dominated by hot basaltic lavas. The geochemistry of a variety of silicate-mineral inclusions found in the zircon crystals had recently established that the inclusions had grown within water-bearing granitic magmas 4.25 billion–4 billion years ago and at temperatures as low as 680 °C (1,256 °F), which was a surprising indication that a continental crust was already present. An analysis of light scattered by the diamond inclusions (using Raman spectroscopy) revealed distinctive structural and chemical properties that were matched only by microdiamonds that had been found in zircon crystals of ultrahigh-pressure metamorphic rocks, and evidence had shown that these microdiamonds probably grew at depths of at least 100 km (62 mi). The authors' preferred interpretation of the origin of the Australian zircons with diamonds was that the zircons grew deep within a thickened continental lithosphere more than 4.25 billion years ago before they were caught up in the relatively cool granitic magmas in continental crust.

 Stromboli Island's volcano is spectacular for its explosive blasts of gas and lava, which typically occur every 10 to 20 minutes. The flying chunks of lava generally rise from shallow depth and fall within the crater, but occasional larger explosions from deeper sources threaten volcano visitors and nearby inhabitants. Mike Burton of the National Institute of Geophysics and Volcanology, Catania, Sicily, and colleagues published the results of gas analyses of measurements made in 2000–02 with geochemical remote sensing. They used an infrared spectrometer to take a series of continuous records from a distance of about 250 m (820 ft). The results demonstrated that gas composition and temperature changed abruptly during the explosive eruptions. In particular, the temperature and the ratios of carbon dioxide to water and to sulfur dioxide increased. The composition of the gas dissolved in the original magma was determined from analyses of glass inclusions that had been trapped in olivine at a depth of about 10 km (6.2 mi). Using the known solubility of gases as a function of pressure and temperature, a computer simulation of degassing during the rise of the magma helped explain the volcano's plumbing system. About 99% of the gases are released quietly as the magma rises to levels of reduced pressure. The other 1% coalesces into large bubbles, or slugs, that accumulate and intermittently clog the volcanic conduit until they rise rapidly and burst out explosively from about 250 m below the surface. The results showed that the gas slugs are generated at a considerable depth of about 3 km (1.8 mi) and are decoupled from the slower magma uprise and degassing process. Improved understanding of the mechanisms controlling strombolian explosive activity in Sicily and other areas was a high priority for civil defense.

 According to evidence published by Sanjeev Gupta of Imperial College, London, and colleagues, the folded chalk ridge that once formed a land bridge between England and France near the Dover Strait was disrupted twice, generating torrential floods that scoured the land surface that became the seafloor beneath the English Channel. High-resolution sonar mapping of the seafloor, supported by older charts from the U.K. Hydrographic Office, revealed an intricate array of features, including incised channels around elevated regions (former islands), scarps, cataracts, and hanging tributaries. Geomorphic interpretation of these features indicated the occurrence of two successive megafloods. During periods of maximum glaciation and lowest sea level, the floor of the English Channel was continuous low-lying land. The chalk ridge acted as a dam that retained a large glacial lake over part of what is now the North Sea, south of the edge of the Scandinavian ice sheet. The rising lake eventually broke over and through the chalk dam and rapidly drained in a catastrophic flood about 450,000 years ago. During a second glacial maximum about 250,000 years ago, a restabilized dam was ruptured, and the second megaflood incised its history across that of the first flood. The seafloor between England and continental Europe was alternately flooded and exposed as sea level rose and fell during the glacial cycles, and additional paleogeographic changes were associated with the huge glacial lakes, land bridges, and dam bursts. These changes greatly influenced subsequent plant, animal, and human migrations.

      Predictions about future climate change depend critically on knowledge about the timing of past climate changes. Progress in dating the fluctuations recorded in many geologic climatic proxies (indicators) was reviewed and evaluated at a workshop in 2007 on “Radiocarbon and Ice-Core Chronologies During Glacial and Deglacial Times.” Results were outlined by Bernd Kromer of Heidelberg (Ger.) Academy of Sciences and others. Accurate timescales with high resolution were essential for correlation of the various geochemical measurements made on cores from marine and lake sediments, ice, trees, caves, and corals. Radiocarbon dating furnished a common timescale for terrestrial and marine materials. Tree-ring analyses provided good calibration back to about 12,500 years ago, but the radiocarbon calibration curve that was extended back to about 26,000 years ago on the basis of studies of coral and foraminifera in marine sediment was less certain. There was no accepted older chronology because calibrations of the carbon- and oxygen-isotope records from ice cores (back to 650,000 years ago) and marine sediments (back to about 100 million years ago) had generated timescales with significant discrepancies. Important advances were presented on carbon-isotope measurements for tree-ring chronology. New carbon-isotope data from corals and uranium-thorium dating of stalagmites from Chinese caves provided the prospect of extending reliable radiocarbon dating beyond 26,000 years, well into the glacial period preceding the current interglacial period.

      Jean-Daniel Stanley of the Smithsonian Institution, Washington, D.C., and four coauthors reviewed the results of recent geologic, geochemical, and archaeological studies of seven sediment cores obtained from the east harbour of Alexandria, Egypt. Alexander the Great founded the city in 332 BC. A town had already existed at the site for at least seven centuries, but evidence of human activity was limited to periods later than about 400 BC. The sediments were classified and dated by radiocarbon analyses. Potsherds and ceramic fragments in the sediments that were dated to between 940 and 420 BC were typical of the cooking vessels, bowls, and jars used in the southeastern Mediterranean during the 9th to 7th century BC. The contents of lead, heavy minerals, and organic material in the sediments began to increase at about 900 BC, which provided signals of early human-related activity. Lead concentrations increased from less than 10 parts per million (ppm) up to about 60 ppm by 330 BC and then exceeded 100 ppm during the swift expansion of Alexandria. Heavy mineral and organic contents followed similar patterns, with abrupt increases through the three centuries after Alexander arrived. The heavy minerals were derived from imported construction rocks, and the organic material was derived from increased sewage runoff from the booming city.

Peter J. Wyllie

      On Aug. 15, 2007, an earthquake of moment magnitude 8.0 occurred off the coast of southern Peru, near the city of Pisco. The dimensions of the fault plane were about 100 km by 200 km (60 mi by 120 mi), and the relative movement between the two sides of the fault was 8 m (26 ft). Over 35,000 buildings were destroyed, and more than 500 persons were killed. Seismic waves from the earthquake were felt in all of the countries that border Peru, and a tsunami with wave heights generally ranging from 10 to 30 cm (4 to 12 in) was recorded throughout the Pacific basin. Large earthquakes are common in Peru because of its location next to a convergent plate boundary where the Nazca tectonic plate subducts (descends) eastward beneath the South American plate at an average rate of 7.7 cm (3 in) annually.

      Satoshi Ide of the University of Tokyo and colleagues discovered a new scaling law for what are known as slow earthquakes. In contrast to normal earthquakes, which last only tens to at most hundreds of seconds, these seismic events occur over a span of a few hours to a year. Slow earthquakes generally cannot be felt and were discovered only in the last decade because of the large numbers of broadband seismometers that had been deployed in Japan and the western United States. The researchers found that the size of slow earthquakes increased in direct proportion to the duration of the event, whereas for normal earthquakes the size was proportional to the cube of the event's duration. This observation unified a diverse group of slow seismic phenomena that were previously thought to be distinct. Although slow earthquakes do not pose a direct hazard to society, they do significantly affect the amount of strain at convergent plate boundaries and therefore influence when and where large damaging “normal” earthquakes occur.

      Since 2003 American seismologists had operated a network of 12 ocean-bottom seismometers along a 4-km (2.5-mi) stretch of the East Pacific Rise west of Central America where the Nazca and Pacific tectonic plates spread apart at an average rate of 11 cm (4.3 in) per year. During a data-recovery cruise in April 2006, the researchers were chagrined to find that only 4 of the 12 instruments could be recovered. Seismic data from 2 of the recovered instruments showed a gradual increase in microseismicity (tremors) with a large peak of activity about Jan. 22, 2006, followed by a sharp cutoff. In 2007 Maya Tolstoy of the Lamont-Doherty Earth Observatory, Palisades, N.Y., and colleagues reported that subsequent measurements of ocean-water temperature and light scattering, dating of rock samples, and seafloor digital images of new lava rock indicated that an eruption had taken place at the midocean ridge.

      A new volcanic island was recently born in the Pacific Ocean near Home Reef in the Vava'u island group of Tonga. These islands were formed by magma created during the westward subduction of the Pacific tectonic plate beneath the Australian plate. Evidence of the volcanic eruption that created the new island was first noticed by a passing ship in August 2006, and the eruption was subsequently monitored via satellite imagery. In early 2007 R. Greg Vaughan of the California Institute of Technology and co-workers reported on data from the Aqua and Terra satellites, which were used to monitor the size of the new island, the sea-surface temperature in the vicinity of the volcano, and the dispersal of pumice rafts (masses of floating pumice rock). The volcano had previously erupted in 1984, when it created a small island, but the island had eroded away prior to the 2006 eruption.

      Katrin Mierdel of the Institute for Geosciences, Tübingen, Ger., and colleagues reported the results of a series of mineral physics experiments that provided a novel explanation for the existence of Earth's asthenosphere, a layer in the upper mantle that is softer and less viscous than the lithospheric plates that override it. The scientists found significant differences in the water solubility of the two main minerals of the upper mantle, olivine and enstatite, as a function of temperature and pressure. Water solubility for olivine continually increases with depth, whereas for enstatite the water solubility sharply decreases with depth before gradually increasing. The combination of the two behaviours leads to a pronounced solubility minimum for the overall composition of the upper mantle at the depth of asthenosphere beneath both continents and oceans. The investigators suggested, therefore, that the partial rock melting prevalent in Earth's asthenosphere is likely caused not by volatile enrichment but by the inability of large amounts of water to chemically bind to the surrounding rock.

 An international team of geophysicists led by Leonid Dubrovinsky of Bayerisches Geoinstitute, University of Bayreuth, Ger., reported new evidence that the crystalline structure of Earth's solid inner core is body-centred cubic (bcc) as opposed to hexagonal close-packed (hcp). Scientists had traditionally believed hcp to be the stable phase of iron at the extremely high pressures and temperatures near the centre of the Earth. The researchers placed samples of an iron-nickel alloy that contained 10% nickel in heated diamond-anvil cells and used X-ray diffraction to image the internal structure of the samples as pressure was increased to more than 225 gigapascals (4.7 billion lb per sq ft) and as temperature was raised to more than 3,100 °C (5,600 °F). The team's results confirmed earlier suggestions that the presence of modest amounts of nickel alters the pressure-temperature stability of iron such that the bcc crystalline structure becomes the stable phase. On the basis of evidence from meteorites, scientists believed that Earth's core contains 5–15% nickel, so the new experiments strongly implied that bcc crystals exist within the core.

Keith D. Koper

Meteorology and Climate
 In 2007 the United Nations panel of experts on climate change issued its Fourth Assessment Report and concluded that the cumulative evidence since the previous assessment, released in 2001, more strongly indicated that human activities were affecting global climate. In the first part of the assessment, which covered the physical- science basis of climate change, the Intergovernmental Panel on Climate Change (IPCC) stated that a variety of observations had revealed that the warming of the climate system was “unequivocal” and that there was “very high confidence that the globally averaged net effect of human activities since 1750 has been one of warming.” It indicated that 11 of the past 12 years (1995–2006) ranked among the 12 warmest years in the instrumental record of global surface temperatures, that mountain glaciers and snow cover had declined on average in both the Northern and Southern hemispheres, and that losses from the ice sheets of Greenland and Antarctica had “very likely contributed to sea-level rise over 1993 to 2003.” According to the report, however, not all aspects of climate were changing. For example, although the minimum extent of the Arctic ice cap during the summer months was shrinking—with the 2007 minimum shattering previous records—there were no significant trends in Antarctic sea-ice extent, a result consistent with the observation that atmospheric temperatures had not been warming when averaged across the Antarctic region. Also, there were no consistent trends in daily temperature ranges, since day and night temperatures had risen at about the same rate.

      As for the future, the IPCC projected a warming of about 0.2 °C (0.35 °F) per decade through the mid-2020s for a range of emission scenarios, but continued greenhouse-gas emissions “at or above current rates would cause further warming and induce many changes in the global climate system.” Estimated temperature increases for the end of the 21st century relative to 1980–99 global averages ranged from 1.8 °C (3.2 °F) for the low-emissions scenario to 4 °C (7.2 °F) for the high-emissions scenario. The IPCC narrowed the ranges for forecast sea-level rises in the current report compared with previous reports. For the years 2090–99 the rise varied from 18–38 cm (7–15 in) for the low-emissions scenario to 26–59 cm (10–23 in) for the high scenario, all relative to 1980–99 mean sea levels. (See Special Report.)

      In research news concerning climate change in the United States, a study by Martin Hoerling and colleagues from the National Oceanic and Atmospheric Administration Earth System Research Lab, Boulder, Colo., found that greenhouse gases likely accounted for more than half the above-average warmth experienced across the country in 2006. According to NOAA's National Climatic Data Center, mean U.S. temperatures in 2006 were the second highest since record keeping began in 1895, tying 1934 for second place and coming in slightly cooler than the record warm year of 1998. A study by Barry Lynn of NASA's Goddard Institute for Space Studies and colleagues suggested that greenhouse-gas warming might raise average summer temperatures by about 5.5 °C (10 °F) in the eastern part of the country by the 2080s. This conclusion was based on a climate simulation that used a weather-prediction model coupled to a global-climate model.

      A study published by a team of authors headed by scientists at the Lamont-Doherty Earth Observatory, Palisades, N.Y., suggested that more drought could be in store for the U.S. Southwest. A broad consensus of climate models indicated that during the 21st century the region would become drier than it had been and that it might already be undergoing the change. If the models were correct, the implication was that the levels of dryness seen in the droughts of the 1930s, 1950s, and 2000–04 could become the established climate in this region within years or decades. In a separate study by the U.S. National Academy of Sciences, researchers indicated that future droughts in the Colorado River Basin could be longer and more severe because of regional warming and that this would reduce the river's flow and the amount of water that it supplied.

      Most climate models did not initialize or take into account natural variability generated internally. Doug Smith and colleagues from the U.K. Hadley Centre for Climate Prediction and Research presented a new modeling system that took into account both internal variability and external forcing from such factors as solar radiation and human-related increases in greenhouse gases. The result was a decadal (10-year) forecast of global temperature fluctuations from 2005 that indicated that warming might be subdued for several years by internal variability but that the climate would continue to warm, so the average global temperature in at least one-half of the years from 2010 to 2014 would exceed that of the warmest year on record, 1998.

      On Nov. 2, 2007, a pilotless aircraft flew into a hurricane for the first time. The 1.5-m (5-ft)-long aircraft, with a wingspan of 3 m (10 ft), took off from Wallops Island, Virginia, and was guided by remote control into the eye of Hurricane Noel off the U.S. coast. The low-altitude flight allowed continuous observations in parts of the storm where a manned hunter-aircraft mission would have risked the lives of the crew.

Douglas Le Comte

▪ 2007

Studies documented historically high ocean temperatures in the western Pacific, disappearing ice in Antarctica, and accelerating glaciers on Greenland. Scientific seafloor drilling penetrated the upper crust, and satellite mapping confirmed an earthquake threat for the southern San Andreas Fault. Research revealed unsuspected carbonate volcanic activity and identified melting reactions for ultrahigh-pressure metamorphic rocks.

Geology and Geochemistry
 Evidence from geochemistry and glacial geology in 2006 provided new insights into paleoclimates, with implications for current and future climate change. The geochemistry of exposed rock surfaces reported by Joerg Schaefer of the Lamont-Doherty Earth Observatory and coauthors resolved a problem in dating the climatic warming at the end of the last glacial period. Drilling through ice sheets in Greenland and Antarctica had provided ice-core records that showed that the climatic warming had occurred later in Greenland (about 15,000 years ago) than in Antarctica (about 18,000 years ago). Because exposed rock surfaces are bombarded by cosmic rays that form an isotope of beryllium at a constant rate, measurements of beryllium isotope ratios in rocks once covered by glaciers could be used to determine when the rocks were left exposed. Surface-exposure dating of such rocks throughout the world indicated that glaciers began to retreat in both the Northern and Southern hemispheres at the same time (about 17,500 years ago), which correlated closely with the time when air temperatures in Antarctica were rising and levels of atmospheric carbon dioxide were increasing. The geochemists suggested that the delayed warming of Greenland was probably the result of changes in ocean currents in the North Atlantic caused by the massive discharge of icebergs associated with retreat of Northern Hemisphere ice sheets.

      Two reports in 2006 concerning the dynamics of Greenland glaciers that flow into the ocean provided dramatic insights into the status of the Greenland ice sheet. Using satellite radar and interferometry data, Eric Rignot of the Jet Propulsion Laboratory, Pasadena, Calif., and Pannir Kanagaratnam of the University of Kansas demonstrated that in 2005 Greenland ice was being lost at a rate about two times faster than in 1996, primarily because of accelerating glaciers and associated iceberg discharge. By 2005 the accelerated flow of the glaciers was accounting for about 75% of the total loss of Greenland ice (with melting accounting for the rest), and from 2000 to 2005 the zone of accelerating glaciers spread northward from about 66° N to 70° N. The authors suggested that the acceleration was at least in part the result of climatic warming and the enhanced production of meltwater that drained to glacier beds, where the water combined with subglacial sediments to provide lubrication for glacier flow. The discovery that the flow of glaciers could increase so rapidly raised the prospect that a major meltdown and accompanying rise in sea levels might be accomplished much faster than many scientists had expected—in centuries rather than millennia. In the second report Göran Ekström of Harvard University and colleagues studied the motion of glaciers in Greenland by means of global seismic records of “glacial earthquakes,” low-frequency earthquakes that they had discovered in 2003. They reported the epicentres for 182 of such earthquakes on Greenland from 1993 to 2005. All were associated with fast-moving glaciers. They analyzed the glacial earthquakes in the same way as landslides, which involved a mass-sliding model, and calculated that a representative glacial earthquake corresponded to the movement of a section of ice 10 km (6 mi) long, 1 km wide, and 1 km thick lurching forward through 10 m (33 ft) in one minute. The study showed that glacial earthquakes were most frequent during the summer and that their annual rate of occurrence had risen sharply from 2002 to 2005, which suggested a dynamic response to a warming climate. The monitoring of glacial earthquakes might provide another way of remotely gathering data on fast-moving glaciers for use in theoretical models of climate change.

      Research by Andreas Mulch and colleagues at Stanford University provided strong evidence that the Sierra Nevada mountain range had stood taller than 2,200 m (7,200 ft) for at least 40 million years, in contrast to another view held by geologists that the mountains had been uplifted only during the past 3 million to 5 million years. The researchers based their study on the geochemistry of clay minerals in ancient soils from river valleys where gold-mining operations had sliced deeply through successive river deposits. The ratios of hydrogen isotopes in rainfall vary according to the height of rain clouds. The clay minerals had incorporated water from rainfall when they were formed during weathering and thus preserved a record of their height. The researchers speculated that the Sierra Nevada had been the western edge of a high-elevation plateau that later collapsed. Topographic information of this type was critical for the evaluation of tectonic processes and global climate models.

      Ken Bailey of the University of Bristol, Eng., and coauthors published astonishing results concerning volcanic rocks from central France known as peperites, rocks characterized by black lava grains in a pale matrix rich in carbonate. Two centuries of intermittent petrographic studies of peperites had assumed that the carbonate was derived from near-surface sediments into which lava had been injected. The authors, however, showed that the carbonate was igneous in origin. Back-scattered electron images revealed the coexistence of silicate and carbonate melts, and the presence of material derived from subcrustal mantle indicated that the melts had been formed at depths of 100–150 km (60–95 mi). The finding of widespread carbonate volcanism in France called for a reexamination of other alkaline igneous regions worldwide, and according to the authors, “Should similar levels of carbonate activity be revealed, this might herald a revolution in the science of intraplate magmatism across the planet.”

      A wide-ranging review of adakites by Paterno Castillo of the Scripps Institution of Oceanography, La Jolla, Calif., concluded that in using this term, “caution is necessary.” The term was first applied in 1990 to silicic volcanic rocks with specific patterns of trace elements that suggested an origin by partial melting of young, shallow subducted slabs of oceanic crust. The identification and presumed origin of adakites carried significant tectonic implications with respect to converging plate boundaries, which stimulated research into understanding the formation of these rocks. Confusion arose, however, because it came to be recognized that the content of trace elements defined by the name adakite could be attributed to specific source rocks in several environments unrelated to slab melting, and interpretations of the origin of adakites changed. It was safer to retain the classical method of identifying igneous rocks on the basis of mineralogy and general chemical composition rather than on trace elements and deduced process of origin. Interpretations of rock origin sometimes changed with new data, as illustrated by the new thinking about peperites.

      The results from laboratory studies of mineral reactions at high pressures published by Estelle Auzanneau of the Université Blaise Pascal in Clermont-Ferrand, France, and coauthors included a new melting reaction with applications to ultrahigh-pressure metamorphic rocks. Their detailed experiments provided the best prospect yet for understanding the formation of granitic magmas from subducted continental crust that rose from depths of about 120–100 km (75–60 mi). The sediment the authors used as starting material contained about 1% water stored in the mineral biotite. As pressure on the sediment was increased to correspond to depths of 70–80 km (45–50 mi), the biotite was replaced by phengite, another hydrated mineral. When temperatures were above about 800 °C (1,500 °F), this near-isobaric reaction involved a liquid phase with granitic composition, and the phengite was generated by water extracted from the liquid and biotite. This significantly decreased the amount of melt. Therefore, as a rising mass of deeply subducted continental crust containing phengite underwent decompression at a depth of about 75 km (47 mi), it would experience a pulse of melting as the phengite rock was converted to biotite rock. The authors provided detailed comparisons of their experimental reactions with rocks from several well-known ultrahigh-pressure metamorphic regions.

Peter J. Wyllie

 In 2006 an international team of scientists in the Integrated Ocean Drilling Program announced that they had reached a milestone in the scientific drilling of the oceanic crust. Nearly four decades after the first scientific investigations conducted through seafloor drilling, the scientists had penetrated the geologic boundary in the oceanic crust between sheeted dikes of basaltic rocks of the upper crust and underlying coarse-grained rocks called gabbro. The achievement—described in a report by Douglas Wilson of the University of California, Santa Barbara, and colleagues—took place at a depth of about 1,500 m (4,900 ft) below the seafloor in a drill hole about 800 km (500 mi) off the west coast of Central America. The drilling site had been specially chosen to be near the fast-spreading mid-ocean ridge that delineates the boundary between the Cocos and Nazca tectonic plates. Mid-ocean ridges are the birthplace of oceanic crust and are formed where warm upwelling material from the Earth's mantle is cooled by the ocean and begins to subside laterally. The details of the process were poorly known, and the information gained by drilling beneath the overlying sediment into the newly formed oceanic crust (10 million–15 million years old) was expected to provide scientists with important insights. Initial results from the drilling project indicated that reflections of seismic waves that were commonly observed in geophysical surveys of the oceanic crust were largely unrelated to the boundaries between fundamental types of rock (such as the basalt-gabbro boundary) but instead were caused by changes in such characteristics as the porosity of rock materials. The data from these direct geologic observations would be used to help calibrate marine seismic data, which were easier and less expensive to obtain.

      A devastating earthquake occurred on May 27, 2006, about 20 km (12 mi) south of Yogyakarta, Indon. Because of its shallow depth (10 km [6 mi]) under the heavily populated island of Java, the earthquake caused extraordinary damage even though it had a moment magnitude of only 6.3. More than 6,000 persons were killed, more than 38,000 injured, and as many as 600,000 left homeless. The total economic loss was estimated at $3.1 billion. The earthquake was related to the northward subduction of the Australian plate beneath the Sunda plate; however, it occurred about 100 km north of the plate boundary, well within the Sunda plate. Furthermore, the focal mechanism of the earthquake showed lateral, or strike-slip, motion, as opposed to the convergent motion expected for an earthquake occurring near a subduction zone. The Mt. Merapi volcano, located 30–40 km (19–25 mi) to the north of the earthquake, had been erupting at the same time, but geophysicists were unsure if there was a causal link between the two events.

      An important step in quantifying the seismic risk in southern California was accomplished in 2006. Using a technique called InSAR (Interferometric Synthetic Aperture Radar) with data collected by two European Space Agency satellites, Yuri Fialko of Scripps Institution of Oceanography, La Jolla, Calif., was able to observe the motion and deformation of the Earth's crust on either side of the southern San Andreas Fault zone. He found that the overall relative motion between the Pacific and North America tectonic plates along the fault zone was about 45 mm (1.7 in) per year, with the San Andreas Fault and the nearby San Jacinto Fault accommodating this motion in nearly equal amounts. More important, the relative motion changed sharply near the two major faults, which indicated that the crust in that region was undergoing significant strain (deformation). Owing to the fact that there had been no major earthquake along the southern San Andreas Fault in 250 years, the scientist calculated that this strain implied a slip deficit of 5–7 m (16–23 ft), which was essentially the same amount of motion that scientists expected would take place when an earthquake next occurred on this fault segment. In other words, the rocks along the southern segment of the San Andreas Fault had been squeezed about as much as they could take, and a significant earthquake was likely to occur within the next few decades.

       Göran Ekström of Harvard University and colleagues reported a new method of studying the accelerated flow of glaciers that suddenly slip against the Earth's surface and produce low-frequency seismic waves. By analyzing the occurrence of such glacial earthquakes in Greenland, the seismologists showed that the dynamic responses of glaciers to climate change could be quite rapid, much faster than commonly assumed. (See Geology and Geochemistry.)

      By studying how the magnetic field at the Earth's surface varies over time, scientists had learned about the flow of molten iron within Earth's core, the source of the magnetic field. Direct measurements of the intensity of the Earth's magnetic field began around 1840, and since that time geophysicists had observed a steady decline in the strength of the field. David Gubbins and colleagues at the University of Leeds, Eng., completed an analysis of historical data that showed that the intensity of the magnetic field had been relatively constant before 1840, going as far back as 1590. To deduce this fact, the group used ships' logbooks from the period that recorded the direction of the magnetic-field lines at the Earth's surface for the purpose of navigation. The geophysicists combined these angular measurements with 315 rough measurements of overall field strength derived from magnetic minerals in such materials as ceramics and volcanic rock to make a high-precision calculation of magnetic-field strength in the pre-1840 era.

Keith D. Koper

Meteorology and Climate
 Significant progress was made in 2006 in the field of short-term weather forecasting as computer forecast models continued to grow in complexity and sophistication. The U.S. National Center for Atmospheric Research (NCAR) and the U.S. Air Force Weather Agency announced that National Weather Service and air force weather forecasters had adopted the Weather Research and Forecasting Model (WRF) for day-to-day operational use. The WRF improved upon previous models in predictions of several types of extreme weather and reduced errors by more than 50% in nighttime temperature and humidity forecasts. It was the first weather model in the U.S. to serve as both the basis for public weather forecasts and a tool for weather research, which meant that research findings could be translated more easily into making better forecasts. The WRF was also being adopted by the national weather agencies of Taiwan, South Korea, China, and India.

      In the climate realm a number of significant studies were released that dealt with the impact of greenhouse gases on global climate—a major issue related to changes in weather and climate patterns over the course of years, decades, and centuries. Two studies by Bette Otto-Bliesner of NCAR and Jonathan Overpeck of the University of Arizona and colleagues blended computer modeling with paleoclimate records and suggested that climatic warming could cause the ice sheets across both the Arctic and the Antarctic to melt much more quickly than was generally expected. Though many uncertainties remained, the scientists determined that increases in greenhouse gases could make Arctic summers as warm by 2100 as they were nearly 130,000 years ago, when sea levels had risen as much as 6 m (20 ft) higher than they were in 2006. The finding presented the possibility that the ongoing rise in global sea level, which was about 2 mm (0.08 in) per year, might greatly accelerate during the 21st century.

      Several Earth-science satellite missions were being used to help understand sea-level rise. Satellite data indicated that from 1993 to 2005 about one-half of the global sea-level rise had been caused by thermal expansion of the ocean and about one-half had been caused by melting ice. Future melting—particularly of the ice sheets in Greenland and Antarctica—was shown to have the greatest potential to raise sea level. Data from the Gravity Recovery and Climate Experiment ( GRACE), which used tiny gravity-induced variations in the orbits of two satellites to determine changes in the Earth's mass, indicated that from 2002 through late 2005 there was an annual net loss of ice in both Antarctica and Greenland. In a separate study, which used satellite radar and interferometry data, researchers found that the loss of ice from glaciers in Greenland had doubled between 1996 and 2005, mainly because of the accelerated flow of the glaciers into the sea. (See Geology and Geochemistry.)

      A study by James Hansen of NASA Goddard Institute for Space Studies and colleagues showed that global surface temperatures had increased about 0.2 °C (1 °C = 1.8 °F) per decade from1975 to 2005, which was in agreement with the warming predicted in the 1980s by climate models that considered increases in the amount of greenhouse gases in the atmosphere. A comparison of sea-surface temperatures in the Western Pacific with paleoclimate data from microscopic sea-surface animals suggested that this ocean region was approximately as warm as it had been at any time in the past 12,000 years and within about 1 °C of the maximum temperature of the past million years.

      Among the problems that climate scientists had faced concerning global warming were discrepancies between satellite-based and surface-based temperature measurements. A study issued by the U.S. Climate Change Science Program, a collaborative interagency program, reconciled these discrepancies. The findings lent support to the evidence of substantial human impact on global temperature increases and showed that temperatures at the surface and in the low and middle atmosphere had all warmed since 1950.

      A warming climate in the western United States appeared to be leading to increased forest wildfire activity, according to a study done by Anthony Westerling of the Scripps Institution of Oceanography, La Jolla, Calif., and colleagues. A comparison of the wildfire seasons of 1987–2003 with those of 1970–86 showed an average increase of 78 days (64%) in season length and an increase from 7.5 to 37.1 days in the duration of large wildfires. Snowpacks were melting one to four weeks earlier than they had 50 years before, and years with early melting had five times as many wildfires as years with late melting. An increase in spring and summer temperatures of about 0.9 °C (1.6 °F) had contributed to the earlier snowmelt.

      Thanks in part to Atlantic-basin wind shear, which was attributed to a developing El Niño in the equatorial Pacific Ocean, the 2006 Atlantic tropical storm season ended with only nine named storms—far fewer than the record 2005 season. Nevertheless, the debate over the role of global warming in producing more powerful hurricanes continued as various studies were published that either supported or countered the idea. Separating the natural multidecadal climatic variability in oceanic temperatures from any long-term human-induced warming trend had proved difficult. A study released by Benjamin Santer of the Lawrence Livermore National Laboratory, Livermore, Calif., and an international group of scientists used 22 climate computer models to study the causes of increases in sea-surface temperatures where tropical cyclones form in the Atlantic and Pacific. The results confirmed that the most likely driver for most of the rise in temperatures was a human-induced increase in greenhouse-gas emissions. A major remaining issue was whether improvements in storm monitoring had distorted perceived trends in tropical-cyclone intensity, and researchers continued to analyze historical storm data to resolve the question.

Douglas Le Comte

▪ 2006

New findings onmantle plumes were reported, while the very existence of mantle plumes came into question. Geoneutrinos were detected for the first time. The North Atlantic region experienced a record-breaking hurricane season, and scientists debated possible tropical-cyclone effects from global warming.

Geology and Geochemistry.
 The dramatic red Navajo lsandstone cliffs of the Colorado Plateau in southern Utah contain many iron-rich concretions, some of which appear to be very similar to the small spherical gray rocks, known as “blueberries,” that were discovered on Mars by the Exploration Rover Opportunity. In 2005 Marjorie Chan of the University of Utah and coauthors presented a detailed geologic and geochemical study of the processes that led to the formation of the concretions in the Navajo sandstone. The processes involved the breakdown of iron minerals in a source rock by the action of groundwater and the formation of thin films of iron oxide (hematite), which coloured the rocks red. At a later time a different aqueous solution percolated through the rock and dissolved some of the iron oxide. When the solution reached locations that were more oxidizing than the solution, the iron minerals were precipitated and formed solid concretions of various shapes, including marble-shaped bodies that resembled the Mars blueberries. Listing six characteristics that indicated that the spherical concretions found in Utah were a good analog for the Mars blueberries, Chan's team concluded that the formation of the blueberries required the percolation of two separate aqueous solutions. The question that remained was whether the water that percolated through the rock on Mars also supported life.

      A 2005 paper by Aivo Lepland of the Geological Survey of Norway and coauthors delivered a strong shock in the continuing debate about whether traces of early life are recorded by the geochemistry of 3.85-billion-year-old rocks found in southwestern Greenland. In 1996 it had been reported that apatite, a mineral that was widely distributed in these old rocks, had inclusions of graphite (a carbon mineral) whose isotope ratios indicated that the carbon was of biogenic origin, and it was proposed that the apatite-graphite combination was derived from bacteria. Geologic and geochemical evidence supported the view that some of the rocks were sedimentary in origin and therefore indicative of the presence of water necessary for the existence of early bacteria.

      Although it was well established that the geochemistry of tiny mineral bodies might facilitate the interpretation of the geologic environments of rock formation, these rocks had been strongly metamorphosed during their long history, which obscured the geologic environment of their original formation. The new results, which used optical microscopy and electron microscopy, denied the existence of any graphite in the apatite minerals described in the 1996 report, despite a diligent search that was extended to many associated rocks. The authors of the 2005 paper concluded that claims for the existence of early life in the rocks “cannot be founded on an occurrence of graphite inclusions in apatite.”

      Joseph V. Smith of the University of Chicago in 2005 presented evidence to support the hypothesis that mineralogy and geochemistry, particularly as related to volcanic eruptions, played significant roles in the emergence and evolution of a self-replicating biochemical system—that is, life. After the first living cells were generated by geochemistry on internal mineral surfaces about four billion years ago, life evolved through the utilization of energy from the Sun and the incorporation of selected chemical elements. Smith described how volcanic activity would have been a major source of such biologically important elements as carbon, phosphorus, sulfur, iron, zinc, and manganese. Drawing on emerging evidence from studies of metabolism, gene regulation, and medicine, he noted a connection between geochemistry and the evolution of large-brained hominids. The East African Rift Valley, which opened about 30 million years ago, is associated with alkali-rich carbonatite volcanoes. Local soils derived from material erupted from these volcanoes would have been abundant in phosphorus and other trace elements that are known to be biochemical nutrients essential for the growth and enhancement of primate brains. Only in the Rift Valley was there the unique coincidence of this rather rare type of volcano and an evolving large-brained primate population. A test of the possible influence of alkali-rich volcanism in the evolution of hominids in Africa might come from advanced synchrotron X-ray measurements of the trace elements in the mineral apatite of fossil teeth.

      In 2005 Ralf Tappert of the University of Alberta and coauthors demonstrated how major geologic processes might be elucidated by the geochemistry of diamonds and their inclusions. Diamonds from Jagersfontein, S.Af., contain tiny inclusions of garnet with two geochemical properties of interest. First, their content of the trace-element europium showed that they grew from material of the Earth's crust. Second, their unusual composition (majoritic garnet) proved that they nucleated and grew at depths of 250–500 km (about 155–310 mi) or more. This evidence indicated that crustal rocks were carried into the Earth's interior. The most likely geologic process that satisfied these observations was subduction of the oceanic crust. In addition, the ratio of carbon isotopes in the diamonds indicated that the source of the carbon may have been organic. Organic carbon would have been introduced from surface rocks (such as from dead organisms buried in the seafloor), which was consistent with the inferred subduction process. The study confirmed the idea of the long-term survival of crustal material within a heterogeneous mantle, at least to a depth of 500 km.

      The geochemistry of lavas from Hawaii provided information about the mantle plume that many geologists assumed transports source rocks from deep in the mantle to a near-surface hotspot, where melting occurs. The Hawaiian volcanoes comprise the parallel Loa and Kea chains, whose lavas are distinguished by slightly different but overlapping geochemical properties. Two papers in 2005 countered previous interpretations that described the mantle plume as having a concentrically zoned structure in terms of composition. Wafa Abouchami of the Max Planck Institute for Chemistry, Mainz, Ger., and coauthors presented high-precision lead-isotope data from the lavas and demonstrated that the plume had a bilateral composition structure between the two chains and that there were small-scale variations in composition along the chains. The results indicated that there were compositional bands less than 50 km (about 30 mi) in diameter within the plume and that they stretched out vertically like spaghetti over tens to hundreds of kilometres. Zhong-Yuan Ren of the Tokyo Institute of Technology and coauthors analyzed trace elements in inclusions of magma solidified within olivine crystals, which recorded the complexities of the magma sources in the mantle during the process of melting, magma uprise, and crystallization. They inferred that the plume was not concentrically zoned and that the geochemistry was controlled by the thermal structure of the plume, which contained streaks or ribbons of deformed ocean crust that had been subducted much earlier.

      The phenomenon of volcanism within tectonic plates, such as that which occurs in Hawaii, was understood by most geologists to be caused by plumes of material that rises from the mantle to the Earth's surface. Two 2005 publications, however, presented powerful challenges to the existence of mantle plumes and suggested that geologists had reached an important revolutionary stage in theories of mantle dynamics and plate tectonics. Yaoling Niu of the University of Durham, Eng., organized an issue of the Chinese Science Bulletin that featured the “Great Plume Debate,” and the Geological Society of America published Plates, Plumes, and Paradigms, a compendium that included several articles by one of the most influential skeptics of mantle plumes, Don L. Anderson of the California Institute of Technology.

Peter J. Wyllie

      Geophysicists of many different stripes spent much of the year in 2005 sifting through data from the great earthquake of Dec. 26, 2004, which produced the tsunami that devastated coastal regions of the Indian Ocean. Seismologists determined that the earthquake lasted about 500 seconds, rupturing a 1,200-km (about 750-mi) segment of plate boundary from Sumatra, Indonesia, to the Andaman Islands, India, with maximum offsets of 15–20 m (49–66 ft). Debate continued about the precise moment magnitude of the event. It was originally inferred to be 9.0, but later analyses suggested values ranging from 9.15 to 9.3. Although these differences appear to be numerically small, they actually represent a large difference in the amount of energy released in the earthquake because earthquake magnitude scales are logarithmic. Geodesists contributed to the debate by using GPS (global positioning system) stations to measure the offset of the ground, which suggested a moment magnitude of 9.2. Remarkably, they found measurable offsets at distances as far as 4,500 km (2,800 mi) from the epicentre. Oceanographers used coastal tide-gauge records and satellite altimetry records to delineate the region where the tsunami originated and found several “hot-spot” regions of variable slip (motion) that acted as distinct tsunami sources. Geodynamicists calculated that the redistribution of mass that occurred during the earthquake should have decreased the length of day by 2.68 microseconds and shifted the rotation axis of the Earth so that the North Pole would have moved by about 2 cm (0.8 in). The change in rotational speed was probably too small to observe; however, the change in rotational axis might be detectable with observations made over an extended period of time. Some geomagneticists also speculated that the earthquake altered conditions in the fluid core of the Earth, and they were expecting a “jerk” in the strength of Earth's magnetic field to become observable within the following few years.

      On March 28, 2005, an earthquake of moment magnitude 8.7 occurred off the west coast of Sumatra. It was located on the boundary of the Australia and Sunda tectonic plates, about 160 km (100 mi) to the southeast of the epicentre of the earthquake of December 26. Some seismologists considered this earthquake an aftershock because it was likely triggered by a change in stress induced by the December event. As an aftershock it would have the distinction of being the largest ever recorded. Incredibly, a group of seismologists in the United Kingdom had forecast such an event in a paper published on March 17, just 11 days before the earthquake occurred. The technique used by the scientists did not allow for specific earthquake predictions (for example, a forecast of a magnitude–7.4 earthquake next Tuesday at 11:40 AM in southern California), but it might be able to provide information that would be useful in preparing for future earthquakes and so reduce the damage they could cause. The slip of the March 28 earthquake was concentrated beneath the Indonesian islands of Nias and Simeulue. It caused widespread damage and the deaths of about 1,300 persons, but the fact that it occurred mainly beneath these islands may have kept the death toll from being even larger. Scientists who modeled the earthquake found that the presence of the islands severely reduced the amount of water displaced during the earthquake so that only a mild, and largely unnoticed, tsunami was produced.

      Although most earthquakes happen at the boundaries of tectonic plates, large damaging earthquakes occasionally occur within a tectonic plate. A notable example is the New Madrid seismic zone, which lies approximately in the middle of the North America tectonic plate. Four large earthquakes occurred near New Madrid, Mo., in 1811–12, and debate about the present-day seismic hazard in the region was vigorous. In June researchers published results from a four-year study of ground motion in the New Madrid region. The scientists drove H-beams 20 m (66 ft) into the ground and continuously tracked their relative positions, using GPS equipment. They found relative motion of about 3 mm (0.12 in) per year for two H-beams on opposite sides of an active fault and argued that this implied a strain (deformation) in the New Madrid region as great as that found in plate-boundary regions such as the San Andreas Fault zone in California. This interpretation, though it was at odds with previous GPS studies in the region, was consistent with previous geologic results that suggested that large, damaging earthquakes happened in the New Madrid seismic zone about every 500 years. If the new interpretation came to be supported by future work, the seismic hazard to residents of the New Madrid region, including the city of Memphis, Tenn., would be recognized to be just as high as for those living in earthquake-prone California.

      A new subfield of geophysics was established in 2005 when an international team of scientists announced the first-ever detection of geoneutrinos. Neutrinos are nearly massless subatomic particles that travel close to the speed of light and interact very weakly with matter. They are emitted during the radioactive decay of certain elements, such as uranium and thorium, and solar neutrinos from nuclear reactions on the Sun had been detected and studied on Earth for many years. Using a detector buried in a mine in Japan and cleverly screening out neutrinos emitted by nearby nuclear-power plants, the scientists were able to identify conclusively neutrinos that were emitted by the decay of radioactive elements within the Earth. Ultimately, the scientists hoped to be able to use geoneutrino observations to deduce the amount of radioactive heat generated within the Earth, which is generally thought to represent 40–60% of the total heat the Earth dissipated each year. Furthermore, by combining geoneutrino observations from many detectors, scientists might be able to make tomographic maps of radiogenic heat production within the Earth. Such maps would lead to a better understanding of the convection currents within the mantle that drive the motion of tectonic plates at the surface of the Earth.

Keith D. Koper

Meteorology and Climate
 The devastation that resulted from Hurricanes Katrina and Rita along the Gulf of Mexico during the 2005 hurricane season increased interest in short-term and seasonal forecasts of tropical storms and hurricanes as well as the role that climate might be playing in the recent surge in storm activity. (A tropical storm is a tropical cyclone with sustained winds of 63–118 km/hr [39–73 mph]; a hurricane is a tropical cyclone with sustained winds of 119 km/hr [74 mph] or greater.) Although the seasonal forecasting of where tropical storms and hurricanes might make landfall remained a difficult task, forecasts of broader measures of storm activity had become quite successful. A press release by the National Oceanic and Atmospheric Administration (NOAA) on May 16, 2005, forecast an active tropical-cyclone season, with 12–15 named storms and 7–9 hurricanes. (A tropical cyclone is named when it reaches tropical-storm status.) This forecast was in contrast to the long-term mean of 11 named storms and 6 hurricanes. On August 2, only two months into the six-month-long tropical-cyclone season, the forecast was updated to 18–21 named storms and 9–11 hurricanes. When the phenomenal season officially ended November 30, a record 26 named storms had formed, including a record 13 hurricanes. (See Map.) (On December 2 the 26th storm, Epsilon, became the 14th hurricane of the year, and one additional tropical storm, Zeta, formed on December 30. After the original, preselected list of 21 names from Arlene to Wilma was exhausted, letters of the Greek alphabet were used.) The NOAA forecasts, which have been issued since 1998, were a combined effort of the National Hurricane Center, the Climate Prediction Center, and the Hurricane Research Division. Forecasts of above-normal activity were also made within the private sector and by academics. For example, a team led by William Gray, a professor at Colorado State University (CSU), forecast as early as December 2004 that there was an above-average probability that a major hurricane would make landfall in the U.S. during 2005. By the end of May, the CSU team had bumped up their forecast from 11 named storms to 15, and on August 5 the team raised it to 20.

      The ability to make such seasonal forecasts hinges on the fact that several large-scale oceanic and atmospheric patterns have been identified as having an influence on tropical-cyclone activity. The NOAA forecasts rely upon observations in the Atlantic basin of wind and air-pressure patterns and of multidecadal (decade-to-decade) variations in such environmental factors as sea-surface temperatures. In addition, most seasonal-storm forecasters closely monitored the status of the El Niño/Southern Oscillation (ENSO), a large-scale weather pattern associated with the warming and cooling of the equatorial Pacific Ocean, because it can affect the strength of wind shear—which inhibits storm development—over the Atlantic Ocean. In 1995 the Atlantic multidecadal signal turned favourable for storm development, and all the tropical-storm seasons from that year through 2005 exhibited above-normal activity except for the years 1997 and 2002, when there were ENSO-related increases in Atlantic-basin wind shear.

      Although the idea that multidecadal climate variations play a role in tropical-storm activity in the Atlantic had become generally accepted, the role of long-term climate change and global warming was under debate. Since warmer ocean waters tend to fuel hurricane development, it was tempting to consider possible links between a warmer climate and more frequent or intense hurricanes. Kerry Emanuel of the Massachusetts Institute of Technology determined that there was a high correlation between an increase in tropical ocean temperatures and an increase in an index that he developed to gauge the potential destructiveness of hurricanes. His results suggested that future warming could lead to a further increase in the destructive potential of hurricanes. Kevin Trenberth of the National Center for Atmospheric Research noted that human-influenced changes in climate were evident and that they should affect hurricane intensity and rainfall. He cautioned, however, that there was no sound theoretical basis for determining how these changes would affect the number of hurricanes that would form or the number that would make landfall.

      Theoretical and numerical simulations of global warming on hurricanes by Thomas Knutson of NOAA's Geophysical Fluid Dynamics Laboratory in Princeton, N.J., and colleagues suggested that hurricane wind intensity would gradually increase by about 5% over the next 80 years. Given the normal large multidecadal variations that occur in hurricane frequency and intensity, it appeared therefore that any effects of global warming on the impact of hurricanes would be difficult to determine for some time. Another study, however, presented observational evidence that an increase in storm intensity might already be occurring. Using hurricane data from weather satellites, Peter Webster of the Georgia Institute of Technology and colleagues found nearly a doubling in the number of the most severe (category 4 and 5) storms worldwide in the previous 35 years. Yet they also cautioned that a longer period of observations was needed in order to attribute the increase to global warming.

      Less controversial was the steady improvement in the forecasts of tropical-storm tracks. Accurate and timely landfall forecasts are crucial to the effectiveness of evacuations in the face of dangerous storms. In the early 1970s the mean 48-hour error in the storm tracks forecast by the National Hurricane Center was about 510 km (320 mi). With steady improvement through the years, the mean error shrank to less than 290 km (180 mi) in the late 1990s, and the mean error of 175 km (108 mi) in 2004 was the best to date. Both statistical and numerical forecast models had contributed to the improving forecasts, with numerical forecast models taking the lead since the 1990s. Hurricane forecasting is clearly a case where better models resulting from advances in physics and computational power have the potential to save lives.

Douglas Le Comte

▪ 2005


Geology and Geochemistry
      In August 2004 thousands of geologists from all over the world shared recent developments in Earth science at the quadrennial International Geological Congress (the 32nd) in Florence. The themes of the congress were the renaissance of geology and the application of geology to mitigate natural risks and preserve cultural heritage. Among the points made in the message from the organizers of the congress were that societies face complex problems and the geologic sciences must play a key role in finding solutions for them and that geologists must communicate both with the public to build awareness of the role of geology and with governments to ensure the long-term sustainability of the Earth for human habitation.

      Among the many symposia on environmental geology were presentations that demonstrated how geology affects human health. People breathe in and drink substances that have been incorporated into the atmosphere and water from rocks and soils. Health can suffer from either an excess or a deficiency of some of these substances, including iodine, fluorine, arsenic, dust, radon gas, and asbestos. In recent years, for example, the litigation arising from the lung problems caused by just one of the substances—asbestos—has led to huge financial losses for the companies that mined it or manufactured asbestos products. Growing recognition of the significance of such health-related issues was manifested by the launching of a new organization to deal with them: the International Medical Geology Association.

      Enrico Bonatti of the Institute of Marine Science of the Italian National Research Council in Venice delivered one of seven plenary lectures, “The Internal Breathing of the Earth.” He described the relationships between volatile materials in the mantle, plate tectonics, and the Earth's climate with many complex geologic illustrations. One example bearing on current concerns about global warming was the enhancement of volcanism about 100 million years ago through deep-Earth thermal effects. This episode increased the amount of carbon dioxide in the atmosphere, which could have caused the unusually hot climate, the existence of which scientists had deduced from an analysis of the oxygen isotopes found in deep-sea sediments of that age.

      The potential for volcanoes to influence long-term global climatic changes by the emission of carbon dioxide had been discussed for many years, but it was in 1986 that geologists learned of the devastating short-term effects of volcanic carbon-dioxide emission. Volcanic carbon dioxide escaped from solution in the waters deep within Lake Nyos, which occupies an old volcanic crater in Cameroon, and killed 1,800 people by asphyxiation. In 2004 Michel Halbwachs of the Université de Savoie, France, and coauthors reported the results of their continuing studies on the causes and mechanisms of such events, which are called limnic eruptions. The seepage of carbon dioxide into the lake is less than one-tenth the flow of carbon dioxide into the air in the volcanic area of Mammoth Mountain in California, for example, but the deep, stagnant layers of water in Lake Nyos trap the gas under pressure. A large volume of the gas can suddenly bubble to the surface and spread over the surrounding area. The scientists reported on mitigation procedures that they had developed in which a vertical plastic pipe carried deep CO2-rich water up toward the surface. The degassing of the CO2 from the water as it rose created a self-sustaining flow of water through the pipe.

      The Geological Society of America's presidential address by Clark Burchfiel of the Massachusetts Institute of Technology discussed how GPS (Global Positioning System) data from parts of India and China were forcing field geologists to look in new ways at crustal structures and geologic processes. International cooperative studies with Chinese geologists through the previous decade or so had been directed toward sorting out the complex geologic rearrangements arising from the collision of the Indian landmass with that of Asia. The mapping of enormous and intricate fault systems by field geologists had begun to be complemented in dramatic fashion by GPS-derived information giving the direction and rate of motion of many individual points across the vast terrane.

      Reports by Matthew Pritchard and Mark Simons of Princeton University and the California Institute of Technology and Alessandro Ferretti of Tele-Rilevamento, Milan, and colleagues from Italy and the U.S. demonstrated how measurements from satellite radar instruments, which complement GPS studies, had revolutionized tectonic studies of topographic maps and deformations of large and small areas of the crust. Applications included the study of volcanoes, active faults, landslides, oil fields, and glaciers. The technique that was used, called InSAR, involved successive imaging of a given area using synthetic aperture radar (SAR). The images were then superposed to generate interferograms, revealing changes in elevation that had occurred during the time between measurements.

      Pritchard and Simons summarized the InSAR results gathered over 11 years from the central subduction arc of South America, a region along the Pacific coast containing about 900 volcanic structures. They studied the deformation within four circular volcanic structures having diameters of 40–60 km (25–37 mi). The deformation within each structure was greatest at the centre, with a displacement of 10–20 cm (4–8 in), and decreased symmetrically from the centre. Of the four structures (none of which was an actively erupting volcano), two structures were associated with the inflation of large stratovolcanoes and one was associated with the sinking of a large volcanic caldera. The scientists calculated that these deformations could be explained by the injection or withdrawal, respectively, of magma at a depth 8–13 km (5–8 mi) below the surface. The connection between fairly frequent short-lived pulses of magma movement at depth and surface eruptions remained uncertain. Monitoring deformations through the use of InSAR was expected to become a critical tool for understanding volcanic hazards, elucidating the processes at depth that lead to an eruption.

      Ferretti and colleagues modified the InSAR technique, improving its precision sufficiently to measure surface motions with an accuracy of better than one millimetre per year (0.04 in per year). Using this technique to reveal complex patterns of surface motions in the San Francisco Bay area, they found that the San Andreas strike-slip fault was accommodating 40 mm per year of relative motion and the Hayward fault was slipping by about 5 mm per year. Throughout the area the rate of tectonic uplift was generally less than one millimetre per year, with some local regions of more rapid uplift. Areas of unconsolidated sediment and fill flanking the bay exhibited the highest rates of change, with a subsidence of about two centimetres per year. Superimposed on the slow tectonic uplift of the East Bay Hills area of 0.4 mm per year were deep-seated creeping landslides in the Berkeley Hills moving downhill at an average speed of 27–38 mm per year, accelerating during wet months and ceasing during summer months.

Peter J. Wyllie

      On December 26, an undersea earthquake with an epicentre west of the northern end of Sumatra in Indonesia had a moment magnitude of 9.0, the largest since the 9.2-magnitude Alaska earthquake in 1964. A portion of the ocean floor shifted upward along more than 1,000 km (600 mi) of the fault that lies between the Burma and Indian tectonic plates. The movement displaced an enormous volume of seawater and created a tsunami—a series of long-period ocean waves. A tsunami can travel a great distance at speeds as fast as 800 km/hr (500 mph), but as the waves reach a shoreline, their speed is reduced and they build in height. As an example, Sri Lanka, though located approximately 1,200 km (750 mi) from the fault, was struck some 2 hours later by waves that were reported to have reached a height of 9 m (30 feet). With deadly and devastating force the Indian Ocean tsunami overran the coastal areas of many countries, from Malaysia in Southeast Asia to Tanzania in East Africa. The greatest loss of life occurred in Banda Aceh and other coastal cities in northern Sumatra. (See Disasters: Sidebar (Deadliest Tsunami ).)

      In January 2004 a group of seismologists from Princeton University published a new 3-D tomographic model of the Earth's interior. They produced the model by processing earthquake-generated seismic waves, in much the same manner that images of a fetus within the womb are made by processing ultrasound waves. Using an innovative algorithm to process the seismic-wave data, the seismologists were able to reveal the existence of cylindrical plumes of material that extend from the core-mantle boundary, some 2,900 km (1,800 mi) beneath the surface of the Earth, to “hot spots” of volcanic activity at the Earth's surface. (The volcanic activity of hot spots is generally unrelated to the volcanic activity that occurs at tectonic plate boundaries.) The material in the plumes was thought to be slowly rising through the mantle and to be hundreds of degrees warmer than its surroundings, remaining solid until it is within a few kilometres of the surface. Not all the plumes in the model originate at the core-mantle boundary. The plume associated with the Icelandic hot spot, for example, begins at a depth of only about 700 km (430 mi).

      A new crystalline structure for the mineral MgSiO3 was discovered during the year by a group of Japanese researchers. This mineral is the predominant component of the Earth's lower mantle. The researchers found that when the common form of MgSiO3, called perovskite, is subjected to extreme pressure and temperature (specifically, 125 gigapascals and 2,230 °C), its crystalline structure changes into a denser form called the post-perovskite phase. Conditions necessary for the formation of the post-perovskite phase exist in the mantle at and below a depth of 2,700 km (1,700 mi). The presence of this phase deep within the mantle may explain many of the enigmatic seismological properties of that region. For example, the reflection of seismic waves from what appears to be a structure above the core-mantle boundary may be caused by the difference in density between perovskite and the post-perovskite phase, and the fact that the elastic properties of the post-perovskite phase are anisotropic (vary with direction) may be the reason the velocity of seismic waves in the lower mantle depends on the waves' polarization.

      In 2002 two identical satellites were launched into orbit for an earth science mission called GRACE, operated through a partnership between NASA and the German Aerospace Centre. The satellites orbited the Earth at an altitude of about 500 km (300 mi) and provided data for measuring the Earth's gravitational field. In 2004 scientists published the first results from the GRACE mission, presenting a global map of gravity anomalies with a spatial resolution about 10 times greater than that of previous maps. The gravity anomalies are largely caused by variations in the density of materials from place to place within the Earth, and they give clues to understanding the creeping convective motion of material within the Earth's mantle. Some gravity anomalies vary with time, and the scientists reported a strong seasonal variation in South America. This variation appeared to be related to the flow of groundwater in the Amazon basin, and the new observations would help hydrologists merge models of well-studied local systems of water flow into a continental-scale model.

      In mid-2004 scientists reported the discovery of an impact crater in the shallow waters off the northwestern coast of Australia. The geologic feature, known as the Bedout High, is overlain by a layer of sediment about three kilometers (two miles) deep and was first identified as a potential impact site from an analysis of data from a marine seismological experiment. The scientists used a combination of geologic, geochemical, and geophysical observations to confirm the identity of the crater and to link it with the mass extinction that occurred between the Permian and Triassic geologic periods about 250 million years ago. Although this extinction event was less well known than the one that included the demise of the dinosaurs between the Cretaceous and Tertiary periods, it was the more severe of the two—about 90% of all marine species and 70% of land vertebrate species became extinct. The scientists pointed out that a massive amount of volcanic activity in Siberia produced large flows of basalt at about the time the Bedout High impact crater was formed, and they speculated that there might be a connection between the two events.

Keith Koper

Meteorology and Climate
      In 2004 abrupt climate change was a topic widely discussed in news reports and was the subject of a popular disaster movie. A number of scientists believed there was reason to be concerned that within a matter of decades a warming of the climate in the Arctic could lead to cooler climates in Europe and parts of North America. In theory, an increase in Arctic air temperature would lead to greater rainfall and to the melting of ice in the Arctic, which in turn would increase the flow of fresh water into the northern Atlantic Ocean in the area south of Greenland. Fresh water being more buoyant than salt water, it would interfere with the surface ocean currents of the oceanic circulation system known as the Atlantic conveyor belt, which transports warm water northward from the tropics. Without this warm water, the climates of Europe and parts of North America would become colder, and precipitation patterns would change in various parts of the world.

      Although evidence existed that the northern Atlantic Ocean was becoming significantly less salty, scientists did not know how great the change in salinity would have to be in order to trigger a major shift in climate. A number of scientists were skeptical that abrupt climate change was a near-term threat. David Battisti of the University of Washington noted that at the rate at which the salinity was decreasing, it would take 200 years or more to slow the circulation of the Atlantic conveyor belt. In addition, warming of the upper layers of the ocean might substantially offset the loss of buoyancy and moderate the effects associated with a decrease in salinity. A recent report from the U.S. Climate Change Science Program suggested that recent changes in the distribution of fresh and saline ocean waters were occurring in ways that might be linked to global warming.

      Various studies involving the measurement of global sea level indicated that it was rising. The rise was believed to be caused by the thermal expansion of the oceans (which would correspond to recent warming trends) and by the melting of continental ice, such as glaciers, with a subsequent increase of the volume of the oceans. Researchers Peter Wadhams of the University of Cambridge and Walter Munk of the Scripps Institution of Oceanography, La Jolla, Calif., determined that the warming of the oceans was causing a rise in sea level of about 0.5 mm (1 mm = about 0.4 in) per year and that glacial melting contributed about another 0.6 mm per year—resulting in a total rate of 1.1 mm per year. Other researchers calculated higher rates. For example, John A. Church and co-workers of CSIRO Marine Research, Hobart, Tas., Australia, found a global increase of 1.8 mm per year for the period 1950 to 2000. Scientists in the U.S. Climate Change Science Program found a similar overall rate of increase (1.5 to 2 mm per year) and noted that their research provided evidence suggesting that the melting of polar ice sheets could play an important role in rising sea levels.

      Additional research conducted as part of the U.S. Climate Change Science Program used satellite data to show that the portion of the Arctic Ocean covered by perennial sea ice had declined by about 9% per decade since 1978 and that the decline could have large-scale consequences on climate. No direct evidence was found that greenhouse gases were responsible for the melting of sea ice or for a reduction of snow cover in the Arctic, but some evidence showed that the natural weather pattern known as the North Atlantic Oscillation/Northern Annular Mode might have contributed to the overall decrease in Arctic sea ice. Weather patterns that changed from year to year were a major cause of variability in snow and ice coverage. For example, a pattern of cold weather that persisted in central and eastern North America during the summer resulted in the lingering of ice on the waters of Hudson Bay through the end of August for the first time since 1994. An Arctic Climate Impact Assessment study issued in November 2004 concluded that the “Arctic is now experiencing some of the most rapid and severe climate change on Earth,” and it indicated that climate change is expected to accelerate over the next 100 years.

      In September 2004 the United States released the first draft of its plan to monitor the Earth as part of the U.S. Integrated Earth Observation System, a component of the Global Earth Observation System involving nearly 50 countries. The draft plan, produced through the collaborative effort of 18 federal agencies under the auspices of the National Science and Technology Council, focused on nine areas of study with potential benefit to society, including weather forecasting and the prediction and mitigation of climate variability and change. The plan was to be incorporated within a larger intergovernmental document to be presented at the third global Earth Observation Summit in Brussels in February 2005.

      A large portion of the annual rainfall across the southwestern United States and northwestern Mexico occurs during thunderstorms generated by a seasonal shift of wind patterns between June and the end of September. Improved forecasts of this summer monsoon were seen as an important goal for meteorologists to help predict drought in these water-scarce areas. The field phase of the North American Monsoon Experiment began in June 2004. For nearly four months, scientists from the United States, Mexico, and several Central American countries collaborated in collecting extensive atmospheric, oceanic, and land-surface observations in northwestern Mexico, the southwestern United States, and adjacent oceanic areas. Scientists hoped to use the data to explore improvements in global models of weather and climate, potentially resulting in better forecasts of summer precipitation months to seasons in advance.

Douglas LeComte

▪ 2004


Geology and Geochemistry
      A team of earth scientists in 2003 reported the success of an experiment, begun in 1999 in western Washington state, that was revolutionizing investigations of surface-rupturing faults, landslide hazards, surface processes such as runoff and flooding, and past continental glaciation in the region. The dense forest cover in the Puget Sound area had frustrated high-resolution topographic mapping of the land surface by conventional photographic techniques. In an alternate approach, Ralph Haugerud of the U.S. Geological Survey (USGS) and five colleagues from the USGS, NASA, and the Puget Sound Regional Council synthesized topographic survey data collected from aircraft by lidar (light detection and ranging). Analogous to radar mapping with microwaves, the lidar technique measured the distance to a target by timing the round-trip travel of short laser pulses scanned across and reflected from the target area. The narrow laser beam, operating at a typical pulse rate of 30,000 per second, was able to probe between trees to reveal variations in surface height with a remarkable accuracy of 10–20 cm (4–8 in). The Puget Sound Lidar Consortium, an informal group of planners and researchers supported by the USGS, NASA, and local government, acquired the lidar topographic data for more than 10,000 sq km (3,860 sq mi) of lowlands around Puget Sound.

      The effort resulted in the discovery of many previously unidentified geologic features, including ruptures along five known fault zones, some of which were later explored on the ground to investigate the frequency of past breaks and associated earthquakes. Among the evidence for glacial processes revealed by the lidar mapping were two intersecting sets of roughly parallel grooves spaced hundreds of metres apart in the solid land surface and having amplitudes (distances from ridge crests to valley bottoms) of metres to tens of metres. One set of older north-south grooves was overprinted by another set of younger grooves having a northeast-southwest orientation. This topography was caused by flowing ice and clearly demonstrated a significant change in the direction of ice flow.

      The glaciers that spread across North America and Eurasia during the last ice age were trivial compared with the ice postulated to have covered all of Earth during so-called Snowball Earth periods about 2.4 billion years ago and again between 890 million and 580 million years ago. In 2003 John Higgins and Daniel Schrag of Harvard University used the results of a computed model of the ocean-atmosphere system to interpret the geochemistry of the global, characteristic sequence of limestone deposits—carbonate “cap rocks”—that overlie the glacial deposits of the more recent snowball period. Previous arguments had assumed that the carbon isotopes cycling between the atmosphere, the ocean, and exposed carbonate rock would be in a steady state, which did not explain the unusual changes in concentrations of carbon isotopes found in the cap rocks. The new model calculations accounted for the changes in terms of an increase in sea-surface temperature, which affected the exchange between carbon dioxide and the carbonates that form the rocks. In the early 1990s, Joseph Kirschvink of the California Institute of Technology had solved the dilemma of how Earth could have escaped its snowball condition through accumulation of carbon dioxide in the atmosphere from volcanic degassing and consequent trapping of solar radiation via the greenhouse effect. The new calculations emphasized the importance of the effect of high atmospheric carbon dioxide concentration on seawater chemistry and its relationship to the formation of the limestone cap rocks associated with the thawing of a frozen Earth.

      Geochemical evidence from the deep drill cores extracted from the remaining ice sheets in Greenland and Antarctica since the 1970s have transformed investigations of Earth's climatic history. In the Vostok ice core drilled from Antarctica in the 1980s, the concentrations of hydrogen isotopes in the ice vary with increasing depth (corresponding to increasing time since the ice was deposited) in regular fluctuations that are associated with climatic cycles. Mechanisms to explain climatic cycles also involve the ocean, but scientists had found it difficult to correlate time periods that had been determined for depths within the ice core with time periods recorded in ocean sediments. During the year P. Graham Mortyn of the Scripps Institution of Oceanography, La Jolla, Calif., and four colleagues from Scripps and the University of Florida reported finding such a correlation. In so doing they clarified relationships between the Antarctic polar climate, air-sea interactions, and variability in the deep ocean and again demonstrated the important role of carbon dioxide as a greenhouse gas in past climate change. Mortyn and associates compared hydrogen isotope records from the Vostok core with their own detailed geochemical measurements of oxygen isotopes present in selected deep-sea sediment cores from the South Atlantic Ocean adjacent to Antarctica. They confirmed that the timing of the oscillations in both were synchronous over the past 60,000 years, and they extended the study of temperature oscillations through the past 400,000 years, using data from a previously drilled sediment core. The results suggested that during the last four major deglaciation events (ice-sheet retreats), changes in the temperature of polar air were synchronous with those of the nearby deep ocean and with changes in atmospheric content of carbon dioxide.

      Uncertainties about Earth's internal temperature were reduced in 2003 as a result of two independent sets of experiments, by Charles Lesher of the University of California, Davis, and three colleagues and by Kyoko Matsukage of Ibaraki University, Mito, Japan, and Keiko Kubo of the Tokyo Institute of Technology. Experimental determination of the temperature at which peridotite in Earth's mantle begins to melt as a function of depth (i.e., its melting curve, or solidus) provides some calibration for thermal models of Earth's interior and for the temperatures and types of melting experienced by peridotite rising in mantle plumes. Results published between 1986 and 2000 had differed by 150 °C (270 °F) in the pressure range of 4–6 gigapascals (GPa; corresponding to depths of 120–180 km [75–110 mi]). Lesher and colleagues concluded after intricate tests that the differences had arisen from the misbehaviour of thermocouples (temperature-measuring devices) used in some of the earlier experiments. Their results indicated that the solidus temperature of mantle peridotite at the investigated pressure range was as much as 150 °C lower than usually assumed, which had significant implications for estimated temperatures in connection with mantle convection and magma generation in general.

      In related experiments at lower pressures (1–2.5 GPa, corresponding to depths of 35–80 km [20–50 mi]), Matsukage and Kubo determined for the first time the systematic variation of chromium content in the mineral spinel as its parent rock, a type of dry (water-free) peridotite, was progressively melted. They compared their results with the measured chromium content in spinel from many natural peridotites and demonstrated that most of these rocks, which were known to have come from the mantle, had undergone more than one episode of partial melting. This supported the view that such rocks had experienced a complex history of successive episodes of magma generation and separation. In addition, the results of their dry experiments, compared with other experiments that included water under pressure, confirmed the generally accepted hypothesis that partial melting of peridotites originating from subduction zones (where the oceanic tectonic plates are sinking into the mantle) was accompanied by an influx of water-bearing fluid or melt. The source of the water presumably was ocean water that earlier had generated hydrated minerals in the basalt of the sinking oceanic plate.

Peter J. Wyllie

      During 2003, there were 14 major earthquakes (those of moment magnitude [Mw] 7.0 –7.9) and one great earthquake (Mw 8.0 or higher). On December 26 the deadliest earthquake of the year (Mw6.6) struck southeastern Iran, killing at least 26,000 people and injuring a comparable number. The city of Bam was hardest hit, with 85% of buildings damaged or destroyed. Another earthquake (Mw 6.8) with a high death toll rocked northern Algeria on May 21, taking more than 2,200 lives and injuring at least 10,200. Other earthquakes with significant fatalities occurred on January 22 (Mw 7.6) in Colima state, Mex.; February 24 (Mw 6.4) in southern Xinjiang province, China; and May 1 (Mw 6.4) in eastern Turkey. The great earthquake of the period (Mw 8.3) struck the southeastern Hokkaido region of Japan on September 25; because its epicentre was about 60 km (40 mi) offshore, injuries and damage were comparatively light.

      Old observations regarding the connections between earthquakes and hydrology were discussed in new ways during the year. For instance, it had long been regarded as little more than scientific curiosities that after big earthquake tremors, nearby streams sometimes flowed more rapidly for a few days and wells located thousands of kilometres away showed permanent falls or rises in water levels. In a review of recent research on the hydrologic effects of earthquake-caused crustal deformation and ground shaking, Michael Manga of the University of California, Berkeley, and David Montgomery of the University of Washington suggested that in some instances of stream-flow surges following earthquakes, shallow seismic waves pass through groundwater-sodden soil, shaking and compacting it and squeezing the water into streams. In cases of wells drilled into solid bedrock, the researchers described how seismic waves can riddle the rock with fractures, whereupon water seeps in and well-water levels drop. In cases of wells drilled into aquifers made of unconsolidated deposits, seismic waves can compact the deposits and shrink the aquifer volume, pushing the water table upward. Manga and Montgomery concluded that the complex interactions between earthquakes and hydrologic systems offered unique opportunities for learning more about the workings of both.

      An important event in seismological research was the initiation of the San Andreas Fault Observatory at Depth (SAFOD), a 3.9-km (2.4-mi)-deep instrumented borehole through California's infamous San Andreas Fault Zone, where the Pacific and North American tectonic plates are slowly slipping past each other. Sited on private land near Parkfield, Calif., the hole would begin on the western (Pacific) side of the fault, descend vertically and then angle to the east, and eventually pierce the fault zone to end on the eastern (North American) side. It would enable scientists to install sensitive seismometers and other instruments in the fault zone to monitor seismic activity and real-time changes in rock deformation, temperature, fluid pressure, and other physical and chemical properties that occur prior to earthquakes. The findings were expected to shed new light on exactly how earthquakes work.

      The year was exciting for earth scientists in Italy, considered to be the “cradle of volcanology.” Stromboli Island's volcano erupted with a once-in-a-century level of intensity on April 5, showering parts of the coastline with scoria and blocks up to 2 m (about 61/2 ft) in diameter but causing no human fatalities. The event was part of an unusual series of violent eruptions that had begun in December 2002. Sicily's Mt. Etna experienced major flank eruptions between October 2002 and January 2003. Lava flows destroyed ski facilities on northern and southern slopes of the volcano and near-continuous ash falls plagued two regional airports for a period of six weeks. Other significant eruptions occurred in Ecuador (Reventador), Montserrat (Soufrière Hills), Guatemala (Fuego), and the Mariana Islands (Anatahan).

      A vast province lies beneath the deep ocean waters in which Earth's crust is continually renewed by volcanism and hydrothermal activities along the mid-oceanic ridge systems. Following planning meetings and workshops attended by more than 300 scientists engaged in a range of specialties in geophysics, geology, biology, chemistry, and oceanography, an integrated initiative, RIDGE 2000, was launched in late 2001 under the auspices of the U.S. National Science Foundation. The focus of the effort was “a comprehensive, integrated understanding of the relationships among the geological and geophysical processes of planetary renewal on oceanic spreading centers and the seafloor and subseafloor ecosystems that they support,” and it involved far-reaching collaboration between scientists to develop whole-system models through exploration, mapping, and sampling at a limited number of representative sites.

      As of 2003 three sites had been designated for the initial integrated studies: the 8°–11° N segment of the East Pacific Rise, off Central America; the Endeavor Segment of the Juan de Fuca Ridge, in the eastern Pacific Ocean off Vancouver Island, B.C.; and a segment of the East Lau Spreading Center in the Lau Basin in the western Pacific, near Fiji. Among the fundamental questions to be addressed were the relationships between mantle flow, mantle composition, and morphology and segmentation of the mid-oceanic ridges; the organization of the flow of magma in the mantle and crust underlying the seafloor; the effects of biological activity, particularly that of microorganisms, on the chemistry of hydrothermal vents and hydrothermal circulation; and the role of hydrothermal flow in influencing the physical, chemical, and biological characteristics of the biosphere from deep in the seafloor to the overlying water column.

      A highlight of research related to the second question, concerning the distribution and transport of melt in the oceanic crust, was a seismic tomography study carried out by Douglas Toomey and Laura Magde of the University of Oregon and co-workers. By processing velocity data from seismic waves in a way similar to the processing of X-ray data in medical tomography, they produced vivid three-dimensional images of the magma “plumbing system” in the crust below a segment of the Mid-Atlantic Ridge.

Murli H. Manghnani

Meteorology and Climate
      The drought that gripped southern Europe, southwestern Asia, and the U.S. between 1998 and 2002 appeared to be connected to temperatures in the tropical Pacific and Indian oceans, according to a study reported in 2003. Martin Hoerling and Arun Kumar of the U.S. National Oceanic and Atmospheric Administration found that during the drought years, surface waters in the eastern tropical Pacific Ocean were cooler than normal, while those in the western Pacific and Indian oceans were warmer than average. When they ran computer simulations of Earth's atmospheric circulation using the actual ocean temperature data, the jet stream in the models shifted northward, pushing wet weather north and away from midlatitude regions. Extended La Niña conditions in the east-central tropical Pacific explained the cooling observed there. In contrast, the western Pacific and Indian oceans were unprecedentedly warm, which the researchers attributed to the ocean's response to increased greenhouse gases in the atmosphere—an effect that they thought likely to continue. The results of the study reinforced the necessity for improved understanding of the links between ocean and atmosphere.

      Another part of the world ocean may have been associated with drought and climate variability. Research reported during the year on the causes of multiyear “megadroughts” hinted that opposing shifts in temperatures in the tropical Pacific and North Atlantic oceans occurred while disastrous long-term droughts persisted across the North American continent. Stephen Gray of the University of Wyoming and colleagues used seven centuries of tree-ring data from the central and southern Rocky Mountains as indicators of precipitation changes having oscillations of 40–70-year periods. Their results suggested that the Great Plains, the Rockies, and the U.S. Southwest were stricken by a widespread megadrought when the tropical Pacific cooled at the same time that the North Atlantic warmed. This pattern could help explain both the long large-scale drought of the 1950s and the recent 1998–2002 drought; in each case, cool waters spread over the eastern Pacific while warmth covered portions of the North Atlantic.

      The record-breaking heat wave experienced in Europe during August (see Calendar;Disasters), though not necessarily related to climate change, gave added impetus to scientists researching the extent and causes of the observed trends in rising global temperatures. Although much press attention was given to the possible effects of greenhouse gases, the size of the contribution that land use makes to global climate change may have been underestimated, according to a study by two investigators from the University of Maryland. Eugenia Kalnay and Ming Cai compared two sets of 50-year temperature records for the entire U.S., one set collected from surface stations and the other from above-surface instruments (satellites and weather balloons). They concluded that not only the growth of cities but also that of agricultural activities make the world seem warmer than what could be attributed to the effects of greenhouse gases alone. The overall rise in U.S. mean surface temperatures due to such changes in land use could be as much as 0.27 °C (0.5 °F) per century—a value at least twice as high as previous estimates based on urbanization alone.

      Not only do cities warm the atmosphere, but they also affect rainfall patterns. “Urban heat islands,” created from solar-heat-retaining streets and buildings, were known to increase the amount and frequency of rainfall in and downwind of a number of cities. During the year a NASA-funded analysis of data from the Tropical Rainfall Measuring Mission satellite and from rain gauges on the ground corroborated this effect for the Houston, Texas, area. Average rainfall from 1998 to 2002 was 44% higher downwind of Houston than upwind and 29% higher over the city than upwind. Another study showed that the combination of increased particle pollution and higher air temperatures over large cities likely was enhancing cloud-to-ground lightning strikes in those locales. Analyzing three summer seasons (2000–2002) of lightning-flash data from three large urban areas in southeastern Brazil, Kleber Naccarato and colleagues of Brazil's National Institute for Space Research observed a 60–100% increase in flash density over the urban areas compared with surrounding regions.

      Although long-term temperature trends vary widely from region to region, evidence mounted that climate change could be affecting plants and animals across the globe. The results of one study charting the biological impact of the average rise in global temperature of 0.6 °C (1 °F) in the past 100 years suggested that the warming was moving species' ranges northward and shifting spring events earlier. After mining data from previous studies involving 1,700 species, Camille Parmesan of the University of Texas at Austin and Gary Yohe of Wesleyan University, Middletown, Conn., reported that ranges were creeping toward cooler latitudes about 6 km (3.7 mi) on average per decade. In addition, spring events such as breeding in frogs, bird nesting, bursting of tree buds, and arrival of migrating butterflies and birds were taking place about two days earlier per decade. (For discussion of a study assessing the effects of climate change on plant productivity, see Life Sciences: Botany.)

      All research is based on data, and accurate global data are essential for sound climate research. In late July representatives of approximately 30 countries and 20 international organizations assembled at the Earth Observation Summit, a conference hosted by the U.S. with the goal of establishing a comprehensive and coordinated Earth observation system. The new system would focus on providing critical scientific data to help policy makers come to more-informed decisions regarding climate and the environment. Linking and expanding the many current disparate observation systems were expected to lead to better observations and models, which in turn would benefit fundamental earth science and improve its predictive power in such applications as climate change, crop production, energy and water use, disease outbreaks, and natural-hazard assessment.

Douglas Le Comte

▪ 2003


Geology and Geochemistry
      A comprehensive 2002 publication by Ali Aksu of the Memorial University of Newfoundland with six coauthors (from the U.K., Canada, the U.S., and Turkey) contradicted the popular Noah's Flood Hypothesis. In 1996 William Ryan, Walter Pitman, and co-workers (Columbia University, New York City) had discovered that mollusk shells from the Mediterranean Sea suddenly appeared on the shelves of the Black Sea about 7,500 years ago. They developed the case—the Flood Hypothesis—that while the connecting channels between the Mediterranean and Black seas were closed, with bedrock bottoms exposed to the atmosphere during glacial periods, the isolated Black Sea had evaporated down to about 150 m (1 m = 3.28 ft) lower than modern sea level. About 7,500 years ago, they surmised, water broke through, causing a catastrophic flood of Mediterranean waters that refilled the Black Sea in about two years and washed in the Mediterranean mollusks that then settled on the Black Sea shelves. They suggested that this event could be the historical basis for Noah's Flood.

      Aksu and coauthors reported on geologic and geochemical results from sedimentary cores drilled beneath the Sea of Marmara, a gateway that connects the Black Sea with the Mediterranean. They compiled a history of the water flowing through the Sea of Marmara during the past 10,000–25,000 years on the basis of seismic profiles of the submarine sediments and the geochemistry and sequential contents (sediment types, carbon isotopes, salinity, fossils, and pollen) of the one–two-metre-long cores drilled from the sediments. They found no evidence for a catastrophic flood and were convinced that the evidence rather supported an outflow hypothesis, which involved continuous overflow of water from the Black Sea into the Mediterranean over almost 10,000 years. The sudden appearance of Mediterranean fossils in the Black Sea was explained, they suggested, by changes in salinity 7,500 years ago that permitted the opportunistic mollusks to populate the shallow Black Sea shelves.

      In 2002 a controversy over interpretation of rocks famous for evidence of early life drew attention to the continuing importance of classical geology in these days of near-magical geochemical instruments. Efforts to decipher the origin of life have often focused on the investigation of ancient rocks in southwestern Greenland, in particular the banded-iron formation (BIF) rocks of the Isua greenstone belt. These were originally sedimentary rocks formed beneath water. Tectonic activity altered their original structure and mineralogy, but their origin as sedimentary rocks is not disputed. In 1996 Stephen J. Mojzsis (then a graduate student at Scripps Institution of Oceanography, La Jolla, Calif.) and colleagues had reported that rocks from nearby Akilia island were also BIFs, with crosscutting veins of an igneous rock that yielded an age of 3.85 billion years. The researchers concluded that the values of carbon isotopes measured in small inclusions of graphite were a signature for the existence of 3.85-billion-year-old life in the original sediments. Christopher M. Fedo of George Washington University, Washington, D.C., and Martin J. Whitehouse of the Swedish Museum of Natural History, while engaged in a multiscientist investigation of the Isua belt, also visited Akilia. The rocks there did not look like the metamorphosed BIFs with which they were familiar. The researchers' geochemical analyses, published in 2002, together with the field relationships, satisfied them that the rocks were igneous, not sedimentary BIFs. Such rocks would have formed at a temperature much too high for the graphite inclusions to represent original life. Resolution of the controversy would require a satisfactory explanation for the presence of the iron oxide mineral magnetite in quartz-rich layers, which would involve traditional detailed tectonic, petrographic, and mineralogical investigation of the rocks in addition to geochemical analyses.

      In a 2002 review of metamorphism, Michael Brown of the University of Maryland wrote that excitement remained focused on the extreme conditions of pressure and temperature to which some crustal rocks have been subjected. The conventional diagrams for metamorphic facies have extended to 10 kilobars (1 kilobar = 1,000 atmospheres) for rocks metamorphosed at a depth of 25–30 km (1 km = 0.62 mi) and temperatures up to about 850 °C (1,500 °F). The discovery of crustal rocks containing minerals such as coesite and diamond indicated that these rocks reached depths of 100 km (and corresponding pressures of 30 kilobars) or more in ultrahigh-pressure metamorphism (UHPM). The mineralogy of some other rocks indicated the attainment of 1,100 °C (2,000 °F) in ultrahigh temperature metamorphism (UHTM). UHPM rocks provide information about the subduction of crustal rocks to extreme depths, and UHTM rocks provide information about the involvement of crustal rocks with hot, shallow asthenospheric mantle, perhaps through the breaking off and sinking of crustal rocks. The oldest-known UHPM rocks are dated at about 620 million years, and the oldest-known UHTM rocks are about 2.5 billion years old. Brown noted that these dates correspond roughly to boundaries between the three eras—the Archean, the Proterozoic, and the Phanerozoic—that have always been recognized as distinctive. Further documentation of UHPM and UHTM rocks through time may indicate whether these three geologic eras are characterized by different styles of global geodynamics, a possibility that has been much debated.

      In 2002 Ethan F. Baxter, Donald J. DePaolo, and Paul R. Renne of the University of California, Berkeley, published a significant advance in the interpretation of mineralogical ages based on argon isotopes. Biotites sampled across the boundary between an amphibolite and a contemporaneous pelitic rock in the Alps yielded different apparent ages. The biotite ages in the pelite averaged 12 million years—consistent with known geology—but those in the amphibolite ranged from 15 million to 18 million years. The anomaly of the greater ages in the amphibolite was ascribed to “excess argon.” The origin of excess argon was poorly understood, but it was a bane for geochronologists because frequently the only way to confirm its presence was to make independent age determinations. As a rock cools, argon40 produced or incorporated within minerals at high temperatures is able to diffuse away until “closure” occurs, at a temperature where diffusivity slows effectively to zero. Subsequently, additional argon40 is produced from potassium at a known rate and remains trapped in the mineral. Measuring the ratio of argon40 to potassium provides the time at which closure occurred—that is, the “closure age” of the mineral; the presence of excess argon indicates exceptions to the assumptions. Baxter and his co-workers established equations that took into account not only the diffusive properties of the minerals but also the characteristics of the intergranular medium (typically a fluid) through which argon must diffuse after exiting the minerals. Numerical modeling showed that excess argon is dependent on “bulk rock argon diffusivity,” a factor not included in standard geochronological thinking. Quantitative modeling provides numerical limits for this diffusivity and suggests that it decreased rapidly about 15 million years ago in the amphibolite, which corresponds to the geologically known onset of rapid exhumation and rheological changes of the rocks. In the pelite, with its different mineralogy and texture, the bulk rock diffusivity was not affected by the tectonic uplift, and diffusive escape of argon continued until the closure temperature was reached 12 million years ago. With this kind of understanding, patterns of excess argon may be exploited to learn more about the properties and history of geologic systems.

      The Galapagos Rift 2002 Expedition reported via satellite from the research ship Atlantis to journalists at the May meeting of the American Geophysical Union. The expedition marked the 25th anniversary of the discovery of submarine hydrothermal vents, those fascinating localities on the oceanic ridges where water circulates through the crust, is heated, and emerges as hot springs. The hot water contains material dissolved from the ocean crust, and as it encounters the cold ocean water, it precipitates sulfide-rich chimneys and provides chemical sustenance for bacterial mats and oases of exotic fauna. This expedition was continuing long-term investigations in the Galapagos Rift region that aimed to reconstruct the history of the formation of vents and the population of submarine oases, which are intermittently destroyed by lava flows. The scientists used a remarkable instrument, the Autonomous Benthic Explorer (ABE), a deep-swimming robot not attached to the surface ship. Following a preplanned path, the ABE mapped the seafloor by using sonar and made other measurements. Very detailed maps were produced, with vertical resolution of one metre. In 25 years of study in this region, no chimney vents had been found, but with its sensitive thermometry the ABE discovered and tracked a trail of water only 0.02 °C (0.036 °F) warmer than the surrounding ocean water. This trail led to two extinct sulfide-bearing chimneys that must have required water of at least 200 °C (392 °F)—the first evidence of high-temperature vents along the Galapagos Ridge. The Rose Garden” oasis with its spectacular tube worms, discovered in 1979, had provided the foundation for understanding the biological communities associated with vents, but the expedition found that this site had been covered by recent lava flows. These submarine oases of life in total darkness represent a most remarkable interplay between geology, geochemistry, and biology.

Peter J. Wyllie

       Earthquakes occur mainly because of the constant movement of Earth's lithospheric plates, which include the crust. For instance, most seismic activity in Alaska results from the interaction of the northwestwardly moving Pacific Plate with the corner of the North American Plate that comprises Alaska. On November 3 one of the largest recorded earthquakes to strike North America hit central Alaska. The epicentre of this Mw (moment magnitude) 7.9 earthquake was 120 km (75 mi) south of Fairbanks. The event was preceded by a foreshock of Mw6.7 on October 23, which ruptured a 300-km (190-mi) segment of the Denali Fault, east of the Parks Highway and community of Cantwell. Although some support structures of the Trans-Alaska Pipeline were displaced, their earthquake-resistant features allowed the pipeline itself to remain intact. No casualties were recorded for either Alaskan earthquake. The Denali Fault, a bow-shaped strike-slip fault transecting Alaska, is perhaps the most significant crustal fault in the state and is seismically active. It experiences infrequent large earthquakes similar to those recorded along the northern and southern segments of the San Andreas Fault in California.

      Earthquakes of 2002 with high human casualties included separate Mw 6.1 and 7.4 shocks in the Hindu Kush region of Afghanistan in March, which together killed more than 1,000, and a Mw6.5 event in northwestern Iran in June, which killed more than 200.

      The most significant volcanic eruption in terms of human impact was that of Mt. Nyiragongo in the Democratic Republic of the Congo, commencing in January. Lava flowed southward at a rate of about 1–2 km (0.6–1.2 mi) per hour and entered the city of Goma. About 400,000 people in Goma were evacuated, and 14 villages were damaged by lava flows. The eruption killed at least 45 people and left about 12,000 families homeless.

      Beginning late April, Mauna Loa on the island of Hawaii showed signs of renewed activity after an 18-year period of repose. Global Positioning System (GPS) stations and tiltmeters positioned around the volcano recorded the equivalent of as much as 5–6 cm (2–2.4 in) per year of deformation, interpreted as a reinflation of Mauna Loa's magma chamber caused by injection of additional material at a depth of 5 km (3 mi) beneath the summit.

      Among persistent active volcanoes, Sicily's Mt. Etna resumed its pattern of frequent summit eruptions in October, following the large flank event of July–August 2001. On October 27 Etna spewed a column of volcanic ash, blackening skies over Sicily and as far away as North Africa, 560 km (350 mi) south. Rivers of lava flowed halfway down the mountain's slopes, setting forests afire.

      In November astronomers reported what they described as the most energetic eruption ever seen in the solar system on the highly volcanic moon Io, one of the four Galilean satellites of the planet Jupiter. Working at the Keck Observatory on Mauna Kea, Hawaii, Franck Marchis and Imke de Pater of the University of California, Berkeley, and collaborators captured near-infrared images of the same side of Io two days apart, on Feb. 20 and 22, 2001. (Analysis of the images was not completed until 2002.) The earlier image showed a brightening near Surt volcano, the site of a large eruption in 1979 that had been identified from the flybys of the Voyager 1 and 2 spacecraft. Over the following two days, the hot spot grew "into an extremely bright volcanic outburst," according to the researchers. They estimated that the emitting area of the eruption was larger than the entire base of Mt. Etna. The lower limit of the interpreted temperature of the hot spot—1,400 K (2,000 °F)—was consistent with the temperature of basaltic eruptions on Earth.

      Scientists had monitored changes in Earth's oblateness—a slight bulge around the Equator caused by axial rotation—by means of satellite laser ranging techniques since the 1970s. During the year Christopher Cox of Raytheon Information Technology and Scientific Services and Benjamin Chao of NASA Goddard Space Flight Center reported that, whereas the oblateness had been slowly decreasing over the past quarter century, it abruptly reversed that trend around 1998. The continually decreasing oblateness had been attributed mainly to rebound in the mantle after the last glacial period, when massive polar caps had covered the high latitudes in the north and south. The exact causes of the trend reversal were uncertain, but a possible reason was a large-scale mass redistribution in Earth's deep interior—specifically, a flow of material driven from high altitudes to the equatorial regions by Earth's dynamo in the liquid outer core and along the core-mantle boundary (located at a depth of 2,900 km [1,800 mi]). This explanation was consistent with a significant geomagnetic jerk (a sudden shift in the trend of the long-term variation of Earth's magnetic field) recorded in 1999, probably caused by the same material flow. A second possible cause examined by Cox and Chao was a large-mass redistribution in the oceans. In a subsequent report, Jean O. Dickey of the California Institute of Technology and collaborators made a case for glacial melting as yet another major factor in the trend reversal.

      Seismic tomography (imaging of the structure of Earth's interior by seismic velocity differences), three-dimensional global seismicity, and detailed GPS measurements of the surface were enabling geophysicists to improve their understanding of plate motions. As two plates collide, one is forced beneath the other and sinks into the less-dense upper mantle—a process called subduction. The descent of the subducted portions of the plates, called slabs, was thought to drive the motions of the plates on Earth's surface, but the exact mechanism by which the slabs and plates interact was not yet well understood. Clinton Conrad and Carolina Lithgow-Bertelloni of the University of Michigan showed that the present-day observed plate motions could be best modeled if the slabs that are sinking into the upper mantle are still mechanically attached to their source plates and thus generate a direct pull on the plates. In contrast, by the time the slabs reach the lower mantle (at about a 700-km [430-mi] depth), they are no longer well attached and instead draw plates via a suction force created by their sinking.

      The core-mantle boundary represents the most prominent discontinuity in Earth's interior with respect to chemistry and properties of deformation and flow. There the solid lower mantle, composed of silicates, meets the fluid outer core, composed of molten iron-nickel alloy. Using seismic-wave data from earthquakes in the Tonga-Fiji region in the South Pacific Ocean, Sebastian Rost and Justin Revenaugh of the University of California, Santa Cruz, detected rigid zones lying just within the top boundary of the outer core. Normally, seismic waves called shear waves cannot propagate through a fluid; when they encounter the core-mantle boundary, they reflect sharply from the molten alloy. Within the core-rigidity zones, however, the waves propagated at a very low velocity. The investigators interpreted these zones as being thin (0.12–0.18-km [400–600-ft]) patches of molten iron mixed with solid material having a small shear-wave velocity, which enables the shear waves to travel in the outermost core. Such zones at the top of the outer core had been previously detected as topographic highs of the core-mantle boundary.

Murli H. Manghnani

Meteorology and Climate
       On June 24 the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument aboard NASA's Aqua satellite began looking at Earth from about 700 km (435 mi) in space. Aqua, launched May 4, was a complement to NASA's Terra satellite, which had gone into orbit in 1999 carrying a twin MODIS instrument. MODIS viewed Earth's surface in 36 spectral bands ranging from visible to thermal infrared wavelengths. Combining data from the two instruments allowed a comprehensive daily examination of Earth that would help scientists study water evaporation, the movements of water vapour throughout the atmosphere, and cloud formation as well as various characteristics of the land and oceans.

      Also on June 24, NASA and the National Oceanic and Atmospheric Administration (NOAA) launched NOAA-17. The spacecraft was the third in a series of five Polar-orbiting Operational Environmental Satellites (POES) that had improved imaging and sounding capabilities and that would operate over the next 10 years. The satellite was expected to improve weather forecasting and monitor environmental phenomena around the world such as El Niño events, droughts, fires, and floods. The data would be used primarily by NOAA's National Weather Service for its weather and climate forecasts. Longer-term data records from the NOAA satellites would contribute to scientists' understanding of climate change.

      A new three-dimensional weather computer model from NOAA, covering the continental U.S., became operational in April. Called the RUC20 (for Rapid Update Cycle and the model's 20-km [12-mi] horizontal grid increments), it improved the accuracy and timeliness of the most immediate predictive information widely used for aviation, severe-weather forecasting, and general weather forecasting. Combining the latest observations from commercial aircraft, wind profilers, Doppler radar, weather balloons, satellites, and surface stations, the model produced new analyses and short-range forecasts on an hourly basis, with forecasts as far as 12 hours into the future every three hours—the most frequent updating of any NOAA forecast model. Maps and other products from the model were available on the Internet at .

      Late in the year, drought experts from the U.S., Canada, and Mexico neared the end of their preparations to launch a new program of continental-scale drought monitoring for North America. The existing Drought Monitor program, begun in 1999, provided weekly updates in the form of maps and text reports of the status of drought in the 50 U.S. states (available on the Internet at ). The expanded program, which was to be called the North American Drought Monitor and which would initially issue monthly assessments, was a cooperative arrangement between specialists currently producing the U.S. Drought Monitor and meteorologists from Mexico and Canada.

      A report issued in August by the UN Environment Programme indicated that a vast blanket of pollution stretching across South Asia, dubbed the Asian Brown Cloud, was damaging agriculture and modifying rainfall patterns. Estimated to be about three kilometres (two miles) thick, the constant haze was thought to result from forest fires, the burning of agricultural wastes, emissions from inefficient cookers, and the burning of fossil fuels in vehicles, industries, and power stations. The blanket of pollution reduced the amount of sunlight reaching Earth's surface by as much as 10–15%. The resulting combination of surface cooling and lower-atmosphere heating may be altering rainfall patterns, leading to a reduction in winter rainfall over northwestern India, Pakistan, and Afghanistan.

      Paleoclimatologists reported that they had found century-scale trends for Asia's southwest monsoon, a climate system of vital importance to nearly half the world's population. A climate reconstruction for the past millennium based on the relative abundance of a certain type of fossils in sediment cores from the Arabian Sea suggested that monsoon wind strength had increased during the past four centuries as the Northern Hemisphere warmed. The finding supported an observed link between Eurasian snow cover and the southwest monsoon. The researchers predicted that southwest monsoon intensity could increase further during the 21st century if greenhouse gases continued to rise and northern latitudes continued to warm.

      The rapid melting of Alaskan glaciers was contributing to a rise in sea level, according to a team of scientists who used airborne laser altimetry to estimate the volume changes of 67 glaciers in Alaska. They found that the glaciers' thicknesses had diminished at an average annual rate of 0.5 m (1.6 ft) from the mid-1950s to the mid-1990s. Repeat measurements of 28 glaciers from the mid-1990s to 2000–01 showed that the average rate of melting had increased to 1.8 m (5.9 ft) per year. Extrapolating these rates to all Alaskan glaciers yielded an annual loss of volume of 96 cu km (23 cu mi), equivalent to a 0.27-mm (0.01-in) rise in sea level per year during the past decade. These losses were nearly double the estimated annual loss from the entire Greenland Ice Sheet during the same period.

      In contrast, temperatures over large parts of the interior of Antarctica exhibited a small cooling trend during the past several decades. The cooling could be related to linkages between the troposphere—the lowest layer of the atmosphere—and the stratosphere above it. Researchers presented evidence during the year that ozone losses over the southern polar region, embodied in the formation of the annual Antarctic ozone hole, were leading to a cooling of the lower stratosphere, which in turn was affecting the circulation in the troposphere so as to contribute to the observed temperature trends. Because chemical pollutants affected the formation of the yearly ozone hole, the evidence suggested that pollutants were having an impact on Antarctic climate.

Douglas Le Comte

▪ 2002


Geology and Geochemistry
      In June 2001 geology and geochemistry were successfully merged in Edinburgh at the novel Earth System Processes: A Global Meeting (June 24–28, 2001), coconvened by the Geological Society of America and the Geological Society of London (cosponsored by the Edinburgh Geological Society, University of Edinburgh, and Geological Surveys of the U.S. and the U.K.). The concept was that the plate tectonics revolution of the 1960s was only a first step in understanding the whole Earth system. In order to understand the dynamic whole, interdisciplinary studies of interactions between its component parts are required. The sessions were designed to emphasize the linkages between geologic, chemical, physical, and biological processes, along with their social and economic implications.

      The linkages between geology, geochemistry, and the biosphere are clearly displayed by the submarine hydrothermal vents that spew hot water and deposit minerals that form chimneys. A report from a project of the University of Washington and the American Museum of Natural History, New York City, to characterize a suite of large sulfide chimneys from the Juan de Fuca Ridge was published during the year by John Delaney and Deborah Kelley of the University of Washington and seven coauthors. Using a centimetre-scale navigation-control system, optical- and sonar-imaging sensors, and real-time navigation techniques, the researchers produced the highest-quality fine-scale map of a complex with more than 13 chimneys ranging in height from 8.5 to 23 m (1 m = 3.28 ft). Water venting from the chimneys reached 300 °C (570 °F). Dense clusters of tubeworms, snails, crabs, and microbial communities covered the structures. In a remarkable feat of remote engineering in water 2,250 m deep, the team sawed off and recovered four samples of chimneys about two metres in length. These were digitally imaged, cored, split, and immediately subjected to geochemical and microbiological examination. The research team discovered complex vertical zones of minerals, networks of flow channels lined with sulfides, and microorganisms distributed within well-defined mineral zones. The proportions of specific bacteria varied from the hot interior to the cooler exterior of the chimneys and were clearly related to, and perhaps even modifiers of, the geochemistry of their local environment.

      The largest submarine hydrothermal towers yet discovered were described and explained in 2001 by Kelley, Donna Blackman (Scripps Institution of Oceanography), and many coauthors from the U.S. and Europe. The submersible research vessel Alvin revealed to its three astonished crew members a large field of white towers at a water depth of 700–800 m. This “Lost City Hydrothermal Field” extends across a terrace on the steep southern wall of the Atlantis Massif, about 15 km (1 km = 0.62 mi) west of the Mid-Atlantic Ridge near 30° N. It consists of about 30 pinnacle-like chimney structures, the largest reaching 60 m in height and more than 10 m in diameter. In contrast to the black, high-temperature, sulfide-rich chimneys associated with the volcanically active oceanic ridge axes, these white towers vent relatively cool water of less than 70 °C (160 °F); the water precipitates carbonates and hydrated minerals, and there is no evidence for recent volcanism. The mineralogy of the carbonate chimneys and the fluid composition are consistent with reactions occurring between percolating seawater and the mantle rock that underlies the Atlantis Massif. The water is heated by the chemical reactions that convert the peridotite rock to serpentinite. Although few mobile creatures were found around these structures, there are abundant, dense microbial communities. These low-temperature carbonate chimneys may be widespread on older oceanic crust, supporting chemosynthetic microbial populations in environments similar to those in early Earth systems when life first evolved.

      There was a new claim for the oldest rock on Earth, arising from the geochemistry of tiny minerals, zircons, in Western Australia. Igneous gneisses there are about 3,750,000,000 years old. The zircons, collected from a series of sedimentary rocks formed in a large delta, had previously been dated at 4,276,000,000 years. Detailed investigations by two teams from Australia, the U.S., and Scotland (Simon Wilde and three coauthors, and Stephen Mojzsis and two coauthors), however, yielded an age of 4,404,000,000 years, closer by 128,000,000 years to the formation of the Earth. Measurements of isotope concentrations were made on the sliced zircons by means of a precise ion microprobe that bombards the mineral in tiny spots, releasing atoms that are then weighed in a mass spectrometer. Both groups also measured the oxygen isotope ratios in the zircons and concluded that the minerals were derived from preexisting igneous rocks that had been involved with water at near-surface conditions. The existence of liquid water within 150,000,000 years of the Earth's formation 4,550,000,000 years ago was unexpected, given the intense bombardment of the Earth by asteroids at the time. Perhaps there was early formation of oceans and primitive life-forms, which were periodically destroyed on a global scale and then reformed through an interval of about 400,000,000 years earlier than life on Earth is currently thought to have begun.

      Insight into the periodic disruptions caused more recently by asteroid impacts was provided by the study of helium in a sequence of limestones deposited in deep seas between 75 million and 40 million years ago. Graduate student Sujoy Mukhopadhyay, working with Kenneth Farley at the California Institute of Technology, and Alessandro Montanari of the Geological Observatory, Apiro, Italy, studied the limestones near Gubbio in Italy. This series of limestones, composed predominantly of the calcite skeletons of plankton, includes a finger-thick clay layer at the boundary between the Cretaceous and Tertiary periods corresponding in time to the extinction 65 million years ago of 75% of all living species, including the dinosaurs. The sharp boundary at the top of the Cretaceous limestones indicates an abrupt reduction in productivity of plankton, which then rather suddenly increased above the clay layer with different plankton species forming more limestones. Analyses of the rare element iridium in the clay layer, reported in the early 1980s, had provided the first evidence that the mass extinction was caused by the impact of an extraterrestrial body, accompanied by a huge explosion and the global distribution of dust through the stratosphere. The new analyses of isotopes of helium confirmed that no comet shower had passed near the Earth at that time, and a single large extraterrestrial body was thus left as the destructive agent. The analyses also permitted calculations of rates of sedimentation, which led to the conclusion that, following the impact and destruction of global life, the food chain became reestablished in only 10,000 years. Repopulation of the ocean was then achieved within a short time interval. This rapid turnover contrasts with the longer time interval recognized by many paleontologists for the progressive extinction of larger land animals, such as the dinosaurs.

      Some significant steps for the parallel development of experiment and theory were achieved during 2001 in defining the framework of phase relationships that control the geology and geochemistry of rock-melt reactions. Two publications by Tim Holland (University of Cambridge), Roger Powell, and R.W. White (both of the University of Melbourne, Australia) presented a comprehensive thermodynamic model for granitic melts in a synthetic system with eight oxide components, including water. The internally consistent dataset with software Thermo-Calc makes possible calculation of the melting relationships for many rocks through the entire thickness of the continental crust. Manipulations of the complex phase diagrams permit the evaluation of processes, including the extent of melt loss during high-temperature metamorphism.

      The continuing dependence of thermodynamic databases on new experiments at higher pressures and with additional components was demonstrated by Robert Luth (University of Alberta). The nature of the melting reaction in the Earth's upper mantle under conditions in which carbon dioxide is present is significantly affected by the position of a particular reaction among calcium-magnesium carbonates. Earlier experimental measurements for this reaction at pressures up to 55 kilobars (1 kilobar =  1,000 atmospheres) and a temperature of 600 °C (1,100 °F) had been extrapolated by calculations using Thermo-Calc, yielding the result that dolomite (calcium magnesium carbonate) would not be involved in mantle melting at pressures greater than 60 kilobars, corresponding to a depth of 180 km. When Luth measured the reaction experimentally, however, he determined that dolomite does persist as the carbonate relevant for melting reactions in the upper mantle. The presence of dolomitic carbonate-rich melts in mantle rocks beneath an island off the coast of Brazil was demonstrated in 2001 by Lia Kogarko (Vernadsky Institute, Moscow), Gero Kurat (Natural History Museum, Vienna), and Theodoros Ntaflos (University of Vienna). Textures indicated the formation of immiscible (incapable of being mixed) carbonate, sulfide, and silicate liquids.

      Experiments by Roland Stalder, Peter Ulmer, A.B. Thompson, and Detlef Günther of the Swiss Federal Institute of Technology, Zürich, on the effect of water on the conditions for melting in the Earth's mantle provided convincing evidence for the occurrence of a critical endpoint on the melting reaction, at a temperature of 1,150 °C (2,100 °F) and a pressure (130 kilobars) corresponding to a depth of about 400 km. At that point the melt and the coexisting aqueous fluid phase become identical in composition and properties. Despite the advances in thermodynamic calculations, experimental data were still insufficient to calculate the high-pressure behaviour of aqueous fluids under those conditions.

Peter J. Wyllie

      An intraplate earthquake of magnitude 7.7 (moment magnitude) shook the Indian state of Gujarat on the morning of Jan. 26, 2001, India's Republic Day. Called the Bhuj earthquake, it was one of the deadliest ever recorded in the country. At least 20,000 people were killed, 166,000 injured, and 600,000 displaced. More than 350,000 houses were destroyed; property damage and economic losses were estimated in the billions of dollars.

      The Bhuj earthquake occurred on a fault system adjacent to one on which a major shock (moment magnitude 7.8) took place in the Great Rann of Kachchh in 1819. Its focus was determined to be as deep as 23 km (1 km = about 0.62 mi). In a review of geophysical data from seismology, geology, and tectonics, Roger Bilham and Peter Molnar of the University of Colorado at Boulder and Vinod K. Gaur of the Indian Institute of Astrophysics, Bangalore, demonstrated how this earthquake was triggered by the release of elastic strain energy generated and replenished by the stress resulting from the ongoing collision of the Indian plate with the Asian plate, which began between 40 million and 50 million years ago. In this scenario the top surface (basement rock) of the Indian plate south of the Himalayas flexes and slides under the Himalayas in an uneven, lurching manner, similar to the behaviour observed in rapidly converging lithospheric plates beneath the ocean.

      The researchers also showed, on the basis of Global Positioning System (GPS) satellite measurements, that India and southern Tibet were converging at a rate of 20 mm (about 0.8 in) per year, consistent with the rate deduced from concurrent field observations. Moreover, they pointed out that the convergence region along the Himalayas held an increased hazard for earthquakes and that 60% of the Himalayas were overdue for a great earthquake. The Bhuj earthquake did not occur along the Himalayan arc and so did nothing to relieve the accumulating strain on the arc. An earthquake of magnitude 8 would be catastrophic for the densely populated region in the Ganges Plain to the south.

      In addition to the Bhuj earthquake, major earthquakes (magnitude 7 and greater) with high casualties occurred on January 13 in El Salvador (magnitude 7.7, with more than 800 people killed and 100,000 homes destroyed) and June 23 off coastal Peru (magnitude 8.4, with at least 100 people killed—many by tsunami—and 150,000 homes destroyed).

      Sicily's Mt. Etna, Europe's largest and most active volcano, erupted on July 17 in a dramatic display that continued into August. The flow of molten magma, which emerged from fissures along Etna's southeastern slopes, caused tremendous damage to the tourist complex of Rifugio Sapienza and set fire to a cable-car base station. The July–August event, which was the first flank eruption of the volcano since 1993, aroused wide interest from both the scientific community and emergency managers. It occurred from five short vent segments over a linear distance of six kilometres at an elevation of 2,950–2,100 m (9,680–6,890 ft) and discharged 30 million cu m (1.1 billion cu ft) of new magma. Significant losses were avoided when the lava stopped a few kilometres short of the first major mountain community, Nicolosi. Interest for scientists lay in the simultaneous eruption of two magma types, of contrasting chemistry and residence time in the volcano, and in the wide diversity of eruption intensities observed over short distance and time scales.

      The economically crippling eruption of Soufrière Hills volcano on the Caribbean island of Montserrat continued through the growth and collapse of the lava dome in 2001. This long-lived (since 1995) and complex event prompted the publication of a major analytic memoir by the Geological Society of London. The even more protracted eruption of Kilauea volcano in Hawaii, which began in 1983, also carried on unabated throughout the year.

      Christopher G. Fox of Oregon State University and colleagues reported on the first detailed observation of the eruption of a submarine volcano—Axial volcano on the Juan de Fuca Ridge off the Oregon coast—by a seafloor instrument serendipitously positioned very close to the event. The instrument, a Volcanic System Monitor, carried several sensors, including one for measuring bottom pressure, which served as an indicator for vertical deformation of the seafloor associated with magma movements. Although the instrument was overrun by a lava flow, the scientific data were retrieved.

      The mantle, that part of Earth that lies beneath the crust and above the central core, constitutes 82% of Earth's volume and 65% of its weight. Progress was being made in the use of seismic tomography to infer temperature anomalies associated with thermal convection in the mantle. Analogous to the use of X-rays in medical tomography, seismic tomography yielded accurate maps of variations in the velocities of seismic waves produced by earthquakes. By combining this information with a knowledge of the elastic properties (wave-propagation velocities) of various mantle mineral phases as a function of pressure and temperature, scientists could make accurate estimates of the temperature distribution in Earth's mantle. Such velocity data for a number of mantle mineral phases, such as (Mg, Fe)SiO3 (perovskite) and (Mg, Fe)2SiO4 (ringwoodite), were being obtained in various laboratories.

      Surface geophysical data (e.g., geodetic measurements and observed tectonic plate motions) and global seismic tomographic models were together providing useful information on the flow and thermochemical structure in the deep mantle. In this respect, A.M. Forte of the University of Western Ontario and J.X. Mitrovica of the University of Toronto suggested the existence of a very high effective viscosity near 2,000 km depth, which would suppress flow-induced deformation and convective mixing in the deep mantle.

      Leonid Dubrovinsky of Uppsala (Swed.) University and associates suggested that the observed heterogeneity in composition, density, and thermal state (revealed by seismological data) at Earth's core-mantle boundary and in the inner core could plausibly be explained by chemical interaction. They based their reasoning on experimental data on the chemical interaction of iron and aluminum oxide (Al2O3) with MgSiO3 (perovskite phase) under simulated conditions of pressure and temperature at the core.

      High-resolution images gathered by the Mars Global Surveyor (MGS), which began orbiting the planet in 1997, yielded exciting views of massive layered outcrops of sedimentary rock, as thick as four kilometres, as reported by Michael C. Malin and Kenneth S. Edgett of Malin Space Science Systems, San Diego, Calif. Although the age relationships of these erosional landforms and the processes of deposition and transport that created them, including the possible role of liquid water, remained to be ascertained, their discovery provided some initial clues to the previously unknown geologic and atmospheric history of Mars.

      Mars currently lacks a global dipole magnetic field like that of Earth, but the detection of strongly magnetized ancient crust on Mars by the MGS spacecraft was indicative of the presence of a liquid core and an active magnetic dynamo early in the planet's history. Building on this information, David J. Stevenson of the California Institute of Technology reported important new interpretations and insights about the Martian interior—the nature and history of the iron-rich Martian core and the influence of the core on the early climate and possible life on Mars. According to Stevenson, heat flow from the Martian core also appeared to have contributed to volcanic activity and feeding of mantle plumes, as in the case of Earth's core.

Murli H. Manghnani

Meteorology and Climate
      A revived La Niña—the condition of below-normal sea-surface temperatures dominating the central and eastern equatorial Pacific—influenced the weather over parts of the Earth early in 2001. By April equatorial temperatures had returned to normal, which suggested that La Niña, which had begun in 1998, had finally ended.

      The upward trend in global surface temperatures continued, while NASA estimates from land and ocean data for the first ten months of 2001 had the year on track to be the second warmest on record. In contrast, lower tropospheric temperatures as measured by satellite averaged close to the 1979–98 mean, suggesting no significant recent warming trend above the surface. La Niña played a role in aggravating long-term drought over the southeastern United States, particularly Florida, where the 12-month period that ended in April was the third driest in 107 years. Drought also developed over the northwestern U.S. during the 2000–01winter as blocking high pressure aloft steered storms to the north and south. November–April precipitation in the region was the second lowest since records began in 1895.

      For other parts of the U.S., winter brought abundant snowfall, especially in the Northeast and the Great Plains. Major winter storms struck the Northeast in February and March, with a particularly severe storm burying New England and the northern mid-Atlantic region on March 4–5. A wet and stormy April in the upper Midwest led to serious flooding and considerable property damage along the upper reaches of the Mississippi and other rivers.

      The first tropical storm in the Atlantic basin, Allison, made landfall June 5 on Galveston Island, Texas. Although the storm was relatively weak, its historic two-week odyssey across the South and up the mid-Atlantic coast cost about $5 billion and left 50 dead. The storm, which turned Houston's streets into raging rivers after depositing up to 890 mm (1 mm = 0.04 in) of rain, ended up as the costliest tropical storm in U.S. history.

      In central and western Texas a persistent high-pressure system aloft brought drought to the region for the second consecutive summer. Rainfall totaled well under 50% of normal in both June and July, and temperatures above 37.8 °C (100 °F) worsened the dryness. Rains exceeding 300 mm in late August and early September ended dryness in eastern Texas but triggered flooding.

      Over the central U.S., the high-pressure ridge responsible for the heat and dryness in the southern plains expanded northward in late July and early August, bringing dangerous heat to the upper Midwest. The ridge further broadened, which resulted in a nearly coast-to-coast heat wave August 6–9. Nationwide, widespread heat during June–August resulted in the fifth warmest summer on record for the U.S., while above-normal temperatures during September–November across all but the Southeast caused autumn to rank as the fourth warmest on record. Drought intensified along the Eastern Seaboard in autumn as rainfall totaled under 50% of normal from North Carolina to Massachusetts. In contrast, a series of Pacific storms in November and December eased drought in the West.

      The Atlantic tropical storm season was active, with 15 named storms of which 9 became hurricanes. The bulk of activity occurred in the last three months of the season—September to November—during which 11 of the named storms formed. For the second consecutive year, no hurricanes made U.S. landfall. Two storms, Barry and Gabrielle, brought some flooding to Florida but also relieved its long-term drought.

      In Central America drought in June and July damaged crops from Nicaragua to Guatemala. Hurricane Iris, a category 4 storm packing winds of 233 km (145 mi) per hour, caused severe damage to southern Belize on October 8. The tropical depression that later became Hurricane Michelle brought extremely heavy rains to portions of Nicaragua and Honduras at the end of October. On November 4, Michelle slammed into the costal islands of Cuba as a category 4 hurricane and into the main island as a category 3 hurricane. Michelle was the strongest hurricane to hit Cuba since 1952.

      Across the Middle East and south-central Asia, another dry winter and spring resulted in countries from Syria to Pakistan enduring a third consecutive year of drought. Much of the region experienced four straight months (January–April) with precipitation below half of normal. The drought slashed crop production and depleted rivers and reservoirs. In Algeria, an intense storm struck the north coast on November 9–11. Up to 260 mm of rain led to catastrophic floods and mud slides in Algiers, leaving more than 700 people dead.

      Crops dependent on rain failed almost totally in Afghanistan again in 2001. Major drought during the first half of the year also affected northern China and North and South Korea. March–May rainfall in Beijing totaled about one-third of normal. The opposite extreme prevailed in southern China, where torrential June rains exceeding 800 mm killed hundreds of people. An active storm season also affected the region, with Taiwan enduring damage from Typhoons Chebi in June, Toraji in July, and Nari and Lekima in September. Other storms hit the Philippines, China, and Japan, with two typhoons, Pabuk and Danas, striking the Tokyo area within one month of each other (August 21 and September 10). In the Philippines, Typhoon Utor left more than 150 dead in July, and Tropical Storm Lingling caused at least 180 deaths in early November. Monsoon flooding hit South and Southeast Asia, although on a smaller scale than in 2000. India suffered severely again as floodwaters affected millions during July and August.

Douglas Le Comte

▪ 2001


Geology and Geochemistry
      In 2000 J.L. Kirschvink (California Institute of Technology) published a novel report (with six coauthors from the U.S. and South Africa) relating the end of the 2.4 billion-year “Snowball Earth” to global geochemistry and major episodes in the history of life. He had originated the Snowball Earth concept about a decade earlier and by 2000 had evidence for two periods when the Earth was completely glaciated, covered with ice like a snowball, at about 2.4 billion and 600 million to 800 million years ago. The evidence includes measurements of the Earth's ancient magnetic field preserved in old rocks, which indicate the near-equatorial latitude of rock formations known to indicate the presence of ice. There is a 45-m (147.6-ft)-thick layer of manganese ore in the Kalahari Desert with an age corresponding to the end of the 2.4 billion-year Snowball Earth period, and the report proposed that its deposition was caused by the rapid and massive change in global climate as the snowball melted.

      Most primitive organisms had been wiped out as the freeze developed on a global scale. The ice-covered oceans, separated from oxygen by thick sea ice, became reducing agents and therefore dissolved more metals. Carbon dioxide from increased volcanic activity is a candidate for cause of the eventual global warming, creating a greenhouse effect by preventing much of the Sun's radiation from escaping into space. As the ice melted, the dissolved metals and most other essential nutrients for photosynthesis were available for the hungry blue-green algae that had escaped extinction, and the algae bloom released enough oxygen to cause a cascade of chemical reactions. The global warming associated with oxidizing conditions led to the precipitation from seawater of iron and carbonates, producing characteristic rock masses known as banded iron formations and postglacial cap carbonates (limestones deposited above glacial rock deposits). The oxygen spike, in effect, led to a “rusting” of the iron and manganese. The manganese precipitation involved large quantities of oxygen, and these geochemical changes may have forced the organisms to mutate in such a way that they were protected from the changing chemical environment. Kirschvink suggested that the organisms may have adapted the enzyme known as superoxide dismutase to compensate for the changes. The enzyme and its evolutionary history were well known to biologists, but this was the first time a global climatic change had been suggested as a cause of the enzyme's diversification.

      Much attention had been devoted to tracking the history of continental migration, with evidence for the formation of supercontinent Pangaea being firmly based on ocean-floor magnetic anomalies. Information about the assembly of the previous supercontinent of Rodinia was more speculative. I.W.D. Dalziel at the University of Texas at Austin and two coauthors in 2000 presented testable evidence for the hypothesis that Rodinia formed by the amalgamation of four separate continental entities along three boundaries, which are belts of mountain formation between 1.2 billion and 1 billion years ago. C.R. Scotese at the University of Texas at Arlington and his colleagues had devoted 20 years to the PALEOMAP Project, with the goal of illustrating the plate tectonic development of oceans and continents and their changing distribution during the past 1.1 billion years. The project also generated maps showing plate tectonics in the far future, illustrating the formation of the next supercontinent of “Pangea Ultima.” The results were made available on a World Wide Web site, , in an atlas of full-colour paleogeographic maps showing ancient mountain ranges, active plate boundaries, and the extent of paleoclimatic belts. In addition, the site provided many animations, including how the continental configuration could change over the next 250 million years.

      Development of plate tectonic theory after the 1960s demonstrated with precision how the continental masses drifted across the Earth during the past 250 million years, but understanding the origin and evolution of the continents remained a major objective. Several reports published during 2000 demonstrated the power of geochemical data produced by the measurement of isotope ratios by mass spectrometers to advance the understanding of the structure and evolution of continents. Three examples outlined below are the continental growth of southern Africa and the current collision between India and Asia as generators of major fault systems, and huge sedimentary fans accumulated from the erosion products of the Himalayas.

      Evidence about continental origins involving the birth and death and erosion of successive mountain ranges is found in the oldest, stabilized parts of the continents, called cratons. The origin and history of the craton in South Africa was recently described in considerable detail in a report by R.W. Carlson (Carnegie Institution of Washington) and 16 coauthors from the U.S., Great Britain, and South Africa. This integrated investigation illustrated the necessity for a multidisciplinary approach involving geology, geochemistry, and geophysics for the comprehension of processes in the Earth sciences. The geology of the shallow crust of the craton was very well known. Hundreds of kimberlites (a rare, deep-seated kind of volcanic eruption) brought rock samples of upper mantle and lower crust (xenoliths) through cylindrical pipes to the Earth's surface. High-resolution measurements of isotopes of uranium-lead and rhenium-osmium systems on many samples revealed a long, complex history. Rocks of the upper mantle have ages of 3.5 billion to 3.3 billion years, and the craton was stabilized about 3 billion years ago. Mantle rocks formed during that time interval included subducted materials from plate margins around the continent, and these became attached to the continent through time, creating a stable block of lithosphere. The craton consists of crust and a thick section of the underlying mantle.

      The Indian subcontinent collided with Asia about 50 million years ago, and the continued convergence of these masses at a rate of about five centimetres (two inches) per year has elevated the huge area to an average height of about five kilometres (three miles). This continental collision provided a natural laboratory for the study of the plate tectonic forces that generate continents. An example is a series of huge strike-slip faults in northern Tibet where blocks of the Earth's crust slide past one another. There are two competing models: Do these faults define major discontinuities to depths of 100 km (60 mi), through the crust and into the upper mantle, or are they relatively shallow features playing a secondary role to displacements in a more fluid (but solid) lithosphere? Geophysicists Rick Ryerson, Jerome Van der Woerd, Bob Finkel, and Marc Caffee at Lawrence Livermore National Laboratory, Livermore, Calif., with collaborators from Los Angeles, Paris, and Beijing, made the first-ever measurements of the rates of long-term movement along these large faults in order to characterize their large-scale behaviour. Specific fault breaks (tectonic offsets) were first identified from satellite images with a resolution of 10 m (33 ft). Sensitive accelerator mass spectrometry made it possible to measure very low levels of the nuclides Be10 and Al26, which provided dates for the surfaces exposed by faulting. Slip rates can be calculated from those ages. The first stage of the research suggested that the northern portion of the Tibetan plateau had been uplifted by successive episodes of eastward fault propagation coupled with the uplift of young mountain ranges. The Livermore data indicated that the models representing the lithosphere as fluid might be flawed.

      The Himalayan mountains are being eroded rapidly. The products of erosion have been deposited into the huge submarine sedimentary fans on either side of India—the eastern Bengal Fan and the western Indus Fan. The Bengal Fan is fed by the Ganges and Brahmaputra rivers, which deliver sediments derived from the high Himalayas along much of the mountain range. This fan is swamped by material from the rapidly unroofing Himalayas, which has occurred during the past 20 million years. The material and structure of the Indus Fan had been investigated by deep-ocean drilling. Its age had been debated for a decade, with one view being that the fan was formed as a response to the high Himalayan uplift and unroofing starting about 20 million years ago. The sequence of sediments deposited on the Indus Fan yields information on the uplift and erosion of the western Himalayas, as described in a 2000 report by Peter D. Clift (Woods Hole [Mass.] Oceanographic Institution) and six coauthors from the U.S., Germany, and Pakistan. The erosion sequence is more readily isolated than for the sediments of the Bengal Fan. Modern microbeam mass spectrometry is capable of measuring the very small amounts of lead occurring in feldspars eroded and transported from the mountains. Clift and his colleagues characterized various parts of the Himalayas in terms of their lead isotope ratios and then measured the lead isotopes in feldspars from sediment cores drilled from the Indus Fan. The significant observation was that the mineral grains were derived from the northwestern regions, and none were derived from the Indian plate. These results, together with new seismic studies of fan structure, suggested that the Indus River and fan system were initiated soon after the India-Asia collision, about 55 million years ago. These results demonstrated that different sedimentary fans may provide quite different images of evolving mountain ranges, which is important when determining the history of ancient deposits that are contemporaneous with mountain-building episodes.

Peter J. Wyllie

      During 2000 scientists reported on several societally relevant strong earthquakes that took place late in the previous year. On Sept. 21, 1999, a magnitude-7.6 quake occurred on the Chelungpu thrust fault in central Taiwan, killing more than 2,300 people. The earthquake produced tremendous surface slip, offsetting man-made structures vertically as much as 10 m (33 ft). Because the Taiwan Central Weather Bureau had recently completed installation of the most densely instrumented strong-ground-motion network in the world, scientists were able to determine the location and magnitude of the earthquake less than two minutes after it happened. Indeed, the network provided a wealth of digital data on the quake for seismology and earthquake engineering studies.

      On Oct. 16, 1999, an earthquake of magnitude 7.1 occurred within the eastern California shear zone (ECSZ) in a sparsely populated area (Hector Mine) of the Mojave Desert east-southeast of Barstow, rupturing 45 km (28 mi) of faults. Twelve minor foreshocks were recorded in the 12 hours preceding the main shock, and 2,500 aftershocks were recorded in the succeeding two weeks. Although people in Los Angeles felt the earthquake, damage and disruption were minimal.

      In a preliminary report, scientists from the U.S. Geological Survey (USGS), Southern California Earthquake Center, and California Division of Mines and Geology observed that the Hector Mine earthquake involved rupture on two previously studied faults, the Bullion and Lavic Lake faults. Much of the fault zone had been buried by young stream deposits and had not experienced significant offset during the past 10,000 years. As was the case for other parts of the ECSZ, the rate of movement along these faults was slow (less than one millimetre [0.04 in] per year), which explained its long period of inactivity during the Holocene Epoch (the past 10,000 years). By analyzing satellite imagery data of the Mojave Desert before and after the Hector Mine earthquake, scientists from the Scripps Institution of Oceanography and the USGS mapped the surface deformation. They found that the locations of the aftershocks delineated the entire rupture zone and that maximum slip (offset) along the main rupture was as high as 7 m (23 ft), compared with 5.2 m (17 ft) estimated from ground-based observations.

      Two strong earthquakes near Istanbul—one of magnitude 7.4 on Aug. 17, 1999, and the other of magnitude 7.1 on Nov. 12, 1999—together killed 18,000 people, destroyed 15,400 buildings and structures, and resulted in $10 billion–$25 billion in damage. The first event, with an epicentre southwest of the city of Izmit, was the most recent manifestation of a westerly progression of major earthquakes along the North Anatolian Fault that had begun in 1939. The Istanbul region had been struck and heavily damaged by 12 major earthquakes in the past 15 centuries, which attested to the significant earthquake hazard there. Stress-induced triggering and rupturing was considered to be the mechanism for the westerly propagation of these earthquakes. Seismologists at the USGS studied the time-dependent effect of stress transfer to adjacent faults following the Izmit event. From this they estimated that the next large quake or quakes in the region had a 62% (15%) probability of occurring during the next 30 years and a 32% (12%) probability during the next decade.

      The Hawaii Scientific Drilling Project (HSDP), involving an international team of scientists from dozens of universities and institutions, was focused on drilling into the buried lava flows constituting Mauna Loa volcano on the island of Hawaii. Begun in 1999, the first phase of drilling, to a depth of 3,109 m (10,201 ft), was accomplished. The goal of the second phase was to reach 5,500–6,100 m (18,000–20,000 ft). Temperature measurements in the borehole revealed that temperature decreases with depth and that variations in temperature are affected by hydrologic factors. From analyses of drill core samples, in conjunction with geophysical well-logging and downhole measurements, HSDP scientists expected to learn more about mantle plumes—upwellings of hot, solid mantle material, perhaps originating from the thermal boundary layer at the mantle-core boundary (3,000 km [1,860 mi] deep)—that accounted for the creation of the Hawaiian Islands volcanic chain. Other objectives of the HSDP were to investigate variations in mantle geochemistry and the intensity and polarity of Earth's magnetic field during the formation of the Hawaiian volcanoes.

      Geodetic measurements making use of the satellite-based Global Positioning System (GPS) continued to aid in geophysical studies of earthquakes, volcanoes, tectonic plate motion, and related dynamic phenomena at the Earth's surface (for example, vertical movements of the crust caused by the growth or shrinkage of large ice sheets) and in its interior (for example, in subduction zones). Using GPS observations made before and after the Izmit earthquake of 1999, scientists from the Massachusetts Institute of Technology and the University of California, Berkeley, and their collaborators from Turkey and France estimated the distribution of coseismic and postseismic slip along the earthquake rupture, which led to a better understanding of the seismogenic zone. Such studies could also help assess the potential for neighbouring faults to generate future earthquakes.

      Volcanic activity, magma transport, and seismic tremors under and around volcanoes are interrelated. Volcanoes often deform prior to eruption. Studies of volcanoes continued to be enhanced by seismological techniques in conjunction with the use of tiltmeters, leveling instruments, and the GPS. Using GPS measurements and seismic data from earthquake swarms, scientists from Stanford University and the University of Tokyo estimated the space-time evolution of a magma-filled crack off the Izu Peninsula, Japan, and provided improved understanding of magma transport through the brittle crust and of the cause of volcanic seismicity.

      Results from continuous GPS monitoring of the eruptive event of Jan. 30, 1997, on the east rift zone of Hawaii's Kilauea volcano by scientists from Stanford University, the USGS, and the University of Hawaii provided unprecedented insight into the spatial and temporal behaviour of a volcanic eruption. Models based on GPS data showed the rift opening eight hours prior to the eruption. Absence of precursory inflation of the summit led the investigators to reject magma storage in favour of pressurization as the cause of the eruption. Other, non-GPS types of studies involving simultaneous measurements of deformation and gravity also can be used to identify magma-chamber processes prior to the onset of the conventional precursors of eruptions.

      Collaborating scientists from France, Spain, and Italy produced detailed internal imagery of Italy's Mt. Etna volcano through the use of a set of arrival times of seismic waves from local earthquakes. The data were collected by a dense array of temporarily emplaced three-component seismographs. The study revealed a body of intrusive material of magmatic origin under the southern part of Valle del Bove, on Etna's eastern flank, above the basement rock 6 km (3.7 mi) below sea level. Velocity changes in the seismic waves passing through the body signified the presence of magmatic melt and partial melt.

      Sandwiched between Earth's crust and molten outer core is the mantle, which continued to be a major topic of debate in geophysics. The mantle makes up 83% of Earth's volume and consists of solid ferromagnesian silicate rock, heated by the outer core and its own radioactive decay. Circulation of the mantle is the driving force for the motion of the tectonic plates, which causes mountain building and earthquakes. Several seismic and geochemical-petrologic modeling studies of the mantle indicated that the mantle circulates in two layers rather than in one, as had formerly been thought. On the basis of results from recent seismological studies, researchers at the University of Arizona and the University of California, Berkeley, reported highly anomalous structures—modeled as “fuzzy” patches roughly 5–50 km (3–30 mi) thick—at the base of the mantle (about 2,900 km [1,800 mi] deep). The patches, which appeared to exhibit a wide range of increased density (as much as 60%), were inferred as being contamination of the deep mantle by the outer core. Such patches may represent zones of intense chemical and physical interaction at the mantle-core boundary.

Murli H. Manghnani

Meteorology and Climate
      Many of the unusual climate and weather events during 2000 were influenced by the ongoing La Niña over the Pacific Ocean, characterized by below-normal sea-surface temperatures over the eastern and central equatorial Pacific and somewhat warmer-than- normal temperatures over much of the western Pacific. Although La Niña began to weaken noticeably during the spring and summer, its impact was felt over many areas throughout much of the year into the early fall. Greater-than-usual rainfall occurred over much of the western Pacific and Indian Ocean basins, with enhanced tropical storm activity affecting Australia, southeastern Africa, and the southern Indian Ocean during the first several months of the year. With the advent of summer in the Northern Hemisphere, the area of heavy monsoon rains and tropical storm activity shifted northward, and numerous tropical storms and typhoons produced periods of torrential rains and flooding over southeastern Asia, China, the Korean peninsula, and Japan.

      One of the effects of La Niña on the United States was relatively wet weather over the western part of the country during the first three months of the year as the jet stream repeatedly steered Pacific storms into northern California and Oregon. Except for a brief period of cold and snow over the southern and middle Atlantic states in late January and early February, storms avoided much of the remainder of the country. The winter and early spring period was the warmest on record in many areas. Drought continuing from 1999 affected inland areas of the Northeast and much of the Midwest early in the year, but as the La Niña-influenced circulation steered most storms across southern Canada and the northern U.S., the driest areas shifted southward to the southeast and Gulf Coast regions.

      Later in the summer the extreme drought conditions and record heat had a severe impact on agriculture and water supplies in Texas and the southern Great Plains. Areas to the west of the Continental Divide became progressively drier throughout the summer, and, although the southwestern U.S. monsoon started earlier than usual in June, it yielded little rainfall during July and much of August. Its circulation pattern steered mid-level moisture northward and caused numerous “dry” thunderstorms. These storms produced little rainfall but much lightning over the western part of the country and led to many wildfires that contributed to the worst fire season in 50 years over a large area expanding northward and westward from New Mexico in May to Montana and the West Coast states by the end of the summer.

      The late summer drought and heat set many new all-time records over Texas, Oklahoma, and some adjacent states. Some areas of northern Texas went nearly three months without measurable rain, the longest such period on record for more than 100 years. Maximum temperatures in the triple-digit range were observed nearly every day in August over parts of Texas and Oklahoma, and drought and heat matched or exceeded records set in 1913 and in 1934 and 1936 during the Dust Bowl era. Records were set in several locations in Texas, Oklahoma, and Arkansas in late August and early September, with values exceeding 43.3 °C (110 °F) at several locations.

      To the north and east of the areas of heat and drought, temperatures were cooler than normal, and rainfall was normal or greater, which produced a good year for crops in parts of the nation's important Midwestern agricultural areas. Nebraska, however, suffered drought-induced economic losses totaling more than $1 billion. Over much of the Northeast, it was one of the coolest summers in many years.

      As in most recent years, the Atlantic hurricane season (June– November) got off to a late start, with the first storm not developing until early August. As had been forecast because of the lingering effects of La Niña, the season became somewhat more active than normal, with 14 named tropical storms, of which eight became hurricanes. Most remained away from the U.S.; three attained major (category 3 or higher) intensity. None caused significant damage to the U.S., and two of the storms brought welcome rains to parts of the southeastern drought area.

      The first several months of the year were stormy and wet over much of western and northern Europe, but abnormally warm and dry weather developed over much of northern Africa, southeastern Europe, and the Middle East in the spring and continued throughout most of the summer. Temperatures soared to well over 40 °C (104 °F) over those areas during the summer months, with severe adverse impacts on agriculture and health. A maximum of 40.8 °C (105 °F) in Jerusalem recorded in late July was the highest there in more than 100 years. Several damaging storms brought strong winds and floods to parts of western and southern Europe in October and November.

      The weather was abnormally wet over southeastern Africa during the first several months of the year, partly from the effects of tropical storms from the Indian Ocean. In February an intense cyclone brought disastrous flooding rains to Mozambique and parts of neighbouring countries, killing hundreds and leaving thousands, including entire villages, homeless. Abnormally wet conditions, some due to tropical cyclones and at times accompanied by unseasonably cool weather, also prevailed over much of Australia during the first half of the year, especially in the northern and western portions of the island continent.

      In South America the first three months of the year were unusually wet over much of Colombia and western Venezuela. Abnormal summer heat developed over central and southeastern parts of the continent during January. Periods of abnormally heavy rainfall occurred over central and southern parts of South America during much of the first seven months of the year, augmented by strong storms from the Pacific affecting Chile during the winter season. Unusually cold weather developed during July and brought subfreezing temperatures to much of the southern part of the continent.

John J. Kelly, Jr.

▪ 2000


Geology and Geochemistry
      The biosphere, an integral part of Earth's geologic and geochemical cycles, exists in a delicate balance with the environment. The intimate relationship between “Geology, Mineralogy and Human Welfare” was summarized by Joseph V. Smith (University of Chicago) in his opening paper of the 1999 Proceedings of a Colloquium of the U.S. National Academy of Sciences. Emerging “chemical microscopes” using neutrons, synchrotron X-rays, and electrons are revolutionizing the study of mineral surfaces, fluids, and microbes with many applications to agriculture and soils, trace elements and food quality, the hazards of toxic elements and asbestos, and the formation of ore deposits. Papers in the Proceedings also dealt with advances in the recovery of petroleum from geologic reservoirs and the application of geochemical dating of clay minerals to the prediction of oil yields. As human society expands its dominion over Earth, using natural geologic resources, it is increasingly threatened by the destructive power of volcanic eruptions, earthquakes, landslides, floods, and storms. The natural geologic processes become hazards. The following review includes three completely different situations in which natural hazards have impinged on the development of life and its structures: at the present time both for human society and the oases of life on the ocean floor and more than 3.8 billion years ago for the beginning of life.

      The World Disasters Report 1999, published by the International Federation of Red Cross and Red Crescent Societies, stated that 1998 was the worst year on record for natural disasters, which together resulted in 25 million refugees. The impact of natural disasters was further illustrated in 1999 by the devastating earthquakes in Turkey, Greece, and Taiwan, as well as by the road-clogging evacuation of some two million people from the east coast of the United States as Hurricane Floyd approached.

      The results of a five-year study to evaluate methods for reducing the social and economic costs of natural hazards were published in 1999 by the U.S. National Science Foundation Engineering Directorate. Dennis Mileti (University of Colorado at Boulder), the study's principal investigator, concluded that a basic philosophical change was required for “sustainable hazard mitigation,” which would involve rethinking society's relationship to the physical environment, as well as requiring more interdisciplinary study of hazards. Highlighting this need was the fact that of the 10 most costly natural disasters in the United States, 7 had occurred since 1989.

      A “Decade City” project for 2000–09 was proposed in 1999 by the International Association of Volcanology and Chemistry of the Earth's Interior to enhance the understanding of, prediction of, and methods of coping with natural hazards. The project took an interdisciplinary approach involving geologic, geophysical, hydrologic, and atmospheric sciences. The proposal, which was made to the International Union of Geodesy and Geophysics, called for each IUGG member nation to nominate an urban centre to be the focus of study. Sustainability and vulnerability issues would be jointly examined by geologists, engineers, urban planners, social scientists, and educators. This proposal followed the successful “Decade Volcano Project,” which included Mt. Vesuvius in Italy. The geology, geochemistry, and geophysics of this high-risk volcano had been intensely studied in the hope that the next major eruption could be predicted in sufficient time for orderly evacuation of the population. If this effort was successful, the hope was that the tragedies that befell Pompeii and Herculaneum might be averted and the next major eruption of Vesuvius could be predicted in time for an orderly evacuation of the city of Naples and the three million people at risk—not an easy task.

      Volcanic eruptions also threaten the development of life associated with the hydrothermal vents on the seafloor. The chemical exchanges between ocean water and oceanic crust provide the heat and nutrients required for the formation of microbial mats, but associated lava flows destroy them, as described by Robert Embley and Edward Baker of the National Oceanographic and Atmospheric Administration in their 1999 report of some results from the 1998 interdisciplinary expedition to the Axial Volcano on the Juan de Fuca Ridge (west of Oregon and Washington). The Canadian Remotely Operated Vehicle for Ocean Science (ROPOS) dives facilitated a careful exploration of the new eruption site with a scanning sonar for detailed mapping, and a variety of tools for in situ temperature and chemical measurements. An intense microbial bloom accompanied the recent eruption. At one location, dead tube worms and clams were found partially buried by the lava; elsewhere, older vent communities survived beyond the limit of the new eruption. It was intended to continue in situ sampling, high-resolution mapping, and continuous monitoring of the hydrothermal systems in this region over several years. Mapping of the ocean floor was accomplished by remote sensing from ships, and from submersible vessels. ROPOS was so successful that the unmanned systems developed during the past decade as an alternative to manned submersibles were identified as the harbinger for future deep ocean expeditions.

      The early Earth environment was bombarded by meteorites, and evidence for the existence of life in some of the oldest rocks raised the question of whether the development of life was disrupted by the explosive impacts. Greenland's Isua greenstone belt (IGB) comprises the oldest rocks of their type. Peter W.U. Appel (Geological Survey of Denmark and Greenland) and Stephen Moorbath (University of Oxford) described in 1999 a revitalized effort to decipher the origin of life on Earth through a geologic and geochemical study of these rocks in the new Isua Multidisciplinary Research Project. The geology indicated an environment of volcanic centers surrounded by shore lines, passing to deeper water. Geochemical analyses provided rock ages of 3.75 to 3.7 gigayears (a gigayear is 1 billion years). In some minerals carbon isotope ratios suggested (but did not prove) that the carbon is a chemofossil—chemical remnants of very early life. The oldest known cellular fossils found in rocks elsewhere are 3.4–3.5 gigayears old. The Moon—and presumably Earth also—was subjected to major impacts from meteorites until about 3.8 gigayears ago, indicating that early life had 50 million to 100 million years free of meteorite bombardment in which to develop. Some minerals have inner cores with older ages of 3.85–3.87 gigayears, which overlapped with the lunar meteorite impacts and raised the question of whether life developed even earlier, during lunar-style impacts. The critical age relationships, as well as the search for chemofossils, requires detailed, reliable geologic remapping of the whole area, together with the most advanced geochemical laboratory measurements.

      The need for more detailed maps was indicated above in connection with young ocean floor and old rocks. A geologic map is the storehouse of information for interpretation of geologic history and processes. The images of the Earth's surface obtained since the first Landsat satellite was launched in 1972 revolutionized mapping on a global scale. The successful launch on April 15, 1999, of Landsat 7 with its improved capabilities was expected to enhance world mapping even further. Maps constructed by individuals at ground level had long been prepared on a variety of scales and could be located within the remotely sensed images. New, rapidly evolving digital technologies were replacing the traditional techniques for high resolution geologic mapping. It could soon be possible to complete real-time analysis and three-dimensional visualization using accurate, portable instruments at reasonable cost. Carlos Aiken and colleagues (University of Texas, Dallas) described procedures using a digital camera, a laser gun, a portable computer, the Geographical Information System (GIS), and the Global Positioning System (GPS). The laser gun could locate points or trace features on the ground, which are converted into three-dimensional visualizations by GIS. Standard mapping of features such as strike and dip of bedding and faults, thickness of beds, and geologic contacts can be converted into computer images within seconds. The images could be globally referenced with GPS and integrated with stored digital maps and images.

      Stephen M. Stanley and Lawrence A. Hardie (Johns Hopkins University, Baltimore, Md.) correlated variations in the mineralogy of oceanic fossils with changes in the chemistry of seawater, which in turn is controlled by rates of divergence of tectonic plates at seafloor- spreading centres. Carbonate mineral cements that precipitate from seawater in marine sediments oscillated on a time scale of 100 million to 200 million-year time between low-magnesian calcite and aragonite with high-magnesian calcite. These cements are ascribed to “calcite seas” or “aragonite seas,” respectively. In the laboratory, brines precipitating calcite can be made to precipitate aragonite by increasing the ratio of magnesium (Mg) to calcium (Ca) in solution. Minor changes in the hot solutions emerging from hydrothermal vents at seafloor-spreading centres may change the Mg to Ca ratio of ocean water sufficiently to cause the oscillation between calcite and aragonite precipitation. Fast spreading rates lower the Mg to Ca ratio of brines. The new investigation detected the same oscillation in the mineralogy of some marine fossils, in particular the carbonates of the reef-building organisms, and the voluminous chalk deposits. The deposition of massive chalk from calcareous nannoplankton during Late Cretaceous time (about 100 to 65 million years) had been a puzzle, but it could now be explained because it coincided with an interval when the Mg to Ca ratio in seawater was at its lowest level during the past 500 million years. The White Cliffs of Dover in England were caused by an increase in the rate of mantle convection and seafloor spreading.

Peter John Wyllie

      The Alpide Belt, one of three major seismic belts of the Earth, stretches from its western terminus in the Atlantic Ocean, through the Iberian Peninsula and the northern Mediterranean Sea, Turkey, Armenia, northern Iran, the Himalayas, and finally down through Myanmar (Burma) to the East Indies. One of its most active segments is the North Anatolian Fault, extending from the Aegean Sea across northern Turkey into Armenia. Cities and villages have been clustered in this zone since Neolithic times—and the record of seismic devastation is a long one. On Aug. 17, 1999, a catastrophic earthquake with a magnitude of 7.4 occurred near the Turkish cities of Izmit and Golcuk. The surface rupture was nearly 64 km (40 mi) in length, and the maximum permanent horizontal ground displacement was 2.7 m (9 ft) in length. This event caused the total collapse of hundreds of buildings in the provinces of Istanbul, Kocaeli, and Sakarya. Thousands were rescued from the rubble by local and international teams; still, these numbers were small when compared with the numbers of dead and missing. In such disasters the final tallies might never be absolute, but the official figures stated that there were at least 12,000 dead, 33,000 injured, and many thousands missing. A large aftershock occurred on September 13.

      Although there was only an average level of global seismic activity in late 1998 and throughout 1999, there were an exceptional number of earthquakes that caused fatalities and destruction. A major (magnitude-7.8) quake struck Indonesian islands in the Ceram Sea on Nov. 29, 1998. It left 34 people dead and 89 injured on Mangole and 7 dead and 18 injured at Manado, Sulawesi. At least 512 houses were destroyed, and 760 more were severely damaged. On Sept. 21, 1999, an earthquake of magnitude 7.6 occurred in the county of Nan-t'ou in central Taiwan (about 145 km [90 mi] south of Taipei), leaving thousands dead and causing extensive damage. Hundreds of the deaths occurred in the nearby county of T'ai-chung. Although damage in Taipei was relatively light, the collapse of a 12-story hotel trapped at least 60 people. The official totals overall were more than 2,250 dead and thousands injured. This earthquake was the most destructive to hit the island since 1935.

      Three other earthquakes in 1999 exceeded a magnitude of 7.0, but they occurred in remote areas and caused little damage. More than 1,500 lives were lost in 20 smaller earthquakes, however. Among them was a magnitude-6.2 earthquake on January 25 that rocked the Colombian cities of Armenia, Calarca, and Pereira. This event caused 1,185 deaths, left 700 people missing and presumed dead, and injured 4,750. Some 50–60% of the homes in the region were destroyed, and 250,000 people were left homeless. An earthquake with a magnitude of 6.0 occurred in Afghanistan on February 11, leaving as many as 70 people dead and hundreds injured. Another earthquake occurred at Xizang on the China-India border on March 29. This magnitude-6.6 event caused the death of at least 100 people, injured 394, and destroyed more than 21,000 homes. A magnitude-5.9 earthquake struck Athens on September 7, with a death toll that exceeded 120.

      Volcanoes also attracted attention. In January 1998 a swarm of earthquakes was detected near the summit of Axial, a submarine volcano on the Juan de Fuca Ridge—a very active seafloor feature some 500 km (300 mi) off the coast of Washington and Oregon. Within a day, the volcano erupted and formed a megaplume. A team of scientists from the Pacific Marine Environmental Laboratory, Seattle, Wash., soon arrived on-site to study this phenomenon, which since its discovery in 1986 ha been observed by researchers only eight times. Megaplumes are created when superheated water erupts from the upper fissures of an underwater volcano. Rising several thousand metres into the much cooler ocean, the water forms a distinct disk-shaped mass. These features can have a diameter of 20 km (12 mi) and may persist as coherent water parcels through voyages of hundreds of kilometres. The influence of the Coriolis force can cause a megaplume to spin at rates from 2 to 6 m (6 to 20 ft) per minute. Current studies were directed at discovering the generating mechanism for the plumes, their mineral content, the life-forms they carry, and their effect on the ocean through which they travel. Plans were being made to install a long-term-monitoring network on the seafloor at the Axial Volcano comprising an array of sensors connected to transmission buoys at the surface and linked to communication satellites. (See also Geology and Geochemistry (Earth Sciences ) above.)

      Recent satellite altimetry maps of the seafloor at the eastern end of the Samoan islands showed a small hill-like rise, and recent seismic activity suggested that volcanic activity was occurring. Examining the site with multibeam sonar, scientists from the Woods Hole (Mass.) Oceanographic Institute discovered a new volcano. Named Fa'afafine, the newcomer was 4,300 m (14,000 ft) high and had a base diameter of 35 km (22 mi). Preliminary analysis of dredged material indicated that an eruption had recently occurred. The investigators concluded that the Samoan islands were a hot-spot chain and that Fa'afafine marked the current location of the hot spot.

      Studies of ancient climates showed that the Earth had been warming for five million years prior to the event known as the Late Paleocene Thermal Maximum (LPTM), which began 55 million years ago. The long warming resulted in a dramatic decrease in oceanic ventilation due to a lack of cold, dense surface waters, which would sink and thereby carry oxygen into deeper waters. Eventually the oxygen supply became inadequate to support many species of foraminifera (one-celled organisms), and they became extinct. These organisms were at the base of the food chain, and their extinction reverberated through the entire marine ecosystem. Effects of the prolonged warming extended to Antarctica, which became ice-free and perhaps even forested. Antarctic sea-surface temperatures were 18° C (32.4° F) higher than at present. A marine geologist at the University of North Carolina suggested that the additional surge of heat during the LPTM was triggered by a gigantic volcanic eruption. Supporting evidence came from sediment cores collected in the western Caribbean Basin as part of the Ocean Drilling Program. This eruption was thought to have been massive enough to alter global atmosphere and produce the 10,000-year temperature spike of the LPTM. Whatever its cause, it was significant that the LPTM coincided with a spectacular increase in the numbers of mammalian fossils, including primates.

      The Continental Scientific Drilling Program completed an exploratory well in the Long Valley Caldera in California. The well, which was drilled to a depth of 2,977 m (9,767 ft), was to be fitted with instruments to monitor seismic activity. In March 1999 a deep hole was started by the Hawaii Scientific Drilling Project near Hilo, Hawaii. It would eventually reach a depth of 4,500 m (about 15,000 ft); one of the project's objectives was to help determine the origin of the Hawaiian Islands.

Rutlage J. Brazee

Meteorology and Climate
 The year 1999 was characterized by abnormally active weather patterns and occasional extreme events, triggered by colder-than-normal sea-surface temperatures across the eastern and central Equatorial Pacific Ocean. This event, known as La Niña (see Map—>), was usually associated with below-normal sea-level pressure and increased storminess over Indonesia and northern Australia, with opposite conditions prevailing across the eastern tropical Pacific. Reduced wind shear favoured an unusually active hurricane season (June–November) for the Caribbean and Atlantic basins.

      Wintry weather dominated the central and eastern United States during the first half of January. More than 127 cm (50 in) of snow buried Buffalo, N.Y., and more than 50 cm (20 in) paralyzed Chicago. Youngstown, Ohio, received a record-breaking 94 cm (37 in) of snow during January, and the 147-cm (58-in) total snowfall in Erie, Pa., was the second highest on record. In late February a powerful northeaster dumped almost 60 cm (2 ft) of snow on Cape Cod, Massachusetts. Meanwhile, record-breaking cold gripped Alaska, setting an all-time low of –48° C (–54° F) at Denali National Park and Preserve in central Alaska on February 5. Record February lows were also established at Galena and Fairbanks.

      In mid-January unusual outbreaks of tornadoes claimed lives in Arkansas and Tennessee, and heavy rains triggered flooding in parts of the Midwest, the Southeast, and the mid-Atlantic states. A deadly outbreak of more than 70 tornadoes ravaged the Great Plains on May 3–4, with intense F5 tornadoes (winds estimated in excess of 418 km/h [260 mph]) striking Bridge Creek and Moore, Okla. On July 8 a torrential downpour drenched Las Vegas, Nev., with 33 mm (1.3 in) of rain falling in just one hour; two persons died as a result. The first killer tornado in Utah history struck downtown Salt Lake City on August 11 and claimed at least one life.

      Abnormally dry weather dominated the eastern United States for much of 1999. For the 12-month period from August 1998 through July 1999, Maryland experienced its driest period, and Virginia, West Virginia, and New York experienced their second driest period. The dryness along the East Coast abruptly ended as the remnants of Hurricanes Dennis and Floyd brought heavy rains and strong, gusty winds during September. Unfortunately, rain from Floyd caused significant flooding, particularly in North Carolina. The hurricane triggered the evacuation of more than two million people from coastal areas of Florida, Georgia, and the Carolinas.

      During the first four months of 1999, unusually wet weather prevailed across much of northern South America from the coast of Peru eastward to the eastern tip of Brazil. Subsequently, very dry conditions overspread northern Venezuela and the southern Caribbean. Torrential rains drenched the Córdoba and La Pampa provinces of central Argentina in late April. Between 203 and 508 mm (8 and 20 in) of rain doused east-central South America during June and July.

      Above-normal precipitation dominated the Alpine region during January and February. Heavy February snowfalls in the Alps caused numerous avalanches, closed many roads, and stranded thousands of individuals. Heavy rains in mid-May combined with melting snow to cause severe flooding that triggered landslides, forced numerous evacuations, and killed several people. One month later up to 102 mm (4 in) of rain in a week resulted in major flooding across much of Hungary, Slovakia, and southern Poland. Two brutal storms lashed Western and Southern Europe in the last week of the year.

      Dryness across Kenya and Tanzania persisted into 1999, with less than 20 mm (0.8 in) of rain during January. Dryness returned to Ethiopia and northern Kenya during March and dominated the region through June. In sharp contrast, heavy rain drenched Madagascar, Malawi, Mozambique, and Zimbabwe during the first six weeks of the year, with short-term moisture surpluses persisting through April. Meanwhile, the rainy season was delayed in the sub-Saharan Sahel region. Scattered showers progressed northward from the Gulf of Guinea coast during July.

      Abundant precipitation (up to 762 mm [30 in]) soaked much of east-central Asia during March and April, with the heaviest amounts reported in east-central China and southern Japan. During the middle of May, Tropical Cyclone 02A battered southern Pakistan with heavy rains and strong, gusty winds, killing as many as 1,000 persons and leaving some 50,000 homeless. Frequent heavy thunderstorms drenched much of southeastern Asia during much of the year.

      Frequent thunderstorms, with torrential rains, soaked the Philippines, Indonesia, and northern Australia as 1999 began. Malaysia and Indonesia accumulated precipitation excesses of up to 406 mm (16 in) during January and February. Farther south, heavy thunderstorms (up to 178 mm [7 in] of rain in one week in mid-February) saturated the coasts of eastern New South Wales and southeastern Queensland and caused significant flooding. During March and April, Tropical Storm Vance and Tropical Cyclone Gwenda brought heavy rains, strong winds, and excessive cloudiness to the state of Western Australia, where significant damage was reported at some locations. Heavy rains returned to east-central Australia in June and July, and moisture surpluses of 102–482 mm (4–19 in) along the immediate coast resulted.

John J. Kelly, Jr.

▪ 1999


Geology and Geochemistry
      The interrelatedness of Earth processes was a motif for 1998. The German Geological Society, for example, under the leadership of Peter Neumann-Mahlkau (Geologische Landesamt Nordrhein-Westfalen, Krefeld), celebrated its 150th anniversary with a symposium on "The System Earth." The role of convection in the Earth's interior (mantle) in affecting geologic processes and products and the geochemistry of lavas was elegantly illustrated in a paper by Michael Gurnis (California Institute of Technology), R. Dietmar Muller (University of Sydney, Australia), and Louis Moresi (Australian Geodynamics Cooperative Research Centre, Nedlands). They developed a physical model that explained problems related to the sedimentary rocks of Australia and to properties of the oceanic spreading ridge between Australia and Antarctica.

      The stratigraphic record of sedimentary rocks revealed that broad regions of Australia underwent vertical motion during the Cretaceous Period. These movements varied from a condition of maximum flooding by seas 120 million-110 million years ago to minimum flooding 80 million-70 million years ago. By the end of the Cretaceous (66 million years ago), Australia was about 250 m (820 ft) higher than it is today. These movements are out of phase with the global sea-level variations, because Australia was high and dry when the sea level throughout the world was at a maximum. The deepest part of the global oceanic ridge system is on the Australia-Antarctica spreading ridge. Its low elevation is believed to be due to an unexplained cold spot, possibly a downwelling. The basalts along this ridge have two distinct isotopic provinces, one to the west of the cold spot, characteristic of the Indian Ocean basalts, and one to the east of the cold spot, characteristic of the Pacific Ocean basalts.

      The investigators developed a three-dimensional model of mantle convection, including the known history of plate tectonics near Australia. Two tectonic plates had been converging near eastern Australia through 100 million years before the Cretaceous. The model explored the consequences of the subduction beneath Australia of the cold lithosphere slab to the west, from 130 million years ago to the present, with the geometrical arrangement of the tectonic plates being adjusted in steps of 10 million years. The subducted slab passed beneath Australia during the Cretaceous, stagnated in the mantle near a depth of 670 km (415 mi), and is now rising up to the Southeast Indian Ridge. For a reasonable range of input values, the dynamic models explained the two unusual geologic and geochemical features, the inferred inundation and uplift of Australia, and the isotope geochemistry of the Australian-Antarctic ridge basalts. This successful modeling of the consequences of mantle convection, including plate motions, was a significant step forward in connecting the Earth's internal motions with surface geology and geochemistry.

      New discoveries were made during the year concerning the exchanges that occur between the solid earth and seawater. The formation of continents begins, effectively, with the eruption of new basaltic lava from the Earth's mantle at the mid-oceanic ridges. The geology of the ocean floor and the geochemistry of the lavas are coupled with the convective motions occurring within the mantle beneath the ridges. The oceanic ridge system is the largest geologic formation on Earth, and the discovery in 1979 of submarine hydrothermal vents associated with the ridges revealed that they are probably also the most active formations in terms of hydrology. Circulation of ocean water through the rifted basalt, heated by the magma below, causes the exchange of many elements between the ocean and crust, and solutions heated to temperatures of up to 350° C (660° F) precipitate clouds of metallic sulfide minerals, giving them the appearance of "black smokers" as they emerge through fissures into the cold ocean. The chimneys of minerals and rock precipitated by the venting solutions contain geochemical and biological information that is difficult to sample from deep-ocean submersibles. During the summer of 1998, therefore, a team from the University of Washington and the American Museum of Natural History hauled four complete rock chimneys from the Juan de Fuca Ridge to a ship for study in the laboratory. A revisit two weeks later to install instruments at the site of one of the removed chimneys found that a new one had already grown 4.5 m (15 ft) high. The tallest chimney yet observed on the ocean floor was 43 m (140 ft) high.

      The discovery of thriving sunlight-deprived bacterial colonies on these hot, lava-derived chemical precipitates, nourished by the chemosynthesis of sulfur, fostered the idea that life on the Earth and other planets may have begun in similar environments. John Holloway at Arizona State University constructed a large experimental apparatus to simulate the hydrothermal vents. In 1998 his pressurized experiments were producing a tiny black smoker in a tank of cool saltwater, precipitating sulfides and other minerals. The object of the experiment was to find out if the reactions, originally free of life-forms, produce organic chemicals, the ingredients of life.

      The oceanic crust, partially hydrated by the circulating ocean water at the mid-ocean ridges, is eventually carried back into the Earth's interior at subduction zones, where the oceanic lithosphere penetrates to depths of at least 670 km (415 mi). The subducted rock is heated as it descends, and the water driven off participates in the generation of the explosive arc volcanoes associated with subduction, such as those in the Ring of Fire encircling the Pacific Ocean. Geotimes in 1998 reviewed some current experiments and ideas related to the experimental formation of hydrated minerals at high pressures and temperatures corresponding to 400 km (250 mi) or deeper within the Earth. Such minerals have the potential to store subducted water if any water escapes the melting process and volcanism and is carried deeper into the Earth. Maarten J. de Wit (University of Cape Town) outlined a process relating water at mid-ocean ridges and subducted slabs to the volume of ocean water. If more water is carried down in subduction than is released in arc volcanism, the sea level will fall. If the mid-ocean ridges are thus exposed, hydration of the ocean crust will be less efficient and less water will be available for subduction, which could later lead to a net flux of water from mantle back to the ocean. Such a mechanism could possibly regulate the volume of the oceans.

      Study of the diversity and extinctions of species requires correlation between the geologic record containing fossils and the geochemical study of minerals that has made it possible to date the ages of rocks. Samuel A. Bowring (Massachusetts Institute of Technology) and Douglas H. Erwin (National Museum of Natural History, Washington, D.C.) reported in 1998 that the integration of detailed paleontology and high-resolution uranium-lead geochronology "has revolutionized our knowledge of several important episodes in geological history." The geologic approach is to find fossiliferous sedimentary rocks interlayered with volcanic rocks, after which geochemists use mass spectrometers to measure the isotopic ratios of uranium and lead in zircons separated from the lavas or volcanic ash beds. The combination of high-precision geochronology and detailed field studies produced remarkable results. Uranium-lead dating of the mineral zircon can now define zircon ages with uncertainties of less than one million years. This precision is available for zircons in the age range of 200 million-600 million years, which includes the beginning of the Cambrian Period and the Cambrian explosion of life represented by the abrupt appearance of a wide range of fossils. On the basis of these studies, the age of the beginning of the Cambrian was determined to be younger than it had been according to the classical time scales. It was considered to be 590 million years in 1982 and 570 million years in 1983, and in 1998 it was reduced to 543 million years.

      This precision in dating was also permitting the determination of the rates of evolution of species. It was demonstrated that the Cambrian explosion of life was much faster than previously recognized, lasting no more than 10 million years. Among the several known mass extinctions of life-forms, the disappearance of dinosaurs and many contemporary species from the fossil record 65 million years ago is the most familiar. Most scientists now believe that this extinction was caused by climatic changes associated with the impact of an asteroid, a meteorite, or a comet, about 10 km (6 mi) in diameter, into the ocean and underlying sedimentary rocks near Yucatán in Mexico. There are, however, proponents for the argument that massive volcanic eruptions, as exemplified by the Deccan Traps of India, caused the climatic changes. The most severe mass extinction occurred at the end of the Paleozoic Era, now dated at 251 million years ago. At that time 85% of all marine species, about 70% of land vertebrates, and many plants and insects disappeared. Using high-precision mass spectrometry, researchers were able to show that the extinction occurred in less than one million years, a much shorter time than had previously been assumed. The cause of the extinction remained unresolved, but this discovery placed constraints on the kinds of processes that might have been responsible, such as the aggregation of the supercontinent of Pangaea, glaciation or global warming, volcanic eruption of excessive carbon dioxide into the atmosphere, or impact by an extraterrestrial body.


      On Jan. 10, 1998, a magnitude-6.2 earthquake in northern China killed at least 50 people, injured at least 11,500, and left 44,000 homeless. Resulting fires added to the total destruction, reported to have been 70,000 houses destroyed or badly damaged. There was also some damage to the Great Wall in Hubei province. Two other shocks notable for their severity were one of magnitude 6.1, on February 4 on the Afghanistan-Tajikistan border, and one of magnitude 6.9, which struck the same area on May 30. The first resulted in the deaths of more than 4,000 persons, injured 818, destroyed 8,094 homes, and killed more than 6,700 livestock. The second was even more destructive, killing as many as 5,000 and injuring many thousands. Extensive landslides contributed to the catastrophes.

      These earthquakes were located in almost real time by the U.S. Geological Survey (USGS) in Golden, Colo. This service, which began in 1928, made a major leap forward in 1958 when a rudimentary program was developed to calculate earthquake epicenters by computer, and it made another in the early 1960s when the U.S. government developed and deployed standard seismograph systems to 125 sites around the globe. Although it had been continually upgraded and modernized, the network provided only a portion of the data used in the location process. One of the items tabulated was the number of station reports used in each determination. This number frequently reached 200 and for a very large shock exceeded 500. The USGS routinely located 15,000-20,000 events each year. The depth, seismic moment, several types of magnitude, and other factors were included with each epicenter.

      In spite of the large number of active stations, there were areas of the Earth that were not well covered because its surface is about 70% water. To help alleviate this problem, the Scripps Institution of Oceanography, La Jolla, Calif., and the Woods Hole (Mass.) Oceanographic Institution formed an international group, the Ocean Seismic Network. They planned to install 20 permanent ocean-bottom seismometers in remote locations to augment data from existing stations. In 1998, with funding from the Ocean Drilling Program and the National Science Foundation, scientists successfully installed a pilot station south of Hawaii that included a seismometer in a borehole, a broadband seismometer on the ocean floor, and another in the bottom mud. The stations were designed to include magnetometers, acoustic arrays, climate and ocean current instruments, and tsunami (tidal wave) detectors.

      Studies during the year were aimed at determining the nature of the upwelling of melt materials of the undersea mantle beneath the East Pacific Rise. The Mantle Electromagnetic and Tomography Experiment, funded by the U.S. National Science Foundation, engaged scientists from nine institutions from around the world. Fifty-one ocean-bottom seismometers were deployed in the region, where the plates were spreading at a rate of 15 cm (6 in) per year, among the fastest anywhere on the Earth. After researchers gathered seismic data for six months, an array of more than 40 instruments that measured the electromagnetic fields generated in the Earth by particle currents in the ionosphere was installed, and data from the instruments were gathered for another year. The detection of slow seismic velocities across the array indicated the existence and concentration of melt materials and passive, plate-driven flow, and the conductivity measurements revealed whether the melt areas were connected. The melt distribution was found to be asymmetrical, with a concentration to the west of the crest of the East Pacific Rise. This seemed to indicate that the magma forms over a relatively broad area and then is concentrated to go to the surface along the narrow ridge to form crust. Investigators were not sure whether the asymmetry was due to thermal structure or geologic composition.

      The well-defined seismic discontinuity at a depth of 410 km (255 mi) was widely believed to be due to a high-pressure phase change in olivine, but recent studies revealed that the increase in velocity in some areas was too large to be explained by that mechanism. Two scientists from Ehime University, Matsuyama, Japan, postulated that the problem was in the assumption of a fixed composition for olivine. They concluded that olivine must, in varying degrees, exchange its iron and magnesium with other minerals in the mantle such as garnet majorite. In this manner the olivine would become denser and sustain a higher velocity.

      Volcanoes had long been recognized as prone to landslides because of the relatively unconsolidated materials that form their slopes, but it was usually assumed that an eruption was required before the slopes would give way. Recently, however, researchers at Open University in the U.K. discovered that an eruption is not necessary. While studying a long-dormant volcano in Nicaragua, Benjamin van Wyk de Vries found that two conditions make a volcano susceptible to such slides. First, the crevices must be filled with hot acidic gas, which weakens the rocks. Second, the weight of the mountain tends to push the weakened material outward at the base. This is usually a gradual, evenly distributed ring of material around the base, but if the terrain is such that the force is directed asymetrically, an avalanche may occur. Since dormant volcanoes were not monitored, de Vries feared that many populated areas of the world were in unrecognized danger of landslides.

      The Tsunami Warning System, centred on Oahu in Hawaii, was founded by the U.S. Coast and Geodetic Survey after the devastating wave produced by the magnitude-7.8 Aleutian earthquake on April 1, 1946. The effectiveness of the system depended on the difference between the velocity of the sea wave, up to 965 km/h (600 mph), and the seismic wave velocities, ranging up to 29,000 km/h (18,000 mph). Through timely reporting of seismograph readings from stations of the international circum-Pacific network, large shocks could be located in minutes, and, if the epicentre was in an area where a tsunami might be generated, warnings could be issued to all points. This system worked well many times and saved hundreds of lives. Since only a small percentage of likely large shocks produce tsunamis, however, there was a problem with false alarms. To reduce this problem a network of tide stations was queried to determine whether a wave had actually been generated. This method was time-consuming, however, and its effectiveness was limited by communications difficulties.

      The National Oceanic and Atmospheric Administration had by 1998 begun to set up a supporting network of ocean-bottom pressure recorders and seismic detectors in several areas believed likely to generate tsunamis. The data from these instruments were to be used to develop methods of detecting and locating tsunamis in real time and thus allow more warning time and the calculation of more exact arrival times and wave heights.

      The Ocean Drilling Program (ODP) continued its long-term objectives of establishing the history of sea-level change and its influence on sedimentation. ODP Leg 174A began drilling 129 km (80 mi) east of Atlantic City, N.J. Some 800 cores were obtained and then submitted for laboratory studies. The information was then to be combined with the oxygen isotopic record. The coordinated analyses of these data were expected to provide a more accurate history of global sea-level change.


Meteorology and Climate
      The strong El Niño begun in 1997 continued into the first few months of 1998 before abruptly fading. A cold episode, La Niña, developed during the last half of the year. El Niño made an impact on weather over many parts of the world early in 1998, contributing to heavy winter rains in California and Florida, drought in Mexico and Central America, and floods and drought in South America. Widespread above-normal ocean temperatures contributed to the unusual warmth recorded over much of the globe. Preliminary data from land and ocean temperature observations through August indicated that 1998 would be the warmest year on record.

      During winter 1997-98 numerous Pacific storms affected California, causing floods and landslides. Heavy rains and severe weather struck the southeastern U.S., especially Florida, into the spring. A historic outbreak of tornadoes on February 23 took 42 lives in Florida. Another outbreak killed 39 in Georgia and Alabama on April 8-9, with the majority of deaths from one F5 twister (winds over 418 km/h [260 mph]) near Birmingham, Ala., killing 32. Northward displacement of the northern jet stream brought mild weather to the Midwest and Northeast, which resulted in a dearth of snow in low-elevation areas. Washington, D.C.'s 1997-98 snowfall total of 0.25 cm (0.1 in) tied that of 1972-73 as the lowest on record. The relative warmth contributed, however, to one of the worst ice storms of the century in upstate New York, northern New England, and eastern Canada during January. It left more than two million homes and businesses without power and caused tremendous damage to utilities and trees. The weather in the southern U.S. changed markedly in the spring as warm and dry conditions spread northward from Mexico. A severe drought contributed to a record number of wildfires in Florida from late May into early July. Despite scattered heavy rains in July, April-July rainfall was the lowest in more than 100 years in Florida. Texas and Louisiana also recorded the driest April-July ever. Extreme heat aggravated the drought, with June-July temperatures averaging the highest on record in Texas, Louisiana, and Florida. Tropical rains in August and September finally broke the drought in Texas and, to a lesser extent, Oklahoma, with agricultural losses estimated at $4 billion in those two states alone.

      The 1998 Atlantic tropical cyclone season was active, highlighted by the rampage of Hurricanes Georges and Mitch through the Caribbean and eastern Gulf of Mexico. The Caribbean track of Georges on September 20-25 cost more than 400 lives and left more than 100,000 homeless, mainly on Hispaniola, where some mountainous locations recorded over 500 mm (20 in) of rain. Georges crossed the Florida Keys into the Gulf of Mexico on September 25, hitting the Mississippi coast three days later with maximum sustained winds of 170 km/h (106 mph). Storm surges and torrential rains caused flooding from Louisiana to Florida. Southern Mississippi totaled 380-500 mm (15-20 in) of rain, and isolated reports exceeded 600 mm (24 in) in northwestern Florida and southern Alabama.

      A scant month later Georges was dwarfed by Mitch, one of the deadliest hurricanes of the 20th century, which reached Category Five on October 26. Blocked from moving northward by a strong front, Mitch hung off the coast of Honduras for four days, causing torrents of rain (as much as 600 mm [2 ft] a day) that in turn caused catastrophic flooding and mud slides. End-of-year figures listed 9,021 dead in five Central American countries (most in Honduras and Nicaragua), one million homeless, another million persons affected, and the infrastructures of the worst-hit countries devastated.

      Five other storms hit the U.S. earlier in the season, most notably Hurricane Bonnie, which crossed North Carolina on August 26-27. One day after Tropical Storm Charlie moved into Texas on August 22, Del Rio measured a record 301 mm (11.85 in) of rain.

      In Mexico torrential rains, exceeding 400 mm (16 in) during September 6-12, triggered massive flooding in Chiapas. Mud slides and swollen rivers cut off 400,000 people. In contrast, during the first half of the year, fires abetted by drought consumed forests and grasslands on hundreds of thousands of hectares across Mexico and Central America. Drought affected northeastern Brazil during the first half of the year, but storms and floods killed hundreds and caused widespread damage in coastal Ecuador and Peru from November 1997 to May 1998. In February alone more than 700 mm (28 in) of rain inundated northern Peru's coast. Heavy rains from January to April caused major flooding in northern Argentina, Paraguay, Uruguay, and southern Brazil.

      In South Asia the southwest monsoon produced catastrophic floods during the summer, killing more than 2,000 in India and over 1,000 in Bangladesh. In addition, a tropical cyclone packing 185-km/h (115-mph) winds and over 125 mm (5 in) of rain struck northwestern India on June 9, killing more than 600. In China heavy rains emptying into the Chang Jiang (Yangtze River) caused extensive flooding during July and August, resulting in more than 2,000 deaths. Along parts of the Chang Jiang from February 1 to August 18, over 2,000 mm (79 in) of rain fell, more than twice the normal amount. Summer floods struck northeastern China and South Korea; September typhoons battered Japan and flooded the Philippines. El Niño-related heat and dryness affected Indonesia, Malaysia, and the Philippines during the first part of the year, producing widespread smoke and haze. Summer drought hurt crops in Kazakstan and parts of Russia, and July-August heat and dryness led to a rash of fires in the Russian Far East, where August rainfall totaled less than 25% of the normal amount.

      In Africa heavy rains in January caused flooding in Kenya. Drier weather early in the year, however, relieved flooding in Somalia, where torrential rains during October-December 1997 had inundated large parts of the south.


      After the strongest El Niño since 1982-83 in 1997, the equatorial Pacific upper ocean by early 1998 had begun to cool from the anomalously warm levels of the previous year. Instead of simply returning to normal conditions, however, equatorial Pacific sea-surface temperatures continued to decline until they were several degrees below the long-term average. El Niño thus was replaced by La Niña, a condition that is in many ways its reverse. As a result, climate-related matters continued to dominate oceanographic research as well as marine and coastal resource management during 1998.

      Under normal circumstances Pacific equatorial trade winds blow from the east and are particularly strong in the eastern Pacific. On account of the Earth's rotation, these strong winds force surface waters both northward and southward away from the Equator. Colder water upwells from depths of many tens of metres to replace the poleward-flowing surface water, so that a tongue of cold surface water extends thousands of kilometres westward of South America along the Equator. The trade winds normally extend well into the western Pacific, but there they are usually weaker than in the east. The upper ocean is much warmer in the western than in the eastern Pacific, and the warm layer is thick, so that upwelling normally does not bring cold water to the surface. The result is that in the western Pacific the warm surface water evaporates into the atmosphere. When the warm and moist air reaches moderate elevations, the moisture condenses as rainfall. The far western Pacific is thus normally a region of widespread and intense rainfall.

      During an El Niño the trade winds weaken or even reverse, and eastern Pacific equatorial upwelling ceases so that the entire equatorial eastern Pacific Ocean is several degrees warmer than the long-term average. The region of rising moist air normally found in the western Pacific migrates eastward into the central tropical Pacific. The normally wet far western Pacific thus becomes a region of low rainfall and even drought, whereas the rainfall at normally temperate central tropical Pacific islands increases dramatically. Tropical storms in the Pacific are more frequent and occur over larger areas of the ocean during an El Niño.

      In the La Niña that developed during 1998, the trade winds were strong, and the sea-surface temperature in the eastern equatorial Pacific was several degrees below the long-term mean. In Indonesia, in the far western Pacific, the drought and accompanying forest fires of 1997 were replaced by heavy rains that caused flash floods and mud slides.

      Among the most important oceanic effects of an El Niño are changes in sea level. During much of 1997, for example, the sea level along the coasts of Peru-Ecuador and of southern California was 15-25 cm (6-10 in) above the long-term average. Part of this was attributable to the thermal expansion of the anomalously warm surface waters, but changes in the pattern of ocean currents also played a role. In 1998 researchers carried out a study spanning much of the eastern north Pacific to determine the relative importance of these two effects. The temperature of the water from top to bottom was monitored by measuring the time required for sound waves emitted from an acoustic transmitter located atop a seamount on the seafloor about 100 km (60 mi) west of San Francisco to reach receiving stations located across the Pacific to the west and southwest. Travel times were measured from December 1995 through March 1997. Because the speed of sound in water depends on the water temperature, such times could be used to estimate the heat content of the entire water column over much of the northeastern Pacific during that time. Ocean currents were reconstructed from a combination of traditional measurements at sea and satellite measurements of the deviation of the sea surface from the shape it would assume if there were no currents (the geoid). Such measurements had been carried out routinely since 1992 by the Topex/Poseidon altimetric satellite. In order to make the best use of the physical understanding of the dynamics of ocean currents, all these observations were used as inputs into a numerical model of Pacific Ocean currents, and the model then constructed the current system that was most compatible with both the observations and physical theory. The result was that only about half of the seasonal and year-to-year changes in sea level are due to thermal expansion of the water; the rest result from shifts in the pattern of ocean currents.

      During 1998 researchers continued to study possible oceanic effects on climate patterns over timescales of years to thousands of years. Deep-sea sediment cores revealed that millennial-scale climate shifts as documented in, for example, ice cores from Greenland were accompanied by changes in the rate of sinking of water from the surface in the far north Atlantic. A somewhat similar process may be important in modulating the strength and frequency of El Niño episodes. The temperature of surface waters in the northwestern Pacific and Atlantic is set by wintertime air-sea interactions. These waters sink below the surface and are carried to the Equator by the large-scale circulation, where, years afterward, they may affect the surface temperature and, consequently, the strength of the trade winds. Spurred by this possibility, researchers concentrated on reconstructing the pathways and travel times of such upper-ocean water masses, using numerical models of the circulation constrained by shipboard and satellite observations.


▪ 1998


Geology and Geochemistry
      In 1797 James Hutton died and Charles Lyell was born. Their contributions to geology were recounted and celebrated at the Hutton/Lyell Bicentennial Conference, held in London and Edinburgh in 1997. Hutton conceived the imaginative Theory of the Earth (published 1788 and 1795). His work is encapsulated in the famous quotation "No vestige of a beginning, no prospect of an end," which introduced a sense of time, or timelessness, to geologic processes. In 1830 Lyell published Principles of Geology, which affirmed and consolidated Hutton's ideas. Lyell's work also marked the beginning of a long period in which most geologists concentrated on the mapping and study of rock formations and considered interpretation of the Earth's interior inaccessible and astronomy irrelevant. It was only through the insights provided by the theory of plate tectonics in the 1960s that the relationship of geology to global geophysics and geochemistry became thoroughly appreciated. Don Anderson (California Institute of Technology [Caltech]) presented the paper "A New Theory of the Earth" at the 1997 Edinburgh celebration, in which he demonstrated that essentially all of mantle geochemistry, tectonics, and petrology can be understood in terms of geophysical processes involving the Earth's mantle and crust.

      Calibration of the "no-beginning, endless" time of Hutton with respect to observed rock sequences has been a central theme in geology. A quantitative geologic time scale did not become possible until the 1950s, with the application of isotopic studies of minerals. The discovery during the 1960s that the Earth's magnetic field reversed its polarity at intervals, leaving records in magnetized rock that could be calibrated by radiometric methods, provided techniques for dating magnetized sedimentary rocks back through several million years. Many sedimentary rocks display a cyclicity (in which alternating layers differ in chemical characteristics, sediment properties, and fossil communities) that is generally attributed to oscillations in climate. Considerable effort has been directed toward correlating climatic oscillations with perturbations in the Earth's orbit and rotational axis, which affect the solar energy reaching the Earth's surface. In 1997 F.J. Hilgen (University of Utrecht, Neth.), with colleagues W. Krijgsman, C.G. Langereis, and L.J. Lourens, reported a breakthrough in dating of the recent geologic record. They compared cyclic marine sedimentary sequences with curves showing the computed variations in precession (gyration of the rotation axis so as to describe a cone), obliquity (angle between the planes of the Earth's Equator and orbit), and eccentricity of the Earth's axis and orbit and concluded that the alternations reflected precession-controlled variations in regional climate. The sedimentary cycles, dated by the magnetic polarity reversal time scale, were used to calibrate the astronomical time scale, which by 1997 had been established for the past 12 million years and appeared to be more accurate and have higher resolution than the other time scales. Research during the year was directed toward finding a correlation between marine and continental sedimentary sequences and extending the astronomical time scale to earlier times. These findings could lead to a better understanding of paleoclimatology and climate modeling.

      Concern about the prospects for and consequences of global warming gave urgency to research in paleoclimatology. Rocks, as well as cores drilled from ice sheets, contain the record of past climatic changes, and evidence confirmed that during the past several hundred thousand years there were significant swings in temperature. In 1997 Sarah J. Fowell (Lamont-Doherty Earth Observatory, Palisades, N.Y.) and John Peck (University of Rhode Island) reported on results obtained from a 1996 reconnaissance in Mongolia to study the climatic variability recorded in sediment cores drilled in lakes. The location was important because its climate is transitional between the Siberian subarctic region and the Asian monsoon belt, and climatic changes should therefore leave high-resolution records in the lake sediments.

      Studies of the sediment cores for variations in pollen and spores, magnetic properties, and carbon isotopes were to be correlated with temperature estimates from oxygen isotope measurements of shells and fossilized horse teeth. A sequence of fossil soils indicated that the Gobi Desert in Central Asia expanded and contracted dramatically during the last glacial-interglacial cycle, between 24,000 and 35,000 years ago. An ice core drilled from an old glacier on the Tibetan Plateau also supported the idea of an unstable climate.

      The geochemistry of ancient ice layers drilled from the ice sheets of Greenland provided compelling evidence for large temperature increases, many of which appeared to have occurred abruptly. The ice, made up of layers of trapped snow, air, and dust extending back almost 250,000 years, was analyzed for variations in oxygen and hydrogen isotopes (reflecting temperature changes), dust and ash (wind patterns and volcanic eruptions), ammonia (distant forest fires), and several other variable geochemical tracers.

      On the basis of discoveries in the layers, geologists concluded that the end of the last glacial period, 10,000 years ago, did not occur through centuries as previously assumed but probably happened within a few decades—less than a human lifetime. Thus, the evidence suggested that climate change could conceivably occur quite suddenly and be completed within a few years if the current industrial society disturbed the delicate balance of the atmosphere with continued emission of greenhouse gases. The change could involve either global warming or global cooling.

      The distribution of glacial rock deposits produced by the latest ice age confirms that the polar ice sheets left uncovered a wide equatorial belt, extending locally well into middle latitudes. D.A. Evans, N.J. Beukes, and J.L. Kirschvink (Caltech, Rand Afrikaans University) published in 1997 a discovery in Africa that indicated the formation of an ice sheet that approached equatorial regions. The only other unequivocally glacial rock deposits known through the 4 billion years of Precambrian history (older than 540 million years) are aged 600 million-800 million years. Some of these rocks are found in Australia, with measurements indicating that they too were formed near the Equator. An interpretation of these two Precambrian events is that they represented severe, globally inclusive ice ages, a model called the "Snowball Earth." Once such a condition is reached, reflection of sunlight should tend to keep the Earth glaciated, and the fact that the Earth recovered both times indicates a resilience to extreme perturbations in climate. Evans suggested that reheating of a Snowball Earth might be caused by carbon dioxide released by the impact of a comet or asteroid or by large volcanic outpourings.

      Detailed studies following the 1991 eruption of Mt. Pinatubo in the Philippines confirmed that dust and sulfuric acid aerosols have measurable effects on global temperatures and other climatic factors. The potential effects of volcanoes were demonstrated in 1997 by the devastation caused by the continuing eruptions of the Soufrière Hills volcano on the island of Montserrat, which began in 1995, and the June 30 eruption of the huge volcano Popocatépetl, near Mexico City, which became active in 1994. According to a report by Simon Young of the British Geological Survey and four coauthors, flows of sulfur dioxide from Soufrière Hills monitored by spectrometer observations of the plume ranged from 50 to 500 tons per day—moderate compared with many other volcanoes—but flows up to 1,000 tons per day associated with periods of enhanced dome growth and emissions of lava and ash were observed. Changes in the volume of the volcanic dome were being measured from a helicopter by an innovative technique, using range-finding binoculars and the Global Positioning System. A comparison of the mineralogy and textures of the lavas with the findings from studies on similar compositions was providing estimates of the water content of the magma, rates of magma ascent, and degassing conditions.

      The ash plume from Popocatépetl, the largest in 72 years, rose higher than 6.4 km (4 mi) and had a diameter of 55 km (34 mi). Rain mixed with the ash covered many of the 30 villages around the base of the volcano and deposited a layer of sludge on Mexico City, 72 km (45 mi) away. Mexican scientists stressed that there was less than a 10% chance of an imminent major eruption.

      Mt. Pinatubo, Soufrière Hills, and Popocatépetl are arc volcanoes, associated with oceanic subduction, the descent of the edge of one oceanic plate beneath another. The explosive eruptions of such volcanoes are caused by the downward transfer of oceanic water and carbon dioxide during subduction and its subsequent transfer back to the surface dissolved in magmas formed at high pressures. The geologic and geochemical processes occurring in this environment constitute a vital link in the recycling of the Earth's crust. An international meeting, State of the Arc, was held in Australia at the University of Adelaide in 1997 to study the impact on the development of models for subduction processes of new geochemical knowledge (from studies of uranium, thorium, and an isotope of beryllium) and techniques (new laser-based methods for the analysis of small inclusions of lavas in minerals. Despite many advances in analytic techniques and computations, the report of the conference by Simon Turner (the Open University, Milton Keynes, Eng.) acknowledged the complexity of the problem with its final statement: "Thus there is still much to be done."

      This article updates geologic science (geology).

      There were no great earthquakes in 1997, and of the nine shocks with magnitudes of seven or greater, only one exceeded 7.1. With a magnitude of 7.9, it struck on April 21 in the Santa Cruz Islands, a part of the Solomon Islands, where it generated a tsunami that caused minor damage along the coasts of the Solomon and the Vanuatu islands. The two shocks that caused the most fatalities were those of February 28, magnitude 6.1, in the border region of Armenia, Azerbaijan, and Iran, which resulted in 965 deaths; and of May 10, magnitude 7.1, in northeastern Iran, where some 1,560 died. In all, at least 2,855 people lost their lives as a result of earthquakes.

      Though the level of seismic activity was low, this was not necessarily a good thing. Plates continue to move, and stresses continue to grow. It is generally true that the longer the interval since the last quake, the larger the next one is likely to be. One phenomenon, the slow earthquake, may, however, help to reduce this danger in some instances. Slow earthquakes release strain energy very slowly and are difficult to detect. They produce no seismic waves, and their movement is too small to be detected by satellites or other conventional means. They are detected by instruments that measure gradual movement along a fault interface. Research on these quakes has been under way for several years, with the latest work being done on an event recorded in 1992 at the juncture between locked and sliding sections of the San Andreas Fault in central California. Along the sliding sections of a fault, stress is reduced gradually by means of slipping and small earthquakes, whereas stress tends to build over relatively long periods on a locked fault until it is released abruptly by a large earthquake. The 1992 slow event was the slowest ever recorded, having been more than a thousand times slower than an ordinary shock. A series of events with several episodes of varying slip times occurred at depths ranging from 0.1 to 4+ km (1 km = 0.62 mi). The surface area of the fault affected was 30 sq km (11.5 sq mi), and the strain release was equal to a normal earthquake of magnitude 4.8. It had a total displacement of only a few centimetres. Current studies seemed to indicate that the amount of slow redistribution of stress is indicative of the size of the next regular shock. The 9.5-magnitude Chilean earthquake of 1960 was preceded by a slow earthquake very large in extent with a cumulative slip of several metres, whereas a 5.8-magnitude shock in Japan in 1978 was preceded by a slow earthquake that produced a slip of about one metre (3.28 ft). Many scientists believe that these slow events are part of the total seismic process and may act as a trigger for the larger shocks.

      Russian, Mongolian, and American seismologists were studying a major fault system in Central Asia's Gobi Desert that strongly resembles the San Andreas Fault system in southern California. A point of special interest was the Altai-Gobi earthquake of 1957, during which the strike-slip fault (in which the actual displacement along the fault plane is horizontal) and an adjacent thrust fault (in which displacement occurs vertically) ruptured simultaneously, producing a shock with a magnitude of about 8.0. The team spent two seasons in the field mapping the displacements and found them similar in size and orientation to the Fort Tejon earthquake of 1857, during which approximately 300 km of the San Andreas Fault ruptured with displacements up to 10 m. There was evidence that some movement occurred at the same time on the thrust system on the northeastern side of the nearby Elkhorn Hills. The investigators concluded from this evidence that such a simultaneous concurrence of ruptures could occur along the San Andreas and the large Sierra Madre/Cucamonga thrust fault in the San Gabriel Mountains in the Los Angeles area. Skeptics recognized the similarities between the Gobi and the California geologic structures but held that because of the much faster rate of fault movement in California, the differences outweighed the similarities.

      On the basis of their studies of geodetic records from a period surrounding an earthquake of 1868, two geophysicists from Stanford University found evidence that challenged the long-accepted theory that earthquakes are contained within fault segments that limit their spread. Previously known as the San Francisco earthquake, this magnitude-7.0 shock caused damage along 51.5 km of the Hayward Fault in California from south of Fremont north to Berkeley. The ground rupture was thought to have stopped at San Leandro, but the records revealed that it continued 48 km farther to Berkeley and possibly beyond there, though there were no stations to record it farther north. The researchers had to rework the data because the original surveyors did not know that earthquakes distorted the surface. The reworked data showed there had been a maximum relative movement along the fault interface of two metres and that the rupture had broken through the boundary between what had been assumed to be northern and southern sections of the main fault. Their findings were corroborated by investigators from the U.S. Geological Survey, who found evidence of the rupture in an exploratory trench in Oakland.


Meteorology and Climate
 Early in 1997 atmospheric and oceanic patterns across the tropical Pacific were indicative of a rapidly evolving warm episode, commonly known as El Niño (see Map—> ). During the next few months, some of the largest El Niño effects of the 20th century developed. below.) (Earth Sciences )

      In late December 1996 and early January 1997, heavy precipitation and unseasonably mild air caused considerable snowpack melting, which resulted in serious river flooding across the western United States, from central California and northern Nevada northward into Washington and Idaho. In early March severe weather involving tornadoes and torrential rains affected the Ohio, Tennessee, and lower Mississippi valleys. In Arkansas, where tornadoes claimed 26 lives, Arkadelphia was hardest hit when an F4 tornado (wind speeds of 333-418 km/h [207-260 mph]) tore through the town. Severe river flooding developed along the central and lower portions of the Ohio River and middle sections of the Mississippi River. In April flooding in the Northern Plains resulted after unseasonably mild weather had caused rapid melting of the deep snowpack, which led to rapid runoff and ice jams that pushed many streams out of their banks. The Red River at Fargo, N.D., topped the previous flood crest record level observed a century earlier, and Grand Forks, N.D., exceeded its 500-year statistical recurrence level. In early April a change in the jet stream brought unseasonably cool conditions to the eastern two-thirds of the United States until mid-June. Heavy rains, occasionally accompanied by severe weather, affected the south-central and southeastern U.S. throughout the spring. In May an F4 tornado killed 27 people in Jarrell, Texas. Dryness developed across the mid-Atlantic in April, and many areas recorded one of their driest April-August periods. In late October the first major snowstorm of 1997-98 buried the central Rockies and High Plains with 30-130 cm (1-4 ft) of snow.

      The 1997 Atlantic hurricane season was rather tranquil, with seven named storms. Only one, Danny, affected the U.S. The eastern Pacific hurricane season, although average in number of storms, was marked by several that were intense. In mid-September Hurricane Linda, packing winds of 300 km/h (185 mph), became the strongest eastern Pacific hurricane on record but never made landfall. At the end of September, however, Hurricane Nora pounded southwestern Baja California, Mex., with 250-km/h (155-mph) wind gusts. As Nora moved northeastward, up to 250 mm (10 in) of rain soaked Mexico's northern Gulf of California coast, and 50-150 mm (2-6 in) of rain deluged much of the U.S. desert Southwest. In early October Hurricane Pauline battered the southwestern coast of Mexico, including the resort town of Acapulco; more than 400 lives were lost.

      Above-normal temperatures dominated South America, particularly along the Pacific Coast through late April, where elevated sea-surface temperatures, indicating the strong El Niño event, had a direct influence. During late July, in the middle of the Southern Hemisphere winter, high temperatures in central Argentina reached 34° C (93° F) as far south as lat 32° S.

      In Europe the year commenced with bitterly cold weather gripping much of the continent, as temperatures averaged 3° C to 13° C (5° F to 23° F) below normal. Canals in The Netherlands froze over for only the 15th time in 100 years. In late January, however, unusually mild and dry weather developed across the continent and persisted for several weeks. Late in March dryness abruptly abated as copious precipitation fell in western and central Russia, southeastern and north-central Europe, and, especially, central Scandinavia. Farther west, rain soaked much of continental Europe from mid-May through mid-July. Flooding occurred in parts of Poland, the Czech Republic, Slovakia, and Austria. In Poland and the Czech Republic 100 people lost their lives. Unseasonable warmth developed in the Mediterranean basin during late May and overspread much of Europe, especially Scandinavia, throughout the summer.

      Warmth covered much of northern Africa during early January, with highs reaching 38° C (100° F) in parts of southeastern Niger and northwestern Senegal. In southern Africa rainfall was above normal the first four months of the year. Late in January Cyclone Gretelle pushed across southeastern Madagascar, dropping 200-250 mm (8-10 in) of rain. A month later two tropical cyclones, Josie and Lisette, fueled torrential downpours in southeastern Africa. After a dry beginning across east-central Africa, heavy rains developed in late March and early April and spread westward to the Gulf of Guinea coast by mid-June. Rainfall deficiencies, however, developed across the western Sahel by early August, and above-normal temperatures in that region and the Gulf of Guinea area aggravated the dryness during September and October.

      Unseasonably mild weather covered much of Asia during January. By contrast, temperatures averaged well below normal during March, April, and early May across most of the Indian subcontinent. Tropical Cyclone 01B caused widespread damage as it tracked through southeastern Bangladesh during mid-May. Unofficial reports placed the death toll at some 100 people, with more than a million people homeless. Typhoon Peter dumped torrential rains on South Korea and western Japan in late June, and a month later Typhoon Rosie crossed Japan, raising six-week (mid-June through July) precipitation excesses to 530 mm (21 in). Two weeks later a fourth typhoon, Tina, brought more heavy rains to western Japan and South Korea. Meanwhile, a sequence of three typhoons (Victor, Winnie, and Zita) doused southern China with excessive rains. As August ended, another pair of tropical systems (Amber and Cass) brought heavy rains and strong winds to eastern China. To the south, heavy rains affected much of western Indonesia, Malaysia, and extreme southern Thailand during May, but as the summer progressed, intense dryness, regarded as an effect of El Niño, overspread Indonesia. The lack of precipitation abetted numerous wildfires through September and October, with heavy smoke affecting health and transportation throughout much of Southeast Asia.

      Two tropical cyclones (Phil and Rachel) brought heavy rain and strong winds to northern Australia as the year began. Surplus rains persisted across northern Australia during January, and frequent February precipitation ended dryness across New South Wales and southeastern Queensland. At the end of February, the remnants of Tropical Cyclone Gillian caused locally heavy rains in northeastern Queensland. In March Tropical Cyclone Justin brought strong winds and heavy rain to southeastern Cape York Peninsula, but much drier weather prevailed elsewhere. By early May significant dryness covered northeastern Australia after the rainy season ended early.


      The occurrence of a major El Niño dominated oceanographic research as well as planning for marine and coastal resource management during 1997. The term El Niño originally referred to the occurrence of warm southward ocean currents every few years near the coasts of Ecuador and Peru during the Southern Hemisphere summer, when local winds are weakest. This was called El Niño ("the Child") by local inhabitants in reference to the "Christ Child," since it normally occurred around Christmas. It signaled both a shift in local weather and a shift in the biology of the coastal ocean. Occasionally this event is extraordinarily strong, and scientists now recognize that the strong episodes involve climatic anomalies that may begin in the tropics but ultimately extend over the entire Pacific Ocean and even beyond. Such large-scale events are now called El Niño, the common name for El Niño/Southern Oscillation, or ENSO. The most extensive El Niño since 1982-83 began in 1997.

      One of the most unusual aspects of this El Niño was the rapidity with which researchers and the public became aware of it. During previous episodes tropical observations had been sparse. They were often not available until moored instruments had been recovered, and even then they were not routinely disseminated rapidly; thus, the onset of an El Niño was recognized only retrospectively. Since late 1994, however, instrumented buoys had spanned the equatorial Pacific, sending observations of surface winds, upper-level ocean currents, and water temperatures via satellite to researchers every day. As a result, governmental agencies had an unprecedented opportunity to plan rationally for the possible effects of the episode.

      Under normal circumstances, winds at the Equator are from the east (the southeast trade winds) and are particularly strong in the eastern Pacific. On account of the Earth's rotation, surface waters are forced both northward and southward away from the Equator by these strong winds. Water upwells from depths of many tens of metres to replace the offshore flowing water. This upwelled water is several degrees colder than the surface water it replaces, so that a tongue of cold water extends along the Equator several thousand kilometres westward of South America. During an El Niño, however, the trade winds in the eastern Pacific weaken or even reverse, and equatorial upwelling there ceases so that the entire equatorial eastern Pacific Ocean is anomalously warm by several degrees Celsius. The system of trade winds normally extends well into the western Pacific, but there it is usually weaker than in the eastern Pacific, and the layer of warm surface water is much thicker; consequently, upwelling normally does not bring cold water to the surface. The result is that in the western Pacific, evaporation normally puts water vapour into the atmosphere, and the ocean heats the atmosphere so that the moist air rises. The far western Pacific is, therefore, normally a region of widespread and intense rainfall. During an El Niño, however, the region of rising moist air migrates far eastward into the central tropical Pacific. The normally wet far western Pacific becomes a region of low rainfall and even drought, whereas the rainfall at normally temperate central tropical Pacific islands increases dramatically.

      At one time researchers believed that most of the variability in the atmosphere sprang from the processes that generate storms at middle and high latitudes, but now it is clear that much of the variability of weather and climate has its origins in the tropics. El Niño is simply one of the largest and best-studied tropical phenomena; its effects on the atmosphere and the ocean extend far beyond the tropics. The best-known of these effects are profound changes in the marine populations in the rich fisheries of western coastal North and South America. Less well understood but possibly even more important are El Niño effects on sea level and storminess along those coasts, as well as on climate at latitudes far removed from the tropics.

      The first indication that something was out of the ordinary came in December 1996, when normally westward-blowing trade winds briefly reversed direction in the far western Pacific. Although this change produced little effect at the ocean surface, it generated a deepening of the equatorial warm-water layer and caused the layer to spread eastward to South America, where it arrived by March 1997. Western Pacific trade winds decisively reversed direction in February 1997, generating another eastward-moving deepening of the equatorial warm-water layer. The region of reversed trade winds began to expand eastward across the Pacific and by December extended as far west as the longitude of California. The combination of deepening warm water pulses and weakening trade winds resulted in a warming of the far eastern tropical Pacific that was first noticeable in May, and by the year's end the warming episode had spread westward with temperatures of several degrees Celsius above normal at the International Date Line and in the eastern tropical Pacific. The plentiful rainfall that accompanies normally strong evaporation in the far western Pacific gave way to drought there, with an unusual incidence of prolonged forest fires.

      See also Business and Industry Review: Energy (Business and Industry Review ); Mining (Business and Industry Review ); Disasters; (Disasters ) The Environment (Environment ); Life Sciences .

      This article updates hydrologic sciences.

▪ 1997


      More than 5,000 geologists attended the 30th International Geological Congress in Beijing during August 1996. Song Ruixiang, president of the congress, outlined the role of geology in China's five-year plan, emphasizing the search for minerals and petroleum with a view to protection of the environment. Increasing recognition of the fact that environmental protection is one aspect of resource exploitation was also apparent at the 1996 annual meeting of the Geological Society of America in Denver, Colo., during October. Of some 200 technical sessions, 25% addressed the ways that Earth science is relevant to environmental problems, ranging from ground-water contamination to the cleanup of radioactive waste. At the General Assembly of the International Council of Scientific Unions in Washington, D.C., in September, much attention was paid to the "sustainable development" of society through the next century. The problems and progress were presented in a booklet, Understanding Planet Earth, which described processes occurring in the outer layers of the Earth during the fairly recent past as a basis for predicting future changes.

      The Earth may be in transition from an ice age to a global greenhouse, with the rate of change probably being enhanced by society's contributions of greenhouse gases such as carbon dioxide to the atmosphere from the combustion of fossil fuels. A recent report by Robert Gastaldo of Auburn (Ala.) University and two colleagues analyzed the changes in vegetation worldwide during the two icehouse-greenhouse transitions that occurred in the late Paleozoic (about 300 million and 275 million years ago). Plant life changed during the geologically short time interval of 1,000 to 10,000 years; the primeval forests were replaced by vegetation dominated by seed plants. Recognizing such patterns of change would, the geologists believed, help them make predictions about future changes.

      Geologists everywhere were concerned that although the need for interdisciplinary science for environmental management is recognized, the central role of geology in both resource acquisition and environmental problems was not appreciated by policy makers and the public in general. There was a scarcity of geologists among scientific advisers to government at all levels. Consequently, many efforts were under way to educate the public and policy makers about the reciprocal relationship between geology and society and the ways in which the world's aggressive agricultural and industrial activities are changing the biosphere and the geologic cycles.

      The geochemical activities of the biosphere (the outer shell of the world where life exists) may help compensate for the degradation of the environment by human activities. For example, J. Craig Venter of the Institute for Genomic Research in Rockville, Md., and his team reported the complete genetic identification of a tiny, single-celled organism collected in 1983 from a hot submarine hydrothermal vent in the Pacific Ocean, 1,600 km (995 mi) from Baja California. Since the DNA and genes of the organism differ from those of organisms in the two major groups of living things, the prokaryotes and eukaryotes, it had been assigned to a third branch of life called archaea. It was proposed that up to 20% of the Earth's biomass may be inhabited by this organism and its relatives, associated with the hot vents of the deep oceans. The ability of the organism to recycle methane and digest heavy metals, converting them into other compounds, might one day be exploited by humans.

      Another discovery of previously unknown organisms, reported in August, generated much excitement and debate. David S. McKay (see BIOGRAPHIES (McKay, David Stewart )) of NASA's Johnson Space Center, with eight coauthors, reported evidence for the occurrence of bacterial microfossils in a 4.5 billion-year-old meteorite from Mars that reached Earth about 13,000 years ago. The meteorite contains cracks filled with carbonate material, presumably deposited by solution at a time when Mars still supported free water. The carbonates contain organic material and structures resembling microfossils, along with iron sulfide and magnetite minerals similar to those produced by bacteria on the Earth. Some scientists believed that inorganic processes could yield the same products. A later investigation by Colin Pillinger and colleagues at the Open University, Milton Keynes, Eng., found carbon isotope ratios in the sample consistent with those formed by microscopic life forms on Earth. Pillinger also reported similar findings for a second meteorite from Mars that was only 600,000 years old.

      The process of evolution—the history of the biosphere—is recorded both in rocks and in the genes of animals. Recent advances in molecular biology were revealing molecular evidence of evolution that had yet to be reconciled with the fossil evidence. Gregory Wray, Jeffrey Levinton, and Leo Shapiro at the State University of New York at Stony Brook studied the genes of more than 200 species of 16 animal groups. They reported that the huge genetic differences they discovered between the groups, which they calibrated against changes in dated fossils of the many species, indicated that the animals last shared a common ancestor as long ago as 1.2 billion years. In contrast, the evidence from the fossil record was that nearly all known groups of animals appeared during a few million years in the early Cambrian Period, about 540 million years ago.

      There was little evidence to show how life evolved before the Cambrian Period until one of the greatest discoveries about evolution in many years was reported by John Grotzinger at the Massachusetts Institute of Technology and three colleagues at the end of 1995. They explored rocks of Cambrian and older ages in Namibia and found a large selection of fossils in rocks of Vendian age, older by tens of millions of years than the Cambrian. The time interval just before the Cambrian Period was suddenly filled with a great variety of previously unknown, complex life forms.

      Paleontology and evolutionary biology were both challenged by this discovery. The Cambrian fossils were preserved because they contained shells or skeletons. One possibility was that the animals had existed and evolved as soft-bodied creatures through perhaps 500 million years until predators evolved, which led to the development of hard body parts as protection. Further geologic studies in selected older rocks and better precision for the molecular clock were required.

      Many geologic and geochemical processes are intimately involved with the biosphere. The Ocean Drilling Program reported another discovery in September. The research vessel JOIDES Resolution was drilling about 240 km (150 mi) west of Vancouver Island, British Columbia, when two new hot springs were created on the seafloor. One site was inspected by lowering an underwater camera to the seafloor, 2,448 m (8,031 ft) deep. Hot water was rushing out of the hole so fast that it was carrying mud and rock fragments and forming a cloud more than 30 m (100 ft) above the seafloor. These submarine hydrothermal vents are formed when seawater circulates through hot volcanic rocks, often located where new oceanic crust is being formed, and the hot solutions emerging into cold seawater precipitate mineral deposits rich in iron, copper, zinc, and other metals. This was the first opportunity to watch how a new hydrothermal vent and the animal communities that thrive in those environments grow and change with time. One of the biggest mysteries is how the animal communities manage to migrate from one vent to another.

      Hydrothermal vents also occur on submarine volcanoes. Loihi, a growing volcano discovered in 1954 approximately 30 km (20 mi) southeast of the island of Hawaii, rises 3,500 m (11,480 ft) from the seafloor to about 1,000 m (3,280 ft) below sea level. An intense swarm of more than 4,000 earthquakes during July and August was accompanied by the conversion of a cone called Pele's Vents into a crater 260 m (850 ft) wide and 300 m (985 ft) deep, now called Pele's Pit. Alexander Malahoff of the Hawaii Undersea Research Laboratory organized an expedition with a research ship and a submarine to map and sample the reshaped volcano. The researchers found new fractures and hydrothermal vents that were more active than before. The new vents were covered with huge mats of chemosynthetic bacteria, and the water above Loihi was turbid and teeming with a "soup of life."

      Geologists expected that Loihi would grow and eventually merge with the big island of Hawaii to become the successor to the volcanoes Mauna Kea, Mauna Loa, and Kilauea, which would become extinct as they were carried across the plume of hot rock rising from the Earth's interior. Details of the growth of those massive volcanoes, and of the deep mantle plume from which the lavas were derived, was being determined from deep drilling through the flanks of Mauna Loa and Mauna Kea. The drilling yielded information unavailable from surface reconstructions and had already established that the previous view of growth stages of Hawaiian volcanoes was incorrect. During the year the National Science Foundation recommended funding of a new drill hole to a depth of approximately 4.5 km.

      The gases emerging from volcanoes play a crucial role on the Earth. The global carbon cycle, connecting the biosphere with rocks, air, and water, may be considered to begin in volcanic gases. Occasional massive eruptions pump such large quantities of carbon dioxide and acid gases into the atmosphere that global climate may be modified for years. It was reported by Peter Francis and colleagues at the Open University that they were able to measure the concentrations of several components of volcanic gases from a distance by using Fourier-transform infrared spectroscopy. (PETER J. WYLLIE)

      Seismic activity was high during recent months. One of the largest earthquakes occurred on Oct. 9, 1995, near the coast of Jalisco, Mex., and left 19 persons dead, more than 100 injured, and at least 1,000 homeless, mostly in Colima. The quake was felt in Mexico City and by persons in high-rise buildings as far away as Houston and Dallas, Texas, and in Oklahoma City, Okla. A tsunami estimated to have reached a maximum height of 5 m (17 ft) was generated. It was registered throughout the Pacific Basin, in the Marquesas Islands, the Hawaiian Islands, French Polynesia, Western Samoa, and even Southport, Australia, where its peak-to-trough amplitude was four centimetres.

      Five shocks occurred with magnitudes of 7.9: on Dec. 3, 1995, in the Kuril Islands; on Feb. 17, 1996, in Indonesia; on June 10 in the Andreanof Islands off the coast of Alaska; on June 11 near the Philippine island of Samar; and on June 17 in the Flores Sea near Indonesia. Although the quake in the Andreanofs caused a tsunami that was registered in Hawaii, Crescent City, Calif., and Port Angeles, Wash., only the earthquake of February 17 caused fatalities and appreciable damage. It left 108 dead, 423 injured, and 58 missing and destroyed or seriously damaged more than 5,000 homes, some owing to a tsunami.

      It is not always the most powerful earthquakes that are the most destructive. The most devastating earthquake of 1996 occurred on February 3, in Yunnan province, China, where at least 251 people were killed and more than 4,000 were injured. It was estimated that 329,000 homes were destroyed throughout northwestern Yunnan and that one million people were left homeless. The magnitude of the shock was 6.6. Another shock on Oct. 6, 1995, in southern Sumatra, magnitude 6.7, killed 84, injured more than 1,800, damaged more than 17,000 homes, and left 65,000 homeless.

      Two smaller earthquakes, of magnitude 5.9 each, caused fatalities. The first, on March 28 in Ecuador, killed at least 19 and injured 58; the second occurred on May 3 in western Nei mongol, China, and left 18 dead and 300 injured. A total of 15 earthquakes of magnitude 7.0 or greater occurred. February was exceptionally active, experiencing eight shocks with magnitudes between 6.0 and 6.9 and four of magnitude 7.0 or higher.

      The most notable volcanic activity was the continuing series of eruptions of the Soufrière Hills volcano on Montserrat in the West Indies, which began on July 18, 1995. It was the first volcanic activity that was recorded on the island since it was visited by Columbus in 1493. The volcano began by producing clouds of ash that slowly increased in duration. New vents opened on July 18 and July 30. Low-level activity continued until an ash explosion formed a third vent on August 20, when some 5,000 people were evacuated. On August 27 there was a magma eruption, producing a lava flow and an ash cloud that coated the nearby city of Plymouth and blotted out the light for 25 minutes. On the next day ejecta were hurled as far as three kilometres (two miles) from the summit. This time 6,000 residents took refuge at the northern end of this the island, which is only 13 km (8 mi) in length. Twice afterward the situation became ominous to the degree that three other evacuations took place, in November 1995, April 1996, and September 1996.

      While seismic activity and other indicators have been effective in predicting eruptions in many instances, additional methods are needed. During the year a geochemist, Tobias Fischer, at Arizona State University found one that appeared to have great promise. Volcanologists had determined that the mechanics of volcanism result in a predictable chain of events. The molten magma gives off a volatile mix of gases, containing carbon, hydrochloric acid, and sulfur dioxide. These escape under great pressure, forcing out rainwater in the form of steam. When the tubes or fissures become clogged owing to the accretion of minerals or by cooling of the surface rock, pressure builds within them until it is released with explosive force. In June 1992 Fischer began monitoring the content of the gases escaping from the active Galeras volcano in Colombia. He found definite changes in gas temperature and the percentages of the various chemicals before an eruption. In the week prior to the latest large event, the temperature of the surface rock dropped from 750° to just over 400° C, and the amount of the very soluble hydrogen chloride dropped to one-thirtieth of the mixture, while that of the insoluble carbon dioxide remained unchanged.

      Fischer reasoned that water was not expelled because the channels were blocked and the water consequently seeped into the rocks, dissolving the salt. This increase in salt indicated blocked tubes, which in turn indicated a pressure buildup and an imminent eruption.

      According to the long-accepted theory of isostasy, mountains float on the denser mantle, similar to icebergs in the ocean. The mass of the mountain below the surface of the ground, extending downward as much as 60 km (37 mi), is greater than that of the visible portion. This is apparently not so for the Sierra Nevada mountains, however, according to Stephen Parks and his team, who began working on the Southern Sierra Continental Dynamics project in 1992. Using extensive seismic refraction surveys, they determined the thickness to the mantle beneath the mountains to be only about five kilometres (three miles). Furthermore, electrical resistivity surveys showed that there were areas of partially molten rock beneath the crust. This indicated to the investigators that the mountain roots were being melted and were less than half the size they had been 15 million-20 million years ago. Thus, in theory, the Sierra Nevada chain, which contains Mt. Whitney, the highest peak in the contiguous U.S., should be sinking rather than rising. Further research efforts would be designed to map the magma, date deep cores, and attempt to determine whether the Sierras were higher in the past.

      Another study of geodynamics produced rather startling results. The most widely accepted theory of plate tectonics supposes that the plates float on the mantle or at least move independently of it. Recently, however, researchers from the Carnegie Institution of Washington, D.C., and the University of São Paulo, Brazil, found a fossil plume buried deep in the mantle beneath Paraná flood basalt that has remained stationary with respect to the South American plate, which thus demonstrates that a portion of the mantle is moving with the plate and that the mantle and the continent have been coupled since the opening of the South Atlantic Ocean about 130 million years ago. (RUTLAGE J. BRAZEE)

      Aided by advanced numerical models, the scientific understanding of the atmosphere—and of the interactions between the ocean and atmosphere—and the ability to forecast large- and small-scale meteorologic and hydrologic phenomena on a variety of time scales have increased dramatically during the past two decades. Rapid technological advances have also increased the capability to collect and process vast amounts of atmospheric data. This knowledge and technology provide meteorologists and hydrologists with many research opportunities that are expected to lead to improved forecasts.

      Long-term outlooks for periods of as much as a year into the future are now possible. While such long-range predictions do not have the precision of tomorrow's forecast, they can provide useful planning information for such industries as utilities, agriculture, and water-supply management. One basis of seasonal prediction is the ocean-atmospheric interaction in the South Pacific Ocean. The research on this interaction has enabled the prediction of tropical sea-surface temperature variations for as long as a year. With this knowledge forecasters have been able to predict seasonal temperature and rainfall variations over North America. Increasingly sophisticated regional models of the atmosphere are being developed to bring these forecasts down to the regional scale.

      Global-scale climate changes based upon the possible consequences of increases in "greenhouse" gases in the atmosphere are being studied. These gases, which include carbon dioxide, can affect climate and weather by modifying the radiative characteristics of the atmosphere.

      As the accuracy of models of large-scale changes in the atmosphere increases and as computers become faster, research efforts will continue to improve medium-range (three-to-five-day) forecasts. That these efforts have paid dividends was demonstrated when the forecast models developed by the National Weather Service accurately predicted the superstorm of March 1993 five days in advance. Five-day forecasts in 1996 were as good as three-day forecasts were 15 years ago.

      Considerable research was also taking place in regard to short-term forecasts. Improved models of the atmosphere have resulted from the incorporation of sophisticated representations of physical processes, such as the effects of ocean temperature and topographic variation at the Earth's surface. Such research has led to rapid progress in "mesoscale" meteorology—the meteorology of severe local storms.

      Because short-term and long-term meteorology is global in scope, it has historically fostered international cooperation. Efforts were expected to continue in such areas as the exchange of real-time data, scientific collaboration, and technology transfer. One example was in the area of river forecasting and water management. The performance of recently implemented forecast systems (using U.S. river-forecasting techniques) during the extensive flooding in the summer of 1996 in China was widely praised.

      In spite of these improvements in forecasting, some of the most deadly meteorological menaces, such as tornadoes, lightning, and flash floods, still could not be forecast with total precision. In an effort to improve such forecasts, the U.S. deployed advanced observing instruments, such as Doppler radar, satellites, and telemetering observation systems, to provide real-time data in order to mitigate the loss of life from rapidly evolving small-scale meteorological events. Doppler radars can detect the speed and direction of wind as well as precipitation within developing storms. This allowed early detection of severe thunderstorms and tornadoes and also provided precipitation estimates important to forecasting of flooding. Geostationary satellites provided images of storm systems as frequently as every six minutes during severe weather situations. Automated surface-observing systems provided a significant increase in the number of observing sites, including many airports.

      New forecast capabilities could also benefit the economy. A new field of meteorological application was unfolding as industry learned to apply the improved weather products and services to the benefit of their companies. The future of meteorology thus seemed certain to be an expanding collaborative endeavour between federal and state governments, academia, and the private sector. (ELBERT W. FRIDAY, JR.)

      Two distinctive features of research in oceanography during 1996 were the importance of new technology in carrying out observations and the evident necessity of observational programs extending over many years, not only for long-term monitoring but also for developing a conceptual background that would help researchers formulate new scientific questions.

      The Ocean Drilling Program (ODP) had its inception in an attempt, in 1961, to drill through the ocean floor to the Mohorovicic discontinuity separating the Earth's crust from the mantle beneath. This effort became the Deep Sea Drilling Program in 1968 and was transformed into the present ODP in 1984, when the drilling ship JOIDES Resolution was commissioned. When drilling was begun, the ideas of plate tectonics were in their infancy, and scientists' view of the events that shape the seafloor emphasized processes that occur over geologic time scales. But during the lifetime of the drilling programs, even more evidence has been found supporting the importance of sudden events. Several of these were observed in 1996. The ability to detect them and to put observers above them at sea while they were occurring was possible only because of advances in ocean technology during the past decade.

      In late February the U.S. Navy's Sound Surveillance System (SOSUS) detected seismic events near the northern Gorda Ridge about 350 km west of the coast of northern California (1 km = 0.62 mi). By early March scientists were at sea in the region sampling the water column—a plume of heated water 10 km across that rose 1,500 m off the seafloor (1 m = 3.28 ft). Further studies in April and June found microorganisms that demonstrated the ability to grow at temperatures as high as 90° C (194° F) but could not grow at normal ocean temperatures.

      Loihi Seamount is an underwater volcano about 30 km southeast of the island of Hawaii. A hydrothermal vent system at a depth of about 1,000 m previously capped the summit. Seismicity was intense there for a month beginning in mid-July. Again, scientists were able to take field observations during the event. The newly changed seafloor was mapped acoustically; volcanic glass fragments were recovered, using submersible vehicles; and plumes of hydrothermally altered water were observed. At the conclusion the summit vent system had collapsed into a broadened summit crater whose floor was 1,350 m deep. Continued volcanism at Loihi over thousands of years would ultimately build the summit upward to the ocean surface.

      In September, while the JOIDES Resolution was drilling into metal-rich deposits formed by an old and inactive hydrothermal vent system about 240 km west of Vancouver Island (British Columbia) and just a few kilometres from an active vent, two new vents were created. Repeated visits to this site were expected to provide a unique opportunity for scientists to learn how the particular collection of organisms that flourish only in the extreme conditions near the vent colonize a new site.

      The World Ocean Circulation Experiment (WOCE) began a global survey of the circulation of the world ocean in 1990. In the Atlantic, WOCE data based on analyses of the distribution of tritium in the water were beginning to give a consistent picture of the "age" of subsurface waters (the time elapsed since those waters participated in exchanges across the air-sea interface), a picture that would be important in refining estimates of such quantities as oxygen consumption by living organisms at different depths. Tritium is primarily a product of atmospheric nuclear weapons testing, and its distribution thus provides information about water motions since the 1960s.

      Chlorofluorocarbons have entered the oceans as by-products of industrial activity, primarily refrigeration and air-conditioning. The WOCE measurements of the chlorofluorocarbon distribution in the Pacific were completed in 1996; levels of those compounds were below detectability in the deep waters of the northern Pacific but were well above detectability in the deep waters of the southern Pacific and in deep northward-flowing Pacific currents.

      Most of the WOCE fieldwork was scheduled to be finished by the end of 1997. The project largely attained its goal of providing a basic global picture of the circulation of the ocean over a period of several years. An important part of that picture is the global pattern of heat transport and of water exchange between the air and the sea. One of the most important practical applications for this knowledge is climate prediction. Planning began for a global study of the coupled atmosphere-ocean climate system and its predictability on time scales of seasons to years. This Climate Variability and Predictability Program was scheduled to begin in 1998 and was to last for 15 years so that year-to-year variability could be understood adequately.

      The complexity and variability of seafloor and fluid environments greatly complicates efforts to understand the abundance and variability of marine populations. Even so, during 1996 technological developments originating in physical oceanography made possible an open ocean test of the hypothesis that in regions of the ocean where nutrients and light are available in abundance yet phytoplankton populations are lower than expected, it is a lack of iron that is the limiting factor. In work carried out in 1995 and reported in 1996, an area about 30 km on a side in the equatorial eastern Pacific was initially surveyed to check that temperature and salinity, as well as biological and chemical conditions, were uniform, so that the sinking of cold or salty water relative to adjacent cold or relatively fresh water would be minimal. A small part of this region, about eight kilometres on a side, was then seeded with iron (as acidic iron sulfate) mixed with the inert tracer sulfur hexafluoride previously used to study vertical diffusion rates in California coastal waters and in the central Atlantic. A freely drifting buoy that constantly radioed its satellite-derived geographic position to the ship was used to mark the centre of the seeded patch. The ship carried out continuous surveys through the seeded patch around the buoy, measuring dissolved iron, sulfur hexafluoride, nitrate, and chlorophyll over a 19-day period. Chlorophyll levels increased by as much as 27 times several days after the last addition of iron, which indicated phytoplankton growth, and nitrate levels were correspondingly depleted. (MYRL C. HENDERSHOTT)

      See also Energy (Business and Industry Review ); Mining (Business and Industry Review ); Disasters ; The Environment (Environment ); Life Sciences .

▪ 1995


      The astronomical display produced in July 1994 by the predicted explosive collisions of a string of fragments from a shattered comet with the atmosphere of Jupiter (see Astronomy ) ranked as the most spectacular planetary event of the year. It also drew attention to the role of asteroid and comet impacts in the Earth's history. The Jovian impacts followed by a few months a major scientific conference in Houston, Texas, devoted to the events associated with the boundary between the Cretaceous (K) and Tertiary (T) geologic periods, 65 million years ago, when dinosaurs and many other species became extinct. It was 15 years earlier that the U.S. physicist Luis Alvarez and his geologist son, Walter Alvarez, had proposed that the extinctions were the result of climatic disruptions caused by the impact of a massive asteroid or comet at the end of the Cretaceous. The initial evidence was an increased concentration of the trace element iridium (rare in asteroids but even rarer on Earth) discovered in a thin layer of sediment in rocks delineating the K-T boundary—an anomaly that proved to be global in extent.

      The impact proposal was hotly debated because the idea that a catastrophic event could cause profound changes in the geologic record and the course of evolution is opposed to the venerable geologic doctrine of uniformitarianism—the idea that geologic changes and evolution occurred gradually through a progression of processes similar to those seen to be acting at present. Scientific doctrine is not easily overturned, but many earth scientists were converted following the discovery and investigation since 1992 of a giant 65 million-year-old impact crater, at least 180 km and perhaps 300 km in diameter (1 km is about 0.62 mi), at Chicxulub in Mexico's Yucatán Peninsula. Converts stretched the doctrine of uniformitarianism to include the occurrence of occasional impact events, such as that observed on Jupiter. A crater about 35 km in diameter at Manson, Iowa, had previously been evaluated in connection with the K-T extinction and found to be too small. The observation of multiple impacts on Jupiter strengthened the proposal that the collisions that made the Manson and Chicxulub craters might have been part of a multiple event, although recent dating measurements indicated that the Manson crater may be older than 65 million years.

      In 1994 there were few skeptics who doubted that a major collision with an extraterrestrial body occurred 65 million years ago. Some maintained, nevertheless, that the dinosaurs were already in decline and that the impact merely accelerated the mass extinction that was under way as a result of the climatic disruption caused by an enormous eruption of basalt—the flood basalts known today as the Deccan Traps—in India 65 million years ago. The argument was bolstered by the fact that only the K-T boundary is characterized by an iridium anomaly and that the several other mass extinctions that took place in the past 500 million years, therefore, must have had some other cause. At the Houston meeting Vincent Courtillot of the Institute of Physics of the Earth, Paris, presented evidence of a strong correlation between the ages of mass extinctions and of continental flood basalts, and he concluded that continental flood basalt volcanism is the main candidate for most extinction events. Interpretation of the evidence depends critically on accurate age measurements of both mass extinctions and flood basalt eruptions and their durations. Recent improvements in dating allowed researchers to confirm that most of the large flood basalt events lasted for less than one million years. Some uncertainties about the precise age of the Chicxulub crater could be resolved by a new drilling project, which would permit sampling of rocks in and under the crater.

      Each continental flood basalt province represents a very large transfer of heat and material from within the Earth to its surface within a very short time. Recently a mechanism for the concentrated transfer was proposed that involved a modification of the concept of mantle plumes, cylinders of relatively hot rocks in the mantle (beneath the crust) that are rising slowly from perhaps as deep as the core-mantle boundary, 2,900 km down. Initially solid owing to the high pressure deep in the Earth, the plumes begin to melt as they approach the surface, yielding basaltic lavas. It was argued that "superplumes" sometimes developed and that the head of such a superplume grew in size by entraining rock from the surrounding mantle during its upward flow. When the large mushroom-shaped plume head approached the surface, it generated the enormous volumes of continental flood basalts. Subsequent plume activity from the thinner stem of the plume produced lesser volcanic activity, corresponding, for example, to that which formed the Hawaiian Islands.

      Information about mantle plumes is based on fluid dynamics—i.e., on interpretation of small-scale laboratory experiments with different fluids—and on interpretation of the geochemistry of basalts. During the year a drilling experiment under way on the island of Hawaii was beginning to reveal more about the mantle plume that feeds lava to the volcanoes. The successive lava flows on the island represent samples of successive portions of the rising plume, and the accessible lavas on the volcanoes thus represent only the most recent history. A drill hole near Hilo 1,100 m (3,600 ft) deep first traversed lavas from Mauna Loa and then passed into lavas from Mauna Kea. According to Donald Thomas of the University of Hawaii, Donald DePaolo of the University of California at Berkeley, and Edward Stolper of the California Institute of Technology, the frequency and ages of flows indicate that the volcanoes may be twice as old as previously thought. Detailed geochemical studies of the lava samples taken from the drilling were expected to provide information about variations along the rising mantle plume. The earlier stages of volcanic growth from these plume-derived lavas were being sampled in ocean-drilling studies of Loihi, the youngest Hawaiian volcano, which is growing on the submerged flanks of Kilauea.

      Causal relationships have also been proposed between mantle plumes and the breakup of some continents, those having margins identified as having been created by volcanic rifting. It is widely believed that the northeastern Atlantic Ocean formed from a continental split that developed above a hot mantle plume, the ancestor of today's Iceland plume, and the possibility was explored during Leg 152 of the international Ocean Drilling Program in late 1993. Sites were drilled on the volcanically rifted margin of southeastern Greenland, and the first penetration through the volcanic cover into the underlying continental crust was achieved. Reports of results during 1994 revealed the tectonic and volcanic history of the continental breakup and confirmed the role of hot, buoyant mantle reaching fairly close to the surface in the rifting environment. Voluminous floodlike eruptions of basalt were in evidence. The upper series of lavas was richer in magnesium than normal oceanic basalts, indicating higher melting temperatures, but the trace-element geochemistry of the lavas was similar to that of normal mid-ocean ridge basalts, with no indication of basalts contributed from deep-seated mantle rocks, as would be expected if the lavas had been fed from a deep plume. Thus, a causal link between continental breakup and deep-seated mantle plumes was not yet established.

      Basaltic volcanism causes the major chemical differentiation of the Earth; that is, the extraction of the components of the crust, hydrosphere, and atmosphere from the Earth's interior. But a more extreme differentiation is accomplished by geomorphic, weathering, and sedimentary processes at and near the Earth's surface. Sediments as diverse as limestone (calcium carbonate) and sandstone (silicon dioxide) derive from original basalts and other lavas. The weathering, transportation, and redeposition of rocks and soil form the differentiated sedimentary rocks, with many processes involving biological activity. The result is a modified landscape, the familiar scenery of the Earth's surface. The important effects of biological agents are limited in magnitude and time, but during the year Roger Hooke of the University of Minnesota emphasized that this generalization breaks down when human beings are considered.

      The role of humans in landscape modification, although long recognized, had not been treated in textbooks of geomorphology. Hooke compared the efficacy of various geomorphic agents, humans included, on a global scale. The measure used was the mass of material moved from one location to another by unidirectional processes (the study thus excluded such processes as waves moving beach sand back and forth perpendicular to the shoreline and plows turning soil from furrow to ridge). According to Hooke, the amount of sediment moved by rivers is about 24 billion tons per year (24 Gt/yr), of which 10 Gt/yr is due to agriculture, while glaciers transport about 4.3 Gt/yr of material. Slope processes, wave action, and wind move only about 2.5 Gt/yr. Hooke estimated that the worldwide geomorphic activity of humans in earth moving, such as building excavations, mineral production, and highway construction, is about 30 Gt/yr, not including the 10 Gt/yr of river sediment due to agriculture. Humans were thus the most important geomorphic agent shaping the surface of the Earth.


      On June 9, 1994, seismologist Waverly Person, cataloger and archiver for the U.S. Geological Survey's National Earthquake Information Center, Golden, Colo., was perplexed. The automated earthquake-location system had just located a great earthquake of magnitude 8.2 at latitude 13.2° S, longitude 67.6° W in Bolivia at a depth of 617 km (1 km is about 0.62 mi). Although the area experiences considerable seismic activity, the shock was exceptionally large considering its great depth. The location, depth, and magnitude were later found to be correct and, in any case, were not the cause of Person's concern. Shortly after the shock, people across a wide area of the U.S. began reporting that they had felt an earthquake. The suggestion was made that the reports were connected to the Bolivian earthquake, but Waverly was not convinced. After checking with many seismologists in areas from which "felt" reports had been received, however, and finding that no corresponding local shocks had been recorded, Pearson was forced to agree with his colleagues that an unprecedented phenomenon had occurred; people indeed had felt an earthquake whose focus was as much as 6,000 km distant.

      Apparently only five people lost their lives in the earthquake; damage, though widespread, was minor, occurring in Peru and Brazil. As would be expected, the shock was felt in many parts of Bolivia, Brazil, Chile, Ecuador, and Peru; however, it was felt also in Puerto Rico, Dominica, several U.S. states from coast to coast, and Toronto. In the past 70 years many researchers had found evidence of certain layers in the crust that trap seismic energy as so-called channel waves and carry it, almost undiminished, for long distances and at comparatively slow speed, allowing it to escape slowly along its path. Such findings had been based on aberrant or anomalous seismic readings noted on instrumental records. The unique Bolivian shock finally furnished direct, dramatic evidence of such channel waves.

      Seismic activity through much of 1994 was above average. In addition to the Bolivian shock, several other large earthquakes of magnitude 7.0 or greater occurred around the globe, a number of them involving loss of life. One, of magnitude 7.2 (upgraded from 6.5), rocked the island of Sumatra, Indonesia, on February 16, killing at least 215 people. On June 3 a magnitude-7.7 earthquake (followed by another large shock the following day) struck off the south coast of Java, Indonesia, causing destructive tsunamis (seismic sea waves) and killing more than 200 people. On October 4 an undersea earthquake of magnitude 8.2, with an epicentre east of Hokkaido, Japan, and Russia's southernmost Kuril Islands, killed at least 16 people in the Kurils and caused damage and injuries in northern Japan.

      Other earthquakes that resulted in fatalities include those of January 17 in southern California, where 61 deaths were recorded; June 6 in southwestern Colombia, where hundreds died; August 18 in northern Algeria, where at least 171 were killed; and November 15 in the vicinity of the Philippine island of Mindoro, where the shock and resulting tsunamis killed more than 60 people. The January 17 California quake, having a magnitude of 6.8 and an epicentre in the highly urbanized Northridge area of Los Angeles in the San Fernando Valley, followed four major shocks in 1993. It left more than 9,000 injured and an estimated 20,000 homeless and damaged more than 40,000 buildings. Overpasses collapsed in many places, closing several freeways.

      The international Ocean Drilling Program (ODP) continued the exploration of the crust beneath the world's oceans by means of coring, extraction, and study of rock samples from below the seafloor. Among the more notable recent discoveries resulted from the exploration on ODP Leg 149 of the central portion of the Iberian Abyssal Plain. This ocean-continent transition zone, beneath the Atlantic off the Iberian Peninsula, is one of a conjugate pair, its partner being that found off Newfoundland. They were created when the Iberian Peninsula and Newfoundland, once part of a single landmass, rifted and separated. The rifting apparently was nonvolcanic and resulted in crustal thinning. Magnetic and gravitational data agreed with this interpretation, but the six holes drilled on a west-east transect found not only a thinning crust but a ridge of mantle rocks 19 km wide. The latter discovery indicated that a break exists between the oceanic crust and the continental crust and that the edge of the latter lies a surprising 200 km west of the continental shelf. The findings suggested that the present models of the breakup of continents needed revision.

      Leg 150, called the New Jersey Sea Level Transect, was designed to help earth scientists reliably recognize past worldwide sea-level changes in rocks formed of sediments laid down from the Oligocene to the Holocene epochs (from about 37 million years ago to the present). Studies of past sea-level changes were focusing on three major periods. These were colloquially dubbed the "Icehouse World" of the Oligocene to Holocene epochs, when ice sheets were known to have existed and to have affected sea levels; the ice-free "Greenhouse World" that existed in the Cretaceous Period prior to 66 million years ago; and the "Doubthouse World" of the intermediate Paleocene and Eocene epochs, a time for which the existence of ice sheets was debated. The area off New Jersey was chosen because previous seismic profiles had shown it to be especially suitable for evaluating the effects of sea-level changes on sedimentation at a continental margin. Cores from four holes sampled sediments from both the Icehouse and Doubthouse periods and corroborated the profile data. Especially interesting was the discovery of a layer of microtektites (tiny glassy objects thought to be associated with meteorite impacts) in two of the cores. The finding correlated with one from a much-earlier deep-sea drilling study in the area and suggested the impact of an extraterrestrial body some 50 million years ago.

      The Norwegian and Greenland seas, a relatively small area of the North Atlantic, have an inordinate influence on the weather patterns of the Northern Hemisphere, owing in large part to the interaction there of north-flowing warmer surface water from the North Atlantic and south-flowing water from the Arctic Ocean through the Fram Strait. Leg 151, which extended from a drilling site (Site 907) midway between Iceland and Jan Mayen Island north to the Yermak Plateau northwest of Spitsbergen, had the objective of determining the history of the Norwegian and Greenland seas, especially with respect to glaciation.

      An interesting artifact that has helped to determine glaciation sequences are dropstones. When a glacier scours the land surface and then moves out to sea, it carries stones with it among the gravels and silt that it has picked up. Then, when it breaks off into floating rafts that eventually melt, the rafts drop their loads of stones, which become a signature of their passing. At Site 907 a 16 million-year sequence of glacial sediments was recovered. Dropstones were deposited as early as 6.4 million years ago, but their occurrence was rare from that time to the present. Sites 908 through 912 were concentrated at the northern end of the transect as far north as the 80th parallel, the most northern sites ever drilled by ODP researchers. Site 913 was located to the south on the oldest oceanic crust east of Greenland, where a penetration of 770 m (2,525 ft) brought up sediments dating back to the Eocene, the oldest obtained in this region. This site also produced an abundance of dropstones from about 2.5 million years ago, in agreement with finds throughout the North Atlantic and North Pacific indicating the beginning of major glaciation. (RUTLAGE J. BRAZEE)

      Effects of the "great flood of 1993" that inundated much of the U.S. Midwest during the summer months of that year lingered through at least early 1994 as observers reported a freshwater "river" in the Atlantic Ocean off the coast of Florida. The flow, which measured 24 km (15 mi) wide and 18 m (60 ft) deep, was the result of the outpouring of the flooded Mississippi River into the Gulf of Mexico. Estimates varied for the length of time that the phenomenon would last, but no one believed that it would fade away before the end of 1994.

      A report published during the year attributed a significant part of a decades-long annual rise of 1.5-2 mm (0.06-0.08 in) in sea level to the long-term accelerated drainage of aquifers, wetlands, and inland seas for human use. Researchers at Ohio State University suggested that as much as one-third of the rise could be due to human activities unassociated with global warming. Those activities included not only the increased drainage of water bodies but also the destruction of forests, which released enormous quantities of water from trees and soil, and the expansion of desert areas.

      One of the Earth's rapidly shrinking bodies of water is the Aral Sea, which by the mid-1990s had lost two-thirds of the water volume that it possessed in 1960. Straddling the boundary between two Central Asian republics of the former Soviet Union, Kazakhstan to the north and Uzbekistan to the south, the Aral Sea was once the world's fourth largest inland body of water. Starved in recent decades by the diversion of its major inflowing rivers for purposes of irrigation, the sea was reduced in surface area to half that of three decades earlier; by the 1990s some one-time seaports were more than 50 km (31 mi) from the water. Five Central Asian countries whose activities affect the Aral Sea—the two aforementioned republics and Kyrgyzstan, Turkmenistan, and Tajikistan—agreed in 1994 to restoration and rehabilitation efforts, although they set no specific targets.

      Another, much smaller body of water was given a new lease on life when court orders imposed a requirement on the city of Los Angeles to reduce its diversions from rivers feeding Mono Lake in California. Water-supply diversions for the city had reduced the volume of the lake to such a point that aquatic life was severely threatened.

      Californians, who had hailed above-average rainfall in 1993 as the end of a six-year drought for the state, were disappointed with a light snowpack in the mountains over the winter of 1993-94. Although reservoirs were filled near capacity at the beginning of the water-use season, the light snowpack discouraged water managers from making confident predictions about the state of future water supplies. In spite of sober predictions, most California cities were reluctant to dust off rationing plans that had been developed during the 1987-92 years of shortage.


      A broad upper-level trough of low pressure over the central and eastern U.S. during January and early February 1994 brought bitterly cold conditions to those parts of the country. The mercury plunged to -37.8° C (-36° F) as far south as Indiana, and several locations across the Ohio Valley and central Appalachians established new all-time record-low temperatures. In sharp contrast, abnormally mild and dry weather prevailed across the Far West during the 1993-94 wet season, with some areas receiving less than 50% of normal precipitation. Snowpack, vital for adequate water supplies during the May-September dry season, ranged from 50% to 80% of normal across the region. During July and August hot, dry weather engendered numerous wildfires across the West. Beginning in late October, however, surplus precipitation fell on most of the Far West, easing concerns of a second straight subnormal wet season.

      In early July Tropical Storm Alberto tracked inland over the Florida Panhandle and stalled over Georgia. As much as 615 mm (24 in) of rain generated widespread severe lowland and river flooding. In mid-November Tropical Storm Gordon pursued an erratic path that took it over Jamaica, Haiti, and Cuba; across southern Florida; and then into the Atlantic, where it looped westward, briefly menacing North Carolina's Outer Banks before drifting back toward Florida as it weakened. The storm killed several hundred people in Haiti and cost Florida an estimated $200 million in damage.

      In South America, flooding claimed dozens of lives in Colombia and Peru in February and forced thousands of individuals to flee their homes. In late June and early July, winter temperatures dipped below freezing as far north as Brazil's Paraná state, damaging the coffee crop. In São Paulo state persistent dryness and heat from August to October cut into Brazil's orange production.

      Frequent storms, heavy snows, and bitter cold afflicted much of Europe in January and February. Excessive precipitation plagued northern Europe through April, while unusually heavy rains also drenched the Middle East, where totals during March and early April were 600-850% of normal. Very dry conditions developed across southern Europe during March; by mid-July hot, dry conditions covered the entire continent. In late September and early October, storms battered the Baltics and southern Scandinavia. On September 28 more than 900 lives were lost when a ferryboat sank in rough waters of the Baltic Sea off Finland.

      In February Cyclone Geralda slammed into the island of Madagascar. National officials declared it the "cyclone of the century," with 95% of the main commercial port of Toamasina reportedly destroyed. Geralda and Cyclone Daisy, which struck the island in January, combined to wipe out nearly 300,000 metric tons of the rice crop. In late March Cyclone Nadia crossed Madagascar before striking the African mainland. In Mozambique Nadia left almost 1.5 million people homeless and caused considerable damage to crops, including cashew trees, a major source of income for the nation. Across most of southeastern Africa, unusually wet conditions prevailed during January and February, contributing to an excellent fall harvest. Farther north, copious rains fell on most of the Sahel, resulting in the wettest growing season in 30 years.

      Torrential rains spread across much of southeastern Asia during early April and persisted into early May. By contrast, unusually warm and dry weather developed across Korea, Japan, and northeastern China. Although tropical systems battered parts of Japan in late July and early August, prevailing hot and dry conditions severely depleted reservoirs and damaged crops in many parts of the country. Summer dryness in parts of central China was the worst since 1934, causing widespread crop stress. In July and August rains soaked much of south-central and southeastern China and Southeast Asia. Floods took at least 1,800 lives, nearly 1,000 of which, according to press reports, were lost as Typhoon Fred slammed into southeastern China in mid-August.

      Following an exceptional midyear heat wave that claimed more than 400 lives, the 1994 Indian monsoon brought abundant rainfall, causing episodes of flooding in India and Pakistan. From January through early April, surplus rainfall was measured across most of Indonesia and southern Malaysia, generating periodic flooding in Sumatra and Java. By June, however, extremely dry conditions developed across Indonesia; they persisted through November, abetting wildfires and crop damage across much of the archipelago.

      In January, hot and dry conditions dominated Australia, setting the stage for extensive wildfires across New South Wales, but in February Tropical Storm Sadie brought heavy rains to the Cape York Peninsula, eastern Arnhem Land, and parts of Queensland. Widespread subnormal winter rains combined with early spring dryness to produce serious moisture shortages as the nation's primary agricultural season got under way. According to the Australian Bureau of Meteorology, portions of the southeastern quarter of the continent endured one of the driest April-August periods on record. At year's end large moisture deficits accumulated across eastern Australia.

      In September El Niño conditions (a periodic appearance of abnormally warm surface waters in the tropical Pacific) developed, and by December they had entered the mature phase. The atmospheric and oceanic changes associated with an El Niño strongly influence temperature and precipitation patterns in various parts of the world. Some effects anticipated for 1994-95 included dryness over northern Australia (September-March), wetness in southeastern South America (November-February), warmth over southeastern Africa (October-April), and coolness along the U.S. Gulf Coast (October-March). (ELBERT W. FRIDAY, JR.)

      This updates the article climate.

      In June and October 1994 two major undersea earthquakes occurred, the first near Indonesia and the second near Japan. Both generated tsunamis, or seismic sea waves. In both cases reports of water running up onto land to heights of three to five metres were common (1 m is about 3.3 ft). In Indonesia many villages near river inlets were destroyed, and at least 200 people lost their lives. Tsunamis have been a recurring natural hazard throughout history. The Minoan civilization on Crete in the Mediterranean Sea was shaken by the combined effects of a volcanic eruption and a tsunami in the 2nd millennium BC, and Lisbon was devastated by a tsunami in 1755.

      Tsunamis are particularly prevalent in the Pacific because of the seismic activity associated with the edges of the Pacific Ocean. Since the water wave of a tsunami travels across the ocean at about 200 m per second (450 mph) whereas seismic waves travel through the solid Earth roughly 20 times faster, tsunami warning systems in operation around the Pacific have been able to issue warnings hours before a tsunami's arrival at distant locations. On the other hand, the ability to predict in advance the actual run-up height or the pattern of sea-level fluctuations after the initial arrival has remained poor. Research in 1994 showed that previously puzzling resurgences of sea level, which sometimes occur many hours after the tsunami has arrived, are likely to be due either to the arrival of waves reflected from the coasts or to waves traveling along the coasts. Research also called attention to the importance of distinguishing between slow and rapid earthquakes. Earthquakes in which the seafloor deforms relatively slowly will not excite strong seismic waves, yet their potential for generating tsunamis may be great. Seismological measurements capable of resolving lower-frequency seismic waves were expected to help identify such earthquakes. The most difficult problem remained that of issuing useful tsunami warnings for locations close to the earthquake centre, where arrival times between earthquake and tsunami may be only a few minutes apart.

      During the year oceanographers saw the beginning of near-real-time global observation of the circulation of the world's oceans. Meteorologists long had possessed the ability—by means of satellites and a worldwide system of observing stations—to visualize the state of the atmosphere at any time in detail sufficient to resolve major storms anywhere on the globe. By contrast, oceanographers generally had had to make do with partial pictures of the circulation reconstructed only months or years after the observations were made. It had been known that precise satellite-based altimetric measurements of sea level (to an accuracy of centimetres) had the potential to provide real-time pictures of the surface currents of the oceans. During the late 1980s the U.S. Navy's Geosat mission had collected more than four years of satellite altimetry, but in 1994 about two years of data with an accuracy 5-10 times better became available to oceanographers from the Topex/Poseidon satellite, which was launched in mid-1992. Using these data researchers observed major patterns of surface circulation over time in a way never before possible. Coastal winds appeared to generate theoretically predicted wavelike disturbances in both the middle latitudes and the tropics. The ability to observe such phenomena in a timely way was expected to lead to improved forecasting of the onset of El Niño, the appearance every few years of unusually warm water off the western coast of tropical South America.

      The Topex/Poseidon system really makes two measurements. One, by radar, is of the instantaneous distance from satellite to sea surface. The other, based on knowledge of the Earth's gravity field gained from many years of satellite tracking, is of the distance from the satellite to the sea surface as it would be if the ocean were motionless. It is the difference between the two measurements that indicates the presence and strength of ocean currents. When the satellite crosses over strong currents such as the Gulf Stream, that difference may be as great as one metre, but for more gentle currents it is measured in centimetres. Consequently, ocean tides must be predicted and removed from the altimetric signal before currents can be recognized. That necessity resulted in 1994 in the formulation of global models of ocean tides that predict the world tide with an overall accuracy of a few centimetres.

      Whereas tsunamis and ocean-current systems span ocean basins, it is small-scale water motion—currents that change over centimetres and seconds—that is important in the dilution of pollutants in the ocean or in the mixing of cold deep waters with warm surface waters to form water of an intermediate temperature. The effect of such small-scale motion on the diffusion of heat and salt in the ocean had long been studied theoretically and estimated indirectly, but in 1992 researchers began an experiment to look directly at the way in which a thin patch of an inert tracer substance injected in the eastern subtropical North Atlantic subsequently spread vertically and horizontally. By 1994 the patch had expanded vertically from its initial thickness of a few metres to about 80 m and had stretched from its initial horizontal size of a few kilometres to a sinuous streak several hundred kilometres long. Previous theoretical predictions of the rate of vertical diffusion proved to be accurate; further observation and analysis may give insight into what keeps the streak from getting ever narrower as it lengthens. Such studies of ocean diffusion were important for understanding pollutant dispersal and nutrient distribution in the oceans as well as the role of the oceans in global heat transport.


▪ 1994


      In 1993 the U.S. National Academy of Sciences published the report Solid-Earth Sciences and Society, which recommended priorities for future research in the field while delineating the scientific challenges facing modern society. In its outlook the report echoed a theme that was recurring more and more often within the Earth sciences at the international level, namely, the reciprocal relationship between the Earth sciences and society concerning, on the one hand, the response of society to hazardous geologic processes and environmental changes and, on the other hand, the role of industrial society in extracting, using, and discarding materials and thereby changing geologic processes. In discussing priorities the report attempted to reduce the head-on conflict between basic science and societal needs by developing a "research framework" matrix with five major scientific topics set against the understanding of scientific processes and three objectives—resources, hazards, and environmental change. Overall, the report recommended studying processes while viewing the Earth as an integrated, dynamic system rather than as a collection of isolated components divided up among different disciplines.

      A top-priority scientific topic continued to be mantle dynamics. Convection within the Earth's mantle, the slow movement of the Earth's hot, solid outer 2,900 km of rock, represents the Earth's engine at work and is the driving force for many near-surface geologic features. (A kilometre is about 0.62 mi.) The process was being investigated by means of geophysical and geochemical methods and computer models.

      One debate was whether convective motions are mantle-wide, causing mixing through the complete mantle down to the core-mantle boundary at a depth of 2,900 km, or whether they are defined within two discrete layers that remain physically separate, one descending to a depth of 670 km and the other from this depth down to 2,900 km. At 670 km there exists a phase transition (where a less dense rock above is compressed into a more dense rock below) that had been investigated by geochemists in high-pressure laboratory experiments. In 1993 several investigators presented models in which massive transfer of material occurs across the 670-km boundary by means of "periodic flushing" of the upper mantle into the lower mantle. The most detailed were those of Paul Tackley and co-workers of the California Institute of Technology. Their calculations in three-dimensional spherical geometry combined with the phase transition at 670 km depth revealed a flow pattern containing cylindrical plumes and flat sheets. The dynamics are dominated by the accumulation of sheets of downwelling cold material (corresponding to subducted lithospheric slabs) just above 670 km, as the material is not dense enough to penetrate more deeply. When the volume of subducted material reaches a critical amount, it initiates a catastrophic flushing event, which drains the material into the lower mantle in broad cylindrical downwellings to the core-mantle boundary. The downwelling then shuts off completely and does not recur in exactly the same place. There are corresponding hot upwellings. Several flushing events are in progress at different places in the model at the same time.

      Several distinctive rock masses involved in mantle convection have been characterized by the isotopic signatures, i.e., the characteristic patterns of isotopes, of mantle rock fragments (xenoliths) brought to the surface in some lavas. One signature, called HIMU, was believed to represent recycled oceanic crust in the convecting mantle, while a component dubbed EMII was believed to represent enrichment by recycled sediments. During the year Erik H. Hauri of the Woods Hole (Mass.) Oceanographic Institution and co-workers reported that the trace-element patterns of four xenoliths from oceanic islands showed that they had reacted with carbonate-rich melts within the mantle. They concluded that a mechanism must exist for the transport of carbon dioxide through subduction zones and into convecting mantle. David H. Green of Australian National University, Canberra, and colleagues commented that these results "may have provided a critical linking piece in the jigsaw of mantle dynamics," adding that minute concentrations of carbon and hydrogen can exert huge geochemical effects on the melting behaviour of the mantle. Diamond samples containing solid carbon dioxide, which must have become trapped in the diamond at depths of 220-270 km—reported during the year by Marcus Schrauder and Oded Navon of Hebrew University, Jerusalem—could also be explained by the subduction of carbon-containing sediments at least to these depths.

      Whereas the biosphere is linked through the carbon cycle to mantle convection, evolution in the biosphere may be linked to objects from space. The case had been advanced for a few years that the extraterrestrial object responsible for the impact crater at Chicxulub in Mexico's Yucatán Peninsula was also responsible for the mass extinction of dinosaurs and many other creatures 65 million years ago at the end of the Cretaceous Period (denoted in rock strata by the K-T boundary). In 1993 the idea gained support from a reexamination of gravity measurements over the basin by Virgil Sharpton of the Lunar and Planetary Institute, Houston, Texas, and co-workers. They placed the scar of the crater edge at 300 km in diameter, nearly twice as wide as the previous estimate. The figure, if correct, would make the Chicxulub crater the largest impact crater known on Earth and imply an extremely devastating effect on Cretaceous life for the impact. In fact, the catastrophic-impact extinction issue was complex and contained many unresolved problems. One persistent one was that of explaining how any animals at all managed to survive a catastrophe of such magnitude.

      A new method of satellite radar interferometry was providing researchers with insights into the processes accounting for recent evidence that the Antarctic ice sheets formed and collapsed several times during the past few million years. During the year Richard Goldstein and colleagues of the Jet Propulsion Laboratory, Pasadena, Calif., applied the method to the study of fast-moving ice streams in Antarctica. A pair of radar images taken a few days apart provided a diagram that directly displayed relative surface motions for the time interval between images, with detection limits of 1.5 mm (0.06 in) for vertical motions and 4 mm (0.16 in) for horizontal motions. This information permitted measurements of rates of ice flow and mapping (with a resolution of 0.5 km) of the "grounding line," i.e., the limit of ice lying on bedrock, since ungrounded ice is revealed by vertical motions of about two metres owing to tidal uplift. (A metre is about 3.3 ft.)

      A possible link between the Antarctic fast ice streams and volcanoes, described during the year by Donald Blankenship of the University of Texas at Austin and Robin Bell of Lamont-Doherty Earth Observatory, Palisades, N.Y., suggested that volcanoes may affect the global climate in more than one way. The familiar atmospheric effect of volcanoes is exemplified by the millions of tons of sulfur dioxide, other gases, and dust lofted into the upper atmosphere by the 1991 eruption of Mt. Pinatubo in the Philippines—emissions that were being monitored and evaluated for their effects on global temperatures and the ozone hole. Blankenship and Bell identified an active volcano having a peak about 1.5 km beneath the ice near the head of one of the five fast-moving ice streams flowing from the centre of the West Antarctic Ice Sheet into the Ross Ice Shelf, which is afloat offshore from the grounding line. Aerial surveys across a circular depression in the ice measured its surface and thickness, and measurements of gravity and magnetic field combined with radar mapping of the ground underneath the ice sheet revealed a cone rising about 650 m above surrounding bedrock. The surface depression, about 50 m deep and 6 km in diameter, represents ice that has been melted. It was inferred that this meltwater softens the glacial sediments beneath the ice (the effect had been detected by seismology a few years earlier and confirmed by direct drilling in 1990). The subglacial layer of water-logged sediment lubricates the ice stream (50 km wide, 1 km deep, 500 km long), which is moving about 100 times as fast—up to two metres per day—as the adjacent ice sheet.

      Five major ice streams make up about 90% of the outflow from the ice sheet, and their behaviour is critical to the stability or catastrophic collapse and melting of the Western Antarctic Ice Sheet. If heat from subglacial volcanoes increases the flow rates, causing retreat of the grounding line, then ice presently locked onto bedrock would be freed, perhaps leading to accelerated flow and disintegration of the ice sheet. Such a collapse would raise the sea level by about six metres, flooding many of the world's heavily populated cities.


      Sept. 30, 1993, marked the end of the 10-day festival in India honouring Ganesa, god of good fortune and new beginnings. Thousands of villagers in the southern Deccan Plateau fell into bed exhausted from the revelry; they had only hours to live. Shortly before 4 AM an earthquake of magnitude 6.4 turned thousands of mud-brick dwellings to dust and rubble, burying the inhabitants and killing more than 9,700. The epicentre was located between the major cities of Bombay and Hyderabad, nearly equidistant from the Arabian Sea and the Bay of Bengal. It was the most destructive shock to hit the region in 58 years, almost totally demolishing the villages of Killari, Latur, and Umarga.

      One great earthquake, i.e., an earthquake having a magnitude of 8 or greater, occurred during the year. The shock, of magnitude 8.0, struck south of Guam in the Mariana Islands on August 8, injuring 48 and causing minor damage in the centre of the island. On July 12 an earthquake of magnitude 7.8 rocked northern Japan. The quake and consequent tsunamis (seismic sea waves) killed at least 185 persons; the island of Okushiri, especially hard hit, was virtually destroyed. Residents of Klamath Falls, Ore., were surprised in mid-September by the first tremors ever recorded in the region. The activity consisted of a magnitude-5.8 shock and several large aftershocks, one of magnitude 5.5.

      Several volcanic events resulted in tragedies. On February 2 the Mayon Volcano in the Philippines erupted in a series of explosions, culminating in the largest on February 12. The first blast was unexpected and sent a pyroclastic flow six kilometres down the Bonga Gully, where it spread over the fan deposited in the 1984 eruption, killing 68 persons and prompting the evacuation of tens of thousands. (One kilometre is about 0.62 mi.) Three main explosions produced towering ash clouds, the first and largest rising to 4.5 km.

      The Galeras Volcano, only eight kilometres from Pasto, Colombia, a city of 300,000, has been the most active volcano in South America for the past 500 years. Accordingly it was chosen as the only South American volcano to be included in the UN International Decade of Natural Disaster Reduction program. In January a workshop comprising 50 scientists from Colombia and 40 scientists from 14 other countries was convened. Part of its program included field studies in which lava, gas, and rock samples were to be taken from the crater and temperatures, seismic activity, and other phenomena monitored. On January 14, while several scientists were in the crater and several more were on the rim, the volcano exploded, killing six; three tourists also died from the blast. In Ecuador on March 12 two volcanologists who had ascended the dome of Guagua Pichincha were killed instantly by a strong explosion.

      The international Ocean Drilling Program (ODP) continued to explore the sea bottom and subsurface oceanic structures. On Leg 143 the scientific drilling ship JOIDES Resolution sailed from Honolulu westward above the submerged Mid-Pacific Mountains to a point approximately 18° N latitude, 180° longitude, where it occupied the first of six sites on its itinerary. The purpose of the expedition was to extract core samples of guyots and thereby discover the origin of these underwater mesas. In the 19th century Charles Darwin had outlined what he believed to be the evolutionary sequence of events leading to the formation of guyots. He postulated a progression from a volcanic island, which became surrounded by a coral reef, to the gradual erosion of the central island to leave an atoll encompassing a shallow lagoon. Modern researchers went one step further, theorizing that the lagoon gradually silts up and sinks beneath the surface as a flat-topped guyot.

      The ODP team drilled two deep holes along with several shallow ones at the first site, called Allison Guyot. Cores of seafloor were obtained to a depth of 870 m through overlying limestone to a layer of abundant plant and marine-animal debris, indicating that the layer was once a marsh and reinforcing Darwin's hypothesis. (One metre is about 3.3 ft.) The next site, located about 21° N latitude, 175° E longitude, was a formation named Resolution Guyot after the drilling ship and its crew. There drilling established a single-leg depth record with a hole cored to 1,743.6 m through limestone and volcanic basalts. Two other sites were cored on the perimeter of Resolution Guyot in search of the expected reef, but none was found, suggesting an evolution different from that of Allison Guyot.

      Hess Deep is located at the western extremity of the seafloor spreading centre between the Nazca and Cocos tectonic plates north of the Galápagos Islands in the eastern Pacific. It is notable because at this site the Mohorovicic discontinuity (Moho), the boundary between the Earth's crust and upper mantle, lies only a few hundred metres beneath the ocean bottom. On Leg 147 of the ODP, 13 holes were drilled and cores obtained that traversed the Moho with penetrations of less than 300 m. This core material was especially important because it represented the first direct evidence obtained from a fast-spreading mid-ocean ridge. Leg 147 was the first voyage of several to be made during the 1993-94 season in a coordinated effort to investigate the lower crust and upper mantle.

      A number of organizations around the globe with goals similar to those of the ODP were exploring the subsurface structure of the continents. One, called COCORP (for Consortium for Continental Reflection Profiling) and begun in 1975 at Cornell University, Ithaca, N.Y., was funded principally by the U.S. National Science Foundation and included participants from universities, government agencies, and industry. As of 1993 it had assembled 12,000 km of seismic reflection data from across the U.S., in some areas delineating the Moho and noting its varying depths, elsewhere tracing faults to great depth and even discovering seismic reflectors in the mantle below the Moho. Since its beginnings COCORP had stimulated similar quests in as many as 30 other countries, including Canada, the U.K., France, Australia, Germany, and China. (RUTLAGE J. BRAZEE)

      For the U.S. the biggest news in hydrology during 1993 was the midyear flooding in the Midwest. Stalled weather patterns in the early and middle parts of the year produced long-term heavy rains over much of the Dakotas, Minnesota, Wisconsin, Iowa, Nebraska, Kansas, and parts of Illinois, Missouri, Colorado, and Wyoming. Coming on top of wet soils, the rains resulted in flows on the Mississippi River from April to July that broke records dating to the late 19th century. (See Meteorology, below).

      The drought that had dogged the western U.S. came to a dramatic end over much of the area during the winter of 1992-93. High rains and a heavy snowpack in the mountains promised relief as the spring progressed. Early rains were as much as twice normal, and the snowpack in the Sierra Nevada range stood at its highest level in 50 years. Salt Lake City, Utah, reported record-high snowfall, and Yuma, Ariz., recorded rain at 840% of normal. Early in the year jubilant water officials in thirsty Los Angeles announced the end of seven years of water rationing. The end of the drought was not an unmixed blessing, however. The high rains early in the winter and melting snow in the southern Rocky Mountains later resulted in high waters and flooding in parts of Baja California and the southwestern U.S.

      Heavy weather caused floods in China and Southeast Asia. Flooding that resulted from torrential summer rains in south-central China cut off road transport in the mountains. In a band from India's Punjab region across Nepal and into Bangladesh, monsoon rains in the latter part of the year brought swollen rivers and floods to low-lying lands in the major river basins. In India the overflowing Ravi and Beas rivers washed away bridges and stretches of highway. Although monsoon rains are a normal part of the mid- and late-year weather pattern, they were reported in some places to have hit with a ferocity not seen in decades.

      Water-management activities during the year ranged from considerations of flood control to hydroelectric power production. After the floods in the U.S. Midwest had amply demonstrated that levees prevent floodplains from serving to control floods, the federal administration directed the U.S. Army Corps of Engineers to evaluate alternatives to levees for flood control in future planning. In the wake of flooding in India, the national government was excoriated in the press for "continuous neglect of flood prevention projects." China broke ground for what was to be the world's largest dam. Although not holding the largest reservoir or having the greatest height, the Three Gorges Dam on the Chang Jiang (Yangtze River) would be the largest hydropower producer in the world when it was finished in about a decade. Flood control was the major argument posed in favour of the structure, while siltation was expected to be the largest potential problem once the project had been completed.

      Remote-sensing images taken by an Earth-orbiting satellite revealed a dry riverbed 850 km (530 mi) in length buried beneath the sands of Saudi Arabia and Kuwait. Segments of the channel had been noted previously as dry-streambed depressions known as wadis, but dunes cutting across the area had masked their identity as part of a single river system. According to Farouk El-Baz of Boston University, the so-called Kuwait River, which begins in Saudi Arabia's Hijaz Mountains, last flowed 5,000 to 11,000 years ago when the region experienced a wet period. Because the riverbed follows a geologic fault, the underlying rock might still contain water that could be accessed with wells driven hundreds of metres deep.


      A combination of factors appeared to have contributed to the atmospheric circulation pattern responsible for 1993's extreme warm-season weather conditions over parts of the Northern Hemisphere. A prolonged El Niño/Southern Oscillation (ENSO) event (a pattern of anomalous oceanic and atmospheric behaviour in the tropical Pacific that appears every few years), which had begun in 1991, may have combined with natural climatic variability to produce the unusually strong and persistent upper-air pattern that dominated the April-September weather across North America and led, most disastrously, to copious rains and extensive flooding across the U.S. Midwest.

      Prior to these persistent anomalies, the 1992-93 wet season in the far West provided excess precipitation that finally ended the long-term (1986-92) drought in California. (See Hydrology, above.) In the East a mid-March "storm of the century" dumped up to 60 cm (2 ft) of snow on the extreme southern Appalachians northeastward into lower elevations of the mid-Atlantic, where blizzard conditions were widespread but relatively short-lived. Prolonged blizzard conditions ranged from the south-central Appalachians northeastward across most of New England, where 60-150-cm (2-5-ft) snowfalls were common. At least 238 lives were lost in the storm, and an estimated $1 billion in property damage occurred.

      The floods in the U.S. Midwest were preceded over much of the eastern half of the country by months of surplus precipitation, which saturated soils and set up high streamflow levels. That situation, combined with heavy spring and summer rainfall, created severe flooding throughout the northern half of the Mississippi drainage basin. Some locations in Iowa, Kansas, and Missouri measured more rain from April through July than normally fell in a full year. Many reservoirs overflowed; over two-thirds of the region's levees were overtopped or breached; and severe lowland flooding ensued. At some locations the Mississippi expanded to a width of nearly 11 km (7 mi) and the Missouri to 32 km (20 mi), while the confluence of the Mississippi and Missouri rivers shifted 32 km upstream of its previous position. Almost 942 km (585 mi) of the Mississippi and the lower 861 km (535 mi) of the Missouri were closed to navigation for several weeks. At least 50 lives were lost owing to the flooding, and damages were estimated to be at least $12 billion.

      In late October and early November, two waves of wildfires, many of them set by arsonists, raced across southern California, fueled by an abundance of dead timber and brush from six years of drought and driven by strong Santa Ana winds gusting as high as 113 km/h (70 mph). All told, the fires scorched at least 61,500 ha (152,000 ac), destroyed more than 1,000 homes, took 3 lives, and injured more than 150 people.

      The 1993 Atlantic and Caribbean hurricane season adversely affected parts of northern South America, the Caribbean, and Central America. During early August resilient Tropical Storm Bret generated severe flooding across Colombia, Venezuela, Nicaragua, Costa Rica, and Honduras, killing hundreds of people and leaving thousands homeless. During mid-September heavy rains once again fell on parts of Central America, this time from Tropical Storm Gert. Thousands were left homeless from flooding, and dozens of lives were lost in Nicaragua and Honduras. As Gert emerged over the Gulf of Mexico, it strengthened into a hurricane and made landfall near Tuxpan, Mexico, with gusts to 200 km/h (120 mph). As much as 400 mm (16 in) of rain inundated the Mexican state of San Luis Potosí, producing severe flooding and mud slides that left over 100,000 individuals homeless and at least 14 dead.

      Heavy precipitation also drenched much of central South America for the first two months of the year, although prolonged dryness continued in northeastern Brazil. By March sizable moisture deficits had spread through Paraguay, Uruguay, and Argentina. In contrast, unusually heavy April rains fell on Ecuador and northern Peru and covered most of central South America during May, alleviating the aforementioned moisture shortages.

      Much of southern Europe and the Mediterranean began the year with very dry conditions, with some areas receiving only 10-30% of normal precipitation during the 1992-93 winter. Heavy April and May rains alleviated dryness in western Europe, but a dry July and August brought renewed moisture deficits to the region. In September and early October copious rains pelted southern Switzerland, southern France, and northern Italy, causing floods that took more than a dozen lives. In Greece, however, inadequate long-term rainfall dropped reservoir levels near Athens to dangerously low levels.

      Through late July a subnormal rainy season dominated large sections of sub-Saharan Africa. In the next two months rainfall across the northern tier of the western Sahel increased significantly, with the greatest improvement across northern Senegal and southwestern Mauritania. Farther south, moisture deficits persisted throughout the rainy season across southern sub-Saharan Africa. A favourably moist rainy season through late July deteriorated during August and September across the eastern Sahel, leaving below-normal seasonal rainfall amounts in most areas. In southern Africa heavy precipitation at the start of the 1992-93 summer rainy season eased many of the drought-related effects from the previous year, but renewed moisture shortages were observed through much of the region late in the season. In late September and early October, however, heavy rains soaked southeastern sections of the region, providing a favourable start to the 1993-94 rainy season.

      Monsoon rains generally began on schedule and in abundance across the Indian subcontinent. Heavy rains in late July and early August caused severe flooding in parts of India, Nepal, and Bangladesh. The floods were one of the worst disasters on record in Nepal, with damage estimates of $20 million and possibly 3,000 deaths. Torrential warm-season rains also inundated China, Korea, and Japan. Fourteen tropical storms, most of which became typhoons, hit Japan; more than 2,500 mm (100 in) of rain inundated parts of Kyushu from June through August. Meanwhile, subnormal rainfall threatened crop production in Taiwan, and low reservoir levels forced the rationing of hydroelectric power.

      Cyclone Kina, the worst storm to strike Fiji in 57 years, caused considerable damage to the South Pacific island nation in early January. In the western Pacific, Typhoon Koryn battered the Philippine island of Luzon in late June, abruptly ending a two-month dry spell and engendering landslides and floods. In early October, Tropical Storm Flo, the 25th storm to hit the Philippines in 1993, dumped copious rains on northern Luzon, damaging up to 10% of the rice crop and taking dozens of lives. In Australia the year commenced with very wet weather across New South Wales and Victoria, while large moisture deficits accumulated across Queensland through most of the 1992-93 austral summer rainy season. Torrential late January rains across northern Australia and southwestern Indonesia created flooding and forced a quarter of a million people to flee their homes. Beginning in September heavy early-season rains covered eastern and southeastern Australia and continued through October.

      This updates the article climate.

      In 1993 the World Ocean Circulation Experiment (WOCE) neared the midpoint of its 1990-97 program of observations intended to span entire ocean basins. Planning for WOCE began in the early 1980s when researchers realized that changes in ocean circulation might hold the key to predicting climate. One example of the new results that were emerging from the experiment related to the Pacific-wide distribution of carbon-14.

      Cosmic rays from space continually convert a very small amount of the stable isotope carbon-12 present in the atmosphere into the radioactive isotope 14C. The half-life of 14C—the time it takes half the atoms in a given sample to decay—is about 5,730 years. A buried or otherwise isolated sample of carbon that has been out of contact with the atmosphere for several thousand years thus will have much less 14C than a sample in contact with the atmosphere. The age of the isolated sample can be determined through measurement of its 14C content. Oceanographers use 14C measurements to determine the time that waters below the surface of the ocean have been away from the atmosphere. Some of the more interesting WOCE results of 1993 concerned such measurements in the Pacific Ocean.

      On the basis of 14C content, researchers believe that the deep water of the north Pacific has been away from the atmosphere for about 1,500 years. This water is a mixture of water that was last at the surface around Antarctica or even farther away in the far north Atlantic. The traditional view of Pacific deep circulation is that the oldest water (the water below the surface for the longest time) is to be found deep in the northwestern corner of the Pacific, but WOCE 14C measurements during the year surprisingly changed this picture. The oldest Pacific waters were found at depths of thousands of metres (but not at the bottom) in two east-west transpacific bands about 1,000 km (620 mi) wide, one on either side of the equator. The water in the very northern part of the Pacific is not the oldest; its 14C content suggested that it had been in contact with the atmosphere more recently than that in the transpacific bands.

      The term El Niño refers to a recurring event in which the cold, nutrient-rich waters off the west coast of South America are replaced by warmer, relatively nutrient-poor water, with consequent catastrophic failure of coastal fisheries. Researchers gradually realized that El Niño is but one part of a Pacificwide pattern of oceanic and atmospheric change now called the El Niño/Southern Oscillation (ENSO). Predicting ENSO events is of global economic importance. A number of researchers had successfully predicted the 1986 and 1991 events, but predictions made in the fall of 1993 ranged widely, from another El Niño to an abnormally cold east Pacific.

      One problem in developing predictive El Niño models has been that, because ENSO events typically occur only once or twice a decade, historical meteorologic records cover a fairly small number of events. Typically, ENSO events include abnormally intense rainfall at equatorial Pacific islands. During the year researchers reported that the concentration of the isotope oxygen-18 in a core of coral grown over the previous 96 years at an island in the west Pacific mirrors the index of rainfall over the central Pacific. The condensation of water vapour during atmospheric convection preferentially separates out oxygen isotopes of different weight into the rainfall; consequently, the 18O content of the ocean surface water, and hence of corals growing in it, is lower during times of abnormally intense rainfall. The coral record may actually be a better measure of rainfall averaged over the tropical Pacific than would be an island rain-gauge record because ocean currents cause the 18O content of the coral to reflect rainfall conditions over a broad region rather than just where the coral grows. Such work was expected to allow researchers to look back over many more ENSO events to see if their frequency and duration have changed over time.

      Relaxation of Cold War tensions provided oceanographers with an unexpected new source of data. They gained access to the U.S. Navy's global acoustic undersea surveillance system, originally designed to detect and track submarines, in order to listen for signals as diverse as whale vocalizations and seafloor volcanoes and earthquakes. The global coverage afforded by this system would provide whale researchers with a basin-scale picture of numbers and locations of whales at any given time. Earth scientists would enjoy greatly increased ability to monitor seismic activity under the ocean, particularly the frequent but relatively low-level activity that is believed to occur along with volcanism at ocean-ridge crests, the sites of seafloor spreading.

      Seafloor earthquakes sometimes generate extremely destructive ocean waves called tsunamis. Because seismic waves travel faster through the Earth's crust than do the water waves of the tsunami, researchers who monitor the world for earthquakes on the seafloor or near the coast often can warn coastal residents of a possible tsunami several hours or more in advance. But they cannot tell with certainty whether a particular earthquake has, in fact, generated a large tsunami. In 1993 researchers suggested that the traditional measure of earthquake intensity underestimates the size of those earthquakes that release their energy relatively slowly and thus have hidden potential for generating tsunamis. They argued that the Nicaraguan earthquake of Sept. 2, 1992, which generated only mild ground motions at the coast but was followed by large tsunami waves, was one such slow earthquake, and they noted similar historical occurrences around the Pacific. Their work suggested that a change in the way earthquakes are monitored could provide more certain tsunami warnings than are presently available.


      This updates the articles atmosphere (atmosphere, evolution of); dinosaur; Earth (geoid); Earth sciences; earthquake; geochronology; hydrosphere; ocean; plate tectonics; river; volcano.

* * *


      the fields of study concerned with the solid Earth, its waters, and the air that envelops it. Included are the geologic, hydrologic, and atmospheric sciences.

      The broad aim of the Earth sciences is to understand the present features and the past evolution of the Earth and to use this knowledge, where appropriate, for the benefit of humankind. Thus the basic concerns of the Earth scientist are to observe, describe, and classify all the features of the Earth, whether characteristic or not, to generate hypotheses with which to explain their presence and their development, and to devise means of checking opposing ideas for their relative validity. In this way the most plausible, acceptable, and long-lasting ideas are developed.

      The physical environment in which humans live includes not only the immediate surface of the solid Earth, but also the ground beneath it and the water and air above it. Early man was more involved with the practicalities of life than with theories, and thus his survival depended on his ability to obtain metals from the ground to produce, for example, alloys, such as bronze from copper and tin, for tools and armour, to find adequate water supplies for establishing dwelling sites, and to forecast the weather, which had a far greater bearing on human life in earlier times than it has today. Such situations represent the foundations of the three principal component disciplines of the modern Earth sciences.

      The rapid development of science as a whole over the past century and a half has given rise to an immense number of specializations and subdisciplines, with the result that the modern Earth scientist, perhaps unfortunately, tends to know a great deal about a very small area of study but only a little about most other aspects of the entire field. It is therefore very important for the lay person and the researcher alike to be aware of the complex interlinking network of disciplines that make up the Earth sciences today, and that is the purpose of this article. Only when one is aware of the marvelous complexity of the Earth sciences and yet can understand the breakdown of the component disciplines is one in a position to select those parts of the subject that are of greatest personal interest.

      It is worth emphasizing two important features that the three divisions of the Earth sciences have in common. First is the inaccessibility of many of the objects of study. Many rocks, as well as water and oil reservoirs, are at great depths in the Earth, while air masses circulate at vast heights above it. Thus the Earth scientist has to have a good three-dimensional perspective. Second, there is the fourth dimension: time. The Earth scientist is responsible for working out how the Earth evolved over millions of years. For example, what were the physical and chemical conditions operating on the Earth and the Moon 3,500,000,000 years ago? How did the oceans form, and how did their chemical composition change with time? How has the atmosphere developed? And finally, how did life on Earth begin, and from what did man evolve?

      Today the Earth sciences are divided into many disciplines, which are themselves divisible into six groups:
● Those subjects that deal with the water and air at or above the solid surface of the Earth. These include the study of the water on and within the ground (hydrology), the glaciers and ice caps (glaciology), the oceans (oceanography), the atmosphere and its phenomena (meteorology), and the world's climates (climatology). In this article such fields of study are grouped under the hydrologic and atmospheric sciences and are treated separately from the geologic sciences, which focus on the solid Earth.
● Disciplines concerned with the physical-chemical makeup of the solid Earth, which include the study of minerals (mineralogy), the three main groups of rocks (igneous, sedimentary, and metamorphic petrology), the chemistry of rocks (geochemistry), the structures in rocks (structural geology), and the physical properties of rocks at the Earth's surface and in its interior (geophysics).
● The study of landforms (geomorphology), which is concerned with the description of the features of the present terrestrial surface and an analysis of the processes that gave rise to them.
● Disciplines concerned with the geologic history of the Earth, including the study of fossils and the fossil record (paleontology), the development of sedimentary strata deposited typically over millions of years (stratigraphy), and the isotopic chemistry and age dating of rocks (geochronology).
● Applied Earth sciences dealing with current practical applications beneficial to society. These include the study of fossil fuels (oil, natural gas, and coal); oil reservoirs; mineral deposits; geothermal energy for electricity and heating; the structure and composition of bedrock for the location of bridges, nuclear reactors, roads, dams, and skyscrapers and other buildings; hazards involving rock and mud avalanches, volcanic eruptions, earthquakes, and the collapse of tunnels; and coastal, cliff, and soil erosion.
● The study of the rock record on the Moon and the planets and their satellites (astrogeology). This field includes the investigation of relevant terrestrial features—namely, tektites (glassy objects resulting from meteorite impacts) and astroblemes (meteorite craters).

      With such intergradational boundaries between the divisions of the Earth sciences (which, on a broader scale, also intergrade with physics, chemistry, biology, mathematics, and certain branches of engineering), researchers today must be versatile in their approach to problems. Hence, an important aspect of training within the Earth sciences is an appreciation of their multidisciplinary nature.

Brian Frederick Windley

Origins in prehistoric times
      The origins of the Earth sciences lie in the myths and legends of the distant past. The creation story, which can be traced to a Babylonian epic of the 22nd century BC and which is told in the first chapter of Genesis, has proved most influential. The story is cast in the form of Earth history and thus was readily accepted as an embodiment of scientific as well as of theological truth.

      Earth scientists later made innumerable observations of natural phenomena and interpreted them in an increasingly multidisciplinary manner. The Earth sciences, however, were slow to develop largely because the progress of science was constrained by whatever society would tolerate or support at any one time.


Geologic sciences (geology)
Knowledge of Earth composition and structure
      The oldest known treatise on rocks and minerals is the De lapidibus (“On Stones”) of the Greek philosopher Theophrastus(c.372–c.287 BC). Written probably in the early years of the 3rd century, this work remained the best study of mineral substances for almost 2,000 years. Although reference is made to some 70 different materials, the work is more an effort at classification than systematic description.

      In early Chinese writings (China) on mineralogy, stones and rocks were distinguished from metals and alloys, and further distinctions were made on the basis of colour and other physical properties. The speculations of Cheng Ssu-hsiao (died AD 1332) on the origin of ore deposits were more advanced than those of his contemporaries in Europe. In brief, his theory was that ore is deposited from groundwater circulating in subsurface fissures.

      Ancient accounts of earthquakes (earthquake) and volcanic eruptions (volcanism) are sometimes valuable as historical records but tell little about the causes of these events. Aristotle (384–322 BC) and Strabo (64 BC–c. AD 21) held that volcanic explosions and earthquakes alike are caused by spasmodic motions of hot winds that move underground and occasionally burst forth in volcanic activity attended by Earth tremors. Classical and medieval ideas on earthquakes and volcanoes were brought together in William Caxton (Caxton, William)'s Mirrour of the World (1480). Earthquakes are here again related to movements of subterranean fluids. Streams of water in the Earth compress the air in hidden caverns. If the roofs of the caverns are weak, they rupture, causing cities and castles to fall into the chasms; if strong, they merely tremble and shake from the heaving by the wind below. Volcanic action follows if the outburst of wind and water from the depths is accompanied by fire and brimstone from hell.

      The Chinese have the distinction of keeping the most faithful records of earthquakes and of inventing the first instrument capable of detecting them. Records of the dates on which major quakes rocked China date to 780 BC. To detect quakes at a distance, the mathematician, astronomer, and geographer Chang Heng(AD 78–139) invented what has been called the first seismograph.

Knowledge of Earth history
      The occurrence of seashells embedded in the hard rocks of high mountains aroused the curiosity of early naturalists and eventually set off a controversy on the origin of fossils that continued through the 17th century. Xenophanesof Colophon (flourished c. 560 BC) was credited by later writers with observing that seashells occur “in the midst of earth and in mountains.” He is said to have believed that these relics originated during a catastrophic event that caused the Earth to be mixed with the sea and then to settle, burying organisms in the drying mud. For these views Xenophanes is sometimes called the father of paleontology.

      Petrified wood was described by Chinese scholars as early as the 9th century AD, and around 1080 Shen Kua (Shen Kuo)cited fossilized plants as evidence for change in climate. Other kinds of fossils that attracted the attention of early Chinese writers include spiriferoid brachiopods (“stone swallows”), cephalopods, crabs, and the bones and teeth of reptiles, birds, and mammals. Although these objects were commonly collected simply as curiosities or for medicinal purposes, Shen Kua recognized marine invertebrate fossils for what they are and for what they imply historically. Observing seashells in strata of the T'ai-hang Shan range, he concluded that this region, though now far from the sea, must once have been a shore.

Knowledge of landforms and of land–sea relations
      Changes in the landscape and in the position of land and sea related to erosion and deposition by streams were recognized by some early writers. The Greek historian Herodotus(c. 484–c .426 BC) correctly concluded that the northward bulge of Egypt into the Mediterranean is caused by the deposition of mud carried by the Nile.

      The early Chinese writers were not outdone by the Romans and Greeks in their appreciation of changes wrought by erosion. In the Chin shu (“History of the Chin Dynasty”), it is said of Tu Yü (AD 222–284) that when he ordered monumental stelae to be carved with the records of his successes, he had one buried at the foot of a mountain and the other erected on top. He predicted that in time they would likely change their relative positions, because the high hills will become valleys and the deep valleys will become hills.

      Aristotle guessed that changes in the position of land and sea might be cyclical in character, thus reflecting some sort of natural order. If the rivers of a moist region should build deltas at their mouths, he reasoned, seawater would be displaced and the level of the sea would rise to cover some adjacent dry region. A reversal of climatic conditions might cause the sea to return to the area from which it had previously been displaced and retreat from the area previously inundated. The idea of a cyclical interchange between land and sea was elaborated in the Discourses of the Brothers of Purity, a classic Arabic work written between AD 941 and 982 by an anonymous group of scholars at Basra (Iraq). The rocks of the lands disintegrate and rivers carry their wastage to the sea, where waves and currents spread it over the seafloor. There the layers of sediment accumulate one above the other, harden, and, in the course of time, rise from the bottom of the sea to form new continents. Then the process of disintegration and leveling begins again.

Hydrologic (hydrologic sciences) and atmospheric sciences (atmospheric science)
      The only substance known to the ancient philosophers in its solid, liquid, and gaseous states, water is prominently featured in early theories about the origin and operations of the Earth. Thales of Miletus(c. 624–c. 545 BC) is credited with a belief that water is the essential substance of the Earth, and Anaximander of Miletus (c.610–545 BC) held that water was probably the source of life. In the system proposed by Empedocles of Agrigentum (c. 490–430 BC), water shared the primacy Thales had given it with three other elements: fire, air, and earth. The doctrine of the four earthly elements was later embodied in the universal system of Aristotle and thereby influenced Western scientific thought until late in the 17th century.

Knowledge of the hydrologic cycle
      The idea that the waters of the Earth undergo cyclical motions, changing from seawater to vapour to precipitation and then flowing back to the ocean, is probably older than any of the surviving texts that hint at or frame it explicitly.

      The idea of the hydrological cycle developed independently in China as early as the 4th century BC and was explicitly stated in the Lü-shih Ch'un Ch'iu (“The Spring and Autumn [Annals] of Mr. Lü”), written in the 3rd century BC. A circulatory system of a different kind, involving movements of water on a large scale within the Earth, was envisioned by Plato(c. 428–348/347 BC). In one of his two explanations for the origin of rivers and springs, he described the Earth as perforated by passages connecting with Tartarus, a vast subterranean reservoir.

      A coherent theory of precipitation is found in the writings of Aristotle. Moisture on the Earth is changed to airy vapour by heat from above. Because it is the nature of heat to rise, the heat in the vapour carries it aloft. When the heat begins to leave the vapour, the vapour turns to water. The formation of water from air produces clouds. Heat remaining in the clouds is further opposed by the cold inherent in the water and is driven away. The cold presses the particles of the cloud closer together, restoring in them the true nature of the element water. Water naturally moves downward, and so it falls from the cloud as raindrops. Snow falls from clouds that have frozen.

      In Aristotle's system the four earthly elements were not stable but could change into one another. If air can change to water in the sky, it should also be able to change into water underground.

The origin of the Nile (Nile River)
      Of all the rivers known to the ancients, the Nile was most puzzling with regard to its sources of water. Not only does this river maintain its course up the length of Egypt through a virtually rainless desert, but it rises regularly in flood once each year.

      Speculations on the strange behaviour of the Nile were many, varied, and mostly wrong. Thales suggested that the strong winds that blow southward over the delta in summertime hold back the flow of the river and cause the waters to rise upstream in flood. Oenopides of Chios (flourished c.475 BC) thought that heat stored in the ground during the winter dries up the underground veins of water so that the river shrinks. In the summer the heat disappears, and water flows up into the river, causing floods. In the view of Diogenes Of Apollonia (flourished c.435 BC), the Sun controls the regimen of the stream. The idea that the Nile waters connect with the sea is an old one, tracing back to the geographic concepts of Hecataeus of Miletus (c. 520 BC). Reasonable explanations related the discharge of the Nile to precipitation in the headwater regions, as snow ( Anaxagoras of Clazomenae, c. 500–428 BC) or from rain that filled lakes supposed to have fed the river ( Democritus of Abdera, c. 460–c. 357 BC). Eratosthenes (Eratosthenes of Cyrene) (c. 276–194 BC), who had prepared a map of the Nile Valley southward to the latitude of modern Khartoum, anticipated the correct answer when he reported that heavy rains had been observed to fall in the upper reaches of the river and that these were sufficient to account for the flooding.

Knowledge of the tides (tide)
      The tides of the Mediterranean, being inconspicuous in most places, attracted little notice from Greek and Roman naturalists. Poseidonius(135–50 BC) first correlated variations in the tides with phases of the Moon.

      By contrast, the tides along the eastern shores of Asia generally have a considerable range and were the subject of close observation and much speculation among the Chinese. In particular, the tidal bore on the Ch'ien-t'ang Chiang (Fuchun River) (Ch'ien-t'ang River) near Hang-chou attracted early attention; with its front ranging up to 3.7 metres in height, this bore is one of the largest in the world. As early as the 2nd century BC, the Chinese had recognized a connection between tides and tidal bores and the lunar cycle.

Prospecting for groundwater
      Although the origin of the water in the Earth that seeps or springs from the ground was long the subject of much fanciful speculation, the arts of finding and managing groundwater were already highly developed in the 8th century BC. The construction of long, hand-dug underground aqueducts (qanats) in Armenia and Persia represents one of the great hydrologic achievements of the ancient world. After some 3,000 years qanatsare still a major source of water in Iran.

      In the 1st century BC, Vitruvius (Marcus Vitruvius Pollio), a Roman architect and engineer, described methods of prospecting for groundwater in his De architectura libri decem (The Architecture of Marcus Vitruvius Pollio, in Ten Books). To locate places where wells should be dug, he recommended looking for spots where mist rises in early morning. More significantly, Vitruvius had learned to associate different quantities and qualities of groundwater with different kinds of rocks and topographic situations.

      After the inspired beginnings of the ancient Greeks, Romans, Chinese, and Arabs, little or no new information was collected, and no new ideas were produced throughout the Middle Ages, appropriately called the Dark Ages. It was not until the Renaissance in the early 16th century that the Earth sciences began to develop again.

The 16th–18th centuries

Geologic sciences
Ore deposits and mineralogy
      A common belief among alchemists of the 16th and 17th centuries held that metalliferous deposits were generated by heat emanating from the centre of the Earth but activated by the heavenly bodies.

      The German scientist Georgius Agricola (Agricola, Georgius)has with much justification been called the father of mineralogy. Of his seven geologic books, De natura fossilium(1546; “On Natural Fossils”) contains his major contributions to mineralogy and, in fact, has been called the first textbook on that subject. In Agricola's time and well into the 19th century, “fossil” was a term that could be applied to any object dug from the Earth. Thus Agricola's classification of fossils provided pigeonholes for organic remains, such as ammonites, and for rocks of various kinds in addition to minerals. Individual kinds of minerals, their associations and manners of occurrence, are described in detail, many for the first time.

      With the birth of analytical chemistry toward the latter part of the 18th century, the classification of minerals on the basis of their composition at last became possible. The German geologist Abraham Gottlob Werner (Werner, Abraham Gottlob) was one of those who favoured a chemical classification in preference to a “natural history” classification based on external appearances. His list of several classifications, published posthumously, recognized 317 different substances ordered in four classes.

      During the 17th century the guiding principles of paleontology and historical geology began to emerge in the work of a few individuals. Nicolaus Steno (Steno, Nicolaus), a Danish scientist and theologian, presented carefully reasoned arguments favouring the organic origin of what are now called fossils. Also, he elucidated three principles that made possible the reconstruction of certain kinds of geologic events in a chronological order. In his Canis carcariae dissectum caput(1667; “Dissected Head of a Dog Shark”), he concluded that large tongue-shaped objects found in the strata of Malta were the teeth of sharks, whose remains were buried beneath the seafloor and later raised out of the water to their present sites. This excursion into paleontology led Steno to confront a broader question. How can one solid body, such as a shark's tooth, become embedded in another solid body, such as a layer of rock? He published his answers in 1669 in a paper entitled “De solido intra naturaliter contento dissertationis” (“A Preliminary Discourse Concerning a Solid Body Enclosed by Processes of Nature Within a Solid”). Steno cited evidence to show that when the hard parts of an organism are covered with sediment, it is they and not the aggregates of sediment that are firm. Consolidation of the sediment into rock may come later, and, if so, the original solid fossil becomes encased in solid rock. He recognized that sediments settle from fluids layer by layer to form strata that are originally continuous and nearly horizontal. His principle of superposition of strata states that in a sequence of strata, as originally laid down, any stratum is younger than the one on which it rests and older than the one that rests upon it.

      In 1667 and 1668 the English physicist Robert Hooke (Hooke, Robert) read papers before the Royal Society in which he expressed many of the ideas contained in Steno's works. Hooke argued for the organic nature of fossils. Elevation of beds containing marine fossils to mountainous heights he attributed to the work of earthquakes. Streams attacking these elevated tracts wear down the hills, fill depressions with sediment, and thus level out irregularities of the landscape.

Earth history according to Werner and James Hutton
      The two major theories of the 18th century were the Neptunian and the Plutonian. The Neptunists, led by Werner and his students, maintained that the Earth was originally covered by a turbid ocean. The first sediments deposited over the irregular floor of this universal ocean formed the granite and other crystalline rocks. Then as the ocean began to subside, “Stratified” rocks were laid down in succession. The “Volcanic” rocks were the youngest; Neptunists took small account of volcanism and thought that lava was formed by the burning of coal deposits underground.

      The Scottish scientist James Hutton (Hutton, James), leader of the Plutonists, viewed the Earth as a dynamic body that functions as a heat machine. Streams wear down the continents and deposit their waste in the sea. Subterranean heat causes the outer part of the Earth to expand in places, uplifting the compacted marine sediments to form new continents. Hutton recognized that granite is an intrusive igneous rock and not a primitive sediment as the Neptunists claimed. Intrusive sills and dikes of igneous rock provide evidence for the driving force of subterranean heat. Hutton viewed great angular unconformities separating sedimentary sequences as evidence for past cycles of sedimentation, uplift, and erosion. His Theory of the Earth, published as an essay in 1788, was expanded to a two-volume work in 1795. John Playfair (Playfair, John), a professor of natural philosophy, defended Hutton against the counterattacks of the Neptunists, and his Illustrations of the Huttonian Theory (1802) is the clearest contemporary account of Plutonist theory.

Hydrologic sciences
      The idea that there is a circulatory system within the Earth, by which seawater is conveyed to mountaintops and there discharged, persisted until early in the 18th century. Two questions left unresolved by this theory were acknowledged even by its advocates. How is seawater forced uphill? How is the salt lost in the process?

The rise of subterranean water
      René Descartes (Descartes, René) supposed that the seawater diffused through subterranean channels into large caverns below the tops of mountains. The Jesuit philosopher Athanasius Kircher (Kircher, Athanasius)in his Mundus subterraneus (1664; “Subterranean World”) suggested that the tides pump seawater through hidden channels to points of outlet at springs. To explain the rise of subterranean water beneath mountains, the chemist Robert Plot appealed to the pressure of air, which forces water up the insides of mountains. The idea of a great subterranean sea connecting with the ocean and supplying it with water together with all springs and rivers was resurrected in 1695 in John Woodward's Essay towards a Natural History of the Earth and Terrestrial Bodies.

      The French Huguenot Bernard Palissy (Palissy, Bernard) maintained, to the contrary, that rainfall is the sole source of rivers and springs. In his Discours admirables (1580; Admirable Discourses) he described how rainwater falling on mountains enters cracks in the ground and flows down along these until, diverted by some obstruction, it flows out on the surface as springs. Palissy scorned the idea that seawater courses in veins to the tops of mountains. For this to be true, sea level would have to be higher than mountaintops—an impossibility. In his Discours Palissy suggested that water would rise above the level at which it was first encountered in a well provided the source of the groundwater came from a place higher than the bottom of the well. This is an early reference to conditions essential to the occurrence of artesian water, a popular subject among Italian hydrologists of the 17th and 18th centuries.

      In the latter part of the 17th century, Pierre Perrault (Perrault, Pierre) and Edmé Mariotte (Mariotte, Edme) conducted hydrologic investigations in the basin of the Seine River that established that the local annual precipitation was more than ample to account for the annual runoff.

Evaporation (vaporization) from the sea
      The question remained as to whether the amount of water evaporated from the sea is sufficient to account for the precipitation that feeds the streams. The English astronomer-mathematician Edmond Halley (Halley, Edmond) measured the rate of evaporation from pans of water exposed to the air during hot summer days. Assuming that this same rate would obtain for the Mediterranean, Halley calculated that some 5,280,000,000 tons of water are evaporated from this sea during a summer day. Assuming further that each of the nine major rivers flowing into the Mediterranean has a daily discharge 10 times that of the Thames, he calculated that a daily inflow of fresh water back into that sea would be 1,827,000,000 tons, only slightly more than a third of the amount lost by evaporation. Halley went on to explain what happens to the remainder. A part falls back into the sea as rain before it reaches land. Another part is taken up by plants.

      In the course of the hydrologic cycle, Halley reasoned, the rivers constantly bring salt into the sea in solution, but the salt is left behind when seawater evaporates to replenish the streams with rainwater. Thus the sea must be growing steadily saltier.

Atmospheric sciences
Water vapour in the atmosphere
      After 1760 the analytical chemists at last demonstrated that water and air are not the same substance in different guises. Long before this development, however, investigators had begun to draw a distinction between water vapour and air. Otto von Guericke (Guericke, Otto von), a German physicist and engineer, produced artificial clouds by releasing air from one flask into another one from which the air had been evacuated. A fog then formed in the unevacuated flask. Guericke concluded that air cannot be turned into water, though moisture can enter the air and later be condensed into water. Guericke's experiments, however, did not answer the question as to how water enters the atmosphere as vapour. In “Les Météores”(“Meteorology,” an essay published in the book Discours de la methodein 1637), Descartes envisioned water as composed of minute particles that were elongate, smooth, and separated by a highly rarified “subtle matter.”

      The same uncertainty as to how water gets into the air surrounded the question as to how it remains suspended as clouds. A popular view in the 18th century was that clouds are made of countless tiny bubbles that float in air. Guericke had suggested that the fine particles in his artificial clouds were bubbles. Other observers professed to have seen bubble-shaped particles of water vapour rising from warm water or hot coffee.

      If clouds are essentially multicompartmented balloons, their motions could be explained by the movements of winds blowing on them. Descartes suggested that the winds might blow upward as well as laterally, causing the clouds to rise or at least preventing them from descending. In 1749 Benjamin Franklin (Franklin, Benjamin) explained updrafts of air as due to local heating of the atmosphere by the Sun. Sixteen years later the Swiss-German mathematical physicist Johann Heinrich Lambert (Lambert, Johann Heinrich) described the conditions necessary for the initiation of convection currents in the atmosphere. He reasoned that rising warm air flows into bordering areas of cooler air, increasing their downward pressure and causing their lower layers to flow into ascending currents, thus producing circulation.

      The fact that Lambert could appeal to changes in air pressure to explain circulation reflects an important change from the view still current in the late 16th century that air is weightless. This misconception was corrected after 1643 with the invention of the mercury barometer. It was soon discovered that the height of the barometer varied with the weather, usually standing at its highest during clear weather and falling to the lowest on rainy days.

      Toward the end of the 18th century it was beginning to be understood that variations in the barometer must be related to the general motion and circulation of the atmosphere. That these variations could not be due solely to changes in humidity was the conclusion of the Swiss scientist Horace Bénédict de Saussure (Saussure, Horace Bénédict de) in his Essais sur l'hygrométrie (1783; “Essay on Hygrometry”). From experiments with changes of water vapour and pressure in air enclosed in a glass globe, Saussure concluded that changes in temperature must be immediately responsible for variations of the barometer and that these in turn must be related to the movement of air from one place to another.

The 19th century

Geologic sciences
Crystallography and the classification of minerals and rocks
      The French scientist René-Just Häuy (Haüy, René-Just), whose treatises on mineralogy and crystallography appeared in 1801 and 1822, respectively, has been credited with advancing mineralogy to the status of a science and with establishing the science of crystallography. From his studies of the geometric relationships between planes of cleavage, he concluded that the ultimate particles forming a given species of mineral have the same shape and that variations in crystal habit reflect differences in the ways identical molecules are put together. In 1814 Jöns Jacob Berzelius (Berzelius, Jöns Jacob)of Sweden published a system of mineralogy offering a comprehensive classification of minerals based on their chemistry. Berzelius recognized silica as an acid and introduced into mineralogy the group known as silicates. At mid-century the American geologist James Dwight Dana (Dana, James D)'sSystem of Mineralogy, in its third edition, was reorganized around a chemical classification, which thereafter became standard for handbooks.

      The development of the polarizing microscope and the technique for grinding sections of rocks so thin as to be virtually transparent came in 1827 from studies of fossilized wood by William Nicol. In 1849 Clifton Sorby (Sorby, Henry Clifton) showed that minerals viewed in thin section could be identified by their optical properties, and soon afterward improved classifications of rocks were made on the basis of their mineralogic composition. The German geologist Ferdinand Zirkel (Zirkel, Ferdinand)'s Mikroscopische Beschaffenheit der Mineralien und Gesteine (1873; “The Microscopic Nature of Minerals and Rocks”) contains one of the first mineralogic classifications of rocks and marks the emergence of microscopic petrography as an established branch of science.

William Smith and faunal succession
      In 1683 the zoologist Martin Lister proposed to the Royal Society that a new sort of map be drawn showing the areal distribution of the different kinds of British “soiles” (vegetable soils and underlying bedrock). The work proposed by Lister was not accomplished until 132 years later, when William Smith (Smith, William) published his Geologic Map of England and Wales with Part of Scotland (1815). A self-educated surveyor and engineer, Smith had the habit of collecting fossils and making careful note of the strata that contained them. He discovered that the different stratified formations in England contain distinctive assemblages of fossils. His map, reproduced on a scale of five miles to the inch, showed 20 different rock units, to which Smith applied local names in common use—e.g., London Clay and Purbeck Beds. In 1816 Smith published a companion work, Strata Identified by Organized Fossils, in which the organic remains characteristic of each of his rock units were illustrated. His generalization that each formation is “possessed of properties peculiar to itself [and] has the same organized fossils throughout its course” is the first clear statement of the principle of faunal sequence, which is the basis for worldwide correlation of fossiliferous strata into a coherent system. Smith thus demonstrated two kinds of order in nature: order in the spatial arrangement of rock units and order in the succession of ancient forms of life.

      Smith's principle of faunal sequence was another way of saying that there are discontinuities in the sequences of fossilized plants and animals. These discontinuities were interpreted in two ways: as indicators of episodic destruction of life or as evidence for the incompleteness of the fossil record. Baron Georges Cuvier (Cuvier, Georges, Baron) of France was one of the more distinguished members of a large group of naturalists who believed that paleontological discontinuities bore witness to sudden and widespread catastrophes (catastrophism). Cuvier's skill at comparative anatomy enabled him to reconstruct from fragmentary remains the skeletons of large vertebrate animals found at different levels in the Cenozoic sequence of northern France. From these studies he discovered that the fossils in all but the youngest deposits belong to species now extinct. Moreover, these extinct species have definite ranges up and down in the stratigraphic column. Cuvier inferred that the successive extinctions were the result of convulsions that caused the strata of the continents to be dislocated and folded and the seas to sweep across the continents and just as suddenly subside.

      In opposition to the catastrophist school of thought, the British geologist Charles Lyell proposed a uniformitarian interpretation of geologic history in his Principles of Geology (3 vols., 1830–33). His system was based on two propositions: the causes of geologic change operating include all the causes that have acted from the earliest time; and these causes have always operated at the same average levels of energy. These two propositions add up to a “steady-state” theory of the Earth. Changes in climate have fluctuated around a mean, reflecting changes in the position of land and sea. Progress through time in the organic world is likewise an illusion, the effect of an imperfect paleontological record. The main part of the Principles was devoted less to theory than to procedures for inferring events from rocks; and for Lyell's clear exposition of methodology his work was highly regarded throughout its many editions, long after the author himself had abandoned antiprogressivist views on the development of life.

Louis Agassiz (Agassiz, Louis) and the ice age (glaciology)
      Huge boulders of granite resting upon limestone of the Jura Mountains were subjects of controversy during the 18th and early 19th centuries. Saussure described these in 1779 and called them erratics. He concluded that they had been swept to their present positions by torrents of water. Saussure's interpretation was in accord with the tenets of diluvial geologists, who interpreted erratics and sheets of unstratified sediment (till or drift) spread over the northern parts of Europe and North America as the work of the “Deluge.”

      In 1837 the Swiss zoologist and paleontologist Louis Agassiz delivered a startling address before the Helvetian Society, proposing that, during a geologically recent stage of refrigeration, glacial ice had covered Eurasia from the North Pole to the shores of the Mediterranean and Caspian. Wherever erratics, till, and striated pavements of rock occur, sure evidence of this recent catastrophe exists. The reception accorded this address was glacial, too, and Alexander von Humboldt advised Agassiz to return to his fossil fishes. Instead, he began intensive field studies and in 1840 published his Études sur les glaciers (“Studies of Glaciers”), demonstrating that Alpine glaciers had been far more extensive in the past. That same year he visited the British Isles in the company of Buckland and extended the glacial doctrine to Scotland, northern England, and Ireland. In 1846 he carried his campaign to North America and there found additional evidence for an ice age.

Geologic time (dating) and the age of the Earth
      By mid-century the fossiliferous strata of Europe had been grouped into systems arrayed in chronological order. The stratigraphic column, a composite of these systems, was pieced together from exposures in different regions by application of the principles of superposition and faunal sequence. Time elapsed during the formation of a system became known as a period, and the periods were grouped into eras: the Paleozoic (Cambrian through Permian periods), Mesozoic (Triassic, Jurassic, and Cretaceous periods), and Cenozoic (Tertiary and Quaternary periods).

      Charles Darwin (Darwin, Charles)'sOrigin of Species (1859) offered a theoretical explanation for the empirical principle of faunal sequence. The fossils of the successive systems are different not only because parts of the stratigraphic record are missing but also because most species have lost in their struggles for survival and also because those that do survive evolve into new forms over time. Darwin borrowed two ideas from Lyell and the uniformitarians: the idea that geologic time is virtually without limit and the idea that a sequence of minute changes integrated over long periods of time produce remarkable changes in natural entities.

      The evolutionists and the historical geologists were embarrassed when, beginning in 1864, William Thomson (later Lord Kelvin (Kelvin, William Thomson, Baron)) attacked the steady-state theory of the Earth and placed numerical strictures on the length of geologic time. The Earth might function as a heat machine, but it could not also be a perpetual motion machine. Assuming that the Earth was originally molten, Thomson calculated that not less than 20,000,000 and not more than 400,000,000 years could have passed since the Earth first became a solid body. Other physicists of note put even narrower limits on the Earth's age ranging down to 15,000,000 or 20,000,000 years. All these calculations, however, were based on the common assumption, not always explicitly stated, that the Earth's substance is inert and hence incapable of generating new heat. Shortly before the end of the century this assumption was negated by the discovery of radioactive elements that disintegrate spontaneously and release heat to the Earth in the process.

Concepts of landform evolution
      The scientific exploration of the American West following the end of the Civil War yielded much new information on the sculpture of the landscape by streams. John Wesley Powell (Powell, John Wesley)in his reports on the Colorado River and Uinta Mountains (1875, 1876) explained how streams may come to flow across mountain ranges rather than detour around them. The Green River does not follow some structural crack in its gorge across the Uinta Mountains; instead it has cut its canyon as the mountain range was slowly bowed up. Given enough time, streams will erode their drainage basins to plains approaching sea level as a base. Grove Karl Gilbert (Gilbert, Grove Karl)'s Report on the Geology of the Henry Mountains (1877) offered a detailed analysis of fluvial processes. According to Gilbert all streams work toward a graded condition, a state of dynamic equilibrium that is attained when the net effect of the flowing water is neither erosion of the bed nor deposition of sediment, when the landscape reflects a balance between the resistance of the rocks to erosion and the processes that are operative upon them. After 1884 William Morris Davis (Davis, William Morris) developed the concept of the geographical cycle, during which elevated regions pass through successive stages of dissection and denudation characterized as youthful, mature, and old. Youthful landscapes have broad divides and narrow valleys. With further denudation the original surface on which the streams began their work is reduced to ridgetops. Finally in the stage of old age, the region is reduced to a nearly featureless plain near sea level or its inland projection. Uplift of the region in any stage of this evolution will activate a new cycle. Davis' views dominated geomorphic thought until well into the 20th century, when quantitative approaches resulted in the rediscovery of Gilbert's ideas.

gravity, isostasy, and the Earth's figure
      Discoveries of regional anomalies in the Earth's gravity led to the realization that high mountain ranges have underlying deficiencies in mass about equal to the apparent surface loads represented by the mountains themselves. In the 18th century the French scientist Pierre Bouguer (Bouguer, Pierre) had observed that the deflections of the pendulum in Peru are much less than they should be if the Andes represent a load perched on top of the Earth's crust. Similar anomalies were later found to obtain along the Himalayan front. To explain these anomalies it was necessary to assume that beneath some depth within the Earth pressures are hydrostatic (equal on all sides). If excess loads are placed upon the crust, as by addition of a continental ice cap, the crust will sink to compensate for the additional mass and will rise again when the load is removed. The tendency toward general equilibrium maintained through vertical movements of the Earth's outer layers was called isostasy in 1899 by Clarence Edward Dutton (Dutton, Clarence Edward) of the United States.

      Evidence for substantial vertical movements of the crust was supplied by studies of regional stratigraphy. In 1883 another American geologist, James Hall, had demonstrated that Paleozoic rocks of the folded Appalachians were several times as thick as sequences of the same age in the plateaus and plains to the west. It was his conclusion that the folded strata in the mountains must have accumulated in a linear submarine trough that filled with sediment as it subsided. Downward crustal flexures of this magnitude came to be called geosynclines (geosyncline).

Hydrologic sciences
Darcy's law
      Quantitative studies of the movement of water in streams and aquifers led to the formulation of mathematical statements relating discharge to other factors. Henri-Philibert-Gaspard Darcy (Darcy, Henri-Philibert-Gaspard), a French hydraulic engineer, was the first to state clearly a law describing the flow of groundwater. Darcy's experiments, reported in 1856, were based on the ideas that an aquifer is analogous to a main line connecting two reservoirs at different levels and that an artesian well is like a pipe drawing water from a main line under pressure. His investigations of flow through stratified beds of sand led him to conclude that the rate of flow is directly proportional to the energy loss and inversely proportional to the length of the path of flow. Another French engineer, Arsène-Jules-Étienne-Juvénal Dupuit (Dupuit, Arsène-Jules-Étienne-Juvénal), extended Darcy's work and developed equations for underground flow toward a well, for the recharge of aquifers, and for the discharge of artesian wells. Philip Forchheimer (Forchheimer, Philipp), an Austrian hydrologist, introduced the theory of functions of a complex variable to analyze the flow by gravity of underground water toward wells and developed equations for determining the critical distance between a river and a well beyond which water from the river will not move into the well.

Surface water discharge
      A complicated empirical formula for the discharge of streams resulted from the studies of Andrew Atkinson Humphreys and Henry Larcom Abbot in the course of the Mississippi Delta Survey of 1851–60. Their formula contained no term for roughness of channel and on this and other grounds was later found to be inapplicable to the rapidly flowing streams of mountainous regions. In 1869 Emile-Oscar Ganguillet and Rudolph Kutter developed a more generally applicable discharge equation following their studies of flow in Swiss mountain streams. Toward the end of the century, systematic studies of the discharge of streams had become common. In the United States the Geological Survey, following its establishment in 1879, became the principal agency for collecting and publishing data on discharge, and by 1906 stream gauging had become nationwide.

Foundations of oceanography
      In 1807 Thomas Jefferson ordered the establishment of the U.S. Coast Survey (later Coast and Geodetic Survey and now the National Ocean Survey). Modeled after British and French agencies that had grown up in the 1700s, the agency was charged with the responsibilities of hydrographic and geodetic surveying, studies of tides, and preparation of charts. Beginning in 1842 the U.S. Navy undertook expansive oceanographic operations (oceanography) through its office of charts and instruments. Lieut. Matthew Fontaine Maury (Maury, Matthew Fontaine) promoted international cooperation in gathering meteorologic and hydrologic data at sea. In 1847 Maury compiled the first wind and current charts for the North Atlantic and in 1854 issued the first depth map to 4,000 fathoms (7,300 metres). His Physical Geography of the Sea (1855) is generally considered the first oceanographic textbook.

      The voyage of the Beagle(1831–36) is remembered for Darwin's biological and geologic contributions. From his observations in the South Pacific, Darwin formulated a theory for the origin of coral reefs, which with minor changes has stood the test of time. He viewed the fringing reefs, barrier reefs, and atolls as successive stages in a developmental sequence. The volcanic islands around which the reef-building organisms are attached slowly sink, but at the same time the shallow-water organisms that form the reefs build their colonies upward so as to remain in the sunlit layers of water. With submergence of the island, what began as a fringing reef girdling a landmass at last becomes an atoll enclosing a lagoon.

      Laying telegraphic cables across the Atlantic called for investigations of the configuration of the ocean floor, of the currents that sweep the bottom, and of the benthonic animals that might damage the cables. The explorations of the British ships Lightning and Porcupine in 1868 and 1869 turned up surprising oceanographic information. Following closely upon these voyages, the Challenger was authorized to determine “the conditions of the Deep Sea throughout the Great Ocean Basins.”

      The Challenger left port in December of 1872 and returned in May 1876, after logging 127,600 kilometres (68,890 nautical miles). Under the direction of Wyville Thomson (Thomson, Sir C Wyville), Scottish professor of natural history, it occupied 350 stations scattered over all oceans except the Arctic. The work involved in analyzing the information gathered during the expedition was completed by Thomson's shipmate Sir John Murray (Murray, Sir John), and the results filled 50 large volumes. Hundreds of new species of marine organisms were described, including new forms of life from deep waters. The temperature of water at the bottom of the oceans was found to be nearly constant below the 2,000-fathom level, averaging about 2.5° C (36.5° F) in the North Atlantic and 2° C (35° F) in the North Pacific. Soundings showed wide variations in depths of water, and from the dredgings of the bottom came new types of sediment—red clay as well as oozes made predominantly of the minute skeletons of foraminifera, radiolarians, or diatoms. Improved charts of the principal surface currents were produced, and the precise location of many oceanic islands was determined for the first time. Seventy-seven samples of seawater were taken at different stations from depths ranging downward to about 1.5 kilometres. The German-born chemist Wilhelm Dittmar conducted quantitative determinations of the seven major constituents (other than the hydrogen and oxygen of the water itself)—namely, sodium, calcium, magnesium, potassium, chloride, bromide, and sulfate. Surprisingly, the percentages of these components turned out to be nearly the same in all samples.

      Efforts to analyze the rise and fall of the tides in mathematical terms reflecting the relative and constantly changing positions of Earth, Moon, and Sun, and thus to predict the tides at particular localities, has never been entirely successful because of local variations in configuration of shore and seafloor. Nevertheless, harmonic tidal analysis gives essential first approximations that are essential to tidal prediction. In 1884 a mechanical analog tidal prediction device was invented by William Ferrel (Ferrel, William) of the U.S. Coast and Geodetic Survey, and improved models were used until 1965, when the work of the analog machines was taken over by electronic computers.

Atmospheric sciences
Composition of the atmosphere
      Studies of barometric pressure by the British chemist and physicist John Dalton (Dalton, John) led him to conclude that evaporation and condensation of vapour do not involve chemical transformations. The introduction of vapour into the air by evaporation must change the average specific gravity of the air column and, without altering the height of that column, will change the reading of the barometer. In 1857 Rudolf Clausius (Clausius, Rudolf), a German physicist, clarified the mechanics of evaporation in his kinetic theory of gases. Evaporation occurs when more molecules of a liquid are leaving its surface than returning to it, and the higher the temperature the more of these escaped molecules will be in space at any one time.

      Following the invention of the hot-air balloon by the Montgolfier brothers in 1783, balloonists produced some useful information on the composition and movements of the atmosphere. In 1804 the celebrated French chemist Joseph-Louis Gay-Lussac (Gay-Lussac, Joseph-Louis) ascended to about 7,000 metres, took samples of air, and later determined that the rarefied air at that altitude contained the same percentage of oxygen (21.49 percent) as the air on the ground. Austrian meteorologist Julius von Hann, working with data from balloon ascents and climbing in the Alps and Himalayas, concluded in 1874 that about 90 percent of all the water vapour in the atmosphere is concentrated below 6,000 metres—from which it follows that high mountains can be barriers against the transport of water vapour.

Understanding of clouds, fog, and dew
      Most of the names given to clouds (cirrus, cumulus, stratus, nimbus, and their combinations) were coined in 1803 by the English meteorologist Luke Howard. Howard's effort was not simply taxonomic; he recognized that clouds reflect in their shapes and changing forms “the general causes which effect all the variations of the atmosphere.”

      After Guericke's experiments it was widely believed that water vapour condenses into cloud as soon as the air containing it cools to the dew point. That this is not necessarily so was proved by Paul-Jean Coulier of France from experiments reported in 1875. Coulier found that the sudden expansion of air in glass flasks failed to produce an artificial cloud if the air in the system was filtered through cotton wool. He concluded that dust in the air was essential to the formation of cloud in the flask.

      From about the mid-1820s, efforts were made to classify precipitation in terms of the causes behind the lowering of temperature. In 1841 the American astronomer-meteorologist Elias Loomis recognized the following causes: warm air coming into contact with cold earth or water, responsible for fog; mixing of warm and cold currents, which commonly results in light rains; and sudden transport of air into high regions, as by flow up a mountain slope or by warm currents riding over an opposing current of cold air, which may produce heavy rains.

Observation and study of storms
      Storms, particularly tropical revolving storms, were subjects of much interest. As early as 1697 some of the more spectacular features of revolving storms were recorded in William Dampier (Dampier, William)'sNew Voyage Round the World. On July 4, 1687, Dampier's ship survived the passage of what he called a “tuffoon” off the coast of China. The captain's vivid account of this experience clearly describes the calm central eye of the storm and the passage of winds from opposite directions as the storm moved past. In 1828 Heinrich Wilhelm Dove, a Prussian meteorologist, recognized that tropical revolving storms are traveling systems with strong winds moving counterclockwise in the Northern Hemisphere and clockwise in the Southern Hemisphere. The whirlwind character of these storms was independently established by the American meteorologist William C. Redfield in the case of the September hurricane that struck New England in 1821. He noted that in central Connecticut the trees had been toppled toward the northwest, whereas some 80 kilometres westward they had fallen in the opposite direction. Redfield identified the belt between the Equator and the tropics as the region in which hurricanes are generated, and he recognized how the tracks of these storms tend to veer eastward when they enter the belt of westerly winds at about latitude 30° N. In 1849 Sir William Reid, a British meteorologist and military engineer, studied the revolving storms that occur south of the Equator in the Indian Ocean and confirmed that they have reversed rotations and curvatures of path compared with those of the Northern Hemisphere. Capt. Henry Piddington subsequently investigated revolving storms affecting the Bay of Bengal and Arabian Sea, and in 1855 he named these cyclones in his Sailor's Horn-book for the Laws of Storms in all Parts of the World.

      Beginning in 1835 James Pollard Espy (Espy, James Pollard), an American meteorologist, began extensive studies of storms from which he developed a theory to explain their sources of energy. Radially convergent winds, he believed, cause the air to rise in their area of collision. Upward movement of moist air is attended by condensation and precipitation. Latent heat released through the change of vapour to cloud or water causes further expansion and rising of the air. The higher the moist air rises the more the equilibrium of the system is disturbed, and this equilibrium cannot be restored until moist air at the surface ceases to flow toward the ascending column.

      That radially convergent winds are not necessary to the rising of large air masses was demonstrated by Loomis in the case of a great storm that passed across the northeastern United States in December 1836. From his studies of wind patterns, changes of temperature, and changes in barometric pressure, he concluded that a cold northwest wind had displaced a wind blowing from the southeast by flowing under it. The southeast wind made its escape by ascending from the Earth's surface. Loomis had recognized what today would be called a frontal surface.

Weather and climate
      Modern meteorology began when the daily weather map was developed as a device for analysis and forecasting, and the instrument that made this kind of map possible was the electromagnetic telegraph. In the United States the first telegraph line was strung in 1844 between Washington, D.C., and Baltimore. Concurrently with the expansion of telegraphic networks, the physicist Joseph Henry (Henry, Joseph) arranged for telegraph companies to have meteorological instruments in exchange for current data on weather telegraphed to the Smithsonian Institution. Some 500 stations had joined this cooperative effort by 1860. The Civil War temporarily prevented further expansion, but, meanwhile, a disaster of a different order had accelerated development of synoptic meteorology in Europe. On Nov. 14, 1854, an unexpected storm wrecked British and French warships off Balaklava on the Crimean peninsula. Had word of the approaching storm been telegraphed to this port in the Black Sea, the ships might have been saved. This mischance led in 1856 to the establishment of a national storm-warning service in France. In 1863 the Paris Observatory began publishing the first weather maps in modern format.

      The first national weather service in the United States began operations in 1871 as an agency of the Department of War. The initial objective was to provide storm warnings for the Gulf and Atlantic coasts and the Great Lakes. In 1877 forecasts of temperature changes and precipitation averaged 74 percent in accuracy, as compared with 79 percent for cold-wave warnings. After 1878 daily weather maps were published.

      Synoptic meteorology made possible the tracking of storm systems over wide areas. In 1868 the British meteorologist Alexander Buchan (Buchan, Alexander) published a map showing the travels of a cyclonic depression across North America, the Atlantic, and into northern Europe. In the judgment of Sir Napier Shaw, Buchan's study marks the entry of modern meteorology, with “the weather map as its main feature and forecasting its avowed object.”

      In addition to weather maps, a variety of other kinds of maps showing regional variations in the components of weather and climate were produced. In 1817 Alexander von Humboldt (Humboldt, Alexander von) published a map showing the distribution of mean annual temperatures over the greater part of the Northern Hemisphere. Humboldt was the first to use isothermal lines in mapping temperature. Buchan drew the first maps of mean monthly and annual pressure for the entire world. Published in 1869, these maps added much to knowledge of the general circulation of the atmosphere. In 1886 Léon-Philippe Teisserenc de Bort (Teisserenc de Bort, Léon) of France published maps showing mean annual cloudiness over the Earth for each month and the year. The first world map of precipitation showing mean annual precipitation by isohyets was the work of Loomis in 1882. This work was further refined in 1899 by the maps of the British cartographer Andrew John Herbertson, which showed precipitation for each month of the year.

      Although the 19th century was still in the age of meteorologic and climatological exploration, broad syntheses of old information thus kept pace with acquisition of the new fairly well. For example, Julius Hann'smassive Handbuch der Klimatologie (“Handbook of Climatology”), first issued in 1883, is mainly a compendium of works published in the Meteorologische Zeitschrift (“Journal of Meteorology”). The Handbuch was kept current in revised editions until 1911, and this work is still sometimes called the most skillfully written account of world climate.

The 20th century: modern trends and developments

Geologic sciences
      The development of the geologic sciences in the 20th century has been influenced by two major “revolutions.” The first involves dramatic technological advances that have resulted in vastly improved instrumentation, the prime examples being the many types of highly sophisticated computerized devices. The second is centred on the development of the plate tectonics theory, which is the most profound and influential conceptual advance the Earth sciences have ever known.

      Modern technological developments have affected all the different geologic disciplines. Their impact has been particularly notable in such activities as radiometric dating, experimental petrology, crystallography, chemical analysis of rocks and minerals, micropaleontology, and seismological exploration of the Earth's deep interior.

      In 1905, shortly after the discovery of radioactivity, the American chemist Bertram Boltwood (Boltwood, Bertram Borden) suggested that lead is one of the disintegration products of uranium, in which case the older a uranium-bearing mineral the greater should be its proportional part of lead. Analyzing specimens whose relative geologic ages were known, Boltwood found that the ratio of lead to uranium did indeed increase with age. After estimating the rate of this radioactive change he calculated that the absolute ages of his specimens ranged from 410,000,000 to 2,200,000,000 years. Though his figures were too high by about 20 percent, their order of magnitude was enough to dispose of the short scale of geologic time proposed by Lord Kelvin.

      Versions of the modern mass spectrometer were invented in the early 1920s and 1930s, and during World War II the device was improved substantially to help in the development of the atomic bomb. Soon after the war, Harold C. Urey and G.J. Wasserburg applied the mass spectrometer to the study of geochronology. This device separates the different isotopes of the same element and can measure the variations in these isotopic abundances to within one part in 10,000. By determining the amount of the parent and daughter isotopes present in a sample and by knowing their rate of radioactive decay (each radioisotope has its own decay constant), the isotopic age of the sample can be calculated. For dating minerals and rocks, investigators commonly use the following couplets of parent and daughter isotopes: thorium-232–lead-208, uranium-235–lead-207, samarium-147–neodymium-143, rubidium-87–strontium-87, potassium-40–argon-40, and argon-40–argon-39. The SHRIMP (Sensitive High Resolution Ion Microprobe) enables the accurate determination of the uranium-lead age of the mineral zircon, and this has revolutionized the understanding of the isotopic age of formation of zircon-bearing igneous granitic rocks. Another technological development is the ICP-MS (Inductively Coupled Plasma Mass Spectrometer), which is able to provide the isotopic age of the minerals zircon, titanite, rutile, and monazite. These minerals are common to many igneous and metamorphic rocks.

      Such techniques have had an enormous impact on scientific knowledge of Earth history because precise dates can now be obtained on rocks in all orogenic (mountain) belts ranging in age from the early Archean (about 3,800,000,000 years old) to the late Tertiary (roughly 20,000,000 years old). The oldest known rocks on Earth, estimated at 4,280,000,000 years old, are the faux amphibolite volcanic deposits of the Nuvvuagittuq greenstone belt in Quebec, Canada. A radiometric dating technique that measures the ratio of the rare-earth elements neodymium and samarium present in a rock sample was used to produce the estimate. Also by extrapolating backward in time to a situation when there was no lead that had been produced by radiogenic processes, a figure of about 4,600,000,000 years is obtained for the minimum age of the Earth. This figure is of the same order as ages obtained for certain meteorites and lunar rocks.

Experimental study of rocks
      Experimental petrology began with the work of Jacobus Henricus van't Hoff (Hoff, Jacobus Henricus van 't), one of the founders of physical chemistry. Between 1896 and 1908 he elucidated the complex sequence of chemical reactions attending the precipitation of salts (evaporites) from the evaporation of seawater. Van't Hoff's aim was to explain the succession of mineral salts present in Permian rocks of Germany. His success at producing from aqueous solutions artificial minerals and rocks like those found in natural salt deposits stimulated studies of minerals crystallizing from silicate melts simulating the magmas from which igneous rocks have formed. Working at the Geophysical Laboratory of the Carnegie Institution of Washington, D.C., Norman L. Bowen (Bowen, Norman L.) conducted extensive phase-equilibrium studies of silicate systems, brought together in his Evolution of the Igneous Rocks (1928). Experimental petrology, both at the low-temperature range explored by van't Hoff and in the high ranges of temperature investigated by Bowen, continues to provide laboratory evidence for interpreting the chemical history of sedimentary and igneous rocks. Experimental petrology also provides valuable data on the stability limits of individual metamorphic minerals and of the reactions between different minerals in a wide variety of chemical systems. These experiments are carried out at elevated temperatures and pressures that simulate those operating in different levels of the Earth's crust. Thus the metamorphic petrologist today can compare the minerals and mineral assemblages found in natural rocks with comparable examples produced in the laboratory, the pressure–temperature limits of which have been well defined by experimental petrology.

      Another branch of experimental science relates to the deformation of rocks. In 1906 the American physicist P.W. Bridgman (Bridgman, Percy Williams) developed a technique for subjecting rock samples to high pressures similar to those deep in the Earth. Studies of the behaviour of rocks in the laboratory have shown that their strength increases with confining pressure but decreases with rise in temperature. Down to depths of a few kilometres the strength of rocks would be expected to increase. At greater depths the temperature effect should become dominant, and response to stress should result in flow rather than fracture of rocks. In 1959 two American geologists, Marion King Hubbert (Hubbert, Marion King)and William W. Rubey (Rubey, William W.), demonstrated that fluids in the pores of rock may reduce internal friction and permit gliding over nearly horizontal planes of the large overthrust blocks associated with folded mountains. More recently the Norwegian petrologist Hans Ramberg performed many experiments with a large centrifuge that produced a negative gravity effect and thus was able to create structures simulating salt domes, which rise because of the relatively low density of the salt in comparison with that of surrounding rocks. With all these deformation experiments, it is necessary to scale down as precisely as possible variables such as the time and velocity of the experiment and the viscosity and temperature of the material from the natural to the laboratory conditions.

      In the 19th century crystallographers were only able to study the external form of minerals, and it was not until 1895 when the German physicist Wilhelm Conrad Röntgen discovered X rays that it became possible to consider their internal structure. In 1912 another German physicist, Max von Laue, realized that X rays were scattered and deflected at regular angles when they passed through a copper sulfate crystal, and so he produced the first X-ray diffraction pattern on a photographic film. A year later William Bragg of Britain and his son Lawrence perceived that such a pattern reflects the layers of atoms in the crystal structure, and they succeeded in determining for the first time the atomic crystal structure of the mineral halite (sodium chloride). These discoveries had a long-lasting influence on crystallography because they led to the development of the X-ray powder diffractometer, which is now widely used to identify minerals and to ascertain their crystal structure.

The chemical analysis of rocks and minerals
      Advanced analytic chemical equipment has revolutionized the understanding of the composition of rocks and minerals. For example, the XRF (X-Ray Fluorescence) spectrometer can quantify the major and trace element abundances of many chemical elements in a rock sample down to parts-per-million concentrations. This geochemical method has been used to differentiate successive stages of igneous rocks in the plate-tectonic cycle. The metamorphic petrologist can use the bulk composition of a recrystallized rock to define the structure of the original rock, assuming that no structural change has occurred during the metamorphic process. Next, the electron microprobe bombards a thin microscopic slice of a mineral in a sample with a beam of electrons, which can determine the chemical composition of the mineral almost instantly. This method has wide applications in, for example, the fields of industrial mineralogy, materials science, igneous geochemistry, and metamorphic petrology.

      Microscopic fossils, such as ostracods, foraminifera, and pollen grains, are common in sediments of the Mesozoic and Cenozoic eras (from about 250,000,000 years ago to the present). Because the rock chips brought up in oil wells are so small, a high-resolution instrument known as a scanning electron microscope had to be developed to study the microfossils. The classification of microfossils of organisms that lived within relatively short time spans has enabled Mesozoic-Cenozoic sediments to be subdivided in remarkable detail. This technique also has had a major impact on the study of Precambrian life (i.e., organisms that existed more than 542,000,000 years ago). Carbonaceous spheroids and filaments about 7–10 millimetres (0.3–0.4 inch) long are recorded in 3,500,000,000-year-old sediments in the Pilbara region of northwestern Western Australia and in the lower Onverwacht Series of the Barberton belt in South Africa; these are the oldest reliable records of life on Earth.

Seismology and the structure of the Earth
      Earthquake study was institutionalized in 1880 with the formation of the Seismological Society of Japan under the leadership of the English geologist John Milne (Milne, John). Milne and his associates invented the first accurate seismographs, including the instrument later known as the Milne seismograph. seismology has revealed much about the structure of the Earth's core, mantle, and crust. The English seismologist Richard Dixon Oldham (Oldham, Richard Dixon)'s studies of earthquake records in 1906 led to the discovery of the Earth's core. From studies of the Croatian quake of Oct. 8, 1909, the geophysicist Andrija Mohorovičić (Mohorovičić, Andrija)discovered the discontinuity (often called the Moho) that separates the crust from the underlying mantle.

      Today there are more than 1,000 seismograph stations around the world, and their data are used to compile seismicity maps. These maps show that earthquake epicentres are aligned in narrow, continuous belts along the boundaries of lithospheric plates (see below). The earthquake foci outline the mid-oceanic ridges in the Atlantic, Pacific, and Indian oceans where the plates separate, while around the margins of the Pacific where the plates converge, they lie in a dipping plane, or Benioff zone, that defines the position of the subducting plate boundary to depths of about 700 kilometres.

      Since 1950, additional information on the crust has been obtained from the analysis of artificial tremors produced by chemical explosions. These studies have shown that the Moho is present under all continents at an average depth of 35 kilometres and that the crust above it thickens under young mountain ranges to depths of 70 kilometres in the Andes and the Himalayas. In such investigations the reflections of the seismic waves generated from a series of “shot” points are also recorded, and this makes it possible to construct a profile of the subsurface structure. This is seismic reflection profiling, the main method of exploration used by the petroleum industry. During the late 1970s, a new technique for generating seismic waves was invented: thumping and vibrating the surface of the ground with a gas-propelled piston from a large truck.

The theory of plate tectonics
      Plate tectonics has revolutionized virtually every discipline of the Earth sciences since the late 1960s and early 1970s. It has served as a unifying model or paradigm for explaining geologic phenomena that were formerly considered in unrelated fashion. Plate tectonics describes seismic activity, volcanism, mountain building, and various other Earth processes in terms of the structure and mechanical behaviour of a small number of enormous rigid plates thought to constitute the outer part of the planet (i.e., the lithosphere). This all-encompassing theory grew out of observations and ideas about continental drift and seafloor spreading.

      In 1912 the German meteorologist Alfred Wegener (Wegener, Alfred Lothar) proposed that throughout most of geologic time there was only one continental mass, which he named Pangea. At some time during the Mesozoic Era, Pangaea fragmented and the parts began to drift apart. Westward drift of the Americas opened the Atlantic Ocean, and the Indian block drifted across the Equator to join with Asia. In 1937 the South African Alexander Du Toit modified Wegener's hypothesis by suggesting the existence of two primordial continents: Laurasia in the north and Gondwanaland in the south. Aside from the congruency of continental shelf margins across the Atlantic, proponents of continental drift have amassed impressive geologic evidence to support their views. Similarities in fossil terrestrial organisms in pre-Cretaceous (older than about 145,000,000 years) strata of Africa and South America and in pre-Jurassic rocks (older than about 200,000,000 years) of Australia, India, Madagascar, and Africa are explained if these continents were formerly connected but difficult to account for otherwise. Fitting the Americas with the continents across the Atlantic brings together similar kinds of rocks and structures. Evidence of widespread glaciation during the Upper Paleozoic is found in Antarctica, southern South America, southern Africa, India, and Australia. If these continents were formerly united around the south polar region, this glaciation becomes explicable as a unified sequence of events in time and space.

      Interest in continental drift heightened during the 1950s as knowledge of the Earth's magnetic field during the geologic past developed from the studies of Stanley K. Runcorn (Runcorn, Stanley Keith), Patrick M.S. Blackett (Blackett, Patrick M.S., Baron Blackett of Chelsea), and others. Ferromagnetic minerals such as magnetite acquire a permanent magnetization when they crystallize as components of igneous rock. The direction of their magnetization is the same as the direction of the Earth's magnetic field at the place and time of crystallization. Particles of magnetized minerals released from their parent igneous rocks by weathering may later realign themselves with the existing magnetic field at the time these particles are incorporated into sedimentary deposits. Studies of the remanent magnetism in suitable rocks of different ages from over the world indicate that the magnetic poles were in different places at different times. The polar wandering curves are different for the several continents, but in important instances these differences are reconciled on the assumption that continents now separated were formerly joined. The curves for Europe and North America, for example, are reconciled by the assumption that America has drifted about 30° westward relative to Europe since the Triassic Period (approximately 200,000,000 to 250,000,000 years ago).

      In the early 1960s a major breakthrough in understanding the way the modern Earth works came from two studies of the ocean floor. First, the American geophysicists Harry H. Hess and Robert S. Dietz (Dietz, Robert S.) suggested that new ocean crust was formed along mid-oceanic ridges between separating continents; and second, Drummond H. Matthews and Frederick J. Vine of Britain proposed that the new oceanic crust acted like a magnetic tape recorder insofar as magnetic anomaly strips parallel to the ridge had been magnetized alternately in normal and reversed order, reflecting the changes in polarity of the Earth's magnetic field. This theory of seafloor spreading then needed testing, and the opportunity arose from major advances in deep-water drilling technology. The Joint Oceanographic Institutions Deep Earth Sampling (JOIDES) project began in 1969, continued with the Deep Sea Drilling Project (DSDP), and, since 1976, with the International Phase of Ocean Drilling (IPOD) project. These projects have produced more than 500 boreholes in the floor of the world's oceans, and the results have been as outstanding as the plate-tectonic theory itself. They confirm that the oceanic crust is everywhere younger than about 200,000,000 years and that the stratigraphic age determined by micropaleontology of the overlying oceanic sediments is close to the age of the oceanic crust calculated from the magnetic anomalies.

      The plate-tectonic theory, which embraces both continental drift and seafloor spreading, was formulated in the mid-1960s by the Canadian geologist J. Tuzo Wilson, who described the network of mid-oceanic ridges, transform faults, and subduction zones as boundaries separating an evolving mosaic of enormous plates, and who proposed the idea of the opening and closing of oceans and eventual production of an orogenic belt by the collision of two continents.

      Up to this point, no one had considered in any detail the implications of the plate-tectonic theory for the evolution of continental orogenic belts; most thought had been devoted to the oceans. In 1969 John Dewey of the University of Cambridge outlined an analysis of the Caledonian-Appalachian orogenic belts in terms of a complete plate-tectonic cycle of events, and this provided a model for the interpretation of other pre-Mesozoic (Paleozoic and Precambrian) belts. Even the oldest volcano-sedimentary rocks on Earth, in the 3,800,000,000 year-old Isua belt in West Greenland, have been shown by geologists from the Tokyo Institute of Technology to have formed in a plate tectonic setting, i.e., in a trench or mouth of a subduction zone. For a detailed discussion of plate-tectonic theory and its far-reaching effects, see plate tectonics.

water resources and seawater chemistry
      Quantitative studies of the distribution of water have revealed that an astonishingly small part of the Earth's water is contained in lakes and rivers. Ninety-seven percent of all the water is in the oceans; and, of the fresh water constituting the remainder, three-fourths is locked up in glacial ice and most of the rest is in the ground. Approximate figures are also now available for the amounts of water involved in the different stages of the hydrologic cycle. Of the 859 millimetres of annual global precipitation, 23 percent falls on the lands; but only about a third of the precipitation on the lands runs directly back to the sea, the remainder being recycled through the atmosphere by evaporation and transpiration. Subsurface groundwater accumulates by infiltration of rainwater into soil and bedrock. Some may run off into rivers and lakes, and some may reemerge as springs or aquifers. Advanced techniques are used extensively in groundwater studies nowadays. The rate of groundwater flow, for example, can be calculated from the breakdown of radioactive carbon-14 by measuring the time it takes for rainwater to pass through the ground, while numerical modeling is used to study heat and mass transfer in groundwater. High-precision equipment is used for measuring down-hole temperature, pressure, flow rate, and water level. Groundwater hydrology is important in studies of fractured reservoirs, subsidence resulting from fluid withdrawal, geothermal resource exploration, radioactive waste disposal, and aquifer thermal-energy storage.

      Chemical analyses of trace elements and isotopes of seawater are conducted as part of the Geochemical Ocean Sections (Geosecs) program. Of the 92 naturally occurring elements, nearly 80 have been detected in seawater or in the organisms that inhabit it, and it is thought to be only a matter of time until traces of the others are detected. Contrary to the idea widely circulated in the older literature of oceanography, that the relative proportions of the oceans' dissolved constituents are constant, investigations since 1962 have revealed statistically significant variations in the ratios of calcium and strontium to chlorinity. The role of organisms as influences on the composition of seawater has become better understood with advances in marine biology. It is now known that plants and animals may collect certain elements to concentrations as much as 100,000 times their normal amounts in seawater. Abnormally high concentrations of beryllium, scandium, chromium, and iodine have been found in algae; of copper and arsenic in both the soft and skeletal parts of invertebrate animals; and of zirconium and cerium in plankton.

Desalinization, tidal power, and minerals from the sea
      For ages a source of food and common salt, the sea is increasingly becoming a source of water, chemicals, and energy. In 1967 Key West, Fla., became the first U.S. city to be supplied solely by water from the sea, drawing its supplies from a plant that produces more than 2,000,000 gallons of refined water daily. Magnesia was extracted from the Mediterranean in the late 19th century; at present nearly all the magnesium metal used in the United States is mined from the sea at Freeport, Texas. Many ambitious schemes for using tidal power have been devised, but the first major hydrographic project of this kind was not completed until 1967, when a dam and electrical generating equipment were installed across the Rance River in Brittany. The seafloor and the strata below the continental shelves are also sources of mineral wealth. Concretions of manganese oxide, evidently formed in the process of subaqueous weathering of volcanic rocks, have been found in dense concentrations with a total abundance of 1011tons. In addition to the manganese, these concretions contain copper, nickel, cobalt, zinc, and molybdenum. To date, oil and gas have been the most valuable products to be produced from beneath the sea.

      Modern bathymetric charts show that about 20 percent of the surfaces of the continents are submerged to form continental shelves. Altogether the shelves form an area about the size of Africa. Continental slopes, which slant down from the outer edges of the shelves to the abyssal plains of the seafloor, are nearly everywhere furrowed by submarine canyons. The depths to which these canyons have been cut below sea level seem to rule out the possibility that they are drowned valleys cut by ordinary streams. More likely the canyons were eroded by turbidity currents, dense mixtures of mud and water that originate as mud slides in the heads of the canyons and pour down their bottoms.

      Profiling of the Pacific Basin prior to and during World War II resulted in the discovery of hundreds of isolated eminences rising 1,000 or more metres above the floor. Of particular interest were seamounts in the shape of truncated cones, whose flat tops rise to between 1.6 kilometres and a few hundred metres below the surface. Harry H. Hess interpreted the flat-topped seamounts (guyots) as volcanic mountains planed off by action of waves before they subsided to their present depths. Subsequent drilling in guyots west of Hawaii confirmed this view; samples of rocks from the tops contained fossils of Cretaceous age representing reef-building organisms of the kind that inhabit shallow water.

Ocean circulation, currents, and waves
      Early in the century Vilhelm Bjerknes (Bjerknes, Vilhelm), a Norwegian meteorologist, and V. Walfrid Ekman (Ekman, V Walfrid), a Swedish physical oceanographer, investigated the dynamics of ocean circulation and developed theoretical principles that have influenced subsequent studies of currents in the sea. Bjerknes showed that very small forces resulting from pressure differences caused by nonuniform density of seawater can initiate and maintain fluid motion. Ekman analyzed the influence of winds and the Earth's rotation on currents. He theorized that in a homogeneous medium the frictional effects of winds blowing across the surface would cause movement of successively lower layers of water, the deeper the currents so produced the less their velocity and the greater their deflection by the Coriolis effect (an apparent force due to the Earth's rotation that causes deflection of a moving body to the right in the Northern Hemisphere and to the left in the Southern Hemisphere), until at some critical depth an induced current would move in a direction opposite to that of the wind.

      Results of many investigations suggest that the forces that drive the ocean currents originate at the interface between water and air. The direct transfer of momentum from the atmosphere to the sea is doubtless the most important driving force for currents in the upper parts of the ocean. Next in importance are differential heating, evaporation, and precipitation across the air–sea boundary, altering the density of seawater and thus initiating movement of water masses with different densities. Studies of the properties and motion of water at depth have shown that strong currents also exist in the deep sea and that distinct types of water travel far from their geographic sources. For example, the highly saline water of the Mediterranean that flows through the Strait of Gibraltar has been traced over a large part of the Atlantic, where it forms a deep-water stratum that is circulated far beyond that ocean in currents around Antarctica.

      Improvements in devices for determining the motion of seawater in three dimensions have led to the discovery of new currents and to the disclosure of unexpected complexities in the circulation of the oceans generally. In 1951 a huge countercurrent moving eastward across the Pacific was found below depths as shallow as 20 metres, and in the following year an analogous equatorial undercurrent was discovered in the Atlantic. In 1957 a deep countercurrent was detected beneath the Gulf Stream with the aid of subsurface floats emitting acoustic signals.

      Since the 1970s Earth-orbiting satellites have yielded much information on the temperature distribution and thermal energy of ocean currents such as the Gulf Stream. Chemical analyses from Geosecs makes possible the determination of circulation paths, speeds, and mixing rates of ocean currents.

      Surface waves of the ocean are also exceedingly complex, at most places and times reflecting the coexistence and interferences of several independent wave systems. During World War II, interest in forecasting wave characteristics was stimulated by the need for this critical information in the planning of amphibious operations. The oceanographers H.U. Sverdrup (Sverdrup, Harold Ulrik) and Walter Heinrich Munk combined theory and empirical relationships in developing a method of forecasting “significant wave height”—the average height of the highest third of the waves in a wave train. Subsequently, this method was improved to permit wave forecasters to predict optimal routes for mariners. Forecasting of the most destructive of all waves, tsunamis, or “tidal waves,” caused by submarine quakes and volcanic eruptions, is another recent development. Soon after 159 persons were killed in Hawaii by the tsunami of 1946, the U.S. Coast and Geodetic Survey established a seismic sea-wave warning system. Using a seismic network to locate epicentres of submarine quakes, the installation predicts the arrival of tsunamis at points around the Pacific Basin often hours before the arrival of the waves.

glacier motion and the high-latitude ice sheets
      Beginning around 1948, principles and techniques in metallurgy and solid-state physics were brought to bear on the mechanics of glacial movements. Laboratory studies showed that glacial ice deforms like other crystalline solids (such as metals) at temperatures near the melting point. Continued stress produces permanent deformation. In addition to plastic deformation within a moving glacier, the glacier itself may slide over its bed by mechanisms involving pressure melting and refreezing and accelerated plastic flow around obstacles. The causes underlying changes in rate of glacial movement, in particular spectacular accelerations called surges, require further study. Surges involve massive transfer of ice from the upper to the lower parts of glaciers at rates of as much as 20 metres a day, in comparison with normal advances of a few metres a year.

      As a result of numerous scientific expeditions into Greenland and Antarctica, the dimensions of the remaining great ice sheets are fairly well known from gravimetric and seismic surveys. In parts of both continents it has been determined that the base of the ice is below sea level, probably due at least in part to subsidence of the crust under the weight of the caps. In 1966 a borehole was drilled 1,390 metres to bedrock on the North Greenlandice sheet, and two years later a similar boring of 2,162 metres was cut through the Antarctic ice at Byrd Station. From the study of annual incremental layers and analyses of oxygen isotopes, the bottom layers of ice cored in Greenland were estimated to be more than 150,000 years old, compared with 100,000 years for the Antarctic core. With the advent of geochemical dating of rocks it has become evident that the Ice Age, which in the earlier part of the century was considered to have transpired during the Quaternary Period, actually began much earlier. In Antarctica, for example, potassium–argon age determinations of lava overlying glaciated surfaces and sedimentary deposits of glacial origin show that glaciers existed on this continent at least 10,000,000 years ago.

      The study of ice sheets has benefited much from data produced by advanced instruments, computers, and orbiting satellites. The shape of ice sheets can be determined by numerical modeling, their heat budget from thermodynamic calculations, and their thickness with radar techniques. Colour images from satellites show the temperature distribution across the polar regions, which can be compared with the distribution of land and sea ice.

Atmospheric sciences (atmospheric science)
Probes, satellites (Earth satellite), and data transmission
      Kites equipped with meteorgraphs were used as atmospheric probes in the late 1890s, and in 1907 the U.S. Weather Bureau recorded the ascent of a kite to 7,044 metres above Mt. Weather, Virginia.

      In the 1920s the radio replaced the telegraph and telephone as the principal instrument for transmitting weather data. By 1936 the radio meteorgraph (radiosonde) was developed, with capabilities of sending signals on relative humidity, temperature, and barometric pressure from unmanned balloons. Experimentation with balloons up to altitudes of about 31 kilometres showed that columns of warm air may rise more than 1.6 kilometres above the Earth's surface and that the lower atmosphere is often stratified, with winds in the different layers blowing in different directions. During the 1930s airplanes began to be used for observations of the weather, and the years since 1945 have seen the development of rockets and weather satellites. TIROS (Television Infra-Red Observation Satellite), the world's first all-weather satellite, was launched in 1960, and in 1964 the Nimbus Satellite of the United States National Aeronautics and Space Administration (NASA) was rocketed into near-polar orbit.

      There are two types of weather satellites: polar and geostationary. Polar satellites, like Nimbus, orbit the Earth at low altitudes of a few hundred kilometres, and, because of their progressive drift, they produce a photographic coverage of the entire Earth every 24 hours. Geostationary satellites, first sent up in 1966, are situated over the Equator at altitudes of about 35,000 kilometres and transmit data at regular intervals. Much information can be derived from the data collected by satellites. For example, wind speed and direction are measured from cloud trajectories, while temperature and moisture profiles of the atmosphere are calculated from infrared data.

      Efforts at incorporating numerical data on weather into mathematical formulas that could then be used for forecasting were initiated early in the century at the Norwegian Geophysical Institute. Vilhelm Bjerknes (Bjerknes, Vilhelm) and his associates at Bergen succeeded in devising equations relating the measurable components of weather, but their complexity precluded the rapid solutions needed for forecasting. Out of their efforts, however, came the polar front theory for the origin of cyclones and the now-familiar names of cold front, warm front, and stationary front for the leading edges of air masses (see climate: Atmospheric pressure and wind (climate)).

      In 1922 the British mathematician Lewis Fry Richardson (Richardson, Lewis Fry) demonstrated that the complex equations of the Norwegian school could be reduced to long series of simple arithmetic operations. With no more than the desk calculators and slide rules then available, however, the solution of a problem in procedure only raised a new one in manpower. In 1946 the mathematician John von Neumann (von Neumann, John) and his fellow workers at the Institute for Advanced Study, in Princeton, N.J., began work on an electronic device to do the computation faster than the weather developed. Four years later the von Neumann group could claim that, given adequate data, their computer could forecast the weather as well as a weatherman. Present-day numerical weather forecasting is achieved with the help of advanced computer analysis (see weather forecasting).

      Studies of cloud physics have shown that the nuclei around which water condenses vary widely in their degree of concentration and areal distribution, ranging from six per cubic centimetre over the oceans to more than 4,000,000 per cubic centimetre in the polluted air of some cities. The droplets that condense on these foreign particles may be as small as 0.001 centimetre in diameter. Raindrops apparently may form directly from the coalescence of these droplets, as in the case of tropical rains, or in the temperate zones through the intermediary of ice crystals. According to the theory of Tor Bergsonand Walter Findeisen, vapour freezing on ice crystals in the clouds enlarges the crystals until they fall. What finally hits the ground depends on the temperature of air below the cloud—if below freezing, snow; if above, rain.

Properties and structure of the atmosphere
      Less than a year after the space age began with the launching of the Soviet Sputnik I in 1957, the U.S. satellite Explorer I was sent into orbit with a Geiger counter for measuring the intensity of cosmic radiation at different levels above the ground. At altitudes around 1,000 kilometres this instrument ceased to function due to saturation by charged particles. This and subsequent investigations showed that a zone of radiation encircles the world between about latitude 75° N and 75° S, with maximum intensities at 5,000 and 16,000 kilometres. Named after the American physicist James Van Allen (Van Allen, James A.), a leading investigator of this portion of the Earth's magnetosphere, these zones are responsive to events taking place on the Sun. The solar wind, a stream of atomic particles emanating from the Sun in all directions, seems to be responsible for the electrons entrapped in the Van Allen region as well as for the teardrop shape of the magnetosphere as a whole, with its tail pointing always away from the Sun.

      In 1898 Teisserenc de Bort (Teisserenc de Bort, Léon), studying variations of temperature at high altitudes with the aid of balloons, discovered that at elevations of about 11 kilometres the figure for average decrease of temperature with height (about 5.5° C per 1,000 metres of ascent) dropped and the value remained nearly constant at around -55° C. He named the atmospheric zones below and above this temperature boundary the troposphere and the stratosphere.

      Toward the end of World War II the B-29 Superfortress came into use as the first large aircraft to cruise regularly at 10,000 metres. Heading westward from bases in the Pacific, these planes sometimes encountered unexpected head winds that slowed their flight by as much as 300 kilometres per hour. The jet streams, as these high-altitude winds were named, have been found to encircle the Earth following wavy courses and moving from west to east at velocities ranging upward to 500 kilometres per hour. Aircraft have also proved useful in studies of the structure and dynamics of tropical hurricanes. Following the destruction wrought to the Atlantic Coast of the United States in 1955 by hurricanes Connie and Diane, a national centre was established in Florida with the missions of locating and tracking and, it is hoped, of learning how to predict the paths of hurricanes and to dissipate their energy.

      As late as the 1890s experiments were conducted in the United States in the hope of producing rain by setting off charges of dynamite lofted by balloons or kites. No positive results were reported, however. More promising were the cloud-seeding experiments of the 1940s, in which silver iodide was released into clouds as smoke or solid carbon dioxide broadcast into clouds from airplanes. The results are still uncertain for increasing precipitation. The lessons learned from cloud seeding, however, have had other successful applications, such as the dispersal of low-level supercooled fog at airports (the first system designed for this purpose, the Turboclair fog-dissipation system, was set up in 1970 at Orly airport in Paris).

      The inadvertent weather modification that has followed industrialization and the building of large cities has, however, already produced measurable changes in local climate and may someday produce effects more widespread. The introduction of some 12,000,000,000 tons of carbon dioxide into the atmosphere each year from the burning of fuels may in time raise the Earth's average temperature. Cities affect the flow of wind, warm the atmosphere over them, and send pollutants into the sky. Updrafts and an abundance of condensation nuclei may increase rainfall and winter fog and reduce sunshine and daylight.

Claude C. Albritton Brian Frederick Windley

Additional Reading
The history of the Earth sciences is recounted in Frank Dawson Adams, The Birth and Development of the Geological Sciences (1938, reprinted 1954), the best general account for the years prior to 1830; Asit K. Biswas, History of Hydrology (1970), a factual chronicle of developments since the earliest times; Henry Faul and Carol Faul, It Began with a Stone: A History of Geology from the Stone Age to the Age of Plate Tectonics (1983); Mott T. Greene, Geology in the Nineteenth Century: Changing Views on a Changing World (1982), a history of tectonic thinking concerned with the formation of mountains and earth evolution; A. Hallam, A Revolution in the Earth Sciences (1973), a summary of the historical development of ideas from seafloor spreading to plate tectonics, and Great Geological Controversies (1983), an evaluation of celebrated controversies from Neptunism to continental drift; Robert Muir Wood, The Dark Side of the Earth: The Battle for the Earth Sciences, 1800–1980 (1985), a history of important controversies; Richard J. Chorley, Antony J. Dunn, and Robert P. Beckinsale, The History of the Study of Landforms; or, The Development of Geomorphology, vol. 1, Geomorphology Before Davis (1964), an expansive account covering developments to the end of the 19th century; Charles C. Gillispie, Genesis and Geology: A Study in the Relations of Scientific Thought, Natural Theology, and Social Opinion in Great Britain, 1790–1850 (1951, reprinted 1969), an analysis of the impact of developments in geology upon Christian beliefs in the decades before Darwin (extensive bibliography); C.P. Idyll (ed.), Exploring the Ocean World: A History of Oceanography, rev. ed. (1972), a symposium treating each of the several branches of oceanography in historical format; Rachel Laudan, From Mineralogy to Geology: The Foundations of a Science, 1650-1830 (1987), which traces the intellectual roots of geology to mineralogy and chemical cosmogony; Joseph Needham, Science and Civilisation in China, vol. 3, Mathematics and the Sciences of the Heavens and the Earth (1959), containing a comprehensive and elaborately illustrated account of the history of Earth science in China to around AD 1500; Cecil J. Schneer, “The Rise of Historical Geology in the 17th Century,” Isis, vol. 45, part 3, no. 141, pp. 256–268 (September 1954), an analysis of the points at issue in the fossil controversy; Cecil J. Schneer (ed.), Toward a History of Geology (1969), 25 essays on the history of geologic thought, mainly of the 18th and 19th centuries; Napier Shaw, Manual of Meteorology, vol. 1, Meteorology in History (1926, reprinted 1932), a rambling but literate and entertaining history of meteorology from the earliest to modern times; Evelyn Stokes, “Fifteenth Century Earth Science,” Earth Sciences Journal, 1(2):130–148 (1967), an analysis of classical and medieval views of nature, especially those reflected in Caxton's Mirrour of the World; Philip D. Thompsonet al., Weather, rev. ed. (1980), an introduction to meteorology with much historical material, well illustrated; Stephen Toulminand June Goodfield, The Discovery of Time (1965, reprinted 1983), which traces the history of the idea of geologic time; William Whewell, History of the Inductive Sciences from the Earliest to the Present Time, 3rd ed., 3 vol. (1857, reissued 1976)—vol. 2 containing an analysis of uniformitarian and catastrophist views of Earth history; and Karl Alfred Von Zittel, History of Geology and Palaeontology to the End of the Nineteenth Century (1901, reissued 1962; originally published in German, 1899, reprinted 1965), best for its history of paleontology.Brian Frederick Windley

* * *

Universalium. 2010.

Look at other dictionaries:

  • earth sciences — ► PLURAL NOUN ▪ the branches of science concerned with the physical constitution of the earth and its atmosphere …   English terms dictionary

  • Earth sciences — sciences which deal with the study of the earth (geography, oceanography, etc.) …   English contemporary dictionary

  • Earth sciences graphics software — Earth sciencs graphics software plotting and image processing software used in atmospheric sciences, meteorology, climatology, oceanography and other Earth science disciplines.Earth sciences graphics software includes capability to read… …   Wikipedia

  • Stanford University School of Earth Sciences — The School of Earth Sciences (often referred to as the SES, or at Stanford as just The School ) is one of three schools at Stanford awarding both graduate and undergraduate degrees. Stanford s first faculty member was a professor of geology as… …   Wikipedia

  • Nicholas School of the Environment and Earth Sciences — Infobox University name = Nicholas School of the Environment and Earth Sciences established = 1938 as the School of Forestry and Environmental Studies, 1991 as the Nicholas School type = Private city = Durham state = North Carolina country = USA… …   Wikipedia

  • Ministry of Earth Sciences — The Ministry of Earth Sciences formed in 2006 from a merger of the India Meteorological Department (IMD) [1], the National Centre for Medium Range Weather Forecasting (NCMRWF) [2], the Indian Institute of Tropical Meteorology (IITM), Pune [3],… …   Wikipedia

  • University of Cambridge Department of Earth Sciences — The Department of Earth Sciences at Cambridge is the University of Cambridge s Earth Sciences department. The main location of the department is at the Downing Site, Downing St. The Bullard Laboratories, located in West Cambridge on Madingley Rd …   Wikipedia

  • Ministry of Earth Sciences (India) — The Ministry of Earth Sciences formed in 2006 from a merger of the [ India Meteorological Department (IMD)] , the [ National Centre for Medium Range Weather Forecasting (NCMRWF)] , the… …   Wikipedia

  • Faculty of Earth Sciences — may refer to:* King Abdulaziz University Faculty of Earth Sciences * University of Cambridge Department of Earth Sciencesee also* List of Universities with Soil Science Curriculum …   Wikipedia

  • Sedgwick Museum of Earth Sciences — The Sedgwick Museum of Earth Sciences, opened in 1904, is the geology museum of the University of Cambridge in England. The Sedgwick has a collection of more than 1 million rocks, minerals and fossils. The museum was built in memory of Adam… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”