In the News
North America

Geophysics and Petroleum Exploration in North America: A Time for Reflection

We take a look at the advent of ‘remote’ sensing and the adoption of seismology in petroleum exploration in North America.

Prospecting for oil in the the early 20th century had come a long way from the days of riding around the prairie on a horse and drilling wherever your hat fell off. Even so, however you looked at it, the extent to which a geologist could predict the subsurface structure was limited. Anticlines might have been the classic indicator of an oil-bearing structure, but only drilling could prove its presence and then often by chance. The advent of ‘remote’ sensing fundamentally changed this.

Salt domes were often associated with oil fields. In 1924 Everette deGolyer discovered the Nash Salt Dome in Texas using an Eötvös torsion balance, a device for measuring anomalies in local gravity – the first time that a significant oil field was located by geophysical means in the United States. Gravimeters and magnetometers were subsequently developed, and the Schlumberger brothers pioneered electrical surveying in Europe, but none of these methods could supplant seismology which arrived on the scene in the 1920s.

Seismology originated from the study of earthquakes many years before it was applied to petroleum exploration. It was understood that sound waves passed through the ground at different velocities according to the density of the rock they encountered and, by recording them, the subsurface structure could be delineated. During World War I, German mining engineer Ludger Mintrop developed a portable seismic device for locating enemy gun emplacements. In 1921 he formed a company named Seismos, and three years later a Seismos crew was hired by Gulf Oil to search for shallow salt domes in Texas using the refraction method.

Seismology – the Turning Point

However, experiments in reflection seismology marked a turning point. While the refraction method sent sound waves down and then laterally through the Earth, the more powerful but expensive reflection method measured waves reflected directly from below. The original research was carried out by Canadian radio pioneer Reginald Fessenden. In September 1917 he patented his ‘Method and Apparatus for Locating Ore Bodies’, a design for sound-generating equipment to be used in geophysical prospecting.

American physicist John C. Karcher, who worked for the Sound Section of the US Bureau of Standards on artillery detection, was aware of Fessenden’s work on reflection seismology. After the war, he researched its possible use in petroleum exploration. In December 1928, based on his earlier findings, the Amerada Petroleum Corporation drilled into the Viola limestone formation in Oklahoma and discovered oil – the first well to be drilled in a structure discovered by the reflection method. Karcher’s work earned him the sobriquet ‘Father of Reflection Seismology’.

By now DeGolyer, vice president and general manager of Amerada, was closely involved with Karcher through a subsidiary, the Geophysical Research Corporation. In 1930, with Eugene McDermott and the backing of DeGolyer, Karcher launched Geophysical Service Incorporated (GSI) which would later become Texas Instruments with GSI as a subsidiary. The reflection method, more suited to greater depths and marine environments than refraction, became a primary means of petroleum exploration, although the latter still had a role to play.

  • A GSI crew in South Louisiana in the early 1930s. Credit: Degolyer Library, SMU.

Out in the Field

Deep in the southern tip of Canada’s vast coniferous belt, dotted with innumerable muskeg, haunted by wails of coyote and wolf, Western F-53 maintains the company colors in this wild and rugged no man’s land 300 miles northwest of the provincial capital city of Edmonton.

By the 1950s, seismic crews were operating across the continent, albeit with differences in working conditions between prairie, mountain, desert, or sea; it was not all camping in the wilderness, but in some cases a question of finding accommodation in a nearby town. We can imagine the scene when a survey party set off in convoy for a new location. At times, they would leave town in multiple convoys in order to throw oil scouts off the scent and prevent them from discovering their new survey locations.

  • A Seismos convoy in Louisiana in 1926. Credit: Gerhard Keppner.

Research included published literature and work on theoretical mathematics. Dr Milton Dobrin’s Introduction to Geophysical Prospecting was the most influential textbook of its day. But essentially it was field work that kept the wheels of the industry turning. By the 1960s, the cost of geophysical and geological surveys could vary considerably, sometimes exceeding the drilling costs of a well, ranging over several months or years from between $20,000 to $50,000 ($425,000 today) a month with only a few prospects in return. Nevertheless, it often made sense to drill since the additional costs would only be nominal when compared with those that had gone before.

Exploration was a lengthy business: there was the time needed to plan, contract a seismic crew and a company to acquire and process the data, perhaps within a weather window, and then finally to interpret it, which often took years before there were results to drill upon. What was acquired, processed, and reprocessed depended on the cost of drilling a well, the requirements of governments, and the type and duration of a concession.

Seismic is King

After World War II, innovations such as the airborne magnetometer, computers and magnetic tape were increasingly used. The mid-1950s saw the introduction of sonic logs, which brought greater precision to measuring the depth of structures identified by seismology. Since seismic data is usually measured in time, and geophysicists, geologists, and drillers work in depth, the process of depth conversion to produce maps and make drilling progress was an important part of exploration and development. If a depth conversion was incorrect, expected reservoir levels could be hundreds of metres too deep or too shallow, bringing increased exploration costs and risk.

Magnetic tape allowed relatively large amounts of data to be recorded and analysed, and seismic reflections to be manipulated to cut out extraneous ‘noise’ and enhance the essential signal. In an Alaskan summer, for example, the sound of ice thawing, vegetation moving and the wind blowing could interfere with seismic surveys, while in the winter the freezing of the ice had a similar effect. But that was in the days of old-style single-fold shooting, which was replaced by common depth shooting (CDP), a technique invented by Harry Mayne of the Petty Geophysical Engineering Company that cancelled much of the ‘noise’ around seismic readings. With magnetic tape, companies could process the extra data in a cost-effective way.

Computing Power

By 1972 the power of computers and the range of data acquisition were such that 3D imaging was technically possible. After a brainstorming session at GSI headquarters in Dallas, and with the support of six major oil companies, Bell Lake field in New Mexico was chosen as the test site. The results were remarkable: now seismic sections of the subsurface could be displayed in any orientation. However, it took time to catch on. Until the arrival of workstations in the 1980s, seismic was still interpreted on paper with coloured crayons, stratigraphic horizons and faults were picked by an interpreter and digitised, and maps were contoured by hand.

  • Texas Instruments computers for digital seismic processing, 1964.

The innovations went on. It became possible to measure relative wave amplitudes and directly identify the presence of hydrocarbons, or so-called ‘bright spots’. With the addition of time, 3D surveys became 4D and could show how reservoirs changed over a period. Pre-stack depth migrated seismic (PSDM) was used for imaging complex features adjacent to or beneath salt in the Gulf of Mexico. In post-seismic processing, many things could be done to present the data by emphasising or filtering features to clarify the image or subsurface prospects, the connectivity of faults and the like. An application named the coherence cube focused on 3D seismic discontinuities and revealed stratigraphic features and faults that were not immediately visible in other seismic programmes.

With the shale revolution, there was a rush to reprocess older seismic surveys, and many operators seeking partners and money to do new 3D surveys taking advantage of the latest upgrades in recording and processing technologies. Microseismic technology was used to review and guide the effectiveness of fracking operations and assist in locating adjacent wells. These days there are only a few seismic companies operating onshore, although the new technologies have given rise to a small market for reshooting areas that were considered poor quality seismic areas.

Making Waves

In 1936 geophysicist Maurice Ewing, whose ground-breaking work on ocean basins opened a new field of marine seismology, approached an executive of a large oil company for support and was told they were not interested in searching for oil at sea. It has since transpired that the sedimentary basins of the kind investigated by Ewing are the source of vast deposits of oil. Nearly all the marine discoveries have been made with seismic devices.

  • A worker placing explosives for seismic testing, 1940. Credit: Degolyer Library, SMU.

Dynamite, the seismic source of choice in the past, has been replaced by more environmentally friendly methods. ‘Thumper’ trucks which drop a weight on the ground were introduced in 1953 as an alternative to dynamite. However, since Conoco introduced hydraulic vibrators in the mid-1950s, vibroseis trucks have become the most common seismic method on land, providing a continuous signal and being less destructive than other methods. The seismic air-gun was invented in the 1960s and is often used as the seismic source at sea. Towed behind seismic survey vessels, arrays of air-guns release high pressure air pulses that penetrate the ocean floor.

  • Vibroseis trucks deployed in Nevada in 2012. Credit: Lesli J. Ellis, USGS

Whatever their source, seismic signals create compression waves (P waves) and shear waves (Sh and Sv). Typically, geophones record these modes using three components (3C recording or 4C recording in the sea where an extra hydrophone is used). Accelerometers, which measure acceleration as opposed to velocity, have a greater range than geophones and are often used in 3C and 4C recording. While P waves propagate through water, collecting data with a marine cable containing geophones (a ‘streamer’) towed behind a vessel is fraught.

  • A seismic survey ship trailing cables (streamers) in its wake. Source: Getty Images.

The slightest motion generates ‘noise’ so the offshore industry uses hydrophones, these being detectors that respond to pressure changes in the surrounding medium. Ocean-bottom recording is used to record shear wave data, which cannot travel through water. Broadband seismic allows deeper imaging, and towing streamers at different depths and using sampling pairs, such as pressure and velocity fields, reduces reflections from the sea surface (‘ghosting’).

Emerging Technologies

Today, the purpose of seismology in petroleum exploration remains essentially the same, to identify oil and gas prospects, and to assist the development of discoveries. But the technologies, and the acquisition, processing and interpretation of seismic data have evolved with the development of computing power and workstations, making it possible to develop deeper and clearer images of the subsurface, and beneath salt and basalts.

The geophysicist can image hydrocarbons from Direct Hydrocarbon Indicators (DHIs) and fluid contacts and movement via time-lapse surveys. Fibre optics are in vogue instead of geophones in boreholes where information is derived from the deformation in a fibre optic cable caused by seismic waves – a recent development is to make the fibre optics biodegradable.

There are plenty of derivatives of good quality offshore data worldwide like Exxon’s model for oil exploration based on seismic stratigraphy and global sea level change. This was of particular interest to the oil industry because it enabled sequences to be predicted and thus be correlated on a regional scale and represented a paradigm shift in geology. Technology is improving all the time and it is common to reshoot time-lapse surveys every few years, particularly over producing assets.

The upshot of these innovations has been to lower the element of risk in petroleum exploration. Since it is possible to predict the presence of hydrocarbons more accurately than by simply delineating structures and traps, the uncertainty and cost of drilling for oil are greatly reduced. This in turn has allowed companies to invest in new technologies, such as larger deepwater platforms that take petroleum exploration to new frontiers.

Acknowledgements

Thanks to Peter Morton and Alan Heward for their kind assistance.

Previous article
Drum Roll to Success at Kveikje
Next article
Subsurface CO2 storage – learning from past projects for better risk assessments

Related Articles