Can we someday predict earthquakes?
The only thing we know for sure about earthquakes is that one will happen again very soon. Earthquakes pose a vital yet puzzling set of research questions that have confounded scientists for decades, but new ways of looking at seismic information and innovative laboratory experiments are offering tantalizing clues to what triggers earthquakes — and when.
Millions of earthquakes shake the globe every year, the ground suddenly lurching in response to movements of the tectonic plates that form Earth’s crust. These plates jostle over, under and against each other as they shift. All that shoving and grinding builds up stress along faults — fractures or breaks in the rock of the crust — until something has to give: an earthquake.
The science of seismology seeks to understand what causes earthquakes by tracking their occurrence, measuring their force and using sophisticated imaging technology to probe the subsurface geology where they happen. Today we can locate faults, characterize them and explain many of the stresses building toward their failure.
We still don’t fully understand the details inside faults or how those details might control the location and timing of earthquakes, but geophysicists and computer scientists at Los Alamos National Laboratory and their colleagues are wielding an array of new tools to study the interactions among earthquakes, precursor quakes (often very small earth movements) and faults.
These tools include aquarium-sized experiments in the laboratory that replicate quakes, more sensitive and more densely deployed seismology instruments worldwide producing vast data streams and supercomputers that can make sense out of this massive data set.
Because it’s so hard to observe geologic-scale interactions underground, the Los Alamos team, with collaborators at Penn State, the U.S. Geological Survey, ETH in Zurich and the Institute of Physics of the Globe and the Ecole Normale in Paris, France, has developed laboratory experiments to figure out when faults might fail.
Using an “earthquake machine” built by Chris Marone at Penn State, the team is investigating the role that “fault gouge” — the loose material created by the constant grinding at a fault — may play in triggering and influencing the size of quakes. The lab machine creates conditions similar to faults with gouge, then submits them to sound waves as surrogate seismic waves.
These experiments have produced strange and startling effects. The team observed that when stress built up on a fault and it approached failure as a miniature earthquake, a series of small precursor quakes rippled through at rates that followed specific patterns. Comparing these results to actual seismic data reveals similar rate failures when precursors are observed before real earthquakes.
They also found that the applied sound waves played a key role in triggering laboratory-scale earthquakes by making the gouge more fluid. Amazingly, precursors can trigger major earthquakes thousands of miles away from their origin and often months later.
Frequently no precursors are observed before a quake, but that might be because extremely small precursors elude detection. To test that hypothesis, Los Alamos is bringing its supercomputing horsepower to bear on the subject, combing through historical data to see if smaller-magnitude events seemed to signal precursor events preceding temblors in the past. Starting with the lab data on simulated precursor quakes and using a technique called machine learning, Los Alamos is “training” a computer program to sift through this data set and spot precursors.
After the computer program has taught itself to recognize precursors, the team will run the program against actual seismic data. The team will then compare the accuracy of those results to more traditional interpretations of the same data.
Other data sets from actual seismic monitoring will be added to the experiment in a process called “ground truthing,” intended to verify the computer program’s predictive accuracy. The goal is to develop a computer program that reviews new data in almost real time and spots precursors heralding an upcoming major earthquake.
Within the next year or so, the team plans to use the newest computers at Los Alamos, some of the most powerful in the world, to crunch the numbers from larger and larger data sets — first from mining areas, then tectonic regions like the San Andreas fault and finally worldwide — to reveal previously hidden patterns of seismic signals.
Dreaming big, the team dares to pursue the Holy Grail of seismology: forecasting major earthquakes. That won’t happen any time soon. The first level of forecasting will be characterizing when an earthquake might happen within some time span. But as supercomputing power continues to grow, it will certainly drive us closer to accurately forecasting massive earthquakes.
While Los Alamos maintains technical expertise in seismology and the geodynamics of Earth’s crust as a means of monitoring underground nuclear testing worldwide, that expertise could one day alleviate suffering from unexpected earthquakes on a global scale.
Paul Johnson is a geophysicist, Los Alamos National Laboratory Fellow, a Fellow of the Acoustical Society of America and an American Geophysical Union Fellow in the Laboratory’s Geophysics group.
This story originally appeared in The Santa Fe New Mexican.