Texas scientists found that injecting waste water into an aquifer more than 7000 feet beneath the town of Azle, TX, reactivated ancient faults (thick black lines) and caused earthquakes.
Source: Nature Communications/SMU
(Click to view larger image.)
As seismologists, we are often asked if fracking, the brute force technology for the advanced recovery of hydrocarbons and geothermal energy from underground reservoirs, causes earthquakes. The answer is a clear yes, because, simply put, fracking is the very definition of an earthquake. The sole purpose of fracking is to break the rock underground, so that fluids and gases can flow more easily. It is this rock breaking, which in Earth science is commonly defined as an earthquake. There are, however, many major differences between a frack quake and the kind of temblors we are experiencing here in California all the time.
Given the tectonic situation in this state, most of our quakes are of a sort called shear fractures. In such a quake the two flanks of a fault rumble past each other, but they do not move away from each other. In contrast, in a frack quake the two flanks of a fault become separated and show only minimal sliding or shear movement. Such quakes, which not only occur during fracking but also when magma rises in a volcano, are called tensional fractures. Given the differences of physical movement in a fault, tensional fractures are a separate class of earthquakes from shear fractures.
The other major difference between fracks and tectonic quakes are the forces involved. While it looks impressive and sounds deafening when huge pumps and compressors are moved to a drill site and begin pumping water into a borehole, the pressures they generate are tiny compared to what our own Earth is capable of when the tectonic plates shift. It is for a good reason that the power of tectonic earthquakes is often compared to the energy released in the detonation of a nuclear weapons. You can bring all the pump trucks of the western hemisphere to a drill site and even if they could access the bore hole simultaneously, they would not be able to generate enough force to cause even a medium sized shaker. Bill Ellsworth, a seismologist at the USGS office in Menlo Park summarized in the journal "Science" in 2013: "More than 100,000 wells have been subjected to fracking in recent years, and the largest induced earthquake was magnitude 3.6, which is too small to pose a serious risk." The bottom line: Frack quakes are small and only very few of them are even felt by humans.
Why is it then that fracking - or hydraulic fracturing as it is technically called - has such a bad reputation? While the blogger can't comment on how fracking could affect groundwater quality, one thing is clear: There won't be any man made frack quakes which will rock the whole planet like the megaquakes we have had during the past few years. Nevertheless, there is an important aftermath of fracking, which can cause sizable earthquakes. It has to do with the thousands of gallons of water used during the fracturing process. This water contains additives to help the fracking and while underground, it also dissolves minerals from the deep rock formations - in short this water is contaminated. Hence it cannot simply be discharged into a river, used as irrigation water or even sent to a sewage treatment plant. Instead this water is pumped back into the ground, deep below the groundwater horizons. This process is called injection and has been a regulator approved standard operating procedure in the oil and gas industry for decades. According to some sources, there are more than 500,000 such injection wells in the US.
Most of them cause no problems at all, but a few dozen of them are associated with significant earthquakes. Here a two examples: The largest quake ever measured in Oklahoma, a 5.6 shaker which struck 40 miles east of Oklahoma City in 2011, was caused by waste water injection.
Yesterday, a group of researchers led by Matthew Hornbach from the Earth Science Department of the Southern Methodist University (SMU) in Dallas, TX, published a study about the seismicity in their own neighborhood. They write in the journal "Nature Communications" that a series of small quakes near the town of Azle, northwest of Dallas-Fort Worth, was clearly caused by wastewater injection.
If you want to learn more about fracking, injection wells and the risk of earthquakes associated with both, come to the Berkeley campus next Wednesday, April 29. Greg Beroza, an Earth scientist from Stanford, will give BSL's annual Lawson Lecture under the title "Induced Earthquakes in the 21st Century". The talk will start at 5:30 pm in the Banatao Auditorium in Sutardja Dai Hall (Citris-Building). (hra101)
|UCERF Bay Area map (Click to view larger image.)|
Where is the most hazardous place to live in the Bay Area? That's a difficult question to answer, because it all depends on the type of risk one is taking into account. A map showing the likelihood of becoming a victim of violent crime will look very different from a map of all fatal traffic accidents or a plot of the geographical distribution of heart attacks that happen while the victim is jogging. Thanks to the efforts of dozens of Earth scientists from many institutions, we now have a much better map of another danger lurking beneath the Bay Area - the hazard associated with earthquake ruptures. This new map clearly shows a distinct distribution of hot spots and low risk areas.
As described in several previous blogs, these regional details are part of a much larger effort, the "Uniform California Earthquake Rupture Forecast" (UCERF-3), which was published last month. For this forecast, researchers computed the probabilities that potentially damaging earthquakes - those with magnitudes larger than 5 - will strike various parts of California within the next 30 years. For the Bay Area, scientists divided the known faults of our region into approximately 100 smaller sections and looked at the likelihood that each of them might rupture in a quake.
The accompanying map - compiled by Jack Boatwright from the USGS office in Menlo Park based on the UCERF-3 modelling - shows the major faults in the Bay Area in different colors. All faults in blue and green have a very low hazard of rupture, below 3 percent. Yellow faults have a risk that is up to twice as high. But still, a 6 percent chance translates into a rather unlikely event. However, once the colors are range and red, the probabilities become significant. Three fault segments with the reddest colors - hence the greatest likelihoods of rupture - stand out. In the far north it is the Maacama Fault, which runs parallel to the San Andreas Fault far inland in Mendocino County. It has an almost 25 percent chance of generating a large quake in the coming three decades. The San Andreas Fault to the south beyond Hollister has an even slightly higher likelihood of rupture. Fortunately, both of these fault segments cross areas with relatively sparse population. Since "risk" is defined as the product of "hazard" times "vulnerability", the risk is lower in rural areas.
In contrast, the third fault with a similarly high probability cuts though the heart of a very densely inhabited part of the Bay Area. The southern Hayward Fault, which extends along the East Bay foothills between south of Fremont and San Leandro, has the highest likelihood of giving way. One of the reasons for its high probability is the fact, that this segment has not ruptured in a significant quake since 1868. Since then tectonic stress has accumulated and according to the latest UCERF model there is a 26 percent chance it will give way in the next 30 years. A detailed scenario of what's is in store for us, when we experience a repeat of the 1868 earthquake, is described in a previous blog - and the news is not good.
However, there are some pieces of positive information in the UCERF map. The San Andreas Fault, the mother of all earthquake faults in California, shows a much lower probability of rupturing than the more hazardous faults further inland. Particularly along the Peninsula and further south, the chances of a large quake are only 9 percent. The reason lies in the not so distant past. In 1989 the Loma Prieta Quake relieved the fault of a good amount of tectonic stress. Even though that quake occurred more than 25 years ago, its relaxing effect on the Earth's crust is still being felt. (hra100)
Alternating layers of tidal mud (grey) and soil (brown), here pointed out in the Girdwood Marsh, show the repeated cycle of subsidence and uplift caused by the megathrust earthquakes in Alaska. (Foto: Horst Rademacher) (Click to view larger.)
Although seismologists have made enormous progress in the last few decades in understanding the causes and effects of earthquakes, many mysteries still remain. Now compare our knowledge today to the state of seismology in the year 1964: Coordinated worldwide seismic networks were still in their infancy. Plate tectonics, today's standard in explaining the restless surface of the earth with its volcanoes and earthquakes, was still a very controversial hypothesis. Computers, which could be programmed to model the processes inside our planet, were few and far between and in addition they were excruciatingly slow. In short: Five decades ago, almost every major earthquake posed a great new challenge to researchers, forcing them to throw out old notions and raise completely new questions.
This was especially true for those giant quakes with magnitudes of 8.5 and above, which we now call megathrust earthquakes. This lack of understanding of the really giant quakes was about to change, when in the late afternoon today 51 years ago Alaska was rocked and rolled by a truly strong quake. With a magnitude of 9.2 this Great Alaska Earthquake of 1964 remains today the second biggest earthquake ever recorded. The largest, with a magnitude of 9.5, had occurred four years earlier in Chile.
This entry into the "Giant Temblor Hall of Fame" is not the only reason why the blogger thinks that this quake, which wreaked havoc in Anchorage and whose tsunami cost hundreds of lives along the southern cost of the northernmost State, deserves a look back on its anniversary. The main reason is that by studying this quake, scientists were for the first time able to convincingly connect plate tectonics and earthquakes.
This connection was mainly the work of one Earth scientist, who wasn't even an expert in seismology. A day after the quake, the USGS sent George Plafker to Alaska, who was then a young geologist. His mission was to tally the devastation the quake had caused, and also to note how the effects of the temblor had changed the Alaskan landscape. After a few days in the field, he noticed a curious pattern of subsidence and uplift over a large area of Alaska. Along the southern coast the land was uplifted by 30 feet or more, further inland, in a sweeping arc encompassing the Kenai Peninsula, Kodiak Island and the area around Anchorage, the land had subsided by up to 7 feet.
At first, this pattern didn't make sense to Plafker, but a few weeks after the quake he found a convincing explanation. Plate tectonics predicted that the Pacific Plate moves to the Northwest and dives under the North-American continent along the Alaska coast. Because this process is neither smooth nor nicely lubricated, the movement would cause the two plates to stick to another. During its dive the Pacific would grab pieces of the North American plate and drag them along into the abyss of the Earth's mantle. As a consequence, over the years the land along the immediate coast would subside at a substantial rate, while at the same time the countryside further inland bulged upward. The earthquake finally releases the two interlocked plates and the segments of earth crust jump back into their original positions: Hence the observed uplift long the coast and subsidence further inland.
It is this jumping up of the land which scientists now call a thrust - and if the jump is big enough, it gets the name megathrust. As the pattern measured by Plafker showed, the Alaska earthquake of 51 years ago today, was indeed the first megathrust ever measured. The devastating quakes of the past decade, like the Sumatra earthquake of 2004 or the Tohuku quake of 2011 also belong to the same category. (hra099)
If we Californians didn't know that we are living in earthquake country, and that the question is not "if" but "when" the next Big One will occur, one could become really frightened by the details which went into the latest earthquake forecast model, published last week . The researchers involved in setting up the model used more than 250,000 earthquake scenarios on a total of 350 fault segments to compute where significant shakers are likely to occur in the state during the next 30 years. The computations are so complex that it would have taken a regular desktop computer roughly eight years of continuous number crunching to yield a result. In order not to be outrun by the forecast quake, the scientists used a supercomputer instead, which computed each model in less than an hour.
|Figure 1 ( Adapted from WGCEP, 1988 )|
|Figure 2 ( Adapted from UCERF-3 )|
But despite its enormous detail, the latest "Uniform California Earthquake Rupture Forecast" (UCERF-3) is still a forecast and by no means an earthquake prediction. When talking about the weather, most people in casual conversations don't distinguish between the two terms anymore, forecast and prediction. But when you look closely at the bulletins about future weather issued by the National Weather Service (NWS) you will see the difference. The NWS folks never say "that it will rain exactly 2 inches at Fisherman's Wharf starting at 3 pm tomorrow". That would be prediction. Instead their bulletins read that there will be a 90 percent chance of significant rain at the northern end of the Peninsula in the next 24 hours - which is a forecast.
Such distinctions must also be made in seismology. A meaningful earthquake prediction has to fulfill three criteria: One has to forecast exactly where, when and with what magnitude an earthquake will strike. Despite considerable research efforts in many countries over in recent decades, nobody anywhere has succeeded in getting all three of those criteria consistently right. Hence: The prediction of earthquakes is still an elusive goal. But forecasts like UCERF-3 are the next best thing. The main reason for that is precisely the large amount of detail mentioned earlier: Over the years these rupture forecast models have become much more reliable and should therefore be taken quite seriously.
One example of the improvement is the number of California faults, which are the basis of the models. Figure 1 shows all of the handful of faults used for the first such calculations in 1988 - one year before the magnitude 6.9 Loma Prieta Earthquake ruptured part of the Northern San Andreas Fault. The faults are colored to show the different rates at which their flanks are moving past each other. Yellow is the highest speed with about 1.5 inches per year. Compare this with Figure 2, which shows all the faults and subfaults on which the latest forecast is based. It is much more detailed than the model from almost 30 years ago.
Another reason for the improvement in the forecasts is that seismologists have learned much more about the actual processes taking place during an earthquake. Until recently, nobody could imagine that a rupture during a temblor could jump from one fault to a neighbor. But modern techniques which use seismic waves to illuminate the actual breaking of the rock have shown that such jumps happen regularly - resulting in stronger and more extended earthquakes than previously thought possible. In the next blog, we will look in more detail at the forecast for the faults in our local Bay Area. (hra098)
A new long-term earthquake forecast for California. (Click to view larger.)
The seismoblogger was amazed how some mass media reacted to the latest announcement of earthquake probabilities in California. "Risk of 8.0 earthquake in California rises" read the headline in the Los Angeles Times. Fox News beamed over the TV channels "Risk of mega-earthquake increases" and LA's KABC even went so far as to say the "USGS predicts massive earthquake in California within 30 years". While none of these statements is completely untrue, they certainly leave the impression as if the big one is imminent and that the USGS can predict giant temblors. What gave rise to these alarmist headlines during the last few days?
On Monday the United States Geological Survey (USGS) released the latest findings of a group of nearly 50 Earth scientists from more than a dozen institutions. The goal of this "Working Group on California Earthquake Probabilities" was to estimate and compute just what its name suggests: What are the chances that potentially damaging earthquakes - those with magnitudes larger than 5 - will strike California within the next 30 years. Such efforts to determine the probabilities are not new. The latest release of the findings is the third in series of similar calculations which began in 1995. Such computations are technically called "probabilistic risk analysis". They enable seismologists to calculate the chances that an earthquake of a certain size will strike a certain segment of a fault during a specified time window (older blog: Earthquake Probabilities in the Bay Area). These calculations are based on our knowledge of the past seismicity along the various active faults in California and also on precise measurements of how fast the plates move in our State. As research progresses and our understanding of fault behavior improves over time, the model calculations of probabilities need to be refined every few years.
That is exactly what happened on Monday, when the very complex and detailed new study of the chances of earthquake occurrence on all known 350 fault segments in California was published. But unfortunately, many in the media focused on just one small aspect of the report: The likelihood that a truly big quake of magnitude 8 or greater will occur in California over the next 30 years increased from 4.7% as calculated in the last report to 7%. Is this jump really significant? Shall we lose sleep over this reevaluation of the chances of a mega-quake? When we look at the other numbers in this report, the answer to both questions is simply "no". The chances that during the coming three decades some part of California will get hit by a magnitude 7 earthquake are a whopping 93 percent - which translates to "almost certain". And one step further: We can be sure (larger than 99 percent) that we will have to live through a magnitude 6.7 in the next 30 years.
The only way not to lose sleep over these numbers is to be prepared - and the preparations for an "almost certain" magnitude 7 quake are exactly the same as those for a "very unlikely" (7%) magnitude 8 event. Hints and examples on how to get ready for and survive the shaking can be found in numerous websites. For example, the Red Cross provides earthquake preparedness information. In the next blog, we will discuss in more detail how the experts computed the most recent probabilities. (hra097)
USGS press release