Alternating layers of tidal mud (grey) and soil (brown), here pointed out in the Girdwood Marsh, show the repeated cycle of subsidence and uplift caused by the megathrust earthquakes in Alaska. (Foto: Horst Rademacher) (Click to view larger.)
Although seismologists have made enormous progress in the last few decades in understanding the causes and effects of earthquakes, many mysteries still remain. Now compare our knowledge today to the state of seismology in the year 1964: Coordinated worldwide seismic networks were still in their infancy. Plate tectonics, today's standard in explaining the restless surface of the earth with its volcanoes and earthquakes, was still a very controversial hypothesis. Computers, which could be programmed to model the processes inside our planet, were few and far between and in addition they were excruciatingly slow. In short: Five decades ago, almost every major earthquake posed a great new challenge to researchers, forcing them to throw out old notions and raise completely new questions.
This was especially true for those giant quakes with magnitudes of 8.5 and above, which we now call megathrust earthquakes. This lack of understanding of the really giant quakes was about to change, when in the late afternoon today 51 years ago Alaska was rocked and rolled by a truly strong quake. With a magnitude of 9.2 this Great Alaska Earthquake of 1964 remains today the second biggest earthquake ever recorded. The largest, with a magnitude of 9.5, had occurred four years earlier in Chile.
This entry into the "Giant Temblor Hall of Fame" is not the only reason why the blogger thinks that this quake, which wreaked havoc in Anchorage and whose tsunami cost hundreds of lives along the southern cost of the northernmost State, deserves a look back on its anniversary. The main reason is that by studying this quake, scientists were for the first time able to convincingly connect plate tectonics and earthquakes.
This connection was mainly the work of one Earth scientist, who wasn't even an expert in seismology. A day after the quake, the USGS sent George Plafker to Alaska, who was then a young geologist. His mission was to tally the devastation the quake had caused, and also to note how the effects of the temblor had changed the Alaskan landscape. After a few days in the field, he noticed a curious pattern of subsidence and uplift over a large area of Alaska. Along the southern coast the land was uplifted by 30 feet or more, further inland, in a sweeping arc encompassing the Kenai Peninsula, Kodiak Island and the area around Anchorage, the land had subsided by up to 7 feet.
At first, this pattern didn't make sense to Plafker, but a few weeks after the quake he found a convincing explanation. Plate tectonics predicted that the Pacific Plate moves to the Northwest and dives under the North-American continent along the Alaska coast. Because this process is neither smooth nor nicely lubricated, the movement would cause the two plates to stick to another. During its dive the Pacific would grab pieces of the North American plate and drag them along into the abyss of the Earth's mantle. As a consequence, over the years the land along the immediate coast would subside at a substantial rate, while at the same time the countryside further inland bulged upward. The earthquake finally releases the two interlocked plates and the segments of earth crust jump back into their original positions: Hence the observed uplift long the coast and subsidence further inland.
It is this jumping up of the land which scientists now call a thrust - and if the jump is big enough, it gets the name megathrust. As the pattern measured by Plafker showed, the Alaska earthquake of 51 years ago today, was indeed the first megathrust ever measured. The devastating quakes of the past decade, like the Sumatra earthquake of 2004 or the Tohuku quake of 2011 also belong to the same category. (hra099)
If we Californians didn't know that we are living in earthquake country, and that the question is not "if" but "when" the next Big One will occur, one could become really frightened by the details which went into the latest earthquake forecast model, published last week . The researchers involved in setting up the model used more than 250,000 earthquake scenarios on a total of 350 fault segments to compute where significant shakers are likely to occur in the state during the next 30 years. The computations are so complex that it would have taken a regular desktop computer roughly eight years of continuous number crunching to yield a result. In order not to be outrun by the forecast quake, the scientists used a supercomputer instead, which computed each model in less than an hour.
|Figure 1 ( Adapted from WGCEP, 1988 )|
|Figure 2 ( Adapted from UCERF-3 )|
But despite its enormous detail, the latest "Uniform California Earthquake Rupture Forecast" (UCERF-3) is still a forecast and by no means an earthquake prediction. When talking about the weather, most people in casual conversations don't distinguish between the two terms anymore, forecast and prediction. But when you look closely at the bulletins about future weather issued by the National Weather Service (NWS) you will see the difference. The NWS folks never say "that it will rain exactly 2 inches at Fisherman's Wharf starting at 3 pm tomorrow". That would be prediction. Instead their bulletins read that there will be a 90 percent chance of significant rain at the northern end of the Peninsula in the next 24 hours - which is a forecast.
Such distinctions must also be made in seismology. A meaningful earthquake prediction has to fulfill three criteria: One has to forecast exactly where, when and with what magnitude an earthquake will strike. Despite considerable research efforts in many countries over in recent decades, nobody anywhere has succeeded in getting all three of those criteria consistently right. Hence: The prediction of earthquakes is still an elusive goal. But forecasts like UCERF-3 are the next best thing. The main reason for that is precisely the large amount of detail mentioned earlier: Over the years these rupture forecast models have become much more reliable and should therefore be taken quite seriously.
One example of the improvement is the number of California faults, which are the basis of the models. Figure 1 shows all of the handful of faults used for the first such calculations in 1988 - one year before the magnitude 6.9 Loma Prieta Earthquake ruptured part of the Northern San Andreas Fault. The faults are colored to show the different rates at which their flanks are moving past each other. Yellow is the highest speed with about 1.5 inches per year. Compare this with Figure 2, which shows all the faults and subfaults on which the latest forecast is based. It is much more detailed than the model from almost 30 years ago.
Another reason for the improvement in the forecasts is that seismologists have learned much more about the actual processes taking place during an earthquake. Until recently, nobody could imagine that a rupture during a temblor could jump from one fault to a neighbor. But modern techniques which use seismic waves to illuminate the actual breaking of the rock have shown that such jumps happen regularly - resulting in stronger and more extended earthquakes than previously thought possible. In the next blog, we will look in more detail at the forecast for the faults in our local Bay Area. (hra098)
A new long-term earthquake forecast for California. (Click to view larger.)
The seismoblogger was amazed how some mass media reacted to the latest announcement of earthquake probabilities in California. "Risk of 8.0 earthquake in California rises" read the headline in the Los Angeles Times. Fox News beamed over the TV channels "Risk of mega-earthquake increases" and LA's KABC even went so far as to say the "USGS predicts massive earthquake in California within 30 years". While none of these statements is completely untrue, they certainly leave the impression as if the big one is imminent and that the USGS can predict giant temblors. What gave rise to these alarmist headlines during the last few days?
On Monday the United States Geological Survey (USGS) released the latest findings of a group of nearly 50 Earth scientists from more than a dozen institutions. The goal of this "Working Group on California Earthquake Probabilities" was to estimate and compute just what its name suggests: What are the chances that potentially damaging earthquakes - those with magnitudes larger than 5 - will strike California within the next 30 years. Such efforts to determine the probabilities are not new. The latest release of the findings is the third in series of similar calculations which began in 1995. Such computations are technically called "probabilistic risk analysis". They enable seismologists to calculate the chances that an earthquake of a certain size will strike a certain segment of a fault during a specified time window (older blog: Earthquake Probabilities in the Bay Area). These calculations are based on our knowledge of the past seismicity along the various active faults in California and also on precise measurements of how fast the plates move in our State. As research progresses and our understanding of fault behavior improves over time, the model calculations of probabilities need to be refined every few years.
That is exactly what happened on Monday, when the very complex and detailed new study of the chances of earthquake occurrence on all known 350 fault segments in California was published. But unfortunately, many in the media focused on just one small aspect of the report: The likelihood that a truly big quake of magnitude 8 or greater will occur in California over the next 30 years increased from 4.7% as calculated in the last report to 7%. Is this jump really significant? Shall we lose sleep over this reevaluation of the chances of a mega-quake? When we look at the other numbers in this report, the answer to both questions is simply "no". The chances that during the coming three decades some part of California will get hit by a magnitude 7 earthquake are a whopping 93 percent - which translates to "almost certain". And one step further: We can be sure (larger than 99 percent) that we will have to live through a magnitude 6.7 in the next 30 years.
The only way not to lose sleep over these numbers is to be prepared - and the preparations for an "almost certain" magnitude 7 quake are exactly the same as those for a "very unlikely" (7%) magnitude 8 event. Hints and examples on how to get ready for and survive the shaking can be found in numerous websites. For example, the Red Cross provides earthquake preparedness information. In the next blog, we will discuss in more detail how the experts computed the most recent probabilities. (hra097)
USGS press release
In the seismoblogger's California-centric view of the world, tectonic earthquakes in the US occur mainly along the West Coast and in Alaska, some big ones are generated by the volcanoes in Hawaii, and on a very rare occasion the vast area east of the Rockies is shaken by a significant temblor, like the one on August 23, 2011 when the major population centers in the East rocked to the beat of the waves of the magnitude 5.8 Virgina earthquake.
|Figure 1: A magnitude 6.9 Earthquake in Idaho in 1983 caused a fault scarp up to 8 feet tall (between white arrows). Borah Peak, Idaho's highest mountain, is in the background.|
However, a quake on Saturday morning served as a reminder that the seismogenic zones in the US are much more widespread than is commonly thought. Shortly after 10:44 am MST, most of Idaho was shaken by a magnitude 4.9 temblor, which originated 5 miles under the small town of Challis in Coster County in the center of the state. None of Challis' nearly 1000 inhabitants was hurt, but some houses suffered damage, rockslides blocked a road here and there, and the power was out for hours. The quake was felt 200 miles away in Idaho's capital Boise as well as in Montana's Bitterroot Valley.
According to the Idaho Geological Survey, Saturday's quake was the strongest so far in a series of shakers which began in March 2014. Since then the area around Challis has endured hundreds of slight to moderate temblors, among them a magnitude 3.7 shortly before Christmas. So far, it is not clear to geologists whether the quakes indicate new life on a dormant fault or if they are lined up on a hitherto unknown fault line.
A look into Idaho's earthquake history reveals that the recent series of quakes is by no means abnormal. For instance, on May 12, 1916, Boise was hit by a shock which wrecked chimneys and caused people to rush into the streets. Reclamation ditches were damaged and the flow of natural gas altered. It was felt at Loon Creek, 120 miles northeast, and in eastern Oregon - an area of 50,000 square miles. An intensity VII earthquake occurred within the state on July 12, 1944. The Seafoam Ranger Station building shook so hard the occupants thought it was coming apart. Several people reported that the shaking was so violent they were unable to walk. Another observer reported that rocks rose at least a foot in the air and looked like a series of explosions up the hill. Part of a canyon wall collapsed near Lime Creek. Cracks opened 100 yards long in Duffield Canyon and cracks one to three inches across and several hundred yards long opened on the road below Seafoam. Two chimneys fell at Cascade. This shock was felt over 70,000 square miles, including all of central Idaho, and parts of Washington, Oregon, and Montana.
In fact the strongest earthquake ever to hit Idaho in historic times occurred not far from the current epicenters near Challis, when on October 28, 1983 a magnitude 6.9 (the same strength as the Bay Area's 1989 Loma Prieta quake) occurred under Borah Peak, Idaho's highest mountain. The shaking was so strong that two people were killed in Challis, 11 commercial buildings and 39 private houses sustained major damage, and more than 200 houses sustained minor to moderate damage. Geologically, the most dramatic consequence of this quake was a 20 mile long zone of fault scarps and ground breakage. In some sections the ground was thrown off by more than 8 feet (see figure 1).
The mountainous area of Idaho where the earthquakes occur is part of the Central Idaho Seismic Zone (also called the Centennial Tectonic Belt). The zone is approximately 200 miles long by 50 to 100 miles wide and is characterized by rugged basin and range topography and the highest elevations in Idaho. The zone contains high levels of earthquake activity and at least 6 major active faults crisscross the region (see figure 2). (hra096)
|Figure 2: Saturday's earthquake near Challis as well as the Borah Peak quake from 1983 occurred in Idaho's Central Seismic Zone. (Map: Idaho Geological Service)|
Josh Bloom's prototype EEW device. (Click to view larger.)
More often than not earthquakes are associated with destruction, losses and general malaise rather than with new and exciting business opportunities. Sure, there are the makers of equipment to measure ground motion who make money, and seismic retrofitting of older structures can be a costly effort. Over the last decade UC Berkeley alone spent more than two billion dollars to make its campus more resilient against the inevitable earthquake shaking. But neither the retrofit nor the production of seismometers has the glamorous image of today's high tech and smart technology. This, however, is about to change as early warning before earthquake shaking is slowly rolled out in California, along the entire US West Coast and elsewhere.
Last week, leading California politicians, among them Lieutenant Governor Gavin Newsom, shared the stage at the Third International Conference on Earthquake Early Warning (EEW) held on the UC Berkeley campus. Under the glaring spotlights of TV crews, those politicians offered their strong support for the EEW demonstration system put together over the last few years under the name "Shake Alert" by a consortium of government agencies and universities, among them the Berkeley Seismological Laboratory (BSL). (see blog from October 2, 2012). While the elected officials and Shake Alert scientists discussed possible sources for the 120 million dollars necessary to fully build and run the West Coast system over the next five years, engineers at established companies, entrepreneurs at start-ups and tinkerers in their garages or living rooms were already busy designing high tech gadgets and apps which can take the EEW alerts to new levels.
Take for instance AtHoc, a 15 year old company in San Mateo, Calif., which specializes in crisis communication. Based on their experience with the EEW system in Mexico (see blog April 19, 2014), AtHoc engineers developed a technique, by which warning messages from Shake Alert are routed through the transmitters of the NOAA weather radio system. With broadcast receivers designed by AtHoc, early warning alerts can be received anywhere, independent of internet and cell phone communication.
Another example is Joshua Bloom, a professor in the Astronomy Department of UC Berkeley. As member of the advisory board for the BSL, he became a beta tester for receiving real-time Shake Alert messages on his laptop through the internet. But as a Berkeley resident, living close to the Hayward Fault, Bloom took the idea further. Instead of relying on his laptop to show the message, he built a wireless receiver and connected it to speakers loud enough to alert all rooms in his house. He envisions that his current prototype home alert can at one point become as ubiquitous - and mandated - as smoke detectors. Bloom describes his device in detail on his blog.
Ingrid Johanson, a researcher at the BSL, is working along similar lines. Based on an Arduino platform, she designed a demonstration system which automatically responds to a Shake Alert message by turning off appliances or machinery so that they are not damaged during an earthquake. She also showed how flashing LED lights or other indicators that don't rely on being in front of a laptop could extend the reach of early warning messages.
If you are tinkering with or are developing an add-on to Shake Alert and other EEW systems, let the blogger know. We are planning to create a developer forum at the BSL to help you integrate your ideas with the science of Earthquake Early Warning (hra095).