Earthquake FAQ

Current Earthquake Information

Maybe. A lot of things can cause local shaking - earthquakes, trucks driving by, thunder, etc. Check out our earthquake map. It displays recent seismic activity in California and is updated every time we detect and locate an earthquake. If you felt an earthquake, help us out and report it at ''Did You Feel It''. Earthquakes will feel different depending on the size of the earthquake and your distance from the epicenter. If it is a large earthquake, and you are close to it, you will feel a sudden bump followed by intense shaking. It will be difficult to stand. Conversely, if there is a small earthquake and you are not as close to it, you may feel a small bump, followed by a few seconds of sharp shakes.

Earthquakes first appear on the USGS realtime earthquake map in response to a computer program's handling of triggers at seismic stations throughout California. Then they are reviewed by the seismologist on call at the USGS who confirms whether the event is real. For example, there can be false triggers generated when the signal for one event gets mixed up in the signal for another earthquake. Incorrect quake markers can also show up when the system triggers on the P-wave from a large, more distant earthquake. Other signals that aren't earthquakes can also result in false alarms. Our stations also sometimes detect sonic booms, space shuttle re-entry, or even very loud thunder in areas where the network is very dense. When the rate of seismicity in a region is extremely high (such as a region experiencing many aftershocks following a large earthquake), the automated system often has problems distinguishing different earthquakes. The computer system may pick up one quake triggering on many stations and list it as two. Or one apparent large quake can resolve into many smaller earthquakes upon inspection.

Earthquakes and the Bay Area

Actually, many earthquakes happen in the Bay Area happen every day. Most of them are too small to feel. On average, there is an earthquake that a few people will feel every 2-3 weeks and one that will be felt by many people every year. We cannot predict exactly when the next big earthquake will happen in the Bay Area, or anywhere else; we can only give a probability. Within the next 30 years, the probability is 62% that we will experience an EQ with magnitude 6.7 or greater on one of our faults. Southern California is in a similar boat: Within the next 30 years, there is a 60% chance that people there will have an EQ with magnitude 6.7 or greater.

There are two reasons for this. One is the time between quakes on these two faults. The 1868 earthquake on the Hayward Fault was considered the ``Great San Francisco earthquake'' until the San Andreas Fault ruptured in 1906. Before then, we did not know about the hazard of either fault. Our newest scientific studies show that earthquakes on the San Andreas Fault happen about every 200 years, while they happen on average every 140 years on the Hayward Fault. Do the math: 1868 + 140 = 2008, while 1906 + 200 = 2106. The San Andreas Fault appears not to be ''due'' as soon as the Hayward. The second reason is vulnerability: The San Andreas Fault runs along the hills of the Peninsula and is offshore from Pacifica to Stinson Beach. The Hayward Fault passes along the foot of the East Bay Hills, through Fremont, Hayward, Oakland, etc. Many more people live directly on the Hayward Fault than on the San Andreas Fault. In addition, much of the infrastructure that supports the people in the Bay Area crosses the Hayward Fault: water for San Francisco and EBMUD, power lines, freeways, BART ...

The Golden Gate Bridge is currently undergoing a sequence of retrofits, some of which have already been completed. The process has reached the point that the Golden Gate Bridge no longer faces the potential for collapse. Until all the retrofit steps have been completed, the Main Suspension Bridge may be damaged significantly in a major earthquake. Check for updates at the website of the Golden Gate Bridge. The new eastern section of the Bay Bridge, which is expected to open in 2013, is being built to withstand a major earthquake. The retrofit of the western segment has been completed. The Bay Bridge is considered an emergency ``lifeline'' route to be used in disaster response activities. Thus, when completed, it will be able to reopen quickly following an earthquake. For more information, visit http://www.baybridgeinfo.org/faqs.

Earthquake Engineering is a very important field contributing to public safety in the event of an earthquake. Research is always being done on how to make new and old buildings safer. Scientists and engineers at universities around the world are studying the impact of earthquakes on the built environment, with the goal of designing safer structures. For example, engineers at the UC Berkeley Earthquake Engineering Research Center test designs on their shake table. A sampling of those organizations is given below:

To see an example of earthquake engineering coming into fruition, take a look at Caltrans' Seismic Safety Retrofit Program.

Although there are many places in the East Bay that are prone to liquefaction in a large earthquake, this phenomena can't cause entire cities to disappear. In liquefaction, water-saturated soil loses strength when shaking happens due to changes in water pressure between the particles of soil. In a magnitude 6.5 East Bay earthquake, sewers and other underground pipes would break, and buildings might be damaged or collapse, or even tip over and sink partly into the soil. Our airport runways could be ruined, and area roads and freeways would also be damaged. Much of the damage from the magnitude 6.1 Christchurch, New Zealand earthquake of February 22, 2011 was the result of liquefaction.

The Association of Bay Area Governments Earthquake and Hazards Program has published a guide called "The Real Dirt on Liquefaction" that covers the hows and whys of liquefaction, and what could happen to our infrastructure.

Seismic hazard maps are maps that show expected level of shaking in a particular area due to earthquakes that may occur in the region. These can be used by local governments for construction projects for new buildings or retrofitting old ones. They can also be used by citizens concerned with seismic dangers. There are many places to find these publicly available resources and a quick Google search will yield many useful results. Here are a couple of good sites:

The Alquist-Priolo (AP) Fault Zoning Act was passed as a response to the February 9, 1971 Mw 6.6 San Fernando earthquake. The act creates special regulatory zones, or earthquake fault zones, on and around active faults throughout California. Additionally, maps for these areas are created and distributed to relevant cities and counties. This will ensure that any construction on or near a fault can take the necessary measures to ensure public safety. The maps are also made available to the public, which can be very useful information for various purposes, such as knowing where a fault lies before buying a house. For more detailed information on the AP Act, and for maps and other resources, visit the State of California Department of Conservation website.

Most modern single-family wood-frame homes in the Bay Area are safe to be in during an earthquake. Building safety depends more on building design, construction type, building material quality, soil conditions at the property site and less on the number of stories. The exception to this rule are multi-story buildings with a "soft story", a ground floor with large openings to accommodate windows or garage doors. If not properly braced, these openings can reduce the shear strength of the structure making it vulnerable to strong ground motions. For a detailed description on how different building types should perform during an earthquake, please visit the Association of Bay Area Governments. Or, click here to see a full scale shake table test of a seven story wood-frame condominium building outfitted with Simpson Strong-Tie products. This test was conducted at Japan's Earth-Defense shake table located just north of Kobe, Japan.

Since 1997, UC Berkeley has completed or initiated approximately $500 million worth of seismic and related improvements in buildings across campus. This effort started with the student housing facilities, followed by all occupied buildings on the central campus that had a "very poor" seismic rating.

Seismic retrofitting is still needed for smaller, rarely occupied facilities such as the Old Art Gallery and backstage at the Greek Theater. Work that will further improve the seismic safety of the UC Berkeley campus is in progress, and the SAFER program continues to provide a framework that guides campus planning and investment.

For more information on the SAFER program and a complete list of all campus facilities' seismic rating please visit: http://berkeley.edu/administration/facilities/safer/rating.html

This is a very good question, so we put it in our FAQ! Although plate tectonics wasn’t understood as the mechanism for plate motion until the late 50s and early 60s, when technology enabled scientists to collect evidence of seafloor spreading at mid-ocean ridges, people did understand that earthquakes happened on faults. The 1868 Hayward Fault earthquake flattened the then-farming community of Hayward, and shaking descriptions from many different towns in the Bay Area are documented in the 1908 Lawson Report. The Lawson report also included the first description of Reid’s elastic rebound theory - essentially, the theory that stress builds up along a fault as the ground deforms around it, and an earthquake is a release of that stress, which causes faulting.

The Hayward Fault is also a "creeping" fault - some of the fault motion happens very slowly, without causing an earthquake. You can see this in curbs that have been offset in cities all along the fault. So there may have been visual evidence of the fault’s presence as well, when I-580 and Hwy 13 (which is also more or less on the Hayward fault) were being built.

The Alquist-Priolo Earthquake Fault Zoning Act, which was passed in 1971 and prevents new housing and workplaces from being constructed on or very near an active fault, does not apply to roads. But it’s important to understand that many Bay Area roads will have closures after a major Bay Area earthquake, and not just from faulting. The "geonarrative" put out by the USGS in conjunction with the recently released Haywired scenario explores the probabilities of landslides and liquefaction — two earthquake-related hazards that can have a highly destructive impact on roads — in map form.

Further reading:

Common Myths and Misconceptions

If you look at earthquake statistics in most regions of the world, including California, you will find that for every magnitude 5 earthquake, there are about 10 that have a magnitude of 4, and for each magnitude 4, there are 10 with magnitude 3. Unfortunately, this means there are not enough small earthquakes to relieve enough stress to prevent the large events. In fact, it would take 32 magnitude 5's, 1000 magnitude 4's, or 32,000 magnitude 3's to equal the energy produced in one magnitude 6 event.

Only if you live in an old, unreinforced adobe house. In modern homes, doorways are not stronger than any other parts of the house, and the doors in them may swing and injure you. You are safer practicing ``DROP, COVER, AND HOLD ON'' under a sturdy piece of furniture. Every year, institutions and agencies that deal with earthquake hazard encourage everyone to participate in a preparedness drill. Find out more and sign up at http://www.shakeout.org. Another tidbit of information: if the shaking stops before you have had a chance to decide what to do, then you didn't need to do anything. If you need to ''drop, cover, and hold on'', you'll have plenty of time to decide.

No. The San Andreas Fault System, which crosses California from the Salton Sea in the southeast to Cape Mendocino in the north, is the boundary between the Pacific Plate and North American Plate. The Pacific Plate is moving northwest with respect to the North American Plate at approximately 46 millimeters, or about 2 inches per year (the rate your fingernails grow). The strike-slip earthquakes on the San Andreas Fault are a result of this plate motion. The plates are moving horizontally past one another, so California is not going to fall into the ocean. However, in the very distant future, Los Angeles and San Francisco may one day be adjacent to one another!

The question often arises as to whether astronomical events, such as planetary alignments, can significantly influence the occurrence of earthquakes. The moon, sun, and other planetary bodies in our solar system influence the earth in the form of perturbations to the gravitational field. The relative amount of influence is proportional to the object's mass and inversely proportional to the cube of its distance from the earth. * The general idea is that strains in the earth's crust caused by perturbations in the gravitational field may influence when an impending earthquake will occur (like the straw that broke the camel's back). If this were indeed the case, we would expect to see a correlation between rate at which earthquakes occur and the perturbations to the gravitational field. The dominant perturbation in the earth's gravitational field generates the semi-diurnal (12 hour) ocean and solid earth tides which are primarily caused by the moon (due to its proximity) and the sun (due to its large mass). No significant correlations have been identified between the rate of earthquake occurrence and the semi-diurnal tides when using large earthquake catalogs. There have, however, been some small but significant correlations reported between the semi-diurnal tides and the rate of occurrence of aftershocks in some volcanic regions, such as Mammoth Lakes.

The relative influences of object in the solar system, in order of magnitude, are:

Object Mass
(Earth=1)
Distance
(Million km)
Relative Influence
(Moon=1)
Moon 0.01228 0.38 1.00
Sun 329390 149 0.45
Venus 0.8073 41 0.000052
Jupiter 314.5 629 0.0000056
Mars 0.1065 79 0.00000096
Mercury 0.0549 91 0.00000033
Saturn 94.07 1277 0.00000020
Uranus 14.40 2720 0.000000004

The combined influence of the rest of the objects in the solar system is less than 10 billionths of the influence if the moon. The combined influence of all objects in the solar system, other than the moon and the sun, is at most 0.000059 or only 1/24500 the combined influence of the moon and the sun. Thus, even when all the planets are lined up, their combined influence is relatively small.

Besides the dominant semi-diurnal periodicity, there are other significant periods. Most notably, there is the synodic month (~29.53 days) periodicity due to the moon's orbit around the earth (relative to the sun) and the 18.5 year periodicity due to the 5 degree inclination of the moons orbit. No significant correlations between these periods and the rate of occurrence of earthquakes have been found.

Given the relative influence of a planetary alignment and the lack of correlation of earthquakes with the dominant gravitational effects, we would not expect planetary alignments to significantly influence either the rate of occurrence of earthquakes or the relative motion of the tectonic plates. No significant correlations of earthquakes with planetary alignments have been found.

The gravitational influence of the other bodies in the solar system is largest in the vicinity of the earth's equator and smallest near the poles.

Physics note: The stresses induced in the earth by an extraterrestrial mass are proportional to the gravitational field gradient dg( r ) / dr and NOT to the gravitational field g( r ).
g( r ) = GMm / r^2
thus:
dg( r ) / dr = -2 * g( r ) / r = -2GMm / r^3

There are two types of weather that we experience on earth. One type is space weather. This is weather that comes to us from beyond our own planet, such as solar flares and magnetic storms. These phenomena result in things like the Northern Lights. While this type of weather can influence our communications and other systems, it has never been shown to affect earthquakes. If there were a connection, we would see an increase in earthquakes about every 11 years, which is when solar flares are the most intense, yet we see an even distribution of earthquakes regardless of what the Sun is doing.

The other type of weather is what we are all familiar with, rain, wind, heat, etc. Again, statistically, there is an even distribution of earthquake events throughout all types of weather. However, very large low-pressure systems, such as hurricanes, have been known to cause episodes of fault slip (slow earthquakes), which are not very damaging.

Shallow crevasses can form during earthquake-induced landslides, lateral spreads, or other types of ground failures. Faults, however, do not open up during an earthquake. Movement occurs along the plane of a fault, not perpendicular to it. If faults opened up, no earthquake would occur because there would be no friction to lock them together. (source: USGS)

There is no way for us to prevent earthquakes. After all, we can't stop the earth's tectonic plates from moving. And, although mining and reservoir activities can, and do, cause earthquakes, we can't realistically cause enough tiny earthquakes to prevent a large one. This is because for each increase in magnitude, the amount of energy released increases 32 times. So, for example, it would take about 33,000 magnitude 3 earthquakes to equal the amount of energy released in a magnitude 6.0 earthquake. (In nature, we only get about 1000 magnitude 3 quakes for every quake of magnitude 6 - not nearly enough to release all that energy.)

What we can prevent are deaths, injuries, and property damage caused by earthquakes. Look around your home for heavy objects that could fall on top of you in an earthquake, and find a better spot for them. Practice Drop, Cover, and Hold On! in different rooms of your home. Make an emergency kit with first aid supplies, food, water, money, and other things you would need after a quake. Make a family plan for contacting each other. Find out whether your home needs to be retrofitted. Click here for good earthquake preparedness links.

So far, there has been no conclusive evidence that animals can predict earthquakes or sense that they are about to occur. Animals frequently exhibit behavior that we call "strange," so it is likely that at any given time, someone will be witnessing odd animal behavior, whether or not an earthquake is imminent. But animals are more sensitive than people in many ways, so they may start to feel the shaking from an earthquake before their human friends notice it. This video shows a dog reacting to an earthquake before his human companion is aware of it. This heightened sensitivity is sometimes misinterpreted as a pet predicting the earthquake.

Earthquakes, Faults, and Plate Tectonics

The term earthquake describes both the sudden slip on a fault and the radiated seismic energy and ground shaking caused by the slip. It also covers ground shaking caused by volcanic or magmatic activity and other sudden movement due to stress changes in the earth.

A fault is a fracture or zone of fractures between two blocks of rock. Faults allow the blocks to move relative to each other. This movement may occur rapidly, in the form of an earthquake - or may occur slowly, in the form of creep. Faults may range in length from a few millimeters to thousands of kilometers. Most faults produce repeated displacements over geologic time.

During an earthquake, the rock on one side of the fault suddenly slips with respect to the other. The fault surface can be horizontal or vertical or some arbitrary angle in between. Earth scientists use the angle of the fault with respect to the surface (known as the dip) and the direction of slip along the fault to classify faults. Faults which move along the direction of the dip plane are dip-slip faults and described as either normal or reverse, depending on their motion. Faults which move horizontally are known as strike-slip faults and are classified as either right-lateral or left-lateral. Faults which show both dip-slip and strike-slip motion are known as oblique-slip faults.

The following definitions are adapted from The Earth by Press and Siever. This book, and many other good references, are listed in the our Suggested Reading index.

  • Normal Fault
    A dip-slip fault in which the block above the fault has moved downward relative to the block below. This type of faulting occurs in response to extension and is often observed in the Western United States Basin and Range Province and along oceanic ridge systems.

  • Reverse Fault
    A dip-slip fault in which the upper block, above the fault plane, moves up and over the lower block. This type of faulting is common in areas of compression, such as regions where one plate is being subducted under another as in Japan. When the dip angle is shallow, a reverse fault is often described as a thrust fault.

  • Left-Lateral Fault
    A strike-slip fault on which the displacement of the far block is to the left when viewed from either side.

  • Right-Lateral Fault
    A strike-slip fault on which the displacement of the far block is to the right when viewed from either side. The San Andreas Fault is an example of a right lateral fault.

Additional information on the types of faults and earthquakes can be found in our FAQ on "What are those beach ball figures?"

The inside of the Earth is hot because of radioactive rocks inside the Earth and because of heat leftover from the immense pressures of the Earth's formation. So the Earth is cooling off. As the Earth cools, hot rock within the mantle is very slowly coming up to the surface, while cold rock is very slowly sinking down towards the core. This is called convection.

Meanwhile, the plates on the Earth's surface are moving around in different ways. In some places, like right under Chile, one plate is being pushed (subducted) under another. The part of the plate that has gone under the other plate is sinking into the mantle, and it's very heavy. The sinking plate is part of the downward flow of hot rock in this convection of the mantle, and we know the force of all that sinking rock is an important part of what causes the plates to move. We have used seismic waves to make pictures of the mantle, so we also know that there are areas where hot rock is coming up to the surface called mantle plumes. Hot rock in the mantle is also (again, very slowly) moving horizontally between places where it goes up and places where it goes down. Does it drag the plates along with it? Scientists don't know. So our picture of the forces driving the plates is not complete.

Aftershocks and foreshocks are related terms describing earthquakes that happen before or after a "mainshock", that is to say, an earthquake. Foreshocks are earthquakes that occur in the same location as a mainshock, but before the earthquake happens, whereas aftershocks are smaller earthquakes that occur after the mainshock in the same area, but not necessarily the same exact location. Aftershocks generally decrease with time after the main event and are much less common in deep earthquakes.

Volcanoes and earthquakes are almost never related. They very rarely have anything to do with one another. The USGS has a nice write up about their relationship here. On the other hand, tsunamis are directly cause by earthquakes that are located in the ocean. The vertical deformation resulting from an earthquake rupture gives rise to the tsunami, which can be very small and unnoticeable, or very large and can cause much destruction and loss of life.

The main problem with crowdsourcing GPS is that it works really badly indoors because of the poor (sometimes non-existent) sky view, which is where most people spend most of their time. But even outdoors, the 30m accuracy provided by most mobile phones and laptops means that we would need prohibitively high numbers of data points* to get mm level accuracy, which is what we need for measuring plate rates. (For geodetic grade GPS stations, we use the signal from the GPS satellites in a fundamentally different way from the way handheld devices use it, in order to get mm level accuracy at each station.) Moreover, the rates we are measuring are so slow that they could not be measured over the course of a single day (one would need sub-mm accuracy for that.) So people would need to put their laptops back in the same place, within a mm, every day in order to track velocity over a longer time.

Secondly, there is a geophysical problem with using widely distributed GPS signals to track plate motions. Strain accumulation around faults causes the areas near them to move at different rates than the total plate rate. If you think of it like pulling on a rubber band, then your hands are two tectonic plates moving at constant rates away from each other, but various points on the rubber band itself are moving more slowly, and, in fact, the center of the rubber band isn't moving at all. Until the rubber band breaks (earthquake), at which point those parts of the rubber band that were moving more slowly suddenly move very quickly and catch up with your hands (the tectonic plates). In geodetic studies of plate motion, data from GPS sites near faults are usually excluded in order to prevent these altered rates from affecting the estimate of total plate motion (of course these data are very important for measuring the strain accumulation itself). In the US this means that data from much of the western US is excluded; in Europe much of Mediterranean region would have to be excluded. And in areas like Asia with lots of microplates, finding enough data away from plate boundaries would be even harder.

*Averaging reduces uncertainty by a factor of 1/square root (N), where N is the number of values being averaged together. So we would need on the order of 900 million data points to get mm level average uncertainty from 30m uncertainty data. This could mean that we would need 900 million people for each tectonic plate.

Intensity describes how much the ground shakes during an earthquake. For a given earthquake, intensity is different from one place to another depending on the shaking, but the quake has only one magnitude. Factors that determine intensity at a specific spot include the rock and soil under it and how far it is from the epicenter of a quake. The intensity is usually given on the Modified Mercalli Intensity (MMI) Scale, a qualitative measure based on groups of standard observations, such as "Difficult to stand" or "Shutters, pictures move". To distinguish it from magnitude, intensity is assigned Roman numerals I-XII, with XII being the highest. If you recently experienced an earthquake and want to contribute to an intensity study, fill out a felt report.

Imagine you and a friend sliding a large wooden dresser to a new location across a wood or tile floor. After it's in place, you may hear small popping or squeaking noises coming from it as it settles. In the same way that moving your large wooden dresser disrupts the pressure each piece is putting on other pieces at the joints, and the pressure that the legs are putting on the floor, an earthquake causes changes in the details of the stress on the fault where the slip occurred and nearby.

Before an earthquake, stress builds up on a fault, and in a major earthquake, stress is released, but it's more correct to think about a more complex "stress field" of varying stress all around the fault. When a big earthquake happens, the stress field around the earthquake changes in response. In some areas, the stress increases, and this can set off the smaller earthquakes we know as aftershocks. (If the next earthquakes is larger, it becomes the mainshock and the previous earthquake becomes a foreshock!)

Earthquakes, large and very small, are happening all over the world, all the time. For an illustration of this, click here to see earthquakes that have been located in California and Nevada in the past 7 days. We don't know exactly how small an earthquake has to be before it can't trigger any aftershocks, but it is well below the magnitude someone could feel (roughly magnitude 2).

Aftershocks and all other earthquakes follow an empirical rule called Gutenberg-Richter Law describing the number of earthquakes of different sizes. This law is a logarithmic relationship* that basically tells us that each time we go down a unit in magnitude, we should expect to see 10 times as many earthquakes. So for each earthquake we see of magnitude 5, we should expect to see, very roughly, 10 4's, 100 3's, 1000 2's, etc. The Gutenberg-Richter relationship may break down for the tiniest of earthquakes * (so small that they have negative magnitude), but even so, the number of teeny-tiny aftershocks generated by all of the tiny earthquakes that occur - most of which are so small and remote that nobody even feels them - has to be absolutely staggering.

So let's talk instead about how many aftershocks we could expect to see in a day following one really large earthquake. Another law, Omori's Law*, describes the number of aftershocks we expect to see over time following an earthquake of a given magnitude. Omori's Law tells us that there will be lots of aftershocks immediately following an earthquake, and that as time passes, the number of aftershocks will decay exponentially. So the first day after a major earthquake is when we would expect to see by far the largest number of aftershocks. Although the basic mathematical relationship in Omori's law doesn't change broadly, the numbers (the constants) used vary region to region, and even from sequence to sequence. But in 1989, two scientists* hammered out appropriate constants that seemed to work for California and put together Omori's Law and the Gutenberg-Richter relationship into a function for the expected number of aftershocks over a given time period after a California earthquake of some magnitude. As statistical measures, Omori's law, and the more detailed aftershock production rate rule constructed by Reasenberg and Jones, are not meant to predict exactly how many aftershocks of a given magnitude we will see over a given time interval, any more than knowing heart attack statistics could tell you exactly how many people will show up at a given hospital with one on a particular day.

The Reasenberg and Jones relationship is a mouthful: rate(t,M)=10^(-1.67+0.91(Mm-M))*(t+0.05)-1.08, where t is time in days, Mm is the magnitude of the mainshock, and M is the magnitude of aftershocks that we are looking at. So, for a magnitude 8 event, if we want to look only at magnitude 2 plus events, Mm-M would be 6. Plugging in the numbers, for a magnitude 8 earthquake, about the biggest that we could experience in California, we get an expected 5849 M2+ aftershocks, 720 M3+ aftershocks, 89 M4+ aftershocks, 11 M5+ aftershocks, and 1 M6+ aftershock. The vast majority of these thousands of aftershocks are between magnitude 2 and 3, just at the level that someone in the vicinity might start to feel them and certainly not big enough to cause any damage. However, though each of these aftershocks could potentially be felt by someone standing nearby, you certainly wouldn't feel all, or even most of them, which for a large earthquake is considerable. The 1857 M 7.9 Fort Tejon earthquake, in the running for California's largest earthquake in written history, ruptured over 350 km, which translates to an area for aftershocks spanning about 12000 square miles!

The largest earthquake ever recorded in the world was not of magnitude 8, however, but magnitude 9.5, in Chile in 1960. We can't use Reasenberg and Jones' California numbers for Chile, but this size of an event would probably generate more than 100,000 aftershocks of magnitude 2+ or above on the first day after the quake.

(1) log N = a-bM, where N is number, M is magnitude, and a and b are constants that vary by region.

(2) Breakdown (?) of the Gutenberg-Richter Frequency-Magnitude Relation for Earthquakes in the SAFOD Target Zone, Ellsworth, W. L.; Imanishi, K., American Geophysical Union, Fall Meeting 2010, abstract #T41A-2089

(3) n=C/(K+t)^P, where n is the rate of aftershocks at time t, and C, K, and P are constants that vary.

(4) Reasenberg, Paul A. and L.M. Jones (1989). Earthquake Hazard After a Mainshock in California, Science, 243, 1173 - 1176.

We thought that this was such an important topic that we wrote this blog about it in September 2008.

Measuring Earthquakes

There are 4 major elements to a seismic station: the seismic sensors, a data collection and storage unit, the power system, and telemetry to get the data back to the data centers in real time. The latter two are generally made up of off the shelf components such as batteries, chargers, solar panels for the power system, and internet service hardware, cell modems, radios and/or microwave links for the telemetery.

The data collection and storage units, mostly called data loggers, include digitizers, the capability to provide a time stamp, and software to provide the data through the telemetry system in a standardized format.

The seismic sensors make up the most specialized part of the station. Imagine, the sensor is attached to the ground, so how can you tell if the ground is moving? Seismometers and accelerometers, the two names for instruments that measure slightly different things, make use of the physical property of all masses, inertia. In principle, a mass is attached to a spring or pendulum. When the ground moves, the mass doesn't move yet. Only after a short while, when the spring or pendulum has been stretched enough. If we somehow attach a writing tool to the mass and a paper to the ground, the instrument will write a record of the difference between the movement of the mass and the ground. In the olden days, that was a seismograph (graph because it writes).

Nowadays, seismometers and accelerometers are more sophisticated. They use feedback loops to ensure that the mass doesn't move relative to the ground. They use magnets and capacitors to create voltages and currents that can easily be measured by digitizers, and then stored and telemetered. And, because the mechanical system of the seismometer or accelerometer never really leaves its "rest position", we can use physics to precisely relate the measured values to what the ground really did. The instruments now have "meter" in their name, because they are a measuring device that provides values over time, and not writing on a piece of paper.

Advantages of the new sensors over the historical, but really cool looking instruments are (a) the ability to relate the measured values to real ground motion (described above) in a generally linear relationship; (b) modern instruments usually measure three orthogonal components of ground movement in a relatively small unit; (c) sensitivity to a very broad range of frequencies; and (d) high dynamic range -- this means that both really tiny ground and fairly large movement can be measured with a single instrument. Imagine, in the old days, the paper for recording was about 1 ft (or 30 cm) tall. If we wanted to be able to record all the details from the tiniest to the largest ground motion from a single modern sensor on a piece of paper, it would have to be about 3 miles (5 km) tall. Not very practical.

The data logger also provides and important improvement, in that the data are available digitally and can be sent by modern telemetry options (not the US mail like 70 years ago), and can be immediately processed in computers to provide earthquake information rapidly, and even earthquake early warnings.

These instruments sound and are fantastic. Seismometers and accelerometers have one shortcoming. While the ratio of the smallest to largest signal each of them can measure is on the order of 10,000,000 (10 million), the difference between a tiny earthquake (M~0) and the biggest quakes we have ever recorded (M9.5) is on the order of 3,000,000,000 (3 billion). Thus, we install both at a single site. The seismometer is much more sensitive and can record close by earthquakes on-scale if they are M~-1 to M~4. Accelerometers record a different range of quakes, from M~3 to M~8.5. Both sensors record more distant earthquakes, too, depending on the size and distance from the station. So, where we can, we install both a seismometer and an accelerometer to get the best view of what is going on nearby.

In the United States, large-scale seismological networks are generally run by federal agencies (such as the USGS, the Bureau of Reclamation, and the Department of Energy), by state agencies, and by public and private universities. These networks are designed to monitor earthquake activity and to provide data for research into earth science problems. The Advanced National Seismic System is an organization of institutions involved in seismic monitoring with the goal of coordinating efforts to record and analyZe seismic data.

For example, UC Berkeley operates a seismic network in northern and central California for the purposes of monitoring seismic activity and furthering earthquake research. The seismographic instrumentation used in the Berkeley Digital Seismic Network (BDSN) includes broadband seismometers to sense weak ground motions and accelerometers to sense strong ground motions. Both types of sensors utilize force-feedback circuitry to determine the overall response, linearity, stability, and dynamic range of the sensors. UC Berkeley operates several different broadband seismometers and accelerometers in order to cover the widest range of frequencies and ground motions:

  • Typical broadband seismometers are capable and responsive to weak ground motions ranging in frequency from the semi-diurnal gravitational tides at ~23 microHz to ~5-40 Hz (depending on sensor bandwidth).
  • Typical accelerometers are responsive to strong ground motions ranging in frequency from 0 Hz to 400+ Hz.

The primary limitations at the low-frequency end of the spectrum are the seismic background noise level, the instrumental noise level, and the thermal stability of the sensor and the data logger. At the best BDSN sites, the semi-diurnal gravitational tide signal is readily apparent in the raw data while at the noisier sites it is not resolvable.

The primary limitations at the high-frequency end of the spectrum are the digital sampling rate, the instrumental noise level, and the attenuation and scattering of the surface weathered layer (the upper ~100 meters of the crust). The best BDSN stations, with sensors installed in boreholes, typically see 200+ Hz signals generated by local earthquakes. The best BDSN stations, with sensors installed on the surface, typically do not register significant energy at frequencies above ~30 Hz.

Seismic frequency bands of interest:
Gravitational tides~0 Hz to ~70 microHz
(periods of 4+ hours)
Earth's eigenvibrations~0.3 mHz to ~0.1 Hz
Surface wave analysis~2 mHz to ~2 Hz
Regional earthquakes~10 mHz to ~10 Hz
Local earthquakes~10 mHz to ~400+ Hz
Strong motion~0.05 Hz to ~10 Hz
(frequency band which usually causes structural
damage during strong ground shaking)

The capabilities of the modern generation of seismic instrumentation was driven by the needs of pure research and made possible by the advent of large scale integrated circuit technology. As our knowledge of earth structure and our ability to model the earth at higher frequencies improves, accurate recordings at yet higher frequencies will become useful. At lower frequencies, primarily associated with secular deformation of the earth's crust, data are provided by continuously operating Global Positioning System (GPS) receivers. The UC Berkeley Seismographic Station is collaborating with a number of agencies in northern California to form the Bay Area Regional Deformation (BARD) Network to monitor crustal deformation as well as seismic activity. Many of the BARD GPS receivers are co-located with BDSN instrumentation.

Information on the installation of the seismic instrumentation is available from the BDSN Installation Guide.

Textbook image and caption showing 3 types of beach ball figures
Figure courtesy of David Oppenheimer of the USGS.

In addition to determining the location and magnitude of earthquakes, seismologists are now routinely determining the "fault plane" solutions or "focal mechanisms" of events. A fault plane solution illustrates the direction of slip and the orientation of the fault during the earthquake. These solutions, which are displayed in lower-hemisphere projects frequently described as "beach balls", can be determined from the first-motion of P-waves and from the inversion of seismic waveforms. These figures help identify the type of earthquake rupture: strike-slip, normal, or thrust.

Strike-slip earthquakes are typical of the San Andreas fault zone, which forms part of the boundary between the North American and Pacific plates.

Normal earthquakes are associated with extension, particularly with formation of plates at mid-ocean ridges.

Thrust or reverse earthquakes are associated with compression, particularly with the subduction of one plate under another as in Japan.

Charles Ammon, now at Penn State, gives an illustrated explanation of focal mechanisms and fault plane solutions on a page about faults and faulting that he created for his introductory earthquake class at St. Louis University.

Examples of fault plane solutions for Northern California, using data from the Berkeley Digital Seismic Network, can be found on the regional moment tensor page. Examples for global earthquakes can be found in the Global Centroid Moment Tensor Catalog.

In order to study earthquakes, scientists deploy seismometers to measure ground motion. Seismograms are recordings of ground motion as a function of time and are the basic data which seismologists use to study the waves generated by earthquakes. These data are used to study the earthquakes themselves and to learn more about the structure of the Earth.

We have gathered several examples of earthquake recordings to illustrate the wide variety of motion. These data are derived from the Berkeley Digital Seismic Network, an array of broadband, high-dynamic range instruments in northern and central California. This network is operated by the UC Berkeley Seismological Laboratory for earthquake monitoring and research.

Seismologists generally describe earthquakes as local, regional, or teleseismic. These terms refer to distance from the earthquake to the recording instrument. For example, when the Berkeley Seismographic Station refers to a local earthquake, we mean one which has occurred within Northern California. An example of a regional earthquake might be an event in Southern California, Nevada, Utah, Oregon, or Washington. Teleseismic events are those which occur at great distances, such as earthquakes in Japan, Tonga, or Iceland.

Local and regional earthquakes are dominated by crustal waves, i.e., by waves which propagate through the crust. At greater distances, the seismic wavefield is dominated by waves which sample the body of the earth- the upper mantle, the lower mantle, and the core.

The seismic recording instruments of the BDSN are capable of "seeing" earthquakes around the globe. In this example, we will illustrate the variations in waveforms among these types of earthquakes.

Earthquake Examples

A detailed explanation of waves, seismic instrumentation, and the records they produce was prepared by Charles Ammon at St. Louis University for an introductory earthquake class.

Seismologists have several different methods for determining the size of an earthquake - some based on body waves (which travel deep within the structure of the earth), some based on surface waves (which primarily travel along the uppermost layers of the earth), and some based on completely different methodologies.

Here is a brief description of five of the most common methodologies.

ML - "Local Magnitude" determined for local earthquakes (usually 600 km, or less from the recording station), originally developed by Charles Richter circa 1935 for classification of earthquakes in southern California. ML is defined as

ML = log(a) - log(ao)

where a is the maximum trace amplitude recorded by a standard instrument (the Wood Anderson Torsion seismometer) at a given distance and ao is amplitude for an earthquake of zero magnitude at the same distance. ML has been used most successfully in California, although it is in use in some other regions as well.

Md - "Duration Magnitude" is based on the length of time (starting from the initial P-wave arrival) the seismic wavetrain takes to diminish to 10% or less of its maximum recorded value. Md is mostly used for assigning magnitudes to small earthquakes. In Northern California, it is the preferred type of magnitude for earthquakes of about magnitude 3.0 or less.

Ms - "Surface Wave Magnitude" used for shallow (depth < 70km) earthquakes at teleseismic distances (20-180 degrees) using the 20-second Rayleigh wave for the determination. Ms is defined:

Ms = log(A/T) + s(distance,depth)

where A is maximum displacement, T is the period of the displacement, and s is a correction term for the distance of the station and the depth of the earthquake. Ms was developed by Gutenberg and Richter in 1936 as an extension to local magnitude at greater distances.

mb - "Body Wave Magnitude" which uses the amplitude of the P-wave train, the first arriving body wave, in the magnitude calculation. It is used at teleseismic distances from 16 to about 100 degrees, where this waveform starts to graze and then enter the core of the earth, changing its character. mb is defined in analogous fashion to Ms, with different correction factors.

Each of these magnitudes uses different parts of the seismogram over a different range of frequencies. While effort has been made to calibrate these scales so that they agree with one another, their definitions were limited by the type of instrumentation which existed during their development. For example, ML begins to "saturate" around magnitude 6.5. That is, ML does not properly estimate the size of larger events. In response to this, a new magnitude scale has been developed:

Mw - "Moment Magnitude" is the latest concept in magnitude determination. Unlike the other methods above, which are all based on the maximum amplitude of ground movement at the station, Mw is based on the seismic moment at the source, or hypocenter, of the earthquake. It may be calculated for local earthquakes all the way out to events occurring half way around the world. These are typically determined for local events of about 3.5 and larger magnitude, and teleseismic events of about magnitude 5.5. Smaller events typically don't generate enough energy to provide a sufficiently strong signal to perform the determination.

From the USGS: 'It isn't that simple. There is not one magnitude above which damage will occur. It also depends on other variables, such as the the distance from the earthquake, what type of soil you are on, etc. That being said, damage does not usually occur until the earthquake magnitude reaches somewhere above 4 or 5.'

Members of the California Integrated Seismic Network (CISN) are in the testing stage of an end-to-end early warning system similar to the one in Japan. Dr. Richard Allen of the Berkeley Seismological Laboratory has pioneered several methods that make earthquake early warning possible and is one of the project's principal contributors. Information on the recent earthquake early warning summit that was held here at UC Berkeley can be found here and other information related to earthquake early warning can be found on Dr. Allen's homepage.

Chang Heng invented the world's first seismoscope around 132 AD. A seismoscope is a device that measures the direction of the epicentral location of an earthquake but does not provide measurements of shaking intensity or duration. For a detailed description and picture of Chang Heng's Dragon Jar, please visit: https://www.usgs.gov/faqs/when-was-first-instrument-actually-recorded-earthquake?qt-news_science_products=7#qt-news_science_products.

In 1880, Sir James Alfred Ewing, Thomas Gray and John Milne, all British scientists working in Japan, began to study earthquakes. They founded the Seismological Society of Japan, and the society funded the invention of seismographs. One of their designs, the Ewing Duplex Pendulum, was installed by Edward S. Holden, the first President of the James Lick Observatory, in 1887 at both the James Lick and Student Observatories. These installations, among others, would later record the Great San Francisco Earthquake in 1906.

Both earthquakes and nuclear tests can rapidly release a large amount of energy. The energy source for small yield (typically less than 50 kilotons) thermonuclear devices is the splitting of heavy radioactive isotopes, whereas the energy source for an earthquake is tectonic strain accumulated by the relative motion of Earth's tectonic plates which is driven by mantle heat flow in the presence of the earth's gravitational field.

In a nuclear test, all of the energy is suddenly (within milliseconds) released in the form of heat from a relatively small volume surrounding the thermo-nuclear device. The tremendous heat causes rapid expansion of a spherical cavity, which in turn generates seismic waves. The heat gradually conducts away from the cavity into the surrounding rock. However, rock is a poor conductor of heat so it can take many years for the thermal signature of the thermonuclear explosion to subside and the increase in the surface temperature above the explosion is insignificant. Nuclear tests are also very shallow sources with the depth of burial generally less than a few hundred meters (the depth of burial is typically proportional to the cube root of the expected yield). The estimated yields of the larger Indian and Pakistani tests are approximately 2-40 kilotons.

In a large earthquake, the elastic strain energy stored in the Earth's crust is released, within a few seconds to a few tens of seconds, by rupture along a fault and the strain energy is released from a relatively large volume of rock surrounding the fault rupture. For example, the (5/30/98 at 06:22:28 UT) magnitude 6.5 earthquake in Afghanistan (37.4 N, 70.0 E), had a source duration of about 5 seconds and an estimated source volume of order 4000 cubic kilometers. This earthquake also had a focal depth of 18 km. The energy release is equivalent to a 2000 kiloton nuclear explosion.

On January 19, 1968, a thermonuclear test, codenamed Faultless, took place in the Central Nevada Supplemental Test Area. The codename turned out to be a poor choice of words because a fresh fault rupture some 1200 meters long was produced. Seismographic records showed that the seismic waves produced by the fault movement were much less energetic than those produced directly by the nuclear explosion.

The possibility of large Nevada Test Site nuclear explosions triggering damaging earthquakes in California was publicly raised in 1969. As a test of this possibility, rate of earthquake occurrence in northern California (magnitude 3.5 and larger) and the known times of the six largest thermonuclear tests (1965-1969) were plotted and it was obvious that no peaks in the seismicity occur at the times of the explosions. This is in agreement with theoretical calculations that transient strain from underground thermonuclear explosions is not sufficiently large to trigger fault rupture at distances beyond a few tens of kilometers from the shot point.

The Indian and Pakistani nuclear test sites are approximately 1000 km from the May 30, 1998 Afghanistan earthquake epicenter. The question that has been asked is whether or not the occurrence of these nuclear tests influenced the occurrence of the large earthquake in Afghanistan. The most direct cause-effect relationship is that the passage of the seismic waves, generated by the thermonuclear explosion, through the epicentral region in Afghanistan somehow triggered the earthquake. For example, following the occurrence of the magnitude 7.3 Landers earthquake in southern California on June 28, 1992, the rate of seismicity in several seismically active regions in the western US, as far as 1250 km from the epicenter, abruptly increased coincident with the passage of the earthquake generated seismic wavefield through each site. The abrupt increases in seismicity occurred primarily in regions of geothermal activity and recent volcanism. The mechanism by which this occurred remains unknown.

The Afghanistan earthquake occurred at 06:22:28 UT on May 30, 1998 and the thermonuclear test most closely associated in time occurred at 06:55 UT or after the occurrence of the earthquake. The other nuclear tests occurred 2-20 days before the earthquake. The elastic strains induced in the epicentral region by the passage of the seismic wavefield generated by the largest of the nuclear tests, the May 11 Indian test with an estimated yield of 40 kilotons, is about 100 times smaller than the strains induced by the Earth's semi-diurnal (12 hour) tides that are produced by the gravitational fields of the Moon and the Sun. If small nuclear tests could trigger an earthquake at a distance of 1000 km, equivalent-sized earthquakes, which occur globally at a rate of several per day, would also be expected to trigger earthquakes. No such triggering has been observed. Thus there is no evidence of a causal connection between the nuclear testing and the large earthquake in Afghanistan and it is pure coincidence that they occurred near in time and location.

There are many ways to create your own seismometer that will allow you to view and record seismic waves from your very own home! It is relatively easy to acquire all of the necessary materials required and you can be looking at earthquakes in no time. Scientific American has published two articles on this topic:

  • "Seismograph Plans: How to build a simple seismograph to record earthquake waves at home", 241, July 1979
  • "The New Backyard Seismology", 100, April 1996

In addition, there are many local groups of amateur seismologists that participate in public seismic networks. One such network is the Redwood City public seismic network which has a website that has lots of information about how to create your own seismometer.

For educators looking to teach students about seismology, here is a great website to help you get started and with lots of great info! http://www.iris.edu/hq/sis

Historic Earthquakes, and Earthquake Statistics

The best collection of damage photographs has been compiled by the UC Berkeley Earthquake Engineering Research Center (EERC). Karl Steinbrugge donated his collection of earthquake slides and photographs to the EERC in 1992. Other slides and photographs from EERC faculty and staff are collected in these archives as well. The EERC library provides copies of the slides and photographs for use in teaching and research for a minimal cost and online access to the digitized slide collection is available. This collection includes specialized sets of slides for the 1994 Northridge and 1995 Kobe earthquakes.

The USGS has a collection of photographs of earthquakes, volcanoes, and other geologic hazards. A description of these data is available on-line.

The National Geophysical Data Center of NOAA maintains a collection of photographs for a variety of geologic hazards, including earthquakes. Most of their collection is available as slide sets, although a sampler of their earthquake photographs is available online and includes images of the 1906 San Francisco and 1989 Loma Prieta earthquakes.

The Museum of the City of San Francisco has gathered photographs for a number of Bay Area earthquakes, particularly for the 1906 and 1989 events. These may be found at their Web site.

The Library of Congress is creating a National Digital library, comprised of prints and photos, documents, motion pictures, and sound recordings. This online collection is searchable and contains a number of images from the 1906 San Francisco earthquake.

This question was submitted to us by a 4th grade class in Southern California. We thought it was a great question and have made it into a FAQ!

Using the ANSS earthquake catalog, we searched California for all earthquakes from 1990-2011 having a magnitude greater than 1.0. We found 558434 earthquakes! Thus, on an average year, approximately 25,383 earthquakes are recorded and analyzed. Then the rate of earthquake occurrence in California is:

	Time	    Number
		of earthquakes

	Year	    25,383

	Month	     2115

	Week	       488

	Day	       70

Thus we record and analyze about 70 earthquakes per day on average. Most of the analysis is now done by automated computer algorithms so that seismologists no longer need to manually determine the location and magnitude of each earthquake. In California, earthquake monitoring is the shared responsibility of UC Berkeley, Caltech, and the United States Geological Survey.

As a function of magnitude, the number of events analyzed per year is:


	Magnitude	-------------Rate---------------
			(eq/yr)	 (eq/mo)   (eq/wk)   (eq/day)	

	>= 1		29545	 2462	   568	        81
	>= 1.5		16038	 1336	   308	        44	

	>= 2		 6076	  506	   117	        17
	>= 2.5		 1950	  163	    38	         5.3

	>= 3		  604	   50	    12	         1.7
	>= 3.5		  200	   17	     3.8	 0.55

	>= 4		   65	    5.4	     1.3	 0.18
	>= 4.5		   20	    1.7	     0.38	 0.055

	>= 5		  6.8	    0.57     0.13	 0.019	
	>= 5.5		  2.2	    0.18     0.042	 0.0060
 
	>= 6		  1.2	    0.10     0.023	 0.0033
	>= 6.5		  0.6	    0.05     0.012	 0.0016

Note that the above table is not reliable at magnitudes above 6 because the 10 year seismicity sample is not sufficiently long to include a lot of magnitude 6+ earthquakes. In generating the above table, we assumed that the rate at which earthquakes occur does not vary with time. Large aftershock sequences violate this assumption.

The threshold at which people report feeling an earthquake is approximately magnitude 2 (under ideal conditions, ie, not moving and in the immediate vicinity of the epicenter). The threshold at which some damage is reported (such as broken windows and objects knocked off shelves) is approximately magnitude 4. The threshold at which damage to weak structures (unreinforced masonry) occurs is approximately magnitude 5.5.

The 1906 San Francisco earthquake occurred on the San Andreas Fault, a place where two plates are sliding past each other in a motion known as "strike-slip. " The rock where the city of San Francisco is now was down in Southern California 20 million years ago and will continue to move relative to the rest of North America. (Click here to see a video showing the plate tectonic history of Southern California, with San Francisco moving north over the past 20 million years.) The 1906 earthquake measured between magnitude 7.7 and 7.9, making it a major earthquake, but not the largest ever recorded. This honor goes to the 1960 magnitude 9.5 earthquake in Chile. The world's largest earthquakes occur in subduction zones, plate boundaries where one plate is pushing under another.

According to the USGS, The earthquake that occurred on January 23, 1556 near Huaxian, Shaanxi (formerly Shensi), China could be considered the worst ever because it caused more casualties than any other earthquake in recorded history, with 830,000. The table below compares this earthquake to other, more recent, large events.

DATE         REGION                      CASUALTIES        MAG
1556/01/23   Shaanxi (Shensi), China        830,000        8.0 (est.)
1906/04/18   San Francisco, CA               >3,000*       7.9
                                             *mostly from resulting fires 
2011/03/11   Japan                           28,050        9.0
2010/01/12   Haiti region                   222,570        7.0
2004/12/26   Sumatra                        227,898**      9.1
                                             **many from the tsunami
1960/05/22   Chile                            1,655        9.5

Looking at this table, we see a well defined trend illustrating how earthquake damage is worse in poorer regions than in more developed countries like Japan and Chile. ( Most of the casualties of the Great Tohoku Earthquake in Japan were due the tsunami.)

The USGS has a neat page that presents a historical earthquake that occurred on the present day. You can even view historical earthquakes on a day of your choosing! http://earthquake.usgs.gov/learn/today/

For more information on historic earthquakes please visit https://www.usgs.gov/natural-hazards/earthquake-hazards/special-earthquakes-earthquake-sequences-and-fault-zones

Questions and Answers About Earthquakes in Other Regions

Other Resources