Report from TriNet North ShakeMap Workshop of June 28, 1999
By TriNet North Shakemap Working Group (D. Dreger
(UCB), C. Scrivner (CDMG) and J. Boatwright (USGS))
This workshop was held at
the request of the TriNet North Program Management Committee (PMC) to begin the
process of identifying tasks that need to be accomplished to develop shakemap
capability in northern California as new strong motion stations become
available, and existing stations receive telemetry. A region-wide capability will be required that uses methodologies
suitable to a seismic network with an uneven distribution of stations, which in
most areas is likely to be quite sparse. The PMC specifically asked that we
attempt to define how to merge different methods of constructing shakemaps to
meet an OES request for information and maps that are uniform state wide. The current sparse coverage of strong motion
stations and the large region of coverage require the development and
integration of a hybrid methodology that allows for estimation of earthquake
strong ground motions given varied levels of data and source parametric input
as well as knowledge about site variability. The PMC provided several focus
items to the workshop participants:
- We want
to meet public service needs by getting multiple organizations to work together
to produce a hybrid product that is coherent across the state, and will reflect
a broad-based consensus within the seismic community.
- We must process both urban and rural earthquakes, and allow
changing processing over various timescales after an earthquake.
- As a result of the workshop, a task list of assignments should
be developed, and a short report prepared.
The report will have internal, and possibly external, distribution.
The
workshop was structured into a morning informational session and an afternoon
general discussion session. In the
morning three shakemap methodologies were presented. In addition, developments in the modularization of TriNet
ShakeMap software was discussed. In the
following we summarize the method presentations and the afternoon discussions. Minutes of the meeting were compiled by C.
Scrivner and are available at http:// ftp.gps.caltech.edu/pub/scrivner/minutes.
TriNet ShakeMap: David Wald presented work being done to
improve the southern California TriNet ShakeMap. The methodology is data driven, but to deal with varied station
density an attenuation model is used to aid in interpolation between data
points. The first step to the method is
the determination of a strong motion centroid (a point source description). Once a centroid is determined, ground motion
values are estimated on a coarse grid of points using standard attenuation
curves (Joyner and Boore, 1981) and a fitted magnitude.
The
ground motion estimates are modified based on a set of site corrections using a
regional geologic model developed by Park and Elrick (1998). This model identifies areas in southern
California with Quaternary, Tertiary or Mesozoic surface rocks. This classification scheme is then used to
determine site corrections following Borcherdt (1994). The corrections are
applied to the predicted shakemap values (data values are never modified). Grid points within 30 km of a recording
seismic station are dropped. The coarse
grid of both data and predicted values is then more finely interpolated and the
fine mesh is contoured. Details of this
method may be found at http://www.trinet.org/shake.html.
Deterministic ShakeMap: Douglas Dreger described work at UCB to
develop shakemap capability in areas that lack either dense strong motion
instrumentation or telemetered instruments.
This method focuses on determining the finite source parameters of
earthquakes from continuously recorded and near-real-time telemetered regional
distance broadband stations. The finite
source parameters are then used in a tiered system to generate near-source
strong shaking. The first stage is to
invert the broadband data for the causative fault plane, rupture velocity and
dislocation rise time using the two possible nodal planes obtained from
automated seismic moment tensor analysis.
The results of the finite source inversion are used to derive
directivity parameters as they are needed by the empirical regression equations
of Somerville et al (1997). These equations
are used to determine peak ground acceleration and spectral acceleration at
periods of interest. The third tier
utilizes the finite source parameters to integrate the fault slip both
spatially and temporally to determine near-source seismograms from which strong
ground motion parameters are obtained.
The merit of this approach is that it provides source-specific
directivity sensitive estimates of near-source strong ground motions. It is
noted that this method is not strictly model based, and that it is possible to
integrate direct observations in the same manner as is done by the TriNet
shakemap. The primary difference
between the two approaches in this context is in the details and complexity of
the underlying model used to predict ground motions where direct observations
are not available. A more detailed
description of the method may be found at http://www.seismo.berkeley.edu/~dreger/webrpt.htm.
Estimated Intensity Maps.
Jack Boatwright described the method he developed for estimating Modified
Mercalli Intensity (MMI) using seismic moment tensor input and a detailed
geologic map of the greater SF Bay Area. This method chooses a fault
plane from the seismic moment tensor as the nodal plane closest to the average
strain across the Bay Area. The extent of faulting is determined from the
hypocentral depth and the earthquake magnitude and mechanism-type. Ground
motions and intensities are estimated for the Bay Area from an attenuation model
that incorporates source directivity both along-strike and updip of the
fault. Digital geology, mapped at the 1:125,000 scale and binned into
NEHRP site categories B-E, are used to provide a dense mapping of site
corrections for the ground motions and the estimated MMI's. A more detailed description of the method
may be found at
http://ncweb-menlo.wr.usgs.gov/study/effects/intensity.html.
Data Availability: The
primary obstacle to implementing the southern California TriNet shakemap
algorithm in northern California is the lack of available data. Examination of station density in the
greater San Francisco Bay Area shows that the numbers of stations is likely to
be adequate for a southern California-style implementation, however many of
these stations, although digital, do not presently have adequate near-real-time
telemetry. Outside of the greater San Francisco Bay Area the density of TriNet
North stations is much less. See http://perry.geo.berkeley.edu/seismo/trinet/committees/existing_stat.html
for figures comparing station densities. How well will data driven shakemaps in
regions outside the Bay Area perform?
Considering the extended and distributed nature of the supporting
infrastructure for California the workshop group reached consensus that all
areas of California should be covered by accurate maps. Given the possibility of local station
and/or telemetry failure, as well as the sparse coverage outside the urban area
it was deemed important to understand how shakemaps respond to deteriorating
station coverage, and to develop an appropriate backup capability. In order to
assess performance under varied conditions a testbed experiment was
proposed. The experiment will compare
the different proposed shakemap methodologies in varied station coverage for
two well recorded southern California earthquakes. The testbed experiment is defined later in this report.
Geologic Information: Another
important factor in the production of shakemaps is the geologic map used to
calculate site corrections for the predicted ground motions. In southern
California the QTM site classification and 30m depth shear wave velocity
extrapolation of Park and Elrick (1998) is used in conjuction with Borcherdt
(1994) to obtain site specific corrections.
A continuous map of corrections is needed because they are used where
predicted ground motion estimates are needed in the contouring of the shakemap.
The Park and Elrick (1998) map has a scale of 1:750,000 for greater southern
California and finer scale in the Los Angeles region. To implement a shakemap
in northern California, it is necessary to have a reference site geology
model. Maps of poor soils are
important, and their compilation is essential to seismic hazard assessment. As
described below, a digital map with 1:250000 scale is being compiled for the
entire state and should be released by the CDMG in the fall of 1999. This map should be adequate for identifying
site type at each point of ground motion prediction for interpolation and
contouring purposes.
The
following is excerpted from Jack Boatwright’s report on the (nationwide) availability
of a geology database:
The state geology is already digitized at
1:750,000. This digitization contains a
relatively crude separation among surficial deposit (unconsolidated sediments,
alluvium… and rock). Similar maps are
available for all the states in the Western Region. The 1:250,000 digital geologic maps that the CDMG is now
compiling are “materials maps.” They
are mostly newly digitized, not newly mapped. The geology is subdivided into
six categories: B, B/C, C, C/D, D, and D/E.
Chris Wills and Walt Silva have been working together to devise these
categories. The digital map will be
released in early September, and Jim Davis is committed to having this geologic
map used for making “ShakeMaps” within the state. Wentworth’s 1:125,000 map covers 10 “Bay Area” counties (Sonoma,
Napa, Solano, Marin, San Francisco, Contra Costa, San Mateo, Alameda, Santa
Clara, and Santa Cruz). This will be
improved to 1:100,000 for the “urban core” of the Bay Area, including 1:24,000
maps for the 9 Bay Area counties. (All of the above, except Santa Cruz.).
A Critical Question: What is the impact of seismic
hazard and geologic mapping in the rapidly expanding urban fringe of the Bay
Area, that is, the various suburban areas of Hollister, Stockton, Tracy, and
Sacramento? Don Gautier (USGS-Chief of
Western Geologic Mapping) is now “redefining” the Bay Area as extending from
Monterey to Sacramento. Ed Halley,
using oil well and borehole data, made quaternary maps of Sacramento years ago.
Len Gaydos (USGS-Project Chief for Urban Dynamics Project) has compiled a
series of maps showing the History of California Urban Growth that details the
urban/suburban incursion into the Valley since 1954. There are some areas
covered by past mapping that may be used, but in the rapidly developing
‘fringes’ of the greater San Francisco Bay Area there appears to be a lack of
detailed information.
Attenuation Relationships: The southern California group remarked that ground motion
estimates merged from a variety of sources, as has been suggested for northern
California, may result in inconsistent results. The consistency of SoCal ShakeMaps is due the use of the same
attenuation relationship to determine both the centroid parameters (location
and magnitude) and the estimated values used in the interpolation. Thus if multiple methodologies are used it
would be advantageous to settle on a common attenuation relationship to
estimate ground motions. It is
recommended that a more recent attenuation relationship be used to take
advantage of recorded ground motions for recent large earthquakes. Additionally, the attenuation relationship
should also have the flexibility to incorporate source directivity information
as it becomes available (e.g. Somerville et al., 1997). The attenuation relationship must provide
estimates of PGA, PGV, PGD, and spectral acceleration at 0.3, 1 and 3 seconds
period. These parameters, except PGD,
are presently reported by the southern California ShakeMap project and PGA and
PGV are used in the calculation of instrumental intensity. The attenuation relationship of Somerville
et al (1997) meets these requirements with the exception of regression formulas
for PGV and PGD. It is recommended that
such regressions be performed in a consistent manner to provide all of the
needed strong ground motion parameters.
For uniformity of product throughout California it is advised that the
same earthquake ground motion data sets and regression equations be used in both
northern and southern California.
In
addition, attenuation relationships currently available used in TriNet
ShakeMaps were developed with data from earthquakes of magnitude 5 and greater.
Attenuation relationships extended to lower magnitudes would be useful. Generating maps for smaller earthquakes
increases the number of test events for the system and develops ongoing user
awareness that the maps are available.
In southern California, maps are made for earthquakes down to M 4 or
even 3.5 in urban areas (based on user demand), but the development group has
had to extrapolate the existing attentuation relationships to apply them to the
smaller events.
Update Stages & Versioning: The concept of stage
processing or versioning was discussed.
What our comfort level is in terms of providing information at various
stages is an important question that requires further thought and
analysis. Should there always be a map
generated? Even with no reporting
stations? The response of the workshop group was yes, to both
questions. A system must be developed
that has sufficient redundancy to provide some information even if it is the
most basic ‘bull’s-eye’ map. Subsequent
versioning then depends upon the numbers of reporting stations, and whether
additional more sophisticated modeling approaches are used to calculate ground
motions for interpolation purposes. Can
a consistent versioning system be developed and applied to both southern and
northern California? This question is
much more difficult to answer. In southern
California, ShakeMap has evolved out of the TriNet plan to provide denser
strong motion coverage, and therefore focus has been on a data driven product
minimally modified by ground motion predictions (except for the smaller
events). In northern California with
the larger region of interest and fewer available stations, versioning is
likely to depend upon different methodologies.
The Berkeley model for a shakemap uses a multi-tier approach in which
the seismic moment tensor processing provides the initial estimate of magnitude
and fault orientation. Given this
magnitude it would be possible to produce a ‘bull’s-eye’ map using the
attenuation relationship. Second tier processing
involves the determination of the causative fault plane and source dimension. A second tier shakemap is obtained utilizing
regression formulas that take source finiteness into account. Third tier processing involves the forward
prediction of strong motion seismograms from which strong motion parameters are
determined. A composite map would then
be compiled from the larger of the stage two and stage three values. As noted before these ground motion values
would then be used for interpolation and contouring. As in southern California, if recording stations are located reasonably
close to a predicted value the data would take precedence and the predicted
values would be ignored.
Concerns
about consistency of results between versions should be addressed during
testing. The test-bed should be used to
address these issues and work toward a hybrid methodology, probably “versioned”
and/or “tiered”, to strive for smoothly improved and self-consistent results as
more data and source information becomes available.
Use of Modules: Bruce Worden described the development of
modules that would allow the incorporation of different model-based approaches
into the general shakemap framework.
This effort is deemed to be very important in providing the flexibility
that is required when dealing with large regions of coverage where capabilities
are expected to vary. The northern and
southern California shakemap working groups need to coordinate on the
definition of input formats to provided the flexibility that is required in
northern California. This can be
accomplished during the test bed experiment described later.
The
working group reached consensus that an evaluation of the different
methodologies is needed. The purpose of
this test is several fold. First, it is important to understand how well a
predicted map based on the relatively fewer near-real-time stations performs in
regard to predicting ground motions where observations are not available. This test can be accomplished for the 1992
Landers and Northridge earthquakes using a reduced station set that is consistent
with the numbers and geometry of TriNet stations that would be available to
these earthquakes if they occurred today.
The remainder stations may then be used to compare the shakemap contours
to direct observation to assess the level of uncertainty in the shakemaps. Second, it is necessary to evaluate the
performance of the methodologies under diminishing station coverage. Reduced
coverage is possible following a large earthquake if there are earthquake induced
telemetry problems, and is certainly a problem in the vast regions of
California where dense strong motion coverage is unlikely to be installed. These tests can be accomplished using the
1992 Landers and 1994 Northridge earthquakes as test cases in which random
station distributions with interstation spacings of 20km, 30km and 40 km are
tested. Further, in these tests it will
be necessary to evaluate performance in cases without near-fault recordings
such as LUC for Landers or stations in forward directivity region for
Northridge. In all tests the combined
data/model shakemaps will be compared to observations from the larger strong
motion data set that has been compiled for each of these earthquakes. To
facilitate the comparisons the various methods should produce maps of PGA, PGV
and Sa at periods of 0.3, 1.0 and 3.0 seconds, and use the contouring algorithm
employed by the TriNet ShakeMap. This
experiment will also serve the purpose of testing the modular programming being
developed for the TriNet shakemap. The
results of the experiment will be posted to a restricted website for use by the
southern and northern California shakemap working groups. The following is a suggested time table:
1.
January 1, 2000 –
Completion of TriNet station test for both the Landers and Northridge
earthquakes including
a.
Comparisons of
data/model shakemaps using different techniques for the estimation of model
input
b.
Comparison of the
data/model shakemaps with the complete strong motion data set for both
earthquakes
2.
March 2000 – Completion
of sparse station tests
a.
20 km random
configuration*
b.
30 km random
configuration*
c.
40 km random
configuration*
d.
40 km without near-fault
or forward directivity observations
* each of
these tests should be performed for a number of randomly generated station
configurations in order to evaluate the sensitivity.
1. Improve the availability of data from
stations that are in the ground.
2. Closer coordination with the TriNet
North Instrumentation Committee to design the future network to provide adequate
coverage of varied site conditions and source regions.
3. Need to conduct an evaluation of
shakemap performance in varied observing station densities.
4. Geology and site response is of
considerable importance. The workshop
group recommends that USGS and CDMG administrators ‘fast track’ efforts to
characterize geology at a scale needed for shakemap.
5. Version 2 of the TriNet ShakeMap
modules need to be flexible enough to allow for incorporation of other model
based approaches (an essential ingredient in the testbed experiment proposed in
3).
6. The southern and northern California
shakemap working groups need to reach a consensus on the underlying attenuation
relationships. HAZUS employs
‘consensus’ relations, however for the northern California shakemap the
relations need to account for directivity affects and the relationships should
include the most complete strong motion data set available. This raises an interesting question; should
we develop guidelines for future updates of attenuation relationships as new
data becomes available?
Borcherdt, R.
D. (1994) Estimates of site-dependent response spectra for design (methodology
and justification), Earthquake Spectra, 10, 617-653.
Joyner, B. and
D. Boore (1981) Peak horizontal accelerations and velocity from strong-motion
records including records from the 1979 Imperial Valley, California,
earthquake, Bull. Seism. Soc. Am., 71, 2011-2038.
Park, S., and
S. Elrick (1998) Predictions of shear-wave velocities in southern California using
surface geology, Bull. Seism. Soc. Am., 88, 677-685.
Wald, D. J.,
V. Quitoriano, T. H. Heaton, H.
Kanamori, C. W. Scrivner, and C.
B. Worden (1999) TriNet ``ShakeMaps'': Rapid Generation of Instrumental Ground Motion
and Intensity Maps for Earthquakes in Southern California, in press Earthquake
Spectra.