Parkfield-Hollister Electromagnetic Monitoring Array


The primary objective of the UC Berkeley electromagnetic (EM) monitoring array is to identify EM fields that might be associated with earthquakes. The array has consisted of up to three sites since 1995 at SAO, PKD, and PKD1, each of which measures three orthogonal components of the magnetic field and two orthogonal components of the electric field. Such an array is necessary in order to separate the fields of a local source (e.g., an earthquake signal) from the natural EM fields of the Earth. Our approach has been to determine the transfer function between fields at different sites for periods of normal background EM variations and then use this transfer function to predict fields between sites. Differences between the observed and predicted fields are used to search for anomalous local fields.

Analysis of the UCB array has shown that cultural noise from the San Francisco Bay Area (in particular BART) extends over surprisingly large areas, and that natural ionospheric sources may exhibit significant spatial complexity (Egbert et al., 2000). The fundamental MT assumption of spatially uniform sources is thus frequently violated in this area. These source complications are highly variable in time, reducing the effectiveness of a single remote site for EM noise cancellation. Multiple remote sites would allow significantly better cancellation of these more spatially complex EM noise fields, and would also reduce bias errors in the inter-station transfer function estimates. It was always the goal of the project to have three stations, but in 1999 the use permit at Haliburton Ranch was lost and PKD1 was removed just one month after PKD was installed. Analysis of data from this one month clearly demonstrates the value of three sites for improving the residual analysis, 2000 BSL Annual Report.

MT Overview

In 1995 we installed two well-characterized electric and magnetic field measuring systems at two sites along the San Andreas Fault which are part of the Berkeley Digital Seismic Network. Since then, magnetotelluric (MT) data have been continuously recorded at 40 Hz and 1 Hz and archived at the NCEDC (Table 6.1 and 6.2). At least one set of orthogonal electric dipoles measures the vector horizontal electric field, E, and three orthogonal magnetic sensors measure the vector magnetic field, B. These reference sites, now referred to as electromagnetic (EM) observatories, are co-located with seismographic sites so that the field data share the same time base, data acquisition, telemetry and archiving system as the seismometer outputs.

The MT observatories are located at Parkfield (PKD1, PKD) 300 km south of the San Francisco Bay Area and Hollister (SAO), halfway between San Francisco and Parkfield (Figure 6.1). In 1995, initial sites were established at PKD1 and SAO, separated by a distance of 150 km, and equipped with three induction coils and two 100 m electric dipoles. PKD1 was established as a temporary seismic site, and when a permanent site (PKD) was found, a third MT observatory was installed in 1999 with three induction coils, two 100 m electric dipoles, and two 200 m electric dipoles. PKD and PKD1 ran in parallel for one month in 1999, and then the MT observatory at PKD1 was closed.

Data at the MT sites are fed to Quanterra data loggers, shared with the collocated BDSN stations, synchronized in time by GPS and sent to the BSL via dedicated communication links.

Figure 6.1: Map illustrating the location of operational (filled squares) and closed (grey squares) MT sites in central California.
\epsfig{, width=8cm}\end{center}\end{figure}

2002-2003 Activities

Over the past year new electrodes have been installed at the PKD site, and automated quality control software has been implemented. Any failures of the UCB MT array are now immediately detected so that corrective action can be taken in a timely fashion. With these improvements in the system, nearly continuous high quality data have been collected.

This year time domain processing codes have been developed and tested on short segments of data. Karl Kappler is using a least squares Wiener filter, while Gary Egbert employs multivariate array transfer functions.

Station Maintenance


In January of this year the Q4120 datalogger was replaced. In February, the Hx coil was replaced.


Last September, lead-lead chloride electrodes from John Booker were installed in the 200 m dipoles. They require less maintenance than the copper-copper sulfate electrodes used in the 100 m dipoles. The addition of bentonite has significantly improved water retention in the electrode holes, increasing electrode longevity. In December the vertical coil was replaced, and in May and June 2003, the batteries powering the electric field pre-amplifiers were replaced.

Instrument Responses

Sierra Boyd ensures the transfer function information at the NCEDC is correct and current.

Data Quality Control

During this year, BSL staff worked in collaboration with Gary Egbert to install his software for automated data processing. The software provides the capability of identifying problems and alerting staff. There is a daily printout of the signal to noise ratios (SNR) in dB for each channel of the array. Currently, SNR's below 10 dB are flagged for inspection or repair by the array operators.

Data Processing

An effort has been made to do residual analysis purely in the time domain. The data used are the MT measurements on all five channels sampled at 1Hz. Since the station mostly sees noise originating by large sheet currents in the ionosphere, and the distance between sites is only a few hundred km, the input EM signal at each station should be roughly the same. Thus, a transfer function (TF) between two sites should be approximately constant. The relationship between the two sites is determined at a time when no significant seismic activity (SSA) is occurring near the arrays. On a day when SSA is present at one site, we can examine the residuals for anomalous activity.


In this section we use an impulse response operator (IRO) rather than a TF as we are working in the time domain. The current IRO is a Wiener filter computed using least squares. The operator is computed using a days worth of data (86400 observations). Before computing the operator, the data must be despiked. For this, an automated despiking algorithm has been employed. Time series data are scanned for anomalies which lie more than a user specified number of standard deviations from the sample mean (default is 10). When an outlier is observed, the corresponding channel at the other station is examined within a two minute window about the time of the outlier. If a similar event took place, the anomaly is considered signal. Otherwise it is considered anomalous noise and is replaced. Currently a two minute window about the spike is replaced with a linear fit. Substituting with an ARMA (AutoRegressive Moving Average) model prediction has also been used, but is not yet the standard.

After despiking, the data are detrended using a first order polynomial. Then the IRO is computed. To predict a given channel we use all five channels at the other site. Denoting the channel to be predicted as the time series $\{X_t\}$ we obtain the formula

X_t = \sum_{ch=1}^{5} \Psi_{ch} * T_{ch}
\end{displaymath} (6.1)

where ch denotes that the sum is over each channel, and each channel has its own convolution operator. The * then denotes the convolution between the filter and T, the time series.

The length of the IRO can be any odd number. It has been observed that by using a longer IRO, predictions improve. With a long enough operator the least squares fit can be made arbitrarily fine (RMS(signal-fit)->0), but such a fine fit is also fitting noise unique to the data segment used to compute the IRO. We choose an IRO length which gives a fit roughly as good as it gives a prediction of future signal. The following chart (Figure 6.2) shows the ratio of the RMS signal to residuals as a function of IRO length.

Figure 6.2: As the filter gets longer, the fit is better.
\epsfig{file=em03_1_1.eps, width=8cm}\end{center}\end{figure}

Due to the computing power needed for this approach (inverting a matrix of dimension 5 times the filter length, and multiplying two matrices of dimension [number of seconds of data, 5xfilter length]), we can see that the number of calculations rises quadratically with filter length. For day long time segments (86400 rows) it is difficult to compute an IRO much longer than 35. An optimization can be performed for a given time segment length to determine the best IRO length. These may include a constrained least squares inversion using support vector machines, or ARMA approaches. For simply reducing one channel to residuals through modeling, invertible ARMA methods yield reduction as good or better. The disadvantage to this type of modeling, however, is that it is difficult to use in predicting one station from another. Furthermore, the method is expensive on computing power and can only model short segments, say of order one hour, with the current computing system and software.

Figure 6.3a shows the result of five 11-point Wiener filters applied to the five channels of Parkfield on day 228 in 1996. This day was chosen for its good signal to noise ratios in the raw data, and the fact that a M4.0 quake occurred one month later near the Parkfield array. The IRO was computed using this day's data, so this is essentially a least squares fit. The edges are imperfect, but the fit is generally excellent. The RMS ratio is around 12.2; however, if we neglect the edges (5000 s to either side), the RMS is 14.

Figure 6.3: it a): A least squares fit (green) to the signal (blue), and the residuals (red), using an 11 point filter to the PKD data of day 228 in 1996. b): Signal(blue), prediction (green), and residual(red) from using the day 228 filters to predict day 230. Note the change in scale from figure a.
\epsfig{file=em03_1_2.eps, width=8cm}\epsfig{file=em03_1_3.eps, width=8cm}\end{center}\end{figure*}

In Figure 6.3b, the day 228 filters are used for prediction of day 230. We can see that the shape of the fit is again excellent, but there are some long period effects, which leave the prediction higher than the signal in some places and lower than the signal in others.

In the raw data there have been some long period instrument related diurnal effects in the magnetic data. A high pass filter has been designed for this job. Care is required in filtering out the long period signals, as the MT precursors we are looking for could be low frequency.

The Future

The residual analysis in time domain is free of the frequency domain inherent errors. The Gibbs phenomenon and effects due to discrete modeling are non existent. The time domain residuals can be computed and scanned for anomalous activity. Bandpass filtering of the raw data will likely remove some of the prediction misfit. Also, cutting the data into smaller parcels (one-three hours) and detrending each of these segments individually may reduce some of the long period noise. High frequency noise also leads to misfits in data. Low pass filters need to be employed to decimate signal to about 0.03 Hz, as we are looking for signals with duration greater than half a minute. Cleaning out this high frequency noise will likely improve predictions. The code is in place to begin the filtering this fall. Experiments with other prediction methods (such as constrained LS and ARMAs mentioned earlier) will continue as well. The plan is to have an automated system which reads in data from the array, despikes, computes residuals, and then scans the residuals for RMS anomalies in place over the next 4 months.

Table 6.1: Sites of MT observatories
Site Net Latitude Longitude Elev (m) Date Location  
PKD BK 35.945171 -120.541603 583 1999/02/05 - Bear Valley Ranch, Parkfield  
PKD1 BK 35.8894 -120.426109 431.6 1995/06/06 - 1999/03/08 Haliburton House, Parkfield  
SAO BK 36.76403 -121.44722 317.2 1995/08/15 - San Andreas Obs., Hollister  

Table 6.2: Typical data streams acquired at each MT site, with channel name, sampling rate, sampling mode, and FIR filter type. C indicates continuous; T triggered; Ac acausal.
Sensor Channel Rate (sps) Mode FIR
Magnetic VT? 0.1 C Ac
Magnetic LT? 1.0 C Ac
Magnetic BT? 40.0 C Ac
Electric VQ? 0.1 C Ac
Electric LQ? 1.0 C Ac
Electric BQ? 40.0 C Ac


Frank Morrison directs the MT program, and collaborates closely with Gary Egbert of Oregon State University and Steve Park of UC Riverside. Sierra Boyd, John Friday, Lind Gee, and Doug Neuhauser also contribute to the operation of the MT observatories. Karl Kappler, Sierra Boyd, Gary Egbert, and Lind Gee contributed to the preparation of this chapter.

Support for the MT array is provided by the USGS through the NEHRP external grants program and by the Plato Malozemoff Chair in Mineral Engineering held by Frank Morrison.


Egbert, G.D., M. Eisel, O.S. Boyd and H.F. Morrison, DC trains and Pc3s: Source effects in mid-latitude geomagnetic transfer functions, Geophys. Res. Lett., 27, 25-28, 2000.

Egbert, G.D., Robust Multiple-Station Magnetotelluric Data Processing, Geoph. J. Int., 130, 475-496, 1997.

Eisel, M., and G.D. Egbert, On the stability of magnetotelluric transfer function estimates and the reliability of their variances, Geophys. J. Int., 144, 65-82, 2001.

Berkeley Seismological Laboratory
215 McCone Hall, UC Berkeley, Berkeley, CA 94720-4760
Questions or comments? Send e-mail:
© 2004, The Regents of the University of California