Andrew Michael
I love that seismology lets me study a fascinating natural process and use that knowledge to help people understand earthquakes and live with them safely. I combine observations of earthquakes with statistical models to assess hazards, evaluate earthquake predictions, seek to understand how geologic structures and tectonic forces cause earthquakes, and communicate that information to the public.
Andy Michael has been a geophysicist with the U.S. Geological Survey’s Earthquake Science Center since 1986. He combines observations of earthquake processes and statistical models to determine long-term and short-term earthquake probabilities, to evaluate proposed earthquake prediction methods, and to better understand how stress and structure function as part of the seismogenic process. A graduate of MIT (B.S., 1981) and Stanford University (M.S., 1983, Ph.D. 1986), he has authored over 100 papers and reports. He was the Editor-in-Chief of the Bulletin of the Seismological Society of America from 2004 to 2010. He also served the Society as President and on its Board of Directors.
His outreach efforts include founding the Earthquake Science Center web site, which became part of earthquake.usgs.gov, in order to facilitate the rapid dissemination of earthquake information and a lecture and performance titled “The Music of Earthquakes.” That lecture combines music and seismology and features “Earthquake Quartet #1,” his composition for voice, cello, trombone, and sonified seismograms. He is a founder of an online educational resource: The Community Online Resource for Statistical Seismicity Analysis.
He currently works on the USGS aftershock forecasts under the Earthquake Processes, Probabilities, and Occurrence Project, long-term hazards assessments as part of the National Seismic Hazard Model Project and is a member of the National Earthquake Prediction Evaluation Council.
For his service to the Seismological Society of America he received its Distinguished Service Award in 2011. For his career contributions, he received the Department of the Interior’s Distinguished Service Award in 2019.
Science and Products
Size distribution of Parkfield’s microearthquakes reflects changes in surface creep rate
Do aftershock probabilities decay with time?
Fundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks
Random variability explains apparent global clustering of large earthquakes
Community online resource for statistical seismicity analysis
CORSSA: The Community Online Resource for Statistical Seismicity Analysis
BSSA: Worth thinking about
Improved tests reveal that the accelarating moment release hypothesis is statistically insignificant
Seismic velocity structure and seismotectonics of the eastern San Francisco Bay region, California
Damped regional-scale stress inversions: Methodology and examples for southern California and the Coalinga aftershock sequence
Implications for prediction and hazard assessment from the 2004 Parkfield earthquake
Science and Products
- Science
- Data
- Publications
Filter Total Items: 77
Size distribution of Parkfield’s microearthquakes reflects changes in surface creep rate
The nucleation area of the series of M6 events in Parkfield has been shown to be characterized by low b-values throughout the seismic cycle. Since low b-values represent high differential stresses, the asperity structure seems to be always stably stressed and even unaffected by the latest main shock in 2004. However, because fault loading rates and applied shear stress vary with time, some degreeAuthorsTheresa Tormann, Stefan Wiemer, Sabrina Metzger, Andrew J. Michael, Jeanne L. HardebeckDo aftershock probabilities decay with time?
So, do aftershock probabilities decay with time? Consider a thought experiment in which we are at the time of the mainshock and ask how many aftershocks will occur a day, week, month, year, or even a century from now. First we must decide how large a window to use around each point in time. Let's assume that, as we go further into the future, we are asking a less precise question. Perhaps a day frAuthorsAndrew J. MichaelFundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks
Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long‐term estimates of large eartAuthorsAndrew J. MichaelRandom variability explains apparent global clustering of large earthquakes
The occurrence of 5 Mw ≥ 8.5 earthquakes since 2004 has created a debate over whether or not we are in a global cluster of large earthquakes, temporarily raising risks above long-term levels. I use three classes of statistical tests to determine if the record of M ≥ 7 earthquakes since 1900 can reject a null hypothesis of independent random events with a constant rate plus localized aftershock seqAuthorsA.J. MichaelCommunity online resource for statistical seismicity analysis
[No abstract available]AuthorsJ.D. Zechar, J.L. Hardebeck, A.J. Michael, M. Naylor, S. Steacy, S. Wiemer, J. ZhuangCORSSA: The Community Online Resource for Statistical Seismicity Analysis
Statistical seismology is the application of rigorous statistical methods to earthquake science with the goal of improving our knowledge of how the earth works. Within statistical seismology there is a strong emphasis on the analysis of seismicity data in order to improve our scientific understanding of earthquakes and to improve the evaluation and testing of earthquake forecasts, earthquake earlyAuthorsAndrew J. Michael, Stefan WiemerBSSA: Worth thinking about
The Bulletin of the Seismological Society of America (BSSA) is a powerful community project that has helped us share the information necessary to keep our field moving forward since 1911. In some ways, BSSA is much like it has always been, and each issue provides us with a collection of research that has been improved by the peer review process and copyedited, typeset, and printed to make it easilAuthorsAndrew J. MichaelImproved tests reveal that the accelarating moment release hypothesis is statistically insignificant
We test the hypothesis that accelerating moment release (AMR) is a precursor to large earthquakes, using data from California, Nevada, and Sumatra. Spurious cases of AMR can arise from data fitting because the time period, area, and sometimes magnitude range analyzed before each main shock are often optimized to produce the strongest AMR signal. Optimizing the search criteria can identify apparentAuthorsJ.L. Hardebeck, K.R. Felzer, A.J. MichaelSeismic velocity structure and seismotectonics of the eastern San Francisco Bay region, California
The Hayward Fault System is considered the most likely fault system in the San Francisco Bay Area, California, to produce a major earthquake in the next 30 years. To better understand this fault system, we use microseismicity to study its structure and kinematics. We present a new 3D seismic-velocity model for the eastern San Francisco Bay region, using microseismicity and controlled sources, whicAuthorsJ.L. Hardebeck, A.J. Michael, T. M. BrocherDamped regional-scale stress inversions: Methodology and examples for southern California and the Coalinga aftershock sequence
We present a new focal mechanism stress inversion technique to produce regional-scale models of stress orientation containing the minimum complexity necessary to fit the data. Current practice is to divide a region into small subareas and to independently fit a stress tensor to the focal mechanisms of each subarea. This procedure may lead to apparent spatial variability that is actually an artifacAuthorsJ.L. Hardebeck, A.J. MichaelImplications for prediction and hazard assessment from the 2004 Parkfield earthquake
Obtaining high-quality measurements close to a large earthquake is not easy: one has to be in the right place at the right time with the right instruments. Such a convergence happened, for the first time, when the 28 September 2004 Parkfield, California, earthquake occurred on the San Andreas fault in the middle of a dense network of instruments designed to record it. The resulting data reveal aspAuthorsW. H. Bakun, Brad T. Aagaard, B. Dost, William L. Ellsworth, Jeanne L. Hardebeck, Ruth A. Harris, C. Ji, Malcolm J. S. Johnston, John O. Langbein, James J. Lienkaemper, Andrew J. Michael, Jessica R. Murray, R.M. Nadeau, P.A. Reasenberg, M.S. Reichle, Evelyn A. Roeloffs, A. Shakal, Robert W. Simpson, F. Waldhauser - Software