Publications

Results 1–25 of 89
Skip to search filters

LocOO3D User's Manual

Davenport, Kathy D.; Conley, Andrea C.; Downey, Nathan J.; Ballard, Sanford B.; Hipp, James R.; Begnaud, Mike B.

LocOO3D is a software tool that computes geographical locations for seismic events at regional to global scales. This software has a rich set of features, including the ability to use custom 3D velocity models, correlated observations and master event locations. The LocOO3D software is especially useful for research related to seismic monitoring applications, since it allows users to easily explore a variety of location methods and scenarios and is compatible with the CSS3.0 data format used in monitoring applications. The LocOO3D software, User's Manual, and Examples are available on the web at: https://github.com/sandialabs/LocOO3D For additional information on GeoTess, SALSA3D, RSTT, and other related software, please see: https://github.com/sandialabs/GeoTessJava, www.sandia.gov/geotess, www.sandia.gov/salsa3d, and www.sandia.gov/rstt

More Details

PCalc User's Manual

Conley, Andrea C.; Downey, Nathan J.; Ballard, Sanford B.; Hipp, James R.; Hammond, Patrick H.; Davenport, Kathy D.; Begnaud, Michael L.

PCalc is a software tool that computes travel-time predictions, ray path geometry and model queries. This software has a rich set of features, including the ability to use custom 3D velocity models to compute predictions using a variety of geometries. The PCalc software is especially useful for research related to seismic monitoring applications.

More Details

3D Crustal Tomography Model of Utah

Conley, Andrea C.; Hammond, Patrick H.; Ballard, Sanford B.; Begnaud, Michael L.

The ability to accurately locate seismic events is necessary for treaty monitoring. When using techniques that rely on the comparison of observed and predicted travel times to obtain these locations, it is important that the estimated travel times and their estimated uncertainties are also accurate. The methodology of Ballard et al. (2016a) has been used in the past to generate an accurate 3D tomographic global model of compressional wave slowness (the SAndia LoS Alamos 3D tomography model, i.e. SALSA3D). To re-establish functionality and to broaden the capabilities of the method to local distances, we have applied the methodology of Ballard et al. (2016a) to local data in Utah. This report details the results of the initial model generated, including relocations performed using analyst picked mining events at West Ridge Mine and three ground-truth events at Bingham Mine. We were successfully able to generate a feasible tomography model that resulted in reasonable relocations of the mining events.

More Details

The iterative processing framework: A new paradigm for automatic event building

Bulletin of the Seismological Society of America

Tibi, Rigobert T.; Encarnacao, Andre V.; Ballard, Sanford B.; Young, Christopher J.; Brogan, Ronald; Sundermier, Amy S.

In a traditional data-processing pipeline, waveforms are acquired, a detector makes the signal detections (i.e., arrival times, slownesses, and azimuths) and passes them to an associator. The associator then links the detections to the fitting-event hypotheses to generate an event bulletin. Most of the time, this traditional pipeline requires substantial human-analyst involvement to improve the quality of the resulting event bulletin. For the year 2017, for example, International Data Center (IDC) analysts rejected about 40% of the events in the automatic bulletin and manually built 30% of the legitimate events. We propose an iterative processing framework (IPF) that includes a new data-processing module that incorporates automatic analyst behaviors (auto analyst [AA]) into the event-building pipeline. In the proposed framework, through an iterative process, the AA takes over many of the tasks traditionally performed by human analysts. These tasks can be grouped into two major processes: (1) evaluating small events with a low number of location-defining arrival phases to improve their formation; and (2) scanning for and exploiting unassociated arrivals to form potential events missed by previous association runs. To test the proposed framework, we processed a two-week period (15–28 May 2010) of the signal-detections dataset from the IDC. Comparison with an expert analyst-reviewed bulletin for the same time period suggests that IPF performs better than the traditional pipelines (IDC and baseline pipelines). Most of the additional events built by the AA are low-magnitude events that were missed by these traditional pipelines. The AA also adds additional signal detections to existing events, which saves analyst time, even if the event locations are not significantly affected.

More Details

Rapid and robust cross-correlation-based seismic signal identification using an approximate nearest neighbor method

Bulletin of the Seismological Society of America

Tibi, Rigobert T.; Young, Christopher J.; Gonzales, Antonio G.; Ballard, Sanford B.; Encarnacao, Andre V.

The matched filtering technique that uses the cross correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this study, we introduce an approximate nearest neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation. Our method begins with a projection into a reduced dimensionality space, based on correlation with a randomized subset of the full template archive. Searching for a specified number of nearest neighbors for a query waveform is accomplished by iteratively comparing it with the neighbors of its immediate neighbors. We used the approach to search for matches to each of ∼2300 analyst-reviewed signal detections reported in May 2010 for the International Monitoring System station MKAR. The template library in this case consists of a data set of more than 200,000 analyst-reviewed signal detections for the same station from February 2002 to July 2016 (excluding May 2010). Of these signal detections, 73% are teleseismic first P and 17% regional phases (Pn, Pg, Sn, and Lg). The analyses performed on a standard desktop computer show that the proposed ANN approach performs a search of the large template libraries about 25 times faster than the standard full linear search and achieves recall rates greater than 80%, with the recall rate increasing for higher correlation thresholds.

More Details

Pickless event detection and location: The waveform correlation event-detection system (wceds) revisited

Bulletin of the Seismological Society of America

Arrowsmith, Stephen J.; Young, Christopher J.; Ballard, Sanford B.; Slinkard, Megan E.; Pankow, Kristine

The standard seismic explosion-monitoring paradigm is based on a sparse, spatially aliased network of stations to monitor either the whole Earth or a region of interest. Under this paradigm, state-of-the-art event-detection methods are based on seismic phase picks, which are associated at multiple stations and located using 3D Earth models. Here, we revisit a concept for event-detection that does not require phase picks or 3D models and fuses detection and association into a single algorithm. Our pickless event detector exploits existing catalog and waveform data to build an empirical stack of the full regional seismic wavefield, which is subsequently used to detect and locate events at a network level using correlation techniques. We apply our detector to seismic data from Utah and evaluate our results by comparing them with the earthquake catalog published by the University of Utah Seismograph Stations. The results demonstrate that our pickless detector is a viable alternative technique for detecting events that likely requires less analyst overhead than do the existing methods.

More Details
Results 1–25 of 89
Results 1–25 of 89