Publications

Results 1–50 of 239
Skip to search filters

Applying Waveform Correlation to Reduce Seismic Analyst Workload Due to Repeating Mining Blasts

Bulletin of the Seismological Society of America

Sundermier, Amy S.; Tibi, Rigobert T.; Brogan, Ronald A.; Young, Christopher J.

Agencies that monitor for underground nuclear tests are interested in techniques that automatically characterize mining blasts to reduce the human analyst effort required to produce high-quality event bulletins. Waveform correlation is effective in finding similar waveforms from repeating seismic events, including mining blasts. We report the results of an experiment to detect and identify mining blasts for two regions, Wyoming (U.S.A.) and Scandinavia, using waveform templates recorded by multiple International Monitoring System stations of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO PrepCom) for up to 10 yr prior to the time of interest. We discuss approaches for template selection, threshold setting, and event detection that are specialized for characterizing mining blasts using a sparse, global network. We apply the approaches to one week of data for each of the two regions to evaluate the potential for establishing a set of standards for waveform correlation processing of mining blasts that can be generally applied to operational monitoring systems with a sparse network. We compare candidate events detected with our processing methods to the Reviewed Event Bulletin of the International Data Centre to assess potential reduction in analyst workload.

More Details

Applying Waveform Correlation and Waveform Template Metadata to Aftershocks in the Middle East to Reduce Analyst Workload

sundermier, amy s.; Tibi, Rigobert T.; Young, Christopher J.

Organizations that monitor for underground nuclear explosive tests are interested in techniques that automatically characterize recurring events such as aftershocks to reduce the human analyst effort required to produce high-quality event bulletins. Waveform correlation is a technique that is effective in finding similar waveforms from repeating seismic events. In this study, we apply waveform correlation in combination with template event metadata to two aftershock sequences in the Middle East to seek corroborating detections from multiple stations in the International Monitoring System of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization. We use waveform templates from stations that are within regional distance of aftershock sequences to detect subsequent events, then use template event metadata to discover what stations are likely to record corroborating arrival waveforms for recurring aftershock events at the same location, and develop additional waveform templates to seek corroborating detections. We evaluate the results with the goal of determining whether applying the method to aftershock events will improve the choice of waveform correlation detections that lead to bulletin-worthy events and reduction of analyst effort.

More Details

Applying Waveform Correlation and Waveform Template Metadata to Mining Blasts to Reduce Analyst Workload

sundermier, amy s.; Tibi, Rigobert T.; Young, Christopher J.

Organizations that monitor for underground nuclear explosive tests are interested in techniques that automatically characterize mining blasts to reduce the human analyst effort required to produce high - quality event bulletins. Waveform correlation is effective in finding similar waveforms from repeating seismic events, including mining blasts. In this study we use waveform template event metadata to seek corroborating detections from multiple stations in the International Monitoring System of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization. We build upon events detected in a prior waveform correlation study of mining blasts in two geographic regions, Wyoming and Scandinavia. Using a set of expert analyst-reviewed waveform correlation events that were declared to be true positive detections, we explore criteria for choosing the waveform correlation detections that are most likely to lead to bulletin-worthy events and reduction of analyst effort.

More Details

Evaluation of the PhaseNet Model Applied to the IMS Seismic Network

Garcia, Jorge A.; Heck, Stephen H.; Young, Christopher J.; Brogan, Ronald B.

Producing a complete and accurate set of signal detections is essential for automatically building and characterizing seismic events of interest for nuclear explosion monitoring. Signal detection algorithms have been an area of research for decades, but still produce large quantities of false detections and misidentify real signals that must be detected to produce a complete global catalog of events of interest. Deep learning methods have shown promising capabilities in effectively characterizing seismic signals for complex tasks such as identifying phase arrival times. We use the PhaseNet model, a UNet-based Neural Network, trained on local distance data from northern California to predict seismic arrivals on data from the International Monitoring System (IMS) global network. We use an analyst-curated bulletin generated from this data set to compare the performance of PhaseNet to that of the Short-Term Average/Long-Term Average (STA/LTA) algorithm. We find that PhaseNet has the potential of outperforming traditional processing methods and recommend the training of a new model with the IMS data to achieve optimal performance.

More Details

Generating uncertainty distributions for seismic signal onset times

Bulletin of the Seismological Society of America

Peterson, Matthew G.; Vollmer, Charles V.; Brogan, Ronald; Stracuzzi, David J.; Young, Christopher J.

Signal arrival-time estimation plays a critical role in a variety of downstream seismic analy-ses, including location estimation and source characterization. Any arrival-time errors propagate through subsequent data-processing results. In this article, we detail a general framework for refining estimated seismic signal arrival times along with full estimation of their associated uncertainty. Using the standard short-term average/long-term average threshold algorithm to identify a search window, we demonstrate how to refine the pick estimate through two different approaches. In both cases, new waveform realizations are generated through bootstrap algorithms to produce full a posteriori estimates of uncertainty of onset arrival time of the seismic signal. The onset arrival uncertainty estimates provide additional data-derived information from the signal and have the potential to influence seismic analysis along several fronts.

More Details

Deep learning denoising applied to regional distance seismic data in Utah

Bulletin of the Seismological Society of America

Tibi, Rigobert T.; Hammond, Patrick H.; Brogan, Ronald; Young, Christopher J.; Koper, Keith

Seismic waveform data are generally contaminated by noise from various sources. Suppressing this noise effectively so that the remaining signal of interest can be successfully exploited remains a fundamental problem for the seismological community. To date, the most common noise suppression methods have been based on frequency filtering. These methods, however, are less effective when the signal of interest and noise share similar frequency bands. Inspired by source separation studies in the field of music information retrieval (Jansson et al., 2017) and a recent study in seismology (Zhu et al., 2019), we implemented a seismic denoising method that uses a trained deep convolutional neural network (CNN) model to decompose an input waveform into a signal of interest and noise. In our approach, the CNN provides a signal mask and a noise mask for an input signal. The short-time Fourier transform (STFT) of the estimated signal is obtained by multiplying the signal mask with the STFT of the input signal. To build and test the denoiser, we used carefully compiled signal and noise datasets of seismograms recorded by the University of Utah Seismograph Stations network. Results of test runs involving more than 9000 constructed waveforms suggest that on average the denoiser improves the signal-to-noise ratios (SNRs) by ∼ 5 dB, and that most of the recovered signal waveforms have high similarity with respect to the target waveforms (average correlation coefficient of ∼ 0:80) and suffer little distortion. Application to real data suggests that our denoiser achieves on average a factor of up to ∼ 2–5 improvement in SNR over band-pass filtering and can suppress many types of noise that band-pass filtering cannot. For individual waveforms, the improvement can be as high as ∼ 15 dB.

More Details

Applying Waveform Correlation to Mining Blasts Using a Global Sparse Network

Sundermier, Amy S.; Tibi, Rigobert T.; Young, Christopher J.

Agencies that monitor for underground nuclear tests are interested in techniques that automatically characterize mining blasts to reduce the human analyst effort required to produce high-quality event bulletins. Waveform correlation is effective in finding similar waveforms from repeating seismic events, including mining blasts. We report the results of an experiment that uses waveform templates recorded by multiple International Monitoring System stations of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization for up to 10 years prior to the time period of interest to detect and identify mining blasts that occur during single weeks of study. We discuss approaches for template selection, threshold setting, and event detection that are specialized for mining blasts and a sparse, global network. We apply the approaches to two different weeks of study for each of two geographic regions, Wyoming and Scandinavia, to evaluate the potential for establishing a set of standards for waveform correlation processing of mining blasts that can be effective for operational monitoring systems with a sparse network. We compare candidate events detected with our processing methods to the Reviewed Event Bulletin of the International Data Centre to develop an intuition about potential reduction in analyst workload.

More Details

Applying Waveform Correlation to Mining Blasts Using a Global Sparse Network

Sundermier, Amy S.; Tibi, Rigobert T.; Young, Christopher J.

Agencies that monitor for underground nuclear tests are interested in techniques that automatically characterize mining blasts to reduce the human analyst effort required to produce high-quality event bulletins. Waveform correlation is effective in finding similar waveforms from repeating seismic events, including mining blasts. We report the results of an experiment that uses waveform templates recorded by multiple International Monitoring System stations of the Comprehensive Nuclear-Test-Ban Treaty for up to 10 years prior to detect and identify mining blasts that occur during single weeks of study. We discuss approaches for template selection, threshold setting, and event detection that are specialized for mining blasts and a sparse, global network. We apply the approaches to two different weeks of study for each of two geographic regions, Wyoming and Scandinavia, to evaluate the potential for establishing a set of standards for waveform correlation processing of mining blasts that can be effective for operational monitoring systems with a sparse network. We compare candidate events detected with our processing methods to the Reviewed Event Bulletin of the International Data Centre to develop an intuition about potential reduction in analyst workload.

More Details

The iterative processing framework: A new paradigm for automatic event building

Bulletin of the Seismological Society of America

Tibi, Rigobert T.; Encarnacao, Andre V.; Ballard, Sanford B.; Young, Christopher J.; Brogan, Ronald; Sundermier, Amy S.

In a traditional data-processing pipeline, waveforms are acquired, a detector makes the signal detections (i.e., arrival times, slownesses, and azimuths) and passes them to an associator. The associator then links the detections to the fitting-event hypotheses to generate an event bulletin. Most of the time, this traditional pipeline requires substantial human-analyst involvement to improve the quality of the resulting event bulletin. For the year 2017, for example, International Data Center (IDC) analysts rejected about 40% of the events in the automatic bulletin and manually built 30% of the legitimate events. We propose an iterative processing framework (IPF) that includes a new data-processing module that incorporates automatic analyst behaviors (auto analyst [AA]) into the event-building pipeline. In the proposed framework, through an iterative process, the AA takes over many of the tasks traditionally performed by human analysts. These tasks can be grouped into two major processes: (1) evaluating small events with a low number of location-defining arrival phases to improve their formation; and (2) scanning for and exploiting unassociated arrivals to form potential events missed by previous association runs. To test the proposed framework, we processed a two-week period (15–28 May 2010) of the signal-detections dataset from the IDC. Comparison with an expert analyst-reviewed bulletin for the same time period suggests that IPF performs better than the traditional pipelines (IDC and baseline pipelines). Most of the additional events built by the AA are low-magnitude events that were missed by these traditional pipelines. The AA also adds additional signal detections to existing events, which saves analyst time, even if the event locations are not significantly affected.

More Details

Classification of local seismic events in the utah region: A comparison of amplitude ratio methods with a spectrogram-based machine learning approach

Bulletin of the Seismological Society of America

Tibi, Rigobert T.; Linville, Lisa L.; Young, Christopher J.; Brogan, Ronald

The capability to discriminate low-magnitude earthquakes from low-yield anthropogenic sources, both detectable only at local distances, is of increasing interest to the event monitoring community. We used a dataset of seismic events in Utah recorded during a 14-day period (1–14 January 2011) by the University of Utah Seismic Stations network to perform a comparative study of event classification at local scale using amplitude ratio (AR) methods and a machine learning (ML) approach. The event catalog consists of 7377 events with magnitudes MC ranging from −2 and lower up to 5.8. Events were subdivided into six populations based on location and source type: tectonic earthquakes (TEs), mining-induced events (MIEs), and mining blasts from four known mines (WMB, SMB, LMB, and CQB). The AR approach jointly exploits Pg-to-Sg phase ARs and Rg-to-Sg spectral ARs in multivariate quadratic discriminant functions and was able to classify 370 events with high signal quality from the three groups with sufficient size (TE, MIE, and SMB). For that subset of the events, the method achieved success rates between about 80% and 90%. The ML approach used trained convolutional neural network (CNN) models to classify the populations. The CNN approach was able to classify the subset of events with accuracies between about 91% and 98%. Because the neural network approach does not have a minimum signal quality requirement, we applied it to the entire event catalog, including the abundant extremely low-magnitude events, and achieved accuracies of about 94%–100%. We compare the AR and ML methodologies using a broad set of criteria and conclude that a major advantage to ML methods is their robustness to low signal-to-noise ratio data, allowing them to classify significantly smaller events.

More Details

Global- And local-scale high-resolution event catalogs for algorithm testing

Seismological Research Letters

Linville, Lisa L.; Brogan, Ronald C.; Young, Christopher J.; Aur, Katherine A.

During the development of new seismic data processing methods, the verification of potential events and associated signals can present a nontrivial obstacle to the assessment of algorithm performance, especially as detection thresholds are lowered, resulting in the inclusion of significantly more anthropogenic signals. Here, we present two 14 day seismic event catalogs, a local-scale catalog developed using data from the University of Utah Seismograph Stations network, and a global-scale catalog developed using data from the International Monitoring System. Each catalog was built manually to comprehensively identify events from all sources that were locatable using phase arrival timing and directional information from seismic network stations, resulting in significant increases compared to existing catalogs. The new catalogs additionally contain challenging event sequences (prolific aftershocks and small events at the detection and location threshold) and novel event types and sources (e.g., infrasound only events and long-wall mining events) that make them useful for algorithm testing and development, as well as valuable for the unique tectonic and anthropogenic event sequences they contain.

More Details
Results 1–50 of 239
Results 1–50 of 239