Agencies that monitor for underground nuclear tests are interested in techniques that automatically characterize mining blasts to reduce the human analyst effort required to produce high-quality event bulletins. Waveform correlation is effective in finding similar waveforms from repeating seismic events, including mining blasts. We report the results of an experiment that uses waveform templates recorded by multiple International Monitoring System stations of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization for up to 10 years prior to the time period of interest to detect and identify mining blasts that occur during single weeks of study. We discuss approaches for template selection, threshold setting, and event detection that are specialized for mining blasts and a sparse, global network. We apply the approaches to two different weeks of study for each of two geographic regions, Wyoming and Scandinavia, to evaluate the potential for establishing a set of standards for waveform correlation processing of mining blasts that can be effective for operational monitoring systems with a sparse network. We compare candidate events detected with our processing methods to the Reviewed Event Bulletin of the International Data Centre to develop an intuition about potential reduction in analyst workload.
Agencies that monitor for underground nuclear tests are interested in techniques that automatically characterize mining blasts to reduce the human analyst effort required to produce high-quality event bulletins. Waveform correlation is effective in finding similar waveforms from repeating seismic events, including mining blasts. We report the results of an experiment that uses waveform templates recorded by multiple International Monitoring System stations of the Comprehensive Nuclear-Test-Ban Treaty for up to 10 years prior to detect and identify mining blasts that occur during single weeks of study. We discuss approaches for template selection, threshold setting, and event detection that are specialized for mining blasts and a sparse, global network. We apply the approaches to two different weeks of study for each of two geographic regions, Wyoming and Scandinavia, to evaluate the potential for establishing a set of standards for waveform correlation processing of mining blasts that can be effective for operational monitoring systems with a sparse network. We compare candidate events detected with our processing methods to the Reviewed Event Bulletin of the International Data Centre to develop an intuition about potential reduction in analyst workload.
In a traditional data-processing pipeline, waveforms are acquired, a detector makes the signal detections (i.e., arrival times, slownesses, and azimuths) and passes them to an associator. The associator then links the detections to the fitting-event hypotheses to generate an event bulletin. Most of the time, this traditional pipeline requires substantial human-analyst involvement to improve the quality of the resulting event bulletin. For the year 2017, for example, International Data Center (IDC) analysts rejected about 40% of the events in the automatic bulletin and manually built 30% of the legitimate events. We propose an iterative processing framework (IPF) that includes a new data-processing module that incorporates automatic analyst behaviors (auto analyst [AA]) into the event-building pipeline. In the proposed framework, through an iterative process, the AA takes over many of the tasks traditionally performed by human analysts. These tasks can be grouped into two major processes: (1) evaluating small events with a low number of location-defining arrival phases to improve their formation; and (2) scanning for and exploiting unassociated arrivals to form potential events missed by previous association runs. To test the proposed framework, we processed a two-week period (15–28 May 2010) of the signal-detections dataset from the IDC. Comparison with an expert analyst-reviewed bulletin for the same time period suggests that IPF performs better than the traditional pipelines (IDC and baseline pipelines). Most of the additional events built by the AA are low-magnitude events that were missed by these traditional pipelines. The AA also adds additional signal detections to existing events, which saves analyst time, even if the event locations are not significantly affected.
The capability to discriminate low-magnitude earthquakes from low-yield anthropogenic sources, both detectable only at local distances, is of increasing interest to the event monitoring community. We used a dataset of seismic events in Utah recorded during a 14-day period (1–14 January 2011) by the University of Utah Seismic Stations network to perform a comparative study of event classification at local scale using amplitude ratio (AR) methods and a machine learning (ML) approach. The event catalog consists of 7377 events with magnitudes MC ranging from −2 and lower up to 5.8. Events were subdivided into six populations based on location and source type: tectonic earthquakes (TEs), mining-induced events (MIEs), and mining blasts from four known mines (WMB, SMB, LMB, and CQB). The AR approach jointly exploits Pg-to-Sg phase ARs and Rg-to-Sg spectral ARs in multivariate quadratic discriminant functions and was able to classify 370 events with high signal quality from the three groups with sufficient size (TE, MIE, and SMB). For that subset of the events, the method achieved success rates between about 80% and 90%. The ML approach used trained convolutional neural network (CNN) models to classify the populations. The CNN approach was able to classify the subset of events with accuracies between about 91% and 98%. Because the neural network approach does not have a minimum signal quality requirement, we applied it to the entire event catalog, including the abundant extremely low-magnitude events, and achieved accuracies of about 94%–100%. We compare the AR and ML methodologies using a broad set of criteria and conclude that a major advantage to ML methods is their robustness to low signal-to-noise ratio data, allowing them to classify significantly smaller events.
During the development of new seismic data processing methods, the verification of potential events and associated signals can present a nontrivial obstacle to the assessment of algorithm performance, especially as detection thresholds are lowered, resulting in the inclusion of significantly more anthropogenic signals. Here, we present two 14 day seismic event catalogs, a local-scale catalog developed using data from the University of Utah Seismograph Stations network, and a global-scale catalog developed using data from the International Monitoring System. Each catalog was built manually to comprehensively identify events from all sources that were locatable using phase arrival timing and directional information from seismic network stations, resulting in significant increases compared to existing catalogs. The new catalogs additionally contain challenging event sequences (prolific aftershocks and small events at the detection and location threshold) and novel event types and sources (e.g., infrasound only events and long-wall mining events) that make them useful for algorithm testing and development, as well as valuable for the unique tectonic and anthropogenic event sequences they contain.