This document contains the glossary of terms used for the IDC Re-Engineering Phase 2 project. This version was created for Iteration E3. The IDC applies automatic processing methods in order to produce, archive, and distribute standard IDC products on behalf of all States Parties.
The standard seismic explosion-monitoring paradigm is based on a sparse, spatially aliased network of stations to monitor either the whole Earth or a region of interest. Under this paradigm, state-of-the-art event-detection methods are based on seismic phase picks, which are associated at multiple stations and located using 3D Earth models. Here, we revisit a concept for event-detection that does not require phase picks or 3D models and fuses detection and association into a single algorithm. Our pickless event detector exploits existing catalog and waveform data to build an empirical stack of the full regional seismic wavefield, which is subsequently used to detect and locate events at a network level using correlation techniques. We apply our detector to seismic data from Utah and evaluate our results by comparing them with the earthquake catalog published by the University of Utah Seismograph Stations. The results demonstrate that our pickless detector is a viable alternative technique for detecting events that likely requires less analyst overhead than do the existing methods.
Using template waveforms from aftershocks of the Wenchuan earthquake (12 May 2008, Ms 7.9) listed in a global bulletin and continuous data from eight regional stations, we detected more than 6000 additional events in the mainshock source region from 1 May to 12 August 2008. These new detections obey Omori’s law, extend the magnitude of completeness downward by 1.1 magnitude units, and lead to a more than fivefold increase in number of known aftershocks compared with the global bulletins published by the International Data Centre and the International Seismological Centre. Moreover, we detected more M >2 events than were listed by the Sichuan Seismograph Network. Several clusters of these detections were then relocated using the double-difference method, yielding locations that reduced travel-time residuals by a factor of 32 compared with the initial bulletin locations. Our results suggest that using waveform correlation on a few regional stations can find aftershock events very effectively and locate them with precision.
GeoTess is a model parameterization and software support library that manages the construction, population, storage, and interrogation of data stored in 2D and 3D Earth models. The software is available in Java and C++, with a C interface to the C++ library. The software has been tested on Linux, Mac, Sun, and PC platforms. It is open source and is available online (see Data and Resources).
Given a set of observations within a specified time window, a fitness value is calculated at each grid node by summing station-specific conditional fitness values. Assuming each observation was generated by a refracted P wave, these values are proportional to the conditional probabilities that each observation was generated by a seismic event at the grid node. The node with highest fitness value is accepted as a hypothetical event location, subject to some minimal fitness value, and all arrivals within a longer time window consistent with that event are associated with it. During the association step, a variety of different phases are considered. Once associated with an event, an arrival is removed from further consideration. While unassociated arrivals remain, the search for other events is repeated until none are identified. Results are presented in comparison with analyst-reviewed bulletins for three datasets: a two-week ground-truth period, the Tohoku aftershock sequence, and the entire year of 2010. The probabilistic event detection, association, and location algorithm missed fewer events and generated fewer false events on all datasets compared to the associator used at the International Data Center (51% fewer missed and 52% fewer false events on the ground-truth dataset when using the same predictions).
This report is a progress update on the USNDC Modernization Service Oriented Architecture (SOA) study describing results from Inception Iteration 1, which occurred between October 2012 and March 2013. The goals during this phase are 1) discovering components of the system that have potential service implementations, 2) identifying applicable SOA patterns for data access, service interfaces, and service orchestration/choreography, and 3) understanding performance tradeoffs for various SOA patterns
This report is a progress update on the US NDC Modernization Service Oriented Architecture (SOA) study describing results from a proof of concept project completed from May through September 2013. Goals for this proof of concept are 1) gain experience configuring, using, and running an Enterprise Service Bus (ESB), 2) understand the implications of wrapping existing software in standardized interfaces for use as web services, and 3) gather performance metrics for a notional seismic event monitoring pipeline implemented using services with various data access and communication patterns. The proof of concept is a follow on to a previous SOA performance study. Work was performed by four undergraduate summer student interns under the guidance of Sandia staff.
In support of the International Data Center (IDC) Reengineering Phase 2 project, a list of proposed use cases with brief descriptions is provided for review.