Publications

Results 1–200 of 239
Skip to search filters

Applying Waveform Correlation to Reduce Seismic Analyst Workload Due to Repeating Mining Blasts

Bulletin of the Seismological Society of America

Sundermier, Amy S.; Tibi, Rigobert T.; Brogan, Ronald A.; Young, Christopher J.

Agencies that monitor for underground nuclear tests are interested in techniques that automatically characterize mining blasts to reduce the human analyst effort required to produce high-quality event bulletins. Waveform correlation is effective in finding similar waveforms from repeating seismic events, including mining blasts. We report the results of an experiment to detect and identify mining blasts for two regions, Wyoming (U.S.A.) and Scandinavia, using waveform templates recorded by multiple International Monitoring System stations of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO PrepCom) for up to 10 yr prior to the time of interest. We discuss approaches for template selection, threshold setting, and event detection that are specialized for characterizing mining blasts using a sparse, global network. We apply the approaches to one week of data for each of the two regions to evaluate the potential for establishing a set of standards for waveform correlation processing of mining blasts that can be generally applied to operational monitoring systems with a sparse network. We compare candidate events detected with our processing methods to the Reviewed Event Bulletin of the International Data Centre to assess potential reduction in analyst workload.

More Details

Applying Waveform Correlation and Waveform Template Metadata to Aftershocks in the Middle East to Reduce Analyst Workload

sundermier, amy s.; Tibi, Rigobert T.; Young, Christopher J.

Organizations that monitor for underground nuclear explosive tests are interested in techniques that automatically characterize recurring events such as aftershocks to reduce the human analyst effort required to produce high-quality event bulletins. Waveform correlation is a technique that is effective in finding similar waveforms from repeating seismic events. In this study, we apply waveform correlation in combination with template event metadata to two aftershock sequences in the Middle East to seek corroborating detections from multiple stations in the International Monitoring System of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization. We use waveform templates from stations that are within regional distance of aftershock sequences to detect subsequent events, then use template event metadata to discover what stations are likely to record corroborating arrival waveforms for recurring aftershock events at the same location, and develop additional waveform templates to seek corroborating detections. We evaluate the results with the goal of determining whether applying the method to aftershock events will improve the choice of waveform correlation detections that lead to bulletin-worthy events and reduction of analyst effort.

More Details

Applying Waveform Correlation and Waveform Template Metadata to Mining Blasts to Reduce Analyst Workload

sundermier, amy s.; Tibi, Rigobert T.; Young, Christopher J.

Organizations that monitor for underground nuclear explosive tests are interested in techniques that automatically characterize mining blasts to reduce the human analyst effort required to produce high - quality event bulletins. Waveform correlation is effective in finding similar waveforms from repeating seismic events, including mining blasts. In this study we use waveform template event metadata to seek corroborating detections from multiple stations in the International Monitoring System of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization. We build upon events detected in a prior waveform correlation study of mining blasts in two geographic regions, Wyoming and Scandinavia. Using a set of expert analyst-reviewed waveform correlation events that were declared to be true positive detections, we explore criteria for choosing the waveform correlation detections that are most likely to lead to bulletin-worthy events and reduction of analyst effort.

More Details

Evaluation of the PhaseNet Model Applied to the IMS Seismic Network

Garcia, Jorge A.; Heck, Stephen H.; Young, Christopher J.; Brogan, Ronald B.

Producing a complete and accurate set of signal detections is essential for automatically building and characterizing seismic events of interest for nuclear explosion monitoring. Signal detection algorithms have been an area of research for decades, but still produce large quantities of false detections and misidentify real signals that must be detected to produce a complete global catalog of events of interest. Deep learning methods have shown promising capabilities in effectively characterizing seismic signals for complex tasks such as identifying phase arrival times. We use the PhaseNet model, a UNet-based Neural Network, trained on local distance data from northern California to predict seismic arrivals on data from the International Monitoring System (IMS) global network. We use an analyst-curated bulletin generated from this data set to compare the performance of PhaseNet to that of the Short-Term Average/Long-Term Average (STA/LTA) algorithm. We find that PhaseNet has the potential of outperforming traditional processing methods and recommend the training of a new model with the IMS data to achieve optimal performance.

More Details

Generating uncertainty distributions for seismic signal onset times

Bulletin of the Seismological Society of America

Peterson, Matthew G.; Vollmer, Charles V.; Brogan, Ronald; Stracuzzi, David J.; Young, Christopher J.

Signal arrival-time estimation plays a critical role in a variety of downstream seismic analy-ses, including location estimation and source characterization. Any arrival-time errors propagate through subsequent data-processing results. In this article, we detail a general framework for refining estimated seismic signal arrival times along with full estimation of their associated uncertainty. Using the standard short-term average/long-term average threshold algorithm to identify a search window, we demonstrate how to refine the pick estimate through two different approaches. In both cases, new waveform realizations are generated through bootstrap algorithms to produce full a posteriori estimates of uncertainty of onset arrival time of the seismic signal. The onset arrival uncertainty estimates provide additional data-derived information from the signal and have the potential to influence seismic analysis along several fronts.

More Details

Deep learning denoising applied to regional distance seismic data in Utah

Bulletin of the Seismological Society of America

Tibi, Rigobert T.; Hammond, Patrick H.; Brogan, Ronald; Young, Christopher J.; Koper, Keith

Seismic waveform data are generally contaminated by noise from various sources. Suppressing this noise effectively so that the remaining signal of interest can be successfully exploited remains a fundamental problem for the seismological community. To date, the most common noise suppression methods have been based on frequency filtering. These methods, however, are less effective when the signal of interest and noise share similar frequency bands. Inspired by source separation studies in the field of music information retrieval (Jansson et al., 2017) and a recent study in seismology (Zhu et al., 2019), we implemented a seismic denoising method that uses a trained deep convolutional neural network (CNN) model to decompose an input waveform into a signal of interest and noise. In our approach, the CNN provides a signal mask and a noise mask for an input signal. The short-time Fourier transform (STFT) of the estimated signal is obtained by multiplying the signal mask with the STFT of the input signal. To build and test the denoiser, we used carefully compiled signal and noise datasets of seismograms recorded by the University of Utah Seismograph Stations network. Results of test runs involving more than 9000 constructed waveforms suggest that on average the denoiser improves the signal-to-noise ratios (SNRs) by ∼ 5 dB, and that most of the recovered signal waveforms have high similarity with respect to the target waveforms (average correlation coefficient of ∼ 0:80) and suffer little distortion. Application to real data suggests that our denoiser achieves on average a factor of up to ∼ 2–5 improvement in SNR over band-pass filtering and can suppress many types of noise that band-pass filtering cannot. For individual waveforms, the improvement can be as high as ∼ 15 dB.

More Details

Applying Waveform Correlation to Mining Blasts Using a Global Sparse Network

Sundermier, Amy S.; Tibi, Rigobert T.; Young, Christopher J.

Agencies that monitor for underground nuclear tests are interested in techniques that automatically characterize mining blasts to reduce the human analyst effort required to produce high-quality event bulletins. Waveform correlation is effective in finding similar waveforms from repeating seismic events, including mining blasts. We report the results of an experiment that uses waveform templates recorded by multiple International Monitoring System stations of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization for up to 10 years prior to the time period of interest to detect and identify mining blasts that occur during single weeks of study. We discuss approaches for template selection, threshold setting, and event detection that are specialized for mining blasts and a sparse, global network. We apply the approaches to two different weeks of study for each of two geographic regions, Wyoming and Scandinavia, to evaluate the potential for establishing a set of standards for waveform correlation processing of mining blasts that can be effective for operational monitoring systems with a sparse network. We compare candidate events detected with our processing methods to the Reviewed Event Bulletin of the International Data Centre to develop an intuition about potential reduction in analyst workload.

More Details

Applying Waveform Correlation to Mining Blasts Using a Global Sparse Network

Sundermier, Amy S.; Tibi, Rigobert T.; Young, Christopher J.

Agencies that monitor for underground nuclear tests are interested in techniques that automatically characterize mining blasts to reduce the human analyst effort required to produce high-quality event bulletins. Waveform correlation is effective in finding similar waveforms from repeating seismic events, including mining blasts. We report the results of an experiment that uses waveform templates recorded by multiple International Monitoring System stations of the Comprehensive Nuclear-Test-Ban Treaty for up to 10 years prior to detect and identify mining blasts that occur during single weeks of study. We discuss approaches for template selection, threshold setting, and event detection that are specialized for mining blasts and a sparse, global network. We apply the approaches to two different weeks of study for each of two geographic regions, Wyoming and Scandinavia, to evaluate the potential for establishing a set of standards for waveform correlation processing of mining blasts that can be effective for operational monitoring systems with a sparse network. We compare candidate events detected with our processing methods to the Reviewed Event Bulletin of the International Data Centre to develop an intuition about potential reduction in analyst workload.

More Details

The iterative processing framework: A new paradigm for automatic event building

Bulletin of the Seismological Society of America

Tibi, Rigobert T.; Encarnacao, Andre V.; Ballard, Sanford B.; Young, Christopher J.; Brogan, Ronald; Sundermier, Amy S.

In a traditional data-processing pipeline, waveforms are acquired, a detector makes the signal detections (i.e., arrival times, slownesses, and azimuths) and passes them to an associator. The associator then links the detections to the fitting-event hypotheses to generate an event bulletin. Most of the time, this traditional pipeline requires substantial human-analyst involvement to improve the quality of the resulting event bulletin. For the year 2017, for example, International Data Center (IDC) analysts rejected about 40% of the events in the automatic bulletin and manually built 30% of the legitimate events. We propose an iterative processing framework (IPF) that includes a new data-processing module that incorporates automatic analyst behaviors (auto analyst [AA]) into the event-building pipeline. In the proposed framework, through an iterative process, the AA takes over many of the tasks traditionally performed by human analysts. These tasks can be grouped into two major processes: (1) evaluating small events with a low number of location-defining arrival phases to improve their formation; and (2) scanning for and exploiting unassociated arrivals to form potential events missed by previous association runs. To test the proposed framework, we processed a two-week period (15–28 May 2010) of the signal-detections dataset from the IDC. Comparison with an expert analyst-reviewed bulletin for the same time period suggests that IPF performs better than the traditional pipelines (IDC and baseline pipelines). Most of the additional events built by the AA are low-magnitude events that were missed by these traditional pipelines. The AA also adds additional signal detections to existing events, which saves analyst time, even if the event locations are not significantly affected.

More Details

Classification of local seismic events in the utah region: A comparison of amplitude ratio methods with a spectrogram-based machine learning approach

Bulletin of the Seismological Society of America

Tibi, Rigobert T.; Linville, Lisa L.; Young, Christopher J.; Brogan, Ronald

The capability to discriminate low-magnitude earthquakes from low-yield anthropogenic sources, both detectable only at local distances, is of increasing interest to the event monitoring community. We used a dataset of seismic events in Utah recorded during a 14-day period (1–14 January 2011) by the University of Utah Seismic Stations network to perform a comparative study of event classification at local scale using amplitude ratio (AR) methods and a machine learning (ML) approach. The event catalog consists of 7377 events with magnitudes MC ranging from −2 and lower up to 5.8. Events were subdivided into six populations based on location and source type: tectonic earthquakes (TEs), mining-induced events (MIEs), and mining blasts from four known mines (WMB, SMB, LMB, and CQB). The AR approach jointly exploits Pg-to-Sg phase ARs and Rg-to-Sg spectral ARs in multivariate quadratic discriminant functions and was able to classify 370 events with high signal quality from the three groups with sufficient size (TE, MIE, and SMB). For that subset of the events, the method achieved success rates between about 80% and 90%. The ML approach used trained convolutional neural network (CNN) models to classify the populations. The CNN approach was able to classify the subset of events with accuracies between about 91% and 98%. Because the neural network approach does not have a minimum signal quality requirement, we applied it to the entire event catalog, including the abundant extremely low-magnitude events, and achieved accuracies of about 94%–100%. We compare the AR and ML methodologies using a broad set of criteria and conclude that a major advantage to ML methods is their robustness to low signal-to-noise ratio data, allowing them to classify significantly smaller events.

More Details

Global- And local-scale high-resolution event catalogs for algorithm testing

Seismological Research Letters

Linville, Lisa L.; Brogan, Ronald C.; Young, Christopher J.; Aur, Katherine A.

During the development of new seismic data processing methods, the verification of potential events and associated signals can present a nontrivial obstacle to the assessment of algorithm performance, especially as detection thresholds are lowered, resulting in the inclusion of significantly more anthropogenic signals. Here, we present two 14 day seismic event catalogs, a local-scale catalog developed using data from the University of Utah Seismograph Stations network, and a global-scale catalog developed using data from the International Monitoring System. Each catalog was built manually to comprehensively identify events from all sources that were locatable using phase arrival timing and directional information from seismic network stations, resulting in significant increases compared to existing catalogs. The new catalogs additionally contain challenging event sequences (prolific aftershocks and small events at the detection and location threshold) and novel event types and sources (e.g., infrasound only events and long-wall mining events) that make them useful for algorithm testing and development, as well as valuable for the unique tectonic and anthropogenic event sequences they contain.

More Details

Implementation of the waveform correlation event detection system (WCEDS) method for regional seismic event detection in Utah

Bulletin of the Seismological Society of America

Arrowsmith, Stephen J.; Young, Christopher J.; Pankow, Kristine

Backprojection techniques are a class of methods for detecting and locating events that have been successfully implemented at local scales for dense networks. This article develops the framework for applying a backprojection method to detect and locate a range of event sizes across a heteorogeneous regional network. This article extends previous work on the development of a backprojection method for local and regional seismic event detection, the Waveform Correlation Event Detection System (WCEDS). The improvements outlined here make the technique much more flexible for regional earthquake or explosion monitoring. We first explore how the backprojection operator can be formulated using either a travel-time model or a stack of full waveforms, showing that the former approach is much more flexible and can lead to the detection of smaller events, and to significant improvements in the resolution of event parameters. Second, we discuss the factors that influence the grid of event hypotheses used for backprojection, and develop an algorithm for generating suitable grids for networks with variable density. Third, we explore the effect of including different phases in the backprojection operator, showing that the best results for the study region can be obtained using only the Pg phase, and by including terms for penalizing early arrivals when evaluating the fit for a given event hypothesis. Fourth, we incorporate two parallel backprojection computations with different distance thresholds to enable the robust detection of both network-wide and small (sub-network-only) events. The set of improvements are outlined by applying WCEDS to four example events on the University of Utah Seismograph Stations (UUSS) network.

More Details

Discrimination of Anthropogenic Events and Tectonic Earthquakes in Utah Using a Quadratic Discriminant Function Approach with Local Distance Amplitude Ratios

Bulletin of the Seismological Society of America

Tibi, Rigobert T.; Koper, Keith D.; Pankow, Kristine L.; Young, Christopher J.

Most of the commonly used seismic discrimination approaches are designed for teleseismic and regional data. To monitor for the smallest events, some of these discriminants have been adapted for local distances (< 200 km), with mixed level of success. For this, we take advantage of the variety of seismic sources, including nontraditionally studied anthropogenic sources and the existence of a dense regional seismic network in the Utah region to evaluate amplitude ratio seismic discrimination at local distances. First, we explored phase-amplitude Pg-to-Sg ratios for multiple frequency bands to classify events in a dataset that comprises populations of single-shot surface explosions, shallow and deep ripple-fired mining blasts, mining-induced events (MIEs), and tectonic earthquakes. We achieved a success rate of about 59%–83%. Then, for the same dataset, we combined the Pg-to-Sg phase-amplitude ratios with Sg-to-Rg spectral amplitude ratios in a multivariate quadratic discriminant function (QDF) approach. For two-category pairwise classification, seven of ten population pairs show misclassification rates of about 20% or less, with five pairs showing rates of about 10% or less. The approach performs best for the pair involving the populations of single-shot explosions and MIEs. By combining both Pg-to-Sg and Rg-to-Sg ratios in the multivariate QDFs, we are able to achieve an average improvement of about 4%–14% in misclassification rates compared with Pg-to-Sg ratios alone. When all five event populations are considered simultaneously, as expected, the potential for misclassification increases, and our QDF approach using both Pg-to-Sg and Rg-to-Sg ratios achieves an average success rate of about 74% compared with the rate of about 86% for two-category pairwise classification.

More Details

Dynamic tuning of seismic signal detector trigger levels for local networks

Bulletin of the Seismological Society of America

Draelos, Timothy J.; Peterson, Matthew G.; Knox, Hunter A.; Lawry, Benjamin J.; Phillips-Alonge, Kristin E.; Ziegler, Abra E.; Chael, Eric P.; Young, Christopher J.; Faust, Aleksandra

The quality of automatic signal detections from sensor networks depends on individual detector trigger levels (TLs) from each sensor. The largely manual process of identifying effective TLs is painstaking and does not guarantee optimal configuration settings, yet achieving superior automatic detection of signals and ultimately, events, is closely related to these parameters. We present a Dynamic Detector Tuning (DDT) system that automatically adjusts effective TL settings for signal detectors to the current state of the environment by leveraging cooperation within a local neighborhood of network sensors. After a stabilization period, the DDT algorithm can adapt in near-real time to changing conditions and automatically tune a signal detector to identify (detect) signals from only events of interest. Our current work focuses on reducing false signal detections early in the seismic signal processing pipeline, which leads to fewer false events and has a significant impact on reducing analyst time and effort. This system provides an important new method to automatically tune detector TLs for a network of sensors and is applicable to both existing sensor performance boosting and new sensor deployment. With ground truth on detections from a local neighborhood of seismic sensors within a network monitoring the Mount Erebus volcano in Antarctica, we show that DDT reduces the number of false detections by 18% and the number of missed detections by 11% when compared with optimal fixed TLs for all sensors.

More Details

Lg-wave cross correlation and epicentral double-difference location in and near China

Bulletin of the Seismological Society of America

Schaff, David P.; Richards, Paul G.; Slinkard, Megan E.; Heck, Stephen H.; Young, Christopher J.

We perform epicentral relocations for a broad area using cross-correlation measurements made on Lg waves recorded at regional distances on a sparse station network. Using a two-step procedure (pairwise locations and cluster locations), we obtain final locations for 5623 events—3689 for all of China from 1985 to 2005 and 1934 for the Wenchuan area from May to August 2008. These high-quality locations comprise 20% of a starting catalog for all of China and 25% of a catalog for Wenchuan. Of the 1934 events located for Wenchuan, 1662 (86%) were newly detected. The final locations explain the residuals 89 times better than the catalog locations for all of China (3.7302–0.0417 s) and 32 times better than the catalog locations for Wenchuan (0.8413–0.0267 s). The average semimajor axes of the 95% confidence ellipses are 420 m for all of China and 370 m for Wenchuan. The average azimuthal gaps are 205° for all of China and 266° for Wenchuan. 98% of the station distances for all of China are over 200 km. The mean and maximum station distances are 898 and 2174 km. The robustness of our location estimates and various trade-offs and sensitivities is explored with different inversion parameters for the location, such as starting locations for iterative solutions and which singular values to include. Our results provide order-of-magnitude improvements in locations for event clusters, using waveforms from a very sparse far-regional network for which data are openly available.

More Details

Depth discrimination using Rg-to-Sg spectral amplitude ratios for seismic events in utah recorded at local distances

Bulletin of the Seismological Society of America

Tibi, Rigobert T.; Koper, Keith D.; Pankow, Kristine L.; Young, Christopher J.

Short-period fundamental-mode Rayleigh waves (Rg) are commonly observed on seismograms of anthropogenic seismic events and shallow, naturally occurring tectonic earthquakes (TEs) recorded at local distances. In the Utah region, strong Rg waves traveling with an average group velocity of about 1:8 km=s are observed at ∼1 Hz on waveforms from shallow events (depth < 10 km) recorded at distances up to about 150 km. At these distances, Sg waves, which are direct shear waves traveling in the upper crust, are generally the dominant signals for TEs. In this study, we leverage the well-known notion that Rg amplitude decreases dramatically with increasing event depth to propose a new depth discriminant based on Rg-to-Sg spectral amplitude ratios. The approach is successfully used to discriminate shallow events (both earthquakes and anthropogenic events) from deeper TEs in the Utah region recorded at local distances (< 150 km) by the University of Utah Seismographic Stations (UUSS) regional seismic network. Using Mood’s median test, we obtained probabilities of nearly zero that the median Rg-to-Sg spectral amplitude ratios are the same between shallow events on the one hand (including both shallow TEs and anthropogenic events), and deeper earthquakes on the other, suggesting that there is a statistically significant difference in the estimated Rg-to-Sg ratios between the two populations. We also observed consistent disparities between the different types of shallow events (e.g., mining blasts vs. mining-induced earthquakes), implying that it may be possible to separate the subpopulations that make up this group. This suggests that using local distance Rg-to-Sg spectral amplitude ratios one can not only discriminate shallow events from deeper events but may also be able to discriminate among different populations of shallow events.

More Details

Rapid and robust cross-correlation-based seismic signal identification using an approximate nearest neighbor method

Bulletin of the Seismological Society of America

Tibi, Rigobert T.; Young, Christopher J.; Gonzales, Antonio G.; Ballard, Sanford B.; Encarnacao, Andre V.

The matched filtering technique that uses the cross correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this study, we introduce an approximate nearest neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation. Our method begins with a projection into a reduced dimensionality space, based on correlation with a randomized subset of the full template archive. Searching for a specified number of nearest neighbors for a query waveform is accomplished by iteratively comparing it with the neighbors of its immediate neighbors. We used the approach to search for matches to each of ∼2300 analyst-reviewed signal detections reported in May 2010 for the International Monitoring System station MKAR. The template library in this case consists of a data set of more than 200,000 analyst-reviewed signal detections for the same station from February 2002 to July 2016 (excluding May 2010). Of these signal detections, 73% are teleseismic first P and 17% regional phases (Pn, Pg, Sn, and Lg). The analyses performed on a standard desktop computer show that the proposed ANN approach performs a search of the large template libraries about 25 times faster than the standard full linear search and achieves recall rates greater than 80%, with the recall rate increasing for higher correlation thresholds.

More Details

Pickless event detection and location: The waveform correlation event-detection system (wceds) revisited

Bulletin of the Seismological Society of America

Arrowsmith, Stephen J.; Young, Christopher J.; Ballard, Sanford B.; Slinkard, Megan E.; Pankow, Kristine

The standard seismic explosion-monitoring paradigm is based on a sparse, spatially aliased network of stations to monitor either the whole Earth or a region of interest. Under this paradigm, state-of-the-art event-detection methods are based on seismic phase picks, which are associated at multiple stations and located using 3D Earth models. Here, we revisit a concept for event-detection that does not require phase picks or 3D models and fuses detection and association into a single algorithm. Our pickless event detector exploits existing catalog and waveform data to build an empirical stack of the full regional seismic wavefield, which is subsequently used to detect and locate events at a network level using correlation techniques. We apply our detector to seismic data from Utah and evaluate our results by comparing them with the earthquake catalog published by the University of Utah Seismograph Stations. The results demonstrate that our pickless detector is a viable alternative technique for detecting events that likely requires less analyst overhead than do the existing methods.

More Details

Detection of the Wenchuan aftershock sequence using waveform correlation with a composite regional network

Bulletin of the Seismological Society of America

Slinkard, Megan E.; Heck, Stephen H.; Schaff, David; Bonal, Nedra B.; Daily, David M.; Young, Christopher J.; Richards, Paul

Using template waveforms from aftershocks of the Wenchuan earthquake (12 May 2008, Ms 7.9) listed in a global bulletin and continuous data from eight regional stations, we detected more than 6000 additional events in the mainshock source region from 1 May to 12 August 2008. These new detections obey Omori’s law, extend the magnitude of completeness downward by 1.1 magnitude units, and lead to a more than fivefold increase in number of known aftershocks compared with the global bulletins published by the International Data Centre and the International Seismological Centre. Moreover, we detected more M >2 events than were listed by the Sichuan Seismograph Network. Several clusters of these detections were then relocated using the double-difference method, yielding locations that reduced travel-time residuals by a factor of 32 compared with the initial bulletin locations. Our results suggest that using waveform correlation on a few regional stations can find aftershock events very effectively and locate them with precision.

More Details

GeoTess: A generalized Earth model software utility

Seismological Research Letters

Ballard, Sanford B.; Hipp, James R.; Kraus, Brian; Encarnacao, Andre V.; Young, Christopher J.

GeoTess is a model parameterization and software support library that manages the construction, population, storage, and interrogation of data stored in 2D and 3D Earth models. The software is available in Java and C++, with a C interface to the C++ library. The software has been tested on Linux, Mac, Sun, and PC platforms. It is open source and is available online (see Data and Resources).

More Details

A new method for producing automated seismic bulletins: Probabilistic event detection, association, and location

Bulletin of the Seismological Society of America

Draelos, Timothy J.; Ballard, Sanford B.; Young, Christopher J.; Brogan, Ronald

Given a set of observations within a specified time window, a fitness value is calculated at each grid node by summing station-specific conditional fitness values. Assuming each observation was generated by a refracted P wave, these values are proportional to the conditional probabilities that each observation was generated by a seismic event at the grid node. The node with highest fitness value is accepted as a hypothetical event location, subject to some minimal fitness value, and all arrivals within a longer time window consistent with that event are associated with it. During the association step, a variety of different phases are considered. Once associated with an event, an arrival is removed from further consideration. While unassociated arrivals remain, the search for other events is repeated until none are identified. Results are presented in comparison with analyst-reviewed bulletins for three datasets: a two-week ground-truth period, the Tohoku aftershock sequence, and the entire year of 2010. The probabilistic event detection, association, and location algorithm missed fewer events and generated fewer false events on all datasets compared to the associator used at the International Data Center (51% fewer missed and 52% fewer false events on the ground-truth dataset when using the same predictions).

More Details

US NDC Modernization: Service Oriented Architecture Study Status

Hamlet, Benjamin R.; Encarnacao, Andre V.; Harris, James M.; Young, Christopher J.

This report is a progress update on the USNDC Modernization Service Oriented Architecture (SOA) study describing results from Inception Iteration 1, which occurred between October 2012 and March 2013. The goals during this phase are 1) discovering components of the system that have potential service implementations, 2) identifying applicable SOA patterns for data access, service interfaces, and service orchestration/choreography, and 3) understanding performance tradeoffs for various SOA patterns

More Details

US NDC Modernization: Service Oriented Architecture Proof of Concept

Hamlet, Benjamin R.; Encarnacao, Andre V.; Jackson, Keilan R.; Hays, Ian A.; Barron, Nathan E.; Simon, Luke B.; Harris, James M.; Young, Christopher J.

This report is a progress update on the US NDC Modernization Service Oriented Architecture (SOA) study describing results from a proof of concept project completed from May through September 2013. Goals for this proof of concept are 1) gain experience configuring, using, and running an Enterprise Service Bus (ESB), 2) understand the implications of wrapping existing software in standardized interfaces for use as web services, and 3) gather performance metrics for a notional seismic event monitoring pipeline implemented using services with various data access and communication patterns. The proof of concept is a follow on to a previous SOA performance study. Work was performed by four undergraduate summer student interns under the guidance of Sandia staff.

More Details
Results 1–200 of 239
Results 1–200 of 239