PEDAL: A New Event Detection and Signal Association Algorithm
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Fiscal year 2010 (FY10) is the second full year of NetCAP development and the first full year devoted largely to new feature development rather than the reimplementation of existing capabilities found in NetSim (Sereno et al., 1990). Major tasks completed this year include: (1) Addition of hydroacoustic simulation; (2) Addition of event Identification simulation; and (3) Initial design and preparation for infrasound simulation. The Network Capability Assessment Program (NetCAP) is a software tool under development at Sandia National Laboratories used for studying the capabilities of nuclear explosion monitoring networks. This report discusses motivation and objectives for the NetCAP project, lists work performed prior to fiscal year 2010 (FY10) and describes FY10 accomplishments in detail.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
To test the hypothesis that high quality 3D Earth models will produce seismic event locations which are more accurate and more precise, we are developing a global 3D P wave velocity model of the Earth's crust and mantle using seismic tomography. In this paper, we present the most recent version of our model, SALSA3D version 1.5, and demonstrate its ability to reduce mislocations for a large set of realizations derived from a carefully chosen set of globally-distributed ground truth events. Our model is derived from the latest version of the Ground Truth (GT) catalog of P and Pn travel time picks assembled by Los Alamos National Laboratory. To prevent over-weighting due to ray path redundancy and to reduce the computational burden, we cluster rays to produce representative rays. Reduction in the total number of ray paths is {approx}50%. The model is represented using the triangular tessellation system described by Ballard et al. (2009), which incorporates variable resolution in both the geographic and radial dimensions. For our starting model, we use a simplified two layer crustal model derived from the Crust 2.0 model over a uniform AK135 mantle. Sufficient damping is used to reduce velocity adjustments so that ray path changes between iterations are small. We obtain proper model smoothness by using progressive grid refinement, refining the grid only around areas with significant velocity changes from the starting model. At each grid refinement level except the last one we limit the number of iterations to prevent convergence thereby preserving aspects of broad features resolved at coarser resolutions. Our approach produces a smooth, multi-resolution model with node density appropriate to both ray coverage and the velocity gradients required by the data. This scheme is computationally expensive, so we use a distributed computing framework based on the Java Parallel Processing Framework, providing us with {approx}400 processors. Resolution of our model is assessed using a variation of the standard checkerboard method. We compare the travel-time prediction and location capabilities of SALSA3D to standard 1D models via location tests on a global event set with GT of 5 km or better. These events generally possess hundreds of Pn and P picks from which we generate different realizations of station distributions, yielding a range of azimuthal coverage and ratios of teleseismic to regional arrivals, with which we test the robustness and quality of relocation. The SALSA3D model reduces mislocation over standard 1D ak135 regardless of Pn to P ratio, with the improvement being most pronounced at higher azimuthal gaps.
Abstract not provided.
To test the hypothesis that high quality 3D Earth models will produce seismic event locations which are more accurate and more precise, we are developing a global 3D P wave velocity model of the Earth's crust and mantle using seismic tomography. In this paper, we present the most recent version of our model, SALSA3D version 1.5, and demonstrate its ability to reduce mislocations for a large set of realizations derived from a carefully chosen set of globally-distributed ground truth events. Our model is derived from the latest version of the Ground Truth (GT) catalog of P and Pn travel time picks assembled by Los Alamos National Laboratory. To prevent over-weighting due to ray path redundancy and to reduce the computational burden, we cluster rays to produce representative rays. Reduction in the total number of ray paths is {approx}50%. The model is represented using the triangular tessellation system described by Ballard et al. (2009), which incorporates variable resolution in both the geographic and radial dimensions. For our starting model, we use a simplified two layer crustal model derived from the Crust 2.0 model over a uniform AK135 mantle. Sufficient damping is used to reduce velocity adjustments so that ray path changes between iterations are small. We obtain proper model smoothness by using progressive grid refinement, refining the grid only around areas with significant velocity changes from the starting model. At each grid refinement level except the last one we limit the number of iterations to prevent convergence thereby preserving aspects of broad features resolved at coarser resolutions. Our approach produces a smooth, multi-resolution model with node density appropriate to both ray coverage and the velocity gradients required by the data. This scheme is computationally expensive, so we use a distributed computing framework based on the Java Parallel Processing Framework, providing us with {approx}400 processors. Resolution of our model is assessed using a variation of the standard checkerboard method. We compare the travel-time prediction and location capabilities of SALSA3D to standard 1D models via location tests on a global event set with GT of 5 km or better. These events generally possess hundreds of Pn and P picks from which we generate different realizations of station distributions, yielding a range of azimuthal coverage and ratios of teleseismic to regional arrivals, with which we test the robustness and quality of relocation. The SALSA3D model reduces mislocation over standard 1D ak135 regardless of Pn to P ratio, with the improvement being most pronounced at higher azimuthal gaps.
Abstract not provided.
Swarms of earthquakes and/or aftershock sequences can dramatically increase the level of seismicity in a region for a period of time lasting from days to months, depending on the swarm or sequence. Such occurrences can provide a large amount of useful information to seismologists. For those who monitor seismic events for possible nuclear explosions, however, these swarms/sequences are a nuisance. In an explosion monitoring system, each event must be treated as a possible nuclear test until it can be proven, to a high degree of confidence, not to be. Seismic events recorded by the same station with highly correlated waveforms almost certainly have a similar location and source type, so clusters of events within a swarm can quickly be identified as earthquakes. We have developed a number of tools that can be used to exploit the high degree of waveform similarity expected to be associated with swarms/sequences. Dendro Tool measures correlations between known events. The Waveform Correlation Detector is intended to act as a detector, finding events in raw data which correlate with known events. The Self Scanner is used to find all correlated segments within a raw data steam and does not require an event library. All three techniques together provide an opportunity to study the similarities of events in an aftershock sequence in different ways. To comprehensively characterize the benefits and limits of waveform correlation techniques, we studied 3 aftershock sequences, using our 3 tools, at multiple stations. We explored the effects of station distance and event magnitudes on correlation results. Lastly, we show the reduction in detection threshold and analyst workload offered by waveform correlation techniques compared to STA/LTA based detection. We analyzed 4 days of data from each aftershock sequence using all three methods. Most known events clustered in a similar manner across the toolsets. Up to 25% of catalogued events were found to be a member of a cluster. In addition, the Waveform Correlation Detector and Self Scanner identified significant numbers of new events that were not in either the EDR or regional catalogs, showing a lowering of the detection threshold. We extended our analysis to study the effect of distance on correlation results by applying the analysis tools to multiple stations along a transect of nearly constant azimuth when possible. We expected the number of events found via correlation would drop off as roughly 1/r2, where r is the distance from mainshock to station. However, we found that regional geological conditions influenced the performance of a given station more than distance. For example, for one sequence we clustered 25% of events at the nearest station to the mainshock (34 km), while our performance dropped to 2% at a station 550 km distant. However, we matched our best performance (25% clustering) at a station 198 km distant.
To test the hypothesis that high quality 3D Earth models will produce seismic event locations which are more accurate and more precise, we are developing a global 3D P wave velocity model of the Earth's crust and mantle using seismic tomography. In this paper, we present the most recent version of our model, SALSA3D (SAndia LoS Alamos) version 1.4, and demonstrate its ability to reduce mislocations for a large set of realizations derived from a carefully chosen set of globally-distributed ground truth events. Our model is derived from the latest version of the Ground Truth (GT) catalog of P and Pn travel time picks assembled by Los Alamos National Laboratory. To prevent over-weighting due to ray path redundancy and to reduce the computational burden, we cluster rays to produce representative rays. Reduction in the total number of ray paths is > 55%. The model is represented using the triangular tessellation system described by Ballard et al. (2009), which incorporates variable resolution in both the geographic and radial dimensions. For our starting model, we use a simplified two layer crustal model derived from the Crust 2.0 model over a uniform AK135 mantle. Sufficient damping is used to reduce velocity adjustments so that ray path changes between iterations are small. We obtain proper model smoothness by using progressive grid refinement, refining the grid only around areas with significant velocity changes from the starting model. At each grid refinement level except the last one we limit the number of iterations to prevent convergence thereby preserving aspects of broad features resolved at coarser resolutions. Our approach produces a smooth, multi-resolution model with node density appropriate to both ray coverage and the velocity gradients required by the data. This scheme is computationally expensive, so we use a distributed computing framework based on the Java Parallel Processing Framework, providing us with {approx}400 processors. Resolution of our model is assessed using a variation of the standard checkerboard method, as well as by directly estimating the diagonal of the model resolution matrix based on the technique developed by Bekas, et al. We compare the travel-time prediction and location capabilities of this model over standard 1D models. We perform location tests on a global, geographically-distributed event set with ground truth levels of 5 km or better. These events generally possess hundreds of Pn and P phases from which we can generate different realizations of station distributions, yielding a range of azimuthal coverage and proportions of teleseismic to regional arrivals, with which we test the robustness and quality of relocation. The SALSA3D model reduces mislocation over standard 1D ak135, especially with increasing azimuthal gap. The 3D model appears to perform better for locations based solely or dominantly on regional arrivals, which is not unexpected given that ak135 represents a global average and cannot therefore capture local and regional variations.
Analysts working at the International Data Centre in support of treaty monitoring through the Comprehensive Nuclear-Test-Ban Treaty Organization spend a significant amount of time reviewing hypothesized seismic events produced by an automatic processing system. When reviewing these events to determine their legitimacy, analysts take a variety of approaches that rely heavily on training and past experience. One method used by analysts to gauge the validity of an event involves examining the set of stations involved in the detection of an event. In particular, leveraging past experience, an analyst can say that an event located in a certain part of the world is expected to be detected by Stations A, B, and C. Implicit in this statement is that such an event would usually not be detected by Stations X, Y, or Z. For some well understood parts of the world, the absence of one or more 'expected' stations - or the presence of one or more 'unexpected' stations - is correlated with a hypothesized event's legitimacy and to its survival to the event bulletin. The primary objective of this research is to formalize and quantify the difference between the observed set of stations detecting some hypothesized event, versus the expected set of stations historically associated with detecting similar nearby events close in magnitude. This Station Set Residual can be quantified in many ways, some of which are correlated with the analysts determination of whether or not the event is valid. We propose that this Station Set Residual score can be used to screen out certain classes of 'false' events produced by automatic processing with a high degree of confidence, reducing the analyst burden. Moreover, we propose that the visualization of the historically expected distribution of detecting stations can be immediately useful as an analyst aid during their review process.
To test the hypothesis that high quality 3D Earth models will produce seismic event locations which are more accurate and more precise, we are developing a global 3D P wave velocity model of the Earth's crust and mantle using seismic tomography. In this paper, we present the most recent version of our model, SALSA3D (SAndia LoS Alamos) version 1.4, and demonstrate its ability to reduce mislocations for a large set of realizations derived from a carefully chosen set of globally-distributed ground truth events. Our model is derived from the latest version of the Ground Truth (GT) catalog of P and Pn travel time picks assembled by Los Alamos National Laboratory. To prevent over-weighting due to ray path redundancy and to reduce the computational burden, we cluster rays to produce representative rays. Reduction in the total number of ray paths is > 55%. The model is represented using the triangular tessellation system described by Ballard et al. (2009), which incorporates variable resolution in both the geographic and radial dimensions. For our starting model, we use a simplified two layer crustal model derived from the Crust 2.0 model over a uniform AK135 mantle. Sufficient damping is used to reduce velocity adjustments so that ray path changes between iterations are small. We obtain proper model smoothness by using progressive grid refinement, refining the grid only around areas with significant velocity changes from the starting model. At each grid refinement level except the last one we limit the number of iterations to prevent convergence thereby preserving aspects of broad features resolved at coarser resolutions. Our approach produces a smooth, multi-resolution model with node density appropriate to both ray coverage and the velocity gradients required by the data. This scheme is computationally expensive, so we use a distributed computing framework based on the Java Parallel Processing Framework, providing us with {approx}400 processors. Resolution of our model is assessed using a variation of the standard checkerboard method, as well as by directly estimating the diagonal of the model resolution matrix based on the technique developed by Bekas, et al. We compare the travel-time prediction and location capabilities of this model over standard 1D models. We perform location tests on a global, geographically-distributed event set with ground truth levels of 5 km or better. These events generally possess hundreds of Pn and P phases from which we can generate different realizations of station distributions, yielding a range of azimuthal coverage and proportions of teleseismic to regional arrivals, with which we test the robustness and quality of relocation. The SALSA3D model reduces mislocation over standard 1D ak135, especially with increasing azimuthal gap. The 3D model appears to perform better for locations based solely or dominantly on regional arrivals, which is not unexpected given that ak135 represents a global average and cannot therefore capture local and regional variations.
Abstract not provided.
Abstract not provided.
Abstract not provided.
From 2002 through 2006, the Ground Based Nuclear Explosion Monitoring Research & Engineering (GNEMRE) program at Sandia National Laboratories defined and modified a process for merging different types of integrated research products (IRPs) from various researchers into a cohesive, well-organized collection know as the NNSA Knowledge Base, to support operational treaty monitoring. This process includes defining the KB structure, systematically and logically aggregating IRPs into a complete set, and verifying and validating that the integrated Knowledge Base works as expected.
Seismological Research Letters
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
This guide is intended to enable researchers working with seismic data, but lacking backgrounds in computer science and programming, to develop seismic algorithms using the MATLAB-based MatSeis software. Specifically, it presents a series of step-by-step instructions to write four specific functions of increasing complexity, while simultaneously explaining the notation, syntax, and general program design of the functions being written. The ultimate goal is that that the user can use this guide as a jumping off point from which he or she can write new functions that are compatible with and expand the capabilities of the current MatSeis software that has been developed as part of the Ground-based Nuclear Explosion Monitoring Research and Engineering (GNEMRE) program at Sandia National Laboratories.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
To improve the nuclear event monitoring capability of the U.S., the NNSA Ground-based Nuclear Explosion Monitoring Research & Engineering (GNEM R&E) program has been developing a collection of products known as the Knowledge Base (KB). Though much of the focus for the KB has been on the development of calibration data, we have also developed numerous software tools for various purposes. The Matlab-based MatSeis package and the associated suite of regional seismic analysis tools were developed to aid in the testing and evaluation of some Knowledge Base products for which existing applications were either not available or ill-suited. This presentation will provide brief overviews of MatSeis and each of the tools, emphasizing features added in the last year. MatSeis was begun in 1996 and is now a fairly mature product. It is a highly flexible seismic analysis package that provides interfaces to read data from either flatfiles or an Oracle database. All of the standard seismic analysis tasks are supported (e.g. filtering, 3 component rotation, phase picking, event location, magnitude calculation), as well as a variety of array processing algorithms (beaming, FK, coherency analysis, vespagrams). The simplicity of Matlab coding and the tremendous number of available functions make MatSeis/Matlab an ideal environment for developing new monitoring research tools (see the regional seismic analysis tools below). New MatSeis features include: addition of evid information to events in MatSeis, options to screen picks by author, input and output of origerr information, improved performance in reading flatfiles, improved speed in FK calculations, and significant improvements to Measure Tool (filtering, multiple phase display), Free Plot (filtering, phase display and alignment), Mag Tool (maximum likelihood options), and Infra Tool (improved calculation speed, display of an F statistic stream). Work on the regional seismic analysis tools (CodaMag, EventID, PhaseMatch, and Dendro) began in 1999 and the tools vary in their level of maturity. All rely on MatSeis to provide necessary data (waveforms, arrivals, origins, and travel time curves). CodaMag Tool implements magnitude calculation by scaling to fit the envelope shape of the coda for a selected phase type (Mayeda, 1993; Mayeda and Walter, 1996). New tool features include: calculation of a yield estimate based on the source spectrum, display of a filtered version of the seismogram based on the selected band, and the output of codamag data records for processed events. EventID Tool implements event discrimination using phase ratios of regional arrivals (Hartse et al., 1997; Walter et al., 1999). New features include: bandpass filtering of displayed waveforms, screening of reference events based on SNR, multivariate discriminants, use of libcgi to access correction surfaces, and the output of discrim{_}data records for processed events. PhaseMatch Tool implements match filtering to isolate surface waves (Herrin and Goforth, 1977). New features include: display of the signal's observed dispersion and an option to use a station-based dispersion surface. Dendro Tool implements agglomerative hierarchical clustering using dendrograms to identify similar events based on waveform correlation (Everitt, 1993). New features include: modifications to include arrival information within the tool, and the capability to automatically add/re-pick arrivals based on the picked arrivals for similar events.
In order to exploit the information on surface wave propagation that is stored in large seismic event datasets, Sandia and Lawrence Livermore National Laboratories have developed a MatSeis interface for performing phase-matched filtering of Rayleigh arrivals. MatSeis is a Matlab-based seismic processing toolkit which provides graphical tools for analyzing seismic data from a network of stations. Tools are available for spectral and polarization measurements, as well as beam forming and f-k analysis with array data, to name just a few. Additionally, one has full access to the Matlab environment and any functions available there. Previously the authors reported the development of new MatSeis tools for calculating regional discrimination measurements. The first of these performs Lg coda analysis as developed by Mayeda and coworkers at Lawrence Livermore National Laboratory. A second tool measures regional phase amplitude ratios for an event and compares the results to ratios from known earthquakes and explosions. Release 1.5 of MatSeis includes the new interface for the analysis of surface wave arrivals. This effort involves the use of regionalized dispersion models from a repository of surface wave data and the construction of phase-matched filters to improve surface wave identification, detection, and magnitude calculation. The tool works as follows. First, a ray is traced from source to receiver through a user-defined grid containing different group velocity versus period values to determine the composite group velocity curve for the path. This curve is shown along with the upper and lower group velocity bounds for reference. Next, the curve is used to create a phase-matched filter, apply the filter, and show the resultant waveform. The application of the filter allows obscured Rayleigh arrivals to be more easily identified. Finally, after screening information outside the range of the phase-matched filter, an inverse version of the filter is applied to obtain a cleaned raw waveform which can be used for amplitude measurements. Because all the MatSeis tools have been written as Matlab functions, they can be easily modified to experiment with different processing details. The performance of the propagation models can be evaluated using any event available in the repository of surface wave events.
Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their efforts to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research into how to winnow the reference events in these large reconciled event sets, additional database query approaches have been developed to provide windows into these datasets. These custom built content analysis tools help identify dataset characteristics that can potentially aid in providing a basis for comparing similar reference events in these large reconciled event sets. Once these characteristics can be identified, algorithms can be developed to create and add to the reduced set of events used by the Event Search Engine. These content analysis tools have already been useful in providing information on station coverage of the referenced events and basic statistical, information on events in the research datasets. The tools can also provide researchers with a quick way to find interesting and useful events within the research datasets. The tools could also be used as a means to review reference event datasets as part of a dataset delivery verification process. There has also been an effort to explore the usefulness of commercially available web-based software to help with this problem. The advantages of using off-the-shelf software applications, such as Oracle's WebDB, to manipulate, customize and manage research data are being investigated. These types of applications are being examined to provide access to large integrated data sets for regional seismic research in Asia. All of these software tools would provide the researcher with unprecedented power without having to learn the intricacies and complexities of relational database systems.