Publications

89 Results
Skip to search filters

LocOO3D User's Manual

Davenport, Kathy D.; Conley, Andrea C.; Downey, Nathan J.; Ballard, Sanford B.; Hipp, James R.; Begnaud, Mike B.

LocOO3D is a software tool that computes geographical locations for seismic events at regional to global scales. This software has a rich set of features, including the ability to use custom 3D velocity models, correlated observations and master event locations. The LocOO3D software is especially useful for research related to seismic monitoring applications, since it allows users to easily explore a variety of location methods and scenarios and is compatible with the CSS3.0 data format used in monitoring applications. The LocOO3D software, User's Manual, and Examples are available on the web at: https://github.com/sandialabs/LocOO3D For additional information on GeoTess, SALSA3D, RSTT, and other related software, please see: https://github.com/sandialabs/GeoTessJava, www.sandia.gov/geotess, www.sandia.gov/salsa3d, and www.sandia.gov/rstt

More Details

PCalc User's Manual

Conley, Andrea C.; Downey, Nathan J.; Ballard, Sanford B.; Hipp, James R.; Hammond, Patrick H.; Davenport, Kathy D.; Begnaud, Michael L.

PCalc is a software tool that computes travel-time predictions, ray path geometry and model queries. This software has a rich set of features, including the ability to use custom 3D velocity models to compute predictions using a variety of geometries. The PCalc software is especially useful for research related to seismic monitoring applications.

More Details

3D Crustal Tomography Model of Utah

Conley, Andrea C.; Hammond, Patrick H.; Ballard, Sanford B.; Begnaud, Michael L.

The ability to accurately locate seismic events is necessary for treaty monitoring. When using techniques that rely on the comparison of observed and predicted travel times to obtain these locations, it is important that the estimated travel times and their estimated uncertainties are also accurate. The methodology of Ballard et al. (2016a) has been used in the past to generate an accurate 3D tomographic global model of compressional wave slowness (the SAndia LoS Alamos 3D tomography model, i.e. SALSA3D). To re-establish functionality and to broaden the capabilities of the method to local distances, we have applied the methodology of Ballard et al. (2016a) to local data in Utah. This report details the results of the initial model generated, including relocations performed using analyst picked mining events at West Ridge Mine and three ground-truth events at Bingham Mine. We were successfully able to generate a feasible tomography model that resulted in reasonable relocations of the mining events.

More Details

The iterative processing framework: A new paradigm for automatic event building

Bulletin of the Seismological Society of America

Tibi, Rigobert T.; Encarnacao, Andre V.; Ballard, Sanford B.; Young, Christopher J.; Brogan, Ronald; Sundermier, Amy S.

In a traditional data-processing pipeline, waveforms are acquired, a detector makes the signal detections (i.e., arrival times, slownesses, and azimuths) and passes them to an associator. The associator then links the detections to the fitting-event hypotheses to generate an event bulletin. Most of the time, this traditional pipeline requires substantial human-analyst involvement to improve the quality of the resulting event bulletin. For the year 2017, for example, International Data Center (IDC) analysts rejected about 40% of the events in the automatic bulletin and manually built 30% of the legitimate events. We propose an iterative processing framework (IPF) that includes a new data-processing module that incorporates automatic analyst behaviors (auto analyst [AA]) into the event-building pipeline. In the proposed framework, through an iterative process, the AA takes over many of the tasks traditionally performed by human analysts. These tasks can be grouped into two major processes: (1) evaluating small events with a low number of location-defining arrival phases to improve their formation; and (2) scanning for and exploiting unassociated arrivals to form potential events missed by previous association runs. To test the proposed framework, we processed a two-week period (15–28 May 2010) of the signal-detections dataset from the IDC. Comparison with an expert analyst-reviewed bulletin for the same time period suggests that IPF performs better than the traditional pipelines (IDC and baseline pipelines). Most of the additional events built by the AA are low-magnitude events that were missed by these traditional pipelines. The AA also adds additional signal detections to existing events, which saves analyst time, even if the event locations are not significantly affected.

More Details

Rapid and robust cross-correlation-based seismic signal identification using an approximate nearest neighbor method

Bulletin of the Seismological Society of America

Tibi, Rigobert T.; Young, Christopher J.; Gonzales, Antonio G.; Ballard, Sanford B.; Encarnacao, Andre V.

The matched filtering technique that uses the cross correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this study, we introduce an approximate nearest neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation. Our method begins with a projection into a reduced dimensionality space, based on correlation with a randomized subset of the full template archive. Searching for a specified number of nearest neighbors for a query waveform is accomplished by iteratively comparing it with the neighbors of its immediate neighbors. We used the approach to search for matches to each of ∼2300 analyst-reviewed signal detections reported in May 2010 for the International Monitoring System station MKAR. The template library in this case consists of a data set of more than 200,000 analyst-reviewed signal detections for the same station from February 2002 to July 2016 (excluding May 2010). Of these signal detections, 73% are teleseismic first P and 17% regional phases (Pn, Pg, Sn, and Lg). The analyses performed on a standard desktop computer show that the proposed ANN approach performs a search of the large template libraries about 25 times faster than the standard full linear search and achieves recall rates greater than 80%, with the recall rate increasing for higher correlation thresholds.

More Details

Pickless event detection and location: The waveform correlation event-detection system (wceds) revisited

Bulletin of the Seismological Society of America

Arrowsmith, Stephen J.; Young, Christopher J.; Ballard, Sanford B.; Slinkard, Megan E.; Pankow, Kristine

The standard seismic explosion-monitoring paradigm is based on a sparse, spatially aliased network of stations to monitor either the whole Earth or a region of interest. Under this paradigm, state-of-the-art event-detection methods are based on seismic phase picks, which are associated at multiple stations and located using 3D Earth models. Here, we revisit a concept for event-detection that does not require phase picks or 3D models and fuses detection and association into a single algorithm. Our pickless event detector exploits existing catalog and waveform data to build an empirical stack of the full regional seismic wavefield, which is subsequently used to detect and locate events at a network level using correlation techniques. We apply our detector to seismic data from Utah and evaluate our results by comparing them with the earthquake catalog published by the University of Utah Seismograph Stations. The results demonstrate that our pickless detector is a viable alternative technique for detecting events that likely requires less analyst overhead than do the existing methods.

More Details

GeoTess: A generalized Earth model software utility

Seismological Research Letters

Ballard, Sanford B.; Hipp, James R.; Kraus, Brian; Encarnacao, Andre V.; Young, Christopher J.

GeoTess is a model parameterization and software support library that manages the construction, population, storage, and interrogation of data stored in 2D and 3D Earth models. The software is available in Java and C++, with a C interface to the C++ library. The software has been tested on Linux, Mac, Sun, and PC platforms. It is open source and is available online (see Data and Resources).

More Details

A new method for producing automated seismic bulletins: Probabilistic event detection, association, and location

Bulletin of the Seismological Society of America

Draelos, Timothy J.; Ballard, Sanford B.; Young, Christopher J.; Brogan, Ronald

Given a set of observations within a specified time window, a fitness value is calculated at each grid node by summing station-specific conditional fitness values. Assuming each observation was generated by a refracted P wave, these values are proportional to the conditional probabilities that each observation was generated by a seismic event at the grid node. The node with highest fitness value is accepted as a hypothetical event location, subject to some minimal fitness value, and all arrivals within a longer time window consistent with that event are associated with it. During the association step, a variety of different phases are considered. Once associated with an event, an arrival is removed from further consideration. While unassociated arrivals remain, the search for other events is repeated until none are identified. Results are presented in comparison with analyst-reviewed bulletins for three datasets: a two-week ground-truth period, the Tohoku aftershock sequence, and the entire year of 2010. The probabilistic event detection, association, and location algorithm missed fewer events and generated fewer false events on all datasets compared to the associator used at the International Data Center (51% fewer missed and 52% fewer false events on the ground-truth dataset when using the same predictions).

More Details

A global 3D P-velocity model of the Earth's crust and mantle for improved event location : SALSA3D

Ballard, Sanford B.; Young, Christopher J.; Hipp, James R.; Chang, Marcus C.; Encarnacao, Andre V.

To test the hypothesis that high quality 3D Earth models will produce seismic event locations which are more accurate and more precise, we are developing a global 3D P wave velocity model of the Earth's crust and mantle using seismic tomography. In this paper, we present the most recent version of our model, SALSA3D version 1.5, and demonstrate its ability to reduce mislocations for a large set of realizations derived from a carefully chosen set of globally-distributed ground truth events. Our model is derived from the latest version of the Ground Truth (GT) catalog of P and Pn travel time picks assembled by Los Alamos National Laboratory. To prevent over-weighting due to ray path redundancy and to reduce the computational burden, we cluster rays to produce representative rays. Reduction in the total number of ray paths is {approx}50%. The model is represented using the triangular tessellation system described by Ballard et al. (2009), which incorporates variable resolution in both the geographic and radial dimensions. For our starting model, we use a simplified two layer crustal model derived from the Crust 2.0 model over a uniform AK135 mantle. Sufficient damping is used to reduce velocity adjustments so that ray path changes between iterations are small. We obtain proper model smoothness by using progressive grid refinement, refining the grid only around areas with significant velocity changes from the starting model. At each grid refinement level except the last one we limit the number of iterations to prevent convergence thereby preserving aspects of broad features resolved at coarser resolutions. Our approach produces a smooth, multi-resolution model with node density appropriate to both ray coverage and the velocity gradients required by the data. This scheme is computationally expensive, so we use a distributed computing framework based on the Java Parallel Processing Framework, providing us with {approx}400 processors. Resolution of our model is assessed using a variation of the standard checkerboard method. We compare the travel-time prediction and location capabilities of SALSA3D to standard 1D models via location tests on a global event set with GT of 5 km or better. These events generally possess hundreds of Pn and P picks from which we generate different realizations of station distributions, yielding a range of azimuthal coverage and ratios of teleseismic to regional arrivals, with which we test the robustness and quality of relocation. The SALSA3D model reduces mislocation over standard 1D ak135 regardless of Pn to P ratio, with the improvement being most pronounced at higher azimuthal gaps.

More Details

A global 3D P-Velocity model of the Earth%3CU%2B2019%3Es crust and mantle for improved event location

Ballard, Sanford B.; Young, Christopher J.; Hipp, James R.; Chang, Marcus C.; Encarnacao, Andre V.; Lewis, Jennifer E.

To test the hypothesis that high quality 3D Earth models will produce seismic event locations which are more accurate and more precise, we are developing a global 3D P wave velocity model of the Earth's crust and mantle using seismic tomography. In this paper, we present the most recent version of our model, SALSA3D (SAndia LoS Alamos) version 1.4, and demonstrate its ability to reduce mislocations for a large set of realizations derived from a carefully chosen set of globally-distributed ground truth events. Our model is derived from the latest version of the Ground Truth (GT) catalog of P and Pn travel time picks assembled by Los Alamos National Laboratory. To prevent over-weighting due to ray path redundancy and to reduce the computational burden, we cluster rays to produce representative rays. Reduction in the total number of ray paths is > 55%. The model is represented using the triangular tessellation system described by Ballard et al. (2009), which incorporates variable resolution in both the geographic and radial dimensions. For our starting model, we use a simplified two layer crustal model derived from the Crust 2.0 model over a uniform AK135 mantle. Sufficient damping is used to reduce velocity adjustments so that ray path changes between iterations are small. We obtain proper model smoothness by using progressive grid refinement, refining the grid only around areas with significant velocity changes from the starting model. At each grid refinement level except the last one we limit the number of iterations to prevent convergence thereby preserving aspects of broad features resolved at coarser resolutions. Our approach produces a smooth, multi-resolution model with node density appropriate to both ray coverage and the velocity gradients required by the data. This scheme is computationally expensive, so we use a distributed computing framework based on the Java Parallel Processing Framework, providing us with {approx}400 processors. Resolution of our model is assessed using a variation of the standard checkerboard method, as well as by directly estimating the diagonal of the model resolution matrix based on the technique developed by Bekas, et al. We compare the travel-time prediction and location capabilities of this model over standard 1D models. We perform location tests on a global, geographically-distributed event set with ground truth levels of 5 km or better. These events generally possess hundreds of Pn and P phases from which we can generate different realizations of station distributions, yielding a range of azimuthal coverage and proportions of teleseismic to regional arrivals, with which we test the robustness and quality of relocation. The SALSA3D model reduces mislocation over standard 1D ak135, especially with increasing azimuthal gap. The 3D model appears to perform better for locations based solely or dominantly on regional arrivals, which is not unexpected given that ak135 represents a global average and cannot therefore capture local and regional variations.

More Details

A global 3D P-velocity model of the Earth's crust and mantle for improved event location

Young, Christopher J.; Ballard, Sanford B.; Hipp, James R.; Chang, Marcus C.; Encarnacao, Andre V.; Lewis, Jennifer E.

To test the hypothesis that high quality 3D Earth models will produce seismic event locations which are more accurate and more precise, we are developing a global 3D P wave velocity model of the Earth's crust and mantle using seismic tomography. In this paper, we present the most recent version of our model, SALSA3D (SAndia LoS Alamos) version 1.4, and demonstrate its ability to reduce mislocations for a large set of realizations derived from a carefully chosen set of globally-distributed ground truth events. Our model is derived from the latest version of the Ground Truth (GT) catalog of P and Pn travel time picks assembled by Los Alamos National Laboratory. To prevent over-weighting due to ray path redundancy and to reduce the computational burden, we cluster rays to produce representative rays. Reduction in the total number of ray paths is > 55%. The model is represented using the triangular tessellation system described by Ballard et al. (2009), which incorporates variable resolution in both the geographic and radial dimensions. For our starting model, we use a simplified two layer crustal model derived from the Crust 2.0 model over a uniform AK135 mantle. Sufficient damping is used to reduce velocity adjustments so that ray path changes between iterations are small. We obtain proper model smoothness by using progressive grid refinement, refining the grid only around areas with significant velocity changes from the starting model. At each grid refinement level except the last one we limit the number of iterations to prevent convergence thereby preserving aspects of broad features resolved at coarser resolutions. Our approach produces a smooth, multi-resolution model with node density appropriate to both ray coverage and the velocity gradients required by the data. This scheme is computationally expensive, so we use a distributed computing framework based on the Java Parallel Processing Framework, providing us with {approx}400 processors. Resolution of our model is assessed using a variation of the standard checkerboard method, as well as by directly estimating the diagonal of the model resolution matrix based on the technique developed by Bekas, et al. We compare the travel-time prediction and location capabilities of this model over standard 1D models. We perform location tests on a global, geographically-distributed event set with ground truth levels of 5 km or better. These events generally possess hundreds of Pn and P phases from which we can generate different realizations of station distributions, yielding a range of azimuthal coverage and proportions of teleseismic to regional arrivals, with which we test the robustness and quality of relocation. The SALSA3D model reduces mislocation over standard 1D ak135, especially with increasing azimuthal gap. The 3D model appears to perform better for locations based solely or dominantly on regional arrivals, which is not unexpected given that ak135 represents a global average and cannot therefore capture local and regional variations.

More Details

Analytic solutions for seismic travel time and ray path geometry through simple velocity models

Ballard, Sanford B.

The geometry of ray paths through realistic Earth models can be extremely complex due to the vertical and lateral heterogeneity of the velocity distribution within the models. Calculation of high fidelity ray paths and travel times through these models generally involves sophisticated algorithms that require significant assumptions and approximations. To test such algorithms it is desirable to have available analytic solutions for the geometry and travel time of rays through simpler velocity distributions against which the more complex algorithms can be compared. Also, in situations where computational performance requirements prohibit implementation of full 3D algorithms, it may be necessary to accept the accuracy limitations of analytic solutions in order to compute solutions that satisfy those requirements. Analytic solutions are described for the geometry and travel time of infinite frequency rays through radially symmetric 1D Earth models characterized by an inner sphere where the velocity distribution is given by the function V (r) = A-Br{sup 2}, optionally surrounded by some number of spherical shells of constant velocity. The mathematical basis of the calculations is described, sample calculations are presented, and results are compared to the Taup Toolkit of Crotwell et al. (1999). These solutions are useful for evaluating the fidelity of sophisticated 3D travel time calculators and in situations where performance requirements preclude the use of more computationally intensive calculators. It should be noted that most of the solutions presented are only quasi-analytic. Exact, closed form equations are derived but computation of solutions to specific problems generally require application of numerical integration or root finding techniques, which, while approximations, can be calculated to very high accuracy. Tolerances are set in the numerical algorithms such that computed travel time accuracies are better than 1 microsecond.

More Details

GNEMRE DBTools : a suite of tools for access, maintenance, and manipulation of seismic data

Lewis, Jennifer E.; Ballard, Sanford B.

DBTools is comprised of a suite of applications for manipulating data in a database. While loading data into a database is a relatively simple operation, loading data intelligently is deceptively difficult. Loading data intelligently means: not duplicating information already in the database, associating new information with related information already in the database, and maintaining a mapping of identification numbers in the input data to existing or new identification numbers in the database to prevent conflicts between the input data and the existing data. Most DBTools applications utilize DBUtilLib--a Java library with functionality supporting database, flatfile, and XML data formats. DBUtilLib is written in a completely generic manner. No schema specific information is embedded within the code; all such information comes from external sources. This approach makes the DBTools applications immune to most schema changes such as addition/deletion of columns from a table or changes to the size of a particular data element.

More Details

The 2004 knowledge base parametric grid data software suite

Ballard, Sanford B.; Chang, Marcus C.; Hipp, James R.; Jensen, Lee A.; Simons, Randall W.; Wilkening, Lisa K.

One of the most important types of data in the National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Engineering (GNEM R&E) Knowledge Base (KB) is parametric grid (PG) data. PG data can be used to improve signal detection, signal association, and event discrimination, but so far their greatest use has been for improving event location by providing ground-truth-based corrections to travel-time base models. In this presentation we discuss the latest versions of the complete suite of Knowledge Base PG tools developed by NNSA to create, access, manage, and view PG data. The primary PG population tool is the Knowledge Base calibration integration tool (KBCIT). KBCIT is an interactive computer application to produce interpolated calibration-based information that can be used to improve monitoring performance by improving precision of model predictions and by providing proper characterizations of uncertainty. It is used to analyze raw data and produce kriged correction surfaces that can be included in the Knowledge Base. KBCIT not only produces the surfaces but also records all steps in the analysis for later review and possible revision. New features in KBCIT include a new variogram autofit algorithm; the storage of database identifiers with a surface; the ability to merge surfaces; and improved surface-smoothing algorithms. The Parametric Grid Library (PGL) provides the interface to access the data and models stored in a PGL file database. The PGL represents the core software library used by all the GNEM R&E tools that read or write PGL data (e.g., KBCIT and LocOO). The library provides data representations and software models to support accurate and efficient seismic phase association and event location. Recent improvements include conversion of the flat-file database (FDB) to an Oracle database representation; automatic access of station/phase tagged models from the FDB during location; modification of the core geometric data representations; a new multimodel representation for combining separate seismic data models that partially overlap; and a port of PGL to the Microsoft Windows platform. The Data Manager (DM) tool provides access to PG data for purposes of managing the organization of the generated PGL file database, or for perusing the data for visualization and informational purposes. It is written as a graphical user interface (GUI) that can directly access objects stored in any PGL file database and display it in an easily interpreted textual or visual format. New features include enhanced station object processing; low-level conversion to a new core graphics visualization library, the visualization toolkit (VTK); additional visualization support for most of the PGL geometric objects; and support for the Environmental Systems Research Institute (ESRI) shape files (which are used to enhance the geographical context during visualization). The Location Object-Oriented (LocOO) tool computes seismic event locations and associated uncertainty based on travel time, azimuth, and slowness observations. It uses a linearized least-squares inversion algorithm (the Geiger method), enhanced with Levenberg-Marquardt damping to improve performance in highly nonlinear regions of model space. LocOO relies on PGL for all predicted quantities and is designed to fully exploit all the capabilities of PGL that are relevant to seismic event location. New features in LocOO include a redesigned internal architecture implemented to enhance flexibility and to support simultaneous multiple event location. Database communication has been rewritten using new object-relational features available in Oracle 9i.

More Details

Seismic event location : dealing with multi-dimensional uncertainty, model non-linearity and local minima

Ballard, Sanford B.; Ballard, Sanford B.

Seismic event location is made challenging by the difficulty of describing event location uncertainty in multidimensions, by the non-linearity of the Earth models used as input to the location algorithm, and by the presence of local minima which can prevent a location code from finding the global minimum. Techniques to deal with these issues will be described. Since some of these techniques are computationally expensive or require more analysis by human analysts, users need a flexible location code that allows them to select from a variety of solutions that span a range of computational efficiency and simplicity of interpretation. A new location code, LocOO, has been developed to deal with these issues. A seismic event location is comprised of a point in 4-dimensional (4D) space-time, surrounded by a 4D uncertainty boundary. The point location is useless without the uncertainty that accompanies it. While it is mathematically straightforward to reduce the dimensionality of the 4D uncertainty limits, the number of dimensions that should be retained depends on the dimensionality of the location to which the calculated event location is to be compared. In nuclear explosion monitoring, when an event is to be compared to a known or suspected test site location, the three spatial components of the test site and event location are to be compared and 3 dimensional uncertainty boundaries should be considered. With LocOO, users can specify a location to which the calculated seismic event location is to be compared and the dimensionality of the uncertainty is tailored to that of the location specified by the user. The code also calculates the probability that the two locations in fact coincide. The non-linear travel time curves that constrain calculated event locations present two basic difficulties. The first is that the non-linearity can cause least squares inversion techniques to fail to converge. LocOO implements a nonlinear Levenberg-Marquardt least squares inversion technique that is guaranteed to converge in a finite number of iterations for tractable problems. The second difficulty is that a high degree of non-linearity causes the uncertainty boundaries around the event location to deviate significantly from elliptical shapes. LocOO can optionally calculate and display non-elliptical uncertainty boundaries at the cost of a minimal increase in computation time and complexity of interpretation. All location codes are plagued by the possibility of having local minima obscuring the single global minimum. No code can guarantee that it will find the global minimum in a finite number of computations. Grid search algorithms have been developed to deal with this problem, but have a high computational cost. In order to improve the likelihood of finding the global minimum in a timely manner, LocOO implements a hybrid least squares-grid search algorithm. Essentially, many least squares solutions are computed starting from a user-specified number of initial locations; and the solution with the smallest sum squared weighted residual is assumed to be the optimal location. For events of particular interest, analysts can display contour plots of gridded residuals in a selected region around the best-fit location, improving the probability that the global minimum will not be missed and also providing much greater insight into the character and quality of the calculated solution.

More Details

Seismic Event Location Using Levenberg-Marquardt Least Squares Inversion

Ballard, Sanford B.

The most widely used algorithm for estimating seismic event hypocenters and origin times is iterative linear least squares inversion. In this paper we review the mathematical basis of the algorithm and discuss the major assumptions made during its derivation. We go on to explore the utility of using Levenberg-Marquardt damping to improve the performance of the algorithm in cases where some of these assumptions are violated. We also describe how location parameter uncertainties are calculated. A technique to estimate an initial seismic event location is described in an appendix.

More Details

CaveMan Version 3.0: A Software System for SPR Cavern Pressure Analysis

Ballard, Sanford B.; Ehgartner, Brian L.

The U. S. Department of Energy Strategic Petroleum Reserve currently has approximately 500 million barrels of crude oil stored in 62 caverns solution-mined in salt domes along the Gulf Coast of Louisiana and Texas. One of the challenges of operating these caverns is ensuring that none of the fluids in the caverns are leaking into the environment. The current approach is to test the mechanical integrity of all the wells entering each cavern approximately once every five years. An alternative approach to detecting cavern leaks is to monitor the cavern pressure, since leaking fluid would act to reduce cavern pressure. Leak detection by pressure monitoring is complicated by other factors that influence cavern pressure, the most important of which are thermal expansion and contraction of the fluids in the cavern as they come into thermal equilibrium with the host salt, and cavern volume reduction due to salt creep. Cavern pressure is also influenced by cavern enlargement resulting from salt dissolution following introduction of raw water or unsaturated brine into the cavern. However, this effect only lasts for a month or two following a fluid injection. In order to implement a cavern pressure monitoring program, a software program called CaveMan has been developed. It includes thermal, creep and salt dissolution models and is able to predict the cavern pressurization rate based on the operational history of the cavern. Many of the numerous thermal and mechanical parameters in the model have been optimized to produce the best match between the historical data and the model predictions. Future measurements of cavern pressure are compared to the model predictions, and significant differences in cavern pressure set program flags that notify cavern operators of a potential problem. Measured cavern pressures that are significantly less than those predicted by the model may indicate the existence of a leak.

More Details
89 Results
89 Results