Publications

Results 76–89 of 89
Skip to search filters

Analytic solutions for seismic travel time and ray path geometry through simple velocity models

Ballard, Sanford B.

The geometry of ray paths through realistic Earth models can be extremely complex due to the vertical and lateral heterogeneity of the velocity distribution within the models. Calculation of high fidelity ray paths and travel times through these models generally involves sophisticated algorithms that require significant assumptions and approximations. To test such algorithms it is desirable to have available analytic solutions for the geometry and travel time of rays through simpler velocity distributions against which the more complex algorithms can be compared. Also, in situations where computational performance requirements prohibit implementation of full 3D algorithms, it may be necessary to accept the accuracy limitations of analytic solutions in order to compute solutions that satisfy those requirements. Analytic solutions are described for the geometry and travel time of infinite frequency rays through radially symmetric 1D Earth models characterized by an inner sphere where the velocity distribution is given by the function V (r) = A-Br{sup 2}, optionally surrounded by some number of spherical shells of constant velocity. The mathematical basis of the calculations is described, sample calculations are presented, and results are compared to the Taup Toolkit of Crotwell et al. (1999). These solutions are useful for evaluating the fidelity of sophisticated 3D travel time calculators and in situations where performance requirements preclude the use of more computationally intensive calculators. It should be noted that most of the solutions presented are only quasi-analytic. Exact, closed form equations are derived but computation of solutions to specific problems generally require application of numerical integration or root finding techniques, which, while approximations, can be calculated to very high accuracy. Tolerances are set in the numerical algorithms such that computed travel time accuracies are better than 1 microsecond.

More Details

GNEMRE DBTools : a suite of tools for access, maintenance, and manipulation of seismic data

Lewis, Jennifer E.; Ballard, Sanford B.

DBTools is comprised of a suite of applications for manipulating data in a database. While loading data into a database is a relatively simple operation, loading data intelligently is deceptively difficult. Loading data intelligently means: not duplicating information already in the database, associating new information with related information already in the database, and maintaining a mapping of identification numbers in the input data to existing or new identification numbers in the database to prevent conflicts between the input data and the existing data. Most DBTools applications utilize DBUtilLib--a Java library with functionality supporting database, flatfile, and XML data formats. DBUtilLib is written in a completely generic manner. No schema specific information is embedded within the code; all such information comes from external sources. This approach makes the DBTools applications immune to most schema changes such as addition/deletion of columns from a table or changes to the size of a particular data element.

More Details

The 2004 knowledge base parametric grid data software suite

Ballard, Sanford B.; Chang, Marcus C.; Hipp, James R.; Jensen, Lee A.; Simons, Randall W.; Wilkening, Lisa K.

One of the most important types of data in the National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Engineering (GNEM R&E) Knowledge Base (KB) is parametric grid (PG) data. PG data can be used to improve signal detection, signal association, and event discrimination, but so far their greatest use has been for improving event location by providing ground-truth-based corrections to travel-time base models. In this presentation we discuss the latest versions of the complete suite of Knowledge Base PG tools developed by NNSA to create, access, manage, and view PG data. The primary PG population tool is the Knowledge Base calibration integration tool (KBCIT). KBCIT is an interactive computer application to produce interpolated calibration-based information that can be used to improve monitoring performance by improving precision of model predictions and by providing proper characterizations of uncertainty. It is used to analyze raw data and produce kriged correction surfaces that can be included in the Knowledge Base. KBCIT not only produces the surfaces but also records all steps in the analysis for later review and possible revision. New features in KBCIT include a new variogram autofit algorithm; the storage of database identifiers with a surface; the ability to merge surfaces; and improved surface-smoothing algorithms. The Parametric Grid Library (PGL) provides the interface to access the data and models stored in a PGL file database. The PGL represents the core software library used by all the GNEM R&E tools that read or write PGL data (e.g., KBCIT and LocOO). The library provides data representations and software models to support accurate and efficient seismic phase association and event location. Recent improvements include conversion of the flat-file database (FDB) to an Oracle database representation; automatic access of station/phase tagged models from the FDB during location; modification of the core geometric data representations; a new multimodel representation for combining separate seismic data models that partially overlap; and a port of PGL to the Microsoft Windows platform. The Data Manager (DM) tool provides access to PG data for purposes of managing the organization of the generated PGL file database, or for perusing the data for visualization and informational purposes. It is written as a graphical user interface (GUI) that can directly access objects stored in any PGL file database and display it in an easily interpreted textual or visual format. New features include enhanced station object processing; low-level conversion to a new core graphics visualization library, the visualization toolkit (VTK); additional visualization support for most of the PGL geometric objects; and support for the Environmental Systems Research Institute (ESRI) shape files (which are used to enhance the geographical context during visualization). The Location Object-Oriented (LocOO) tool computes seismic event locations and associated uncertainty based on travel time, azimuth, and slowness observations. It uses a linearized least-squares inversion algorithm (the Geiger method), enhanced with Levenberg-Marquardt damping to improve performance in highly nonlinear regions of model space. LocOO relies on PGL for all predicted quantities and is designed to fully exploit all the capabilities of PGL that are relevant to seismic event location. New features in LocOO include a redesigned internal architecture implemented to enhance flexibility and to support simultaneous multiple event location. Database communication has been rewritten using new object-relational features available in Oracle 9i.

More Details

Seismic event location : dealing with multi-dimensional uncertainty, model non-linearity and local minima

Ballard, Sanford B.; Ballard, Sanford B.

Seismic event location is made challenging by the difficulty of describing event location uncertainty in multidimensions, by the non-linearity of the Earth models used as input to the location algorithm, and by the presence of local minima which can prevent a location code from finding the global minimum. Techniques to deal with these issues will be described. Since some of these techniques are computationally expensive or require more analysis by human analysts, users need a flexible location code that allows them to select from a variety of solutions that span a range of computational efficiency and simplicity of interpretation. A new location code, LocOO, has been developed to deal with these issues. A seismic event location is comprised of a point in 4-dimensional (4D) space-time, surrounded by a 4D uncertainty boundary. The point location is useless without the uncertainty that accompanies it. While it is mathematically straightforward to reduce the dimensionality of the 4D uncertainty limits, the number of dimensions that should be retained depends on the dimensionality of the location to which the calculated event location is to be compared. In nuclear explosion monitoring, when an event is to be compared to a known or suspected test site location, the three spatial components of the test site and event location are to be compared and 3 dimensional uncertainty boundaries should be considered. With LocOO, users can specify a location to which the calculated seismic event location is to be compared and the dimensionality of the uncertainty is tailored to that of the location specified by the user. The code also calculates the probability that the two locations in fact coincide. The non-linear travel time curves that constrain calculated event locations present two basic difficulties. The first is that the non-linearity can cause least squares inversion techniques to fail to converge. LocOO implements a nonlinear Levenberg-Marquardt least squares inversion technique that is guaranteed to converge in a finite number of iterations for tractable problems. The second difficulty is that a high degree of non-linearity causes the uncertainty boundaries around the event location to deviate significantly from elliptical shapes. LocOO can optionally calculate and display non-elliptical uncertainty boundaries at the cost of a minimal increase in computation time and complexity of interpretation. All location codes are plagued by the possibility of having local minima obscuring the single global minimum. No code can guarantee that it will find the global minimum in a finite number of computations. Grid search algorithms have been developed to deal with this problem, but have a high computational cost. In order to improve the likelihood of finding the global minimum in a timely manner, LocOO implements a hybrid least squares-grid search algorithm. Essentially, many least squares solutions are computed starting from a user-specified number of initial locations; and the solution with the smallest sum squared weighted residual is assumed to be the optimal location. For events of particular interest, analysts can display contour plots of gridded residuals in a selected region around the best-fit location, improving the probability that the global minimum will not be missed and also providing much greater insight into the character and quality of the calculated solution.

More Details

Seismic Event Location Using Levenberg-Marquardt Least Squares Inversion

Ballard, Sanford B.

The most widely used algorithm for estimating seismic event hypocenters and origin times is iterative linear least squares inversion. In this paper we review the mathematical basis of the algorithm and discuss the major assumptions made during its derivation. We go on to explore the utility of using Levenberg-Marquardt damping to improve the performance of the algorithm in cases where some of these assumptions are violated. We also describe how location parameter uncertainties are calculated. A technique to estimate an initial seismic event location is described in an appendix.

More Details

CaveMan Version 3.0: A Software System for SPR Cavern Pressure Analysis

Ballard, Sanford B.; Ehgartner, Brian L.

The U. S. Department of Energy Strategic Petroleum Reserve currently has approximately 500 million barrels of crude oil stored in 62 caverns solution-mined in salt domes along the Gulf Coast of Louisiana and Texas. One of the challenges of operating these caverns is ensuring that none of the fluids in the caverns are leaking into the environment. The current approach is to test the mechanical integrity of all the wells entering each cavern approximately once every five years. An alternative approach to detecting cavern leaks is to monitor the cavern pressure, since leaking fluid would act to reduce cavern pressure. Leak detection by pressure monitoring is complicated by other factors that influence cavern pressure, the most important of which are thermal expansion and contraction of the fluids in the cavern as they come into thermal equilibrium with the host salt, and cavern volume reduction due to salt creep. Cavern pressure is also influenced by cavern enlargement resulting from salt dissolution following introduction of raw water or unsaturated brine into the cavern. However, this effect only lasts for a month or two following a fluid injection. In order to implement a cavern pressure monitoring program, a software program called CaveMan has been developed. It includes thermal, creep and salt dissolution models and is able to predict the cavern pressurization rate based on the operational history of the cavern. Many of the numerous thermal and mechanical parameters in the model have been optimized to produce the best match between the historical data and the model predictions. Future measurements of cavern pressure are compared to the model predictions, and significant differences in cavern pressure set program flags that notify cavern operators of a potential problem. Measured cavern pressures that are significantly less than those predicted by the model may indicate the existence of a leak.

More Details
Results 76–89 of 89
Results 76–89 of 89