Sandia is part of a multi-lab Defense Nuclear Nonproliferation Research and Development project designed to improve U.S. capabilities to detect and characterize low yield underground nuclear explosions. Monitoring at low detection thresholds with dynamic monitoring networks requires innovative approaches to leverage and merge new and legacy physics-based knowledge with new data quantities, qualities, and types currently available. Sandia is working to address these challenges through the development of new predictive capabilities, network design, and advanced algorithms for the detection, identification, location, and characterization of underground nuclear explosions. Seismic monitoring provides an important improved capability for detection of nuclear tests around the world, especially those conducted deeper underground in an effort to conceal a nuclear explosion. To contribute to this effort, Sandia is developing new automated tools to analyze seismic data that increase the ability to detect and characterize evasive explosions – understanding the inherent uncertainty – while still maintaining the credibility that human scrutiny provides.
Seismic monitoring faces significant challenges since there are many background sources of seismicity, such as natural seismic events like earthquakes or anthropogenic sources like mining. Analysis must differentiate these background events from events that could represent nuclear tests. Automated preprocessing of sensor data is the first step in helping analysts identify potential events of interest. Automation serves as a filter to categorize unique signatures from background noise when there is abundant data from many sensor phenomenologies. Scientists at Sandia are exploring tools capable of bridging the gap between current automated seismic monitoring systems and human analysis. These tools rely on advances in Bayesian statistical inference, computer simulation, and machine learning. Researchers at multiple labs, including Sandia, are adapting datasets, generated by complex and interdependent physical systems, to interact with these emerging algorithms in ways that inform high-consequence decisions. For this reason, physics-based knowledge, data-driven observation, simulation, and novel learning objectives play important roles in advancing seismic monitoring algorithms.
One of the tools Sandia is using to improve automated processing is deep learning. After automated seismic processing is completed, additional analysis determines attributes such as event source, type, and size. Deep learning builds predictive models that can assign attributes, such as whether an event comes from natural or anthropogenic seismicity, with accuracies significantly exceeding existing methods. Deep learning synthesizes years’ worth of curated data from monitoring networks to extract predictive features. Researchers developing these models have to be concerned not only with high predictive accuracy, but also with prediction credibility. This is because safe and actionable decision support for high-consequence situations requires capturing uncertainty in deep learning predictions. By providing both accurate and credible predictions, deep learning will increase the amount of automated seismic monitoring and reduce the burden on human expertise.
Sandia is also advancing seismic monitoring capabilities by developing novel methods to precisely locate seismic events. Precise locations are critical not only for seismic monitoring but also for providing data that can improve computer simulations used to model the propagation of seismic waves in the earth.
To achieve this goal, researchers at Sandia are developing a novel Bayesian framework for event location which allows scientists to assess the fidelity of inferred locations with higher confidence. A statistical model of a seismic waveform’s features learned from offline synthetic simulations of seismic events gives scientists the ability to integrate meaningful physical properties from the simulations into the location algorithm. Building the statistical model offline avoids costly simulations during online monitoring that previously made this approach impractical. Leveraging physics-based waveform features will decrease monitoring thresholds and increase location reliability and accuracy.
Locating seismic events using Bayesian methods and learning the feature model from seismic waveform simulations are computationally intensive; however, they are highly parallelizable on HPC resources. The waveform synthetics are expensive to compute and require detailed physics models to attain the resolution and realism analysts require to replicate real seismic events. Therefore, researchers rely on GPUs in combination with HPC to build predictive models in an efficient, lower-cost manner. Likewise, data-driven models require GPU and HPC resources for efficient and practical development. Widespread access to GPU and HPC has opened up new doors for seismic monitoring algorithm development.
Beyond improving automated seismic pipelines, Sandia is also using HPC to design better sensor networks through Bayesian optimal experimental design. Within this framework, the position and type of seismic sensors are optimized to maximize the expected information gained about possible events of interest. Therefore, the performance of each potential network configuration is tested for many hypothetical events using Monte Carlo sampling. HPC is crucial for speeding up sampling since many events must be considered. By optimizing the sensor configuration, seismic monitoring networks can be designed to improve detection accuracy and reduce location uncertainty, particularly for small seismic events.
Sandia researchers hope to continue to develop ideas that contribute directly to an ability to detect and characterize low yield underground nuclear explosions. The advances they make to U.S. monitoring capabilities may also provide far-reaching solutions to challenges shared across other mission spaces.