New Sandia software reduces false, missed detections of seismic activity
A dormant volcano in Antarctica helped researchers at Sandia improve sensor data readings to better detect earthquakes and explosions and tune out everyday sounds such as traffic and footsteps.
Finding the ideal settings for each sensor in a network to detect vibrations in the ground, or seismic activity, can be a painstaking and manual process. Researchers at Sandia are working to change that by using software that automatically adjusts the seismic activity detection levels for each sensor.
Sandia tested the new software with seismic data from the Mt. Erebus volcano in Antarctica and achieved 18 percent fewer false detections and 11 percent fewer missed detections than the original performance of the sensors on Mt. Erebus.
Until now, the main way to ensure sensors were picking up unusual seismic activity and not reporting regular activity was to manually adjust the settings of each sensor to its specific surroundings. Unfortunately, getting those settings exactly right is difficult, especially because those ideal settings change with the seasons and weather patterns.
During a three-year project funded by Laboratory Directed Research and Development, researchers developed software that automatically adjusts the detection settings for the data coming from each sensor in a network using a “majority rules” approach, which led to fewer false detections of seismic activity and fewer missed detections of actual events. The work was recently published in a Bulletin of the Seismological Society of America paper, “Dynamic Tuning of Seismic Signal Detector Trigger Levels for Local Networks,” and the open source Python-based software is available for download.
‘Polling the neighborhood’
The research team, led by Tim Draelos, a machine learning and signal processing researcher, developed an algorithm that reads the data from a neighborhood of sensors and compares the detections made by each sensor. If a majority of sensors in a similar location detects seismic activity at the same time, then the program marks the event as legitimate. If most of the sensors do not detect seismic activity, then the program doesn’t mark the event and the detection levels for the sensors that falsely reported an event are adjusted.
“A neighborhood is a small subset of sensors in a network that all have a similar view of the world or a similar sensing footprint,” Tim said. “They should agree on everything they see. If they don’t, we’re able to determine which sensor needs to be tuned so that we get better agreement in the future, which leads to better overall network detection quality. We don’t ever want to miss an event like a nuclear explosion, for example.”
This “majority rules” approach to seismic sensor data processing is automatic while the algorithm runs and allows continuous adjustments to the trigger levels that detect a seismic event, making readings from the sensors more accurate than readings from static sensors with fixed settings.
Watch the Sandia video: Self-tuning seismic sensor data processing
Tim and the team, including Hunter Knox, Matt Peterson and Chris Young, tested the algorithm using the Mt. Erebus seismic sensor network. They created a database of seismic events on the volcano by manually viewing all the sensor activity recorded over 24 hours and then marking seismic events. To be classified as an event, three or more sensors in the same neighborhood had to detect the seismic activity.
The team then ran the raw sensor data through the new majority rules algorithm to see how it performed and compared the results against the database of legitimate detections and the results of the sensors operating without the dynamic tuning of the algorithm.
The improvements in accurate detection rates are important because sensor networks generate a lot of data. For example, the International Data Center analyst-reviewed bulletin for 2014 only included 8 percent of the more than 5.5 million International Monitoring System seismic detections originally registered by sensors. This worldwide network helps verify compliance with the Comprehensive Nuclear Test Ban Treaty, which has been signed but not ratified by the United States, by detecting events that might show the treaty has been violated.
“A large portion, but not all, of the remaining 92 percent of detections were likely false positives, which leads to extraneous data storage and processing,” Tim said. “Additionally, 39 percent of the detections included in the bulletin were found or modified by a human analyst, which indicates a large percentage of missed detections and wrongly measured detections by the sensors, which takes time and effort to amend.”
Some dynamic signal detectors exist, but until now none have used sensor networks to optimize detections of seismic events. The new approach to tuning data could also be applied to environmental monitoring, motion sensor monitoring with cameras, chemical monitoring, infrasound monitoring and more.
“This is a general-purpose idea,” Tim said. “It doesn’t have to be seismic data. This algorithm can potentially be used anywhere you have a network or collection of sensors to detect events.”