Sandia researcher Pete Bosler aims to improve the fidelity by which complex computer simulations can be guided by very fine examinations of real-world data.
His proposal’s information-packed title, “High performance adaptive multiscale simulation with data-driven scale-selective subgrid parameterizations,” refers to multiscale simulations that, integrated, could include individual raindrops, supercell thunderstorms and the entire global atmosphere, guided by data currently thought too fine to be used, that is, too small to be seen on a data grid, or in other words, currently subgrid.
The proposal earned Pete a 2022 DOE Early Career Research Award.
“Peter’s research plan lays out a highly innovative approach to achieving more accuracy from our simulations of complex domains like climate and plasmas,” Sandia manager Andy Salinger said. “As our computing resources for these mission application areas increase, there is a shift in the boundary between those phenomena we can resolve with our highest fidelity models. Those too fine-scaled need to be represented with heuristic models derived from experience. Peter will develop new algorithms that will intelligently choose which heuristic models are most appropriate to capture the unresolved processes.”
When Pete graduated from the U.S. Naval Academy in 2002 with a bachelor’s degree in oceanography, he served as a combat line officer in anti-submarine warfare at sea and as an anti-terrorism officer protecting ships during port visits. Then he worked shore duty as a meteorology and oceanography staff corps officer for two years.
He loved the sea, and the work was patriotic, but what fascinated him throughout his service was the Navy’s widespread use of computational models. “I was looking at end-product use, of course,” he said. “We would tune the sensors on our ship to the environment predicted by our computational ocean models, for example, and for our weather forecast we relied heavily on atmospheric models.”
He found the models “fantastic, but not perfect.” Even over the considerable evolution of the last several decades, “nature can still throw surprises at you where the models do not work well. Usually, these situations are outliers, rare occurrences, but they may be extreme events where you’d need the models most,” Pete said. “I wanted to learn more about the methods behind these models and improve their performance, so I chose to study applied and interdisciplinary mathematics (at the University of Michigan) when I left the Navy.”
His work, which to date has led to nine published papers, 12 invited presentations and many student interactions, has a common technique that uses equations that represent fluid flow by moving with the fluid as it evolves, rather than maintaining a fixed observation point. “Consider a drifting buoy or free-floating balloon that sample the portion of the ocean or atmosphere as they move along with it,” he said.
In this complex flow, the computational elements, or particles, develop features both larger and smaller than their original configuration as they adapt to the flow taking them closer together or further apart. This can leave a simulation region relatively devoid of particles and others with too many. “The accuracy of the simulation will rapidly degrade when this occurs,” said Pete. “So, we use a technique called adaptivity to fill in sparse regions of the flow with particles and remove them from regions where they may accumulate.”
To do this without sacrificing accuracy, Pete proposes to conjoin these purely mathematical efforts with data sets developed from real-world observation. These help mathematical efforts remain on target, until the data cudgel goes subgrid, too small to see, or technically, finer than can be resolved.
Pete intends to circumvent this limit by linking adaptive algorithms to a data analysis technique called dynamic mode decomposition to remap data more finely.
The adaptive algorithms use partial differential equations that link to physical laws, which is attractive but expensive to use for many calculations.
“PDEs are based on physical laws, so they’re attractive because they are solidly rooted in theory, but computing their solution can be very expensive,” Pete said. “Data-driven methods like dynamic mode decomposition don’t necessarily need a connection to physics, but with new graphics processing unit computing and machine learning infrastructure, they can be very fast and inexpensive.
“Lots of people are trying to figure out how to blend the strengths of these two ideas, using physics to inform machine learning algorithms,” Pete said. “It’s hard. Maybe this connection between adaptive criteria and the dynamic mode decomposition algorithm will open new pathways.”