Publications

60 Results
Skip to search filters

A Summary of Validation Studies for the Integrated TIGER Series Performed on ACORN Plus-up 218468/99

Davis, Rowdy D.; Kensek, Ronald P.; Olson, Aaron J.; Perfetti, Christopher

The Integrated TIGER Series (ITS) transport code is a valuable tool for photon-electron transport. A seven-problem validation suite exists to make sure that the ITS transport code works as intended. It is important to ensure that data from benchmark problems is correctly compared to simulated data. Additionally, the validation suite did not previously make use of a consistent quantitative metric for comparing experimental and simulated datasets. To this end, the goal of this long-term project was to expand the validation suite both in problem type and in the quality of the error assessment. To accomplish that, the seven validation problems in the suite were examined for potential drawbacks. When a drawback was identified, the problems were ranked based on severity of the drawback and approachability of a solution. We determined that meaningful improvements could be made to the validation suite by improving the analysis for the Lockwood Albedo problem and by introducing the Ross dataset as an eighth problem to the suite. The Lockwood error analysis has been completed and will be integrated in the future. The Ross data is unfinished, but significant progress has been made towards analysis.

More Details

An X-ray Intensity Operations Monitor (AXIOM) (Final LDRD Project Report)

Ulmen, Benjamin A.; Webb, Timothy J.; Radtke, Gregg A.; Olson, Aaron J.; Depriest, Kendall D.; Coffey, Sean K.; Looker, Quinn M.; Gao, Xujiao G.; Nicholas, Ryder N.; Edwards, Jarrod D.; McCourt, Andrew L.; Bell, Kate S.

The Saturn accelerator has historically lacked the capability to measure time-resolved spectra for its 3-ring bremsstrahlung x-ray source. This project aimed to create a spectrometer called AXIOM to provide this capability. The project had three major development pillars: hardware, simulation, and unfold code. The hardware consists of a ring of 24 detectors around an existing x-ray pinhole camera. The diagnostic was fielded on two shots at Saturn and over 100 shots at the TriMeV accelerator at Idaho Accelerator Center. A new Saturn x-ray environment simulation was created using measured data to validate. This simulation allows for timeresolved spectra computation to compare the experimental results. The AXIOM-Unfold code is a new parametric unfold code using modern global optimizers and uncertainty quantification. The code was written in Python, uses Gitlab version control and issue tracking, and has been developed with long term code support and maintenance in mind.

More Details

Saturn Radiation Dose Environment Characterization

Ulmen, Benjamin A.; Depriest, Kendall D.; Olson, Aaron J.; Webb, Timothy J.; Edwards, Jarrod D.

To understand the environment where a time-resolved hard x-ray spectrometer (AXIOM) might be fielded, experiments and simulations were performed to analyze the radiation dose environment underneath the Saturn vacuum dome. Knowledge of this environment is critical to the design and placement of the spectrometer. Experiments demonstrated that the machine performance, at least in terms of on-axis dose, has not significantly changed over the decades. Simulations of the off-axis dose were performed to identify possible spectrometer locations of interest. The effects from the source and dome hardware as well as source distributions and angles of incidence on the radiation environment were also investigated. Finally, a unified radiation transport model was developed for two widely used radiation transport codes to investigate the off-axis dose profiles and the time-dependent x-ray energy spectrum. The demonstrated equivalence of the unified radiation transport model between the radiation transport codes allows the team to tie future time-dependent x-ray environment calculations to previous integral simulations for the Saturn facility.

More Details

Fast three-dimensional rules-based simulation of thermal-sprayed microstructures

Computational Materials Science

Rodgers, Theron R.; Mitchell, John A.; Olson, Aaron J.; Bolintineanu, Dan S.; Vackel, Andrew V.; Moore, Nathan W.

Thermal spray processes involve the repeated impact of millions of discrete particles, whose melting, deformation, and coating-formation dynamics occur at microsecond timescales. The accumulated coating that evolves over minutes is comprised of complex, multiphase microstructures, and the timescale difference between the individual particle solidification and the overall coating formation represents a significant challenge for analysts attempting to simulate microstructure evolution. In order to overcome the computational burden, researchers have created rule-based models (similar to cellular automata methods) that do not directly simulate the physics of the process. Instead, the simulation is governed by a set of predefined rules, which do not capture the fine-details of the evolution, but do provide a useful approximation for the simulation of coating microstructures. Here, we introduce a new rules-based process model for microstructure formation during thermal spray processes. The model is 3D, allows for an arbitrary number of material types, and includes multiple porosity-generation mechanisms. Example results of the model for tantalum coatings are presented along with sensitivity analyses of model parameters and validation against 3D experimental data. The model's computational efficiency allows for investigations into the stochastic variation of coating microstructures, in addition to the typical process-to-structure relationships.

More Details

An extension of conditional point sampling to quantify uncertainty due to material mixing randomness

International Conference on Mathematics and Computational Methods Applied to Nuclear Science and Engineering, M and C 2019

Vu, Emily V.; Olson, Aaron J.

Radiation transport in stochastic media is a problem found in a multitude of applications, and the need for tools that are capable of thoroughly modeling this type of problem remains. A collection of approximate methods have been developed to produce accurate mean results, but the demand for methods that are capable of quantifying the spread of results caused by the randomness of material mixing remains. In this work, the new stochastic media transport algorithm Conditional Point Sampling is expanded using Embedded Variance Deconvolution such that it can compute the variance caused by material mixing. The accuracy of this approach is assessed for 1D, binary, Markovian-mixed media by comparing results to published benchmark values, and the behavior of the method is numerically studied as a function of user parameters. We demonstrate that this extension of Conditional Point Sampling is able to compute the variance caused by material mixing with accuracy dependent on the accuracy of the conditional probability function used.

More Details

An extension of conditional point sampling to multi-dimensional transport

International Conference on Mathematics and Computational Methods Applied to Nuclear Science and Engineering, M and C 2019

Olson, Aaron J.; Vu, Emily V.

Radiation transport in stochastic media is a challenging problem type relevant for applications such as meteorological modeling, heterogeneous radiation shields, BWR coolant, and pebble-bed reactor fuel. A commonly cited challenge for methods performing transport in stochastic media is to simultaneously be accurate and efficient. Conditional Point Sampling (CoPS), a new method for transport in stochastic media, was recently shown to have accuracy comparable to the most accurate approximate methods for a common 1D benchmark set. In this paper, we use a pseudo-interface-based approach to extend CoPS to application in multi-D for Markovian-mixed media, compare its accuracy with published results for other approximate methods, and examine its accuracy and efficiency as a function of user options. CoPS is found to be the most accurate of the compared methods on the examined benchmark suite for transmittance and comparable in accuracy with the most accurate methods for reflectance and internal flux. Numerical studies examine accuracy and efficiency as a function of user parameters providing insight for effective parameter selection and further method development. Since the authors did not implement any of the other approximate methods, there is not yet a valid comparison for efficiency with the other methods.

More Details

Radiation Transport in Random Media with Large Fluctuations

EPJ Web of Conferences

Olson, Aaron J.; Prinja, Anil; Franke, Brian C.

Neutral particle transport in media exhibiting large and complex material property spatial variation is modeled by representing cross sections as lognormal random functions of space and generated through a nonlinear memory-less transformation of a Gaussian process with covariance uniquely determined by the covariance of the cross section. A Karhunen-Loève decomposition of the Gaussian process is implemented to effciently generate realizations of the random cross sections and Woodcock Monte Carlo used to transport particles on each realization and generate benchmark solutions for the mean and variance of the particle flux as well as probability densities of the particle reflectance and transmittance. A computationally effcient stochastic collocation method is implemented to directly compute the statistical moments such as the mean and variance, while a polynomial chaos expansion in conjunction with stochastic collocation provides a convenient surrogate model that also produces probability densities of output quantities of interest. Extensive numerical testing demonstrates that use of stochastic reduced-order modeling provides an accurate and cost-effective alternative to random sampling for particle transport in random media.

More Details
60 Results
60 Results