We report the method-of-moments implementation of the electric-field integral equation (EFIE) yields many code-verification challenges due to the various sources of numerical error and their possible interactions. Matters are further complicated by singular integrals, which arise from the presence of a Green's function. To address these singular integrals, an approach is presented in wherein both the solution and Green's function are manufactured. Because the arising equations are poorly conditioned, they are reformulated as a set of constraints for an optimization problem that selects the solution closest to the manufactured solution. In this paper, we demonstrate how, for such practically singular systems of equations, computing the truncation error by inserting the exact solution into the discretized equations cannot detect certain orders of coding errors. On the other hand, the discretization error from the optimal solution is a more sensitive metric that can detect orders less than those of the expected convergence rate.
The Multi-Fidelity Toolkit (MFTK) is a simulation tool being developed at Sandia National Laboratories for aerodynamic predictions of compressible flows over a range of physics fidelities and computational speeds. These models include the Reynolds-Averaged-Navier-Stokes (RANS) equations, the Euler equations, and modified Newtonian aerodynamics (MNA) equations, and they can be invoked independently or coupled with hierarchical Kriging to interpolate between high-fidelity simulations using lower-fidelity data. However, as with any new simulation capability, verification and validation are necessary to gather credibility evidence. This work describes formal code- and solution-verification activities as well as model validation with uncertainty considerations. Code verification is performed on the MNA model by comparing with an analytical solution for flat-plate and inclined-plate geometries. Solution-verification activities include grid-refinement studies of HIFiRE-1 wind tunnel measurements, which are used for validation, for all model fidelities. A thorough treatment of the validation comparison with prediction error and validation uncertainty is also presented.
The Multi-Fidelity Toolkit (MFTK) is a simulation tool being developed at Sandia National Laboratories for aerodynamic predictions of compressible flows over a range of physics fidelities and computational speeds. These models include the Reynolds-Averaged Navier–Stokes (RANS) equations, the Euler equations, and modified Newtonian aerodynamics (MNA) equations, and they can be invoked independently or coupled with hierarchical Kriging to interpolate between high-fidelity simulations using lower-fidelity data. However, as with any new simulation capability, verification and validation are necessary to gather credibility evidence. This work describes formal model validation with uncertainty considerations that leverages experimental data from the HIFiRE-1 wind tunnel tests. The geometry is a multi-conic shape that produces complex flow phenomena under hypersonic conditions. A thorough treatment of the validation comparison with prediction error and validation uncertainty is also presented.
Though the method-of-moments implementation of the electric-field integral equation plays an important role in computational electromagnetics, it provides many code-verification challenges due to the different sources of numerical error. In this paper, we provide an approach through which we can apply the method of manufactured solutions to isolate and verify the solution-discretization error. We accomplish this by manufacturing both the surface current and the Green's function. Because the arising equations are poorly conditioned, we reformulate them as a set of constraints for an optimization problem that selects the solution closest to the manufactured solution. We demonstrate the effectiveness of this approach for cases with and without coding errors.
Though the method-of-moments implementation of the electric-field integral equation plays an important role in computational electromagnetics, it provides many code-verification challenges due to the different sources of numerical error and their possible interactions. Matters are further complicated by singular integrals, which arise from the presence of a Green's function. In this report, we document our research to address these issues, as well as its implementation and testing in Gemma.
In this paper, we characterize the logarithmic singularities arising in the method of moments from the Green's function in integrals over the test domain, and we use two approaches for designing geometrically symmetric quadrature rules to integrate these singular integrands. These rules exhibit better convergence properties than quadrature rules for polynomials and, in general, lead to better accuracy with a lower number of quadrature points. We demonstrate their effectiveness for several examples encountered in both the scalar and vector potentials of the electric-field integral equation (singular, near-singular, and far interactions) as compared to the commonly employed polynomial scheme and the double Ma–Rokhlin–Wandzura (DMRW) rules, whose sample points are located asymmetrically within triangles.
The study of hypersonic flows and their underlying aerothermochemical reactions is particularly important in the design and analysis of vehicles exiting and reentering Earth's atmosphere. Computational physics codes can be employed to simulate these phenomena; however, verification of these codes is necessary to certify their credibility. To date, few approaches have been presented for verifying codes that simulate hypersonic flows, especially flows reacting in thermochemical nonequilibrium. In this paper, we present our code-verification techniques for verifying the spatial accuracy and thermochemical source term in hypersonic reacting flows in thermochemical nonequilibrium. We demonstrate the effectiveness of these techniques on the Sandia Parallel Aerodynamics and Reentry Code (SPARC).
The study of heat transfer and ablation plays an important role in many problems of scientific and engineering interest. As with the computational simulation of any physical phenomenon, the first step towards establishing credibility in ablation simulations involves code verification. Code verification is typically performed using exact and manufactured solutions. However, manufactured solutions generally require the invasive introduction of an artificial forcing term within the source code, such that the code solves a modified problem for which the solution is known. In this paper, we present a nonintrusive method for manufacturing solutions for a non-decomposing ablation code, which does not require the addition of a source term.
The study of heat transfer and ablation plays an important role in many problems of scientific and engineering interest. As with the computational simulation of any physical phenomenon, the first step toward establishing credibility in ablation simulations involves code verification. Code verification is typically performed using exact and manufactured solutions. However, manufactured solutions generally require the invasive introduction of an artificial forcing term within the source code such that the code solves a modified problem for which the solution is known. In this paper, we present a nonintrusive method for manufacturing solutions for a non-decomposing ablation code, which does not require the addition of a source term.
Gemma verification activities for FY20 can be divided into three categories: the development of specialized quadrature rules, initial progress towards the development of manufactured solutions for code verification, and automated code-verification testing. In the method-of-moments implementation of the electric-field integral equation, the presence of a Green’s function in the four-dimensional integrals yields singularities in the integrand when two elements are nearby. To address these challenges, we have developed quadrature rules to integrate the functions through which the singularities can be characterized. Code verification is necessary to develop confidence in the implementation of the numerical methods in Gemma. Therefore, we have begun investigating the use of manufactured solutions to more thoroughly verify Gemma. Manufactured solutions provide greater flexibility for testing aspects of the code; however, the aforementioned singularities provide challenges, and existing work is limited in rigor and quantity. Finally, we have implemented automated code-verification testing using the VVTest framework to automate the mesh refinement and execution of a Gemma simulation to generate mesh convergence data. This infrastructure computes the observed order of accuracy from these data and compares it with the theoretical order of accuracy to either develop confidence in the implementation of the numerical methods or detect coding errors.
We propose herein a probabilistic framework for assessing the consistency of an experimental dataset, i.e., whether the stated experimental conditions are consistent with the measurements provided. In case the dataset is inconsistent, our framework allows one to hypothesize and test sources of inconsistencies. This is crucial in model validation efforts. The framework relies on Bayesian inference to estimate experimental settings deemed uncertain, from measurements deemed accurate. The quality of the inferred variables is gauged by its ability to reproduce held-out experimental measurements. We test the correctness of the framework on three double-cone experiments conducted in the CUBRC Inc.'s LENS-I shock tunnel, which have also been numerically simulated successfully. Thereafter, we use the framework to investigate two double-cone experiments (executed in the LENS-XX shock tunnel) which have encountered difficulties when used in model validation exercises. We detect an inconsistency with one of the LENS-XX experiments. In addition, we hypothesize two causes for our inability to simulate LEXS-XX experiments accurately and test them using our framework. We find that there is no single cause that explains all the discrepancies between model predictions and experimental data, but different causes explain different discrepancies, to larger or smaller extent. We end by proposing that uncertainty quantification methods be used more widely to understand experiments and characterize facilities, and we cite three different methods to do so, the third of which we present in this paper.
The study of hypersonic flows and their underlying aerothermochemical reactions is particularly important in the design and analysis of vehicles exiting and reentering Earth’s atmosphere. Computational physics codes can be employed to simulate these phenomena; however, code verification of these codes is necessary to certify their credibility. To date, few approaches have been presented for verifying codes that simulate hypersonic flows, especially flows reacting in thermochemical nonequilibrium. In this paper, we present our code-verification techniques for hypersonic reacting flows in thermochemical nonequilibrium, as well as their deployment in the Sandia Parallel Aerodynamics and Reentry Code (SPARC).
This project will enable high-fidelity aerothermal simulations of hypersonic vehicles to be employed (1) to generate large databases with quantified uncertainties and (2) for rapid interactive simulation. The databases will increase the volume/quality of A4H data; rapid interactive simulation can enable arbitrary conditions/designs to be simulated on demand. We will achieve this by applying reduced-order-modeling techniques to aerothermal simulations.
The SPARC (Sandia Parallel Aerodynamics and Reentry Code) will provide nuclear weapon qualification evidence for the random vibration and thermal environments created by re-entry of a warhead into the earth’s atmosphere. SPARC incorporates the innovative approaches of ATDM projects on several fronts including: effective harnessing of heterogeneous compute nodes using Kokkos, exascale-ready parallel scalability through asynchronous multi-tasking, uncertainty quantification through Sacado integration, implementation of state-of-the-art reentry physics and multiscale models, use of advanced verification and validation methods, and enabling of improved workflows for users. SPARC is being developed primarily for the Department of Energy nuclear weapon program, with additional development and use of the code is being supported by the Department of Defense for conventional weapons programs.
Computer Methods in Applied Mechanics and Engineering
Reddy, Sohail R.; Freno, Brian A.; Cizmas, Paul G.A.; Gokaltun, Seckin; McDaniel, Dwayne; Dulikravich, George S.
A novel approach is presented to constrain reduced-order models (ROM) based on proper orthogonal decomposition (POD). The Karush–Kuhn–Tucker (KKT) conditions were applied to the traditional reduced-order model to constrain the solution to user-defined bounds. The constrained reduced-order model (C-ROM) was applied and validated against the analytical solution to the first-order wave equation. C-ROM was also applied to the analysis of fluidized beds. It was shown that the ROM and C-ROM produced accurate results and that C-ROM was less sensitive to error propagation through time than the ROM.