Publications

Results 9651–9675 of 9,998
Skip to search filters

Geomechanics of penetration : experimental and computational approaches : final report for LDRD project 38718

Holcomb, David J.; Fossum, Arlo F.; Gettemy, Glen L.; Hardy, Robert D.; Bronowski, David R.; Rivas, Raul R.; Preece, Dale S.

The purpose of the present work is to increase our understanding of which properties of geomaterials most influence the penetration process with a goal of improving our predictive ability. Two primary approaches were followed: development of a realistic, constitutive model for geomaterials and designing an experimental approach to study penetration from the target's point of view. A realistic constitutive model, with parameters based on measurable properties, can be used for sensitivity analysis to determine the properties that are most important in influencing the penetration process. An immense literature exists that is devoted to the problem of predicting penetration into geomaterials or similar man-made materials such as concrete. Various formulations have been developed that use an analytic or more commonly, numerical, solution for the spherical or cylindrical cavity expansion as a sort of Green's function to establish the forces acting on a penetrator. This approach has had considerable success in modeling the behavior of penetrators, both as to path and depth of penetration. However the approach is not well adapted to the problem of understanding what is happening to the material being penetrated. Without a picture of the stress and strain state imposed on the highly deformed target material, it is not easy to determine what properties of the target are important in influencing the penetration process. We developed an experimental arrangement that allows greater control of the deformation than is possible in actual penetrator tests, yet approximates the deformation processes imposed by a penetrator. Using explosive line charges placed in a central borehole, we loaded cylindrical specimens in a manner equivalent to an increment of penetration, allowing the measurement of the associated strains and accelerations and the retrieval of specimens from the more-or-less intact cylinder. Results show clearly that the deformation zone is highly concentrated near the borehole, with almost no damage occurring beyond 1/2 a borehole diameter. This implies penetration is not strongly influenced by anything but the material within a diameter or so of the penetration. For penetrator tests, target size should not matter strongly once target diameters exceed some small multiple of the penetrator diameter. Penetration into jointed rock should not be much affected unless a discontinuity is within a similar range. Accelerations measured at several points along a radius from the borehole are consistent with highly-concentrated damage and energy absorption; At the borehole wall, accelerations were an order of magnitude higher than at 1/2 a diameter, but at the outer surface, 8 diameters away, accelerations were as expected for propagation through an elastic medium. Accelerations measured at the outer surface of the cylinders increased significantly with cure time for the concrete. As strength increased, less damage was observed near the explosively-driven borehole wall consistent with the lower energy absorption expected and observed for stronger concrete. As it is the energy absorbing properties of a target that ultimately stop a penetrator, we believe this may point the way to a more readily determined equivalent of the S number.

More Details

Optimal neuronal tuning for finite stimulus spaces

Proposed for publication in Neural computation.

Brown, William M.; Backer, Alejandro B.

The efficiency of neuronal encoding in sensory and motor systems has been proposed as a first principle governing response properties within the central nervous system. We present a continuation of a theoretical study presented by Zhang and Sejnowski, where the influence of neuronal tuning properties on encoding accuracy is analyzed using information theory. When a finite stimulus space is considered, we show that the encoding accuracy improves with narrow tuning for one- and two-dimensional stimuli. For three dimensions and higher, there is an optimal tuning width.

More Details

Dynamic context discrimination : psychological evidence for the Sandia Cognitive Framework

Speed, Ann S.

Human behavior is a function of an iterative interaction between the stimulus environment and past experience. It is not simply a matter of the current stimulus environment activating the appropriate experience or rule from memory (e.g., if it is dark and I hear a strange noise outside, then I turn on the outside lights and investigate). Rather, it is a dynamic process that takes into account not only things one would generally do in a given situation, but things that have recently become known (e.g., there have recently been coyotes seen in the area and one is known to be rabid), as well as other immediate environmental characteristics (e.g., it is snowing outside, I know my dog is outside, I know the police are already outside, etc.). All of these factors combine to inform me of the most appropriate behavior for the situation. If it were the case that humans had a rule for every possible contingency, the amount of storage that would be required to enable us to fluidly deal with most situations we encounter would rapidly become biologically untenable. We can all deal with contingencies like the one above with fairly little effort, but if it isn't based on rules, what is it based on? The assertion of the Cognitive Systems program at Sandia for the past 5 years is that at the heart of this ability to effectively navigate the world is an ability to discriminate between different contexts (i.e., Dynamic Context Discrimination, or DCD). While this assertion in and of itself might not seem earthshaking, it is compelling that this ability and its components show up in a wide variety of paradigms across different subdisciplines in psychology. We begin by outlining, at a high functional level, the basic ideas of DCD. We then provide evidence from several different literatures and paradigms that support our assertion that DCD is a core aspect of cognitive functioning. Finally, we discuss DCD and the computational model that we have developed as an instantiation of DCD in more detail. Before commencing with our overview of DCD, we should note that DCD is not necessarily a theory in the classic sense. Rather, it is a description of cognitive functioning that seeks to unify highly similar findings across a wide variety of literatures. Further, we believe that such convergence warrants a central place in efforts to computationally emulate human cognition. That is, DCD is a general principle of cognition. It is also important to note that while we are drawing parallels across many literatures, these are functional parallels and are not necessarily structural ones. That is, we are not saying that the same neural pathways are involved in these phenomena. We are only saying that the different neural pathways that are responsible for the appearance of these various phenomena follow the same functional rules - the mechanisms are the same even if the physical parts are distinct. Furthermore, DCD is not a causal mechanism - it is an emergent property of the way the brain is constructed. DCD is the result of neurophysiology (cf. John, 2002, 2003). Finally, it is important to note that we are not proposing a generic learning mechanism such that one biological algorithm can account for all situation interpretation. Rather, we are pointing out that there are strikingly similar empirical results across a wide variety of disciplines that can be understood, in part, by similar cognitive processes. It is entirely possible, even assumed in some cases (i.e., primary language acquisition) that these more generic cognitive processes are complemented and constrained by various limits which may or may not be biological in nature (cf. Bates & Elman, 1996; Elman, in press).

More Details

Unified parallel C and the computing needs of Sandia National Laboratories

Wen, Zhaofang W.

As Sandia looks toward petaflops computing and other advanced architectures, it is necessary to provide a programming environment that can exploit this additional computing power while supporting reasonable development time for applications. Thus, they evaluate the Partitioned Global Address Space (PGAS) programming model as implemented in Unified Parallel C (UPC) for its applicability. They report on their experiences in implementing sorting and minimum spanning tree algorithms on a test system, a Cray T3e, with UPC support. They describe several macros that could serve as language extensions and several building-block operations that could serve as a foundation for a PGAS programming library. They analyze the limitations of the UPC implementation available on the test system, and suggest improvements necessary before UPC can be used in a production environment.

More Details

Analysis and control of distributed cooperative systems

Feddema, John T.; Schoenwald, David A.; Parker, Eric P.; Wagner, John S.

As part of DARPA Information Processing Technology Office (IPTO) Software for Distributed Robotics (SDR) Program, Sandia National Laboratories has developed analysis and control software for coordinating tens to thousands of autonomous cooperative robotic agents (primarily unmanned ground vehicles) performing military operations such as reconnaissance, surveillance and target acquisition; countermine and explosive ordnance disposal; force protection and physical security; and logistics support. Due to the nature of these applications, the control techniques must be distributed, and they must not rely on high bandwidth communication between agents. At the same time, a single soldier must easily direct these large-scale systems. Finally, the control techniques must be provably convergent so as not to cause undo harm to civilians. In this project, provably convergent, moderate communication bandwidth, distributed control algorithms have been developed that can be regulated by a single soldier. We have simulated in great detail the control of low numbers of vehicles (up to 20) navigating throughout a building, and we have simulated in lesser detail the control of larger numbers of vehicles (up to 1000) trying to locate several targets in a large outdoor facility. Finally, we have experimentally validated the resulting control algorithms on smaller numbers of autonomous vehicles.

More Details

Stability of biological networks as represented in Random Boolean Nets

Slepoy, Alexander S.; Thompson, Marshall A.

We explore stability of Random Boolean Networks as a model of biological interaction networks. We introduce surface-to-volume ratio as a measure of stability of the network. Surface is defined as the set of states within a basin of attraction that maps outside the basin by a bit-flip operation. Volume is defined as the total number of states in the basin. We report development of an object-oriented Boolean network analysis code (Attract) to investigate the structure of stable vs. unstable networks. We find two distinct types of stable networks. The first type is the nearly trivial stable network with a few basins of attraction. The second type contains many basins. We conclude that second type stable networks are extremely rare.

More Details

Sensor placement in municipal water networks

Proposed for publication in the Journal of Water Resources Planning and Management.

Hart, William E.; Phillips, Cynthia A.; Berry, Jonathan W.; Watson, Jean-Paul W.

We present a model for optimizing the placement of sensors in municipal water networks to detect maliciously injected contaminants. An optimal sensor configuration minimizes the expected fraction of the population at risk. We formulate this problem as a mixed-integer program, which can be solved with generally available solvers. We find optimal sensor placements for three test networks with synthetic risk and population data. Our experiments illustrate that this formulation can be solved relatively quickly and that the predicted sensor configuration is relatively insensitive to uncertainties in the data used for prediction.

More Details

Peridynamic modeling of membranes and fibers

Proposed for publication in Peridynamic Modeling of Membranes and Fibers.

Silling, Stewart A.

The peridynamic theory of continuum mechanics allows damage, fracture, and long-range forces to be treated as natural components of the deformation of a material. In this paper, the peridynamic approach is applied to small thickness two- and one-dimensional structures. For membranes, a constitutive model is described appropriate for rubbery sheets that can form cracks. This model is used to perform numerical simulations of the stretching and dynamic tearing of membranes. A similar approach is applied to one-dimensional string like structures that undergrow stretching, bending, and failure. Long-range forces similar to van der Waals interactions at the nanoscale influence the equilibrium configurations of these structures, how they deform, and possibly self-assembly.

More Details

The Sandia GeoModel : theory and user's guide

Fossum, Arlo F.; Brannon, Rebecca M.

The mathematical and physical foundations and domain of applicability of Sandia's GeoModel are presented along with descriptions of the source code and user instructions. The model is designed to be used in conventional finite element architectures, and (to date) it has been installed in five host codes without requiring customizing the model subroutines for any of these different installations. Although developed for application to geological materials, the GeoModel actually applies to a much broader class of materials, including rock-like engineered materials (such as concretes and ceramics) and even to metals when simplified parameters are used. Nonlinear elasticity is supported through an empirically fitted function that has been found to be well-suited to a wide variety of materials. Fundamentally, the GeoModel is a generalized plasticity model. As such, it includes a yield surface, but the term 'yield' is generalized to include any form of inelastic material response including microcrack growth and pore collapse. The geomodel supports deformation-induced anisotropy in a limited capacity through kinematic hardening (in which the initially isotropic yield surface is permitted to translate in deviatoric stress space to model Bauschinger effects). Aside from kinematic hardening, however, the governing equations are otherwise isotropic. The GeoModel is a genuine unification and generalization of simpler models. The GeoModel can employ up to 40 material input and control parameters in the rare case when all features are used. Simpler idealizations (such as linear elasticity, or Von Mises yield, or Mohr-Coulomb failure) can be replicated by simply using fewer parameters. For high-strain-rate applications, the GeoModel supports rate dependence through an overstress model.

More Details

Approach and development strategy for an agent-based model of economic confidence

Sprigg, James A.; Jorgensen, Craig R.; Pryor, Richard J.

We are extending the existing features of Aspen, a powerful economic modeling tool, and introducing new features to simulate the role of confidence in economic activity. The new model is built from a collection of autonomous agents that represent households, firms, and other relevant entities like financial exchanges and governmental authorities. We simultaneously model several interrelated markets, including those for labor, products, stocks, and bonds. We also model economic tradeoffs, such as decisions of households and firms regarding spending, savings, and investment. In this paper, we review some of the basic principles and model components and describe our approach and development strategy for emulating consumer, investor, and business confidence. The model of confidence is explored within the context of economic disruptions, such as those resulting from disasters or terrorist events.

More Details
Results 9651–9675 of 9,998
Results 9651–9675 of 9,998