An inexpensive rapid-sampling device for liquid cell cultures
Nature Methods
Abstract not provided.
Nature Methods
Abstract not provided.
Blood
Abstract not provided.
Abstract not provided.
SIAM Journal of Numerical Analysis
Abstract not provided.
This paper builds upon previous work [Sprigg and Ehlen, 2004] by introducing a bond market into a model of production and employment. The previous paper described an economy in which households choose whether to enter the labor and product markets based on wages and prices. Firms experiment with prices and employment levels to maximize their profits. We developed agent-based simulations using Aspen, a powerful economic modeling tool developed at Sandia, to demonstrate that multiple-firm economies converge toward the competitive equilibria typified by lower prices and higher output and employment, but also suffer from market noise stemming from consumer churn. In this paper we introduce a bond market as a mechanism for household savings. We simulate an economy of continuous overlapping generations in which each household grows older in the course of the simulation and continually revises its target level of savings according to a life-cycle hypothesis. Households can seek employment, earn income, purchase goods, and contribute to savings until they reach the mandatory retirement age; upon retirement households must draw from savings in order to purchase goods. This paper demonstrates the simultaneous convergence of product, labor, and savings markets to their calculated equilibria, and simulates how a disruption to a productive sector will create cascading effects in all markets. Subsequent work will use similar models to simulate how disruptions, such as terrorist attacks, would interplay with consumer confidence to affect financial markets and the broader economy.
Biosecurity must be implemented without impeding biomedical and bioscience research. Existing security literature and regulatory requirements do not present a comprehensive approach or clear model for biosecurity, nor do they wholly recognize the operational issues within laboratory environments. To help address these issues, the concept of Biosecurity Levels should be developed. Biosecurity Levels would have increasing levels of security protections depending on the attractiveness of the pathogens to adversaries. Pathogens and toxins would be placed in a Biosecurity Level based on their security risk. Specifically, the security risk would be a function of an agent's weaponization potential and consequences of use. To demonstrate the concept, examples of security risk assessments for several human, animal, and plant pathogens will be presented. Higher security than that currently mandated by federal regulations would be applied for those very few agents that represent true weapons threats and lower levels for the remainder.
Abstract not provided.
Abstract not provided.
Nonproliferation Review
Abstract not provided.
Abstract not provided.
By means of coupled-cluster theory, molecular properties can be computed with an accuracy often exceeding that of experiment. The high-degree polynomial scaling of the coupled-cluster method, however, remains a major obstacle in the accurate theoretical treatment of mainstream chemical problems, despite tremendous progress in computer architectures. Although it has long been recognized that this super-linear scaling is non-physical, the development of efficient reduced-scaling algorithms for massively parallel computers has not been realized. We here present a locally correlated, reduced-scaling, massively parallel coupled-cluster algorithm. A sparse data representation for handling distributed, sparse multidimensional arrays has been implemented along with a set of generalized contraction routines capable of handling such arrays. The parallel implementation entails a coarse-grained parallelization, reducing interprocessor communication and distributing the largest data arrays but replicating as many arrays as possible without introducing memory bottlenecks. The performance of the algorithm is illustrated by several series of runs for glycine chains using a Linux cluster with an InfiniBand interconnect.
Abstract not provided.
Physical Review Letters
Abstract not provided.
Genetic programming (GP) has proved to be a highly versatile and useful tool for identifying relationships in data for which a more precise theoretical construct is unavailable. In this project, we use a GP search to develop trading strategies for agent based economic models. These strategies use stock prices and technical indicators, such as the moving average convergence/divergence and various exponentially weighted moving averages, to generate buy and sell signals. We analyze the effect of complexity constraints on the strategies as well as the relative performance of various indicators. We also present innovations in the classical genetic programming algorithm that appear to improve convergence for this problem. Technical strategies developed by our GP algorithm can be used to control the behavior of agents in economic simulation packages, such as ASPEN-D, adding variety to the current market fundamentals approach. The exploitation of arbitrage opportunities by technical analysts may help increase the efficiency of the simulated stock market, as it does in the real world. By improving the behavior of simulated stock markets, we can better estimate the effects of shocks to the economy due to terrorism or natural disasters.
In this paper we present an analysis of a new configuration for achieving spin stabilized magnetic levitation. In the classical configuration, the rotor spins about a vertical axis; and the spin stabilizes the lateral instability of the top in the magnetic field. In this new configuration the rotor spins about a horizontal axis; and the spin stabilizes the axial instability of the top in the magnetic field.
ML is a multigrid preconditioning package intended to solve linear systems of equations Ax = b where A is a user supplied n x n sparse matrix, b is a user supplied vector of length n and x is a vector of length n to be computed. ML should be used on large sparse linear systems arising from partial differential equation (PDE) discretizations. While technically any linear system can be considered, ML should be used on linear systems that correspond to things that work well with multigrid methods (e.g. elliptic PDEs). ML can be used as a stand-alone package or to generate preconditioners for a traditional iterative solver package (e.g. Krylov methods). We have supplied support for working with the Aztec 2.1 and AztecOO iterative package [16]. However, other solvers can be used by supplying a few functions. This document describes one specific algebraic multigrid approach: smoothed aggregation. This approach is used within several specialized multigrid methods: one for the eddy current formulation for Maxwell's equations, and a multilevel and domain decomposition method for symmetric and nonsymmetric systems of equations (like elliptic equations, or compressible and incompressible fluid dynamics problems). Other methods exist within ML but are not described in this document. Examples are given illustrating the problem definition and exercising multigrid options.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Journal of Fluid Mechanics
Abstract not provided.
Abstract not provided.
Proposed for publication in Computer Methods in Applied Mechanics and Engineering Journal.
This paper presents solution verification studies applicable to a class of problems involving wave propagation, frictional contact, geometrical complexity, and localized incompressibility. The studies are in support of a validation exercise of a phenomenological screw failure model. The numerical simulations are performed using a fully explicit transient dynamics finite element code, employing both standard four-node tetrahedral and eight-node mean quadrature hexahedral elements. It is demonstrated that verifying the accuracy of the simulation involves not only consideration of the mesh discretization error, but also the effect of the hourglass control and the contact enforcement. In particular, the proper amount of hourglass control and the behavior of the contact search and enforcement algorithms depend greatly on the mesh resolution. We carry out the solution verification exercise using mesh refinement studies and describe our systematic approach to handling the complicating issues. It is shown that hourglassing and contact must both be carefully monitored as the mesh is refined, and it is often necessary to make adjustments to the hourglass and contact user input parameters to accommodate finer meshes. We introduce in this paper the hourglass energy, which is used as an 'error indicator' for the hourglass control. If the hourglass energy does not tend to zero with mesh refinement, then an hourglass control parameter is changed and the calculation is repeated.
Abstract not provided.
Abstract not provided.
An important challenge encountered during post-processing of finite element analyses is the visualizing of three-dimensional fields of real-valued second-order tensors. Namely, as finite element meshes become more complex and detailed, evaluation and presentation of the principal stresses becomes correspondingly problematic. In this paper, we describe techniques used to visualize simulations of perturbed in-situ stress fields associated with hypothetical salt bodies in the Gulf of Mexico. We present an adaptation of the Mohr diagram, a graphical paper and pencil method used by the material mechanics community for estimating coordinate transformations for stress tensors, as a new tensor glyph for dynamically exploring tensor variables within three-dimensional finite element models. This interactive glyph can be used as either a probe or a filter through brushing and linking.