Temporal Algorithms for Physical Security
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Springer Proceedings in Complexity
Anomaly detection is an important problem in various fields of complex systems research including image processing, data analysis, physical security and cybersecurity. In image processing, it is used for removing noise while preserving image quality, and in data analysis, physical security and cybersecurity, it is used to find interesting data points, objects or events in a vast sea of information. Anomaly detection will continue to be an important problem in domains intersecting with “Big Data”. In this paper we provide a novel algorithm for anomaly detection that uses phase-coded spiking neurons as basic computational elements.
2017 IEEE International Conference on Rebooting Computing, ICRC 2017 - Proceedings
Unlike general purpose computer architectures that are comprised of complex processor cores and sequential computation, the brain is innately parallel and contains highly complex connections between computational units (neurons). Key to the architecture of the brain is a functionality enabled by the combined effect of spiking communication and sparse connectivity with unique variable efficacies and temporal latencies. Utilizing these neuroscience principles, we have developed the Spiking Temporal Processing Unit (STPU) architecture which is well-suited for areas such as pattern recognition and natural language processing. In this paper, we formally describe the STPU, implement the STPU on a field programmable gate array, and show measured performance data.
As high performance computing architectures pursue more computational power there is a need for increased memory capacity and bandwidth as well. A multi-level memory (MLM) architecture addresses this need by combining multiple memory types with different characteristics as varying levels of the same architecture. How to efficiently utilize this memory infrastructure is an unknown challenge, and in this research we sought to investigate whether neural inspired approaches can meaningfully help with memory management. In particular we explored neurogenesis inspired re- source allocation, and were able to show a neural inspired mixed controller policy can beneficially impact how MLM architectures utilize memory.
Abstract not provided.
Proceedings of the International Joint Conference on Neural Networks
Considerable effort is currently being spent designing neuromorphic hardware for addressing challenging problems in a variety of pattern-matching applications. These neuromorphic systems offer low power architectures with intrinsically parallel and simple spiking neuron processing elements. Unfortunately, these new hardware architectures have been largely developed without a clear justification for using spiking neurons to compute quantities for problems of interest. Specifically, the use of spiking for encoding information in time has not been explored theoretically with complexity analysis to examine the operating conditions under which neuromorphic computing provides a computational advantage (time, space, power, etc.) In this paper, we present and formally analyze the use of temporal coding in a neural-inspired algorithm for optimization-based computation in neural spiking architectures.
Abstract not provided.
Improved validation for models of complex systems has been a primary focus over the past year for the Resilience in Complex Systems Research Challenge. This document describes a set of research directions that are the result of distilling those ideas into three categories of research -- epistemic uncertainty, strong tests, and value of information. The content of this document can be used to transmit valuable information to future research activities, update the Resilience in Complex Systems Research Challenge's roadmap, inform the upcoming FY18 Laboratory Directed Research and Development (LDRD) call and research proposals, and facilitate collaborations between Sandia and external organizations. The recommended research directions can provide topics for collaborative research, development of proposals, workshops, and other opportunities.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Proceedings of the International Joint Conference on Neural Networks
Through various means of structural and synaptic plasticity enabling online learning, neural networks are constantly reconfiguring their computational functionality. Neural information content is embodied within the configurations, representations, and computations of neural networks. To explore neural information content, we have developed metrics and computational paradigms to quantify neural information content. We have observed that conventional compression methods may help overcome some of the limiting factors of standard information theoretic techniques employed in neuroscience, and allows us to approximate information in neural data. To do so we have used compressibility as a measure of complexity in order to estimate entropy to quantitatively assess information content of neural ensembles. Using Lempel-Ziv compression we are able to assess the rate of generation of new patterns across a neural ensemble's firing activity over time to approximate the information content encoded by a neural circuit. As a specific case study, we have been investigating the effect of neural mixed coding schemes due to hippocampal adult neurogenesis.
Electricity Journal
Abstract not provided.
The transformation of the distribution grid from a centralized to decentralized architecture, with bi-directional power and data flows, is made possible by a surge in network intelligence and grid automation. While changes are largely beneficial, the interface between grid operator and automated technologies is not well understood, nor are the benefits and risks of automation. Quantifying and understanding the latter is an important facet of grid resilience that needs to be fully investigated. The work described in this document represents the first empirical study aimed at identifying and mitigating the vulnerabilities posed by automation for a grid that for the foreseeable future will remain a human-in-the-loop critical infrastructure. Our scenario-based methodology enabled us to conduct a series of experimental studies to identify causal relationships between grid-operator performance and automated technologies and to collect measurements of human performance as a function of automation. Our findings, though preliminary, suggest there are predictive patterns in the interplay between human operators and automation, patterns that can inform the rollout of distribution automation and the hiring and training of operators, and contribute in multiple and significant ways to the field of grid resilience.
Abstract not provided.
Abstract not provided.
This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledge gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Proceedings - Winter Simulation Conference
We created a cognition-focused system dynamics model to simulate the dynamics of smoking tendencies based on media influences and communication of opinions. We based this model on the premise that the dynamics of attitudes about smoking can be more deeply understood by combining opinion dynamics with more in-depth psychological models that explicitly explore the root causes of behaviors of interest. Results of the model show the relative effectiveness of two different policies as compared to a baseline: A decrease in advertising spending, and an increase in educational spending. The initial results presented here indicate the utility of this type of simulation for analyzing various policies meant to influence the dynamics of opinions in a population.
Abstract not provided.
Proceedings of the International Joint Conference on Neural Networks
The field of machine learning strives to develop algorithms that, through learning, lead to generalization; that is, the ability of a machine to perform a task that it was not explicitly trained for. An added challenge arises when the problem domain is dynamic or non-stationary with the data distributions or categorizations changing over time. This phenomenon is known as concept drift. Game-theoretic algorithms are often iterative by nature, consisting of repeated game play rather than a single interaction. Effectively, rather than requiring extensive retraining to update a learning model, a game-theoretic approach can adjust strategies as a novel approach to concept drift. In this paper we present a variant of our Support Vector Machine (SVM) Game classifier which may be used in an adaptive manner with repeated play to address concept drift, and show results of applying this algorithm to synthetic as well as real data.
This project evaluates the effectiveness of moving target defense (MTD) techniques using a new game we have designed, called PLADD, inspired by the game FlipIt [28]. PLADD extends FlipIt by incorporating what we believe are key MTD concepts. We have analyzed PLADD and proven the existence of a defender strategy that pushes a rational attacker out of the game, demonstrated how limited the strategies available to an attacker are in PLADD, and derived analytic expressions for the expected utility of the game’s players in multiple game variants. We have created an algorithm for finding a defender’s optimal PLADD strategy. We show that in the special case of achieving deterrence in PLADD, MTD is not always cost effective and that its optimal deployment may shift abruptly from not using MTD at all to using it as aggressively as possible. We believe our effort provides basic, fundamental insights into the use of MTD, but conclude that a truly practical analysis requires model selection and calibration based on real scenarios and empirical data. We propose several avenues for further inquiry, including (1) agents with adaptive capabilities more reflective of real world adversaries, (2) the presence of multiple, heterogeneous adversaries, (3) computational game theory-based approaches such as coevolution to allow scaling to the real world beyond the limitations of analytical analysis and classical game theory, (4) mapping the game to real-world scenarios, (5) taking player risk into account when designing a strategy (in addition to expected payoff), (6) improving our understanding of the dynamic nature of MTD-inspired games by using a martingale representation, defensive forecasting, and techniques from signal processing, and (7) using adversarial games to develop inherently resilient cyber systems.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
PLoS ONE
Background Recent declines in US cigarette smoking prevalence have coincided with increases in use of other tobacco products. Multiple product tobacco models can help assess the population health impacts associated with use of a wide range of tobacco products. Methods and Findings We present a multi-state, dynamical systems population structure model that can be used to assess the effects of tobacco product use behaviors on population health. The model incorporates transition behaviors, such as initiation, cessation, switching, and dual use, related to the use of multiple products. The model tracks product use prevalence and mortality attributable to tobacco use for the overall population and by sex and age group. The model can also be used to estimate differences in these outcomes between scenarios by varying input parameter values. We demonstrate model capabilities by projecting future cigarette smoking prevalence and smoking-attributable mortality and then simulating the effects of introduction of a hypothetical new lower-risk tobacco product under a variety of assumptions about product use. Sensitivity analyses were conducted to examine the range of population impacts that could occur due to differences in input values for product use and risk. We demonstrate that potential benefits from cigarette smokers switching to the lower-risk product can be offset over time through increased initiation of this product. Model results show that population health benefits are particularly sensitive to product risks and initiation, switching, and dual use behaviors. Conclusion Our model incorporates the variety of tobacco use behaviors and risks that occur with multiple products. As such, it can evaluate the population health impacts associated with the introduction of new tobacco products or policies that may result in product switching or dual use. Further model development will include refinement of data inputs for non-cigarette tobacco products and inclusion of health outcomes such as morbidity and disability.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Prehospital and Disaster Medicine
Hospital evacuations that occur during, or as a result of, infrastructure outages are complicated and demanding. Loss of infrastructure services can initiate a chain of events with corresponding management challenges. This report describes a modeling case study of the 2001 evacuation of the Memorial Hermann Hospital in Houston, Texas (USA). The study uses a model designed to track such cascading events following loss of infrastructure services and to identify the staff, resources, and operational adaptations required to sustain patient care and/or conduct an evacuation. The model is based on the assumption that a hospital's primary mission is to provide necessary medical care to all of its patients, even when critical infrastructure services to the hospital and surrounding areas are disrupted. Model logic evaluates the hospital's ability to provide an adequate level of care for all of its patients throughout a period of disruption. If hospital resources are insufficient to provide such care, the model recommends an evacuation. Model features also provide information to support evacuation and resource allocation decisions for optimizing care over the entire population of patients. This report documents the application of the model to a scenario designed to resemble the 2001 evacuation of the Memorial Hermann Hospital, demonstrating the model's ability to recreate the timeline of an actual evacuation. The model is also applied to scenarios demonstrating how its output can inform evacuation planning activities and timing.
This report presents a mathematical model of the way in which a hospital uses a variety of resources, utilities and consumables to provide care to a set of in-patients, and how that hospital might adapt to provide treatment to a few patients with a serious infectious disease, like the Ebola virus. The intended purpose of the model is to support requirements planning studies, so that hospitals may be better prepared for situations that are likely to strain their available resources. The current model is a prototype designed to present the basic structural elements of a requirements planning analysis. Some simple illustrati ve experiments establish the mo del's general capabilities. With additional inve stment in model enhancement a nd calibration, this prototype could be developed into a useful planning tool for ho spital administrators and health care policy makers.
Abstract not provided.
Abstract not provided.
Adaptation is believed to be a source of resilience in systems. It has been difficult to measure the contribution of adaptation to resilience, unlike other resilience mechanisms such as restoration and recovery. One difficulty comes from treating adaptation as a deus ex machina that is interjected after a disruption. This provides no basis for bounding possible adaptive responses. We can bracket the possible effects of adaptation when we recognize that it occurs continuously, and is in part responsible for the current system’s properties. In this way the dynamics of the system’s pre-disruption structure provides information about post-disruption adaptive reaction. Seen as an ongoing process, adaptation has been argued to produce “robust-yet-fragile” systems. Such systems perform well under historical stresses but become committed to specific features of those stresses in a way that makes them vulnerable to system-level collapse when those features change. In effect adaptation lessens the cost of disruptions within a certain historical range, at the expense of increased cost from disruptions outside that range. Historical adaptive responses leave a signature in the structure of the system. Studies of ecological networks have suggested structural metrics that pick out systemic resilience in the underlying ecosystems. If these metrics are generally reliable indicators of resilience they provide another strategy for gaging adaptive resilience. To progress in understanding how the process of adaptation and the property of resilience interrelate in infrastructure systems, we pose some specific questions: Does adaptation confer resilience?; Does it confer resilience to novel shocks as well, or does it tune the system to fragility?; Can structural features predict resilience to novel shocks?; Are there policies or constraints on the adaptive process that improve resilience?.
Abstract not provided.
Adult neurogenesis in the hippocampus region of the brain is a neurobiological process that is believed to contribute to the brain's advanced abilities in complex pattern recognition and cognition. Here, we describe how realistic scale simulations of the neurogenesis process can offer both a unique perspective on the biological relevance of this process and confer computational insights that are suggestive of novel machine learning techniques. First, supercomputer based scaling studies of the neurogenesis process demonstrate how a small fraction of adult-born neurons have a uniquely larger impact in biologically realistic scaled networks. Second, we describe a novel technical approach by which the information content of ensembles of neurons can be estimated. Finally, we illustrate several examples of broader algorithmic impact of neurogenesis, including both extending existing machine learning approaches and novel approaches for intelligent sensing.
Abstract not provided.
Abstract not provided.