Publications

Results 76–100 of 184
Skip to search filters

Neural computing for scientific computing applications

ACM International Conference Proceeding Series

Aimone, James B.; Parekh, Ojas D.; Severa, William M.

Neural computing has been identified as a computing alternative in the post-Moore's Law era, however much of its attention has been directed at specialized applications such as machine learning. For scientific computing applications, particularly those that often depend on supercomputing, it is not clear that neural machine learning is the exclusive contribution to be made by neuromorphic platforms. In our presentation, we will discuss ways that looking to the brain as a whole and neurons specifically can provide new sources for inspiration for computing beyond current machine learning applications. Particularly for scientific computing, where approximate methods for computation introduce additional challenges, the development of non-approximate methods for neural computation is potentially quite valuable. In addition, the brain's dramatic ability to utilize context at many different scales and incorporate information from many different modalities is a capability currently poorly realized by neural machine learning approaches yet offers considerable potential impact on scientific applications.

More Details

Neuromorphic data microscope

ACM International Conference Proceeding Series

Follett, David R.; Karpman, Gabe D.; Townsend, Duncan; Naegle, John H.; Follett, Pamela L.; Suppona, Roger A.; Aimone, James B.; James, Conrad D.

In 2016, Lewis Rhodes Labs, (LRL), shipped the first commercially viable Neuromorphic Processing Unit, (NPU), branded as a Neuromorphic Data Microscope (NDM). This product leverages architectural mechanisms derived from the sensory cortex of the human brain to efficiently implement pattern matching. LRL and Sandia National Labs have optimized this product for streaming analytics, and demonstrated a 1,000x power per operation reduction in an FPGA format. When reduced to an ASIC, the efficiency will improve to 1,000,000x. Additionally, the neuromorphic nature of the device gives it powerful computational attributes that are counterintuitive to those schooled in traditional von Neumann architectures. The Neuromorphic Data Microscope is the first of a broad class of brain-inspired, time domain processors that will profoundly alter the functionality and economics of data processing.

More Details

Neurogenesis deep learning: Extending deep networks to accommodate new classes

Proceedings of the International Joint Conference on Neural Networks

Draelos, Timothy J.; Miner, Nadine E.; Lamb, Christopher L.; Cox, Jonathan A.; Vineyard, Craig M.; Carlson, Kristofor D.; Severa, William M.; James, Conrad D.; Aimone, James B.

Neural machine learning methods, such as deep neural networks (DNN), have achieved remarkable success in a number of complex data processing tasks. These methods have arguably had their strongest impact on tasks such as image and audio processing - data processing domains in which humans have long held clear advantages over conventional algorithms. In contrast to biological neural systems, which are capable of learning continuously, deep artificial networks have a limited ability for incorporating new information in an already trained network. As a result, methods for continuous learning are potentially highly impactful in enabling the application of deep networks to dynamic data sets. Here, inspired by the process of adult neurogenesis in the hippocampus, we explore the potential for adding new neurons to deep layers of artificial neural networks in order to facilitate their acquisition of novel information while preserving previously trained data representations. Our results on the MNIST handwritten digit dataset and the NIST SD 19 dataset, which includes lower and upper case letters and digits, demonstrate that neurogenesis is well suited for addressing the stability-plasticity dilemma that has long challenged adaptive machine learning algorithms.

More Details

A novel digital neuromorphic architecture efficiently facilitating complex synaptic response functions applied to liquid state machines

Proceedings of the International Joint Conference on Neural Networks

Smith, Michael R.; Hill, Aaron J.; Carlson, Kristofor D.; Vineyard, Craig M.; Donaldson, Jonathon W.; Follett, David R.; Follett, Pamela L.; Naegle, John H.; James, Conrad D.; Aimone, James B.

Information in neural networks is represented as weighted connections, or synapses, between neurons. This poses a problem as the primary computational bottleneck for neural networks is the vector-matrix multiply when inputs are multiplied by the neural network weights. Conventional processing architectures are not well suited for simulating neural networks, often requiring large amounts of energy and time. Additionally, synapses in biological neural networks are not binary connections, but exhibit a nonlinear response function as neurotransmitters are emitted and diffuse between neurons. Inspired by neuroscience principles, we present a digital neuromorphic architecture, the Spiking Temporal Processing Unit (STPU), capable of modeling arbitrary complex synaptic response functions without requiring additional hardware components. We consider the paradigm of spiking neurons with temporally coded information as opposed to non-spiking rate coded neurons used in most neural networks. In this paradigm we examine liquid state machines applied to speech recognition and show how a liquid state machine with temporal dynamics maps onto the STPU - demonstrating the flexibility and efficiency of the STPU for instantiating neural algorithms.

More Details

Optimization-based computation with spiking neurons

Proceedings of the International Joint Conference on Neural Networks

Verzi, Stephen J.; Vineyard, Craig M.; Vugrin, Eric D.; Galiardi, Meghan; James, Conrad D.; Aimone, James B.

Considerable effort is currently being spent designing neuromorphic hardware for addressing challenging problems in a variety of pattern-matching applications. These neuromorphic systems offer low power architectures with intrinsically parallel and simple spiking neuron processing elements. Unfortunately, these new hardware architectures have been largely developed without a clear justification for using spiking neurons to compute quantities for problems of interest. Specifically, the use of spiking for encoding information in time has not been explored theoretically with complexity analysis to examine the operating conditions under which neuromorphic computing provides a computational advantage (time, space, power, etc.) In this paper, we present and formally analyze the use of temporal coding in a neural-inspired algorithm for optimization-based computation in neural spiking architectures.

More Details

A combinatorial model for dentate gyrus sparse coding

Neural Computation

Severa, William M.; Parekh, Ojas D.; James, Conrad D.; Aimone, James B.

The dentate gyrus forms a critical link between the entorhinal cortex and CA3 by providing a sparse version of the signal. Concurrent with this increase in sparsity, a widely accepted theory suggests the dentate gyrus performs pattern separation-similar inputs yield decorrelated outputs. Although an active region of study and theory, few logically rigorous arguments detail the dentate gyrus's (DG) coding.We suggest a theoretically tractable, combinatorial model for this action. The model provides formal methods for a highly redundant, arbitrarily sparse, and decorrelated output signal. To explore the value of this model framework, we assess how suitable it is for two notable aspects of DG coding: how it can handle the highly structured grid cell representation in the input entorhinal cortex region and the presence of adult neurogenesis, which has been proposed to produce a heterogeneous code in the DG.We find tailoring themodel to grid cell input yields expansion parameters consistent with the literature. In addition, the heterogeneous coding reflects activity gradation observed experimentally. Finally,we connect this approach with more conventional binary threshold neural circuit models via a formal embedding.

More Details

Computational perspectives on adult neurogenesis

The Rewiring Brain: A Computational Approach to Structural Plasticity in the Adult Brain

Carlson, Kristofor D.; Rothganger, Fredrick R.; Aimone, James B.

The continuous integration of young neurons into the adult brain represents a novel form of structural plasticity and has inspired the creation of numerous computational models to understand the functional role of adult neurogenesis. These computational models consist of abstract models that focus on the utility of new neurons in simple neural networks and biologically based models constrained by anatomical data that explore the role of new neurons in specific neural circuits such as the hippocampus. Simulation results from both classes of models have suggested a number of theoretical roles for neurogenesis such as increasing the capacity to learn novel information, promoting temporal context encoding, and influencing pattern separation. In this review, we discuss strategies and findings of past computational modeling efforts, current challenges and limitations, and new computational approaches pertinent to modeling adult neurogenesis.

More Details

A historical survey of algorithms and hardware architectures for neural-inspired and neuromorphic computing applications

Biologically Inspired Cognitive Architectures

James, Conrad D.; Aimone, James B.; Miner, Nadine E.; Vineyard, Craig M.; Rothganger, Fredrick R.; Carlson, Kristofor D.; Mulder, Samuel A.; Draelos, Timothy J.; Faust, Aleksandra; Marinella, Matthew J.; Naegle, John H.; Plimpton, Steven J.

Biological neural networks continue to inspire new developments in algorithms and microelectronic hardware to solve challenging data processing and classification problems. Here, we survey the history of neural-inspired and neuromorphic computing in order to examine the complex and intertwined trajectories of the mathematical theory and hardware developed in this field. Early research focused on adapting existing hardware to emulate the pattern recognition capabilities of living organisms. Contributions from psychologists, mathematicians, engineers, neuroscientists, and other professions were crucial to maturing the field from narrowly-tailored demonstrations to more generalizable systems capable of addressing difficult problem classes such as object detection and speech recognition. Algorithms that leverage fundamental principles found in neuroscience such as hierarchical structure, temporal integration, and robustness to error have been developed, and some of these approaches are achieving world-leading performance on particular data classification tasks. In addition, novel microelectronic hardware is being developed to perform logic and to serve as memory in neuromorphic computing systems with optimized system integration and improved energy efficiency. Key to such advancements was the incorporation of new discoveries in neuroscience research, the transition away from strict structural replication and towards the functional replication of neural systems, and the use of mathematical theory frameworks to guide algorithm and hardware developments.

More Details
Results 76–100 of 184
Results 76–100 of 184