Publications

13 Results
Skip to search filters

SPARR: Spiking/Processing Array for Wide Dynamic Range and High Resolution Photonic Sensing

Hays, Park H.; Kagie, Matthew J.; Karelitz, David B.; Kay, Randolph ".; Mincey, John S.; Woods, Mark C.

The Spiking/Processing Array (spARR) is a novel photonic focal plane that uses pixels which generate electronic spikes autonomously and without a clock. These spikes feed into a network of digital asynchronous processing elements or DAPES. By building a useful assemblage of DAPES, and connecting them together in the correct way, sophisticated signal processing can be accomplished within the focal plane. Autonomous self-resetting pixels (AsP) enable SPARR to generate electronic response with very small signals--as little as a single photon in the case of Geiger mode avalanche photodiodes to as few as several hundred photons for in-cmos photodetectors. These spiking pixels enable fast detector response, but do not draw as much continuous power as synchronous clocked designs. The spikes emitted by the pixels all have the same magnitude, the information from the scene is effectively encoded into the rate of spikes and the time at which the spike is emitted. The spiking pixels, having converted incident light into electronic spikes, supply the spikes to a network of digital asynchronous processors. These are small state machines which respond to the spikes arriving at their input ports by either remaining unchanged or updating their internal state and possibly emitting a spike on one or more output ports. We show a design that accomplishes the sophisticated signal processing of a Haar spatial wavelet transform with spatial-spectral whitening. We furthermore show how this design results in a data streams which support imaging and transient optical source detection. Two simulators support this analysis: SPICE and sparrow. The CMOS SPICE simulator Cadence provides accurate CMOs design with accounting for effects of circuit parasitics throughout layout, accurate timing, and accurate energy consumption estimates. To more rapidly assess larger networks with more pixels, sparrow is a custom discrete event simulator that supports the non-homogeneous Poisson processes that lie behind photoelectric interaction. Sparrow is a photon-exact simulator that nevertheless performs SPARR system simulator for large-scale systems.

More Details

Evaluation of urban vehicle tracking algorithms

IEEE Aerospace Conference Proceedings

Love, Joshua A.; Hansen, Ross L.; Melgaard, David K.; Karelitz, David B.; Pitts, Todd A.; Byrne, Raymond H.

Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase significantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, blob tracking is the norm. For higher resolution data, additional information may be employed in the detection and classification steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment. The algorithms considered are: random sample consensus (RANSAC), Markov chain Monte Carlo data association (MCMCDA), tracklet inference from factor graphs, and a proximity tracker. Each algorithm was tested on a combination of real and simulated data and evaluated against a common set of metrics.

More Details

Large scale tracking algorithms

Byrne, Raymond H.; Hansen, Ross L.; Love, Joshua A.; Melgaard, David K.; Pitts, Todd A.; Karelitz, David B.; Zollweg, Joshua D.; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.

Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

More Details

Derivation of an applied nonlinear Schroedinger equation

Pitts, Todd A.; Laine, Mark R.; Schwarz, Jens S.; Rambo, Patrick K.; Karelitz, David B.

We derive from first principles a mathematical physics model useful for understanding nonlinear optical propagation (including filamentation). All assumptions necessary for the development are clearly explained. We include the Kerr effect, Raman scattering, and ionization (as well as linear and nonlinear shock, diffraction and dispersion). We explain the phenomenological sub-models and each assumption required to arrive at a complete and consistent theoretical description. The development includes the relationship between shock and ionization and demonstrates why inclusion of Drude model impedance effects alters the nature of the shock operator. Unclassified Unlimited Release

More Details

Fundamental studies on initiation and evolution of multi-channel discharges and their application to next generation pulsed power machines

Schwarz, Jens S.; Savage, Mark E.; Lucero, Diego J.; Jaramillo, Deanna M.; Seals, Kelly G.; Pitts, Todd A.; Hautzenroeder, Brenna M.; Laine, Mark R.; Karelitz, David B.; Porter, John L.

Future pulsed power systems may rely on linear transformer driver (LTD) technology. The LTD's will be the building blocks for a driver that can deliver higher current than the Z-Machine. The LTD's would require tens of thousands of low inductance ( %3C 85nH), high voltage (200 kV DC) switches with high reliability and long lifetime ( 10 4 shots). Sandia's Z-Machine employs 36 megavolt class switches that are laser triggered by a single channel discharge. This is feasible for tens of switches but the high inductance and short switch life- time associated with the single channel discharge are undesirable for future machines. Thus the fundamental problem is how to lower inductance and losses while increasing switch life- time and reliability. These goals can be achieved by increasing the number of current-carrying channels. The rail gap switch is ideal for this purpose. Although those switches have been extensively studied during the past decades, each effort has only characterized a particular switch. There is no comprehensive understanding of the underlying physics that would allow predictive capability for arbitrary switch geometry. We have studied rail gap switches via an extensive suite of advanced diagnostics in synergy with theoretical physics and advanced modeling capability. Design and topology of multichannel switches as they relate to discharge dynamics are investigated. This involves electrically and optically triggered rail gaps, as well as discrete multi-site switch concepts.

More Details

Scalable analysis tools for sensitivity analysis and UQ (3160) results

Ice, Lisa I.; Fabian, Nathan D.; Moreland, Kenneth D.; Bennett, Janine C.; Karelitz, David B.

The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

More Details

Post-processing V&V Level II ASC Milestone (2843) results

Moreland, Kenneth D.; Wilke, Jason W.; Attaway, Stephen W.; Karelitz, David B.

The 9/30/2008 ASC Level 2 Post-Processing V&V Milestone (Milestone 2843) contains functionality required by the user community for certain verification and validation tasks. These capabilities include fragment detection from CTH simulation data, fragment characterization and analysis, and fragment sorting and display operations. The capabilities were tested extensively both on sample and actual simulations. In addition, a number of stretch criteria were met including a comparison between simulated and test data, and the ability to output each fragment as an individual geometric file.

More Details

Post-processing V&V level II ASC milestone (2360) results

Moreland, Kenneth D.; Chavez, Elmer A.; Weirs, Vincent G.; Brunner, Thomas A.; Trucano, Timothy G.; Karelitz, David B.

The 9/30/2007 ASC Level 2 Post-Processing V&V Milestone (Milestone 2360) contains functionality required by the user community for certain verification and validation tasks. These capabilities include loading of edge and face data on an Exodus mesh, run-time computation of an exact solution to a verification problem, delivery of results data from the server to the client, computation of an integral-based error metric, simultaneous loading of simulation and test data, and comparison of that data using visual and quantitative methods. The capabilities were tested extensively by performing a typical ALEGRA HEDP verification task. In addition, a number of stretch criteria were met including completion of a verification task on a 13 million element mesh.

More Details
13 Results
13 Results