Publications

64 Results
Skip to search filters

Experimental Wargaming with SIGNAL

Military Operations Research

Letchford, Joshua L.; Epifanovskaya, Laura E.; Lakkaraju, Kiran L.; Armenta, Mika; Reddie, Andrew W.; Whetzel, Jonathan H.; Reinhardt, Jason C.; Chen, Andrew C.; Fabian, Nathan D.; Hingorani, Sheryl H.; Iyer, Roshani I.; Krishman, Roshan K.; Laderman, Sarah L.; Lee, Mansook L.; Mohan, Jahani M.; Nacht, Michael; Prakkamakul, Soravis P.; Sumner, Mathew S.; Tibbets, Jake T.; Valdez, Allie V.; Zhang, Charlie Z.

Abstract not provided.

Information Design for XR Immersive Environments: Challenges and Opportunities

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Raybourn, Elaine M.; Stubblefield, William A.; Trumbo, Michael; Jones, Aaron P.; Whetzel, Jonathan H.; Fabian, Nathan D.

Cross Reality (XR) immersive environments offer challenges and opportunities in designing for cognitive aspects (e.g. learning, memory, attention, etc.) of information design and interactions. Information design is a multidisciplinary endeavor involving data science, communication science, cognitive science, media, and technology. In the present paper the holodeck metaphor is extended to illustrate how information design practices and some of the qualities of this imaginary computationally augmented environment (a.k.a. the holodeck) may be achieved in XR environments to support information-rich storytelling and real life, face-to-face, and virtual collaborative interactions. The Simulation Experience Design Framework & Method is introduced to organize challenges and opportunities in the design of information for XR. The notion of carefully blending both real and virtual spaces to achieve total immersion is discussed as the reader moves through the elements of the cyclical framework. A solution space leveraging cognitive science, information design, and transmedia learning highlights key challenges facing contemporary XR designers. Challenges include but are not limited to interleaving information, technology, and media into the human storytelling process, and supporting narratives in a way that is memorable, robust, and extendable.

More Details

Grandmaster: Interactive Text-Based Analytics of Social Media

Proceedings - 15th IEEE International Conference on Data Mining Workshop, ICDMW 2015

Fabian, Nathan D.; Davis, Warren L.; Raybourn, Elaine M.; Lakkaraju, Kiran L.; Whetzel, Jonathan H.

People use social media resources like Twitter, Facebook, forums etc. to shareand discuss various activities or topics. By aggregating topic trends acrossmany individuals using these services, we seek to construct a richer profileof a person's activities and interests as well as provide a broader context ofthose activities. This profile may then be used in a variety of ways tounderstand groups as a collection of interests and affinities and anindividual's participation in those groups. Our approach considers that muchof these data will be unstructured, free-form text. By analyzing free-form text directly, we may be able to gain an implicit grouping ofindividuals with shared interests based on shared conversation, and not onexplicit social software linking them. In this paper, we discuss aproof-of-concept application called Grandmaster built to pull short sections oftext, a person's comments or Twitter posts, together by analysis andvisualization to allow a gestalt understanding of the full collection of allindividuals: how groups are similar and how they differ, based on theirtext inputs.

More Details

Grandmaster: Interactive text-based analytics of social media

Fabian, Nathan D.; Davis, Warren L.; Raybourn, Elaine M.; Lakkaraju, Kiran L.; Whetzel, Jonathan H.

People use social media resources like Twitter, Facebook, forums etc. to share and discuss various activities or topics. By aggregating topic trends across many individuals using these services, we seek to construct a richer profile of a person’s activities and interests as well as provide a broader context of those activities. This profile may then be used in a variety of ways to understand groups as a collection of interests and affinities and an individual’s participation in those groups. Our approach considers that much of these data will be unstructured, free-form text. By analyzing free-form text directly, we may be able to gain an implicit grouping of individuals with shared interests based on shared conversation, and not on explicit social software linking them. In this paper, we discuss a proof-of-concept application called Grandmaster built to pull short sections of text, a person’s comments or Twitter posts, together by analysis and visualization to allow a gestalt understanding of the full collection of all individuals: how groups are similar and how they differ, based on their text inputs.

More Details

Data privacy and security considerations for personal assistantsfor learning (PAL)

International Conference on Intelligent User Interfaces, Proceedings IUI

Raybourn, Elaine M.; Fabian, Nathan D.; Davis, Warren L.; Parks, Raymond C.; McClain, Jonathan T.; Trumbo, Derek T.; Regan, Damon; Durlach, Paula J.

A hypothetical scenario is utilized to explore privacy and security considerations for intelligent systems, such as a Personal Assistant for Learning (PAL). Two categories of potential concerns are addressed: factors facilitated by user models, and factors facilitated by systems. Among the strategies presented for risk mitigation is a call for ongoing, iterative dialog among privacy, security, and personalization researchers during all stages of development, testing, and deployment.

More Details

Canaries in a coal mine: Using application-level checkpoints to detect memory failures

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Widener, Patrick W.; Ferreira, Kurt B.; Levy, Scott; Fabian, Nathan D.

Memory failures in future extreme scale applications are a significant concern in the high-performance computing community and have attracted much research attention. We contend in this paper that using application checkpoint data to detect memory failures has potential benefits and is preferable to examining application memory. To support this contention, we describe the application of machine learning techniques to evaluate the veracity of checkpoint data. Our preliminary results indicate that supervised decision tree machine learning approaches can effectively detect corruption in restart files, suggesting that future extreme-scale applications and systems may benefit from incorporating such approaches in order to cope with memory failues.

More Details

Data co-processing for extreme scale analysis level II ASC milestone (4745)

Rogers, David R.; Moreland, Kenneth D.; Oldfield, Ron A.; Fabian, Nathan D.

Exascale supercomputing will embody many revolutionary changes in the hardware and software of high-performance computing. A particularly pressing issue is gaining insight into the science behind the exascale computations. Power and I/O speed con- straints will fundamentally change current visualization and analysis work ows. A traditional post-processing work ow involves storing simulation results to disk and later retrieving them for visualization and data analysis. However, at exascale, scien- tists and analysts will need a range of options for moving data to persistent storage, as the current o ine or post-processing pipelines will not be able to capture the data necessary for data analysis of these extreme scale simulations. This Milestone explores two alternate work ows, characterized as in situ and in transit, and compares them. We nd each to have its own merits and faults, and we provide information to help pick the best option for a particular use.

More Details

Demonstration of a Legacy Application's Path to Exascale - ASC L2 Milestone 4467

Barrett, Brian B.; Kelly, Suzanne M.; Klundt, Ruth A.; Laros, James H.; Leung, Vitus J.; Levenhagen, Michael J.; Lofstead, Gerald F.; Moreland, Kenneth D.; Oldfield, Ron A.; Pedretti, Kevin P.; Rodrigues, Arun; Barrett, Richard F.; Ward, Harry L.; Vandyke, John P.; Vaughan, Courtenay T.; Wheeler, Kyle B.; Brandt, James M.; Brightwell, Ronald B.; Curry, Matthew L.; Fabian, Nathan D.; Ferreira, Kurt; Gentile, Ann C.; Hemmert, Karl S.

Abstract not provided.

Report of experiments and evidence for ASC L2 milestone 4467 : demonstration of a legacy application's path to exascale

Barrett, Brian B.; Kelly, Suzanne M.; Klundt, Ruth A.; Laros, James H.; Leung, Vitus J.; Levenhagen, Michael J.; Lofstead, Gerald F.; Moreland, Kenneth D.; Oldfield, Ron A.; Pedretti, Kevin P.; Rodrigues, Arun; Barrett, Richard F.; Ward, Harry L.; Vandyke, John P.; Vaughan, Courtenay T.; Wheeler, Kyle B.; Brandt, James M.; Brightwell, Ronald B.; Curry, Matthew L.; Fabian, Nathan D.; Ferreira, Kurt; Gentile, Ann C.; Hemmert, Karl S.

This report documents thirteen of Sandia's contributions to the Computational Systems and Software Environment (CSSE) within the Advanced Simulation and Computing (ASC) program between fiscal years 2009 and 2012. It describes their impact on ASC applications. Most contributions are implemented in lower software levels allowing for application improvement without source code changes. Improvements are identified in such areas as reduced run time, characterizing power usage, and Input/Output (I/O). Other experiments are more forward looking, demonstrating potential bottlenecks using mini-application versions of the legacy codes and simulating their network activity on Exascale-class hardware. The purpose of this report is to prove that the team has completed milestone 4467-Demonstration of a Legacy Application's Path to Exascale. Cielo is expected to be the last capability system on which existing ASC codes can run without significant modifications. This assertion will be tested to determine where the breaking point is for an existing highly scalable application. The goal is to stretch the performance boundaries of the application by applying recent CSSE RD in areas such as resilience, power, I/O, visualization services, SMARTMAP, lightweight LWKs, virtualization, simulation, and feedback loops. Dedicated system time reservations and/or CCC allocations will be used to quantify the impact of system-level changes to extend the life and performance of the ASC code base. Finally, a simulation of anticipated exascale-class hardware will be performed using SST to supplement the calculations. Determine where the breaking point is for an existing highly scalable application: Chapter 15 presented the CSSE work that sought to identify the breaking point in two ASC legacy applications-Charon and CTH. Their mini-app versions were also employed to complete the task. There is no single breaking point as more than one issue was found with the two codes. The results were that applications can expect to encounter performance issues related to the computing environment, system software, and algorithms. Careful profiling of runtime performance will be needed to identify the source of an issue, in strong combination with knowledge of system software and application source code.

More Details

Real-time individualized training vectors for experiential learning

Fabian, Nathan D.; Glickman, Matthew R.

Military training utilizing serious games or virtual worlds potentially generate data that can be mined to better understand how trainees learn in experiential exercises. Few data mining approaches for deployed military training games exist. Opportunities exist to collect and analyze these data, as well as to construct a full-history learner model. Outcomes discussed in the present document include results from a quasi-experimental research study on military game-based experiential learning, the deployment of an online game for training evidence collection, and results from a proof-of-concept pilot study on the development of individualized training vectors. This Lab Directed Research & Development (LDRD) project leveraged products within projects, such as Titan (Network Grand Challenge), Real-Time Feedback and Evaluation System, (America's Army Adaptive Thinking and Leadership, DARWARS Ambush! NK), and Dynamic Bayesian Networks to investigate whether machine learning capabilities could perform real-time, in-game similarity vectors of learner performance, toward adaptation of content delivery, and quantitative measurement of experiential learning.

More Details

Visualization on supercomputing platform level II ASC milestone (3537-1B) results from Sandia

Moreland, Kenneth D.; Fabian, Nathan D.

This report provides documentation for the completion of the Sandia portion of the ASC Level II Visualization on the platform milestone. This ASC Level II milestone is a joint milestone between Sandia National Laboratories and Los Alamos National Laboratories. This milestone contains functionality required for performing visualization directly on a supercomputing platform, which is necessary for peta-scale visualization. Sandia's contribution concerns in-situ visualization, running a visualization in tandem with a solver. Visualization and analysis of petascale data is limited by several factors which must be addressed as ACES delivers the Cielo platform. Two primary difficulties are: (1) Performance of interactive rendering, which is most computationally intensive portion of the visualization process. For terascale platforms, commodity clusters with graphics processors(GPUs) have been used for interactive rendering. For petascale platforms, visualization and rendering may be able to run efficiently on the supercomputer platform itself. (2) I/O bandwidth, which limits how much information can be written to disk. If we simply analyze the sparse information that is saved to disk we miss the opportunity to analyze the rich information produced every timestep by the simulation. For the first issue, we are pursuing in-situ analysis, in which simulations are coupled directly with analysis libraries at runtime. This milestone will evaluate the visualization and rendering performance of current and next generation supercomputers in contrast to GPU-based visualization clusters, and evaluate the performance of common analysis libraries coupled with the simulation that analyze and write data to disk during a running simulation. This milestone will explore, evaluate and advance the maturity level of these technologies and their applicability to problems of interest to the ASC program. Scientific simulation on parallel supercomputers is traditionally performed in four sequential steps: meshing, partitioning, solver, and visualization. Not all of these components are necessarily run on the supercomputer. In particular, the meshing and visualization typically happen on smaller but more interactive computing resources. However, the previous decade has seen a growth in both the need and ability to perform scalable parallel analysis, and this gives motivation for coupling the solver and visualization.

More Details

Beyond game effectiveness. Part II, a qualitative study of multi-role experiential learning

Fabian, Nathan D.; Tucker, Eilish M.

The present paper is the second in a series published at I/ITSEC that seeks to explain the efficacy of multi-role experiential learning employed to create engaging game-based training methods transitioned to the U.S. Army, U.S. Army Special Forces, Civil Affairs, and Psychological Operations teams. The first publication (I/ITSEC 2009) summarized findings from a quantitative study that investigated experiential learning in the multi-player, PC-based game module transitioned to PEO-STRI, DARWARS Ambush! NK (non-kinetic). The 2009 publication reported that participants of multi-role (Player and Reflective Observer/Evaluator) game-based training reported statistically significant learning and engagement. Additionally when the means of the two groups (Player and Reflective Observer/Evaluator) were compared, they were not statistically significantly different from each other. That is to say that both playing as well as observing/evaluating were engaging learning modalities. The Observer/Evaluator role was designed to provide an opportunity for real-time reflection and meta-cognitive learning during game play. Results indicated that this role was an engaging way to learn about communication, that participants learned something about cultural awareness, and that the skills they learned were helpful in problem solving and decision-making.

The present paper seeks to continue to understand what and how users of non-kinetic game-based missions learn by revisiting the 2009 quantitative study with further investigation such as stochastic player performance analysis using latent semantic analyses and graph visualizations. The results are applicable to First-Person game-based learning systems designed to enhance trainee intercultural communication, interpersonal skills, and adaptive thinking. In the full paper, we discuss results obtained from data collected from 78 research participants of diverse backgrounds who trained by engaging in tasks directly, as well as observing and evaluating peer performance in real-time. The goal is two-fold. One is to quantify and visualize detailed player performance data coming from game play transcription to give further understanding to the results in the 2009 I/ITSEC paper. The second is to develop a set of technologies from this quantification and visualization approach into a generalized application tool to be used to aid in future games’ development of player/learner models and game adaptation algorithms.

Specifically, this paper addresses questions such as, “Are there significant differences in one's experience when an experiential learning task is observed first, and then performed by the same individual?” “Are there significant differences among groups participating in different roles in non-kinetic engagement training, especially when one role requires more active participation that the other?” “What is the impact of behavior modeling on learning in games?” In answering these questions the present paper reinforces the 2009 empirical study conclusion that contrary to current trends in military game development, experiential learning is enhanced by innovative training approaches designed to facilitate trainee mastery of reflective observation and abstract conceptualization as much as performance-based skills.

More Details

Scalable analysis tools for sensitivity analysis and UQ (3160) results

Ice, Lisa I.; Fabian, Nathan D.; Moreland, Kenneth D.; Bennett, Janine C.; Karelitz, David B.

The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

More Details

Detecting Combustion and Flow Features In Situ Using Principal Component Analysis

Grout, Ray G.; Bennett, Janine C.; Fabian, Nathan D.

This report presents progress on identifying and classifying features involving combustion in turbulent flow using principal component analysis (PCA) and k-means clustering using an in situ analysis framework. We describe a process for extracting temporally- and spatially-varying information from the simulation, classifying the information, and then applying the classification algorithm to either other portions of the simulation not used for training the classifier or further simulations. Because the regions classified as being of interest take up a small portion of the overall simulation domain, it will consume fewer resources to perform further analysis or save these regions at a higher fidelity than previously possible. The implementation of this process is partially complete and results obtained from PCA of test data is presented that indicates the process may have merit: the basis vectors that PCA provides are significantly different in regions where combustion is occurring and even when all 21 species of a lifted flame simulation are correlated the computational cost of PCA is minimal. What remains to be determined is whether k-means (or other) clustering techniques will be able to identify combined combustion and flow features with an accuracy that makes further characterization of these regions feasible and meaningful.

More Details
64 Results
64 Results