Publications

Results 51–75 of 86
Skip to search filters

Applying cognitive work analysis to a synthetic aperture radar system

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Cole, Kerstan S.; Adams, Susan S.; McNamara, Laura A.; Ganter, John H.

The purpose of the current study was to analyze the work of imagery analysts associated with Sagebrush, a Synthetic Aperture Radar (SAR) imaging system, using an adapted version of cognitive work analysis (CWA). This was achieved by conducting a work domain analysis (WDA) for the system under consideration. Another purpose of this study was to describe how we adapted the WDA framework to include a sequential component and a means to explicitly represent relationships between components. Lastly, we present a simplified work domain representation that we have found effective in communicating the importance of analysts' adaptive strategies to inform the research strategies of computational science researchers who want to develop useful algorithms, but who have little or no familiarity with sensor data analysis work. © 2014 Springer International Publishing.

More Details

Hierarchical task analysis of a synthetic aperture radar analysis process

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Adams, Susan S.; Cole, Kerstan S.; McNamara, Laura A.

Imagery analysts are given the difficult task of determining, post-hoc, if particular events of importance had occurred, employing Synthetic Aperture Radar (SAR) images, written reports and PowerPoint presentations to make their decision. We were asked to evaluate the current system analysis process and make recommendations for a future temporal geospatial analysis prototype that is envisioned to allow analysts to quickly search for temporal and spatial relationships between image-derived features. As such, we conducted a Hierarchical task analysis (HTA; [3], [6]) to understand the analysts' tasks and subtasks. We also implemented a timeline analysis and workload assessment [4] to better understand which tasks were the most time-consuming and perceived as the most effortful. Our results gave the team clear recommendations and requirements for a prototype. © 2014 Springer International Publishing.

More Details

Evaluating information visualizations with working memory metrics

Communications in Computer and Information Science

Bandlow, Alisa; Matzen, Laura E.; Cole, Kerstan S.; Dornburg, Courtney S.; Geiseler, Charles J.; Greenfield, John A.; McNamara, Laura A.; Adams, Susan S.

Information visualization tools are being promoted to aid decision support. These tools assist in the analysis and comprehension of ambiguous and conflicting data sets. Formal evaluations are necessary to demonstrate the effectiveness of visualization tools, yet conducting these studies is difficult. Objective metrics that allow designers to compare the amount of work required for users to operate a particular interface are lacking. This in turn makes it difficult to compare workload across different interfaces, which is problematic for complicated information visualization and visual analytics packages. We believe that measures of working memory load can provide a more objective and consistent way of assessing visualizations and user interfaces across a range of applications. We present initial findings from a study using measures of working memory load to compare the usability of two graph representations. © 2011 Springer-Verlag.

More Details

Why Models Don%3CU%2B2019%3Et Forecast

McNamara, Laura A.

The title of this paper, Why Models Don't Forecast, has a deceptively simple answer: models don't forecast because people forecast. Yet this statement has significant implications for computational social modeling and simulation in national security decision making. Specifically, it points to the need for robust approaches to the problem of how people and organizations develop, deploy, and use computational modeling and simulation technologies. In the next twenty or so pages, I argue that the challenge of evaluating computational social modeling and simulation technologies extends far beyond verification and validation, and should include the relationship between a simulation technology and the people and organizations using it. This challenge of evaluation is not just one of usability and usefulness for technologies, but extends to the assessment of how new modeling and simulation technologies shape human and organizational judgment. The robust and systematic evaluation of organizational decision making processes, and the role of computational modeling and simulation technologies therein, is a critical problem for the organizations who promote, fund, develop, and seek to use computational social science tools, methods, and techniques in high-consequence decision making.

More Details
Results 51–75 of 86
Results 51–75 of 86