Network segmentation of a power grid's communication system can make the grid more resilient to cyberattacks. Here we develop a novel trilevel programming model to optimally segment a grid communication system, taking into account the actions of an information technology (IT) administrator, attacker, and grid operator. The IT administrator is allowed to segment existing networks, and the attacker is given a budget to inflict damage on the grid by attacking the segmented communication system. Finally, the grid operator can redispatch the grid after the attack to minimize damage. The resulting problem is a trilevel interdiction problem that we solve using a branch and bound algorithm for bilevel problems. We demonstrate the benefits of optimal network segmentation through case studies on the 9-bus Western System Coordinating Council (WSCC) system and the 30-bus IEEE system. These examples illustrate that network segmentation can significantly reduce the threat posed by a cyberattacker.
Widespread integration of social media into daily life has fundamentally changed the way society communicates, and, as a result, how individuals develop attitudes, personal philosophies, and worldviews. The excess spread of disinformation and misinformation due to this increased connectedness and streamlined communication has been extensively studied, simulated, and modeled. Less studied is the interaction of many pieces of misinformation, and the resulting formation of attitudes. We develop a framework for the simulation of attitude formation based on exposure to multiple cognitions. We allow a set of cognitions with some implicit relational topology to spread on a social network, which is defined with separate layers to specify online and offline relationships. An individual’s opinion on each cognition is determined by a process inspired by the Ising model for ferromagnetism. We conduct experimentation using this framework to test the effect of topology, connectedness, and social media adoption on the ultimate prevalence of and exposure to certain attitudes.
This report summarizes the activities performed as part of the Science and Engineering of Cybersecurity by Uncertainty quantification and Rigorous Experimentation (SECURE) Grand Challenge LDRD project. We provide an overview of the research done in this project, including work on cyber emulation, uncertainty quantification, and optimization. We present examples of integrated analyses performed on two case studies: a network scanning/detection study and a malware command and control study. We highlight the importance of experimental workflows and list references of papers and presentations developed under this project. We outline lessons learned and suggestions for future work.
This document provides implementation guidance for implementing personnel group FTE costs by JCA Tier 1 or 2 categories in the Contingency Contractor Optimization Tool – Engineering Prototype (CCOT-P). CCOT-P currently only allows FTE costs by personnel group to differ by mission. Changes will need to be made to the user interface inputs pages and the database
The purpose of the Scope, Complexity, Options, Risks, Excursions (SCORE) model is to estimate the relative complexity of design variants of future warhead options, resulting in scores. SCORE factors extend this capability by providing estimates of complexity relative to a base system (i.e., all design options are normalized to one weapon system). First, a clearly defined set of scope elements for a warhead option is established. The complexity of each scope element is estimated by Subject Matter Experts (SMEs), including a level of uncertainty, relative to a specific reference system. When determining factors, complexity estimates for a scope element can be directly tied to the base system or chained together via comparable scope elements in a string of reference systems that ends with the base system. The SCORE analysis process is a growing multi-organizational Nuclear Security Enterprise (NSE) effort, under the management of the NA-12 led Enterprise Modeling and Analysis Consortium (EMAC). Historically, it has provided the data elicitation, integration, and computation needed to support the out-year Life Extension Program (LEP) cost estimates included in the Stockpile Stewardship Management Plan (SSMP).
The purpose of the Scope, Complexity, Options, Risks, Excursions (SCORE) model is to estimate the relative complexity of design variants of future warhead options. The results of this model allow those considering these options to understand the complexity tradeoffs between proposed warhead options. The core idea of SCORE is to divide a warhead option into a well- defined set of scope elements and then estimate the complexity of each scope element against a well understood reference system. The uncertainty associated with estimates can also be captured. A weighted summation of the relative complexity of each scope element is used to determine the total complexity of the proposed warhead option or portions of the warhead option (i.e., a National Work Breakdown Structure code). The SCORE analysis process is a growing multi-organizational Nuclear Security Enterprise (NSE) effort, under the management of the NA- 12 led Enterprise Modeling and Analysis Consortium (EMAC), that has provided the data elicitation, integration and computation needed to support the out-year Life Extension Program (LEP) cost estimates included in the Stockpile Stewardship Management Plan (SSMP).
This report summarizes the work performed as part of a Laboratory Directed Research and Development project focused on evaluating and mitigating risk associated with biological dual use research of concern. The academic and scientific community has identified the funding stage as the appropriate place to intervene and mitigate risk, so the framework developed here uses a portfolio-level approach and balances biosafety and biosecurity risks, anticipated project benefits, and available mitigations to identify the best available investment strategies subject to cost constraints. The modeling toolkit was designed for decision analysis for dual use research of concern, but is flexible enough to support a wide variety of portfolio-level funding decisions where risk/benefit tradeoffs are involved. Two mathematical optimization models with two solution methods are included to accommodate stakeholders with varying levels of certainty about priorities between metrics. An example case study is presented.
Sandia National Laboratories (Sandia) is in Phase 3 Sustainment of development of a prototype tool, currently referred to as the Contingency Contractor Optimization Tool - Prototype (CCOTP), under the direction of OSD Program Support. CCOT-P is intended to help provide senior Department of Defense (DoD) leaders with comprehensive insight into the global availability, readiness and capabilities of the Total Force Mix. The CCOT-P will allow senior decision makers to quickly and accurately assess the impacts, risks and mitigating strategies for proposed changes to force/capabilities assignments, apportionments and allocations options, focusing specifically on contingency contractor planning. During Phase 2 of the program, conducted during fiscal year 2012, Sandia developed an electronic storyboard prototype of the Contingency Contractor Optimization Tool that can be used for communication with senior decision makers and other Operational Contract Support (OCS) stakeholders. Phase 3 used feedback from demonstrations of the electronic storyboard prototype to develop an engineering prototype for planners to evaluate. Sandia worked with the DoD and Joint Chiefs of Staff strategic planning community to get feedback and input to ensure that the engineering prototype was developed to closely align with future planning needs. The intended deployment environment was also a key consideration as this prototype was developed. Initial release of the engineering prototype was done on servers at Sandia in the middle of Phase 3. In 2013, the tool was installed on a production pilot server managed by the OUSD(AT&L) eBusiness Center. The purpose of this document is to specify the CCOT-P engineering prototype platform requirements as of May 2016. Sandia developed the CCOT-P engineering prototype using common technologies to minimize the likelihood of deployment issues. CCOT-P engineering prototype was architected and designed to be as independent as possible of the major deployment components such as the server hardware, the server operating system, the database, and the web server. This document describes the platform requirements, the architecture, and the implementation details of the CCOT-P engineering prototype.
The Contingency Contractor Optimization Tool – Prototype (CCOT-P) database is used to store input and output data for the linear program model described in [1]. The database allows queries to retrieve this data and updating and inserting new input data.
This requirements document serves as an addendum to the Contingency Contractor Optimization Phase 2, Requirements Document [1] and Phase 3 Requirements Document [2]. The Phase 2 Requirements document focused on the high-level requirements for the tool. The Phase 3 Requirements document provided more detailed requirements to which the engineering prototype was built in Phase 3. This document will provide detailed requirements for features and enhancements being added to the production pilot in the Phase 3 Sustainment.
The reports and test plans contained within this document serve as supporting materials to the activities listed within the “Contingency Contractor Optimization Tool – Prototype (CCOT-P) Verification & Validation Plan” [1]. The activities included test development, testing, peer reviews, and expert reviews. The engineering prototype reviews were done for both the software and the mathematical model used in CCOT-P. Section 2 includes the peer and expert review reports, which summarize the findings from each of the reviews and document the resolution of any issues. Section 3 details the test plans that were followed for functional testing of the application through the interface. Section 4 describes the unit tests that were run on the code.
This paper presents a probabilistic origin-destination table for waterborne containerized imports. The analysis makes use of 2012 Port Import/Export Reporting Service data, 2012 Surface Transportation Board waybill data, a gravity model, and information on the landside transportation mode split associated with specifc ports. This analysis suggests that about 70% of the origin-destination table entries have a coeffcient of variation of less than 20%. This 70% of entries is associated with about 78% of the total volume. This analysis also makes evident the importance of rail interchange points in Chicago, Illinois; Memphis, Tennessee; Dallas, Texas; and Kansas City, Missouri, in supporting the transportation of containerized goods from Asia through West Coast ports to the eastern United States.
This project evaluates the effectiveness of moving target defense (MTD) techniques using a new game we have designed, called PLADD, inspired by the game FlipIt [28]. PLADD extends FlipIt by incorporating what we believe are key MTD concepts. We have analyzed PLADD and proven the existence of a defender strategy that pushes a rational attacker out of the game, demonstrated how limited the strategies available to an attacker are in PLADD, and derived analytic expressions for the expected utility of the game’s players in multiple game variants. We have created an algorithm for finding a defender’s optimal PLADD strategy. We show that in the special case of achieving deterrence in PLADD, MTD is not always cost effective and that its optimal deployment may shift abruptly from not using MTD at all to using it as aggressively as possible. We believe our effort provides basic, fundamental insights into the use of MTD, but conclude that a truly practical analysis requires model selection and calibration based on real scenarios and empirical data. We propose several avenues for further inquiry, including (1) agents with adaptive capabilities more reflective of real world adversaries, (2) the presence of multiple, heterogeneous adversaries, (3) computational game theory-based approaches such as coevolution to allow scaling to the real world beyond the limitations of analytical analysis and classical game theory, (4) mapping the game to real-world scenarios, (5) taking player risk into account when designing a strategy (in addition to expected payoff), (6) improving our understanding of the dynamic nature of MTD-inspired games by using a martingale representation, defensive forecasting, and techniques from signal processing, and (7) using adversarial games to develop inherently resilient cyber systems.