The authors examine the problem of how to provide a time code for staff to use in pursuit of innovation. Four potential options are explored ranging from not providing funds for this activity, to charging such efforts against existing or expanded program management and program development funds. One solution that provides funded time without raising laboratory overhead rates is identified and referred to as Innovation Flex Time. This would consist of capturing hours worked in excess of the standard work week but not charged to customers and making those hours available to fund time for exploring new ideas. A brief examination of labor relations laws, and laws regulating laboratory directed research and development suggests that Innovation Flex Time is a viable option for the laboratory. However, implementation of Innovation Flex Time would require NNSA approval and modification of the existing management and operations contract.
A common problem in developing high-reliability systems is estimating the reliability for a population of components that cannot be 100% tested. The radiation survivability of a population of components is often estimated by testing a very small sample to some multiple of the required specification level, known as an overtest. Given a successful test with a sufficient overtest margin, the population of components is assumed to have the required survivability or radiation reliability. However, no mathematical justification for such claims has been crafted without making aggressive assumptions regarding the statistics of the unknown distribution. Here we illustrate a new approach that leverages geometric bounding arguments founded on relatively modest distribution assumptions to produce conservative estimates of component reliability.
Innovation is a highly overused buzzword in government offices and corporate America. Just like leadership, innovation is something that organizations realize they need, but often are frustrated in their efforts to achieve. Making matters worse, the definition of innovation changes with the user and the context. Simply put, innovation is change and change for any organization is difficult. This study examines the topic of innovation and pays special attention to what works, what does not work, and the basic principles governing how one might go about innovating. As leadership is critical to any successful innovation effort, special attention is given to leading change and leading innovation. ''The reasonable man adapts himself to the world; the unreasonable one persists to adapt the world to himself. Therefore, all progress depends on the unreasonable man.'' - George Bernard Shaw
Continuous surveillance of the night sky with ground-based optical sensors requires a number of sites distributed around the globe. Due to variable cloud cover, the number of sites required to guarantee nightly observation of all Geosynchronous Earth Orbit slots is greater than that simply required to provide partial coverage. Combining this consideration with the requirements for dark sky sites and adequate supporting infrastructure presents additional limitations on where ground- based telescopes can be located. The authors examine this problem and present results of an optimization approach that can both recommend sites and networks of sites, as well as provide insight into the utility of any individual geographic location. This page intentionally left blank
As technical systems and social problems in modern society become ever more complex, many organizations are turning to what is commonly termed complexity science to find solutions. The problem many organizations face is that they frequently have no clear idea what they are trying to accomplish, no in-depth understanding of the nature, size and dimension of their problem, and only a limited understanding of what theoretical approaches and off-the-shelf analysis tools exist or are applicable to their particular problem. This paper examines the larger topic of complexity science, providing insight, and helping to place its promises in perspective.
Most national policy decisions are complex with a variety of stakeholders, disparate interests and the potential for unintended consequences. While a number of analytical tools exist to help decision makers sort through the mountains of data and myriad of options, decision support teams are increasingly turning to complexity science for improved analysis and better insight into the potential impact of policy decisions. While complexity science has great potential, it has only proven useful in limited case s and when properly applied. In advance of more widespread use, a national - level effort to refine complexity science and more rigorously establish its technical underpinnings is recommended.
Information is one of the most powerful tools available today. All advances in technology may be used, as David Sarnoff said, for the benefit or harm of society. Information can be used to shape the future by free people, or used to control people by less than benevolent governments, as has been demonstrated since the mid - 1930s, and with growing frequency over the past 50 years. What promised to once set people free and fuel an industrial revolution that might improve the standard of living over most of the world, has also been used to manipulate and enslave entire populations. The future of information is tied to the future of technologies that support the collection of data, processing those data into information and knowledge, and distribution. Technologies supporting the future of information must include technologies that help protect the integrity of data and information, and help to guarantee its discoverability and appropriate availability -- often to the whole of society. This Page Intentionally Left Blank
Here, observing geosynchronous satellites has numerous applications. Lighting conditions near the equinoxes routinely cause problems for traditional observations of sensors near the equator – the solar exclusion. We investigate using sensors on satellites (in polar and high- altitude orbits) to observe satellites that are in geosynchronous orbit. It is hoped that these satellite configurations will alleviate many of these problems. Assessing the orbit insertion and station-keeping requirements are important to understand. We summarize the literature to understand the relevant perturbing forces and assess the delta-v requirements.
In recent years, a number of sky survey projects have chosen to use arrays of commercia l cameras coupled with commercial photographic lenses to enable low - cost, wide - area observation. Projects such as SuperWASP, FAVOR, Raptor, Lotis, PANOPTES, and Dragonfly rely on multiple cameras with commercial lenses to image wide areas of the sky each n ight. The sensors are usually commercial astronomical change coupled devices ( CCD ) or digital single reflex (DSLR) cameras , while the lenses are large - aperture, high - end consumer items intended for general photography. While much of this equipment is very capable and relatively inexpensive, this approach comes with a number of significant limitations that reduce sensitivity and overall utility of the image data. The most frequently encountered limitations include lens vignetting, narrow spectral bandpass , a nd relatively large point spread function. Understanding these limits helps to assess the utility of the data, and identify areas where advanced optical designs could significantly improve survey performance.
We report the results of an LDRD effort to investigate new technologies for the identification of small-sized (mm to cm) debris in low-earth orbit. This small-yet-energetic debris presents a threat to the integrity of space-assets worldwide and represents significant security challenge to the international community. We present a nonexhaustive review of recent US and Russian efforts to meet the challenges of debris identification and removal and then provide a detailed description of joint US-Russian plans for sensitive, laser-based imaging of small debris at distances of hundreds of kilometers and relative velocities of several kilometers per second. Plans for the upcoming experimental testing of these imaging schemes are presented and a preliminary path toward system integration is identified.
The last decade has seen significant interest in wide field of view (FOV) telescopes for sky survey and space surveillance applications. Prompted by this interest, a multitude of wide-field designs have emerged. While all designs result from optimization of competing constraints, one of the more controversial design choices is whether such telescopes require flat or curved focal planes. For imaging applications, curved focal planes are not an obvious choice. Thirty years ago with mostly analytic design tools, the solution to wide-field image quality appeared to be curved focal planes. Today however, with computer aided optimization, high image quality can be achieved over flat focal surfaces. For most designs, the small gains in performance offered by curved focal planes are more than offset by the complexities and cost of curved CCDs. Modern design techniques incorporating reflective and refractive correctors appear to make a curved focal surface an unnecessary complication. Examination of seven current, wide FOV projects (SDSS, MMT, DCT, LSST, PanStarrs, HyperSuprime and DARPA SST) suggests there is little to be gained from a curved focal plane. The one exception might be the HyperSuprime instrument where performance goals are severely stressing refractive prime-focus corrector capabilities.