You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
This volume contains extended versions of 28 carefully selected and reviewed papers presented at The Fourth International Conference on Mathematical Methods in Reliability in Santa Fe, New Mexico, June 21-25, 2004, the leading conference in reliability research. The meeting serves as a forum for discussing fundamental issues on mathematical methods in reliability theory and its applications.A broad overview of current research activities in reliability theory and its applications is provided with coverage on reliability modeling, network and system reliability, Bayesian methods, survival analysis, degradation and maintenance modeling, and software reliability. The contributors are all leading experts in the field and include the plenary session speakers, Tim Bedford, Thierry Duchesne, Henry Wynn, Vicki Bier, Edsel Pena, Michael Hamada, and Todd Graves.
Massive data streams, large quantities of data that arrive continuously, are becoming increasingly commonplace in many areas of science and technology. Consequently development of analytical methods for such streams is of growing importance. To address this issue, the National Security Agency asked the NRC to hold a workshop to explore methods for analysis of streams of data so as to stimulate progress in the field. This report presents the results of that workshop. It provides presentations that focused on five different research areas where massive data streams are present: atmospheric and meteorological data; high-energy physics; integrated data systems; network traffic; and mining commercial data streams. The goals of the report are to improve communication among researchers in the field and to increase relevant statistical science activity.
This assessment of the technical quality and relevance of the programs of the Measurement and Standards Laboratories of the National Institute of Standards and Technology is the work of the 165 members of the National Research Council's (NRC's) Board on Assessment of NIST Programs and its panels. These individuals were chosen by the NRC for their technical expertise, their practical experience in running research programs, and their knowledge of industry's needs in basic measurements and standards. This assessment addresses the following: The technical merit of the laboratory programs relative to the state of the art worldwide; The effectiveness with which the laboratory programs are carried out and the results disseminated to their customers; The relevance of the laboratory programs to the needs of their customers; and The ability of the laboratories' facilities, equipment, and human resources to enable the laboratories to fulfill their mission and meet their customers' needs.
This book explains how computer software is designed to perform the tasks required for sophisticated statistical analysis. For statisticians, it examines the nitty-gritty computational problems behind statistical methods. For mathematicians and computer scientists, it looks at the application of mathematical tools to statistical problems. The first half of the book offers a basic background in numerical analysis that emphasizes issues important to statisticians. The next several chapters cover a broad array of statistical tools, such as maximum likelihood and nonlinear regression. The author also treats the application of numerical tools; numerical integration and random number generation are explained in a unified manner reflecting complementary views of Monte Carlo methods. Each chapter contains exercises that range from simple questions to research problems. Most of the examples are accompanied by demonstration and source code available from the author's website. New in this second edition are demonstrations coded in R, as well as new sections on linear programming and the Nelder–Mead search algorithm.
Volume II covers a number of measurement and analytical issues in greater technical detail, including: range restriction adjustments, methods for evaluating multiple sources of error in measurement, comparing alternative measures of performance, and strategies for clustering military occupations.
Modeling, simulation, and analysis (MS&A) is a crucial tool for military affairs. MS&A is one of the announced pillars of a strategy for transforming the U.S. military. Yet changes in the enterprise of MS&A have not kept pace with the new demands arising from rapid changes in DOD processes and missions or with the rapid changes in the technology available to meet those demands. To help address those concerns, DOD asked the NRC to identify shortcomings in current practice of MS&A and suggest where and how they should be resolved. This report provides an assessment of the changing mission of DOD and environment in which it must operate, an identification of high-level opportunities for MS&A research to address the expanded mission, approaches for improving the interface between MS&A practitioners and decision makers, a discussion of training and continuing education of MS&A practitioners, and an examination of the need for coordinated military science research to support MS&A.
A biological warfare agent (BWA) is a microorganism, or a toxin derived from a living organism, that causes disease in humans, plants, or animals or that causes the deterioration of material. The effectiveness of a BWA is greatly reduced if the attack is detected in time for the target population to take appropriate defensive measures. Therefore, the ability to detect a BWA, in particular to detect it before the target population is exposed, will be a valuable asset to defense against biological attacks. The ideal detection system will have quick response and be able to detect a threat plume at a distance from the target population. The development of reliable biological standoff detection systems, therefore, is a key goal. However, testing biological standoff detection systems is difficult because open-air field tests with BWAs are not permitted under international conventions and because the wide variety of environments in which detectors might be used may affect their performance. This book explores the question of how to determine whether or not a biological standoff detection system fulfills its mission reliably if we cannot conduct open-air field tests with live BWAs.
Ver 1.0 was a three-day workshop on public database verification for journalists and social scientists held in Santa Fe, New Mexico USA in April 2006. Ten journalists and 10 statisticians, social scientists, public administrators and computer scientists met to discuss mutual concerns and worked to find solutions. This book contains most of the papers presented and the workproduct of three breakout groups, each investigating a different aspect of the problem.
At the request of the U.S. Census Bureau, the National Research Council's Committee on National Statistics established the Panel on Research on Future Census Methods to review the early planning process for the 2010 census. This new report documents the panel's strong support for the major aims of the Census Bureau's emerging plan for 2010. At the same time, it notes the considerable challenges that must be overcome if the bureau's innovations are to be successful. The panel agrees with the Census Bureau that implementation of the American Community Survey and, with it, the separation of the long form from the census process are excellent concepts. Moreover, it concurs that the critically im...