You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
This book identifies challenges and opportunities in the development and implementation of software that contain significant statistical content. While emphasizing the relevance of using rigorous statistical and probabilistic techniques in software engineering contexts, it presents opportunities for further research in the statistical sciences and their applications to software engineering. It is intended to motivate and attract new researchers from statistics and the mathematical sciences to attack relevant and pressing problems in the software engineering setting. It describes the "big picture," as this approach provides the context in which statistical methods must be developed. The book's survey nature is directed at the mathematical sciences audience, but software engineers should also find the statistical emphasis refreshing and stimulating. It is hoped that the book will have the effect of seeding the field of statistical software engineering by its indication of opportunities where statistical thinking can help to increase understanding, productivity, and quality of software and software production.
This book identifies challenges and opportunities in the development and implementation of software that contain significant statistical content. While emphasizing the relevance of using rigorous statistical and probabilistic techniques in software engineering contexts, it presents opportunities for further research in the statistical sciences and their applications to software engineering. It is intended to motivate and attract new researchers from statistics and the mathematical sciences to attack relevant and pressing problems in the software engineering setting. It describes the "big picture," as this approach provides the context in which statistical methods must be developed. The book's survey nature is directed at the mathematical sciences audience, but software engineers should also find the statistical emphasis refreshing and stimulating. It is hoped that the book will have the effect of seeding the field of statistical software engineering by its indication of opportunities where statistical thinking can help to increase understanding, productivity, and quality of software and software production.
Recent rough estimates are that the U.S. Department of Defense (DoD) spends at least $38 billion a year on the research, development, testing, and evaluation of new defense systems; approximately 40 percent of that cost-at least $16 billion-is spent on software development and testing. There is widespread understanding within DoD that the effectiveness of software-intensive defense systems is often hampered by low-quality software as well as increased costs and late delivery of software components. Given the costs involved, even relatively incremental improvements to the software development process for defense systems could represent a large savings in funds. And given the importance of pro...
Empirical verification of knowledge is one of the foundations for developing any discipline. As far as software construction is concerned, the empirically verified knowledge is not only sparse but also not very widely disseminated among developers and researchers. This book aims to spread the idea of the importance of empirical knowledge in software development from a highly practical viewpoint. It has two goals: (1) Define the body of empirically validated knowledge in software development so as to advise practitioners on what methods or techniques have been empirically analysed and what the results were; (2) as empirical tests have traditionally been carried out by universities or research centres, propose techniques applicable by industry to check on the software development technologies they use.
Since 1992, the Committee on National Statistics (CNSTAT) has produced a book on principles and practices for a federal statistical agency, updating the document every 4 years to provide a current edition to newly appointed cabinet secretaries at the beginning of each presidential administration. This fourth edition presents and comments on four basic principles that statistical agencies must embody in order to carry out their mission fully: (1) They must produce objective data that are relevant to policy issues, (2) they must achieve and maintain credibility among data users, (3) they must achieve and maintain trust among data providers, and (4) they must achieve and maintain a strong posit...
This book assesses the achievements of the software engineering discipline as represented by IT vendors in Japan in order to deepen understanding of the mechanisms of how software engineering capabilities relate to IT vendors’ business performance and business environment from the perspective of innovation and engineering management. Based on the concepts of service science and science for society, the volume suggests how to improve the sophistication of services between the demand side, i.e., IT user companies, and the supply side, i.e., IT vendors, simultaneously. The author and his colleagues developed a structural model including innovational paths, such as service innovation, product ...
For the past 50 years, the Census Bureau has conducted experiments and evaluations with every decennial census involving field data collection during which alternatives to current census processes are assessed for a subset of the population. An "evaluation" is usually a post hoc analysis of data collected as part of the decennial census processing to determine whether individual steps in the census operated as expected. The 2010 Program for Evaluations and Experiments, known as CPEX, has enormous potential to reduce costs and increase effectiveness of the 2020 census by reducing the initial list of potential research topics from 52 to 6. The panel identified three priority experiments for in...
Planning for the 2020 census is already beginning. This book from the National Research Council examines several aspects of census planning, including questionnaire design, address updating, non-response follow-up, coverage follow-up, de-duplication of housing units and residents, editing and imputation procedures, and several other census operations. This book recommends that the Census Bureau overhaul its approach to research and development. The report urges the Bureau to set cost and quality goals for the 2020 and future censuses, improving efficiency by taking advantage of new technologies.