You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
What gives statistics its unity as a science? Stephen Stigler sets forth the seven foundational ideas of statistics—a scientific discipline related to but distinct from mathematics and computer science. Even the most basic idea—aggregation, exemplified by averaging—is counterintuitive. It allows one to gain information by discarding information, namely, the individuality of the observations. Stigler’s second pillar, information measurement, challenges the importance of “big data” by noting that observations are not all equally important: the amount of information in a data set is often proportional to only the square root of the number of observations, not the absolute number. Th...
This lively collection of essays examines statistical ideas with an ironic eye for their essence and what their history can tell us for current disputes. The topics range from 17th-century medicine and the circulation of blood, to the cause of the Great Depression, to the determinations of the shape of the Earth and the speed of light.
This magnificent book is the first comprehensive history of statistics from its beginnings around 1700 to its emergence as a distinct and mature discipline around 1900. Stephen M. Stigler shows how statistics arose from the interplay of mathematical concepts and the needs of several applied sciences including astronomy, geodesy, experimental psychology, genetics, and sociology. He addresses many intriguing questions: How did scientists learn to combine measurements made under different conditions? And how were they led to use probability theory to measure the accuracy of the result? Why were statistical methods used successfully in astronomy long before they began to play a significant role ...
This lively collection of essays examines in witty detail the history of some of the concepts involved in bringing statistical argument "to the table," and some of the pitfalls that have been encountered. The topics range from seventeenth-century medicine and the circulation of blood, to the cause of the Great Depression and the effect of the California gold discoveries of 1848 upon price levels, to the determinations of the shape of the Earth and the speed of light, to the meter of Virgil's poetry and the prediction of the Second Coming of Christ. The title essay tells how the statistician Karl Pearson came to issue the challenge to put "statistics on the table" to the economists Marshall, ...
Introduction -- Casanova -- The Genoese lottery -- The establishement of the Loterie -- Problems and adjustments of the early drawings -- Antoine Blanquet and the great expansion of 1776 -- The introduction of bonus numbers : les primes gratuites -- The spread of the Loterie in Europe -- Data security : the design of the tickets -- The Loterie and the Revolution -- Was the Loterie fair? -- Dreams and astrology : the bettors and the Loterie -- The number 45 and the maturity of chances -- How much did they bet, and where? -- Muskets, fine-tuned risk, and Voltaire -- The Loterie in textbooks and manuals -- The suppression of the Loterie in 1836 -- Conclusion.
"In the Winter Quarter of the academic year 1984-1985, Raj Bahadur gave a series of lectures on estimation theory at the University of Chicago"--Page i.
In 1865, Gregor Mendel presented "Experiments in Plant-Hybridization," the results of his eight-year study of the principles of inheritance through experimentation with pea plants. Overlooked in its day, Mendel's work would later become the foundation of modern genetics. Did his pioneering research follow the rigors of real scientific inquiry, or was Mendel's data too good to be true—the product of doctored statistics? In Ending the Mendel-Fisher Controversy, leading experts present their conclusions on the legendary controversy surrounding the challenge to Mendel's findings by British statistician and biologist R. A. Fisher. In his 1936 paper "Has Mendel's Work Been Rediscovered?" Fisher ...
A traditional approach to developing multivariate statistical theory is algebraic. Sets of observations are represented by matrices, linear combinations are formed from these matrices by multiplying them by coefficient matrices, and useful statistics are found by imposing various criteria of optimization on these combinations. Matrix algebra is the vehicle for these calculations. A second approach is computational. Since many users find that they do not need to know the mathematical basis of the techniques as long as they have a way to transform data into results, the computation can be done by a package of computer programs that somebody else has written. An approach from this perspective e...
Classical statistical theory—hypothesis testing, estimation, and the design of experiments and sample surveys—is mainly the creation of two men: Ronald A. Fisher (1890-1962) and Jerzy Neyman (1894-1981). Their contributions sometimes complemented each other, sometimes occurred in parallel, and, particularly at later stages, often were in strong opposition. The two men would not be pleased to see their names linked in this way, since throughout most of their working lives they detested each other. Nevertheless, they worked on the same problems, and through their combined efforts created a new discipline. This new book by E.L. Lehmann, himself a student of Neyman’s, explores the relationship between Neyman and Fisher, as well as their interactions with other influential statisticians, and the statistical history they helped create together. Lehmann uses direct correspondence and original papers to recreate an historical account of the creation of the Neyman-Pearson Theory as well as Fisher’s dissent, and other important statistical theories.