You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
Jerzy Neyman received the National Medal of Science "for laying the foundations of modern statistics and devising tests and procedures that have become essential parts of the knowledge of every statistician." Until his death in 1981 at the age of 87, Neyman was vigorously involved in the concerns and controversies of the day, a scientist whose personality and activity were integral parts of his contribution to science. His career is thus particularly well-suited for the non-technical life-story which Constance Reid has made her own in such well-received biographies of Hilbert and Courant. She was able to talk extensively with Neyman and have access to his personal and professional letters and papers. Her book will thus appeal to professional statisticians as well as amateurs wanting to learn about a subject which permeates almost every aspect of modern life.
Summarizing developments and techniques in the field, this reference covers sample surveys, nonparametric analysis, hypothesis testing, time series analysis, Bayesian inference, and distribution theory for applications in statistics, economics, medicine, biology, engineering, sociology, psychology, and information technology. It supplies a geometric proof of an extended Gauss-Markov theorem, approaches for the design and implementation of sample surveys, advances in the theory of Neyman's smooth test, and methods for pre-test and biased estimation. It includes discussions ofsample size requirements for estimation in SUR models, innovative developments in nonparametric models, and more.
This volume contains 30 of David Brillinger's most influential papers. He is an eminent statistical scientist, having published broadly in time series and point process analysis, seismology, neurophysiology, and population biology. Each of these areas are well represented in the book. The volume has been divided into four parts, each with comments by one of Dr. Brillinger's former PhD students. His more theoretical papers have comments by Victor Panaretos from Switzerland. The area of time series has commentary by Pedro Morettin from Brazil. The biologically oriented papers are commented by Tore Schweder from Norway and Haiganoush Preisler from USA, while the point process papers have comments by Peter Guttorp from USA. In addition, the volume contains a Statistical Science interview with Dr. Brillinger, and his bibliography.
Classical statistical theory—hypothesis testing, estimation, and the design of experiments and sample surveys—is mainly the creation of two men: Ronald A. Fisher (1890-1962) and Jerzy Neyman (1894-1981). Their contributions sometimes complemented each other, sometimes occurred in parallel, and, particularly at later stages, often were in strong opposition. The two men would not be pleased to see their names linked in this way, since throughout most of their working lives they detested each other. Nevertheless, they worked on the same problems, and through their combined efforts created a new discipline. This new book by E.L. Lehmann, himself a student of Neyman’s, explores the relationship between Neyman and Fisher, as well as their interactions with other influential statisticians, and the statistical history they helped create together. Lehmann uses direct correspondence and original papers to recreate an historical account of the creation of the Neyman-Pearson Theory as well as Fisher’s dissent, and other important statistical theories.
In the last 25 years, the concept of information has played a crucial role in communication theory, so much so that the terms information theory and communication theory are sometimes used almost interchangeably. It seems to us, however, that the notion of information is also destined to render valuable services to the student of induction and probability, of learning and reinforcement, of semantic meaning and deductive inference, as~well as of scientific method in general. The present volume is an attempt to illustrate some of these uses of information concepts. In 'On Semantic Information' Hintikka summarizes some of his and his associates' recent work on information and induction, and com...
it emphasizes on J. Neyman and Egon Pearson's mathematical foundations of hypothesis testing, which is one of the finest methodologies of reaching conclusions on population parameter. Following Wald and Ferguson's approach, the book presents Neyman-Pearson theory under broader premises of decision theory resulting into simplification and generalization of results. On account of smooth mathematical development of this theory, the book outlines the main result on Lebesgue theory in abstract spaces prior to rigorous theoretical developments on most powerful (MP), uniformly most powerful (UMP) and UMP unbiased tests for different types of testing problems. Likelihood ratio tests their large samp...
This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises’ views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises’ definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axiomatic approaches. This of course raises the problem of how the abstract calculus of probability sh...