You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
This book grew out of lectures delivered at the University of California, Berkeley, over many years. The subject is a part of asymptotics in statistics, organized around a few central ideas. The presentation proceeds from the general to the particular since this seemed the best way to emphasize the basic concepts. The reader is expected to have been exposed to statistical thinking and methodology, as expounded for instance in the book by H. Cramer [1946] or the more recent text by P. Bickel and K. Doksum [1977]. Another pos sibility, closer to the present in spirit, is Ferguson [1967]. Otherwise the reader is expected to possess some mathematical maturity, but not really a great deal of deta...
Contributed in honour of Lucien Le Cam on the occasion of his 70th birthday, the papers reflect the immense influence that his work has had on modern statistics. They include discussions of his seminal ideas, historical perspectives, and contributions to current research - spanning two centuries with a new translation of a paper of Daniel Bernoulli. The volume begins with a paper by Aalen, which describes Le Cams role in the founding of the martingale analysis of point processes, and ends with one by Yu, exploring the position of just one of Le Cams ideas in modern semiparametric theory. The other 27 papers touch on areas such as local asymptotic normality, contiguity, efficiency, admissibility, minimaxity, empirical process theory, and biological medical, and meteorological applications - where Le Cams insights have laid the foundations for new theories.
This is the second edition of a coherent introduction to the subject of asymptotic statistics as it has developed over the past 50 years. It differs from the first edition in that it is now more 'reader friendly' and also includes a new chapter on Gaussian and Poisson experiments, reflecting their growing role in the field. Most of the subsequent chapters have been entirely rewritten and the nonparametrics of Chapter 7 have been amplified. The volume is not intended to replace monographs on specialized subjects, but will help to place them in a coherent perspective. It thus represents a link between traditional material - such as maximum likelihood, and Wald's Theory of Statistical Decision Functions -- together with comparison and distances for experiments. Much of the material has been taught in a second year graduate course at Berkeley for 30 years.
First multi-year cumulation covers six years: 1965-70.
None
There are a number of important questions associated with statistical experiments: when does one given experiment yield more information than another; how can we measure the difference in information; how fast does information accumulate by repeating the experiment? The means of answering such questions has emerged from the work of Wald, Blackwell, LeCam and others and is based on the ideas of risk and deficiency. The present work which is devoted to the various methods of comparing statistical experiments, is essentially self-contained, requiring only some background in measure theory and functional analysis. Chapters introducing statistical experiments and the necessary convex analysis begin the book and are followed by others on game theory, decision theory and vector lattices. The notion of deficiency, which measures the difference in information between two experiments, is then introduced. The relation between it and other concepts, such as sufficiency, randomisation, distance, ordering, equivalence, completeness and convergence are explored. This is a comprehensive treatment of the subject and will be an essential reference for mathematical statisticians.