You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
In nonparametric and high-dimensional statistical models, the classical Gauss–Fisher–Le Cam theory of the optimality of maximum likelihood estimators and Bayesian posterior inference does not apply, and new foundations and ideas have been developed in the past several decades. This book gives a coherent account of the statistical theory in infinite-dimensional parameter spaces. The mathematical foundations include self-contained 'mini-courses' on the theory of Gaussian and empirical processes, approximation and wavelet theory, and the basic theory of function spaces. The theory of statistical inference in such models - hypothesis testing, estimation and confidence sets - is presented within the minimax paradigm of decision theory. This includes the basic theory of convolution kernel and projection estimation, but also Bayesian nonparametrics and nonparametric maximum likelihood estimation. In a final chapter the theory of adaptive inference in nonparametric models is developed, including Lepski's method, wavelet thresholding, and adaptive inference for self-similar functions. Winner of the 2017 PROSE Award for Mathematics.
A friendly and systematic introduction to the theory and applications. The book begins with the sums of independent random variables and vectors, with maximal inequalities and sharp estimates on moments, which are later used to develop and interpret decoupling inequalities. Decoupling is first introduced as it applies to randomly stopped processes and unbiased estimation. The authors then proceed with the theory of decoupling in full generality, paying special attention to comparison and interplay between martingale and decoupling theory, and to applications. These include limit theorems, moment and exponential inequalities for martingales and more general dependence structures, biostatistical implications, and moment convergence in Anscombe's theorem and Wald's equation for U--statistics. Addressed to researchers in probability and statistics and to graduates, the expositon is at the level of a second graduate probability course, with a good portion of the material fit for use in a first year course.
This is a masterly introduction to the modern, and rigorous, theory of probability. The author emphasises martingales and develops all the necessary measure theory.
High dimensional probability, in the sense that encompasses the topics rep resented in this volume, began about thirty years ago with research in two related areas: limit theorems for sums of independent Banach space valued random vectors and general Gaussian processes. An important feature in these past research studies has been the fact that they highlighted the es sential probabilistic nature of the problems considered. In part, this was because, by working on a general Banach space, one had to discard the extra, and often extraneous, structure imposed by random variables taking values in a Euclidean space, or by processes being indexed by sets in R or Rd. Doing this led to striking advan...
This treatise by an acknowledged expert includes several topics not found in any previous book.
What is high dimensional probability? Under this broad name we collect topics with a common philosophy, where the idea of high dimension plays a key role, either in the problem or in the methods by which it is approached. Let us give a specific example that can be immediately understood, that of Gaussian processes. Roughly speaking, before 1970, the Gaussian processes that were studied were indexed by a subset of Euclidean space, mostly with dimension at most three. Assuming some regularity on the covariance, one tried to take advantage of the structure of the index set. Around 1970 it was understood, in particular by Dudley, Feldman, Gross, and Segal that a more abstract and intrinsic point of view was much more fruitful. The index set was no longer considered as a subset of Euclidean space, but simply as a metric space with the metric canonically induced by the process. This shift in perspective subsequently lead to a considerable clarification of many aspects of Gaussian process theory, and also to its applications in other settings.
For almost fifty years, Richard M. Dudley has been extremely influential in the development of several areas of Probability. His work on Gaussian processes led to the understanding of the basic fact that their sample boundedness and continuity should be characterized in terms of proper measures of complexity of their parameter spaces equipped with the intrinsic covariance metric. His sufficient condition for sample continuity in terms of metric entropy is widely used and was proved by X. Fernique to be necessary for stationary Gaussian processes, whereas its more subtle versions (majorizing measures) were proved by M. Talagrand to be necessary in general. Together with V. N. Vapnik and A. Y....