You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
This book covers modern statistical inference based on likelihood with applications in medicine, epidemiology and biology. Two introductory chapters discuss the importance of statistical models in applied quantitative research and the central role of the likelihood function. The rest of the book is divided into three parts. The first describes likelihood-based inference from a frequentist viewpoint. Properties of the maximum likelihood estimate, the score function, the likelihood ratio and the Wald statistic are discussed in detail. In the second part, likelihood is combined with prior information to perform Bayesian inference. Topics include Bayesian updating, conjugate and reference priors, Bayesian point and interval estimates, Bayesian asymptotics and empirical Bayes methods. Modern numerical techniques for Bayesian inference are described in a separate chapter. Finally two more advanced topics, model choice and prediction, are discussed both from a frequentist and a Bayesian perspective. A comprehensive appendix covers the necessary prerequisites in probability theory, matrix algebra, mathematical calculus, and numerical analysis.
Gaussian Markov Random Field (GMRF) models are most widely used in spatial statistics - a very active area of research in which few up-to-date reference works are available. This is the first book on the subject that provides a unified framework of GMRFs with particular emphasis on the computational aspects. This book includes extensive case-studie
Simultaneous confidence bands enable more intuitive and detailed inference of regression analysis than the standard inferential methods of parameter estimation and hypothesis testing. Simultaneous Inference in Regression provides a thorough overview of the construction methods and applications of simultaneous confidence bands for various inferentia
This richly illustrated textbook covers modern statistical methods with applications in medicine, epidemiology and biology. Firstly, it discusses the importance of statistical models in applied quantitative research and the central role of the likelihood function, describing likelihood-based inference from a frequentist viewpoint, and exploring the properties of the maximum likelihood estimate, the score function, the likelihood ratio and the Wald statistic. In the second part of the book, likelihood is combined with prior information to perform Bayesian inference. Topics include Bayesian updating, conjugate and reference priors, Bayesian point and interval estimates, Bayesian asymptotics an...
The contributions collected in this book have been written by well-known statisticians to acknowledge Ludwig Fahrmeir's far-reaching impact on Statistics as a science, while celebrating his 65th birthday. The contributions cover broad areas of contemporary statistical model building, including semiparametric and geoadditive regression, Bayesian inference in complex regression models, time series modelling, statistical regularization, graphical models and stochastic volatility models.
In time series modeling, the behavior of a certain phenomenon is expressed in relation to the past values of itself and other covariates. Since many important phenomena in statistical analysis are actually time series and the identification of conditional distribution of the phenomenon is an essential part of the statistical modeling, it is very im
Sequential Analysis: Hypothesis Testing and Changepoint Detection systematically develops the theory of sequential hypothesis testing and quickest changepoint detection. It also describes important applications in which theoretical results can be used efficiently. The book reviews recent accomplishments in hypothesis testing and changepoint detection both in decision-theoretic (Bayesian) and non-decision-theoretic (non-Bayesian) contexts. The authors not only emphasize traditional binary hypotheses but also substantially more difficult multiple decision problems. They address scenarios with simple hypotheses and more realistic cases of two and finitely many composite hypotheses. The book pri...
The First Detailed Account of Statistical Analysis That Treats Models as Approximations The idea of truth plays a role in both Bayesian and frequentist statistics. The Bayesian concept of coherence is based on the fact that two different models or parameter values cannot both be true. Frequentist statistics is formulated as the problem of estimating the "true but unknown" parameter value that generated the data. Forgoing any concept of truth, Data Analysis and Approximate Models: Model Choice, Location-Scale, Analysis of Variance, Nonparametric Regression and Image Analysis presents statistical analysis/inference based on approximate models. Developed by the author, this approach consistentl...
Filling a gap in current Bayesian theory, Statistical Inference: An Integrated Bayesian/Likelihood Approach presents a unified Bayesian treatment of parameter inference and model comparisons that can be used with simple diffuse prior specifications. This novel approach provides new solutions to difficult model comparison problems and offers direct
In many ways, estimation by an appropriate minimum distance method is one of the most natural ideas in statistics. However, there are many different ways of constructing an appropriate distance between the data and the model: the scope of study referred to by "Minimum Distance Estimation" is literally huge. Filling a statistical resource gap, Stati