You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
The chapters in this volume stress the need for advances in theoretical understanding to go hand-in-hand with the widespread practical application of forecasting in industry. Forecasting and time series prediction have enjoyed considerable attention over the last few decades, fostered by impressive advances in observational capabilities and measurement procedures. On June 5-7, 2013, an international Workshop on Industry Practices for Forecasting was held in Paris, France, organized and supported by the OSIRIS Department of Electricité de France Research and Development Division. In keeping with tradition, both theoretical statistical results and practical contributions on this active field ...
The political economy literature has put forward a multitude of hypotheses regarding the drivers of structural reforms, but few, if any, empirically robust findings have emerged thus far. To make progress, we draw a parallel with model uncertainty in the growth literature and provide a new version of the Bayesian averaging of maximum likelihood estimates (BAMLE) technique tailored to binary logit models. Relying on a new database of major past labor and product market reforms in advanced countries, we test a large set of variables for robust correlation with reform in each area. We find widespread support for the crisis-induces-reform hypothesis. Outside pressure increases the likelihood of reform in certain areas: reforms are more likely when other countries also undertake them and when there is formal pressure to implement them. Other robust correlates are more specific to certain areas—for example, international pressure and political factors are most relevant for product market and job protection reforms, respectively.
In nonparametric and high-dimensional statistical models, the classical Gauss–Fisher–Le Cam theory of the optimality of maximum likelihood estimators and Bayesian posterior inference does not apply, and new foundations and ideas have been developed in the past several decades. This book gives a coherent account of the statistical theory in infinite-dimensional parameter spaces. The mathematical foundations include self-contained 'mini-courses' on the theory of Gaussian and empirical processes, approximation and wavelet theory, and the basic theory of function spaces. The theory of statistical inference in such models - hypothesis testing, estimation and confidence sets - is presented within the minimax paradigm of decision theory. This includes the basic theory of convolution kernel and projection estimation, but also Bayesian nonparametrics and nonparametric maximum likelihood estimation. In a final chapter the theory of adaptive inference in nonparametric models is developed, including Lepski's method, wavelet thresholding, and adaptive inference for self-similar functions. Winner of the 2017 PROSE Award for Mathematics.
Survival analysis generally deals with analysis of data arising from clinical trials. Censoring, truncation, and missing data create analytical challenges and the statistical methods and inference require novel and different approaches for analysis. Statistical properties, essentially asymptotic ones, of the estimators and tests are aptly handled in the counting process framework which is drawn from the larger arm of stochastic calculus. With explosion of data generation during the past two decades, survival data has also enlarged assuming a gigantic size. Most statistical methods developed before the millennium were based on a linear approach even in the face of complex nature of survival d...
Bayesian nonparametrics works - theoretically, computationally. The theory provides highly flexible models whose complexity grows appropriately with the amount of data. Computational issues, though challenging, are no longer intractable. All that is needed is an entry point: this intelligent book is the perfect guide to what can seem a forbidding landscape. Tutorial chapters by Ghosal, Lijoi and Prünster, Teh and Jordan, and Dunson advance from theory, to basic models and hierarchical modeling, to applications and implementation, particularly in computer science and biostatistics. These are complemented by companion chapters by the editors and Griffin and Quintana, providing additional models, examining computational issues, identifying future growth areas, and giving links to related topics. This coherent text gives ready access both to underlying principles and to state-of-the-art practice. Specific examples are drawn from information retrieval, NLP, machine vision, computational biology, biostatistics, and bioinformatics.
This modern approach integrates classical and contemporary methods, fusing theory and practice and bridging the gap to statistical learning.
A brand new, fully updated edition of a popular classic on matrix differential calculus with applications in statistics and econometrics This exhaustive, self-contained book on matrix theory and matrix differential calculus provides a treatment of matrix calculus based on differentials and shows how easy it is to use this theory once you have mastered the technique. Jan Magnus, who, along with the late Heinz Neudecker, pioneered the theory, develops it further in this new edition and provides many examples along the way to support it. Matrix calculus has become an essential tool for quantitative methods in a large number of applications, ranging from social and behavioral sciences to econome...
Bayesian nonparametrics comes of age with this landmark text synthesizing theory, methodology and computation.
Second Generation Wavelets and Applications introduces "second generation wavelets" and the lifting transform that can be used to apply the traditional benefits of wavelets into a wide range of new areas in signal processing, data processing and computer graphics. This book details the mathematical fundamentals of the lifting transform and illustrates the latest applications of the transform in signal and image processing, numerical analysis, scattering data smoothing and rendering of computer images.
New technologies allow us to handle increasingly large datasets, while monitoring devices are becoming ever more sophisticated. This high-tech progress produces statistical units sampled over finer and finer grids. As the measurement points become closer, the data can be considered as observations varying over a continuum. This intrinsic continuous data (called functional data) can be found in various fields of science, including biomechanics, chemometrics, econometrics, environmetrics, geophysics, medicine, etc. The failure of standard multivariate statistics to analyze such functional data has led the statistical community to develop appropriate statistical methodologies, called Functional...