You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
Computational inference is based on an approach to statistical methods that uses modern computational power to simulate distributional properties of estimators and test statistics. This book describes computationally intensive statistical methods in a unified presentation, emphasizing techniques, such as the PDF decomposition, that arise in a wide range of methods.
Monte Carlo simulation has become one of the most important tools in all fields of science. This book surveys the basic techniques and principles of the subject, as well as general techniques useful in more complicated models and in novel settings. The emphasis throughout is on practical methods that work well in current computing environments.
Matrix algebra is one of the most important areas of mathematics for data analysis and for statistical theory. This much-needed work presents the relevant aspects of the theory of matrix algebra for applications in statistics. It moves on to consider the various types of matrices encountered in statistics, such as projection matrices and positive definite matrices, and describes the special properties of those matrices. Finally, it covers numerical linear algebra, beginning with a discussion of the basics of numerical computations, and following up with accurate and efficient algorithms for factoring matrices, solving linear systems of equations, and extracting eigenvalues and eigenvectors.
Will provide a more elementary introduction to these topics than other books available; Gentle is the author of two other Springer books
Accurate and efficient computer algorithms for factoring matrices, solving linear systems of equations, and extracting eigenvalues and eigenvectors. Regardless of the software system used, the book describes and gives examples of the use of modern computer software for numerical linear algebra. It begins with a discussion of the basics of numerical computations, and then describes the relevant properties of matrix inverses, factorisations, matrix and vector norms, and other topics in linear algebra. The book is essentially self- contained, with the topics addressed constituting the essential material for an introductory course in statistical computing. Numerous exercises allow the text to be used for a first course in statistical computing or as supplementary text for various courses that emphasise computations.
The Handbook of Computational Statistics - Concepts and Methods (second edition) is a revision of the first edition published in 2004, and contains additional comments and updated information on the existing chapters, as well as three new chapters addressing recent work in the field of computational statistics. This new edition is divided into 4 parts in the same way as the first edition. It begins with "How Computational Statistics became the backbone of modern data science" (Ch.1): an overview of the field of Computational Statistics, how it emerged as a separate discipline, and how its own development mirrored that of hardware and software, including a discussion of current active researc...
In this book the authors have assembled the "best techniques from a great variety of sources, establishing a benchmark for the field of statistical computing." ---Mathematics of Computation ." The text is highly readable and well illustrated with examples. The reader who intends to take a hand in designing his own regression and multivariate packages will find a storehouse of information and a valuable resource in the field of statistical computing.
Any financial asset that is openly traded has a market price. Except for extreme market conditions, market price may be more or less than a “fair” value. Fair value is likely to be some complicated function of the current intrinsic value of tangible or intangible assets underlying the claim and our assessment of the characteristics of the underlying assets with respect to the expected rate of growth, future dividends, volatility, and other relevant market factors. Some of these factors that affect the price can be measured at the time of a transaction with reasonably high accuracy. Most factors, however, relate to expectations about the future and to subjective issues, such as current management, corporate policies and market environment, that could affect the future financial performance of the underlying assets. Models are thus needed to describe the stochastic factors and environment, and their implementations inevitably require computational finance tools.
Statistical Analysis of Financial Data covers the use of statistical analysis and the methods of data science to model and analyze financial data. The first chapter is an overview of financial markets, describing the market operations and using exploratory data analysis to illustrate the nature of financial data. The software used to obtain the data for the examples in the first chapter and for all computations and to produce the graphs is R. However discussion of R is deferred to an appendix to the first chapter, where the basics of R, especially those most relevant in financial applications, are presented and illustrated. The appendix also describes how to use R to obtain current financial...
This book, now in a thoroughly revised second edition, provides a comprehensive and accessible introduction to modern set theory. Following an overview of basic notions in combinatorics and first-order logic, the author outlines the main topics of classical set theory in the second part, including Ramsey theory and the axiom of choice. The revised edition contains new permutation models and recent results in set theory without the axiom of choice. The third part explains the sophisticated technique of forcing in great detail, now including a separate chapter on Suslin’s problem. The technique is used to show that certain statements are neither provable nor disprovable from the axioms of se...