Welcome to our book review site go-pdf.online!

You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.

Sign up

Computer Age Statistical Inference, Student Edition
  • Language: en
  • Pages: 513

Computer Age Statistical Inference, Student Edition

Now in paperback and fortified with exercises, this brilliant, enjoyable text demystifies data science, statistics and machine learning.

Computer Age Statistical Inference
  • Language: en
  • Pages: 496

Computer Age Statistical Inference

Take an exhilarating journey through the modern revolution in statistics with two of the ringleaders.

An Introduction to the Bootstrap
  • Language: en
  • Pages: 456

An Introduction to the Bootstrap

  • Type: Book
  • -
  • Published: 1994-05-15
  • -
  • Publisher: CRC Press

Statistics is a subject of many uses and surprisingly few effective practitioners. The traditional road to statistical knowledge is blocked, for most, by a formidable wall of mathematics. The approach in An Introduction to the Bootstrap avoids that wall. It arms scientists and engineers, as well as statisticians, with the computational techniques they need to analyze and understand complicated data sets.

The Science of Bradley Efron
  • Language: en

The Science of Bradley Efron

  • Type: Book
  • -
  • Published: 2008-05-15
  • -
  • Publisher: Springer

Nature didn’t design human beings to be statisticians, and in fact our minds are more naturally attuned to spotting the saber-toothed tiger than seeing the jungle he springs from. Yet scienti?c discovery in practice is often more jungle than tiger. Those of us who devote our scienti?c lives to the deep and satisfying subject of statistical inference usually do so in the face of a certain under-appreciation from the public, and also (though less so these days) from the wider scienti?c world. With this in mind, it feels very nice to be over-appreciated for a while, even at the expense of weathering a 70th birthday. (Are we certain that some terrible chronological error hasn’t been made?) Carl Morris and Rob Tibshirani, the two colleagues I’ve worked most closely with, both ?t my ideal pro?le of the statistician as a mathematical scientist working seamlessly across wide areas of theory and application. They seem to have chosen the papers here in the same catholic spirit, and then cajoled an all-star cast of statistical savants to comment on them.

Large-Scale Inference
  • Language: en

Large-Scale Inference

We live in a new age for statistical inference, where modern scientific technology such as microarrays and fMRI machines routinely produce thousands and sometimes millions of parallel data sets, each with its own estimation or testing problem. Doing thousands of problems at once is more than repeated application of classical methods. Taking an empirical Bayes approach, Bradley Efron, inventor of the bootstrap, shows how information accrues across problems in a way that combines Bayesian and frequentist ideas. Estimation, testing and prediction blend in this framework, producing opportunities for new methodologies of increased power. New difficulties also arise, easily leading to flawed inferences. This book takes a careful look at both the promise and pitfalls of large-scale statistical inference, with particular attention to false discovery rates, the most successful of the new statistical techniques. Emphasis is on the inferential ideas underlying technical developments, illustrated using a large number of real examples.

The Science of Bradley Efron
  • Language: en

The Science of Bradley Efron

  • Type: Book
  • -
  • Published: 2010-11-19
  • -
  • Publisher: Springer

Nature didn’t design human beings to be statisticians, and in fact our minds are more naturally attuned to spotting the saber-toothed tiger than seeing the jungle he springs from. Yet scienti?c discovery in practice is often more jungle than tiger. Those of us who devote our scienti?c lives to the deep and satisfying subject of statistical inference usually do so in the face of a certain under-appreciation from the public, and also (though less so these days) from the wider scienti?c world. With this in mind, it feels very nice to be over-appreciated for a while, even at the expense of weathering a 70th birthday. (Are we certain that some terrible chronological error hasn’t been made?) Carl Morris and Rob Tibshirani, the two colleagues I’ve worked most closely with, both ?t my ideal pro?le of the statistician as a mathematical scientist working seamlessly across wide areas of theory and application. They seem to have chosen the papers here in the same catholic spirit, and then cajoled an all-star cast of statistical savants to comment on them.

The Jackknife, the Bootstrap, and Other Resampling Plans
  • Language: en
  • Pages: 99

The Jackknife, the Bootstrap, and Other Resampling Plans

  • Type: Book
  • -
  • Published: 1982-01-01
  • -
  • Publisher: SIAM

The jackknife and the bootstrap are nonparametric methods for assessing the errors in a statistical estimation problem. They provide several advantages over the traditional parametric approach: the methods are easy to describe and they apply to arbitrarily complicated situations; distribution assumptions, such as normality, are never made. This monograph connects the jackknife, the bootstrap, and many other related ideas such as cross-validation, random subsampling, and balanced repeated replications into a unified exposition. The theoretical development is at an easy mathematical level and is supplemented by a large number of numerical examples. The methods described in this monograph form a useful set of tools for the applied statistician. They are particularly useful in problem areas where complicated data structures are common, for example, in censoring, missing data, and highly multivariate situations.

Computer Age Statistical Inference, Student Edition
  • Language: en
  • Pages: 514

Computer Age Statistical Inference, Student Edition

The twenty-first century has seen a breathtaking expansion of statistical methodology, both in scope and influence. 'Data science' and 'machine learning' have become familiar terms in the news, as statistical methods are brought to bear upon the enormous data sets of modern science and commerce. How did we get here? And where are we going? How does it all fit together? Now in paperback and fortified with exercises, this book delivers a concentrated course in modern statistical thinking. Beginning with classical inferential theories - Bayesian, frequentist, Fisherian - individual chapters take up a series of influential topics: survival analysis, logistic regression, empirical Bayes, the jackknife and bootstrap, random forests, neural networks, Markov Chain Monte Carlo, inference after model selection, and dozens more. The distinctly modern approach integrates methodology and algorithms with statistical inference. Each chapter ends with class-tested exercises, and the book concludes with speculation on the future direction of statistics and data science.

Breakthroughs in Statistics
  • Language: en
  • Pages: 576

Breakthroughs in Statistics

Volume III includes more selections of articles that have initiated fundamental changes in statistical methodology. It contains articles published before 1980 that were overlooked in the previous two volumes plus articles from the 1980's - all of them chosen after consulting many of today's leading statisticians.

A Mixture Model Approach to Empirical Bayes Testing and Estimation
  • Language: en
  • Pages: 89

A Mixture Model Approach to Empirical Bayes Testing and Estimation

Many modern statistical problems require making similar decisions or estimates for many different entities. For example, we may ask whether each of 10,000 genes is associated with some disease, or try to measure the degree to which each is associated with the disease. As in this example, the entities can often be divided into a vast majority of "null" objects and a small minority of interesting ones. Empirical Bayes is a useful technique for such situations, but finding the right empirical Bayes method for each problem can be difficult. Mixture models, however, provide an easy and effective way to apply empirical Bayes. This thesis motivates mixture models by analyzing a simple high-dimensional problem, and shows their practical use by applying them to detecting single nucleotide polymorphisms.