You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
Markov Chain Monte Carlo (MCMC) methods are now an indispensable tool in scientific computing. This book discusses recent developments of MCMC methods with an emphasis on those making use of past sample information during simulations. The application examples are drawn from diverse fields such as bioinformatics, machine learning, social science, combinatorial optimization, and computational physics. Key Features: Expanded coverage of the stochastic approximation Monte Carlo and dynamic weighting algorithms that are essentially immune to local trap problems. A detailed discussion of the Monte Carlo Metropolis-Hastings algorithm that can be used for sampling from distributions with intractable normalizing constants. Up-to-date accounts of recent developments of the Gibbs sampler. Comprehensive overviews of the population-based MCMC algorithms and the MCMC algorithms with adaptive proposals. This book can be used as a textbook or a reference book for a one-semester graduate course in statistics, computational biology, engineering, and computer sciences. Applied or theoretical researchers will also find this book beneficial.
The theory of belief functions, also known as evidence theory or Dempster-Shafer theory, was first introduced by Arthur P. Dempster in the context of statistical inference, and was later developed by Glenn Shafer as a general framework for modeling epistemic uncertainty. These early contributions have been the starting points of many important developments, including the Transferable Belief Model and the Theory of Hints. The theory of belief functions is now well established as a general framework for reasoning with uncertainty, and has well understood connections to other frameworks such as probability, possibility and imprecise probability theories. This volume contains the proceedings of the 2nd International Conference on Belief Functions that was held in Compiègne, France on 9-11 May 2012. It gathers 51 contributions describing recent developments both on theoretical issues (including approximation methods, combination rules, continuous belief functions, graphical models and independence concepts) and applications in various areas including classification, image processing, statistics and intelligent vehicles.
Analytics for Leaders provides a concise, readable account of a complete system of performance measurement for an enterprise. Based on over twenty years of research and development, the system is designed to provide people at all levels with the quantitative information they need to do their jobs: board members to exercise due diligence about all facets of the business, leaders to decide where to focus attention next, and people to carry out their work well. For senior officers, chapter openers provide quick overviews about the overall approach to a particular stakeholder group and how to connect overall performance measures to business impact. For MBA students, extensive supporting notes and references provide in-depth understanding. For researchers and practitioners, a generic statistical approach is described to encourage new ways of tackling performance measurement issues. The book is relevant to all types of enterprise, large or small, public or private, academic or governmental.
This book encompasses a wide range of important topics. The articles cover the following areas: asymptotic theory and inference, biostatistics, economics and finance, statistical computing and Bayesian statistics, and statistical genetics. Specifically, the issues that are studied include large deviation, deviation inequalities, local sensitivity of model misspecification in likelihood inference, empirical likelihood confidence intervals, uniform convergence rates in density estimation, randomized designs in clinical trials, MCMC and EM algorithms, approximation of p-values in multipoint linkage analysis, use of mixture models in genetic studies, and design and analysis of quantitative traits.
The emergence of data science, in recent decades, has magnified the need for efficient methodology for analyzing data and highlighted the importance of statistical inference. Despite the tremendous progress that has been made, statistical science is still a young discipline and continues to have several different and competing paths in its approaches and its foundations. While the emergence of competing approaches is a natural progression of any scientific discipline, differences in the foundations of statistical inference can sometimes lead to different interpretations and conclusions from the same dataset. The increased interest in the foundations of statistical inference has led to many p...
Canvasses 3 different perspectives on "stop and frisk" (S&F) police activity in NY City. Provides the legal definition of, and constitutional parameters for S&F encounters. Considers S&F from the perspective of both the N.Y. City Police Dept. (NYPD) and minority communities that believe they have been most affected by the use of S&F. S&F is also examined as part of the NYPD's training regimen and from the point of view of officers who have used the technique. Provides an assessment of the S&F tactic from the perspective of persons who have been "stopped," and commentary from persons who have observed the tactic's secondary effects. Comprehensive!!
Knowledge and expertise, especially of the kind that can shape public opinion, have been traditionally the domain of individuals holding degrees awarded by higher learning institutions or occupying formal positions in notable organizations. Expertise is validated by reputations established in an institutionalized marketplace of ideas with a limited number of “available seats” and a stringent process of selection and retention of names, ideas, topics and facts of interest. However, the social media revolution, which has enabled over two billion Internet users not only to consume, but also to produce information and knowledge, has created a secondary and very active informal marketplace of...
A venture into the art and science of measuring religion in everyday life In an era of rapid technological advances, the measures and methods used to generate data about religion have undergone remarkably little change. Faithful Measures pushes the study of religion into the 21st century by evaluating new and existing measures of religion and introducing new methods for tapping into religious behaviors and beliefs. This book offers a global and innovative approach, with chapters on the intersection of religion and new technology, such as smart phone apps, Google Ngrams, crowdsourcing data, and Amazon buying networks. It also shows how old methods can be improved by using new technology to create online surveys with experimental designs and by developing new ways of mining data from existing information. Chapter contributors thoroughly explain how to employ these new techniques, and offer fresh insights into understanding the complex topic of religion in modern life. Beyond its quantitative contributions, Faithful Measures will be an invaluable resource for inspiring a new wave of creativity and exploration in our connected world.
The volume presents, in a synergistic manner, significant theoretical and practical contributions in the area of social media reputation and authorship measurement, visualization, and modeling. The book justifies and proposes contributions to a future agenda for understanding the requirements for making social media authorship more transparent. Building on work presented in a previous volume of this series, Roles, Trust, and Reputation in Social Media Knowledge Markets, this book discusses new tools, applications, services, and algorithms that are needed for authoring content in a real-time publishing world. These insights may help people who interact and create content through social media ...
divThis book explores community dynamics within social media. Using Wikipedia as an example, the volume explores communities that rely upon commons-based peer production. Fundamental theoretical principles spanning such domains as organizational configurations, leadership roles, and social evolutionary theory are developed. In the context of Wikipedia, these theories explain how a functional elite of highly productive editors has emerged and why they are responsible for a majority of the content. It explains how the elite shapes the project and how this group tends to become stable and increasingly influential over time. Wikipedia has developed a new and resilient social hierarchy, an adhocr...