You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
This book provides an extensive coverage of the methodology of survival analysis, ranging from introductory level material to deeper more advanced topics. The framework is that of proportional and non-proportional hazards models; a structure that is broad enough to enable the recovery of a large number of established results as well as to open the way to many new developments. The emphasis is on concepts and guiding principles, logical and graphical. Formal proofs of theorems, propositions and lemmas are gathered together at the end of each chapter separate from the main presentation. The intended audience includes academic statisticians, biostatisticians, epidemiologists and also researchers in these fields whose focus may be more on the applications than on the theory. The text could provide the basis for a two semester course on survival analysis and, with this goal in mind, each chapter includes a section with a range of exercises as a teaching aid for instructors.
The place in survival analysis now occupied by proportional hazards models and their generalizations is so large that it is no longer conceivable to offer a course on the subject without devoting at least half of the content to this topic alone. This book focuses on the theory and applications of a very broad class of models – proportional hazards and non-proportional hazards models, the former being viewed as a special case of the latter – which underlie modern survival analysis. Researchers and students alike will find that this text differs from most recent works in that it is mostly concerned with methodological issues rather than the analysis itself.
This book is about elicitation: the facilitation of the quantitative expression of subjective judgement about matters of fact, interacting with subject experts, or about matters of value, interacting with decision makers or stakeholders. It offers an integrated presentation of procedures and processes that allow analysts and experts to think clearly about numbers, particularly the inputs for decision support systems and models. This presentation encompasses research originating in the communities of structured probability elicitation/calibration and multi-criteria decision analysis, often unaware of each other’s developments. Chapters 2 through 9 focus on processes to elicit uncertainty fr...
The aim of this book is to bridge the gap between standard textbook models and a range of models where the dynamic structure of the data manifests itself fully. The common denominator of such models is stochastic processes. The authors show how counting processes, martingales, and stochastic integrals fit very nicely with censored data. Beginning with standard analyses such as Kaplan-Meier plots and Cox regression, the presentation progresses to the additive hazard model and recurrent event data. Stochastic processes are also used as natural models for individual frailty; they allow sensible interpretations of a number of surprising artifacts seen in population data. The stochastic process f...
Includes the decisions of the Supreme Courts of Missouri, Arkansas, Tennessee, and Texas, and Court of Appeals of Kentucky; Aug./Dec. 1886-May/Aug. 1892, Court of Appeals of Texas; Aug. 1892/Feb. 1893-Jan./Feb. 1928, Courts of Civil and Criminal Appeals of Texas; Apr./June 1896-Aug./Nov. 1907, Court of Appeals of Indian Territory; May/June 1927-Jan./Feb. 1928, Courts of Appeals of Missouri and Commission of Appeals of Texas.
Bayesian variable selection has experienced substantial developments over the past 30 years with the proliferation of large data sets. Identifying relevant variables to include in a model allows simpler interpretation, avoids overfitting and multicollinearity, and can provide insights into the mechanisms underlying an observed phenomenon. Variable selection is especially important when the number of potential predictors is substantially larger than the sample size and sparsity can reasonably be assumed. The Handbook of Bayesian Variable Selection provides a comprehensive review of theoretical, methodological and computational aspects of Bayesian methods for variable selection. The topics cov...
In today's healthcare landscape, there is a pressing need for quantitative methodologies that include the patients' perspective in any treatment decision. Handbook of Generalized Pairwise Comparisons: Methods for Patient-Centric Analysis provides a comprehensive overview of an innovative and powerful statistical methodology that generalizes the traditional Wilcoxon-Mann-Whitney test by extending it to any number of outcomes of any type and including thresholds of clinical relevance into a single, multidimensional evaluation. The book covers the statistical foundations of generalized pairwise comparisons (GPC), applications in various disease areas, implications for regulatory approvals and benefit-risk analyses, and considerations for patient-centricity in clinical research. With contributions from leading experts in the field, this book stands as an essential resource for a more holistic and patient-centric assessment of treatment effects.
In The Shamrock and the Cross: Irish American Novelists Shape American Catholicism, Eileen P. Sullivan traces changes in nineteenth-century American Catholic culture through a study of Catholic popular literature. Analyzing more than thirty novels spanning the period from the 1830s to the 1870s, Sullivan elucidates the ways in which Irish immigration, which transformed the American Catholic population and its institutions, also changed what it meant to be a Catholic in America. In the 1830s and 1840s, most Catholic fiction was written by American-born converts from Protestant denominations; after 1850, most was written by Irish immigrants or their children, who created characters and plots t...