You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
This paper proposes a new cumulative sum (CUSUM) X chart under the assumption of uncertainty using the neutrosophic statistic (NS). The performance of the new chart is investigated in terms of the neutrosophic run length properties using the Monte Carlo simulations approach.
This thirteenth volume of Collected Papers is an eclectic tome of 88 papers in various fields of sciences, such as astronomy, biology, calculus, economics, education and administration, game theory, geometry, graph theory, information fusion, decision making, instantaneous physics, quantum physics, neutrosophic logic and set, non-Euclidean geometry, number theory, paradoxes, philosophy of science, scientific research methods, statistics, and others, structured in 17 chapters (Neutrosophic Theory and Applications; Neutrosophic Algebra; Fuzzy Soft Sets; Neutrosophic Sets; Hypersoft Sets; Neutrosophic Semigroups; Neutrosophic Graphs; Superhypergraphs; Plithogeny; Information Fusion; Statistics;...
In this paper we show that Neutrosophic Statistics is an extension of Interval Statistics, since it deals with all kinds of indeterminacy (with respect to data, inferential procedures, probability distributions, graphical representations, etc.), allows for indeterminacy reduction, and uses neutrosophic probability which is more general than imprecise and classical probabilities, and has more detailed corresponding probability density functions. Whereas Interval Statistics only deals with indeterminacy that can be represented by intervals. And we respond to the arguments of Woodall et al [1]. We show that not all indeterminacies (uncertainties) can be represented by intervals. Moreover, in some applications, we should use hesitant sets (which have less indeterminacy) instead of intervals. We redirect the authors to Plitogenic Probability and Plitogenic Statistics which are the most general forms of Multivariate Probability and Multivariate Statistics respectively (including, of course, Imprecise Probability and Interval Statistics as subclasses).
Research in the statistical analysis of extreme values has flourished over the past decade: new probability models, inference and data analysis techniques have been introduced; and new application areas have been explored. Statistics of Extremes comprehensively covers a wide range of models and application areas, including risk and insurance: a major area of interest and relevance to extreme value theory. Case studies are introduced providing a good balance of theory and application of each model discussed, incorporating many illustrated examples and plots of data. The last part of the book covers some interesting advanced topics, including time series, regression, multivariate and Bayesian modelling of extremes, the use of which has huge potential.
A unique approach to understanding the foundations of statistical quality control with a focus on the latest developments in nonparametric control charting methodologies Statistical Process Control (SPC) methods have a long and successful history and have revolutionized many facets of industrial production around the world. This book addresses recent developments in statistical process control bringing the modern use of computers and simulations along with theory within the reach of both the researchers and practitioners. The emphasis is on the burgeoning field of nonparametric SPC (NSPC) and the many new methodologies developed by researchers worldwide that are revolutionizing SPC. Over the...
A major tool for quality control and management, statistical process control (SPC) monitors sequential processes, such as production lines and Internet traffic, to ensure that they work stably and satisfactorily. Along with covering traditional methods, Introduction to Statistical Process Control describes many recent SPC methods that improve upon
Some of the key mathematical results are stated without proof in order to make the underlying theory acccessible to a wider audience. The book assumes a knowledge only of basic calculus, matrix algebra, and elementary statistics. The emphasis is on methods and the analysis of data sets. The logic and tools of model-building for stationary and non-stationary time series are developed in detail and numerous exercises, many of which make use of the included computer package, provide the reader with ample opportunity to develop skills in this area. The core of the book covers stationary processes, ARMA and ARIMA processes, multivariate time series and state-space models, with an optional chapter...
Devoted to the problem of fitting parametric probability distributions to data, this treatment uniquely unifies loss modeling in one book. Data sets used are related to the insurance industry, but can be applied to other distributions. Emphasis is on the distribution of single losses related to claims made against various types of insurance policies. Includes five sets of insurance data as examples.