You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
Over the last forty years there has been a growing interest to extend probability theory and statistics and to allow for more flexible modelling of imprecision, uncertainty, vagueness and ignorance. The fact that in many real-life situations data uncertainty is not only present in the form of randomness (stochastic uncertainty) but also in the form of imprecision/fuzziness is but one point underlining the need for a widening of statistical tools. Most such extensions originate in a "softening" of classical methods, allowing, in particular, to work with imprecise or vague data, considering imprecise or generalized probabilities and fuzzy events, etc. About ten years ago the idea of establishi...
Probability theory has been the only well-founded theory of uncertainty for a long time. It was viewed either as a powerful tool for modelling random phenomena, or as a rational approach to the notion of degree of belief. During the last thirty years, in areas centered around decision theory, artificial intelligence and information processing, numerous approaches extending or orthogonal to the existing theory of probability and mathematical statistics have come to the front. The common feature of those attempts is to allow for softer or wider frameworks for taking into account the incompleteness or imprecision of information. Many of these approaches come down to blending interval or fuzzy i...
Man-Machine Interaction is an interdisciplinary field of research that covers many aspects of science focused on a human and machine in conjunction. Basic goal of the study is to improve and invent new ways of communication between users and computers, and many different subjects are involved to reach the long-term research objective of an intuitive, natural and multimodal way of interaction with machines. The rapid evolution of the methods by which humans interact with computers is observed nowadays and new approaches allow using computing technologies to support people on the daily basis, making computers more usable and receptive to the user's needs. This monograph is the third edition in...
First published: IMO, 1991.
A thorough treatment of the statistical methods used to analyze doubly truncated data In The Statistical Analysis of Doubly Truncated Data, an expert team of statisticians delivers an up-to-date review of existing methods used to deal with randomly truncated data, with a focus on the challenging problem of random double truncation. The authors comprehensively introduce doubly truncated data before moving on to discussions of the latest developments in the field. The book offers readers examples with R code along with real data from astronomy, engineering, and the biomedical sciences to illustrate and highlight the methods described within. Linear regression models for doubly truncated respon...
Presents an illustrated biography of the Jewish heroine, Luba Tryszynska, who saved the lives of more than fifty Jewish children in the Bergen-Belsen concentration camp during the winter of 1944/45.