You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
Bayesian networks are a very general and powerful tool that can be used for a large number of problems involving uncertainty: reasoning, learning, planning and perception. They provide a language that supports efficient algorithms for the automatic construction of expert systems in several different contexts. The range of applications of Bayesian networks currently extends over almost all fields including engineering, biology and medicine, information and communication technologies and finance. This book is a collection of original contributions to the methodology and applications of Bayesian networks. It contains recent developments in the field and illustrates, on a sample of applications,...
Ten years ago Bill Gale of AT&T Bell Laboratories was primary organizer of the first Workshop on Artificial Intelligence and Statistics. In the early days of the Workshop series it seemed clear that researchers in AI and statistics had common interests, though with different emphases, goals, and vocabularies. In learning and model selection, for example, a historical goal of AI to build autonomous agents probably contributed to a focus on parameter-free learning systems, which relied little on an external analyst's assumptions about the data. This seemed at odds with statistical strategy, which stemmed from a view that model selection methods were tools to augment, not replace, the abilities...
Uncertainty Proceedings 1994
This book presents a unique approach to stream data mining. Unlike the vast majority of previous approaches, which are largely based on heuristics, it highlights methods and algorithms that are mathematically justified. First, it describes how to adapt static decision trees to accommodate data streams; in this regard, new splitting criteria are developed to guarantee that they are asymptotically equivalent to the classical batch tree. Moreover, new decision trees are designed, leading to the original concept of hybrid trees. In turn, nonparametric techniques based on Parzen kernels and orthogonal series are employed to address concept drift in the problem of non-stationary regressions and classification in a time-varying environment. Lastly, an extremely challenging problem that involves designing ensembles and automatically choosing their sizes is described and solved. Given its scope, the book is intended for a professional audience of researchers and practitioners who deal with stream data, e.g. in telecommunication, banking, and sensor networks.
This volume presents the results of biological and medical research with the statistical methods used to obtain them. Nowadays the fields of biology and experimental medicine rely on techniques for processing of experimental data and for the evaluation of hypotheses. It is increasingly necessary to stimulate awareness of the importance of statistical techniques (and of the possible traps that they can hide) by using real data in concrete situations drawn from research activity.
There has been a surge of interest in methods of analysing data that typically arise from surveys of various kinds of experiments in which the number of people, animals, places or objects occupying various categories are counted. In this textbook, first published in 1984, Dr Fingleton describes some techniques centred on the log-linear model from the perspective of the social, behavioural and environmental scientist.
In recent years it has become apparent that an important part of the theory of artificial intelligence is concerned with reasoning on the basis of uncertain, incomplete, or inconsistent information. A variety of formalisms have been developed, including nonmonotonic logic, fuzzy sets, possibility theory, belief functions, and dynamic models of reasoning such as belief revision and Bayesian networks. Several European research projects have been formed in the area and the first European conference was held in 1991. This volume contains the papers accepted for presentation at ECSQARU-93, the European Conference on Symbolicand Quantitative Approaches to Reasoning and Uncertainty, held at the University of Granada, Spain, November 8-10, 1993.
Provides a more cohesive and sharply focused treatment of fundamental concepts and theoretical background material, with particular attention given to better delineating connections to varying applications Exposition driven by additional examples and exercises
This book is a detailed description of the basics of three-dimensional digital image processing. A 3D digital image (abbreviated as “3D image” below) is a digitalized representation of a 3D object or an entire 3D space, stored in a computer as a 3D array. Whereas normal digital image processing is concerned with screens that are a collection of square shapes called “pixels” and their corresponding density levels, the “image plane” in three dimensions is represented by a division into cubical graphical elements (called “voxels”) that represent corresponding density levels. Inthecontextofimageprocessing,in manycases3Dimageprocessingwill refer to the input of multiple 2D images ...