You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
An overview of theoretical and computational approaches to neuroimaging.
Advances in training models with log-linear structures, with topics including variable selection, the geometry of neural nets, and applications. Log-linear models play a key role in modern big data and machine learning applications. From simple binary classification models through partition functions, conditional random fields, and neural nets, log-linear structure is closely related to performance in certain applications and influences fitting techniques used to train models. This volume covers recent advances in training models with log-linear structures, covering the underlying geometry, optimization techniques, and multiple applications. The first chapter shows readers the inner workings...
An overview of recent efforts in the machine learning community to deal with dataset and covariate shift, which occurs when test and training inputs and outputs have different distributions. Dataset shift is a common problem in predictive modeling that occurs when the joint distribution of inputs and outputs differs between training and test stages. Covariate shift, a particular case of dataset shift, occurs when only the input distribution changes. Dataset shift is present in most practical applications, for reasons ranging from the bias introduced by experimental design to the irreproducibility of the testing conditions at training time. (An example is -email spam filtering, which may fail...
Connectionist Models of Learning, Development and Evolution comprises a selection of papers presented at the Sixth Neural Computation and Psychology Workshop - the only international workshop devoted to connectionist models of psychological phenomena. With a main theme of neural network modelling in the areas of evolution, learning, and development, the papers are organized into six sections: The neural basis of cognition Development and category learning Implicit learning Social cognition Evolution Semantics Covering artificial intelligence, mathematics, psychology, neurobiology, and philosophy, it will be an invaluable reference work for researchers and students working on connectionist modelling in computer science and psychology, or in any area related to cognitive science.
The first ICANNGA conference, devoted to biologically inspired computational paradigms, Neural Net works and Genetic Algorithms, was held in Innsbruck, Austria, in 1993. The meeting attracted researchers from all over Europe and further afield, who decided that this particular blend of topics should form a theme for a series of biennial conferences. The second meeting, held in Ales, France, in 1995, carried on the tradition set in Innsbruck of a relaxed and stimulating environment for the. exchange of ideas. The series has continued in Norwich, UK, in 1997, and Portoroz, Slovenia, in 1999. The Institute of Computer Science, Czech Academy of Sciences, is pleased to host the fifth conference in Prague. We have chosen the Liechtenstein palace under the Prague Castle as the conference site to enhance the traditionally good atmosphere of the meeting. There is an inspirational genius loci of the historical center of the city, where four hundred years ago a fruitful combination of theoretical and empirical method, through the collaboration of Johannes Kepler and Tycho de Brahe, led to the discovery of the laws of planetary orbits.
A description of perturbation-based methods developed in machine learning to augment novel optimization methods with strong statistical guarantees. In nearly all machine learning, decisions must be made given current knowledge. Surprisingly, making what is believed to be the best decision is not always the best strategy, even when learning in a supervised learning setting. An emerging body of work on learning under different rules applies perturbations to decision and learning procedures. These methods provide simple and highly efficient learning rules with improved theoretical guarantees. This book describes perturbation-based methods developed in machine learning to augment novel optimizat...
Introduction / Eddy J. Davelaar -- An ecology-based approach to perceptual modelling / E.L. Byrne, D.P.A Corney and R.B. Lotto -- Early development of visual abilities / Alessio Plebe -- A dynamical neural simulation of feature-based attention and binding in a recurrent model of the ventral stream / D.G. Harrison and M. De Kamps -- Model selection for eye movements : assessing the role of attentional cues in infant learning / Daniel Yurovsky [und weitere] -- The importance of low spatial frequencies for categorization of emotional facial expressions / L. Lopez [und weitere] -- Modeling speech perception with restricted Boltzmann machines / Michael Klein, Louis ten Bosch and Lou Boves -- Earl...
"Sparse modeling is a rapidly developing area at the intersection of statistical learning and signal processing, motivated by the age-old statistical problem of selecting a small number of predictive variables in high-dimensional data sets. This collection describes key approaches in sparse modeling, focusing on its applications in such fields as neuroscience, computational biology, and computer vision. Sparse modeling methods can improve the interpretability of predictive models and aid efficient recovery of high-dimensional unobserved signals from a limited number of measurements. Yet despite significant advances in the field, a number of open issues remain when sparse modeling meets real-life applications. The book discusses a range of practical applications and state-of-the-art approaches for tackling the challenges presented by these applications. Topics considered include the choice of method in genomics applications; analysis of protein mass-spectrometry data; the stability of sparse models in brain imaging applications; sequential testing approaches; algorithmic aspects of sparse recovery; and learning sparse latent models"--Jacket.
This volume presents a timely overview of the latest BCI research, with contributions from many of the important research groups in the field.
How Machine Learning can improve machine translation: enabling technologies and new statistical techniques.