You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
Ten years ago Bill Gale of AT&T Bell Laboratories was primary organizer of the first Workshop on Artificial Intelligence and Statistics. In the early days of the Workshop series it seemed clear that researchers in AI and statistics had common interests, though with different emphases, goals, and vocabularies. In learning and model selection, for example, a historical goal of AI to build autonomous agents probably contributed to a focus on parameter-free learning systems, which relied little on an external analyst's assumptions about the data. This seemed at odds with statistical strategy, which stemmed from a view that model selection methods were tools to augment, not replace, the abilities...
"Brain-Mind presents a unified, brain-based theory of cognition and emotion, with applications to the most complex kinds of thinking, right up to consciousness and creativity. Unification comes from systematic application of Chris Eliasmith's powerful new Semantic Pointer Architecture, a highly original synthesis of neural network and symbolic ideas about how the mind works. Thagard will show the relevance of semantic pointers to a full range of important kinds of mental representations, from sensations and imagery to concepts, rules, analogies, and emotions. Neural mechanisms can then be used to explain many phenomena concerning consciousness, action, intention, language, creativity, and the self. Because of their broad importance, Thagard has tried to make Eliasmith's ideas accessible to a broad audience with no special background in neuroscience or mathematics. The value of a unified theory of thinking goes well beyond psychology, neuroscience, and the other cognitive sciences"--
Novice programming comes of age / David Canfield Smith, Allen Cypher, Larry Tesler -- Generalizing by removing detail : how any program can be created by working with examples / Ken Kahn -- Demonstrational interfaces : sometimes you need a little intelligence, sometimes you need a lot / Brad A. Myers, Richard McDaniel -- Web browsing by example / Atsushi Sugiura -- Trainable information agents for the Web / Mathias Bauer, Dietmar Dengler, Gabriele Paul -- End users and GIS : a demonstration is worth a thousand words / Carol Traynor, Marian G. Williams -- Bringing programming by demonstration to CAD users / Patrick Girard -- Demonstrating the hidden features that make an application work / Richard McDaniel -- A reporting tool using programming by example for format designation / Tetsuya Masuishi, Nobuo Takahashi -- Composition by example / Toshiyuki Masui -- Learning repetitive text-editing procedures with SMARTedit / Tessa Lau ... [et al.] -- Training agents to recognize text by exampl ...
With the convergence of Nanotechnology, Biotechnology, Information technology and Cognitive science (NBIC) fields promising to change our competitive, operational, and employment landscape in fundamental ways, we find ourselves on the brink of a new technological and science-driven business revolution. The already emerging reality of convergence is to be found in genomics, robotics, bio-information and artificial intelligence applications, such as: • Self-assembled, self-cleaning and self-healing manufactured materials and textiles, and much stronger, lighter and more customizable structural materials, • Miniature sensors allowing unobtrusive real-time health monitoring and dramatically ...
This book constitutes the refereed proceedings of the Second International Symposium on Intelligent Data Analysis, IDA-97, held in London, UK, in August 1997. The volume presents 50 revised full papers selected from a total of 107 submissions. Also included is a keynote, Intelligent Data Analysis: Issues and Opportunities, by David J. Hand. The papers are organized in sections on exploratory data analysis, preprocessing and tools; classification and feature selection; medical applications; soft computing; knowledge discovery and data mining; estimation and clustering; data quality; qualitative models.
Lauren Ipsum is a whimsical journey through a land where logic and computer science come to life. Meet Lauren, an adventurer lost in Userland who needs to find her way home by solving a series of puzzles. As she visits places like the Push & Pop Café and makes friends with people like Hugh Rustic and the Wandering Salesman, Lauren learns about computer science without even realizing it—and so do you! Read Lauren Ipsum yourself or with someone littler than you, then flip to the notes at the back of the book to learn more about logic and computer science in the real world. Suggested for ages 10+
The International Conference on Cognitive Modeling brings together researchers who develop computational models that explain and predict cognitive data. The 2004 conference encompassed an integration of diverse data through models of coherent phenomena;
Presents a collection of articles on human-computer interaction, covering such topics as applications, methods, hardware, and computers and society.
None
This volume is a selection of papers presented at the Fourth International Workshop on Artificial Intelligence and Statistics held in January 1993. These biennial workshops have succeeded in bringing together researchers from Artificial Intelligence and from Statistics to discuss problems of mutual interest. The exchange has broadened research in both fields and has strongly encour aged interdisciplinary work. The theme ofthe 1993 AI and Statistics workshop was: "Selecting Models from Data". The papers in this volume attest to the diversity of approaches to model selection and to the ubiquity of the problem. Both statistics and artificial intelligence have independently developed approaches to model selection and the corresponding algorithms to implement them. But as these papers make clear, there is a high degree of overlap between the different approaches. In particular, there is agreement that the fundamental problem is the avoidence of "overfitting"-Le., where a model fits the given data very closely, but is a poor predictor for new data; in other words, the model has partly fitted the "noise" in the original data.