You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
A mechanistic theory of the representation and use of semantic knowledge that uses distributed connectionist networks as a starting point for a psychological theory of semantic cognition.
What makes people smarter than computers? These volumes by a pioneering neurocomputing group suggest that the answer lies in the massively parallel architecture of the human mind. They describe a new theory of cognition called connectionism that is challenging the idea of symbolic computation that has traditionally been at the center of debate in theoretical discussions about the mind. The authors' theory assumes the mind is composed of a great number of elementary units connected in a neural network. Mental processes are interactions between these units which excite and inhibit each other in parallel rather than sequential operations. In this context, knowledge can no longer be thought of as stored in localized structures; instead, it consists of the connections between pairs of units that are distributed throughout the network. Volume 1 lays the foundations of this exciting theory of parallel distributed processing, while Volume 2 applies it to a number of specific issues in cognitive science and neuroscience, with chapters describing models of aspects of perception, memory, language, and thought.
This text, based on a course taught by Randall O'Reilly and Yuko Munakata over the past several years, provides an in-depth introduction to the main ideas in the computational cognitive neuroscience. The goal of computational cognitive neuroscience is to understand how the brain embodies the mind by using biologically based computational models comprising networks of neuronlike units. This text, based on a course taught by Randall O'Reilly and Yuko Munakata over the past several years, provides an in-depth introduction to the main ideas in the field. The neural units in the simulations use equations based directly on the ion channels that govern the behavior of real neurons, and the neural n...
A study of mechanisms of cognitive development. It is part of the "Carnegie Mellon Symposia on Cognition Series" and focuses on behavioural and neural perspectives of cognitive development.
A cutting-edge reference source for the interdisciplinary field of computational cognitive modeling.
Words, Thoughts, and Theories articulates and defends the "theory theory" of cognitive and semantic development, the idea that infants and young children, like scientists, learn about the world by forming and revising theories, a view of the origins of knowledge and meaning that has broad implications for cognitive science. Gopnik and Meltzoff interweave philosophical arguments and empirical data from their own and other's research. Both the philosophy and the psychology, the arguments and the data, address the same fundamental epistemological question: How do we come to understand the world around us? Recently, the theory theory has led to much interesting research. However, this is the fir...
The basic questions addressed in this book are: what is the computational nature of cognition, and what role does it play in language and other mental processes?; What are the main characteristics of contemporary computational paradigms for describing cognition and how do they differ from each other?; What are the prospects for building cognition and how do they differ from each other?; and what are the prospects for building an artificial intelligence?
Readings in Cognitive Science: A Perspective from Psychology and Artificial Intelligence brings together important studies that fall in the intersection between artificial intelligence and cognitive psychology. This book is composed of six chapters, and begins with the complex anatomy and physiology of the human brain. The next chapters deal with the components of cognitive science, such as the semantic memory, similarity and analogy, and learning. These chapters also consider the application of mental models, which represent the domain-specific knowledge needed to understand a dynamic system or natural physical phenomena. The remaining chapters discuss the concept of reasoning, problem solving, planning, vision, and imagery. This book is of value to psychologists, psychiatrists, neurologists, and researchers who are interested in cognition.
Composed of three sections, this book presents the most popular training algorithm for neural networks: backpropagation. The first section presents the theory and principles behind backpropagation as seen from different perspectives such as statistics, machine learning, and dynamical systems. The second presents a number of network architectures that may be designed to match the general concepts of Parallel Distributed Processing with backpropagation learning. Finally, the third section shows how these principles can be applied to a number of different fields related to the cognitive sciences, including control, speech recognition, robotics, image processing, and cognitive psychology. The volume is designed to provide both a solid theoretical foundation and a set of examples that show the versatility of the concepts. Useful to experts in the field, it should also be most helpful to students seeking to understand the basic principles of connectionist learning and to engineers wanting to add neural networks in general -- and backpropagation in particular -- to their set of problem-solving methods.