You may have to Search all our reviewed books and magazines, click the sign up button below to create a free account.
This solid introduction uses the principles of physics and the tools of mathematics to approach fundamental questions of neuroscience.
What happens in our brain when we make a decision? What triggers a neuron to send out a signal? What is the neural code? This textbook for advanced undergraduate and beginning graduate students provides a thorough and up-to-date introduction to the fields of computational and theoretical neuroscience. It covers classical topics, including the Hodgkin–Huxley equations and Hopfield model, as well as modern developments in the field such as generalized linear models and decision theory. Concepts are introduced using clear step-by-step explanations suitable for readers with only a basic knowledge of differential equations and probabilities, and are richly illustrated by figures and worked-out examples. End-of-chapter summaries and classroom-tested exercises make the book ideal for courses or for self-study. The authors also give pointers to the literature and an extensive bibliography, which will prove invaluable to readers interested in further study.
Neurons in the brain communicate by short electrical pulses, the so-called action potentials or spikes. How can we understand the process of spike generation? How can we understand information transmission by neurons? What happens if thousands of neurons are coupled together in a seemingly random network? How does the network connectivity determine the activity patterns? And, vice versa, how does the spike activity influence the connectivity pattern? These questions are addressed in this 2002 introduction to spiking neurons aimed at those taking courses in computational neuroscience, theoretical biology, biophysics, or neural networks. The approach will suit students of physics, mathematics, or computer science; it will also be useful for biologists who are interested in mathematical modelling. The text is enhanced by many worked examples and illustrations. There are no mathematical prerequisites beyond what the audience would meet as undergraduates: more advanced techniques are introduced in an elementary, concrete fashion when needed.
Hebb's postulate provided a crucial framework to understand synaptic alterations underlying learning and memory. Hebb's theory proposed that neurons that fire together, also wire together, which provided the logical framework for the strengthening of synapses. Weakening of synapses was however addressed by "not being strengthened", and it was only later that the active decrease of synaptic strength was introduced through the discovery of long-term depression caused by low frequency stimulation of the presynaptic neuron. In 1994, it was found that the precise relative timing of pre and postynaptic spikes determined not only the magnitude, but also the direction of synaptic alterations when tw...
Explains the relationship of electrophysiology, nonlinear dynamics, and the computational properties of neurons, with each concept presented in terms of both neuroscience and mathematics and illustrated using geometrical intuition. In order to model neuronal behavior or to interpret the results of modeling studies, neuroscientists must call upon methods of nonlinear dynamics. This book offers an introduction to nonlinear dynamical systems theory for researchers and graduate students in neuroscience. It also provides an overview of neuroscience for mathematicians who want to learn the basic facts of electrophysiology. Dynamical Systems in Neuroscience presents a systematic study of the relati...
Most practical applications of artificial neural networks are based on a computational model involving the propagation of continuous variables from one processing unit to the next. In recent years, data from neurobiological experiments have made it increasingly clear that biological neural networks, which communicate through pulses, use the timing of the pulses to transmit information and perform computation. This realization has stimulated significant research on pulsed neural networks, including theoretical analyses and model development, neurobiological modeling, and hardware implementation. This book presents the complete spectrum of current research in pulsed neural networks and include...
A comprehensive, integrated, and accessible textbook presenting core neuroscientific topics from a computational perspective, tracing a path from cells and circuits to behavior and cognition. This textbook presents a wide range of subjects in neuroscience from a computational perspective. It offers a comprehensive, integrated introduction to core topics, using computational tools to trace a path from neurons and circuits to behavior and cognition. Moreover, the chapters show how computational neuroscience—methods for modeling the causal interactions underlying neural systems—complements empirical research in advancing the understanding of brain and behavior. The chapters—all by leaders...
This volume will explore the most recent findings on cellular mechanisms of inhibitory plasticity and its functional role in shaping neuronal circuits, their rewiring in response to experience, drug addiction and in neuropathology. Inhibitory Synaptic Plasticity will be of particular interest to neuroscientists and neurophysiologists.
Papers presented at NIPS, the flagship meeting on neural computation, held in December 2004 in Vancouver.The annual Neural Information Processing Systems (NIPS) conference is the flagship meeting on neural computation. It draws a diverse group of attendees--physicists, neuroscientists, mathematicians, statisticians, and computer scientists. The presentations are interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, brain imaging, vision, speech and signal processing, reinforcement learning and control, emerging technologies, and applications. Only twenty-five percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. This volume contains the papers presented at the December, 2004 conference, held in Vancouver.
The annual Neural Information Processing Systems (NIPS) conference is the flagship meeting on neural computation and machine learning. This volume contains the papers presented at the December 2006 meeting, held in Vancouver.