Models of Neural Networks

One of the great inteJlectual cha1lenges for the next few decades is the question of brain organization. What is the basic mechanism for storage of memory? What are the processes that serve as the interphase between the basically chemical processes of the body and the very specific and nonstatistica...

Full description

Bibliographic Details
Other Authors: Domany, Eytan (Editor), Hemmen, J. Leo van (Editor), Schulten, Klaus (Editor)
Format: eBook
Language:English
Published: Berlin, Heidelberg Springer Berlin Heidelberg 1991, 1991
Edition:1st ed. 1991
Series:Physics of Neural Networks
Subjects:
Online Access:
Collection: Springer Book Archives -2004 - Collection details see MPG.ReNa
Table of Contents:
  • 7.5 Applications: Sequence Recognition, Counting, and the Generation of Complex Sequences
  • 7.6 Hebbian Learning with Delays
  • 7.7 Epilogue
  • Note Added in Proof
  • References
  • 8. Self-organizing Maps and Adaptive Filters
  • 8.1 Introduction
  • 8.2 Self-organizing Maps and Optimal Representation of Data
  • 8.3 Learning Dynamics in the Vicinity of a Stationary State
  • 8.4 Relation to Brain Modeling
  • 8.5 Formation of a “Somatotopic Map”
  • 8.6 Adaptive Orientation and Spatial Frequency Filters
  • 8.7 Conclusion
  • References
  • 9. Layered Neural Networks
  • 9.1 Introduction
  • 9.2 Dynamics of Feed-Forward Networks
  • 9.3 Unsupervised Learning in Layered Networks
  • 9.4 Supervised Learning in Layered Networks
  • 9.5 Summary and Discussion
  • References
  • Elizabeth Gardner — An Appreciation
  • 4.2 Definition of Supervised Learning
  • 4.3 Adaline Learning
  • 4.4 Perceptron Learning
  • 4.5 Binary Synapses
  • 4.6 Basins of Attraction
  • 4.7 Forgetting
  • 4.8 Outlook
  • References
  • 5. Hierarchical Organization of Memory
  • 5.1 Introduction
  • 5.2 Models: The Problem
  • 5.3 A Toy Problem: Patterns with Low Activity
  • 5.4 Models with Hierarchically Structured Information
  • 5.5 Extensions
  • 5.6 The Enhancement of Storage Capacity: Multineuron Interactions
  • 5.7 Conclusion
  • References
  • 6. Asymmetrically Diluted Neural Networks
  • 6.1 Introduction
  • 6.2 Solvability and Retrieval Properties
  • 6.3 Exact Solution with Dynamic Functionals
  • 6.4 Extensions and Related Work
  • Appendix A
  • Appendix B
  • Appendix C
  • References
  • 7. Temporal Association
  • 7.1 Introduction
  • 7.2 Fast Synaptic Plasticity
  • 7.3 Noise-Driven Sequences of Biased Patterns
  • 7.4 Stabilizing Sequences by Delays
  • 1. Collective Phenomena in Neural Networks
  • 1.1 Introduction and Overview
  • 1.2 Prerequisites
  • 1.3 The Hopfield Model
  • 1.4 Nonlinear Neural Networks
  • 1.5 Learning, Unlearning, and Forgetting
  • 1.6 Hierarchically Structured Information
  • 1.7 Outlook
  • References
  • 2. Information from Structure: A Sketch of Neuroanatomy
  • 2.1 Development of the Brain
  • 2.2 Neuroanatomy Related to Information Handling in the Brain
  • 2.3 The Idea of Electronic Circuitry
  • 2.4 The Projection from the Compound Eye onto the First Ganglion (Lamina) of the Fly
  • 2.5 Statistical Wiring
  • 2.6 Symmetry of Neural Nets
  • 2.7 The Cerebellum
  • 2.8 Variations in Size of the Elements
  • 2.9 The Cerebral Cortex
  • 2.10 Inborn Knowledge
  • References
  • 3. Storage Capacity and Learning in Ising-Spin Neural Networks
  • 3.1 Introduction
  • 3.2 Content-addressability: A Dynamics Problem
  • 3.3 Learning
  • 3.4 Discussion
  • References
  • 4. Dynamics of Learning
  • 4.1 Introduction