|
|
|
|
LEADER |
02720nmm a2200349 u 4500 |
001 |
EB000625371 |
003 |
EBX01000000000000000478453 |
005 |
00000000000000.0 |
007 |
cr||||||||||||||||||||| |
008 |
140122 ||| eng |
020 |
|
|
|a 9781461540083
|
100 |
1 |
|
|a Touretzky, David
|e [editor]
|
245 |
0 |
0 |
|a Connectionist Approaches to Language Learning
|h Elektronische Ressource
|c edited by David Touretzky
|
250 |
|
|
|a 1st ed. 1991
|
260 |
|
|
|a New York, NY
|b Springer US
|c 1991, 1991
|
300 |
|
|
|a IV, 149 p
|b online resource
|
505 |
0 |
|
|a Learning Automata from Ordered Examples -- SLUG: A Connectionist Architecture for Inferring the Structure of Finite-State Environments -- Graded State Machines: The Representation of Temporal Contingencies in Simple Recurrent Networks -- Distributed Representations, Simple Recurrent Networks, and Grammatical Structure -- The Induction of Dynamical Recognizers
|
653 |
|
|
|a Complex Systems
|
653 |
|
|
|a Computer science
|
653 |
|
|
|a Computer Science
|
653 |
|
|
|a Artificial Intelligence
|
653 |
|
|
|a System theory
|
653 |
|
|
|a Artificial intelligence
|
653 |
|
|
|a Mathematical physics
|
653 |
|
|
|a Theoretical, Mathematical and Computational Physics
|
041 |
0 |
7 |
|a eng
|2 ISO 639-2
|
989 |
|
|
|b SBA
|a Springer Book Archives -2004
|
490 |
0 |
|
|a The Springer International Series in Engineering and Computer Science
|
028 |
5 |
0 |
|a 10.1007/978-1-4615-4008-3
|
856 |
4 |
0 |
|u https://doi.org/10.1007/978-1-4615-4008-3?nosfx=y
|x Verlag
|3 Volltext
|
082 |
0 |
|
|a 006.3
|
520 |
|
|
|a arise automatically as a result of the recursive structure of the task and the continuous nature of the SRN's state space. Elman also introduces a new graphical technique for study ing network behavior based on principal components analysis. He shows that sentences with multiple levels of embedding produce state space trajectories with an intriguing self similar structure. The development and shape of a recurrent network's state space is the subject of Pollack's paper, the most provocative in this collection. Pollack looks more closely at a connectionist network as a continuous dynamical system. He describes a new type of machine learning phenomenon: induction by phase transition. He then shows that under certain conditions, the state space created by these machines can have a fractal or chaotic structure, with a potentially infinite number of states. This is graphically illustrated using a higher-order recurrent network trained to recognize various regular languages over binary strings. Finally, Pollack suggests that it might be possible to exploit the fractal dynamics of these systems to achieve a generative capacity beyond that of finite-state machines
|