|
|
|
|
LEADER |
07162nmm a2200373 u 4500 |
001 |
EB000617334 |
003 |
EBX01000000000000000470416 |
005 |
00000000000000.0 |
007 |
cr||||||||||||||||||||| |
008 |
140122 ||| eng |
020 |
|
|
|a 9781447115236
|
100 |
1 |
|
|a Karny, Mirek
|e [editor]
|
245 |
0 |
0 |
|a Dealing with Complexity
|h Elektronische Ressource
|b A Neural Networks Approach
|c edited by Mirek Karny, Kevin Warwick, Vera Kurkova
|
250 |
|
|
|a 1st ed. 1998
|
260 |
|
|
|a London
|b Springer London
|c 1998, 1998
|
300 |
|
|
|a XV, 308 p. 5 illus
|b online resource
|
505 |
0 |
|
|a 2 Modified Model with Latent Structure -- 3 Optimizing Model Parameters -- 4 Approach to Feature Selection -- 5 Pseudo-Bayes Decision Rule -- 6 Experiments -- 7 Summary and Conclusion -- 10 Geometric Algebra Based Neural Networks -- 1 Introduction -- 2 Complex-Valued Neural Networks -- 3 Comments on the Applicability of CVNNs to n-Dimensional Signals -- 4 Generalisations of CVNNs Within a GA Framework -- 5 Summary -- 11 Discrete Event Complex Systems: Scheduling with Neural Networks -- 1 Introduction -- 2 The DNN Architecture -- 3 Continuous Time Control Law -- 4 Real-Time Scheduling -- 5 Simulation Results -- 6 Summary -- 12 Incremental Approximation by Neural Networks -- 1 Introduction -- 2 Approximation of Functions by One-Hidden-Layer Networks -- 3 Rates of Approximation of Incremental Approximants -- 4 Variation with Respect to a Set of Functions -- 5 Incremental Approximation by Perceptron and RBF Networks -- 6 Discussion --
|
505 |
0 |
|
|a 13 Approximation of Smooth Functions by Neural Networks -- 1 Introduction -- 2 Preliminaries -- 3 Complexity Theorems -- 4 Local Approximation -- 5 Some Open Problems -- 14 Rates of Approximation in a Feedforward Network Depend on the Type of Computational Unit -- 1 Introduction -- 2 Feedforward Networks with Various Computational Units -- 3 Discussion -- 15 Recent Results and Mathematical Methods for Functional Approximation by Neural Networks -- 1 Introduction -- 2 Individual vs Variable Context -- 3 Nonlinear Approximation -- 4 Feedforward Architectures -- 5 Lower Bounds on Rate of Approximation -- 6 Uniqueness of Approximation by Neural Networks -- 7 Other Approaches -- 16 Differential Neurocontrol of Multidimensional Systems -- 1 Introduction -- 2 Neurophysiological Basis -- 3 Scheme of the Differential Neurocontroller -- 4 Multiplicative Units -- 5 Feedback Block -- 6 Feedforward Block -- 7 Convergence of Learning -- 8 Computer Simulations -- 9 Conclusions --
|
505 |
0 |
|
|a 1 Recurrent Neural Networks: Some Systems-Theoretic Aspects -- 1 Introduction -- 2 System-Theory Results: Statements -- 3 System-Theory Results: Discussion -- 4 Computational Power -- 5 Some Remarks -- 2 The Use of State Space Control Theory for Analysing Feedforward Neural Networks -- 1 Introduction -- 2 State Space Theory -- 3 State Space Representation of Feedforward Neural Networks -- 4 Observability of Feedforward Neural Networks -- 5 Controllability -- 6 Stability -- 7 Discussion -- 8 Appendix: Linear Systems of Equations [7] -- 3 Statistical Decision Making and Neural Networks -- 1 Introduction -- 2 Statistical Decision Making -- 3 Bayesian Learning -- 4 On Ingredients of Bayesian Learning -- 5 Interlude on Gaussian Linear Regression Model -- 6 Approximate On-Line Estimation -- 7 Conclusions -- 4 A Tutorial on the EM Algorithm and its Applications to Neural Network Learning -- 1 Introduction -- 2 The EM Algorithm -- 3 Practical Applications -- 4 Convergence Properties --
|
505 |
0 |
|
|a 5 Concluding Remarks -- 5 On the Effectiveness of Memory-Based Methods inMachine Learning -- 1 Introduction -- 2 Background -- 3 The Curse of Dimensionality -- 4 The Barron-Jones Theory -- 5 Experimental Results -- 6 Analysis of Memory-Based Methods -- 7 Discussion -- 6 A Study of Non Mean Square Error Criteria for the Training of Neural Networks -- 1 Introduction -- 2 Statement of the Problem -- 3 Cost Function Minimisation for ? = E(y/x) -- 4 Cost Function Minimisation for the Median of p(y/x) -- 5 Simulation Results -- 6 Conclusion -- 7 A Priori Information in Network Design -- 1 Introduction -- 2 Preliminaries -- 3 Recurrent Networks and Relative Order -- 4 Simulations -- 5 Conclusions -- 8 Neurofuzzy Systems Modelling: A Transparent Approach -- 1 Empirical Data Modelling -- 2 Neurofuzzy Construction Algorithms -- 3 Modelling Case Studies -- 4Conclusions -- 9 Feature Selection and Classification by a Modified Model with Latent Structure -- 1 Introduction --
|
505 |
0 |
|
|a 17 The Psychological Limits of Neural Computation -- 1 Neural Networks and Turing Machines -- 2 Function Approximation -- 3 Representation of Logical Functions Using Neural Networks -- 4 The Complexity of Learning in Neural Networks -- 5 Learning Logical Functions -- 6 The Optimization of Circuits -- 7 Final Remarks -- 18 A Brain-Like Design to Learn Optimal Decision Strategies in Complex Environments -- 1 Introduction -- 2 Time-Chunked Approximate Dynamic Programming -- 3 Temporal Chunking with Neural Networks -- 4 Spatial Chunking and Critical Subsystems -- 5 Adding the Third Brain -- Research Acknowledgements
|
653 |
|
|
|a Electronic digital computers / Evaluation
|
653 |
|
|
|a System Performance and Evaluation
|
653 |
|
|
|a Artificial Intelligence
|
653 |
|
|
|a Artificial intelligence
|
700 |
1 |
|
|a Warwick, Kevin
|e [editor]
|
700 |
1 |
|
|a Kurkova, Vera
|e [editor]
|
041 |
0 |
7 |
|a eng
|2 ISO 639-2
|
989 |
|
|
|b SBA
|a Springer Book Archives -2004
|
490 |
0 |
|
|a Perspectives in Neural Computing
|
028 |
5 |
0 |
|a 10.1007/978-1-4471-1523-6
|
856 |
4 |
0 |
|u https://doi.org/10.1007/978-1-4471-1523-6?nosfx=y
|x Verlag
|3 Volltext
|
082 |
0 |
|
|a 006.3
|
520 |
|
|
|a In almost all areas of science and engineering, the use of computers and microcomputers has, in recent years, transformed entire subject areas. What was not even considered possible a decade or two ago is now not only possible but is also part of everyday practice. As a result, a new approach usually needs to be taken (in order) to get the best out of a situation. What is required is now a computer's eye view of the world. However, all is not rosy in this new world. Humans tend to think in two or three dimensions at most, whereas computers can, without complaint, work in n dimensions, where n, in practice, gets bigger and bigger each year. As a result of this, more complex problem solutions are being attempted, whether or not the problems themselves are inherently complex. If information is available, it might as well be used, but what can be done with it? Straightforward, traditional computational solutions to this new problem of complexity can, and usually do, produce very unsatisfactory, unreliable and even unworkable results. Recently however, artificial neural networks, which have been found to be very versatile and powerful when dealing with difficulties such as nonlinearities, multivariate systems and high data content, have shown their strengths in general in dealing with complex problems. This volume brings together a collection of top researchers from around the world, in the field of artificial neural networks
|