|
|
|
|
LEADER |
05044nmm a2200469 u 4500 |
001 |
EB001939858 |
003 |
EBX01000000000000001102760 |
005 |
00000000000000.0 |
007 |
cr||||||||||||||||||||| |
008 |
210123 ||| eng |
020 |
|
|
|a 9780080514338
|
050 |
|
4 |
|a QA76.87
|
100 |
1 |
|
|a Masters, Timothy
|
245 |
0 |
0 |
|a Practical neural network recipes in C++
|c Timothy Masters
|
260 |
|
|
|a Boston
|b Morgan Kaufmann
|c 1993
|
300 |
|
|
|a 1 disc (3 1/2 in.)
|
300 |
|
|
|a xviii, 493 pages
|b illustrations
|
505 |
0 |
|
|a Chapter 10. Designing Feedforward Network ArchitecturesHow Many Hidden Layers?; How Many Hidden Neurons?; How Long Do I Train This Thing???; Chapter 11. Interpreting Weights: How Does This Thing Work?; Features Used by Networks in General; Features Used by a Particular Network; Chapter 12. Probabilistic Neural Networks; Overview; Computational Aspects; Optimizing Sigma; A Sample Program; Bayesian Confidence Measures; Autoassociative Versions; When to Use a Probabilistic Neural Network; Chapter 13. Functional Link Networks; Application to Nonlinear Approximation
|
505 |
0 |
|
|a Chapter 4. Time-Series PredictionThe Basic Model; Input Data; Multiple Prediction; Multiple Predictors; Measuring Prediction Error; Chapter 5. Function Approximation; Univariate Function Approximation; Inverse Modeling; Multiple Regression; Chapter 6. Multilayer Feedforward Networks; Basic Architecture; Theoretical Discussion; Algorithms for Executing the Network; Training the Network; Training by Backpropagation of Errors; Training by Conjugate Gradients; Eluding Local Minima in Learning; When to Use a Multiple-Layer Feedforward Network; Chapter 7. Eluding Local Minima I: Simulated Annealing
|
505 |
0 |
|
|a Mathematics of the Functional Link NetworkWhen to Use a Functional Link Network; Chapter 14. Hybrid Networks; Functional Link Net as a Hidden Layer; Fast Bayesian Confidences; Attention-based Processing; Factorable Problems; Chapter 15. Designing the Training Set; Number of Samples; Borderline Cases; Hidden Bias; Balancing the Classes; Fudging Cases; Chapter 16. Preparing Input Data; General Considerations; Types of Measurements; Is Scaling Always Necessary?; Transformations; Circular Discontinuity; Outliers; Missing Data; Chapter 17. Fuzzy Data and Processing
|
505 |
0 |
|
|a OverviewChoosing the Annealing Parameters; Implementation in Feedforward Network Learning; A Sample Program; A Sample Function; Random Number Generation; Going on from Here; Chapter 8. Eluding Local Minima II: Genetic Optimization; Overview; Designing the Genetic Structure; Evaluation; Parent Selection; Reproduction; Mutation; A Genetic Minimization Subroutine; Some Functions for Genetic Optimization; Advanced Topics in Genetic Optimization; Chapter 9. Regression and Neural Networks; Overview; Singular-Value Decomposition; Regression in Neural Networks
|
505 |
0 |
|
|a Front Cover; Practical Neural Network Recipes in C++; Copyright Page; Dedication; Table of Contents; Preface; Chapter 1. Foundations; Motivation; New Life for Old Techniques; Perceptrons and Linear Separability; Neural Network Capabilities; Basic Structure of a Neural Network; Training; Validation; Chapter 2. Classification; Binary Decisions; Multiple Classes; Supervised versus Unsupervised Training; Chapter 3. Autoassociation; Autoassociative Filtering; Noise Reduction; Learning a Prototype from Exemplars; Exposing Isolated Events; Pattern Completion; Error Correction; Data Compression
|
505 |
0 |
|
|a Includes bibliographical references (pages 479-490)
|
653 |
|
|
|a Réseaux neuronaux (Informatique)
|
653 |
|
|
|a Neural networks (Computer science) / fast
|
653 |
|
|
|a Neural networks (Computer science) / http://id.loc.gov/authorities/subjects/sh90001937
|
653 |
|
|
|a C++ (Langage de programmation)
|
653 |
|
|
|a C++ (Computer program language) / http://id.loc.gov/authorities/subjects/sh87007505
|
653 |
|
|
|a C++ (Computer program language) / fast
|
653 |
|
|
|a COMPUTERS / General / bisacsh
|
041 |
0 |
7 |
|a eng
|2 ISO 639-2
|
989 |
|
|
|b OREILLY
|a O'Reilly
|
500 |
|
|
|a Accompanied by Diskette (702000135)
|
776 |
|
|
|z 0124790402
|
776 |
|
|
|z 9780124790407
|
776 |
|
|
|z 0080514332
|
776 |
|
|
|z 9780080514338
|
856 |
4 |
0 |
|u https://learning.oreilly.com/library/view/~/9780080514338/?ar
|x Verlag
|3 Volltext
|
082 |
0 |
|
|a 331
|
082 |
0 |
|
|a 500
|
082 |
0 |
|
|a 006.3
|
520 |
|
|
|a This text serves as a cookbook for neural network solutions to practical problems using C++. It will enable those with moderate programming experience to select a neural network model appropriate to solving a particular problem, and to produce a working program implementing that network. The book provides guidance along the entire problem-solving path, including designing the training set, preprocessing variables, training and validating the network, and evaluating its performance. Though the book is not intended as a general course in neural networks, no background in neural works is assumed and all models are presented from the ground up
|