Statistical and Neural Classifiers An Integrated Approach to Design
Automatic (machine) recognition, description, classification, and groupings of patterns are important problems in a variety of engineering and scientific disciplines such as biology, psychology, medicine, marketing, computer vision, artificial intelligence, and remote sensing. Given a pattern, its r...
Main Author: | |
---|---|
Format: | eBook |
Language: | English |
Published: |
London
Springer London
2001, 2001
|
Edition: | 1st ed. 2001 |
Series: | Advances in Computer Vision and Pattern Recognition
|
Subjects: | |
Online Access: | |
Collection: | Springer Book Archives -2004 - Collection details see MPG.ReNa |
Table of Contents:
- A.5 Matlab Codes (the Non-Linear SLP Training, the First Order Tree Dependence Model, and Data Whitening Transformation)
- References
- 3.1 Bayes, Conditional, Expected, and Asymptotic Probabilities of Misclassification
- 3.2 Generalisation Error of the Euclidean Distance Classifier
- 3.3 Most Favourable and Least Favourable Distributions of the Data
- 3.4 Generalisation Errors for Modifications of the Standard Linear Classifier
- 3.5 Common Parameters in Different Competing Pattern Classes
- 3.6 Minimum Empirical Error and Maximal Margin Classifiers
- 3.7 Parzen Window Classifier
- 3.8 Multinomial Classifier
- 3.9 Bibliographical and Historical Remarks
- 4. Neural Network Classifiers
- 4.1 Training Dynamics of the Single Layer Perceptron
- 4.2 Non-linear Decision Boundaries
- 4.3 Training Peculiarities of the Perceptrons
- 4.4 Generalisation of the Perceptrons
- 4.5 Overtraining and Initialisation
- 4.6 Tools to Control Complexity
- 4.7 TheCo-Operation of the Neural Networks
- 4.8 Bibliographical and Historical Remarks
- 5. Integration of Statistical and Neural Approaches
- 5.1 Statistical Methods or Neural Nets?
- 5.2 Positive and Negative Attributes of Statistical Pattern Recognition
- 5.3 Positive and Negative Attributes of Artificial Neural Networks
- 5.4 Merging Statistical Classifiers and Neural Networks
- 5.5 Data Transformations for the Integrated Approach
- 5.6 The Statistical Approach in Multilayer Feed-forward Networks
- 5.7 Concluding and Bibliographical Remarks
- 6. Model Selection
- 6.1 Classification Errors and their Estimation Methods
- 6.2 Simplified Performance Measures
- 6.3 Accuracy of Performance Estimates
- 6.4 Feature Ranking and the Optimal Number of Feature
- 6.5 The Accuracy of the Model Selection
- 6.6 Additional Bibliographical Remarks
- Appendices
- A.1 Elements of Matrix Algebra
- A.2 The First Order Tree Type Dependence Model
- A.3 Temporal Dependence Models
- A.4 Pikelis Algorithm for Evaluating Means and Variances of the True, Apparent and Ideal Errors in Model Selection
- 1. Quick Overview
- 1.1 The Classifier Design Problem
- 1.2 Single Layer and Multilayer Perceptrons
- 1.3 The SLP as the Euclidean Distance and the Fisher Linear Classifiers
- 1.4 The Generalisation Error of the EDC and the Fisher DF
- 1.5 Optimal Complexity — The Scissors Effect
- 1.6 Overtraining in Neural Networks
- 1.7 Bibliographical and Historical Remarks
- 2. Taxonomy of Pattern Classification Algorithms
- 2.1 Principles of Statistical Decision Theory
- 2.2 Four Parametric Statistical Classifiers
- 2.3 Structures of the Covariance Matrices
- 2.4 The Bayes Predictive Approach to Design Optimal Classification Rules
- 2.5. Modifications of the Standard Linear and Quadratic DF
- 2.6 Nonparametric Local Statistical Classifiers
- 2.7 Minimum Empirical Error and Maximal Margin Linear Classifiers
- 2.8 Piecewise-Linear Classifiers
- 2.9 Classifiers for Categorical Data
- 2.10 Bibliographical and Historical Remarks
- 3. Performance and the Generalisation Error