Artificial Neural Networks and Statistical Pattern Recognition (eBook)
271 Seiten
Elsevier Science (Verlag)
978-1-4832-9787-3 (ISBN)
With the growing complexity of pattern recognition related problems being solved using Artificial Neural Networks, many ANN researchers are grappling with design issues such as the size of the network, the number of training patterns, and performance assessment and bounds. These researchers are continually rediscovering that many learning procedures lack the scaling property; the procedures simply fail, or yield unsatisfactory results when applied to problems of bigger size. Phenomena like these are very familiar to researchers in statistical pattern recognition (SPR), where the curse of dimensionality is a well-known dilemma. Issues related to the training and test sample sizes, feature space dimensionality, and the discriminatory power of different classifier types have all been extensively studied in the SPR literature. It appears however that many ANN researchers looking at pattern recognition problems are not aware of the ties between their field and SPR, and are therefore unable to successfully exploit work that has already been done in SPR. Similarly, many pattern recognition and computer vision researchers do not realize the potential of the ANN approach to solve problems such as feature extraction, segmentation, and object recognition. The present volume is designed as a contribution to the greater interaction between the ANN and SPR research communities.
Front Cover 1
Artificial Neural Networks and Statistical Pattern Recognition: Old and New Connections 4
Copyright Page 5
Table of Contents 14
FOREWORD 6
PREFACE 10
PART 1: ANN AND SPR RELATIONSHIP 16
CHAPTER 1. EVALUATION OF A CLASS OF PATTERN-RECOGNITION NETWORKS 16
INTRODUCTION 16
1. A CLASS OF PATTERN-RECOGNITION NETWORKS 16
2. A REPRESENTATION OF THE JOINT DISTRIBUTION 18
3. A CLASS OF CLASSIFICATION FUNCTIONS 19
4. DETERMINATION OF COEFFICIENTS FROM SAMPLES 23
5. SOME COMMENTS ON COMPARING DESIGN PROCEDURES 23
6. SOME COMMENTS ON THE CHOICE OF OBSERVABLES, AND ON INVARIANCE PROPERTIES 24
ACKNOWLEDGMENT 24
REFERENCES 25
CHAPTER 2. LINKS BETWEEN ARTIFICIAL NEURAL NETWORKS (ANN) AND STATISTICAL PATTERN RECOGNITION 26
1. Overview 26
2. Neural Networks and Pattern Recognition – Generalities 26
3. Some Examples of ANN Paradigms 30
4. Dynamic Systems and Control 41
5. Conclusions 42
REFERENCES 43
CHAPTER 3. Small sample size problems in designing artificial neural networks 48
Abstract 48
1. INTRODUCTION 48
2. FINITE SAMPLE PROBLEMS IN STATISTICAL PATTERN RECOGNITION 50
3. THE CLASSIFICATION ACCURACY AND TRAINING TIME OF ARTIFICIAL NEURAL NETWORKS 55
4. ESTIMATION OF THE CLASSIFICATION ERROR 58
5. PEAKING IN THE CLASSIFICATION PERFORMANCE WITH INCREASE IN DIMENSIONALITY 59
6. EFFECT OF THE NUMBER OF NEURONS IN THE HIDDEN LAYERON THE PERFORMANCE OF ANN CLASSIFIERS 61
7. DISCUSSION 61
References 62
CHAPTER 4. On Tree Structured Classifiers 66
Abstract 66
1. INTRODUCTION 66
2. DECISION RULES AND CLASSIFICATION TREES 72
3. CLASSIFICATION TREE CONSTRUCTION AND ERROR RATE ESTIMATION 74
4. TREE PRUNING ALGORITHMS 79
5. EXPERIMENTAL RESULTS 81
6. CONCLUSION 84
REFERENCES 84
CHAPTER 5. Decision tree performance enhancement using an artificial neural network implementation 86
Abstract 86
1. INTRODUCTION 86
2. DECISION TREE CLASSIFIER ISSUES 87
3. MULTILAYER PERCEPTRON NETWORKS 92
4. AN MLP IMPLEMENTATION OF TREE CLASSIFIERS 94
5. TRAINING THE TREE MAPPED NETWORK 95
6. PERFORMANCE EVALUATION 96
7. CONCLUSIONS 100
REFERENCES 101
PART 2: APPLICATIONS 104
CHAPTER 6. Bayesian and neural network pattern recognition : atheoretical connection and empirical results with handwritten characters 104
Abstract 104
1 Introduction 104
2 Bayes Classifier 105
3 Artificial Neural Networks and Back Propagation 109
4 Relationship 110
5 Experimental Results 114
6 Discussion 116
7 Conclusion 118
8 Acknowledgements 118
References 118
CHAPTER 7. Shape and Texture Recognition by a Neural Network 124
1. INTRODUCTION 124
2. ZERNIKE MOMENT FEATURES FOR SHAPE RECOGNITION 126
3. RANDOM FIELD FEATURES FOR TEXTURE RECOGNITION 129
4. MULTI-LAYER PERCEPTRON CLASSIFIER 132
5. CONVENTIONAL STATISTICAL CLASSIFIERS 134
6. EXPERIMENTAL STUDY ON SHAPE CLASSIFICATION 135
7. EXPERIMENTAL STUDY ON TEXTURE CLASSIFICATION 141
8. DISCUSSIONS AND CONCLUSIONS 143
9. REFERENCES 145
CHAPTER 8. Neural Networks for Textured Image Processing 148
Abstract 148
1. INTRODUCTION 148
2. DETECTION OF EDGES IN COMPUTER AND HUMAN VISION 151
3. TEXTURE ANALYSIS USING MULTIPLE CHANNEL FILTERS 153
4. NEURAL NETWORK APPROACHES 158
5. CONCLUDING REMARKS 167
6. References 167
CHAPTER 9. Markov Random Fields and Neural Networks with Applications to Early Vision Problems 170
Abstract 170
1 INTRODUCTION 170
2 RELATIONSHIP BETWEEN THE TWO FIELDS 172
3 APPLICATIONS 175
4 CONCLUSIONS 187
References 188
CHAPTER 10. Connectionist Models and their Application to Automatic Speech Recognition 190
Abstract 190
1. INTRODUCTION 190
2. USE OF A-PRIORI KNOWLEDGE 191
3. RECURRENT BACK-PROPAGATION NETWORKS 193
4. FAST IMPLEMENTATION OF BP 197
5. RADIAL BASIS FUNCTIONS MODELS 199
6. COMBINING LOCAL AND DISTRIBUTED REPRESENTATIONS 205
7. CONCLUSION 206
REFERENCES 207
PART 3: IMPLEMENTATION ASPECTS 210
CHAPTER 11. DYNAMIC ASSOCIATIVE MEMORIES 210
1. INTRODUCTION 210
2 . DAM ARCHITECTURES AND GENERAL MEMORY DYNAMICS 211
3. CHARACTERISTICS OF A HIGH-PERFORMANCE DAM 215
4. ASSOCIATIVE LEARNING IN A DAM 217
5. RECORDING STRATEGIES 225
6. DAM CAPACITY AND PERFORMANCE 226
REFERENCES 231
CHAPTER 12. Optical Associative Memories 234
Abstract 234
1 INTRODUCTION 234
2 BASICS OF ASSOCIATIVE MEMORIES 235
3 FOUR ASSOCIATIVE MEMORY MODELS 236
4 OUTER PRODUCT ASSOCIATIVE MEMORIES 239
5 INNER PRODUCT ASSOCIATIVE MEMORIES 241
6 HOLOGRAPHIC ASSOCIATIVE MEMORIES 242
7 A COMPARISON OF FOUR ASSOCIATIVE MEMORY MODELS 244
8 CONCLUSIONS 251
Acknowledgements 251
REFERENCES 251
CHAPTER 13. ARTIFICIAL NEURAL NETS IN MOS SILICON 258
1. INTRODUCTION 258
2. A DIGITAL CMOS VLSI IMPLEMENTATION OF A 4-NEURON CIRCUIT 261
3. ANNs USING ANALOG VECTOR MULTIPLIER BASIC CELLS 264
4. ANALOG VLSI IMPLEMENTATION OF SYNAPTIC WEIGHTS VIA SIMPLE MOSFETs 276
5. CONCLUSIONS 280
REFERENCES 282
AUTHOR INDEX 286
| Erscheint lt. Verlag | 28.6.2014 |
|---|---|
| Sprache | englisch |
| Themenwelt | Mathematik / Informatik ► Informatik ► Netzwerke |
| Informatik ► Theorie / Studium ► Künstliche Intelligenz / Robotik | |
| ISBN-10 | 1-4832-9787-X / 148329787X |
| ISBN-13 | 978-1-4832-9787-3 / 9781483297873 |
| Informationen gemäß Produktsicherheitsverordnung (GPSR) | |
| Haben Sie eine Frage zum Produkt? |
Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM
Dateiformat: PDF (Portable Document Format)
Mit einem festen Seitenlayout eignet sich die PDF besonders für Fachbücher mit Spalten, Tabellen und Abbildungen. Eine PDF kann auf fast allen Geräten angezeigt werden, ist aber für kleine Displays (Smartphone, eReader) nur eingeschränkt geeignet.
Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine
Geräteliste und zusätzliche Hinweise
Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.
aus dem Bereich