Sensitivity Analysis for Neural Networks (eBook)
VIII, 86 Seiten
Springer Berlin (Verlag)
978-3-642-02532-7 (ISBN)
Artificial neural networks are used to model systems that receive inputs and produce outputs. The relationships between the inputs and outputs and the representation parameters are critical issues in the design of related engineering systems, and sensitivity analysis concerns methods for analyzing these relationships. Perturbations of neural networks are caused by machine imprecision, and they can be simulated by embedding disturbances in the original inputs or connection weights, allowing us to study the characteristics of a function under small perturbations of its parameters.
This is the first book to present a systematic description of sensitivity analysis methods for artificial neural networks. It covers sensitivity analysis of multilayer perceptron neural networks and radial basis function neural networks, two widely used models in the machine learning field. The authors examine the applications of such analysis in tasks such as feature selection, sample reduction, and network optimization. The book will be useful for engineers applying neural network sensitivity analysis to solve practical problems, and for researchers interested in foundational problems in neural networks.
Preface 5
Contents 7
1 Introduction to Neural Networks 9
1.1 Properties of Neural Networks 11
1.2 Neural Network Learning 12
1.2.1 Supervised Learning 13
1.2.2 Unsupervised Learning 13
1.3 Perceptron 14
1.4 Adaline and Least Mean Square Algorithm 16
1.5 Multilayer Perceptron and Backpropagation Algorithm 17
1.5.1 Output Layer Learning 19
1.5.2 Hidden Layer Learning 19
1.6 Radial Basis Function Networks 20
1.7 Support Vector Machines 21
2 Principles of Sensitivity Analysis 24
2.1 Perturbations in Neural Networks 24
2.2 Neural Network Sensitivity Analysis 25
2.3 Fundamental Methods of Sensitivity Analysis 28
2.3.1 Geometrical Approach 28
2.3.2 Statistical Approach 30
2.4 Summary 31
3 Hyper-Rectangle Model 32
3.1 Hyper-Rectangle Model for Input Space of MLP 32
3.2 Sensitivity Measure of MLP 33
3.3 Discussion 34
4 Sensitivity Analysis with Parameterized Activation Function 35
4.1 Parameterized Antisymmetric Squashing Function 35
4.2 Sensitivity Measure 36
4.3 Summary 37
5 Localized Generalization Error Model 38
5.1 Introduction 38
5.2 The Localized Generalization Error Model 40
5.2.1 The Q-Neighborhood and Q-Union 41
5.2.2 The Localized Generalization Error Bound 41
5.2.3 Stochastic Sensitivity Measure for RBFNN 43
5.2.4 Characteristics of the Error Bound 45
5.2.5 Comparing Two Classifiers Using the Error Bound 47
5.3 Architecture Selection Using the Error Bound 47
5.3.1 Parameters for MC2SG 49
5.3.2 RBFNN Architecture Selection Algorithm for MC2SG 49
5.3.3 A Heuristic Method to Reduce the Computational Time for MC2SG 50
5.4 Summary 50
6 Critical Vector Learning for RBF Networks 52
6.1 Related Work 52
6.2 Construction of RBF Networks with Sensitivity Analysis 53
6.2.1 RBF Classifiers' Sensitivity to the Kernel Function Centers 54
6.2.2 Orthogonal Least Square Transform 56
6.2.3 Critical Vector Selection 57
6.3 Summary 57
7 Sensitivity Analysis of Prior Knowledge 59
7.1 KBANNs 59
7.2 Inductive Bias 60
7.3 Sensitivity Analysis and Measures 63
7.3.1 Output-Pattern Sensitivity 63
7.3.2 Output-Weight Sensitivity 64
7.3.3 Output-H Sensitivity 65
7.3.4 Euclidean Distance 65
7.4 Promoter Recognition 65
7.4.1 Data and Initial Domain Theory 66
7.4.2 Experimental Methodology 67
7.5 Discussion and Conclusion 68
8 Applications 72
8.1 Input Dimension Reduction 72
8.1.1 Sensitivity Matrix 73
8.1.2 Criteria for Pruning Inputs 73
8.2 Network Optimization 74
8.3 Selective Learning 77
8.4 Hardware Robustness 78
8.5 Measure of Nonlinearity 80
8.6 Parameter Tuning for Neocognitron 81
8.6.1 Receptive Field 82
8.6.2 Selectivity 83
8.6.3 Sensitivity Analysis of the Neocognitron 83
Bibliography 86
| Erscheint lt. Verlag | 9.11.2009 |
|---|---|
| Reihe/Serie | Natural Computing Series | Natural Computing Series |
| Zusatzinfo | VIII, 86 p. 24 illus. |
| Verlagsort | Berlin |
| Sprache | englisch |
| Themenwelt | Mathematik / Informatik ► Informatik ► Netzwerke |
| Naturwissenschaften ► Physik / Astronomie | |
| Technik | |
| Schlagworte | Adaline • Backpropagation algorithm • Computational Intelligence • Connectionism • Feature Selection • Hyperrectangle model • Knowledge-based artificial neural networks (KBANNs) • learning • Localized generalized error model • machine learning • Multilaye • Multilayer perceptron (MLP) • Neural networks • Optimization • perception • Perceptron • perturbations • Sensitivity Analysis • supervised learning • Unsupervised Learning • Vector learni |
| ISBN-10 | 3-642-02532-3 / 3642025323 |
| ISBN-13 | 978-3-642-02532-7 / 9783642025327 |
| Informationen gemäß Produktsicherheitsverordnung (GPSR) | |
| Haben Sie eine Frage zum Produkt? |
DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasserzeichen und ist damit für Sie personalisiert. Bei einer missbräuchlichen Weitergabe des eBooks an Dritte ist eine Rückverfolgung an die Quelle möglich.
Dateiformat: PDF (Portable Document Format)
Mit einem festen Seitenlayout eignet sich die PDF besonders für Fachbücher mit Spalten, Tabellen und Abbildungen. Eine PDF kann auf fast allen Geräten angezeigt werden, ist aber für kleine Displays (Smartphone, eReader) nur eingeschränkt geeignet.
Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.
Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.
aus dem Bereich