Nicht aus der Schweiz? Besuchen Sie lehmanns.de
Principal Component Analysis Networks and Algorithms -  Zhansheng Duan,  Changhua Hu,  Xiangyu Kong

Principal Component Analysis Networks and Algorithms (eBook)

eBook Download: PDF
2017 | 1st ed. 2017
XXII, 323 Seiten
Springer Singapore (Verlag)
978-981-10-2915-8 (ISBN)
Systemvoraussetzungen
149,79 inkl. MwSt
(CHF 146,30)
Der eBook-Verkauf erfolgt durch die Lehmanns Media GmbH (Berlin) zum Preis in Euro inkl. MwSt.
  • Download sofort lieferbar
  • Zahlungsarten anzeigen
This book not only provides a comprehensive introduction to neural-based PCA methods in control science, but also presents many novel PCA algorithms and their extensions and generalizations, e.g., dual purpose, coupled PCA, GED, neural based SVD algorithms, etc. It also discusses in detail various analysis methods for the convergence, stabilizing, self-stabilizing property of algorithms, and introduces the deterministic discrete-time systems method to analyze the convergence of PCA/MCA algorithms. Readers should be familiar with numerical analysis and the fundamentals of statistics, such as the basics of least squares and stochastic algorithms. Although it focuses on neural networks, the book only presents their learning law, which is simply an iterative algorithm. Therefore, no a priori knowledge of neural networks is required. This book will be of interest and serve as a reference source to researchers and students in applied mathematics, statistics, engineering, and other related fields.

Xiangyu Kong, received the B.S. degree in optical engineering from Beijing Institute of Technology, China, in 1990, and Ph.D. degree in control engineering from Xi'an Jiaotong University, China, in 2005. He is currently an associate professor in department of control engineering at Xi'an Institute of Hi-Tech. His research interests include adaptive signal processing, neural networks and feature extraction. He has published two monographs (both as first author) and more than 60 papers, in which nearly 20 articles were published in premier journal including IEEE Trans. Signal Process., IEEE Trans. Neural Netw. and Learning Syst., IEEE Signal Process. Lett., Neural Networks, and so on. He has presided two projected for national natural science foundation of China.

Changhua Hu, is currently a professor in Department of Control Engineering at Xi'an Institute of Hi-Tech. His research interests include fault diagnosis in control systems, fault prognostics and predictive maintenance. He has published three monographs and more than 200 papers in premier journals including IEEE Transactions, EJOR, and so on. In 2010, he obtained National Outstanding Young natural science foundation. He was awarded National-class candidate of 'New Century BaiQianWan Talents Program', and National Middle-aged and Young Experts with Outstanding Contributions in 2012. In 2013, he was awarded Cheung Kong Scholar.


Zhansheng Duan, received the B.S. and Ph.D. degrees from Xi'an Jiaotong University, China, in 1999 and 2005, respectively, both in electrical engineering. He also received PhD degree in electrical engineering from the University of New Orleans, in 2010. From January 2010 to April 2010, he worked as an assistant research professor in the Department of Computer Science, University of New Orleans. In July 2010, he joined the Center for Information Engineering Science Research, Xi'an Jiaotong University as an associate professor. His research interests include estimation and detection theory, target tracking, information fusion, nonlinear filtering and performance evaluation. Dr. Duan has co-authored one monograph: Multisource Information Fusion (Tsinghua University Press, 2006), 50 journal and conference proceedings papers. He is also a member of ISIF (International Society of Information Fusion) and the Honor Society of Eta Kappa Nu, and is listed in Who's Who in America 2015.


This book not only provides a comprehensive introduction to neural-based PCA methods in control science, but also presents many novel PCA algorithms and their extensions and generalizations, e.g., dual purpose, coupled PCA, GED, neural based SVD algorithms, etc. It also discusses in detail various analysis methods for the convergence, stabilizing, self-stabilizing property of algorithms, and introduces the deterministic discrete-time systems method to analyze the convergence of PCA/MCA algorithms. Readers should be familiar with numerical analysis and the fundamentals of statistics, such as the basics of least squares and stochastic algorithms. Although it focuses on neural networks, the book only presents their learning law, which is simply an iterative algorithm. Therefore, no a priori knowledge of neural networks is required. This book will be of interest and serve as a reference source to researchers and students in applied mathematics, statistics, engineering, and other related fields.

Xiangyu Kong, received the B.S. degree in optical engineering from Beijing Institute of Technology, China, in 1990, and Ph.D. degree in control engineering from Xi'an Jiaotong University, China, in 2005. He is currently an associate professor in department of control engineering at Xi'an Institute of Hi-Tech. His research interests include adaptive signal processing, neural networks and feature extraction. He has published two monographs (both as first author) and more than 60 papers, in which nearly 20 articles were published in premier journal including IEEE Trans. Signal Process., IEEE Trans. Neural Netw. and Learning Syst., IEEE Signal Process. Lett., Neural Networks, and so on. He has presided two projected for national natural science foundation of China.Changhua Hu, is currently a professor in Department of Control Engineering at Xi'an Institute of Hi-Tech. His research interests include fault diagnosis in control systems, fault prognostics and predictive maintenance. He has published three monographs and more than 200 papers in premier journals including IEEE Transactions, EJOR, and so on. In 2010, he obtained National Outstanding Young natural science foundation. He was awarded National-class candidate of "New Century BaiQianWan Talents Program", and National Middle-aged and Young Experts with Outstanding Contributions in 2012. In 2013, he was awarded Cheung Kong Scholar.Zhansheng Duan, received the B.S. and Ph.D. degrees from Xi'an Jiaotong University, China, in 1999 and 2005, respectively, both in electrical engineering. He also received PhD degree in electrical engineering from the University of New Orleans, in 2010. From January 2010 to April 2010, he worked as an assistant research professor in the Department of Computer Science, University of New Orleans. In July 2010, he joined the Center for Information Engineering Science Research, Xi'an Jiaotong University as an associate professor. His research interests include estimation and detection theory, target tracking, information fusion, nonlinear filtering and performance evaluation. Dr. Duan has co-authored one monograph: Multisource Information Fusion (Tsinghua University Press, 2006), 50 journal and conference proceedings papers. He is also a member of ISIF (International Society of Information Fusion) and the Honor Society of Eta Kappa Nu, and is listed in Who’s Who in America 2015.

Preface 6
Aim of This book 6
Novel Algorithms and Extensions 7
Prerequisites 8
Outline of the Book 8
Suggested Sequence of Reading 10
Acknowledgments 11
Contents 12
About the Authors 18
1 Introduction 20
1.1 Feature Extraction 20
1.1.1 PCA and Subspace Tracking 20
1.1.2 PCA Neural Networks 21
1.1.3 Extension or Generalization of PCA 23
1.2 Basis for Subspace Tracking 24
1.2.1 Concept of Subspace 24
1.2.2 Subspace Tracking Method 27
1.3 Main Features of This Book 29
1.4 Organization of This Book 30
References 31
2 Matrix Analysis Basics 36
2.1 Introduction 36
2.2 Singular Value Decomposition 37
2.2.1 Theorem and Uniqueness of SVD 37
2.2.2 Properties of SVD 39
2.3 Eigenvalue Decomposition 41
2.3.1 Eigenvalue Problem and Eigen Equation 41
2.3.2 Eigenvalue and Eigenvector 42
2.3.3 Eigenvalue Decomposition of Hermitian Matrix 47
2.3.4 Generalized Eigenvalue Decomposition 49
2.4 Rayleigh Quotient and Its Characteristics 53
2.4.1 Rayleigh Quotient 53
2.4.2 Gradient and Conjugate Gradient Algorithm for RQ 54
2.4.3 Generalized Rayleigh Quotient 55
2.5 Matrix Analysis 57
2.5.1 Differential and Integral of Matrix with Respect to Scalar 57
2.5.2 Gradient of Real Function with Respect to Real Vector 58
2.5.3 Gradient Matrix of Real Function 59
2.5.4 Gradient Matrix of Trace Function 61
2.5.5 Gradient Matrix of Determinant 62
2.5.6 Hessian Matrix 64
2.6 Summary 64
References 65
3 Neural Networks for Principal Component Analysis 66
3.1 Introduction 66
3.2 Review of Neural-Based PCA Algorithms 67
3.3 Neural-Based PCA Algorithms Foundation 67
3.3.1 Hebbian Learning Rule 67
3.3.2 Oja’s Learning Rule 69
3.4 Hebbian/Anti-Hebbian Rule-Based Principal Component Analysis 70
3.4.1 Subspace Learning Algorithms 71
3.4.1.1 Symmetrical Subspace Learning Algorithm 71
3.4.1.2 Weighted Subspace Learning Algorithm 72
3.4.2 Generalized Hebbian Algorithm 72
3.4.3 Learning Machine for Adaptive Feature Extraction via PCA 73
3.4.4 The Dot-Product-Decorrelation Algorithm (DPD) 74
3.4.5 Anti-Hebbian Rule-Based Principal Component Analysis 74
3.4.5.1 Rubner-Tavan PCA Algorithm 75
3.4.5.2 APEX Algorithm 75
3.5 Least Mean Squared Error-Based Principal Component Analysis 76
3.5.1 Least Mean Square Error Reconstruction Algorithm (LMSER) 77
3.5.2 Projection Approximation Subspace Tracking Algorithm (PAST) 78
3.5.3 Robust RLS Algorithm (RRLSA) 79
3.6 Optimization-Based Principal Component Analysis 80
3.6.1 Novel Information Criterion (NIC) Algorithm 80
3.6.2 Coupled Principal Component Analysis 81
3.7 Nonlinear Principal Component Analysis 82
3.7.1 Kernel Principal Component Analysis 83
3.7.2 Robust/Nonlinear Principal Component Analysis 85
3.7.3 Autoassociative Network-Based Nonlinear PCA 87
3.8 Other PCA or Extensions of PCA 87
3.9 Summary 89
References 90
4 Neural Networks for Minor Component Analysis 93
4.1 Introduction 93
4.2 Review of Neural Network-Based MCA Algorithms 94
4.2.1 Extracting the First Minor Component 95
4.2.2 Oja’s Minor Subspace Analysis 97
4.2.3 Self-stabilizing MCA 97
4.2.4 Orthogonal Oja Algorithm 98
4.2.5 Other MCA Algorithm 98
4.3 MCA EXIN Linear Neuron 99
4.3.1 The Sudden Divergence 99
4.3.2 The Instability Divergence 101
4.3.3 The Numerical Divergence 101
4.4 A Novel Self-stabilizing MCA Linear Neurons 102
4.4.1 A Self-stabilizing Algorithm for Tracking One MC 102
4.4.2 MS Tracking Algorithm 108
4.4.3 Computer Simulations 112
4.5 Total Least Squares Problem Application 116
4.5.1 A Novel Neural Algorithm for Total Least Squares Filtering 116
4.5.2 Computer Simulations 123
4.6 Summary 124
References 125
5 Dual Purpose for Principal and Minor Component Analysis 129
5.1 Introduction 129
5.2 Review of Neural Network-Based Dual-Purpose Methods 131
5.2.1 Chen’s Unified Stabilization Approach 131
5.2.2 Hasan’s Self-normalizing Dual Systems 132
5.2.3 Peng’s Unified Learning Algorithm to Extract Principal and Minor Components 135
5.2.4 Manton’s Dual-Purpose Principal and Minor Component Flow 136
5.3 A Novel Dual-Purpose Method for Principal and Minor Subspace Tracking 137
5.3.1 Preliminaries 137
5.3.1.1 Definitions and Properties 137
5.3.1.2 Conventional Formulation for PSA or MSA 138
5.3.2 A Novel Information Criterion and Its Landscape 140
5.3.2.1 A Novel Criterion for PSA and MSA 140
5.3.2.2 Landscape of Nonquadratic Criteria 140
5.3.3 Dual-Purpose Subspace Gradient Flow 145
5.3.3.1 Dual Purpose Gradient Flow 145
5.3.3.2 Convergence Analysis 146
5.3.3.3 Self-stability Property Analysis 148
5.3.4 Global Convergence Analysis 149
5.3.5 Numerical Simulations 150
5.3.5.1 Self-stabilizing Property and Convergence 150
5.3.5.2 The Contrasts with Other Algorithms 152
5.3.5.3 Examples from Practical Applications of Our Unified Algorithm 153
5.4 Another Novel Dual-Purpose Algorithm for Principal and Minor Subspace Analysis 156
5.4.1 The Criterion for PSA and MSA and Its Landscape 157
5.4.2 Dual-Purpose Algorithm for PSA and MSA 159
5.4.3 Experimental Results 160
5.4.3.1 Simulation Experiment 160
5.4.3.2 Real Application Experiment 161
5.5 Summary 164
References 164
6 Deterministic Discrete-Time System for the Analysis of Iterative Algorithms 167
6.1 Introduction 167
6.2 Review of Performance Analysis Methods for Neural Network-Based PCA Algorithms 168
6.2.1 Deterministic Continuous-Time System Method 168
6.2.2 Stochastic Discrete-Time System Method 169
6.2.3 Lyapunov Function Approach 172
6.2.4 Deterministic Discrete-Time System Method 173
6.3 DDT System of a Novel MCA Algorithm 173
6.3.1 Self-stabilizing MCA Extraction Algorithms 173
6.3.2 Convergence Analysis via DDT System 174
6.3.3 Computer Simulations 183
6.4 DDT System of a Unified PCA and MCA Algorithm 185
6.4.1 Introduction 186
6.4.2 A Unified Self-stabilizing Algorithm for PCA and MCA 186
6.4.3 Convergence Analysis 187
6.4.4 Computer Simulations 198
6.5 Summary 200
References 201
7 Generalized Principal Component Analysis 203
7.1 Introduction 203
7.2 Review of Generalized Feature Extraction Algorithm 205
7.2.1 Mathew’s Quasi-Newton Algorithm for Generalized Symmetric Eigenvalue Problem 205
7.2.2 Self-organizing Algorithms for Generalized Eigen Decomposition 207
7.2.3 Fast RLS-like Algorithm for Generalized Eigen Decomposition 208
7.2.4 Generalized Eigenvector Extraction Algorithm Based on RLS Method 209
7.2.5 Fast Adaptive Algorithm for the Generalized Symmetric Eigenvalue Problem 212
7.2.6 Fast Generalized Eigenvector Tracking Based on the Power Method 214
7.2.7 Generalized Eigenvector Extraction Algorithm Based on Newton Method 216
7.2.8 Online Algorithms for Extracting Minor Generalized Eigenvector 218
7.3 A Novel Minor Generalized Eigenvector Extraction Algorithm 220
7.3.1 Algorithm Description 221
7.3.2 Self-stabilizing Analysis 222
7.3.3 Convergence Analysis 223
7.3.4 Computer Simulations 231
7.4 Novel Multiple GMC Extraction Algorithm 235
7.4.1 An Inflation Algorithm for Multiple GMC Extraction 235
7.4.2 A Weighted Information Criterion and Corresponding Multiple GMC Extraction 238
7.4.3 Simulations and Application Experiments 246
7.5 Summary 248
References 249
8 Coupled Principal Component Analysis 252
8.1 Introduction 252
8.2 Review of Coupled Principal Component Analysis 254
8.2.1 Moller’s Coupled PCA Algorithm 254
8.2.2 Nguyen’s Coupled Generalized Eigen pairs Extraction Algorithm 255
8.2.3 Coupled Singular Value Decomposition of a Cross-Covariance Matrix 260
8.3 Unified and Coupled Algorithm for Minor and Principal Eigen Pair Extraction 260
8.3.1 Couple Dynamical System 261
8.3.2 The Unified and Coupled Learning Algorithms 263
8.3.2.1 Coupled MCA Algorithms 263
8.3.2.2 Coupled PCA Algorithms 264
8.3.2.3 Multiple Eigen Pairs Estimation 265
8.3.3 Analysis of Convergence and Self-stabilizing Property 267
8.3.4 Simulation Experiments 269
8.4 Adaptive Coupled Generalized Eigen Pairs Extraction Algorithms 274
8.4.1 A Coupled Generalized System for GMCA and GPCA 274
8.4.2 Adaptive Implementation of Coupled Generalized Systems 279
8.4.3 Convergence Analysis 282
8.4.4 Numerical Examples 288
8.5 Summary 295
References 295
9 Singular Feature Extraction and Its Neural Networks 297
9.1 Introduction 297
9.2 Review of Cross-Correlation Feature Method 299
9.2.1 Cross-Correlation Neural Networks Model and Deflation Method 299
9.2.2 Parallel SVD Learning Algorithms on Double Stiefel Manifold 302
9.2.3 Double Generalized Hebbian Algorithm (DGHA) for SVD 304
9.2.4 Cross-Associative Neural Network for SVD(CANN) 305
9.2.5 Coupled SVD of a Cross-Covariance Matrix 307
9.2.5.1 Single-Component Learning Rules 308
9.2.5.2 Multiple Component Learning Rules 309
9.3 An Effective Neural Learning Algorithm for Extracting Cross-Correlation Feature 310
9.3.1 Preliminaries 311
9.3.1.1 Definitions and Properties 311
9.3.1.2 Some Formulations Relative to PSS 311
9.3.2 Novel Information Criterion Formulation for PSS 313
9.3.2.1 Novel Information Criterion Formulation for PSS 313
9.3.2.2 Landscape of Nonquadratic Criterion 314
9.3.2.3 Remarks and Comparisons 320
9.3.3 Adaptive Learning Algorithm and Performance Analysis 321
9.3.3.1 Adaptive Learning Algorithm 321
9.3.3.2 Convergence Analysis 322
9.3.3.3 Self-Stability Property Analysis 323
9.3.4 Computer Simulations 324
9.4 Coupled Cross-Correlation Neural Network Algorithm for Principal Singular Triplet Extraction of a Cross-Covariance Matrix 327
9.4.1 A Novel Information Criterion and a Coupled System 328
9.4.2 Online Implementation and Stability Analysis 331
9.4.3 Simulation Experiments 332
9.4.3.1 Experiment 1 332
9.4.3.2 Experiment 2 334
9.5 Summary 336

Erscheint lt. Verlag 9.1.2017
Zusatzinfo XXII, 323 p. 86 illus., 41 illus. in color.
Verlagsort Singapore
Sprache englisch
Themenwelt Mathematik / Informatik Informatik Programmiersprachen / -werkzeuge
Informatik Theorie / Studium Algorithmen
Informatik Theorie / Studium Künstliche Intelligenz / Robotik
Mathematik / Informatik Mathematik Angewandte Mathematik
Mathematik / Informatik Mathematik Statistik
Technik Elektrotechnik / Energietechnik
Schlagworte Algorithm analysis and problem complexity • feature extraction • Generalized Feature Extraction • Neural networks • PCA Algorithms • Principal Component Analysis • Singular Component Analysis
ISBN-10 981-10-2915-6 / 9811029156
ISBN-13 978-981-10-2915-8 / 9789811029158
Haben Sie eine Frage zum Produkt?
PDFPDF (Wasserzeichen)
Größe: 6,9 MB

DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasser­zeichen und ist damit für Sie persona­lisiert. Bei einer missbräuch­lichen Weiter­gabe des eBooks an Dritte ist eine Rück­ver­folgung an die Quelle möglich.

Dateiformat: PDF (Portable Document Format)
Mit einem festen Seiten­layout eignet sich die PDF besonders für Fach­bücher mit Spalten, Tabellen und Abbild­ungen. Eine PDF kann auf fast allen Geräten ange­zeigt werden, ist aber für kleine Displays (Smart­phone, eReader) nur einge­schränkt geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.

Zusätzliches Feature: Online Lesen
Dieses eBook können Sie zusätzlich zum Download auch online im Webbrowser lesen.

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich
Learn asynchronous programming by building working examples of …

von Carl Fredrik Samson

eBook Download (2024)
Packt Publishing Limited (Verlag)
CHF 28,10