Artificial Neural Networks (eBook)
836 Seiten
Elsevier Science (Verlag)
978-1-4832-9800-9 (ISBN)
This two-volume proceedings compiles a selection of research papers presented at the ICANN-91. The scope of the volumes is interdisciplinary, ranging from mathematics and engineering to cognitive sciences and biology. European research is well represented. Volume 1 contains all the orally presented papers, including both invited talks and submitted papers. Volume 2 contains the plenary talks and the poster presentations.
Front Cover 1
Artificial Neural Networks 4
Copyright Page 5
Table of Contents 8
Part 1: Plenary Talks 22
CHAPTER 1. SELF-ORGANIZING MAPS: OPTIMIZATION APPROACHES 24
1. Introduction 24
2. Relation of the Self-Organizing Map Algorithm to Stochastic Approximation 26
3. Special Case: Optimal Recursive Expression for the Classical Vector Quantization 27
4. An Attempt to Derive the Self-Organizing Map Algorithm from an Error Functional 27
5. Numerical Simulations 29
6. Conclusions 30
References 31
Appendix 32
Acknowledgement 33
CHAPTER 2. CONNECTIONISM OR WEIGHTLESS NEUROCOMPUTING? 34
1. INTRODUCTION 34
2. CONNECTIONIST AND WEIGHTLESS FRAMEWORKS 34
3. THE GENERALISING RAM MODEL (G-RAM) 37
4. THE GENERAL NEURAL UNIT (GNU) 38
5. WEIGHTLESS NEUROCOMPUTING FUTURE RESEARCH ISSUES 40
6. SUMMARY 42
REFERENCES 43
Part 2: Mathematical Theories of Networks and Dynamical Systems 44
CHAPTER 3. PROBABILISTIC APPROACH FOR MULTICLASSCLASSIFICATION WITH NEURAL NETWORKS 46
1 Introduction 46
2 Proposed approach 47
3 Practical Reconstruction 48
4 Conclusions 49
References 49
CHAPTER 4. ON THE FEEDBACK PERCEPTRON 50
1. INTRODUCTION 50
2. STABILITY 51
3. SIMULATION EXAMPLE AND CONCLUSIONS 52
REFERENCES 53
CHAPTER 5. MATHEMATICAL ASPECTS OF NEURO-DYNAMICSFOR COMBINATORIAL OPTIMIZATION 54
1. INTRODUCTION 54
2. OBJECTIVE FUNCTION OF QUADRATIC FORM 54
3. DYNAMICAL SYSTEM FOR OPTIMIZATION PROBLEMS 55
4. INITIAL STATE SELECTION PROBLEM 56
5. CONCLUDING REMARKS 57
References 57
CHAPTER 6. THE QUANTITATIVE DESCRIPTION OF PLN NETWORK'S BEHAVIOR 58
1. Introduction 58
2. PLN Network 58
3. The Convergence Theorem of PLN Networks 59
References 61
CHAPTER 7. PREDICTING THE ANNEALING RANGE BY COMPUTING CRITICAL TEMPERATUREIN MEAN FIELD ANNEALING FOR THE TRAVELING SALESMAN PROBLEM 62
Abstract 62
Introduction 62
Experimental Results and Conclusions 65
References 67
CHAPTER 8. ORDERS OF APPROXIMATION OF NEURAL NETWORKS TO BRAINSTRUCTURE: LEVELS, MODULES AND COMPUTING POWER 68
1. INTRODUCTION 68
2. ELEMENTS AND STRUCTURES 68
3. COMPUTATIONAL POWER 70
4. CONCLUSION 71
5. REFERENCES 71
CHAPTER 9. A Provably Convergent Perceptron-Like Algorithmfor Learning Hypercubic Decision Regions 72
1 Introduction 72
2 Notation 72
3 Algorithm Statement 73
4 Stability Proof 73
5 Implications of the stability 74
6 Conclusions 75
Acknowlegdement 75
7 References 75
CHAPTER 10. Perceptron Learning with Reasonable Distributions of Examples 76
1 Introduction 76
2 PAC-Learnability 76
3 Learning with reasonable distributions 77
4 Perceptron learning 77
5 Directed search perceptron algorithms 78
Acknowledgements 79
References 79
CHAPTER 11. GLOBAL OPTIMIZATION BY REACTION-DIFFUSION SYSTEMS 80
CHAPTER 12. SELF-ORGANIZING MAP TRAINING USING DYNAMIC K-D TREES 84
1 Introduction 84
2 Training of Self-Organizing Feature Maps 84
3 The Dynamic K-d Tree 85
4 Experimental Results 85
5 Conclusions 87
References 87
CHAPTER 13. NETWORK CONFIGURATION AND INITIALIZATION USING MATHEMATICAL MORPHOLOGY :THEORETICAL STUDY OF MEASUREMENT FUNCTIONS 88
1 Introduction 88
2 Network equivalence 88
3 Dimension of the space of transformed images 89
4 Optimal size of neighborhood 90
5 Results on morphological measurements 90
6 Conclusion 91
7 Notations 91
References 91
CHAPTER 14. A GLOBAL APPROACH TO CLASSIFICATION: PROBABILISTIC ASPECTS 92
1 CLASSIFICATION 92
2 PROBABILISTIC APPROACH 92
3 PROPOSED METHODOLOGY 94
References 95
CHAPTER 15. A NETWORK FOR DISCRIMINANT ANALYSIS 96
ABSTRACT 96
1. Introduction 96
2. Network Model and Training 97
3 . Discussion 99
ACKNOWLEDGEMENTS 99
REFERENCES 99
CHAPTER 16. ON THE NEURAL NETS DISCRETENESS ROLE BY THE TREATMENT OF IMAGES 100
REFERENSES 102
CHAPTER 17. AN INFORMATION THEORETICAL INTERPRETATION OFNEURONAL ACTIVITIES 104
1. Introduction 104
2. The definitions of Eó and Et 104
References 107
Part 3: Pattern Recognitionand Signal Processing I 108
CHAPTER 18. CLASSIFICATION OF REMOTELY-SENSED SATELLITEIMAGES USING MULTI-LAYER PERCEPTRON NETWORKS 110
1. INTRODUCTION 110
2. SATELLITE IMAGERY AND GROUND TRUTH TRAINING DATA 110
3. IMPLEMENTATION OF NEURAL NETWORK CLASSIFIER 111
4. NETWORK ARCHITECTURES AND PERFORMANCE 111
5. DISCUSSION 113
REFERENCES 113
CHAPTER 19. NEURAL NETWORK FOR DIGITAL IMAGE ENHANCEMENT 114
1. THEORETICAL ASPECTS 114
2. REALTIME HISTOGRAM MODIFICATION 115
3. CONCLUSION 117
REFERENCES 117
CHAPTER 20. IMAGE PROCESSING BY NEURAL NETWORK IMPLEMENTATION OF 2 - D 118
1. Introduction 118
2. Problem statement 118
3. Problem solution 119
4. Neural network solving Lyapunov equation 119
5.Conclusions 121
References 121
CHAPTER 21. GAP-REMOVAL IMAGE TRANSFORMATION FOR ALIAS 122
1. INTRODUCTION TO ALIAS 122
2. IMAGE TRANSFORMATION 123
3 . PRELIMINARY RESULTS AND CONCLUSIONS 124
REFERENCES 125
CHAPTER 22. IMAGE SEGMENTATION USING 4 DIRECTIONLINE-PROCESSES AND WINNER-TAKE-ALL 126
1 Introduction 126
2 Mean field techniques and winner-take-all: 126
3 Four line-process model : 127
4 Simulation 128
References 128
CHAPTER 23. CONSTRAINT SATISFACTION NEURAL NETWORKS FOR IMAGE SEGMENTATION 130
1. INTRODUCTION 130
2. CONSTRAINT SATISFACTION NEURAL NETWORKS 130
3. EXPERIMENTAL RESULTS 132
4. CONCLUDING REMARKS 132
REFERENCES 132
CHAPTER 24. A CUT-POINT RECOGNITION ALGORITHM USING PLN NODE 134
1. INTRODUCTION 134
2. THE CUT-POINT RECOGNITION METHOD 135
3. DISCUSSIONS AND CONCLUSIONS 136
REFERENCES 137
CHAPTER 25. A NOVEL CHARACTER RECOGNITION SYSTEM USING ACONTEXTUAL FEEDBACK CONNECTIONIST MODULE TO ENHANCE SYSTEM PERFORMANCE 138
1. INTRODUCTION 138
2. SYSTEM DESIGN 138
3. RESULTS AND DISCUSSION 140
4. CONCLUSION 141
Acknowledgements: 141
References 141
CHAPTER 26. NN AND HEURISTIC APPROACH TO CHARACTER RECOGNITION 142
1. INTRODUCTION 142
2. THE NEURAL NETWORK 142
3. THE COOPERATION BETWEEN THE NN AND AN HEURISTIC APPROACH 143
4. CONCLUSION 145
REFERENCES 145
CHAPTER 27. SIMILARITY-INVARIANT RECOGNITION OF VISUAL IMAGES WITHHELP OF KOHONENfS MAPPING FORMATION ALGORITHM 146
1 . INTRODUCTION 146
2. THE MAIN THEOREM 147
3. RECOGNITION CONSIDERATIONS 148
4 . GENERALIZATIONS 148
REFERENCES 149
CHAPTER 28. ACQUIRED STRUCTURE, ADAPTED PARAMETERS :MODIFICATIONS OF THE NEOCOGNITRON 150
1. INTRODUCTION 150
2. OUTLINE OF THE NEOCOGNITRON 150
3. OPTIMIZATION 151
4. SELF ORGANISATION AND CONSTRUCTION PROCESS 151
5. SIMULATIONS 152
6. THRESHOLD CONTROL 152
7. CONCLUSION 153
CHAPTER 29. MODEL-BASED OBJECT RECOGNITION USING ARTIFICIAL NEURAL NETWORKS 154
1. INTRODUCTION 154
2. INVARIANT BOUNDARY REPRESENTATION 154
3. THE PREDICT BACK-PROPAGATION ALGORITHM 155
4. THE KOHONEN ALGORITHM 155
5. IMPLEMENTATION ISSUES AND SIMULATION RESULTS 155
6. COMPARISONS WITH CLASSICAL METHODS 157
7. CONCLUSIONS 157
REFERENCES 158
CHAPTER 30. AUTOMATIC CLASSIFICATION OF VISUAL EVOKED POTENTIALS BY FEEDFORWARD NEURAL NETWORKS 160
INTRODUCTION 160
SIGNAL RECORDING 161
VEP CLASSIFICATION BY HUMAN EXPERTS 161
VEP CLASSIFICATION BY NEURAL NETWORKS 162
DISCUSSION 163
AKNOWLEDGMENTS 163
REFERENCES 163
CHAPTER 31. A NEURAL NETWORK MODEL FOR CONTROL AND STABILIZATION OF REVERBERATING PATTERN SEQUENCES 164
Introduction 164
The model approach off the dynamic recurrent filter 164
Week context dependent on-Iine-moduIation of the processing structure of the recurrent dynamic filter 165
Simulation results 166
CHAPTER 32. RESULTS OBTAINED WITH THE AUTOGENERATIVE NODALMEMORY (ANM) MODEL NEURAL NETWORK 168
1. ABSTRACT 168
2. ILLUSORY CONTOURS (THE EFFECT) 168
3. THE IMPLEMENTATION WITH ANM 169
4. ANM EXPERIMENT RESULT 170
5. CONCLUSIONS 171
6. REFERENCES 171
Part 4: Physics Connection 172
CHAPTER 33. LEARNING TOPOLOGICAL MAPPINGS FOR SKELETAL REPRESENTATION 174
1. INTRODUCTION: TOPOLOGICAL MAPPINGS 174
2. LEARNING TOPOLOGICAL MAPPINGS FOR SKELETAL REPRESENTATION 175
3. PATH PLANNING 175
4. SIMULATION RESULTS 175
5. CONCLUSION 175
ACKNOWLEDGEMENTS 177
REFERENCES 177
CHAPTER 34. Basins of Attraction in Neural Network ModelsTrained with External Fields 178
Introduction 178
The Model 178
Fixed-Points of the Dynamics 179
Training Field Only 180
Retrieval Field Only 180
Equal Training and Retrieval Fields 180
Concluding Remarks 181
Acknowledgements 181
References 181
Part 5: Neural Network Architectures and Algorithms I 182
CHAPTER 35. FAST LEARNING ALGORITHMS FOR NEURAL NETWORKS 184
1. Introduction 184
2. A Generalized Criterion for the Training of Neural Networks 184
3. Fast Learning Algorithms for Single-Layered Neural Networks 184
4. Fast Learning Algorithms for Multi-layered Neural Networks 185
5. Experimental Results 186
6. Conclusions 187
REFERENCES 187
CHAPTER 36. UN NEURAL NETWORK ALGORITHM FOR GRAPH MATCHING 188
References 191
CHAPTER 37. Recurrence with Delayed Links in Multilayer Networksfor Processing Sequential Data 192
1 Introduction 192
2 Approaches to sequentialdata processing 193
3 Networks described with delayed links 193
4 The learning algorithm 194
5 Simulation experiments 194
6 Discussion 195
7 References 195
CHAPTER 38. DYNAMICALLY CAPACITY ALLOCATING NETWORK MODELSFOR CONTINUOUS LEARNING 196
1. INTRODUCTION 196
2. LEARNING ALGORITHM 197
3. EXAMPLE 199
4. CONCLUSIONS 199
REFERENCES 199
CHAPTER 39. TOWARDS OPTIMAL ARCHITECTURES FOR LOGICAL NEURALNETS 200
1. Introduction 200
2.The PLN Model 201
3.Training and Construction 201
4. Computer Simulations 202
5. Conclusions 203
Acknowledgements 203
References 203
CHAPTER 40. PREDICTION AND GENERALIZATION IN LOGICAL NEURAL NETS 204
I. Introduction 204
II. Probability Transfer Model of G-RAM 204
III. Prediction and Generalization 205
IV. Computer Simulations 206
V. Conclusions 207
Acknowledgements 207
References 207
CHAPTER 41. FUSION-TECHNOLOGY AND THE DESIGN OF EVOLUTIONARY MACHINES FOR NEURAL NETWORKS 208
Motivation and Background 208
Central Features of an Evolutionary Machine 209
Recombination 210
Summary 211
References 211
CHAPTER 42. Synaptic Growth As A Learning Model 212
1. INTRODUCTION 212
2. SGN NETWORK BASICS 213
3. SGN MODELS I, II AND III 213
4. CONCLUSION 215
REFERENCES 215
CHAPTER 43. LEARNING AND GENERALIZATIONIN ADAPTIVE LOGIC NET WORKS 1 216
1 Introduction 216
2 Learning and generalization in logic networks. 216
3 Dealing with continuous inputs 218
4 What advantage is there to using logic networks? 218
5 Conclusions 219
6 References 219
CHAPTER 44. A COMPARISON BETWEEN REAL AND COMPLEX VALUED NEURALNET WORKS IN COMMUNICATION APPLICATIONS 220
Abstract 220
1. Introduction 220
2. Complex-Valued Multi-Layer Perceptron 221
3. Experimental Results 222
4. References 222
CHAPTER 45. SDNN: AN O(l) PARALLEL PROCESSING WITH STRICTLY DIGITALNEURAL NETWORKS FOR COMBINATORIAL OPTIMIZATIONIN LARGE SCALE N-QUEEN PROBLEM 224
1. Introduction 224
2. The Systematic Design of Hopfield Neural Networks using the "k -out-of-n" Design Rule 224
3. Computation Model of Strictly DigitalNeural Network 225
4. Performance evaluation of SDNN in Large-Scale Problems 226
References 227
CHAPTER 46. TOLERANCE OF A BINARY ASSOCIATIVE MEMORYTOWARDS STUCK-AT-FAULTS 238
1. Introduction 238
2. The Associative Matrix Concept 238
3. Stuck-at-1 239
4. Stuck-it-0 240
5. Conclusion 241
Acknowledgements 241
Literature 241
CHAPTER 47. MULTI-FONT CHINESE CHARACTER RECOGNITION WITH ASSOCIATIVE MEMORY NETWORK 242
Abstract 242
1 INTRODUCTION 242
2 THE MECHANISM OF ASSOCIATIVE MEMORY 242
3 INNER CODE SELECTION 243
4 EXPERIMENTS AND RESULTS 244
5 CONCLUSIONS 245
ACKNOWLEDGEMENTS 245
REFERENCES 245
CHAPTER 48. MAGE RECOGNITION IN HYPERCOLUMNAR SCALE SPACE BY SPARSELY CODED ASSOCIATIVE MEMORY 246
1 Introduction 246
2 A Dynamic Approach to Hypercolumnar Interactions 246
3 Recognition by Sparsely Coded Associative Memory 247
4 Scale Space Searching for Translational Invariance 248
5 Implementation and Results 249
References 249
CHAPTER 49. DESIGN IMPROVEMENTS IN ASSOCIATIVE MEMORIES FOR CEREBELLAR MODEL ARTICULATION CONTROLLERS (CMAC) 250
Abstract 250
1. INTRODUCTION 250
2. THE RECEPTIVE CENTER PLACEMENT PROBLEM 251
3. EXPERIMENTAL EVALUATION OF RECEPTIVE FIELD SHAPES 252
4. USE OF SUPERSPHERES 252
5 . SPEEDING UP CONVERGENCE OF THE LEARNING ALGORITHM 253
6 . REFERENCES 253
CHAPTER 50. Implementing a "Sense of Time" via Entropy in Associative Memories 254
Abstract 254
Introduction 254
Acknowledgements 255
The Temporal Model 255
Obtaining the Desired Properties 256
Implementations 257
Further Work 257
Summary 257
Bibliography 257
CHAPTER 51. Paging Associative Memories 258
1 Introduction 258
2 The "Paging" viewpoint 258
3 The "Serial Processing" viewpoint 259
4 The "Merged" viewpoint 259
5 The Context-Sensitive Paradigm 260
6 Conclusion 261
CHAPTER 52. On Finding Approximate Solutions to HardProblems by Neural Networks 262
Abstract 262
1 Introduction 262
2 The Neural Network Model 262
3 Preliminaries 263
4 Finding Approximate Solutions by Neural Networks 264
5 Concluding Remarks 265
References 265
Part 6: Artificial Associative Memories 228
CHAPTER 53. STABILITY RESULTS OF A CLASS OF CONTINUOUS ASSOCIATIVE MEMORIES WITH HIGH-CAPACITY 230
1. Introduction 230
2. Continuous Recurrent Correlation Associative Memories 230
3. Stability Properties of the CRCAM 232
References 233
CHAPTER 54. ON THE RETRIEVAL IN HOPFIELD NETS WITH A FINITE RANGE OFCONNECTIONS 234
1. INTRODUCTION 234
2. NUMERICAL TREATMENT AND RESULTS 235
REFERENCES 237
Part 7: Robotics and Control 266
CHAPTER 55. MULTI-LAYER PERCEPTRON LEARNING FOR DESIGN PROBLEM SOLVING 268
1. A TWO-LAYER NEURAL NETWORK MODEL 268
2. APPLICATION 270
ACKNOWLEDGEMENT 271
REFERENCE 271
CHAPTER 56. PROCESS ERROR DETECTIONUSING SELF-ORGANIZING FEATURE MAPS 272
1 Introduction 272
2 Experimental Set-up 272
3 Monitoring the Process with Self-organizing Feature Maps 273
4 Results 274
5 Conclusion and Further Work 275
References 275
CHAPTER 57. Using Inverse Perspective Mapping as a Basis for two Concurrent Obstacle Avoidance Schemes 276
1 INTRODUCTION 276
2 MONOCULAR APPROACH: OBSTACLE DETECTION BY CORRELATION-BASED OPTICAL FLOW ANALYSIS 277
3 STEREOPTICAL APPROACH: COMPARINGAN IMAGE PAIR WITHIN A COMMON VIEW BY SIMPLE CORRELATION 278
4 CONCLUSIONS 279
References 279
CHAPTER 58. A BIOLOGICAL MOTIVATED SYSTEM TO TRACK MOVING OBJECTSBY ACTIVE CAMERA CONTROL 280
1 Introduction and Motivation 280
2 Biological Model 280
3 The Behavioural Module "Tracking" 281
4 Experimental Results 282
5 Discussion and Future Work 282
References 283
CHAPTER 59. NEURALLY INSPIRED ASSOCIATIVE MEMORIES FOR LEARNINGCONTROL. A COMPARISON 284
1 Introduction 284
2 Classical Methods 284
3 Neurally Inspired Methods 285
4 Comparison of AMS and a Backpropagation Network 286
5 Conclusion 287
Acknowledgements 287
References 287
CHAPTER 60.
288
1. INTRODUCTION 288
2. FUZZY COGNITIVE MODEL CONFIGURATION 288
3. APPLICATION 290
4. CONCLUSION 291
REFERENCES 291
CHAPTER 61. FUZZY ASSOCIATIVE MEMORY APPLICATIONS TO CONTROL 292
1. INTRODUCTION 292
2. FUZZY ASSOCIATIVE MEMORY SYSTEM CONSTRUCTION 292
3. APPLICATIONS 294
4. CONCLUSION 295
REFERENCES 295
CHAPTER 62. A FAST EDGE DETECTION METHOD FOR NAVIGATION 296
1. Introduction 296
2. Detection and classification of road edges 296
3. Results 298
References: 299
CHAPTER 63. QUADRUPEDAL WALKING USING TRAINED AND UNTRAINED NEURAL MODELS 300
1 Introduction 300
3 The hybrid search / learning system 301
5 Extension of the trials 303
References 303
CHAPTER 65. THE BLIND NEURAL NETWORK MAKER: CAN WE USE CONSTRAINED EMBRYOLOGIES TO DESIGN ANIMAT NERVOUS SYSTEMS? 304
1. INTRODUCTION 304
2. SYMMETRY 305
3. SEGMENTATION 306
4. RECURSION 307
ACKNOWLEDGMENTS 307
REFERENCES 307
CHAPTER 66. Path Finding with Nonlinear Waves 308
Introduction 308
Wave propagation in a network of oscillators 309
Path finding with nonlinear waves 309
Discussion 311
References 311
CHAPTER 67. LIZZYThe Genetic Programming of an Artificial Nervous System Hugo de Garis 312
Abstract 312
1. Introduction 312
2. Genetic Programming of GenNets 313
3 . Building an Artificial Nervous System : The LIZZY Project 313
4. The LIZZY Circuit 314
CHAPTER 68. GENETICALLY PROGRAMMED NEURAL NETWORK FOR SOLVING POLE-BALANCING PROBLEM 316
1. INTRODUCTION 316
2. PROBLEM 316
3. METHOD 317
4. RESULTS 318
5. COMMENT 319
REFERENCES 319
CHAPTER 69. NEURAL NET BASED CONTROL OF THE HEATING PROCESS 320
1. Introduction 320
2. One-step-ahead Controller 320
3. Long Range Predictive Controller (LRPC 321
4. Experimental Setup 322
5. Experiments 322
7. Conclusion 323
REFERENCES 323
CHAPTER 70. A NEURAL IMPLEMENTATION OF ANALOGIC PLANNING METHODS 324
1. Analogic Planning 324
2. Neural implementation 325
References 326
CHAPTER 71. USE OF CMAC NEURAL NETWORKS IN REINFORCEMENT SELF-LEARNING CONTROL1 328
ABSTRACT 328
I. INTRODUCTION 328
II. BOX-BASED ADAPTIVE CRITIC LEARNING 328
III. CMAC-BASED ADAPTIVE CRITIC LEARNING 329
IV. SIMULATION RESULTS AND CONCLUSION 330
REFERENCES 331
Part 8: Self-Organization and Vector Quantization 332
CHAPTER 72. A SELF-ORGANIZING ALGORITHM FOR THE COMPUTATIONAL LOAD BALANCE OF ACONCURRENT COMPUTER 334
1. INTRODUCTION 334
2 . THE MULTIPROCESSOR LOAD BALANCING PROBLEM 334
3.THE MULTIPROCESSOR LOAD BALANCING ALGORITHM. 335
4. THE TASK PRESENTATION ORDER 336
5. SIMULATION RESULTS 337
6. CONCLUSIONS 337
7. REFERENCES 337
CHAPTER 73. AN UNSUPERVISED HYPERSPHERIC MULTI-LAYER FEEDFORWARD NEURAL NETWORK CLASSIFIER 338
ABSTRACT 338
I. INTRODUCTION 338
II. THE CASE OF STRONGLY SEPARABLE CATEGORIES 339
III. THE CASE OF NON-SEPARABLE CATEGORIES 341
IV. SIMULATION RESULTS 341
V. SUMMARY 342
Acknowledgements 342
References 342
CHAPTER 74. A NOVEL FEATURE MAP ARCHITECTURE FOR THE REPRESENTATION OF CLASSIFIER CONDITION SETS 344
Abstract 344
1. Introduction 344
2. Basic Architecture 344
3. HLS featurt Maps 344
4. Taet Regime 345
5. Conclusion» 346
6. References 346
CHAPTER 75. SELF ORGANIZING FEATURE MAPS FOR CONTOUR DETECTION IN VIDEOPHONE IMAGES 348
1. INTRODUCTION 348
2. PROBLEM DESCRIPTION 348
3. IMAGE PRE-PROCESSING 349
4. PREVIOUS APPROACHES TO CONTOUR DETECTION 350
5. CONTOUR DETECTION WITH SELF-ORGANIZING MAPS 350
6. EXPERIMENTAL ACTIVITY 350
7. CONCLUSIONS 351
REFERENCES 351
CHAPTER 76. SELF-ORGANIZING FEATURE MAPS FOR APPRAISALOF LAND VALUE OF SHORE PARCELS 352
1. Introduction 352
2. Self-organizing feature maps 352
3 . Land parcels on the shores of lakes 353
4. Conceptual structure of the shore parcels 354
5. Conclusions 355
References 355
CHAPTER 77. FINITE ELEMENT MESHING USING KOHONEN'SSELF-ORGANIZING MAPS 356
1. INTRODUCTION 356
2. PRESENTATION OF FINITE ELEMENT MESHING 356
3. NEURAL MESHING 357
4. CONCLUSION 360
REFERENCES 360
CHAPTER 78. HENAMnet HOMOGENEOUS ENCODINGFOR APPROXIMATION OF MAPPINGS 362
1 Introduction 362
2 Encoding Procedure 362
3 Decoding Procedure 363
4 Connecting different subnets 364
5 Learning 364
6 Simulations 365
7 Conclusions 365
References 365
CHAPTER 79. SELF-ORGANISING FEATURE MAPS FOR CURSIVE SCRIPT RECOGNITION 366
1. Introduction 366
2. Self-Organising Allographic Networks 366
3. Recognition 367
References 367
CHAPTER 80. ENHANCED MAPPING : AN EXTENSION OF TOPOLOGICAL MAPPING TO FORM INTERNAL REPRESENTATIONS AND SPATIAL MAPPINGS 370
1/ Introduction 370
2/ Dynamics (Quantitative). Projection Artifacts and Distribution Artifacts 371
3/ TQpQlQgiçal Mapping 373
4/ A cascade of two Enhanced Maps 373
5/ Conclusions 375
Acknowledgement 375
References 375
CHAPTER 81. DVQ: DYNAMIC VECTOR QUANTIZATION - AN INCREMENTAL LVQ 376
1- INTRODUCTION 376
2- SYNTETIC DATA 376
3- SPEECH DATABASE AND FEATURE EXTRACTION 377
4- CLASSIFIERS 377
5- EXPERIMENTS AND RESULTS 378
6- CONCLUSION 379
References 379
CHAPTER 82. MONITORING OF INPUT SIGNALS SUBSPACE LOCATION IN SENSORY SPACE BY NEURONET INNER LAYER NEURONS THRESHOLD VALUE ADAPTATION 380
1. SINGLE CORRESPONDANCE REQUIREMENT 380
2. THRESHOLD NEURON MODEL 380
3. ANSWER AREAS IN SENSORY SPACE 381
4. PARALLEL SHIFT OF BOUNDARIES 381
5. INPUT SIGNAL SUBSPACE 382
6. THRESHOLDS ADAPTATION PROCESS 382
7. CONCLUSIONS 383
References: 383
CHAPTER 83. UNSUPERVISED CLUSTERING OF PROTEINS 384
REFERENCES 387
CHAPTER 84. An Approach to the Application of Dedicated Neural Network Hardware for Real Time Image Compression 388
1 Introduction 388
2 The binary associative memory 389
3 The Kohonen Feature Map 389
4 The Simulation System 390
5 Simulation Results 390
6 Conclusion and Future Work 391
Acknowledgement 391
References 391
CHAPTER 85. A SELF-ORGANIZING UPDATING NETWORK 392
1. Introduction 392
2. Scheduling by Edge Reversal ( S E R ) 392
3. The Theory of Neuronal Group Selection (TNGS) 393
4. An Updating Network 394
5. Conclusions 395
References 395
CHAPTER 86. A LEARNING ALGORITHM WITH MULTIPLE CRITERIAFOR SELF-ORGANIZING FEATURE MAPS 396
1. INTRODUCTION 396
2. ALGORITHM DEVELOPMENT 396
3. EXPERIMENTS AND RESULTS 398
4. CONCLUSION 398
ACKNOWLEDGEMENT 398
REFERENCES 398
CHAPTER 87. THE HYPERMAP ARCHITECTURE 400
1. Introduction 400
2. The two-phase recognition algorithm 401
3. Application example: recognition of phonemes 403
4. Conclusions 403
References 403
Part 9: Neural Knowledge Data Bases and Non-Rule-Based Decision Making 404
CHAPTER 88. PERFORMANCE EVALUATION OF EXTENDED BACKPROPAGATION RULE FOR GENERATING NETWORKS OF CONNECTIONIST EXPERT SYSTEMS 406
1. INTRODUCTION 406
2. EXTENDED BACKPROPAGATION RULE 406
3. PERFORMANCE EVALUATION OF EXTENDED BACKPROPAGATION WITH DIFFERENT PARAMETERS 407
4 . DISCUSSION AND ANALYSIS OF RESULTS 408
REFERENCES 409
CHAPTER 89. Concept Randomness and Neural Networks 410
1 Introduction 410
2 Sparse, continuous random problems 411
3 Probabilistic approximations for recognition of SRC concepts 412
4 Random neural networks (RNN) 412
References 413
CHAPTER 90. THE EFFECT OF LOW-LEVEL AIR POLLUTION AND WEATHER ON ASTHMA AND CHRONIC BRONCHITIS PATIENTS, STUDIED BY NEURAL NETWORK METHODS 414
1. PROBLEM AND APPROACH 414
TRAINING OF THE NETWORK 415
RESULTS AND DISCUSSION 416
REFERENCES 417
CHAPTER 91. A NEURAL NETWORK THAT LEARNS TO DO HYPHENATION 418
Introduction 418
1. The Network Architecture 418
2. Choosing Word Bases 419
3. Learning to Hyphenate the Training Words 419
4. Testing the Network with Unknown Words 420
5. Effects of the Hidden Layer Size 421
6. Conclusions 421
References 421
CHAPTER 92. PRIME NUMBERS : A WAY TO DISTRIBUTE SYMBOLIC KNOWLEDGE OVERNEURAL NETS 422
1. INTRODUCTION 422
2. PRIME NUMBERS, PROPOSITIONAL LOGIC AND CONNECTIONISM 422
3. MULTIPLE SITE NEURONS FOR DISTRIBUTING SYMBOLIC KNOWLEDGE 424
4. A DISTRIBUTED STRUCTURE FOR A CONNECTIONIST PRODUCTION SYSTEM 424
5.CONCLUSION 425
REFERENCES 425
Part 10: Biological and Physiological Connection 426
CHAPTER 93. PATTERN RECOGNITION OF HOARSE AND HEALTHY VOICES BYTHE SELF-ORGANIZING MAP 428
1. INTRODUCTION 428
2. METHODS 428
3. RESULTS 429
4. DISCUSSION 431
ACKNOWLEDGEMENT 431
REFERENCES 431
CHAPTER 94. LAYERED SELF-ADAPTIVE NEURAL NETWORK APPROACH TO EARLY VISUAL INFORMATION PROCESSING 432
1. INTRODUCTION 432
2. FRAMEWORK 432
3. RESULTS 433
4. DISCUSSION 435
ACKNOWLEDGEMENTS 435
REFERENCES 435
CHAPTER 95. A NEURAL NETWORK FOR VISUAL MOTION DETECTIONTHAT CAN EXPLAIN PSYCHOPHYSICAL AND PHYSIOLOGICAL PHENOMENA 436
INTRODUCTION 436
STRUCTURE OF THE MODEL 436
BEHAVIOR OF T H E MODEL 437
SIMULATION 438
REFERENCES 439
CHAPTER 96. A HIGH DEGREE OF NOISE TOLERANCE IN HUMAN VISUAL FLOW DISCRIMINATION 440
1. INTRODUCTION 440
2. METHODS 441
3. RESULTS AND DISCUSSIONS 442
CHAPTER 97. RESPONSE OF DIRECTIONALLY SELECTIVE CELLS OF THE MACAQUE DORSAL MST AREA TO VISUAL FLOW WITH DIRECTIONAL NOISE AND ITS RELATION TO THE NOISE TOLERANCE IN HUMAN VISUAL FLOW DISCRIMINATION 444
1. INTRODUCTION 444
2. METHODS 445
3. RESULTS AND DISCUSSIONS 445
REFERENCES 446
CHAPTER 98. A DYNAMIC INDUCTION PROCESS FOR LONG-TERM POTENTIATIONIN HIPPOCAMPUS STUDIED BY TEMPORAL PATTERN STIMULATION 448
Methods 448
Results and Discussion 449
CHAPTER 99. INFORMATION PROCESSING AND LEARNING IN THE OLFACTORY BULB 452
1. Introduction 452
2. Neurobiological background 453
3. The model 453
4. Simulation results 454
5. Discussions 455
Acknowledgement 455
References 455
CHAPTER 100. PATCH-CLAMP STUDIES OF CULTUREDOLFACTORY BULB NEURONS 458
1. INTRODUCTION 458
2. MATERIALS AND METHODS 459
3. RESULTS 459
4 . CONCLUSIONS 461
ACKNOWLEDGEMENTS 461
REFERENCES 461
CHAPTER 101. STIMULUS-INDUCED NEURAL SYNCHRONIZATIONS 462
References. 465
A FORMALIZATION OF NEISSER'S MODEL 466
1. INTRODUCTION 466
2. THE NEISSER'S MODEL 466
3. THE MODEL 467
ACKNOWLEDGEMENTS 469
REFERENCES 469
CHAPTER 102. MODEL EQUATIONS FOR PASSIVE DENDRITIC INTEGRATIONI N SPATIALLY COMPLEX NEURONS 470
1. INTRODUCTION 470
2. THE NONUNIFORM EQUIVALENT CABLE MODEL 470
3. ANALYSIS OF THE NONUNIFORM CABLE EQUATION 472
4. BRANCHING CONDITION FOR DENDRITIC TREES 473
REFERENCES 473
CHAPTER 103. EXAMPLES OF REALISTIC NEURAL NETWORK SIMULATIONS 474
Abstract 474
Introduction 474
Simulations 474
Conclusions and Future Research 476
References 477
CHAPTER 104. CRITICAL DEPENDENCE OF NEURAL NETWORKS PROCESSINGON BETWEEN NEURON DELAYS 478
Abstract: 478
1. INTRODUCTION: 478
2. THEORETICAL CONSIDERATIONS: 478
3. METHODS: 479
4. RESULTS: 479
5. DISCUSSION: 480
6. REFERENCES: 481
CHAPTER 105. TOOLS FOR BIOLOGICALLY REALISTI CSIMULATIONS OF NEURAL NETWORKS 482
Abstract 482
Introduction 482
The Model Neuron 483
The Specification Language 484
Conclusions and Future Work 484
Acknowledgements 485
References 485
Chapter 106.
486
1 Introduction 486
2 Network models of classical conditioning 486
3 The extended drive-reinforcement model 487
4 The network description 488
5 Learning the OR, AND, and XOR functions 489
6 conclusion 489
References 489
CHAPTER 107. FUNCTIONAL POINTS OF CONTROL IN BIOLOGICAL NEURAL NETWORKS 490
Abstract 490
0 Introduction 490
1 Node Factors 490
2 Node Factor Functions 491
3 Node State 492
4 Network Composition 493
Conclusion and Future Work 493
References 493
CHAPTER 108. USING PARAMETRIC CONTROLLED STRUCTURED NETWORK TO APPROACH NEURAL NETWORKS TO NEUROSCIEN 494
Abstract 494
Neural Networks and Ncuroscicnccs 494
PARAMETRIC CONTROLLED STRUCTURED NETWORKS 495
References 497
Part 11: Software Development 498
CHAPTER 109. THE HEBB RULE IMPLEMENTATION IN A UNIVERSAL NEURAL NET 500
1. INTRODUCTION 500
2. THE NEURAL NETWORK 500
3. THE HEBBIAN RULE 501
4. THE CONDITIONS TO IMPLEMENT THE HEBBIAN RULE 502
REFERENCES 503
CHAPTER 110. NEMESYS - NEURAL MODELLING SYSTEM FOR WEIGHTLESS NETS 504
Abstract 504
1.Introduction 504
2. Design methodology 504
3.Conclusion 506
References 506
CHAPTER 111. EXPERIMENTS WITH PARALLEL BACKPROPAGATION ON A HYPERCUBE PARALLEL PROCESSOR SYSTEM 508
1 INTRODUCTION 508
2 PARALLEL IMPLEMENTATION OF BACKPROPAGATION 508
3 EXPERIMENTAL RESULTS 509
4 CONCLUSIONS 511
5 ACKNOWLEDGEMENTS 511
Chapter 112. Amalgamating the Neural and Logic Computing Paradigms 512
1. INTRODUCTION 512
2. A LOGICAL MODEL OR INTERPRETATION FOR NEURAL NETS 513
3. EMULATING NEURAL NETS IN CONCURRENT PROLOG 514
5. SUMMARY AND DISCUSSION 515
REFERENCES 515
CHAPTER 113. Implementation of the SANS algorithm on the CM2 516
Abstract 516
Introduction 516
Description of the SANS algorithm 516
The test problem 517
The actual implementations 517
Conclusion 519
Acknowledgements 519
References 519
CHAPTER 114. BIOSIM—A PROGRAM FOR BIOLOGICALLY REALISTIC NEURAL NETWORK SIMULATIONS ON THE CONNECTION MACHINE 520
Abstract 520
Introduction 520
Implementation 521
Size and speed 522
Discussion 523
Acknowledgments 523
References 523
CHAPTER 115.
524
Abstract 524
1. Neural nets 524
2. PLATO 525
3. A piecewise linear model of a Hopfield neuron 525
4. A new neural net simulator 526
5. Results 526
6. Conclusions 527
7. References 527
CHAPTER 116. FUNCTIONAL SPECIFICATION OF A NEURAL NETWORK 528
Abstract 528
Introduction 528
Functional Specification of a Back-Propagation Network 528
Network Structure 529
Network Behaviour 529
Discussion 530
Future Work 531
References 531
CHAPTER 117. FULLY CONNECTED NEURAL NETWORKS: SIMULATION ONMASSIVELY PARALLEL COMPUTERS 532
1. INTRODUCTION 532
2. PARALLEL SOLUTION AND PARALLEL SIMULATORS 533
4. CONCLUSIONS AND FUTURE WORK 535
REFERENCES 535
CHAPTER 118. PARALLEL LEARNING STRATEGIES ON THE EMMA-2 MULTIPROCESSOR 536
ABSTRACT 536
1. INTRODUCTION 536
2. MAPPING STRATEGIES ON EMMA- 2 537
3. EFFICIENCY OF THE MLP IMPLEMENTATION 538
4. EXPERIMENTAL RESULTS AND DISCUSSION 539
References 539
CHAPTER 119. EFFECTIVE NEURAL NETWORK MODELING IN C 540
1. INTRODUCTION 540
2. SOFTWARE DESIGN 540
3. OBJECTS AND METHODS 541
4. TABLE DISPATCH 543
5. CONCLUSION 543
Part 12: Neural Network Architecturesand Algorithms II 544
CHAPTER 120. SPEEDING-UP BACKPROPAGATION BY DATA ORTHONORMALIZATIONt 546
1. Introduction 546
2. A Distributed Decorrelation Algorithm 547
3. Performance Evaluation 547
4. Conclusions 549
References 549
CHAPTER 121. DETERMINING WEIGHTS OF THE HOPFIELD NEURAL NETWORKS 550
1. INTRODUCTION 550
2. THE HOPFIELD MODEL 550
3. DETERMINATION OF WEIGHTS 551
4. TRAVELING SALESMAN PROBLEM 551
5. NUMERICAL CALCULATIONS 552
6. DISCUSSIONS OF RESULTS 552
REFERENCES 553
CHAPTER 122. RECURRENT AND FEEDFORWARD BACKPROPAGATION: PERFORMANCE STUDIES 554
1 Introduction 554
2 The Underlying Theory 554
3 Architecture and Topology of the unified description 555
4 Performance Comparisons 555
5 Acknowledgements 557
References 557
CHAPTER 123. THE NORMALIZED BACKPROPAGATION AND SOME EXPERIMENTS ON SPEECH RECOGNITION 558
ABSTRACT 558
1. INTRODUCTION 558
2. DESCRIPTION OF THE NORMALIZED BACKPROPAGATION ALGORITHM 559
3. DESCRIPTION OF THE EXPERIMENT 560
4. BRIEF DESCRIPTION OF THE ALGORITHMS 560
5 . RESULTS OF THE EXPERIMENTS 560
6. CONCLUSIONS 561
REFERENCES. 561
Chapter 124. An optimum weights initialization for improving scaling relationships in BP learning 562
1 Brief description of the AMBP algorithm 562
2 Computer simulation 563
References 565
Part 13: Hardware Implementations 566
CHAPTER 125. LIQUID CRYSTAL OPTICAL RECTIFICATION FOR SIMULATIONOF VISUAL EXCITATION AND POTENTIAL APPLICATION TOPATTERN RECOGNITION 568
1. INTRODUCTION 568
2. PHYSICAL MECHANISM OF LC-SVE 568
3. APPLICATION TO PATTERN RECOGNITION 570
4 . CONCLUSION 571
ACKNOWLEDGMENT 571
REFERENCES 571
CHAPTER 126. OPTICAL TAG NEURAL NETWORKS FOR LARGE-SCALEI MPLEMENTATION 572
1. INTRODUCTION 572
2. TAG NEURAL NETWORK MODEL 572
3. OPTICAL IMPLEMENTATION 573
4. CONCLUSION 574
Acknowledgement: 574
References 575
CHAPTER 127. PHOTOREFRACTIVE CRYSTAL WAVEGUIDES FOROPTICAL NEURAL NETWORKS 576
I . INTRODUCTION 576
I I . APPLICATION OF PCW ARRAY TO OPTICAL NEURAL NETWORKS 576
III. HOLOGRAPHIC STORAGE AND RECONSTRUCTION IN PCW 577
IV. WAVEGUIDE PHASE CONJUGATE MIRROR 579
V. CONCLUSION 579
REFERENCES 579
CHAPTER 128. TRANSPUTER IMPLEMENTATIONS OF NEURAL NETWORKS: AN ANALYSIS 580
1. INTRODUCTION 580
2. PERFORMANCE ANALYSIS 580
3. CONCLUSION 583
References 583
CHAPTER 128. SELF-ORGANIZING LOGIC USING SIMULATED ANNEALING 584
1. Introduction 584
2. Architecture 585
3. Teaching 586
4. Simulated Annealing 586
5. Realization 587
6. Conclusions 587
References: 587
CHAPTER 129. A CLASSIFIER CIRCUIT BASED ON AN EXPONENTIAL-CAPACITY ASSOCIATIVE MEMORY 588
1. Introduction to the ECAM 588
2. The New Classifier Chip 589
3. Conclusions 591
Acknowledgement: 591
References 591
CHAPTER 130. A METHOD FOR DESIGNING SYSTOLIC ARCHITECTURES FOR MODELLING SPATIOTEMPORAL PROPERTIES OF NEURONS USING DOMAIN DECOMPOSITION 592
1 INTRODUCTION 592
2 MODELING SPATIO-TEMPORAL PROPERTIESOF NEURONS 592
3 A SYSTOLIC METHOD FOR SOLVING A FINITE DIFFERENCE APPROXIMATION OF THE CABLE EQUATION 593
4 DESIGNING ARRAY ARCHITECTURES 593
5 SIMULATION EXPERIMENTS 594
6 DISCUSSION 595
REFERENCES 595
Chapter 131. A Coded Block Adaptive Neural Network Structure for pattern recognition VLSI 596
Abstract 596
Summary 596
Acknowledgments 597
References 597
CHAPTER 132. FAULT TOLERANCE IN ANALOG NEURAL NETWORKS 600
1. INTRODUCTION 600
2. LOGIC AND ANALOGIC MODELS OF FAILURES 600
3. REDUNDANCY AND FAULT TOLERANCE 601
4. CONCLUSION 602
5. REFERENCES 602
CHAPTER 133. DIGITAL VLSI ARCHITECTURE OF BACKPROPAGATION ALGORITHM WITH ON-CHIP LEARNING 604
1. INTRODUCTION 604
2. BASIC STRUCTURES 605
3. ARCHITECTURE AND IMPLEMENTATION 606
4. CONCLUSIONS 607
REFERENCES 607
CHAPTER 134. A DIGITAL SIGNAL PROCESSOR FOR SIMULATING BACK-PROPAGATION NETWORKS 608
INTRODUCTION 608
DATA PRECISION 609
THE ARCHITECTURE OF THE BP-DSP 610
THE PIPELINED EXECUTION OF ALGORITHMS 610
RESULTS 611
REFERENCES 611
CHAPTER 135. VLSI-IMPLEMENTATION OF A PULSE-DENSITY-MODULATED NEURAL NETWORK FOR PC-CONTROLLED COMPUTING ENVIRONMENT 612
1. INTRODUCTION 612
2. CONCEPT OF PULSE-DENSITY-MODULATION 612
3. VLSI IMPLEMENTATION 613
4. THE COMPLETE NEURAL PROCESSING SYSTEM 614
5. CONCLUSIONS 615
REFERENCES 615
CHAPTER 136. A SERIAL-UPDATE VLSI ARCHITECTURE FOR THE LEARNING PROBABILISTIC RAM NEURON 616
1. Introduction 616
2. The pRAM model 616
3. Supervised Learning 617
4. Reinforcement Training 617
5. Hardware Learning 618
6. Serial Architecture 618
7. Expansion beyond 128 neurons 619
8. Conclusion 619
ACKNOWLEDGEMENTS 619
REFERENCES 619
CHAPTER 137.
620
I. INTRODUCTION 620
II. FULLY PROGRAMMABLE ANALOGUE SYNAPSES 620
III. CONCLUSION/DISCUSSION 621
REFERENCES 622
CHAPTER 138. VLSI-IMPLEMENTATION OF A PROGRAMMABLE DUAL COMPUTING CELLULAR NEURAL NETWORK PROCESSOR 624
1. INTRODUCTION 624
2. A PROGRAMMABLE ANALOG CNN CORE PROCESSOR 624
3. A CONTROL-CHIP FOR ANALOG ARRAY PROCESSORS AND PROGRAMMABLE CELLULAR NEURAL NETWORKS 625
4. CONCLUSION 626
REFERENCES 627
Part 14: Pattern Recognitionand Signal Processing II 628
CHAPTER 139. FREQUENCY DIFFERENCE SPECTRA AND THEIR USE IN ACOUSTIC PROCESSING 630
Abstract 630
1. INTRODUCTION 630
2. THEORETICAL FRAME 630
CONCLUSIONS 633
BIBLIOGRAPHY 633
CHAPTER 140. TIME-DEPENDENT SELF-ORGANIZING MAPS FOR SPEECH RECOGNITION 634
1. INTRODUCTION 634
2. TIME-DEPENDENT SELF-ORGANIZING MAPS 634
3. SPEECH RECOGNITION SYSTEM SETUP 635
4. SPEECH RECOGNITION ACCURACY 636
5. CONCLUSIONS 636
References 637
CHAPTER 141. RECURRENT NEURAL NETWORKS AS PHONEME SPOTTERS 638
1. INTRODUCTION 638
2. GOALS OF THE EXPERIMENT 638
3. PREPROCESSING AND NETWORK STRUCTURE 639
4. EXPERIMENTS AND RESULTS 640
5. CONCLUSIONS 641
References 641
CHAPTER 142. PHONEME CLASSIFICATION BASED ON THE 2-DIMENSI0NAL FORMANT DETECTION 642
1. INTRODUCTION 642
2. GAUSSIAN KERNEL 642
3. PHONEME CLASSIFICATION PROCESSOR 643
4. EXPERIMENTS 645
5. REMARKS 645
REFERENCES 646
CHAPTER 143. A SEQUENTIAL NEURAL NETWORK MODEL FOR LEXICAL DECODING INCONTINUOUS SPEECH RECOGNITION 648
1. INTRODUCTION 648
2. ELMAN'S MODEL 649
3. ADAPTATION OF ELMAN'S MODEL TO THE LEXICAL DECODING PROBLEM 649
4. EXPERIMENTAL RESULTS ON THREE SIMILAR PATTERNS 649
5. EXPERIMENTAL RESULTS ON A SIMPLE SENTENCE 650
6. CONCLUSION 651
REFERENCES 651
CHAPTER 144. RECOGNITION OF CORRUPTED LEXICAL PATTERNS 652
1. INTRODUCTION 652
2. VOCABULARY AND KINDS OF ERROR 652
3. ARCHITECTURE AND TRAINING OF THE NETWORK 653
4. INPUT AND OUTPUT REPRESENTATIONS 653
5. LEARNING PARAMETERS 654
6. EXPERIMENTS AND RESULTS 654
7. CONCLUSIONS 655
REFERENCES 655
CHAPTER 145. PHONEME RECOGNITION USING ARTIFICIAL NEURAL NETWORKS 656
1 INTRODUCTION 656
2 SPEECH MATERIAL AND DESCRIPTION OF THE USED NETWORKS 656
3 RESULTS 657
4 CONCLUSIONS 659
ACKNOWLEDGEMENTS 659
REFERENCES 659
CHAPTER 146. CLASSIFICATION OF FLOW PATTERNS IN TWO PHASE FLOW BY NEURAL NETWORK 660
1. INTRODUCTION 660
2. DEFINITION OF THE PROBLEM 660
3. NEURAL NETWORK 661
4. HYBRID ALGORITHM 662
5. IDENTIFICATION OF FLOWS 662
REFERENCES 663
CHAPTER 147. EMG DIAGNOSIS USING THE CONJUGATE GRADIENT BACKPROPAGATION NEURAL NETWORK LEARNING ALGORITHM 664
ABSTRACT 664
INTRODUCTION 664
THE CONJUGATE GRADIENT LEARNING ALGORITHM 664
RESULTS AND DISCUSSION 666
REFERENCES 667
CHAPTER 148. GENETICS-BASED-MACHINE-LEARNING IN CLINICAL ELECTROMYOGRAPHY 668
ABSTRACT 668
INTRODUCTION 668
METHODS 668
RESULTS AND DISCUSSION 670
REFERENCES 671
CHAPTER 149. SELECTION OF CHARACTERISTIC PATTERNS IN MAGNETIC AND ELECTRIC ENCEPHALOGRAMS USING A NEURAL NETWORK 672
1. INTRODUCTION 672
2. EXPERIMENTAL PROCEDURES 672
3. CLASSIFICATION SCHEME 673
4. RESULTS 675
CHAPTER 150. A METHOD OF SPATIAL SPECTRUM ESTIMATION USING NEURAL NETWORKS 676
1 INTRODUCTION 676
2 MODEL OF ARRAY SIGNALS 676
3 COMPUTE THE EIGENVALUE / EIGENVECTOR BY APEX 677
4 DIRECTION FINDING 677
5 DETERMMINE THE NUMBER OF SOURCES 678
6 CONCLUSION 679
Reference 679
CHAPTER 151. FREQUENCY ESTIMATION BY A HEBBIAN SUBSPACE LEARNING ALGORITHM 680
Abstract 680
1 Introduction 680
2 The learning algorithm 680
3 Stability considerations 681
4 Sinusoidal frequency estimation 682
5 Concluding remarks 683
References 683
CHAPTER 152. NEURAL NETWORKS FOR PROSODY CONTROL IN SPEECH SYNTHESIS 684
1. INTRODUCTION 684
2. TRADITIONAL SPEECH SYNTHESIS B Y RULE 684
3. NEURAL NETS FOR THE GENERATION OF CONTROL PARAMETERS 685
4. PHONEME DURATIONS B Y NEURAL NETS 685
5. INTONATION AND STRESS CONTROL B Y NEURAL NETS 687
6. SUMMARY AND CONCLUSIONS 687
REFERENCES 687
CHAPTER 153. A SIMPLE LOOK-UP PROCEDURE SUPERIOR TO NETTALK? 688
1 INTRODUCTION 688
2 A SIMPLE LOOK-UP PROCEDURE 689
3 THE PERFORMANCE OF THE LOOK-UP PROCEDURE 690
4 THE PERFORMANCE OF NETTALK 690
5 EVALUATION AND CONCLUSION 691
ACKNOWLEDGEMENTS 691
REFERENCES 691
CHAPTER 154. COMPARISON AND COOPERATION OF SEVERAL CLASSIFIERS 692
1 INTRODUCTION 692
2 A MULTI-SPEAKER ISOLATED WORD RECOGNITION PROBLEM 692
3 COMPARISON OF SEVERAL METHODS 693
4 A SIMPLE SUB-OPTIMAL COOPERATION METHOD: FEATURE EXTRACTION 694
5 OPTIMAL COOPERATION WITH DYNAMIC PROGRAMMING 695
6 CONCLUSION 696
ACKNOWLEDGEMENT 696
REFERENCES 696
CHAPTER 155. NEURAL NETS AND TASK DECOMPOSITION 698
1. INTRODUCTION 698
2. DECOMPOSITION OF THE TASK AND INTRODUCTION OF MODULARITYIN NEURAL NETS 698
3. RESULTS 700
4. CONCLUSION 701
REFERENCES 701
Part 15: Commercial and IndustrialApplications 702
CHAPTER 156. CONSTRUCTION OF CONTIGUOUS DNA SEQUENCES FROM N-TUPLECONTENT USING GENERALIZING RAM NEURON MODEL 704
Abstract 704
1.Introduction 704
2. The G-RAM 704
4. Simulation and Results 707
5. Conclusion 707
REFERENCES 707
CHAPTER 157. A HYBRID NEURAL NET/KNOWLEDGE BASED APPROACHTO EEG ANALYSIS 708
1. INTRODUCTION: 708
2. DESCRIPTION OF THE HYBRID SYSTEM 708
3. RESULTS: 710
4. CONCLUSION 711
REFERENCES: 711
CHAPTER 158. TRAFFIC MONITORING WITH WISARD AND PROBABILISTIC LOGIC NODES 712
1. INTRODUCTION AND PREVIOUS WORK 712
2. SIMULATION DETAILS 713
3. RESULTS 714
4. DISCUSSION 715
5. REFERENCES 715
CHAPTER 159. REALTIME ECG DATA COMPRESSION USINGDUAL THREE LAYERED NEURAL NETWORKS FOR DIGITAL HOLTER MONITOR 716
ABSTRACT 716
1. Introduction 716
2. Data Compression by Neural Networks 716
3. Evaluation of Performance 718
4. Conclusion 719
REFERENCES 719
CHAPTER 160. PERFORMANCE EVALUATION OF SELF-ORGANIZING MAP BASEDNEURAL EQUALIZERS IN DYNAMIC DISCRETE-SIGNAL DETECTION 720
1. INTRODUCTION 720
2. ADAPTIVE EQUALIZATION BASED ON SELF-ORGANIZING MAPS 720
3. COMBINING LINEAR EQUALIZATION AND SELF-ORGANIZING ADAPTATION 721
4. PERFORMANCE EVALUATION OF THE NEURAL EQUALIZERS 721
5. DISCUSSION 722
ACKNOWLEDGEMENT 723
REFERENCES 723
Chapter 161. A Hopfield-like structure to extract road boundaries from a road image 724
Abstract 724
1 Introduction 724
2 Extracting Road Boundaries 724
3 Conclusions 727
References 727
Chapter 162. The Impact of the Learning-Set Sizein Handwritten—Digit Recognition 728
1. Introduction 728
2. Statistical Classifiers 729
3. Results 730
4. Conclusion 732
References 732
CHAPTER 163. A KIND OF GENERALIZED HOPFIELD CONTINUOUS MODEL AND ITS APPLICATION TO THE OPTIMAL DISTRIBUTION OF REACTIVE POWER SOURCES IN POWER SYSTEMS 734
1. INTRODUCTION 734
2. A BRIEF INTRODUCTION TO HCM 734
3. GENERALIZED HOPFIELD CONTINUOUS MODEL (GHCM) 735
3. THE OPTIMAL DISTRIBUTION OF REACTIVE POWER SOURCES ( ODRPS )IN POWER SYSTEMS 736
4. RESULTS 737
5. CONCLUSIONS 737
REFERENCES 737
CHAPTER 164. SIGNIFICANT VARIABLE PAIRS WHICH EFFECT NEURAL NETWORK DERIVED PERSONNEL SCREENING SCORES 738
1. INTRODUCTION 738
2. Sensitivity Studies of Paired Variables 738
3. Basic Ideas About Layered Neural Network Models 739
4. Neural Network Architecture 739
5. Data Preparation and Training Procedure 739
6. Computer Experiments with the Trained Network 739
7. Significance Test Used for Identifying Strongly Interacting Pairs of Variables 739
8. Pairs Identified by Linear Classifier with Values in Proxy Variables 740
9. Variables Interacting in Neural Nets Trained on Individual Sales Performance 741
10. Conclusions 741
11. References 741
CHAPTER 165. APPLICATION OF THE SELF-ORGANISING FEATURE MAP AND LEARNINGVECTOR QUANTISATION TO RADAR CLUTTER CLASSIFICATION 742
1. INTRODUCTION 742
2. CLASSIFICATION EXPERIMENTS 742
3. TOPOLOGICAL STRUCTURE OF DATA 743
4. SUMMARY 745
REFERENCES 745
CHAPTER 166. APPLICATIONS OF NEURAL NETWORKS IN CONTROL TECHNOLOGY 746
1. INTRODUCTION 746
2. WELD CONTROL 746
3. INTELLIGENT NDT 748
4. CONCLUSIONS 749
REFERENCES 749
CHAPTER 167. FINANCIAL DATA RECOGNITION AND PREDICTION USING NEURAL NETWORKS 752
1 FINANCIAL MARKETS 752
2 THE M.L.P. SOLUTION 753
3 KOHONEN SELF ORGANISING FEATURE MAPS 754
4 KOHONEN AND M.L.P. 755
5 CONCLUSIONS 755
Part 16: Neural Models for Cognitive Science and High-Level Brain Functions 756
Chapter 168. Learning to Classify Natural Language Titles in a Recurrent Connectionist Model 758
Abstract 758
1 Introduction 758
2 Learning Library Titles in a Recurrent Connectionist Network 758
3 Analysis of the Learned Internal Representation 760
4 Conclusion 761
Acknowledgements 761
References 761
CHAPTER 170. THE REPRESENTATION OF ABSTRACT ENTITIES IN ARTIFICIAL NEURAL NETWORKS 762
1. INTRODUCTION 762
2. A FRAME FOR CONCEPTUAL MODELLING 763
3. THE CAUSAL THEORY OF REFERENCE 763
4. FUNCTANTS AND THEIR ROLE IN CONCEPTUAL PROCESSES 764
References 765
CHAPTER 171. A NETWORK-MODEL FOR BINOCULAR INTERACTION IN THE THALAMO-CORTICAL FEEDBACK-SYSTEM 766
1. INTRODUCTION 766
2. MODEL 766
2. RESULTS 767
ACKNOWLEDGEMENT 769
References 769
CHAPTER 172. A NEURAL NET MODEL OF SPELLING DEVELOPMENT 770
Introduction 770
A connectionist model of spelling development 771
Testing the model's predictions: Normal spelling development 772
References 773
CHAPTER 173. NEURAL NETS FOR INTELLIGENT TUTORING SYSTEMS 774
Abstract 774
1. Introduction: intelligent tutoring systems 774
2. Neural networks for ITS 774
3. Modelling a student who is solving physics problems 775
4. Conclusions 777
Acknowledgments 777
References 777
CHAPTER 174. AUTONOMOUS CONTROL OF SELECTIVE ATTENTION: SCAN ARCHITECTURES 778
1. Introduction 778
2. Shifter-circuit architecture 778
3. Competitively coupled layers 779
4. Autonomous control of selective attention in SCAN 780
5. Discussion 781
Acknowledgments 781
References 781
CHAPTER 175. SPATIOTEMPORAL CORRELATION IN THE CEREBELLUM 782
Introduction 782
Temporally organised information 782
Simulations of the mechanism 784
Conclusions 785
Acknowledgments 785
References 785
CHAPTER 176. HARMONY THEORY NETWORKS FOR SCENE ANALYSIS 786
1. INTRODUCTION - TRADITIONAL A.I. SCENE ANALYSIS 786
2. THE P.D.P. APPROACH TO A.I. CONSTRAINT PROPAGATION 786
3. RESULTS 788
4. CONCLUSIONS 789
REFERENCES 789
CHAPTER 177. MODELING HEBBIAN CELL ASSEMBLIESCOMPRISED OF CORTICAL NEURONS 790
Introduction 790
Cell model and network architecture 791
Simulation results 791
Discussion 792
Conclusions 793
Acknowledgement 793
References 793
Chapter 178. Recurrent Kohonen Self-Organization in Natural Language Processing 794
1. Introduction 794
2. Formal Description of the Model 795
3. Simulation Results 796
4. Discussion and Conclusions 797
Acknowledgements 797
References 797
Chapter 179. A Multimodal Model of Cognition - Neural Networks and Cognitive Modeling 798
1 Introduction & Some Methodological Remarks
2 The Structure of the Simulation 798
Conclusion 801
References 801
CHAPTER 180. LABOUR, CONSUMPTION AND FAMILY ASSETS: A NEURAL NETWORKLEARNING FROM ITS OWN CROSS-TARGETS 802
1 . INTRODUCTION 802
2 . THE ECONOMIC SUBJECT MODEL 802
3. THE LEARNING AND ACTING ALGORITHM 803
4. A CONSISTENT BEHAVIOUR 804
5. DISCOVERING ECONOMIC REGULARITIES 804
6 . FUTURE IMPROVEMENTS 805
REFERENCES 805
CHAPTER 181. NEURAL NETWORKS, GENETIC ALGORITHMS AND STOCK TRADING 806
1 . INTRODUCTION 806
2 . THE STOCK TRADER NETWORK 806
3 . THE ENVIRONMENT 806
4 . THE GENETIC ALGORITHM 807
5 . THE HIDDEN NEURONS PROBLEM 808
6 . FUTURE IMPROVEMENTS AND CONCLUDING REMARKS 809
REFERENCES 809
CHAPTER 182. ACOUSTIC ILLUSIONS: EXPECTATION DIRECTED FILTERING IN THE HUMAN AUDITORY SYSTEM 810
1. INTRODUCTION 810
2. EXPECTATION DRIVEN PREPROCESSING 811
3. EXPERIMENTAL RESULTS 811
4. SPEECH PROCESSING 812
5. BIOLOGICAL SPEECH RECOGNITION 812
6. CONCLUSION 813
References 813
Chapter 183. Synchronization of Spikes in Populations of Laterally Coupled Model Neurons 814
1. INTRODUCTION 814
2. THE MODEL NEURON 814
3. SYNCHRONIZATION OF AXONAL IMPULSE GENERATION 815
4 . RESULTS 816
5. DISCUSSION 817
REFERENCES 817
Part 17: Neural Network Architectures and Algorithms III 818
CHAPTER 184. CHARACTER RECOGNITION BY A NEURAL NETWORKWITH FUZZY PARTITIONING UNITS 820
ABSTRACT 820
I. INTRODUCTION 820
2. FUZZY PARTITIONING UNIT ( FPU ) 821
3. LAYERED NEURAL NKTWORKS AND ERROR FUNCTIONS 821
4. COMPARISONS OF FOI/H I.HARNING ALGORITHMS 822
5 . CONCLUSI ON 823
REFERENCES 823
Chapter 185. A Global Minimum Convergence Acceleration Technique Using a k–out–of–n Network Design Rule 824
1 Introduction 824
2 Performance of the Preliminary Model 824
3 Structural Analysis of the Model 824
4 Dual Phase Simulated Annealing Method 825
5 Results 826
References 826
CHAPTER 186. APPROXIMATION CAPABILITIES OF NEURAL NETWORKS USING SAMPLING FUNCTIONS 828
1 Introduction 828
2 The architecture and learning algorithm 828
3 Summary 831
CHAPTER 187.ADDITION AND SUBTRACTION IN NEURAL NETS AS RESULTS OF A LEARNING PROCESS 832
1. INTRODUCTION 832
2. THE MODEL 832
3. THE BEHAVIOUR OF AN ADDER NET 833
4. SUBTRACTOR 835
5. CONCLUDING REMARKS 835
ACKNOWLEDGEMENTS 835
REFERENCES 835
CHAPTER 188. GEOMETRICAL LEARNING IN A NETWORK OF AUTOMATA 836
1. INTRODUCTION 836
2. VORONOI TESSELLATION IN AN N-DIMENSIONAL SPACE 836
3. BUILDING THE NETWORK FROM THE DELAUNAY STRUCTURE 837
4. LEARNING ALGORITHM : BUILDING THE DELAUNAY STRUCTURE 838
5. DISCUSSION 839
REFERENCES 839
Chapter 189. A Neural Network for Solving Hamiltonian Cycle Problems 840
1 Introduction 840
2 Problem Representation and Transformation Rules 840
3 The minimization of the Energy Function 841
4 Experimental Results 842
References 843
Part 18: Late Papers 844
CHAPTER 189. SHIFT-TOLERANT" LVQ2-BASED DIGITS RECOGNITION 846
1. THE VOCAL DATABASES 846
2. SYSTEM ARCHITECTURE 847
3. RESULTS 849
ACKNOWLEDGMENTS 850
BIBLIOGRAPHY 850
CHAPTER 190. ON THE MATHEMATICAL TREATMENT OF SELF-ORGANIZATION: EXTENSION OF SOME CLASSICAL RESULTS 852
1. Introduction: self-organizing basics 852
2. The topologically extended justification of the ordering of the weights 852
BIBLIOGRAPHY 855
Author Index 856
| Erscheint lt. Verlag | 28.6.2014 |
|---|---|
| Sprache | englisch |
| Themenwelt | Mathematik / Informatik ► Informatik ► Netzwerke |
| Informatik ► Theorie / Studium ► Künstliche Intelligenz / Robotik | |
| ISBN-10 | 1-4832-9800-0 / 1483298000 |
| ISBN-13 | 978-1-4832-9800-9 / 9781483298009 |
| Informationen gemäß Produktsicherheitsverordnung (GPSR) | |
| Haben Sie eine Frage zum Produkt? |
Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM
Dateiformat: PDF (Portable Document Format)
Mit einem festen Seitenlayout eignet sich die PDF besonders für Fachbücher mit Spalten, Tabellen und Abbildungen. Eine PDF kann auf fast allen Geräten angezeigt werden, ist aber für kleine Displays (Smartphone, eReader) nur eingeschränkt geeignet.
Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine
Geräteliste und zusätzliche Hinweise
Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.
aus dem Bereich