Advanced Neural Computers (eBook)
464 Seiten
Elsevier Science (Verlag)
978-1-4832-9427-8 (ISBN)
This book is the outcome of the International Symposium on Neural Networks for Sensory and Motor Systems (NSMS) held in March 1990 in the FRG. The NSMS symposium assembled 45 invited experts from Europe, America and Japan representing the fields of Neuroinformatics, Computer Science, Computational Neuroscience, and Neuroscience.As a rapidly-published report on the state of the art in Neural Computing it forms a reference book for future research in this highly interdisciplinary field and should prove useful in the endeavor to transfer concepts of brain function and structure to novel neural computers with adaptive, dynamical neural net topologies.A feature of the book is the completeness of the references provided. An alphabetical list of all references quoted in the papers is given, as well as a separate list of general references to help newcomers to the field. A subject index and author index also facilitate access to various details.
Front Cover 1
Advanced Neural Computers 4
Copyright Page 5
Table of Contents 8
PREFACE 6
ACKNOWLEDGEMENT OF SPONSORSHIP 7
Section 1: General Introduction 12
Chapter 1. Prolegomena to an Analysis of Form and Structure 14
I. INTRODUCTION 14
II.
16
Ill.
17
References 20
Chapter 2. The Truck Backer-Upper: An Example of Self-Learning in Neural Networks 22
1 Introduction 22
2 Training 23
3 Summary and Results 26
References 27
Chapter 3. DIE LERNMATRIX - THE BEGINNING OF ASSOCIATIVE MEMORIES 32
1. INTRODUCTION 32
2. BINARY LEARNING MATRIX 33
3. NON-BINARY LEARNING MATRIX 35
4. CIRCUITS WITH SEVERAL LEARNING MATRICES 37
5. APPLICATIONS 39
References 40
Section 2: Biological Sensory and Motor Systems 42
Chapter 4. MODEL OF VISUO-MOTOR TRANSFORMATIONS PERFORMED BY THE CEREBRAL CORTEX TO COMMAND ARM MOVEMENTS AT VISUAL TARGETS IN THE 3-D SPACE 44
I. BIOLOGICAL CONSTRAINTS 44
II. INVARIANT PROPERTIES OF MOTOR CORTICAL NEURONS 45
III. SENSORIMOTOR COMBINATIONS IN CORTICAL AREAS 46
IV. PROCESSING UNITS: MODEL OF THE CORTICAL COLUMN 47
V. CORTICAL REPRESENTATIONS OF MUSCLES AND 3-D SPACE 48
VI. MOVEMENT DEPENDENT MATCHING OF SENSORY INPUTS 49
VII. GENERALIZATION OF LEARNING 50
VIII. PREDICTIONS OF THE MODEL : INVARIANT PROPERTIES 51
Chapter 5.
54
1.INTRODUCTON: PHYSIOLOGICAL NETWORKS 54
2. DYNAMIC NETWORK MODELS 55
3. MANIPULATON OF HIDDEN UNITS 59
4. CONCLUDING COMMENTS 60
ACKNOWLEDGEMENTS 60
REFERENCES 60
Chapter 6. Neural Representation of Motor Synergies 62
1. Introduction 62
2. Motor Relaxation and Hopfield Networks 62
3. Behaviour of the Network 64
4. Simulation Results 66
References 67
Acknowledgments 67
Appendix: M-net of the finger 70
Chapter 7. Vestibular Head- Eye Coordination: a Geometrical Sensorimotor Neurocomputer Paradigm 72
1. Introduction 72
2. Vestlbulo-Cerebellar Coordination as a Sensorlnrwtor Neurocomputing Paradigm 73
3.Skeletomuscular Systems' Modeling: Btomechanics of Intrinsk: Coordinates 77
4. Development of Neural Network Theory & Experimentation
5. Advanced Experimental Paradigms: Linear Vestibulo-Ocular Reflex Gaussian Testing
6. References 78
Chapter 8. THE REPRESENTATION OF INFORMATION IN THE TEMPORAL LOBE VISUAL CORTICAL AREAS OF MACAQUES 80
Ensemble encoding of object identity 81
The development of specificity of the neuronal responses 81
A Neuronal Representation Showing Invariance 82
An Object-Centered Representation of Visual Information 83
The functions of backprojections in neuronal networks in the neocortex in the storage and recall of visual memories 83
REFERENCES 87
Chapter 9. Exploration of a natural environment 90
1 Introduction 90
2 The system architecure 91
3 Modules of the exploration system 92
4 Conclusion 97
References 97
Chapter 10. MODELING VISUAL CORTEX: HIDDEN ANISOTROPIES IN AN ISOTROPIC INHIBITORY CONNECTION SCHEME 98
1. Introduction 98
2.1. A Detailed Model of the Visual Cortex - Methods 98
2.2. A Detailed Model of the Visual Cortex - Results 100
3.2. Circular inhibition in a real cortex 102
4. Directional bias 103
5. Discussion 104
References 104
Section 3: Theory of
106
Chapter 11. THE LOGIG OF NEURAL COGNITION 108
1. INTRODUCTION 108
2. A COGNITIVE ARCHITECTURE 108
3. THE SENSORY NEURAL LEVEL 109
4. THE COGNITIVE PROCESSING LEVEL 110
5. IMPLEMENTATION ISSUES AND CONCLUSIONS 113
REFERENCES 113
Chapter 12.
114
ABSTRACT 114
1. Introduction 114
2. Unpredictable Neural Nets 116
3. Application: Modeling Brain Activity 120
Acknowledgement 123
References 123
Chapter 13.
124
1. INTRODUCTION 124
2. CLASSICAL CONDITIONING 125
3. THE NEW TYPE OF NETWORK 126
4. SPONTANEOUS RECOVERY OF DISCRIMINATION 128
5. CONCLUSIONS 130
REFERENCES 130
Chapter 14. ON THE HEBB RULE AND UNLIMITED PRECISION ARITHMETIC IN A Mc CULLOCH AND PITTS NETWORK 132
1. INTRODUCTION 132
2. THE FAULT TOLERANT NETWORK AND ITS ASSOCIATED LANGUAGE 133
3. ON THE HEBB RULE AND THE INHIBITORY CONNECTIONS 135
4. ON UNLIMITED PRECISION ARITHMETIC 136
ACKNOWLEDGEMENTS 139
REFERENCES 139
Chapter 15. ON THE ALGEBRAIC STRUCTURE OF FEEDFORWARD NETWORK WEIGHT SPACES 140
Abstract 140
1 Introduction 140
2 The Algebra of Weight Equivalence 141
3 Weight Space Symmetries 143
4 Discussion 145
References 146
Chapter 16. STATISTICAL PATTERN RECOGNITION REVISITED 148
1. Introduction 148
2. Initialization of the Codebook Vectors 149
3. The First Version of Learning Vector Quantization (LVQl) 150
4. TheLVQ2 151
5. Instabilities in the Basic LVQ2 151
6. TheLVQS 152
7. Taking More "Runners-up" into Account 153
8. Experiments with Speech Data 153
9. Conclusions 154
References 155
Chapter 17. LOCAL LEARNING RULES AND SPARSE CODING IN NEURAL NETWORKS 156
1. INTRODUCTION 156
2. HETERO-ASSOCIATION 157
3. HIGH FIDELITY 158
4. AUTO-ASSOCIATION 160
5. CONCLUSION: SPARSE CODING IS NECESSARY 160
ACKNOWLEDGEMENTS 161
REFERENCES 161
Chapter 18.
162
ABSTRACT 162
1. INTRODUCTION 162
2. RESCALING AND RAVINES 163
3. STEP SIZE ADAPTATION 164
4. SOME IMPLEMENTATION DETAILS 165
5. EXPERIMENTAL RESULTS 166
6. DISCUSSION AND CONCLUSIONS 169
REFERENCES 169
Section 4:
170
Chapter 19. VLSI IMPLEMENTATION OF SENSORY PROCESSING SYSTEMS 172
1. INTRODUCTION 172
2. A LOCAL TRAINING RULE FOR DYNAMIC SCENE ANALYSIS 172
3. IGORS - A VLSI SMART SENSOR 174
4. CONCLUSION 175
ACKNOWLEDGEMENTS 175
REFERENCES 175
Chapter 20. THE PYGMALION NEURAL NETWORK PROGRAMMING ENVIRONMENT 178
1. INTRODUCTON 178
2. THE NEURAL NETWORK PROGRAMMING SYSTEM (NNPS) 178
3. HARDWARE INTEGRATON 184
4. CONCLUSION 186
Chapter 21. SIMULATORS FOR NEURAL NETWORKS 188
1. INTRODUCTION 188
2. NEURAL NETWORK REQUIREMENTS 188
3. ARCHITECTURE OF A NEURAL NETWORK SIMULATOR 189
4. THE NEXT STAGE 190
5. OUTLINE OF AVAILABLE SYSTEMS 193
6. CASE STUDY - LOCATION OF EYES IN A FACIAL IMAGE 193
7. CONCLUSION 193
REFERENCES 194
Chapter 22. Dynamics and VLSI Implementation of Self–Organizing Networks 196
1 Introduction 196
2 Scaling and Feature Discovery 197
3 Self-Organizing Biological Networks and VLSI 200
References 203
Chapter 23.
204
1. INTRODUCnON 204
2. THE CHARACTER RECOGNTHON TASK 205
3. FEATURE EXTRACHON 205
4. A PRE-PROGRAMMED FEATURE EXTRACTION SYSTEM 206
5. A DIGIT RECOGNIZER FOR LEARNED FEATURE EXTRACTION 207
6. COMMON CHARACTERISTOS OF THE RECOGNIZER NETWORKS 209
7. AN ADVANCED NEURAL·NET CHIP FOR MACHINE VISION 209
8. CONCLUSIONS 211
ACKNOWLEDGEMENT 211
REFERENCES 211
Chapter 24. THE ROLE OF TINE IN NATURAL INTELLIGENCE: IMPLICATIONS FOR NEURAL NETWORK AND ARTIFICIAL INTELLIGENCE RESEARCH 212
1. INTRODUCTION 212
2. REAL-TIME LEARNING MECHANISM MODELS 213
3. EXPERIMENTAL TESTS 215
4. CONCLUSIONS 216
REFERENCES 216
Chapter 25. HARDWARE CONCEPTS FOR NEURAL NETWORKS 220
1.
220
2. SINGLE-CHIP INTEGRATION OF NEURAL NETS 221
3.
223
4.
227
ACKNOWLEDGEMENTS 228
REFERENCES 228
Chapter 26. Rapid Prototyping for Neural Networks 230
Abstract 230
1 Introduction 230
2 TInMANN Algorithm and Architecture 231
3 Rapid Prototyping of TInMANN 234
References 236
Section 5: Pattem Recognition with Neural Networks 238
Chapter 27. ANIMATE VISION USES OBJECTT-CENTERED REFERENCE FRAMES 240
1. VISION AS BEHAVIOR 240
2. USING THE FIXATION FRAME 241
3. RELATIV. VISION 242
4. LEARNING COORDINATED BEHAVIORS 243
5. CONCLUSIONS 245
Acknowledgements 246
References 246
Chapter 28. A Performance Evaluation of ALIAS for the Detection of Geometric Anomalies on Fractal Images 248
1. PROJECT ALIAS 248
2. EXPERIMENT PROCEDURE AND PERFORMANCE MEASURES 250
3. GEOMETRIC ANOMALIES EXPERIMENT 252
4. CONCLUSIONS 256
REFERENCES 257
Chapter 29. HOW CONNECTIONIST MODELS COULD IMPROVE MARKOV MODELS FOR SPEECH RECOGNITION 258
Abstract 258
1. Introduction 258
2. Hidden Markov Models 259
3. Connectionist models and time sequential inputs 261
4. Hybrid approaches: HMM + MLP 262
5. Another hybrid approach 263
6. Conclusions 264
References 265
Chapter 30. AN ALGEBRAIC APPROACH TO BOOLEAN FEEDFORWARD NETWORKS 266
ABSTRACT 266
1 INTRODUCTION 266
2 AN ALGEBRAIC FRAMEWORK FOR BOOLEAN FEEDFORWARD NETWORKS 268
3 A GEOMETRICAL SEMANTICS OF LAYERED NETWORK ACTIVITY 270
4 LEARNING IN BOOLEAN FEEDFORWARD NETWORKS 271
5 CONCLUSIONS 273
6 REFERENCES 273
7 ACKNOWLEDGEMENTS 273
Chapter 31. ALPHANUMERIC CHARACTER RECOGNITION BY THE NEOCOGNITRON 274
1. INTRODUCTION 274
2. THE STRUCTURE OF THE NETWORK 275
3. TRAINING THE NETWORK TO RECOGNIZE HANDWRITTEN ALPHANUMERIC CHARACTERS 276
4. DISCUSSION 281
REFERENCES 281
Chapter 32. STORING AND PROCESSING INFORMATION IN CONNECTIONIST SYSTEMS 282
1.0 Introduction 282
2.0 The model 284
3.0 Conclusion 288
REFERENCES 288
Chapter 33. THE CLOSED LOOP ANTAGONISTIC NETWORK (CLAN) 290
1. INTRODUCTION 290
2. A QUALITY MEASURE FOR PATTERN MATCHING 290
3. EVALUATION OF THE QUALITY MEASURE BY A NEURON 291
4. THE CLOSED LOOP ANTAGONISTIC NETWORK (CLAN) 293
5· LEARNING AND ASSOCIATIVE RECALL IN A CLAN 295
6. CONCLUSION 296
REFERENCES 296
Chapter 34. Adaptive Resonance Structures in Hierarciiical Receptive Fieid Pattern Recognition Machines 298
Abstract 298
Three ART Architectures 298
Hierarchical Receptive Field Architectures 300
ART and Receptive Fields 301
Selective Attention Neocognitron (SAN) 301
The SAN and Hierarchical ART 302
Conclusions 305
References 305
Chapter 35.
306
1 INTRODUCTION 306
2 BASIC ASSUMPTIONS 306
3 RECEPTIVE FIELD STRUCTURE 307
4 SPECIFIC FORMS 308
5 TRANSFORMATIONS AND COMBINATIONS 311
6 BRAIN CIRCUITRY 311
Acknowledgements 311
References 312
Chapter 36. CONSIDERATIONS FOR A VISUAL ARCHITECTURE 314
THE VISUAL PROCESS 314
THE ARCHITECTURE APPROACH 317
DATA STRUCTURE 317
ORGANIZATION 321
References 322
Chapter 37. NEURAL MODELLING OF VISION AND OLFACTION 324
1. INTRODUCTION 324
2. THE RETINA 325
3. THE PRIMARY VISUAL CORTEX 327
4. OLFACTION 328
5. HIPPOCAMPUS 330
6. CONCLUSIONS 332
7. ACKNOWLEDGEMENTS 332
8. REFERENCES 332
Chapter 38. Neural Computers for Foveating Vision Systems 334
1. Introduction 334
2. Foveating Vision Systems 334
3. A Foveating Sensor 336
4. A Foveating Image Acquisition System 336
5. Example of a Foveated Image 337
6. Image Processing of Foveated Images 338
7. Viewing Foveated Images 339
8. Temporal Foveation Control Method 340
References 341
Chapter 39.
342
1. The Dilemma of Curve Detection 342
2. Two Stages of Curve Detection 342
3. The Model of Curve Detection 343
4. References 347
Section 6: Motor Control with Neural Networks 350
Chapter 40.
352
Abstract 352
1 Introduction 352
2 Hormonal Control 353
3 A Robot Implementation 354
4 Critique 357
5 Grounding Out in Motor Control 357
6 References 358
Chapter 41. SPINAL NETWORK COMPUTATIONS ENABLE INDEPENDENT CONTROL OF MUSCLE LENGTH AND JOINT COMPLIANCE 360
1. AN OPPONENT NEUROMUSCULAR DESIGN FOR INDEPENDENT CONTROL OF MUSCLE LENGTH AND MUSCLE TENSION 360
2. WIDE FORCE RANGE AT EACH MUSCLE LENGTH REQUIRES SIZE PRINCIPLE 360
3. SIZE PRINCIPLE WITH CO-CONTRACTION THREATEN POSITION-CODE INVARIANCE 362
4. AUTOMATIC COMPENSATION BY THE RENSHAW-IalN PATHWAY FOR UNEQUAL AMPLIFICATIONS OF CO-CONTRACTIVE SIGNALS 362
5. LEARNED AND REACTIVE COMPENSATIONS FOR VARIABLE MOMENT-ARMS USING SPINDLE ORGAN ERROR SIGNALS 364
6. CO-CONTRACTIVE AND STRETCH FEEDBACK CONTROL OF LOAD COMPENSATION 366
REFERENCES 367
Chapter 42. Neural Computers for Motor Control 368
Biological Motor Control 368
Neural Networks for Control of a Redundant Robot Arm 371
References 374
Chapter 43. Feedback-Error-Learning Neural Network for Supervised Motor Learning 376
Abstract 376
1 Introduction 376
2 Computational schemes to convert trajectory error into motor command error 377
3 Feedback-error-learning as a Newton-like method in functional space 379
4 Stability of feedback-error-learning scheme 380
5 Feedback-error-learning neural network as models of different parts of cerebellum 382
Ackuowledgment 382
References 382
Chapter 44. DAS LERNFAHRZEUG NEURAL NETWORK APPLICATION FOR AUTONOMOUS MOBILE ROBOTS 384
SUMMARY 384
TARGET 384
SYSTEM ARCHITECTURE 385
PROCESS OF NEURAL APPLICATION SIMULATION 386
LERNFAHRZEUG I 387
LERNFAHRZEUG II 388
ADVANCED CONCEPTS FOR NEURAL NETWORKS 389
Chapter 45. Motor Learning by "Charge'' Placement with Self-organizing Maps 392
1 Introduction 392
2 The "Method of Charges" 393
3 Learning to Recognize Stable Grasp Points 394
4 Conclusion 398
References 398
List of General References 400
REFERENCES FROM ALL CONTRIBUTIONS 402
List of Contributors 446
AUTHOR INDEX 454
SUBJECT INDEX 462
| Erscheint lt. Verlag | 28.6.2014 |
|---|---|
| Sprache | englisch |
| Themenwelt | Mathematik / Informatik ► Informatik ► Netzwerke |
| Informatik ► Theorie / Studium ► Künstliche Intelligenz / Robotik | |
| ISBN-10 | 1-4832-9427-7 / 1483294277 |
| ISBN-13 | 978-1-4832-9427-8 / 9781483294278 |
| Informationen gemäß Produktsicherheitsverordnung (GPSR) | |
| Haben Sie eine Frage zum Produkt? |
Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM
Dateiformat: PDF (Portable Document Format)
Mit einem festen Seitenlayout eignet sich die PDF besonders für Fachbücher mit Spalten, Tabellen und Abbildungen. Eine PDF kann auf fast allen Geräten angezeigt werden, ist aber für kleine Displays (Smartphone, eReader) nur eingeschränkt geeignet.
Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine
Geräteliste und zusätzliche Hinweise
Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.
aus dem Bereich