Zum Hauptinhalt springen
Nicht aus der Schweiz? Besuchen Sie lehmanns.de
COLT '91 -

COLT '91 (eBook)

Proceedings of the Fourth Annual Workshop, UC Santa Cruz, California, August 5-7, 1991

Colt (Herausgeber)

eBook Download: PDF
2014 | 1. Auflage
371 Seiten
Elsevier Science (Verlag)
9781483299143 (ISBN)
Systemvoraussetzungen
53,69 inkl. MwSt
(CHF 52,45)
Der eBook-Verkauf erfolgt durch die Lehmanns Media GmbH (Berlin) zum Preis in Euro inkl. MwSt.
  • Download sofort lieferbar
  • Zahlungsarten anzeigen
COLT
COLT

Front Cover 1
Computational Learning Theory 2
Copyright Page 3
Table of Contens 4
Foreword 10
Invited Talks 12
Learning and Generalization 14
Chapter 1. The Role of Learning in Autonomous Robots 16
Abstract 16
1 INTRODUCTION 16
2 AUTONOMOUS ROBOTS 16
3 LEARNING 18
4 EXAMPLES 19
5 CONCLUSION 20
Acknowledgements 20
References 20
Session I 22
Chapter 2. Tracking Drifting Concepts Using Random Examples 24
Abstract 24
1 Introduction 24
2 Notation and Mathematical Preliminaries 25
3 The Exception Tracking Strategy 26
4 Upper bounds on tolerable amount of drift 28
5 Increasingly unreliable evidence and hypothesis evaluation 29
6 Conclusions 33
7 Acknowledgements 34
References 34
Chapter 3. Investigating the Distribution Assumptions in the Pac Learning Model 35
Abstract 35
1 INTRODUCTION 35
2 LEARNING WITH REASONABLE DISTRIBUTIONS 36
3 LEARNING WHEN THE DISTRIBUTION OF EXAMPLES CHANGES 39
4 CONCLUSIONS 42
Acknowledgement 42
References 43
Chapter 4. Simultaneous Learning of Concepts and Simultaneous Estimation of Probabilities 44
Abstract 44
1 INTRODUCTION 44
2 SIMULTANEOUS LEARNING 45
3 SIMULTANEOUS ESTIMATION 46
4 EMPIRICAL COVERS 47
5 RELATING SIMULTANEOUS LEARNING AND SIMULTANEOUS ESTIMATION 47
6 RELATING SIMULTANEOUS LEARNING AND ESTIMATION TO EMPIRICAL COVERINGS 48
7 A SUFFICIENT CONDITION FOR SIMULTANEOUS LEARNING 50
8 THE DISTRIBUTION-FREE CASE 50
9 CONCLUSION AND OPEN PROBLEMS 51
Acknowledgements 52
APPENDIX 52
REFERENCES 53
Chapter 5. Learning by Smoothing: a morphological approach 54
Abstract 54
1 INTRODUCTION 54
2 FUNDAMENTALS 55
3 PROPERTIES 58
4 COMMENTS 68
Acknowledgements 68
References 68
Session II 70
Chapter 6. Bounds on the Sample Complexity of Bayesian Learning Using Information Theory and the VC Dimension 72
Abstract 72
1 Introduction 72
2 Summary of results 74
3 Notational conventions 74
4 Instantaneous information gain and mistake probabilities 75
5 Bounding the mistake probabilities by the information gain 76
6 Bounding the cumulative mistakes by the partition entropy 78
7 Handling incorrect priors 78
8 The average instantaneous information gain is decreasing 79
9 Bayesian learning and the VC dimension: correct priors 80
10 Bayesian learning and the VC dimension: incorrect priors 82
11 Conclusions and future research 83
Acknowledgements 84
References 84
Chapter 7. Calculation of the Learning Curve of Bayes Optimal Classification Algorithm for Learning a Perceptron With Noise 86
Abstract 86
1 Introduction 86
2 Results 87
3 Conclusion 94
Acknowledgements 94
References 94
Chapter 8. Probably Almost Bayes Decisions 99
Abstract 99
1 INTRODUCTION 99
2 BAYES AND PROBABLY ALMOST BAYES DECISIONS 99
3 PAC-ESTIMABLE DISTRIBUTION CLASSES 100
4 INDEPENDENT BOOLEAN FEATURES 103
5 DEPENDENT BOOLEAN FEATURES 103
6 Conclusions 104
References 105
Session IV 106
Chapter 9. A Geometric Approach to Threshold Circuit Complexity 108
Abstract 108
1 Introduction 108
2 Uniqueness 111
3 Generalized Spectrum Techniques 113
4 On The Method of Correlations 116
5 Miscellaneous 119
6 Concluding Remarks 120
Acknowledgments 120
Appendix 120
References 121
Chapter 10. Learning Curves in Large Neural Networks 123
Abstract 123
1 Introduction 123
2 General Theory 125
3 The Annealed Approximation 128
4 Learning of a Perceptron Rule 129
5 Discussion 134
Acknowledgments 136
References 136
Appendix 137
Chapter 11. On the Learnability of Inflnitary Regular Sets 139
Abstract 139
1 Introduction 139
2 Finite Automata and ù-Regular Sets 140
3 Difficulties in Learning InfinitaryRegular Sets 142
4 The Algorithm L 143
5 An Example Run of Lw 146
Acknowledgments 146
References 147
Session V 148
Chapter 12. Learning Monotone DNF with an Incomplete Membership Oracle 150
Abstract 150
1 INTRODUCTION 150
2 PRELIMINARIES 151
3 USING INCOMPLETE MEMBERSHIP QUERIES 152
4 THE MONOTONE DNF LEARNING ALGORITHM 154
5 HANDLING SOME ERRORS 155
6 FUTURE DIRECTIONS 156
Acknowledgements 156
References 156
Chapter 13. Redundant Noisy Attributes, Attribute Errors, and Linear-threshold Learning Using Winnow 158
Abstract 158
1 Introduction 158
2 Winnow 160
3 Adversary Bounds with Attribute Errors 161
4 Noise 162
5 Winnow and Conditionally Independent Attributes 165
Acknowledgements 166
References 167
Chapter 14. Learning in the Presence of Finitely or Infinitely Many Irrelevant Attributes 168
Abstract 168
1 Introduction 168
2 Definitions 170
3 The Models 170
4 Transformation of an attribute-efficient MB algorithm 171
5 Learning Attribute Efficiently in the MBQ and IMBQ models 173
6 Learning Attribute Efficiently with Membership Queries 174
7 Lower Bound 176
Acknowledgements 177
References 177
Chapter 15. On-line Learning with an Oblivious Environment and the Power of Randomization 178
Abstract 178
1 INTRODUCTION 178
2 ERROR-BOUNDS FOR RANDOMIZED ON-LINE LEARNING ALGORITHMS WITH AN OBLIVIOUS ENVIRONMENT 180
3 THE POWER OF RANDOMIZATION FOR ON-LINE LEARNING WITH AN ADAPTIVE ENVIRONMENT 183
4 COMPARISONS WITH OTHER PREDICTION MODELS AND ALGORITHMS 184
Acknowledgements 186
References 186
Session VI 188
Chapter 16. Learning Monotone kµ DNF Formulas on Product Distributions 190
Abstract 190
1 Introduction 190
2 Learning Model 191
3 Influence and Blocking Sets 191
4 Testing for blocking sets 192
5 The algorithm 193
6 Product distributions 193
7 Generalizing this approach 194
References 194
Chapter 17. Learning Probabilistic Read-once Formulas on Product Distributions 195
Abstract 195
1 INTRODUCTION 195
2 PRELIMINARIES 196
3 THE LEARNING ALGORITHM 197
4 CONCLUSION AND OPEN PROBLEMS 208
Acknowledgements 209
References 209
Chapter 18. Learning 2µ Formulas and kµ Decision Trees 210
Abstract 210
1 INTRODUCTION 210
2 DEFINITIONS ANDTERMINOLOGY 211
3 REDUCING DNF FORMULAS TO 3µDNF FORMULAS 212
4 LEARNING 2µDNF FORMULAS 212
5 LEARNING kµ DECISION TREES 217
6 CONCLUSIONS 220
Acknowledgements 220
References 220
Session VIII 222
Chapter 19. Polynomial - Time Learning of Very Simple Grammars from Positive Data 224
Abstract 
224 
1 Introduction 224
2 Definitions 225
3 Very Simple Grammars and Languages 225
4 Identifying Very Simple Grammars 227
5 An Example Run 235
6 Conclusions 237
Acknowledgements 237
References 238
Chapter 20. Relations Between Probabilistic and Team One-Shot Learners 239
Abstract 239
1 Introduction 239
2 Consequences 241
3 Proof sketches 242
4 Open questions 246
Acknowledgements 246
References 250
Session IX 252
Chapter 21. Approximation and Estimation Bounds for Artificial Neural Networks 254
Abstract 254
1 INTRODUCTION 254
2 TECHNICAL SUMMARY 255
Acknowledgement 260
References 260
Chapter 22. The VC-Dimension vs. the Statistical Capacity for Two Layer Networks with Binary Weights 261
Abstract 261
1 Introduction 261
2 A Relationship between theVC-Dimension and the StatisticalCapacity 261
3 A Lower Bound of the VC-Dimensioll of Two Layer Networks with Binary Weights 263
4 Probability of Error for All Samples Stored 265
5 An Upper Bound 265
6 Conclusion 265
Acknowledgement 266
References 266
Chapter 23. On Learning Binary Weights for Majority Functions 268
Abstract 268
1 INTRODUCTION 268
2 THE SETTING 269
3 HARMONIC UPDATE 271
4 MAJORITY RULE 272
5 DIRECTED DRIFT 273
6 CONCLUSIONS 275
Acknowledgements 276
References 276
Chapter 24. Evaluating the Performance of a Simple Inductive Procedure in the Presence of Overfitting Error 278
Abstract 278
1 Introduction and an Example 278
2 Definitions and Preliminaries 279
3 Empirical Risk Minimization and a Measure of Asymptotic Overfit 280
References 285
Session X 286
Chapter 25. Polynomial Learnability of Probabilistic Concepts with Respect to the Kullback-Leibler Divergence 288
Abstract 288
1 Introduction 288
2 Preliminaries 290
3 Equivalence of Models for Polynomial Learnability of Probabilistic Concepts 291
4 Sample Complexity Bounds 292
5 Learning Convex Combinations of Probabilistic Concepts 294
6 Concluding Remarks 299
Acknowledgements 299
References 299
Chapter 26. A Loss Bound Model for On-Line Stochastic Prediction Strategies 301
Abstract 301
1 Introduction 301
2 A Loss Bound Model 303
3 A New On-Line Stochastic Prediction Strategy Apc 305
4 Upper Bounds on the Cumulative Loss for Apc 305
5 Upper Bounds on the Sample Complexity of e–Prediction by Ape 309
6 Applications of Apc to Some Target Classes 311
Acknowledgements 312
References 313
Chapter 27. On the Complexity of Teaching 314
Abstract 314
1 INTRODUCTION 314
2 THE TEACHING DIMENSION 315
3 RELATED WORK 315
4 COMPARISON TO OTHER DIMENSION MEASURES 316
5 COMPUTING THE TEACHING DIMENSION 318
6 CONCLUSIONS AND OPEN PROBLEMS 323
ACKNOWLEDGEMENTS 323
References 325
Session XI 326
Chapter 28. Improved Learning of AC0 Functions 328
Abstract 328
1 INTRODUCTION 328
2 DEFINITIONS AND NOTATION 328
3 AN ORTHONORMAL BASIS FOR BOOLEAN FUNCTIONS SAMPLED UNDER MUTUALLY INDEPENDENT DISTRIBUTIONS 329
4 THE DROPOFF LEMMA 330
5 DIRECT LEARNING 333
6 INDIRECT LEARNING 334
7 COMPARISON OF APPROACHES 335
8 OPEN QUESTIONS 336
Acknowledgements 336
References 336
Chapter 29. earning Read-Once Formulas over Fields and Extended Bases 337
Abstract 337
1 INTRODUCTION 337
2 DEFINITIONS 338
3 ALGORITHM TO LEARN READ-ONCE FORMULAS OVER 339
4 LEARNING ARITHMETIC READ-ONCE FORMULAS 345
Acknowledgements 347
References 347
Chapter 30. Fast Identification of Geometric Objects with Membership Queries 348
Abstract 348
1 INTRODUCTION 348
2 LEARNING OF A HALFPLANE WITH MEMBERSHIP QUERIES 350
3 FAST AND EXACT IDENTIFICATION OF POLYGONS WITH MEMBERSHIP QUERIES 356
4 LEARNING BOXES AND OTHER "NATURAL" CLASSES OF POLYGONS 362
Acknowledgements 363
References 363
Chapter 31. Bounded degree graph inference from walks 365
Abstract 365
1 INTRODUCTION 365
2 PRELIMINARIES 365
3 INFERENCE OF DEGREE 2 BOUNDED GRAPHS 366
4 INFERENCE OF DEGREE K> 3 BOUNDED GRAPHS
5 CONCLUSIONS 377
References 377
Chapter 32. On the Complexity of Learning Strings and Sequences 378
Abstract 378
1 Introduction 378
2 Preliminaries 378
3 Some Simple Facts about DFA's and Pattern Matching 379
4 The Hardness of Learning a String or Sequence 379
5 Concluding Remarks 381
Acknowledgements 382
References 382
Errata 384
Chapter 33. The Correct Definition of Finite Elasticity: Corrigendum to Identification of Unions 386
Abstract 386
1 CORRIGENDUM 386
References 386
Addendum 388
Chapter 34. When Oracles Do Not Help 390
Abstract 390
1 Introduction 390
2 Notation, Conventions, and Definitions 390
3 Necessary Lemmas 391
4 Overview of the Proof (I) 391
5 Coding A into G 392
6 Satisfying Si (easy part) 392
7 Satisfying Si (hard part) 393
8 Overview of the Proof(II) 393
9 Open Problems 394
10 Acknowledgements 394
References 394
Author Index 396

Erscheint lt. Verlag 23.5.2014
Sprache englisch
Themenwelt Informatik Theorie / Studium Künstliche Intelligenz / Robotik
Wirtschaft
ISBN-13 9781483299143 / 9781483299143
Informationen gemäß Produktsicherheitsverordnung (GPSR)
Haben Sie eine Frage zum Produkt?
PDFPDF (Adobe DRM)

Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM

Dateiformat: PDF (Portable Document Format)
Mit einem festen Seiten­layout eignet sich die PDF besonders für Fach­bücher mit Spalten, Tabellen und Abbild­ungen. Eine PDF kann auf fast allen Geräten ange­zeigt werden, ist aber für kleine Displays (Smart­phone, eReader) nur einge­schränkt geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine Adobe-ID und die Software Adobe Digital Editions (kostenlos). Von der Benutzung der OverDrive Media Console raten wir Ihnen ab. Erfahrungsgemäß treten hier gehäuft Probleme mit dem Adobe DRM auf.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine Adobe-ID sowie eine kostenlose App.
Geräteliste und zusätzliche Hinweise

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich
Die Grundlage der Digitalisierung

von Knut Hildebrand; Michael Mielke; Marcus Gebauer

eBook Download (2025)
Springer Fachmedien Wiesbaden (Verlag)
CHF 29,30
Die materielle Wahrheit hinter den neuen Datenimperien

von Kate Crawford

eBook Download (2024)
C.H.Beck (Verlag)
CHF 17,55