Zum Hauptinhalt springen
Nicht aus der Schweiz? Besuchen Sie lehmanns.de

Support Vector Machines and Perceptrons (eBook)

Learning, Optimization, Classification, and Application to Social Networks
eBook Download: PDF
2016 | 1st ed. 2016
XIII, 95 Seiten
Springer International Publishing (Verlag)
978-3-319-41063-0 (ISBN)

Lese- und Medienproben

Support Vector Machines and Perceptrons - M.N. Murty, Rashmi Raghava
Systemvoraussetzungen
53,49 inkl. MwSt
(CHF 52,25)
Der eBook-Verkauf erfolgt durch die Lehmanns Media GmbH (Berlin) zum Preis in Euro inkl. MwSt.
  • Download sofort lieferbar
  • Zahlungsarten anzeigen

This work reviews the state of the art in SVM and perceptron classifiers. A Support Vector Machine (SVM) is easily the most popular tool for dealing with a variety of machine-learning tasks, including classification. SVMs are associated with maximizing the margin between two classes. The concerned optimization problem is a convex optimization guaranteeing a globally optimal solution. The weight vector associated with SVM is obtained by a linear combination of some of the boundary and noisy vectors. Further, when the data are not linearly separable, tuning the coefficient of the regularization term becomes crucial. Even though SVMs have popularized the kernel trick, in most of the practical applications that are high-dimensional, linear SVMs are popularly used. The text examines applications to social and information networks. The work also discusses another popular linear classifier, the perceptron, and compares its performance with that of the SVM in different application areas.>

Preface 6
Overview 6
Audience 7
Organization 7
Contents 9
Acronyms 13
1 Introduction 14
1.1 Terminology 14
1.1.1 What Is a Pattern? 14
1.1.2 Why Pattern Representation? 15
1.1.3 What Is Pattern Representation? 15
1.1.4 How to Represent Patterns? 15
1.1.5 Why Represent Patterns as Vectors? 15
1.1.6 Notation 16
1.2 Proximity Function 1--4 16
1.2.1 Distance Function 16
1.2.2 Similarity Function 17
1.2.3 Relation Between Dot Product and Cosine Similarity 18
1.3 Classification 2--4 19
1.3.1 Class 19
1.3.2 Representation of a Class 19
1.3.3 Choice of G(X) 20
1.4 Classifiers 20
1.4.1 Nearest Neighbor Classifier (NNC) 20
1.4.2 K-Nearest Neighbor Classifier (KNNC) 20
1.4.3 Minimum-Distance Classifier (MDC) 21
1.4.4 Minimum Mahalanobis Distance Classifier 22
1.4.5 Decision Tree Classifier: (DTC) 23
1.4.6 Classification Based on a Linear Discriminant Function 25
1.4.7 Nonlinear Discriminant Function 25
1.4.8 Naïve Bayes Classifier: (NBC) 26
1.5 Summary 27
References 27
2 Linear Discriminant Function 28
2.1 Introduction 28
2.1.1 Associated Terms 1--3 28
2.2 Linear Classifier [2--4] 30
2.3 Linear Discriminant Function 2 32
2.3.1 Decision Boundary 32
2.3.2 Negative Half Space 32
2.3.3 Positive Half Space 32
2.3.4 Linear Separability 33
2.3.5 Linear Classification Based on a Linear Discriminant Function 33
2.4 Example Linear Classifiers 2 36
2.4.1 Minimum-Distance Classifier (MDC) 36
2.4.2 Naïve Bayes Classifier (NBC) 36
2.4.3 Nonlinear Discriminant Function 37
References 38
3 Perceptron 39
3.1 Introduction 39
3.2 Perceptron Learning Algorithm [1] 40
3.2.1 Learning Boolean Functions 40
3.2.2 W Is Not Unique 42
3.2.3 Why Should the Learning Algorithm Work? 42
3.2.4 Convergence of the Algorithm 43
3.3 Perceptron Optimization 44
3.3.1 Incremental Rule 45
3.3.2 Nonlinearly Separable Case 45
3.4 Classification Based on Perceptrons 2 46
3.4.1 Order of the Perceptron 47
3.4.2 Permutation Invariance 49
3.4.3 Incremental Computation 49
3.5 Experimental Results 50
3.6 Summary 51
References 52
4 Linear Support Vector Machines 53
4.1 Introduction 53
4.1.1 Similarity with Perceptron 53
4.1.2 Differences Between Perceptron and SVM 54
4.1.3 Important Properties of SVM 1--5 54
4.2 Linear SVM 1, 5 55
4.2.1 Linear Separability 55
4.2.2 Margin 56
4.2.3 Maximum Margin 58
4.2.4 An Example 59
4.3 Dual Problem 61
4.3.1 An Example 62
4.4 Multiclass Problems 2 63
4.5 Experimental Results 64
4.5.1 Results on Multiclass Classification 64
4.6 Summary 66
References 68
5 Kernel-Based SVM 69
5.1 Introduction 69
5.1.1 What Happens if the Data Is Not Linearly Separable? 2--4,6 69
5.1.2 Error in Classification 70
5.2 Soft Margin Formulation 2 71
5.2.1 The Solution 71
5.2.2 Computing b 72
5.2.3 Difference Between the Soft and Hard Margin Formulations 72
5.3 Similarity Between SVM and Perceptron 72
5.4 Nonlinear Decision Boundary 1--6 74
5.4.1 Why Transformed Space? 75
5.4.2 Kernel Trick 75
5.4.3 An Example 76
5.4.4 Example Kernel Functions 76
5.5 Success of SVM 2--5 76
5.6 Experimental Results 77
5.6.1 Iris Versicolour and Iris Virginica 77
5.6.2 Handwritten Digit Classification 78
5.6.3 Multiclass Classification with Varying Values of the Parameter C 78
5.7 Summary 79
References 79
6 Application to Social Networks 80
6.1 Introduction 80
6.1.1 What Is a Network? 80
6.1.2 How Do We Represent It? 80
6.2 What Is a Social Network? 1--4 83
6.2.1 Citation Networks 84
6.2.2 Coauthor Networks 84
6.2.3 Customer Networks 84
6.2.4 Homogeneous and Heterogeneous Networks 84
6.3 Important Properties of Social Networks 4 85
6.4 Characterization of Communities 2--3 86
6.4.1 What Is a Community? 86
6.4.2 Clustering Coefficient of a Subgraph 87
6.5 Link Prediction 1--4 88
6.5.1 Similarity Between a Pair of Nodes 89
6.6 Similarity Functions 1--4 90
6.6.1 Example 91
6.6.2 Global Similarity 92
6.6.3 Link Prediction based on Supervised Learning 93
6.7 Summary 94
References 94
7 Conclusion 95
Glossary 98
Index 99

Erscheint lt. Verlag 16.8.2016
Reihe/Serie SpringerBriefs in Computer Science
SpringerBriefs in Computer Science
Zusatzinfo XIII, 95 p. 25 illus.
Verlagsort Cham
Sprache englisch
Themenwelt Mathematik / Informatik Informatik Datenbanken
Mathematik / Informatik Informatik Programmiersprachen / -werkzeuge
Schlagworte Algorithm analysis and problem complexity • classification • Linear discriminant function • machine learning • Network optimisation • Support Vector Machine
ISBN-10 3-319-41063-6 / 3319410636
ISBN-13 978-3-319-41063-0 / 9783319410630
Informationen gemäß Produktsicherheitsverordnung (GPSR)
Haben Sie eine Frage zum Produkt?
PDFPDF (Wasserzeichen)

DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasser­zeichen und ist damit für Sie persona­lisiert. Bei einer missbräuch­lichen Weiter­gabe des eBooks an Dritte ist eine Rück­ver­folgung an die Quelle möglich.

Dateiformat: PDF (Portable Document Format)
Mit einem festen Seiten­layout eignet sich die PDF besonders für Fach­bücher mit Spalten, Tabellen und Abbild­ungen. Eine PDF kann auf fast allen Geräten ange­zeigt werden, ist aber für kleine Displays (Smart­phone, eReader) nur einge­schränkt geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich
Der Leitfaden für die Praxis

von Christiana Klingenberg; Kristin Weber

eBook Download (2025)
Carl Hanser Fachbuchverlag
CHF 48,80