Generalized Low Rank Models
Seiten
2016
now publishers Inc (Verlag)
9781680831405 (ISBN)
now publishers Inc (Verlag)
9781680831405 (ISBN)
- Titel ist leider vergriffen;
keine Neuauflage - Artikel merken
Principal components analysis (PCA) is a well-known technique for approximating a tabular data set by a low rank matrix. In this volume, the authors extend the idea of PCA to handle arbitrary data sets consisting of numerical, Boolean, categorical, ordinal, and other data types.
Principal components analysis (PCA) is a well-known technique for approximating a tabular data set by a low rank matrix. Here, the authors extend the idea of PCA to handle arbitrary data sets consisting of numerical, Boolean, categorical, ordinal, and other data types.
This framework encompasses many well-known techniques in data analysis, such as non-negative matrix factorization, matrix completion, sparse and robust PCA, k-means, k-SVD, and maximum margin matrix factorization. The method handles heterogeneous data sets, and leads to coherent schemes for compressing, denoising, and imputing missing entries across all data types simultaneously. It also admits a number of interesting interpretations of the low rank factors, which allow clustering of examples or of features. The authors propose several parallel algorithms for fitting generalized low rank models, and describe implementations and numerical results.
Principal components analysis (PCA) is a well-known technique for approximating a tabular data set by a low rank matrix. Here, the authors extend the idea of PCA to handle arbitrary data sets consisting of numerical, Boolean, categorical, ordinal, and other data types.
This framework encompasses many well-known techniques in data analysis, such as non-negative matrix factorization, matrix completion, sparse and robust PCA, k-means, k-SVD, and maximum margin matrix factorization. The method handles heterogeneous data sets, and leads to coherent schemes for compressing, denoising, and imputing missing entries across all data types simultaneously. It also admits a number of interesting interpretations of the low rank factors, which allow clustering of examples or of features. The authors propose several parallel algorithms for fitting generalized low rank models, and describe implementations and numerical results.
1: Introduction
2: PCA and quadratically regularized PCA
3: Generalized regularization
4: Generalized loss functions
5: Loss functions for abstract data types
6: Multi-dimensional loss functions
7: Fitting low rank models
8: Choosing low rank models
9: Implementations
Acknowledgements
Appendices
References
| Erscheinungsdatum | 11.07.2016 |
|---|---|
| Reihe/Serie | Foundations and Trends® in Machine Learning |
| Verlagsort | Hanover |
| Sprache | englisch |
| Maße | 156 x 234 mm |
| Gewicht | 210 g |
| Themenwelt | Informatik ► Theorie / Studium ► Künstliche Intelligenz / Robotik |
| ISBN-13 | 9781680831405 / 9781680831405 |
| Zustand | Neuware |
| Informationen gemäß Produktsicherheitsverordnung (GPSR) | |
| Haben Sie eine Frage zum Produkt? |
Mehr entdecken
aus dem Bereich
aus dem Bereich
Eine praxisorientierte Einführung
Buch | Softcover (2025)
Springer Vieweg (Verlag)
CHF 53,15
Buch | Softcover (2025)
Reclam, Philipp (Verlag)
CHF 11,20
Eine kurze Geschichte der Informationsnetzwerke von der Steinzeit bis …
Buch | Hardcover (2024)
Penguin (Verlag)
CHF 39,95