Information Theory, Probability and Statistical Learning
Springer International Publishing (Verlag)
978-3-032-13991-7 (ISBN)
- Noch nicht erschienen - erscheint am 12.05.2026
- Versandkostenfrei
- Auch auf Rechnung
- Artikel merken
In 2024, Andrew Barron turned 65 and retired. This is a Festschrift volume honoring his career and contributions. Andrew R. Barron, a professor of Statistics and Data Science at Yale University, has been one of the most influential figures in information theory research over the past 40 years. He has made profound, broad and consistent contributions to information theory, as well as its interactions with probability theory, statistical learning, and neural networks. From his Ph.D. thesis work in 1985 until today, Barron has been recognized as a leader in both information theory and statistics, especially in the area where the two fields intersect and fertilize each other. There has been a powerful tradition of important work on this interface and it has had a strong impact on both fields. Through the introduction of novel ideas and techniques, and through his outstanding scholarship, Barron has clarified some of the foundations of the mathematical and statistical side of Shannon theory, and he has helped solidify our understanding of the connection between information theory and statistics. This volume consists of invited papers, by prominent researchers that either personally or through the topics of the work have some connection with Barron. The papers in this volume are written by people working in all three areas where Barron has made major contributions: Information theory, probability, and statistical learning. These topics are very timely as there is major current activity in all three areas, especially in connection with the explosive current advances in machine learning theory and its applications.
Ioannis Kontoyiannis is with the Department of Pure Mathematics and Mathematical Statistics at the University of Cambridge, where he is the Churchill Professor of Mathematics of Information. He has been awarded the Manning endowed assistant professorship; a Sloan Foundation Research Fellowship; an honorary Master of Arts Degree Ad Eundem by Brown University; and a Marie Curie Fellowship. He is a Fellow of the IEEE, the IMS, the AAIA, and the AIAA. He has published over 60 journal articles in leading international journals and over 120 conference papers in the top international conferences. He also holds two U.S. patents, and he has authored a textbook in probability theory. He has served on the editorial board of the American Mathematical Society's Quarterly of Applied Mathematics, the IEEE Transactions on Information Theory, Springer-Verlag's Acta Applicandae Mathematicae, the book series Lecture Notes in Mathematics by Springer-Verlag, and the online journal Entropy.
Jason M. Klusowski is an Assistant Professor in the Department of Operations Research and Financial Engineering (ORFE) at Princeton University. Prior to joining Princeton, he was an Assistant Professor in the Department of Statistics at Rutgers University New Brunswick. He received a Ph.D. in Statistics and Data Science from Yale University. His research explores the tradeoffs among interpretability, statistical accuracy, and computational feasibility in large-scale, data-driven systems. He is a recipient of the Alfred P. Sloan Research Fellowship in Mathematics, the National Science Foundation (NSF) CAREER Award, and the Howard B. Wentz, Jr., Junior Faculty Award from Princeton s School of Engineering and Applied Science (SEAS). He currently serves as an Associate Editor for the probability and statistics journal Bernoulli.
Cynthia Rush is an Associate Professor of Statistics at Columbia University. She received a Ph.D. and M.A. in Statistics from Yale University in 2016 and 2011, respectively, and she completed her undergraduate coursework at the University of North Carolina at Chapel Hill where she obtained a B.S. in Mathematics in 2010. Her research focuses on high-dimensional statistics, message passing algorithms, statistical robustness, and information theory. Cynthia currently serves as an Associate Editor for Bernoulli and the IEEE Transactions on Information Theory.
Information Theory.-Probability Theory.- Statistical Learning.
| Erscheint lt. Verlag | 12.5.2026 |
|---|---|
| Zusatzinfo | Approx. 400 p. 30 illus. |
| Verlagsort | Cham |
| Sprache | englisch |
| Maße | 155 x 235 mm |
| Themenwelt | Mathematik / Informatik ► Mathematik |
| Schlagworte | Information Theory • Neural networks • Probability Theory • Shannon theory • Statistical Learning |
| ISBN-10 | 3-032-13991-0 / 3032139910 |
| ISBN-13 | 978-3-032-13991-7 / 9783032139917 |
| Zustand | Neuware |
| Informationen gemäß Produktsicherheitsverordnung (GPSR) | |
| Haben Sie eine Frage zum Produkt? |
aus dem Bereich