Zum Hauptinhalt springen
Nicht aus der Schweiz? Besuchen Sie lehmanns.de
Für diesen Artikel ist leider kein Bild verfügbar.

Calculus for Machine Learning LiveLessons

Jon Krohn (Autor)

Software / Digital Media
2022 | Video Training) (OASIS
Pearson Education (US) (Hersteller)
978-0-13-739815-7 (ISBN)
Preis auf Anfrage
  • Titel nicht im Sortiment
  • Artikel merken
6+ Hours of Video Instruction

An introduction to the calculus behind machine learning models

Overview

Calculus for Machine Learning LiveLessons introduces the mathematical field of calculus -- the study of rates of change -- from the ground up. It is essential because computing derivatives via differentiation is the basis of optimizing most machine learning algorithms, including those used in deep learning such as backpropagation and stochastic gradient descent. Through the measured exposition of theory paired with interactive examples, you'll develop a working understanding of how calculus is used to compute limits and differentiate functions. You'll also learn how to apply automatic differentiation within the popular TensorFlow 2 and PyTorch machine learning libraries. Later lessons build on single-variable derivative calculus to detail gradients of learning (which are facilitated by partial-derivative calculus) and integral calculus (which determines the area under a curve and comes in handy for myriad tasks associated with machine learning).

Skill Level


Intermediate


Learn How To

Develop an understanding of what's going on beneath the hood of machine learning algorithms, including those used for deep learning.
Compute the derivatives of functions, including by using AutoDiff in the popular TensorFlow 2 and PyTorch libraries.
Be able to grasp the details of the partial-derivative, multivariate calculus that is common in machine learning papers and in many other subjects that underlie ML, including information theory and optimization algorithms.
Use integral calculus to determine the area under any given curve, a recurring task in ML applied, for example, to evaluate model performance by calculating the ROC AUC metric.


Who Should Take This Course


People who use high-level software libraries (e.g., scikit-learn, Keras, TensorFlow) to train or deploy machine learning algorithms and would like to understand the fundamentals underlying the abstractions, enabling them to expand their capabilities
Software developers who would like to develop a firm foundation for the deployment of machine learning algorithms into production systems
Data scientists who would like to reinforce their understanding of the subjects at the core of their professional discipline
Data analysts or AI enthusiasts who would like to become data scientists or data/ML engineers, and so are keen to deeply understand the field they're entering from the ground up (a very wise choice!)


Course Requirements


Mathematics: Familiarity with secondary school-level mathematics will make the class easier to follow. If you are comfortable dealing with quantitative information, such as understanding charts and rearranging simple equations, you should be well prepared to follow along with all the mathematics.
Programming: All code demos are in Python, so experience with it or another object-oriented programming language would be helpful for following along with the hands-on examples.


Lesson Descriptions

Lesson 1: Orientation to Calculus
In Lesson 1, Jon defines calculus by distinguishing between differential and integral calculus. This is followed by a brief history of calculus that runs all the way through the modern applications, with a particular emphasis on its application to machine learning.

Lesson 2: Limits
Lesson 2 begins with a discussion of continuous versus discontinuous functions. Then Jon covers evaluating limits by both factoring and approaching methods. Next, he discusses what happens to limits when approaching infinity. The lesson concludes with comprehension exercises.

Lesson 3: Differentiation
In Lesson 3 Jon focuses on differential calculus. He covers the delta method for finding the slope of a curve and using it to derive the most common representation of a differentiation. After Jon takes a quick look at derivative notation, he introduces the most common differentiation rules: the constant rule, the power rule, the constant product rule, and the sum rule. Exercises wind up the lesson.

Lesson 4: Advanced Differentiation Rules
Lesson 4 continues differentiation, covering its advanced rules. These include the product rule, the quotient rule, and the chain rule. After some exercises Jon unleashes the might of the power rule in situations where you have a series of functions chained together.

Lesson 5: Automatic Differentiation
Lesson 5 enables you to move beyond differentiation by hand to scaling it up through automatic differentiation. This is accomplished through the PyTorch and TensorFlow libraries. After representing a line as a graph you will apply automatic differentiation to fitting that line to data points with machine learning.

Lesson 6: Partial Derivatives
Lesson 6 delves into partial derivatives. Jon begins with simple derivatives of multivariate functions, followed by more advanced geometrical examples, partial derivative notation, and the partial derivative chain rule.

Lesson 7: Gradients
Lesson 7 covers the gradient, which captures the partial derivative of cost with respect to all the parameters of the machine learning model from the previous lessons. To understand this, Jon performs a regression on individual data points and the partial derivatives of the quadratic cost. From there, he discusses what it means to descend the gradient of cost and describes the derivation of the partial derivatives of mean squared error, which enables you to learn from batches of data instead of individual points.

Lesson 8: Integrals
Lesson 8 switches to integral calculus. To set up a machine learning problem that requires integration to solve it, Jon starts off with binary classification problems, the confusion matrix, and ROC curve. With that problem in mind, Jon then covers the rules of indefinite and definite integral calculus needed to solve it. Next, Jon shows you how to do integration computationally. You learn how to use Python to find the area under the ROC curve. Finally, he ends the lessons with some resources for further study.

Notebooks are available at github.com/jonkrohn/ML-foundations

About Pearson Video Training
Pearson publishes expert-led video tutorials covering a wide selection of technology topics designed to teach you the skills you need to succeed. These professional and personal technology videos feature world-leading author instructors published by your trusted technology brands: Addison-Wesley, Cisco Press, Pearson IT Certification, Sams, and Que. Topics include IT Certification, Network Security, Cisco Technology, Programming, Web Development, Mobile Development, and more. Learn more about Pearson Video training at informit.com/video.

Jon Krohn is chief data scientist at the machine learning company untapt. He authored the book Deep Learning Illustrated, an instant #1 bestseller that was translated into six languages. Jon is renowned for his compelling lectures, which he offers in-person at Columbia University and New York University, as well as online via O'Reilly, YouTube, and the Super Data Science Podcast. Jon holds a PhD from Oxford and has been publishing on machine learning in leading academic journals since 2010; his papers have been cited more than 1,000 times.

Introduction to Calculus for Machine Learning LiveLessons Lesson 1: Orientation to Calculus  1.1 Differential versus Integral Calculus  1.2 A Brief History 1.3 Calculus of the Infinitesimals  1.4 Modern Applications  Lesson 2: Limits 2.1 Continuous versus Discontinuous Functions  2.2 Solving via Factoring  2.3 Solving via Approaching 2.4 Approaching Infinity  2.5 Exercises   Lesson 3: Differentiation 3.1 Delta Method 3.2 The Most Common Representation 3.3 Derivative Notation  3.4 Constants  3.5 Power Rule 3.6 Constant Product Rule 3.7 Sum Rule  3.8 Exercises   Lesson 4: Advanced Differentiation Rules   4.1 Product Rule   4.2 Quotient Rule   4.3 Chain Rule   4.4 Exercises   4.5 Power Rule on a Function Chain Lesson 5: Automatic Differentiation   5.1 Introduction  5.2 Autodiff with PyTorch   5.3 Autodiff with TensorFlow   5.4 Directed Acyclic Graph of a Line Equation  5.5 Fitting a Line with Machine Learning  Lesson 6: Partial Derivatives 6.1 Derivatives of Multivariate Functions 6.2 Partial Derivative Exercises 6.3 Geometrical Examples 6.4 Geometrical Exercises 6.5 Notation 6.6 Chain Rule 6.7 Chain Rule Exercises Lesson 7: Gradients 7.1 Single-Point Regression 7.2 Partial Derivatives of Quadratic Cost 7.3 Descending the Gradient of Cost 7.4 Gradient of Mean Squared Error 7.5 Backpropagation 7.6 Higher-Order Partial Derivatives 7.7 Exercise Lesson 8: Integrals   8.1 Binary Classification 8.2 The Confusion Matrix and ROC Curve 8.3 Indefinite Integrals 8.4 Definite Integrals 8.5 Numeric Integration with Python 8.6 Exercises 8.7 Finding the Area Under the ROC Curve 8.8 Resources for Further Study of Calculus  Summary of Calculus for Machine Learning LiveLessons

Erscheint lt. Verlag 7.4.2022
Reihe/Serie LiveLessons
Verlagsort Upper Saddle River
Sprache englisch
Themenwelt Informatik Datenbanken Data Warehouse / Data Mining
Mathematik / Informatik Mathematik Analysis
ISBN-10 0-13-739815-8 / 0137398158
ISBN-13 978-0-13-739815-7 / 9780137398157
Zustand Neuware
Informationen gemäß Produktsicherheitsverordnung (GPSR)
Haben Sie eine Frage zum Produkt?