Home > Store

Calculus for Machine Learning (Video Training)

Calculus for Machine Learning (Video Training)

Your browser doesn't support playback of this video. Please download the file to view it.

Online Video

Register your product to gain access to bonus material or receive a coupon.

Description

  • Copyright 2021
  • Edition: 1st
  • Online Video
  • ISBN-10: 0-13-739812-3
  • ISBN-13: 978-0-13-739812-6

6+ Hours of Video Instruction

An introduction to the calculus behind machine learning models

Overview

Calculus for Machine Learning LiveLessons introduces the mathematical field of calculus -- the study of rates of change -- from the ground up. It is essential because computing derivatives via differentiation is the basis of optimizing most machine learning algorithms, including those used in deep learning such as backpropagation and stochastic gradient descent. Through the measured exposition of theory paired with interactive examples, you'll develop a working understanding of how calculus is used to compute limits and differentiate functions. You'll also learn how to apply automatic differentiation within the popular TensorFlow 2 and PyTorch machine learning libraries. Later lessons build on single-variable derivative calculus to detail gradients of learning (which are facilitated by partial-derivative calculus) and integral calculus (which determines the area under a curve and comes in handy for myriad tasks associated with machine learning).

Skill Level

  • Intermediate

Learn How To
  • Develop an understanding of what's going on beneath the hood of machine learning algorithms, including those used for deep learning.
  • Compute the derivatives of functions, including by using AutoDiff in the popular TensorFlow 2 and PyTorch libraries.
  • Be able to grasp the details of the partial-derivative, multivariate calculus that is common in machine learning papers and in many other subjects that underlie ML, including information theory and optimization algorithms.
  • Use integral calculus to determine the area under any given curve, a recurring task in ML applied, for example, to evaluate model performance by calculating the ROC AUC metric.

Who Should Take This Course
  • People who use high-level software libraries (e.g., scikit-learn, Keras, TensorFlow) to train or deploy machine learning algorithms and would like to understand the fundamentals underlying the abstractions, enabling them to expand their capabilities
  • Software developers who would like to develop a firm foundation for the deployment of machine learning algorithms into production systems
  • Data scientists who would like to reinforce their understanding of the subjects at the core of their professional discipline
  • Data analysts or AI enthusiasts who would like to become data scientists or data/ML engineers, and so are keen to deeply understand the field they're entering from the ground up (a very wise choice!)

Course Requirements
  • Mathematics: Familiarity with secondary school-level mathematics will make the class easier to follow. If you are comfortable dealing with quantitative information, such as understanding charts and rearranging simple equations, you should be well prepared to follow along with all the mathematics.
  • Programming: All code demos are in Python, so experience with it or another object-oriented programming language would be helpful for following along with the hands-on examples.

Lesson Descriptions

Lesson 1: Orientation to Calculus
In Lesson 1, Jon defines calculus by distinguishing between differential and integral calculus. This is followed by a brief history of calculus that runs all the way through the modern applications, with a particular emphasis on its application to machine learning.

Lesson 2: Limits
Lesson 2 begins with a discussion of continuous versus discontinuous functions. Then Jon covers evaluating limits by both factoring and approaching methods. Next, he discusses what happens to limits when approaching infinity. The lesson concludes with comprehension exercises.

Lesson 3: Differentiation
In Lesson 3 Jon focuses on differential calculus. He covers the delta method for finding the slope of a curve and using it to derive the most common representation of a differentiation. After Jon takes a quick look at derivative notation, he introduces the most common differentiation rules: the constant rule, the power rule, the constant product rule, and the sum rule. Exercises wind up the lesson.

Lesson 4: Advanced Differentiation Rules
Lesson 4 continues differentiation, covering its advanced rules. These include the product rule, the quotient rule, and the chain rule. After some exercises Jon unleashes the might of the power rule in situations where you have a series of functions chained together.

Lesson 5: Automatic Differentiation
Lesson 5 enables you to move beyond differentiation by hand to scaling it up through automatic differentiation. This is accomplished through the PyTorch and TensorFlow libraries. After representing a line as a graph you will apply automatic differentiation to fitting that line to data points with machine learning.

Lesson 6: Partial Derivatives
Lesson 6 delves into partial derivatives. Jon begins with simple derivatives of multivariate functions, followed by more advanced geometrical examples, partial derivative notation, and the partial derivative chain rule.

Lesson 7: Gradients
Lesson 7 covers the gradient, which captures the partial derivative of cost with respect to all the parameters of the machine learning model from the previous lessons. To understand this, Jon performs a regression on individual data points and the partial derivatives of the quadratic cost. From there, he discusses what it means to descend the gradient of cost and describes the derivation of the partial derivatives of mean squared error, which enables you to learn from batches of data instead of individual points.

Lesson 8: Integrals
Lesson 8 switches to integral calculus. To set up a machine learning problem that requires integration to solve it, Jon starts off with binary classification problems, the confusion matrix, and ROC curve. With that problem in mind, Jon then covers the rules of indefinite and definite integral calculus needed to solve it. Next, Jon shows you how to do integration computationally. You learn how to use Python to find the area under the ROC curve. Finally, he ends the lessons with some resources for further study.

Notebooks are available at github.com/jonkrohn/ML-foundations

About Pearson Video Training
Pearson publishes expert-led video tutorials covering a wide selection of technology topics designed to teach you the skills you need to succeed. These professional and personal technology videos feature world-leading author instructors published by your trusted technology brands: Addison-Wesley, Cisco Press, Pearson IT Certification, Sams, and Que. Topics include IT Certification, Network Security, Cisco Technology, Programming, Web Development, Mobile Development, and more. Learn more about Pearson Video training at informit.com/video.

Downloads

Downloads

Download Notebooks

Sample Content

Table of Contents

Introduction to Calculus for Machine Learning LiveLessonsLesson 1: Orientation to Calculus 1.1 Differential versus Integral Calculus 1.2 A Brief History1.3 Calculus of the Infinitesimals 1.4 Modern Applications Lesson 2: Limits2.1 Continuous versus Discontinuous Functions 2.2 Solving via Factoring 2.3 Solving via Approaching2.4 Approaching Infinity 2.5 Exercises  Lesson 3: Differentiation3.1 Delta Method3.2 The Most Common Representation3.3 Derivative Notation 3.4 Constants 3.5 Power Rule3.6 Constant Product Rule3.7 Sum Rule 3.8 Exercises  Lesson 4: Advanced Differentiation Rules  4.1 Product Rule  4.2 Quotient Rule  4.3 Chain Rule  4.4 Exercises  4.5 Power Rule on a Function ChainLesson 5: Automatic Differentiation  5.1 Introduction 5.2 Autodiff with PyTorch  5.3 Autodiff with TensorFlow  5.4 Directed Acyclic Graph of a Line Equation 5.5 Fitting a Line with Machine Learning Lesson 6: Partial Derivatives6.1 Derivatives of Multivariate Functions6.2 Partial Derivative Exercises6.3 Geometrical Examples6.4 Geometrical Exercises6.5 Notation6.6 Chain Rule6.7 Chain Rule ExercisesLesson 7: Gradients7.1 Single-Point Regression7.2 Partial Derivatives of Quadratic Cost7.3 Descending the Gradient of Cost7.4 Gradient of Mean Squared Error7.5 Backpropagation7.6 Higher-Order Partial Derivatives7.7 ExerciseLesson 8: Integrals  8.1 Binary Classification8.2 The Confusion Matrix and ROC Curve8.3 Indefinite Integrals8.4 Definite Integrals8.5 Numeric Integration with Python8.6 Exercises8.7 Finding the Area Under the ROC Curve8.8 Resources for Further Study of Calculus Summary of Calculus for Machine Learning LiveLessons

Updates

Submit Errata

More Information

InformIT Promotional Mailings & Special Offers

I would like to receive exclusive offers and hear about products from InformIT and its family of brands. I can unsubscribe at any time.