Video description
9 Hours of Video Instruction
Hands-on approach to learning the probability and statistics underlying machine learning
Overview
Probability and Statistics for Machine Learning (Machine Learning Foundations) LiveLessons provides you with a functional, hands-on understanding of probability theory and statistical modeling, with a focus on machine learning applications.
About the Instructor
Jon Krohn is Chief Data Scientist at the machine learning company untapt. He authored the book Deep Learning Illustrated, an instant #1 bestseller that has been translated into six languages. Jon is renowned for his compelling lectures, which he offers in person at Columbia University and New York University, as well as online via O'Reilly, YouTube, and the SuperDataScience podcast. Jon holds a PhD from Oxford and has been publishing on machine learning in leading academic journals since 2010; his papers have been cited over a thousand times.
Skill Level
Learn How To
- Understand the appropriate variable type and probability distribution for representing a given class of data
- Calculate all of the standard summary metrics for describing probability distributions, as well as the standard techniques for assessing the relationships between distributions
- Apply information theory to quantify the proportion of valuable signal that's present among the noise of a given probability distribution
- Hypothesize about and critically evaluate the inputs and outputs of machine learning algorithms using essential statistical tools such as the t-test, ANOVA, and R-squared
- Understand the fundamentals of both frequentist and Bayesian statistics, as well as appreciate when one of these approaches is appropriate for the problem you're solving
- Use historical data to predict the future using regression models that take advantage of frequentist statistical theory (for smaller data sets) and modern machine learning theory (for larger data sets), including why we may want to consider applying deep learning to a given problem
- Develop a deep understanding of what's going on beneath the hood of predictive statistical models and machine learning algorithms
Who Should Take This Course
- You use high-level software libraries (e.g., scikit-learn, Keras, TensorFlow) to train or deploy machine learning algorithms and would now like to understand the fundamentals underlying the abstractions, enabling you to expand your capabilities
- You're a software developer who would like to develop a firm foundation for the deployment of machine learning algorithms into production systems
- You're a data scientist who would like to reinforce your understanding of the subjects at the core of your professional discipline
- You're a data analyst or AI enthusiast who would like to become a data scientist or data/ML engineer, and so you're keen to deeply understand the field you're entering from the ground up (very wise of you!)
Course Requirements
- Mathematics: Familiarity with secondary school-level mathematics will make it easier for you to follow along with the class. If you are comfortable dealing with quantitative information--such as understanding charts and rearranging simple equations--then you should be well-prepared to follow along with all of the mathematics.
- Programming: All code demos will be in Python so experience with it or another object-oriented programming language would be helpful for following along with the hands-on examples.
Lesson Descriptions
Lesson 1: Introduction to Probability
In Lesson 1, Jon starts by orienting you to the machine learning foundations series and covering what probability theory is. He then begins coverage of the most essential probability concepts, which is reinforced by comprehension exercises. The lesson ends with a comparison of Bayesian and frequentist statistics, as well as a discussion of applications of probability to machine learning.
Lesson 2: Random Variables
Lesson 2 focuses on random variables, a fundamental probability concept that is a prerequisite for understanding the later lessons. Jon starts off with an exploration of discrete and continuous variables as well as the probability distributions to which they correspond. The lesson wraps up with calculation of the expected value of random variables.
Lesson 3: Describing Distributions
Lesson 3 is all about metrics for describing probability distributions. Jon covers measures of central tendency, quantiles, box-and-whisker plots, measures of dispersion, and measures of relatedness.
Lesson 4: Relationships Between Probabilities
In Lesson 4, Jon explores the core relationships between probabilities, including joint distributions, marginal and conditional probabilities, the chain rule, and independence.
Lesson 5: Distributions in Machine Learning
Having now led you through mastering probability theory in general, in Lesson 5 Jon details the most important probability distributions in machine learning, including the uniform and normal distributions, as well as the critical concept of the central limit theorem. He also covers the log-normal, exponential, discrete, and Poisson distributions, as well as mixtures of distributions and how to prepare distributions for input into a machine learning model.
Lesson 6: Information Theory
In Lesson 6, Jon provides you with an introduction to information theory, a field of study related to probability theory that includes some key concepts that are ubiquitous in machine learning. Specifically, he defines self-information, Shannon entropy, KL divergence, and cross-entropy.
Lesson 7: Introduction to Statistics
From Lesson 7 onward, Jon shifts gears from general probability theory to the statistical models that probability theory facilitates. He starts by explaining how statistics are applied to machine learning and reviewing the most essential probability theory you absolutely must know to move forward. He then introduces new statistics concepts, specifically z-scores and p-values.
Lesson 8: Comparing Means
In Lesson 8, Jon teaches you to use probability and statistics to compare distributions with t-tests. He covers all the critical types, including the single-sample, independent, and paired varieties. Jon provides specific applications of t-tests to machine learning, and then wraps the lesson up with a discussion of related concepts, namely, confidence intervals and analysis of variance.
Lesson 9: Correlation
Lesson 9 builds on the introduction to correlation in Lesson 3. You are now armed with enough statistical knowledge to calculate p-values for correlations and calculate the coefficient of determination. Jon finishes off the lesson with important discussions about inferring causation and correcting for multiple comparisons.
Lesson 10: Regression
You're in for a treat with Lesson 10, which brings together the preceding lessons with practical, real-world demonstrations of regression--a powerful, highly extensible approach to making predictions. Jon distinguishes independent from dependent variables and uses linear regression to predict continuous variables--first with a single model feature and then with many, including discrete features. The lesson concludes with logistic regression for predicting discrete outcomes.
Lesson 11: Bayesian Statistics
Lesson 11 is on Bayesian statistics. Jon provides a guide as to when frequentist statistics or Bayesian statistics might be the appropriate option for the problem you're solving. Jon then introduces the most essential Bayesian concepts. Finally, Jon leaves you with resources for studying probability and statistics beyond what we had time for in these LiveLessons themselves.
Notebooks are available at github.com/jonkrohn/ML-foundations
About Pearson Video Training
Pearson publishes expert-led video tutorials covering a wide selection of technology topics designed to teach you the skills you need to succeed. These professional and personal technology videos feature world-leading author instructors published by your trusted technology brands: Addison-Wesley, Cisco Press, Pearson IT Certification, Sams, and Que Topics include: IT Certification, Network Security, Cisco Technology, Programming, Web Development, Mobile Development, and more. Learn more about Pearson Video training at http://www.informit.com/video.
Table of Contents
Introduction
Probability and Statistics for Machine Learning: Introduction
Lesson 1: Introduction to Probability
Topics
1.1 Orientation to the Machine Learning Foundations Series
1.2 What Probability Theory Is
1.3 Events and Sample Spaces
1.4 Multiple Observations
1.5 Factorials and Combinatorics
1.6 Exercises
1.7 The Law of Large Numbers and the Gambler’s Fallacy
1.8 Probability Distributions in Statistics
1.9 Bayesian versus Frequentist Statistics
1.10 Applications of Probability to Machine Learning
Lesson 2: Random Variables
Topics
2.1 Discrete and Continuous Variables
2.2 Probability Mass Functions
2.3 Probability Density Functions
2.4 Exercises on Probability Functions
2.5 Expected Value
2.6 Exercises on Expected Value
Lesson 3: Describing Distributions
Topics
3.1 The Mean, a Measure of Central Tendency
3.2 Medians
3.3 Modes
3.4 Quantiles: Percentiles, Quartiles, and Deciles
3.5 Box-and-Whisker Plots
3.6 Variance, a Measure of Dispersion
3.7 Standard Deviation
3.8 Standard Error
3.9 Covariance, a Measure of Relatedness
3.10. Correlation
Lesson 4: Relationships Between Probabilities
Topics
4.1 Joint Probability Distribution
4.2 Marginal Probability
4.3 Conditional Probability
4.4 Exercises
4.5 Chain Rule of Probabilities
4.6 Independent Random Variables
4.7 Conditional Independence
Lesson 5: Distributions in Machine Learning
Topics
5.1 Uniform
5.2 Gaussian: Normal and Standard Normal
5.3 The Central Limit Theorem
5.4 Log-Normal
5.5 Exponential and Laplace
5.6 Binomial and Multinomial
5.7 Poisson
5.8 Mixture Distributions
5.9 Preprocessing Data for Model Input
5.10 Exercises
Lesson 6: Information Theory
Topics
6.1 What Information Theory Is
6.2 Self-Information, Nats, and Bits
6.3 Shannon and Differential Entropy
6.4 Kullback-Leibler Divergence and Cross-Entropy
Lesson 7: Introduction to Statistics
Topics
7.1 Applications of Statistics to Machine Learning
7.2 Review of Essential Probability Theory
7.3 z-scores and Outliers
7.4 Exercises on z-scores
7.5 p-values
7.6 Exercises on p-values
Lesson 8: Comparing Means
Topics
8.1 Single-Sample t-tests and Degrees of Freedom
8.2 Independent t-tests
8.3 Paired t-tests
8.4 Applications to Machine Learning
8.5 Exercises
8.6 Confidence Intervals
8.7 ANOVA: Analysis of Variance
Lesson 9: Correlation
Topics
9.1 The Pearson Correlation Coefficient
9.2 R-squared Coefficient of Determination
9.3 Correlation versus Causation
9.4 Correcting for Multiple Comparisons
Lesson 10: Regression
Topics
10.1 Independent versus Dependent Variables
10.2 Linear Regression to Predict Continuous Values
10.3 Fitting a Line to Points on a Cartesian Plane
10.4 Linear Least Squares Exercise
10.5 Ordinary Least Squares
10.6 Categorical “Dummy” Features
10.7 Logistic Regression to Predict Categories
10.8 Open-Ended Exercises
Lesson 11: Bayesian Statistics
Topics
11.1 Machine Learning versus Frequentist Statistics
11.2 When to Use Bayesian Statistics
11.3 Prior Probabilities
11.4 Bayes’ Theorem
11.5 Resources for Further Study of Probability and Statistics
Summary
Probability and Statistics for Machine Learning: Summary