Video description
4+ Hours of Video Instruction
Code-along sessions move you from introductory machine learning concepts to concrete code.
Machine learning is moving from futuristic AI projects to data analysis on your desk. You need to go beyond nodding along in discussion to coding machine learning tasks. These videos skew away from heavy mathematics and focus on showing you how to turn introductory machine learning concepts into concrete code using Python, scikit-learn, and friends. Our focus is on stories, graphics and code that build your understanding of machine learning; we minimize pure mathematics.
You learn how to load and explore simple datasets; build, train, and perform basic learning evaluation for a few models; compare the resource usage of different models in code snippets and scripts; and briefly explore some of the software and mathematics behind these techniques.
Skill Level
Learn How To
- Build and apply simple classification and regression models
- Evaluate learning performance with train-test splits
- Evaluate learning performance with metrics tailored to classification and regression
- Evaluate the resource usage of your learning models
Who Should Take This Course
If you are becoming familiar with the basic concepts of machine learning and you want an experienced hand to help you turn those concepts into running code, this course is for you. If you have some coding knowledge but want to see how Python can drive basic machine learning models and practice, this course is for you.
Course Requirements
- A basic understanding of programming in Python (variables, basic control flow, simple scripts)
Lesson Descriptions
Lesson 1: Software Background
In Lesson 1, Mark discusses the environment used to run the code and several of the fundamental software packages used throughout the lessons. Mark discusses scikit-learn, seaborn, and pandas–high-level packages that have many powerful features. Mark also introduces numpy and matplotlib–more foundational packages.
Lesson 2: Mathematical Background
In Lesson 2, Mark continues the discussion of background and foundations. He covers several important mathematical ideas: probability, linear combinations, and geometry. He approaches these concepts from a practical and computational viewpoint. He introduces them but shies away from theory. He also spends a few minutes talking about technical issues that affect how you approach mathematics on the computer.
Lesson 3: Beginning Classification (Part I)
In Lesson 3, Mark gets your attention squarely focused on building, training, and evaluating simple classification models. He starts by introducing you to a practice dataset. Along the way, he covers train-test splits, accuracy, and two models: k-nearest neighbors and naive Bayes.
Lesson 4: Beginning Classification (Part II)
In Lesson 4, Mark continues the discussion of classification and focuses on two ways to evaluate classifiers. He shows you how to evaluate learning performance with accuracy and how to evaluate resource utilization for memory and time. Mark shows you how to do this both within Jupyter notebooks and also in standalone Python scripts.
Lesson 5: Beginning Regression (Part I)
In Lesson 5, Mark discusses and demonstrates building, training, and basic evaluation of simple regression models. He starts with a practice dataset. Along the way, he discusses different ways of measuring the center of numerical data, and then he discusses two models: k-nearest neighbors and linear regression.
Lesson 6: Beginning Regression (Part II)
Lesson 6 continues regression. Mark explains how we can pick good models from a basket of possible models. Then, he covers how to evaluate learning and resource consumption of regressors in notebook and standalone scenarios.
Table of Contents
Introduction
Machine Learning with Python for Everyone: Introduction
Lesson 1: Software Background
Topics
1.1 What Is Machine Learning?
1.2 Building Learning Systems
1.3 Environment Installation
1.4 Three Things You Can do with NumPy and matplotlib
1.5 Three Things You Can do with Pandas
1.6 Three Things You Can do with scikit-learn and Friends
1.7 Getting Help
Lesson 2: Mathematical Background
Topics
2.1 Probability
2.2 Distributions
2.3 Linear Combinations
2.4 Geometry, Part 1
2.5 Geometry, Part 2
2.6 Geometry, Part 3
2.7 When Computers and Math Meet
Lesson 3: Beginning Classification (Part I)
Topics
3.1 Setup and the Iris Dataset
3.2 Classification, Accuracy, and Splitting
3.3 Accuracy
3.4 Introduction to Nearest Neighbors and Naive Bayes
3.5 k-Nearest Neighbors
3.6 Train-Test Split and Nearest Neighbors (Part 1)
3.7 Train-Test Split and Nearest Neighbors (Part 2)
3.8 Naive Bayes
Lesson 4: Beginning Classification (Part II)
Topics
4.1 Learning Evaluation, Part 1
4.2 Learning Evaluation, Part 2
4.3 Resource Evaluation: Time
4.4 Resource Evaluation: Memory
4.5 Scripts
Lesson 5: Beginning Regression (Part I)
Topics
5.1 Setup and the Diabetes Dataset
5.2 Introducing Regression
5.3 Measures of Center
5.4 Introducing Linear Regression and NN Regression
5.5 k-Nearest Neighbors for Regression
5.6 Linear Regression, Part 1
5.7 Linear Regression, Part 2
Lesson 6: Beginning Regression (Part II)
Topics
6.1 Optimization, Part 1
6.2 Optimization, Part 2
6.3 Optimization, Part 3
6.4 Learning Performance
6.5 Resource Evaluation
Summary
Machine Learning with Python for Everyone: Summary