Video description
Beau Carnes does it again; Breaking down one of the most complex fields of computer science and distilling it into repeatable, practical lessons to enhance a developer's skillset.
Derek Hampton
Despite being one of the biggest technical leaps in AI in decades, building an understanding in deep learning doesn't mean you need a math degree. All it takes is the right intuitive approach, and you'll be writing your own neural networks in pure Python in no time!
Grokking Deep Learning in Motion is a new course that takes you on a journey into the world of deep learning. Rather than just learn how to use a single library or framework, you’ll actually discover how to build these algorithms completely from scratch!
Professional instructor Beau Carnes breaks deep learning wide open, drawing together his expertise in video instruction and Andrew Trask's unique, intuitive approach from Grokking Deep Learning! As you move through this course, you’ll learn the fundamentals of deep learning from a unique standing! Using Python, as well as Jupyter Notebooks, you’ll get stuck right in to the basics of neural prediction and learning, and teach your algorithms to visualize things like different weights. Throughout, you’ll train your neural network to be smarter, faster, and better at its job in a variety of ways, ready for the real world!
Packed with great animations and explanations that bring the world of deep learning to life in a way that just makes sense, Grokking Deep Learning in Motion is exactly what anyone needs to build an intuitive understanding of one of the hottest techniques in machine learning.
This course also works perfectly alongside the original book Grokking Deep Learning by Andrew Trask, bringing his unique way to teaching to life.
Machine learning has made remarkable progress in recent years. Deep-learning systems now enable previously impossible smart applications, revolutionizing image recognition and natural-language processing, and identifying complex patterns in data. To really get the most out of deep learning, you need to understand it inside and out, but where do you start? This course is the perfect jumping off point!
Inside:
- The differences between deep and machine learning
- An introduction to neural prediction
- Building your first deep neural network
- The importance of visualization tools
- Memorization vs Generalization
- Modeling probabilities and non-linearities
This course is perfect for anyone with high school-level math and basic programming skills with a language like Python. Experience with Calculus is helpful but NOT required.
Beau Carnes is a software developer and a recognized authority in software instruction. Besides teaching in-person workshops and classes, Beau has recently joined the team at freeCodeCamp as their lead video instructor, helping to teach over 2 million people around the world to code. Beau also teaches Manning's best-selling video course, Algorithms in Motion.
Excellent bottom-up introduction to neural networks and deep learning.
Ursin Stauss
Using small snippets of easily memorized code introduced through the various chapters, the video shows a relatively easy way of building a deep learning neural network.
Thomas Heiman
Beau's approach is refreshingly beautiful.
Markus Breuer
Table of Contents
INTRODUCING DEEP LEARNING
Introduction
00:08:54
What you need to get started
00:05:28
FUNDAMENTAL CONCEPTS
What is Deep Learning and Machine Learning?
00:05:02
Supervised vs. unsupervised learning
00:05:23
Parametric vs. non-parametric learning
00:12:56
INTRODUCTION TO NEURAL PREDICTION
Making a prediction
00:07:56
What does a Neural Network do?
00:04:05
Multiple inputs
00:13:25
Multiple outputs and stacking predictions
00:09:15
Primer on NumPy
00:11:27
INTRODUCTION TO NEURAL LEARNING
Compare and learn
00:06:23
Why measure error?
00:03:53
Hot and cold learning
00:09:17
Gradient descent
00:09:21
Learning with gradient decent
00:09:06
The secret to learning
00:07:13
How to use a derivative to learn
00:11:41
Alpha
00:06:13
LEARNING MULTIPLE WEIGHTS AT A TIME
Gradient descent learning with multiple inputs
00:07:16
Several steps of learning
00:06:04
Gradient descent with multiple outputs
00:06:17
Visualizing weight values
00:09:32
BUILDING YOUR FIRST “DEEP” NEURAL NETWORK
The streetlight problem
00:10:32
Building our neural network
00:09:37
Up and down pressure
00:14:41
Correlation and backpropagation
00:08:04
Linear vs. non-linear
00:08:06
Our first “deep” neural network
00:10:14
HOW TO PICTURE NEURAL NETWORKS
Simplifying
00:06:35
Simplified visualization
00:07:16
Seeing the network predict
00:08:04
LEARNING SIGNAL AND IGNORING NOISE
3-layer network on MNIST
00:10:59
Overfitting in Neural Networks
00:06:06
Regularization: Early Stopping and Dropout
00:16:45
MODELING PROBABILITIES AND NON-LINEARITIES
Activation Function Constraints
00:09:31
Standard Activation Functions
00:12:22
Softmax and implementation in code
00:16:35
CONCLUSION
Where to go from here
00:07:17