Use this thread for asking/answering questions about lesson 1.
Note that this is a forum wiki thread, so you all can edit this post to add/change/organize info to help make it better! To edit, click on the little pencil icon at the bottom of this post. Here’s a pic of what to look for:
- How to Ask for Help
- Where can I put my Jupyter Notebook?
- Platforms for using fastai / bash setup
- Lesson video
- Video timelines
- If you need to manually clone the fastai repo:
- Introductions thread - please tell us a bit about yourself and what you’re hoping to achieve here!
- Kaggle Kernel for lesson 1
- Pre-release videos of fast.ai Intro to Machine Learning for Coders (finalized version of course will be released soon)
- To check your AWS limits for p2 and p3 instances, go to the EC2 section of your AWS console and click “Limits”.
Blogs mentioned in the lesson
- Image Kernels
- Convolution>Non Linear>Convolution animation
- Linear algebra cheatsheet for deep learning
- CNNs from different viewpoints
- Transformations means
Other links from the lesson
- Setup environment on crestle/aws/paperspace
- Get comfortable with Jupyter and all other tools
- Run week1 code and understand it…Play with code to understand it
- Try different learning rates, epochs while running code
- Feel free to explore week2 notebook
Video timelines for Lesson 1
00:00:01 Welcome to Part 1, Version 2 of “Practical Deep Learning for Coders”,
Check the Fastai community for help on setting up your system on “forums.fast.ai”
00:02:11 The “Top-Down” approach to study, vs the “Bottom-Up”,
Why you want a nVidia GPU (Graphic Processing Unit = a video card) for Deep Learning
00:12:30 Start with Jupyter Notebook lesson1.ipynb ‘Dogs vs Cats’
00:20:20 Our first model: quick start.
Running our first Deep Learning model with the ‘resnet34’ architecture, epoch, accuracy on validation set.
00:24:11 “Analyzing results: looking at pictures” in lesson1.ipynb
00:30:45 Revisiting Jeremy & Rachel’s approach of “Top-Down vs Bottom-Up” teaching philosophy, in details.
00:33:45 Explaining the “Course Structure” of Fastai, with a slide showing its 8 steps.
Looking at Computer Vision, then Structured Data (or Time Series) with the Kaggle Rossmann Grocery Sales competition, then NLP (Natural Language Processing), then Collaborative Filtering for Recommendation Systems, then Computer Vision again with ResNet.
00:44:11 What is Deep Learning ? A kind of Machine Learning.
00:49:11 The Universal Approximation Theorem, and examples used by Google corporation.
00:58:11 More examples using Deep Learning, as shown in the PowerPoint from Jeremy course in ML1 (Machine Learning 1)
What is actually going on in a Deep Learning model, with convolutional network.
01:02:11 Adding a Non-Linear Layer to our model, sigmoid or ReLu (rectified linear unit), SGD (Stochastic Gradient Descent)
01:08:20 A paper on “Visualizing and Understanding Convolutional Networks”, implementation on ‘lesson1.ipynb’, ‘cyclical learning rates’ with Fastai library as “lr_find” or learning rate finder.
Why it starts training a model but stops before 100%: use Learner Schedule Finder.
01:21:30 Why you need to use Numpy and Pandas libraries with Jupyter Notebook: hit ‘TAB’ for more info, or “Shift-TAB” once or twice or thrice (three times) to bring up the documentation for the code.
Enter ‘?’ before the function, or ‘??’ to look at the code in details.
01:24:40 Using the ‘H’ shortcut in Jupyter Notebook, to see the Keyboard Shortcuts.
01:25:40 Don’t forget to turn off your session in Crestle or Paperspace, or you end up being charged.
- Build a Taylor Swift detector
- Starting deep learning hands-on: image classification on CIFAR-10
- Using Google images you can download lots of training data. Here is a simple Python script for autogenerating test data along with arranging it in the directories that are required for lesson 1. Run by
./training-data-generator.py -hto get options