Lesson resources
- Lesson Video
- Video Timelines for Lesson 7
- Lesson notes from @hiromi
- Lesson notes from @timlee
- Cifar-10 notebook annotations
Other links
- WILD ML RNN Tutorial - http://www.wildml.com/2015/09/recurrent-neural-networks-tutorial-part-1-introduction-to-rnns/
- Chris Olah on LSTM http://colah.github.io/posts/2015-08-Understanding-LSTMs/
- More from Olah and others - https://distill.pub/
- BatchNorm paper
- Laptop recommendation; Surface Book 2 15 inch
Video timeline
-
00:03:04 Review of last week lesson on RNNs,
Part 1, what to expect in Part 2 (start date: 19/03/2018) -
00:08:48 Building the RNN model with âself.init_hidden(bs)â and âself.hâ, the âback prop through time (BPTT)â approach
-
00:17:50 Creating mini-batches, âsplit in 64 equal size chunksâ not âsplit in chunks of size 64â, questions on data augmentation and choosing a BPTT size, PyTorch QRNN
-
00:23:41 Using the data formats for your API, changing your data format vs creating a new dataset class, âdata.Field()â
-
00:24:45 How to create Nietzsche training/validation data
-
00:35:43 Dealing with PyTorch not accepting a âRank 3 Tensorâ, only Rank 2 or 4, âF.log_softmax()â
-
00:44:05 Question on âF.tanh()â, tanh activation function,
replacing the âRNNCellâ by âGRUCellâ -
00:47:15 Intro to GRU cell (RNNCell has gradient explosion problem - i.e. you need to use low learning rate and small BPTT)
-
00:53:40 Long Short Term Memory (LSTM), âLayerOptimizer()â, Cosine Annealing âCosAnneal()â
-
01:01:47 Pause
-
01:01:57 Back to Computer Vision with CIFAR 10 and âlesson7-cifar10.ipynbâ notebook, Why study research on CIFAR 10 vs ImageNet vs MNIST ?
-
01:08:54 Looking at a Fully Connected Model, based on a notebook from student âKerem Turgutluâ, then a CNN model (with Excel demo)
-
01:21:54 Refactored the model with new class âConvLayer()â and âpaddingâ
-
01:25:40 Using Batch Normalization (BatchNorm) to make the model more resilient, âBnLayer()â and âConvBnNet()â
-
01:36:02 Previous bug in âMini netâ in âlesson5-movielens.ipynbâ, and many questions on BatchNorm, Lesson 7 Cifar10, AI/DL researchers vs practioners, âYann Lecunâ & âAli Rahimi talk at NIPS 2017â rigor/rigueur/theory/experiment.
-
01:50:51 âDeep BatchNormâ
-
01:52:43 Replace the model with ResNet, class âResnetLayer()â, using âboostingâ
-
01:58:38 âBottleneckâ layer with âBnLayer()â, âResNet 2â with âResnet2()â, Skipping Connections.
-
02:02:01 âlesson7-CAM.ipynbâ notebook, an intro to Part #2 using âDogs v Catsâ.
-
02:08:55 Class Activation Maps (CAM) of âDogs v Catsâ.
-
02:14:27 Questions to Jeremy: âYour journey into Deep Learningâ and âHow to keep up with important research for practionersâ,
âIf you intend to come to Part 2, you are expected to master all the techniques in Part 1â, Jeremyâs advice to master Part 1 and help new students in the incoming MOOC version to be released in January 2018.