I can’t edit the post Another treat! Early access to Intro To Machine Learning videos to update the Collection of Video Timelines withe Lesson 10, it seems there’s a timer countdown on author privileges cc @jeremy
So here’s the timelines for Lesson 10.
ML1 lesson 10 timeline

00:00:01 Fast.ai is now available on PIP !
And more USF students publications: classwise Processing in NLP, Classwise Regex Functions
. Porto Seguro’s Safe Driver Prediction (Kaggle): 1st place solution with zero feature engineering !
Dealing with semisupervisedlearning (ie. labeled and unlabeled data)
Data augmentation to create new data examples by creating slightly different versions of data you already have.
In this case, he used Data Augmentation by creating new rows with 15% randomly selected data.
Also used “autoencoder”: the independant variable is the same as the dependant variable, as in “try to predict your input” ! 
00:08:30 Back to a simple Logistic Regression with MNIST summary
’lesson4mnist_sgd.ipynb’ notebook 
00:11:30 PyTorch tutorial on Autograd

00:15:30 “Stream Processing” and “Generator Python”
. “l.backward()”
. “net2 = LogReg().cuda()” 
00:32:30 Building a complete Neural Net, from scratch, for Logistic Regression in PyTorch, with “nn.Sequential()”

00:58:00 Fitting the model in ‘lesson4mnist_sgd.ipynb’ notebook
The secret in modern ML (as covered in the Deep Learning course): massively overparamaterized the solution to your problem, then use Regularization. 
01:02:10 Starting NLP with IMDB dataset and the sentiment classification task
NLP = Natural Language Processing 
01:03:10 Tokenizing and ‘termdocument matrix’ & “BagofWords’ creation
"trn, trn_y = texts_from_folders(f’{PATH}train’, names)” from Fastai library to build arrays of reviews and labels
Throwing the order of words with BagofWords ! 
01:08:50 sklearn “CountVectorizer()”
“fit_transform(trn)” to find the vocabulary in the training set and build a termdocument matrix.
“transform(val)” to apply the same transformation to the validation set. 
01:12:30 What is a ‘sparse matrix’ to store only key info and save memory.
More details in Rachel’s “Computational Algebra” course on Fastai 
01:16:40 Using “Naive Bayes” for “BagofWords” approaches.
Transforming words into features, and dealing with the bias/risk of “zero probabilities” from the data.
Some demo/discussion about calculating the probabilities of classes. 
01:25:00 Why is it called “Naive Bayes”

01:30:00 The difference between theory and practice for "Naive Bayes"
Using Logistic regression where the features are the unigrams 
01:35:40 Using Bigram & Trigram with Naive Bayes (NB) features