Lesson 3 official topic

Well 1D is a vector, 2D is a matrix, and then we want to stop inventing new words for all the other scales of dimensions, so we use ‘n dimension tensor’. The actual history of the word comes from wanting to represent stress on an object as a field that varies at different locations - so it was a tensor (tension) field.

5 Likes

Lesson 3, makes me think it’s time to learn Pytorch. Any good resources to learn pytorch concepts covered in chapter 4 of the book?

2 Likes

Technically you can do it, but it is going to be very expensive computationally.

adjust the learning rate(the number 0.01) and number of epochs (the number in the range) just referencing the video right now. We have 3 sets: train, valid and test just adjusting those will make you go far. We will learn cool tricks in other lessons. Review learn.fit_one_cycle as a start in the docs.

1 Like

PyTorch tutorials are really good. You can check here.

7 Likes

21 posts were split to a new topic: Why do we use ReLUs

what’s are good resources to practice / learn practical object oriented programming for Deep Learning / Machine Learning?

Nice, thanks Arun! It was bothering me to keep hearing about terms that I don’t understand the semantic meaning of

1 Like

I would stick with the fastai library as it deals with a lot of the kinks of using pytorch lowlevel as you get started. However, “Deep Learning with PyTorch” (Manning) is very good. The official docs are also very good and have great tutorials.

https://pytorch.org/tutorials/

6 Likes

Can we have an example of semi-supervised learning, where there is one class, and there is unlabeled data (no second class)?

2 Likes

I recommend watching Jeremy’s Machine Learning course where he implemented Decision Tree/Random Forest from scratch using OOPs. It’s highly practical, was released in 2018
Second part of the video: Intro to Machine Learning: Lesson 5 - YouTube

6 Likes

The PyTorch official documentation: Resources | PyTorch
Here’s some other good resources listed here too: Top 7 Free Resources To Learn Deep Learning With PyTorch
(And fast.ai is mentioned there as well!)

2 Likes

The best resources are working example notebooks that you can play with. Examine the classes & functions and understand their purpose. Examine what other parameters you could use with those functions, but then also the other functions that those modules support. You can query these details directly within the notebooks, and also try using them.

Semi-supervised learning - where all the train data may not be annotated

Example application of semi-supervised learning

A common example of an application of semi-supervised learning is a text document classifier. This is the type of situation where semi-supervised learning is ideal because it would be nearly impossible to find a large amount of labeled text documents. This is simply because it is not time efficient to have a person read through entire text documents just to assign it a simple classification.

So, semi-supervised learning allows for the algorithm to learn from a small amount of labeled text documents while still classifying a large amount of unlabeled text documents in the training data.

1 Like

I liked the “tutorial for beginners” in PyTorch’s documentation, I think this is it: Learn the Basics — PyTorch Tutorials 1.11.0+cu102 documentation

After that, I loved “What is torch.nn really?”, by Jeremy and Rachel, I think.

I also like this book if you want to use PyTorch for your own projects, but in my opinion it’s much easier to start with fastai and check stuff as you need it.

5 Likes

I found an interesting books that can shed some light about this. Try Deep Learning from Scratch, Deep Learning for coders and finally Grokking Deep Learning. Switching back and forth to see how something you are reading about being implemented has helped me with this combination. As a bonus, Grokking machine learning goes indepth with classic ML algorithms, metrics and complexity of models and ensembling.

3 Likes

This would be great to know know more about. I presume that might come be covered later in the course.

Conformal Prediction seems a promising way to deal with this.

http://people.eecs.berkeley.edu/~angelopoulos/blog/posts/gentle-intro/

1 Like

In the excel exercise, when Jeremy is doing some feature engineering, he comes up with two new columns: Pclass_1 and Pclass_2. My question is why there is no Pclass_3 column as well? … is it b/c Pclass_1 = 0 & Pclass_2=0 implies Pclass_3 = 1, so in a way, two columns are enough to encode the info of the original column?

6 Likes

Exactly. With two boolean columns, you can encode up to four combinations of binary digits, right? So up to four different classes.

2 Likes