Wiki: Lesson 2

@jeremy

Question: in the lesson 1 notebook, it says “we simply keep increasing the learning rate from a very small value, until the loss starts decreasing”. Shouldn’t this be until the loss starts INCREASING? Since we are trying to minimize loss?

Does anybody know what the parameter ps is in this method learn = ConvLearner.pretrained(arch,data, precompute=True, ps=0.5)

I am also just generally having a hard time with learning about the fast.ai methods when I want to do something different since it is new and seemingly not well documented yet. Is there a place to look other than pressing shift + tab to look for more detail?

It’s from Dropouts

Too look for more detail try ??Convlearner.pretained() in a cell…

@Dhruv, apologies, I must be blind because I still don’t see it :frowning:

What is the name of the python notebook that contains the dogbreed code?

1 Like

@corey
I’ve been keeping a list of the fastai terms. Whenever I can’t remember what something stands for, I go to the top of my repo fastai_deeplearn_part1 and do a search.

In this case, ps means p’s (plural, I believe) to represent the probability of dropout.

6 Likes

Hi,

I’ve been trying to recreate jeremy 's code on the dogsbreed example in lesson 2. I have managed to get up to 1:31:08 of the youtube video but get an error when I run:

learn = ConvLearner.pretrained(arch, data, precompute=True)

error: FileNotFoundError: [Errno 2] No such file or directory: ‘/home/ian/fastai/courses/dl1/fastai/weights/resnext_101_64x4d.pth’

Has anyone got a similar error before, and how did you resolve it?

Thanks !

EDIT: I found a similar issue someone else in the forum encountered here and downloading the weights solved the issue.
Ian

4 Likes

Hi
May I ask the tip been mentioned in “01:32:45 Undocumented Pro-Tip from Jeremy: train on a small size, then use ‘learn.set_data()’ with a larger data set (like 299 over 224 pixels)” is kind of data augmentation or not ?
And why the neural network can adapt to different image size dynamically / automatically ?
Thank you.

I recreated the Jeremy`s notebook Dogbreeds and got in top16% of competition. So I think changing image size works perfect))
As I understood correctly, the images with bigger size (299) like are a new images for the model. And learns again without overfitting.
About size-changing Jeremy talk in detail in Lesson3 (short answer: model changes size of original images to 224 or 299 every time they are loaded into model). So if original size of your images is very large it better to previously change size.

3 Likes

Hey @GregFet

I managed to get up to the last step where we explored using log_preds and got the error shown below. Did you get something similar as well?

Thanks
Ian

Sure. Guys helped.

1 Like

Thank you! I am slightly embarrassed that I did not come across this earlier. Will search better next time!

:slight_smile:

#1 is definitely misleading. Turning data augmentation on in step 1 has no effect if precompute=False.

Hey guys ! So I just finished with Lesson 2 and I have a few doubts.

1: What are the precomputed activations Jeremy talks about in the lesson ? For example, some activations are activated when there are eye balls in the picture, some are activated when there are dogs and so on. I just want to understand them fundamentally.

2: I think this is related but what do you mean by freezing and unfreezing layers ?

2 Likes

For anyone else with the same issue, this was my new code that worked for me:

log_preds,y = learn.TTA()
probs = np.mean(np.exp(log_preds), axis=0)
accuracy(probs,y), metrics.log_loss(y, probs)

Cheers

Ian

5 Likes

Can you point out the exact repo ? I’ve looked into the repos on the internet but cant seem to find it. Thanks !

I was redoing the Lesson 2 and currently facing a challenge at the learn.sched.plot. My learn.lr_find() works well but when I plot it I am not able to infer anything from it. What should I be doing to make the learning rate visible for me to infer?

plot

Hi @GregFet, your response does answer my question :slight_smile:
Since you mentioned “like … new images”, I would consider this tip is kind of “data augmentation”.
Many thanks.

Hi everybody!
I do not find the tmp_lesson1-breeds.ipynb notebook from the repo.
Could someone provide the link, please?

1 Like

You’re meant to create the lesson breeds notebook yourself :slight_smile:

5 Likes

OK @jeremy,
Indeed, as all parts of implementation are already in the lecture of lesson 2, I thought that the notebook had been made available.
So, I will rewrite them from the lecture, thank you.