Things Jeremy says to do


(Aditya Anantharaman) #21

I did @init_27, but no replies from there yet. Also i feel the reason im getting such high accuracy in the elephant classifier is because both african and indian elephats are classes in the original image net dataset. Have to try out with another example tomorrow. Thanks for the suggestion.


(Aditya Anantharaman) #22

Hey @raimanu-ds, I tried downloading a dataset from kaggle following the steps on your blog. But dont actually understand where to place the kaggle.json file on my google drive.
By default, when i upload a file it goes to ‘/content/gdrive/My Drive/’ but in your blog youve copied it to /root/.kaggle/kaggle.json . How do i do that? Also this is not specified in the medium article that you linked to too. need help!!

EDIT


Here’s a great resource to solve the problem.


#23

Hi @ady_anr,

I see you got it to work, well done :+1:t3:

To answer your question, after downloading the kaggle.json I moved it into ~/.kaggle folder because this is where the Kaggle API expects the credentials to be located.

I used Google Drive’s UI to create the kaggle folder and move the json file.


(Robert Bracco) #24

Lesson 3

  1. If you use a dataset, it would be very nice of you to cite the creator and thank them for their dataset.
  2. This week, see if you can come up with a problem that you would like to solve that is either multi-label classification or image regression or image segmentation or something like that and see if you can solve that problem. Context: Fast.ai Lesson 3 Homework
  3. Always use the same stats that the model was trained with. Context: Lesson 3: Normalized data and ImageNet
  4. In response to “Is there a reason you shouldn’t deliberately make lots of smaller datasets to step up from in tuning, let’s say 64x64 to 128x128 to 256x256?”: Yes you should totally do that, it works great, try it! Context: Lesson 3: 64x64 vs 128x128 vs 256x256

#25

Hey, great idea ;] Can you edit your first post and add those tips from next lessons. It would be much easier to find them. Thanks


(Robert Bracco) #26

Thanks for the input. Done.


(Alexandre Cadrin-Chênevert) #27

Be careful if you follow too rigorously what @jeremy says to do, you’ll have a risk to be 2-3 years earlier than everyone else. That is probably the story of his life.


(Robert Bracco) #28

As the lessons go on the “do this” type advice is becoming more specific and specialized. I’ll keep updating this thread as I complete lessons, but the meat is in the advice in lessons 1 and 2. Most of the stuff I post from lesson 3 forward can be found in the (really awesome) lesson notes that have been made available in the forum.

Lesson 4

  1. If you’re doing NLP stuff, make sure you use all of the text you have (including unlabeled validation set) to train your model, because there’s no reason not to. Lesson 4: A little NLP trick

  2. In response to “What are the 10% of cases where you would not use neural nets”. You may as well try both. Try a random forest and try a neural net. Lesson 4: How to know when to use neural nets

  3. Use these terms (parameters, layers, activations…etc) and use them accurately. Lesson 4: Important vocabulary for talking about ML


(Robert Bracco) #29

Lesson 5

  1. The answer to the question “Should I try blah?” is to try blah and see, that’s how you become a good practitioner. Lesson 5: Should I try blah?