Things Jeremy says to do

Jeremy, thanks for the reply and advice!

@ady_anr About overfitting, Jeremy says in lesson 2 it’s quite hard to do using fast.ai libraries. He tried changing a bunch of variables to try to get it to overfit just so he could talk about it in class and he couldn’t get it to. If you overfit you’ll generally have worse results with your validation set because your model doesn’t generalize. Maybe you can download some more images and test to see if your current model can classify them correctly, if it has a high failure rate then I’d get more data and retrain the model.

@init_27 Thanks for your blog. This post How not to do fast.ai was one of the inspirations for this thread! Still waiting on the ‘how to do fast ai’ thread :grinning:

3 Likes

Thanks for reading! :slight_smile:
I’m trying a few more ideas and plan on sharing them in my second pass (I intend to do three-currently about to complete my first pass) through the course.

Thanks for sharing your approach-it’s a great way to distilling Jeremy’s advice as well as leaving points for others to pursue.

Regards,
Sanyam.

1 Like

I did @init_27, but no replies from there yet. Also i feel the reason im getting such high accuracy in the elephant classifier is because both african and indian elephats are classes in the original image net dataset. Have to try out with another example tomorrow. Thanks for the suggestion.

1 Like

Hey @raimanu-ds, I tried downloading a dataset from kaggle following the steps on your blog. But dont actually understand where to place the kaggle.json file on my google drive.
By default, when i upload a file it goes to ‘/content/gdrive/My Drive/’ but in your blog youve copied it to /root/.kaggle/kaggle.json . How do i do that? Also this is not specified in the medium article that you linked to too. need help!!

EDIT


Here’s a great resource to solve the problem.

3 Likes

Hi @ady_anr,

I see you got it to work, well done :+1:t3:

To answer your question, after downloading the kaggle.json I moved it into ~/.kaggle folder because this is where the Kaggle API expects the credentials to be located.

I used Google Drive’s UI to create the kaggle folder and move the json file.

1 Like

Lesson 3

  1. If you use a dataset, it would be very nice of you to cite the creator and thank them for their dataset.
  2. This week, see if you can come up with a problem that you would like to solve that is either multi-label classification or image regression or image segmentation or something like that and see if you can solve that problem. Context: Fast.ai Lesson 3 Homework
  3. Always use the same stats that the model was trained with. Context: Lesson 3: Normalized data and ImageNet
  4. In response to “Is there a reason you shouldn’t deliberately make lots of smaller datasets to step up from in tuning, let’s say 64x64 to 128x128 to 256x256?”: Yes you should totally do that, it works great, try it! Context: Lesson 3: 64x64 vs 128x128 vs 256x256
8 Likes

Hey, great idea ;] Can you edit your first post and add those tips from next lessons. It would be much easier to find them. Thanks

1 Like

Thanks for the input. Done.

1 Like

Be careful if you follow too rigorously what @jeremy says to do, you’ll have a risk to be 2-3 years earlier than everyone else. That is probably the story of his life.

7 Likes

As the lessons go on the “do this” type advice is becoming more specific and specialized. I’ll keep updating this thread as I complete lessons, but the meat is in the advice in lessons 1 and 2. Most of the stuff I post from lesson 3 forward can be found in the (really awesome) lesson notes that have been made available in the forum.

Lesson 4

  1. If you’re doing NLP stuff, make sure you use all of the text you have (including unlabeled validation set) to train your model, because there’s no reason not to. Lesson 4: A little NLP trick

  2. In response to “What are the 10% of cases where you would not use neural nets”. You may as well try both. Try a random forest and try a neural net. Lesson 4: How to know when to use neural nets

  3. Use these terms (parameters, layers, activations…etc) and use them accurately. Lesson 4: Important vocabulary for talking about ML

4 Likes

Lesson 5

  1. The answer to the question “Should I try blah?” is to try blah and see, that’s how you become a good practitioner. Lesson 5: Should I try blah?

  2. If you want to play around, try to create your own nn.linear class. You could create something called My_Linear and it will take you, depending on your PyTorch experience, an hour or two. We don’t want any of this to be magic and you know everything necessary to create this now. These are the things you should be doing for assignments this week, not so much new applications but trying to write more of these things from scratch and get them to work. Learn how to debug them and check them to see what’s going in and coming out. Lesson 5 Assignment: Create your own version of nn.linear

  3. A great assignment would be to take Lesson 2 SGD and try to add momentum to it. Or even the new notebook we have for MNIST, get rid of the Optim.SGD and write your own update function with momentum Lesson 5: Another suggested assignment

3 Likes

Lesson 6

  1. Not an explicit “do this” but it feels like it fits here. “One of the big opportunities for research is to figure out how to do data augmentation for different domains. Almost nobody is looking at that and to me it is one of the biggest opportunities that could let you decrease data requirements by 5-10x.” Lesson 6: Data augmentation on inputs that aren’t images

  2. If you take your time going through the convolution kernel section and the heatmap section of this notebook, running those lines of code and changing them around a bit. The most important thing to remember is shape (rank and dimensions of tensor). Try to think “why?”. Try going back to the printout of the summary, the list of the actual layers, the picture we drew and think about what’s going on. Lesson 6: Go through the convolution kernel and heatmap notebook

2 Likes

Hey ! It would be great if you could tell us how to upload your own dataset using the data bunch factory methods.

Lesson 7

  1. Don’t let this lesson intimidate you. It’s meant to be intense in order to give you ideas to keep you busy before part two comes out.

Parts 2-5 come from a great speech towards the end of the lesson. I’d highly recommend revisiting here: Lesson 7: What to do once you’ve completed Part 1

  1. Go back and watch the videos again. There will be bits where you now understand stuff you didn’t before.

  2. Write code and put it on GitHub. It doesn’t matter if it’s great code or not, writing it and sharing it is enough. You’ll get feedback from your peers that will help you improve.

  3. It’s a good time to start reading some of the papers introduced in the course. All the parts that say derivations/theorems/lemmas, feel free to skip, they will add nothing to your understanding of practical deep learning. Read the parts where they talk about why they are solving this problem and the results. Write summaries that will explain this to you of 6 months ago.

  4. Perhaps the most important is to get together with others. Learning works a lot better if you have that social experience. Start a book club, a study group, get involved in meetups, and build things. It doesn’t have to be amazing. Build something that will make the world slightly better, or will be slightly delightful to your two year old to see it. Just finish something, and then try to make it a bit better. Or get involved with fast.ai and helping develop the code and documentation. Check Dev Projects Index on forums.

  5. In response to “What would you recommend doing/learning/practicing until the part 2 course starts?” "Just code. Just code all the time. Look at the shape of your inputs and outputs and make sure you know how to grab a mini-batch. There’s so much material that we’ve covered, if you can get to a pointwhere you can rebuild those notebooks from scratch without cheating too much, you’ll be in the top echelon of practitioners and you’ll be able to do all
    of these things yourself and that’s really really rare. Lesson 7: What to do/learn/practice between now and Part 2 Bonus: This is lesson 7 and the clip starts at t=7777!

5 Likes

@MadeUpMasters Suggestions: I learned about this cool summary tool on this platform, I suggest that you can edit the top post to make it look better organised

If you do [details="Title displayed]

And then close the [details] <-Put a slash ‘/’ before details here to create this:

Ex:

[details="Lesson 1"]
Ideas here
[/details]

gives:

Lesson 1

Ideas here

It creates a drop down based menu for details.

You can even nest it by:

[details="Lesson 1"]
foo
[details="Point 1"]
bar
[/details]
[/details]

gives:

Lesson 1

foo

Point 1

bar

6 Likes

Done! Thanks for the suggestion. One weird thing is I can’t get the titles to be bold. Not a big deal but if you have a quick fix let me know.

Thanks for your sharing, this is super wise! I am translating your post into Chinese here

1 Like
main point
sub point

Thanks! this sub-nest is very cool!

2 Likes

@init_27, I took the liberty to edit your post to show what’s behind the rendered output by simply adding``` ``` around it, it’s much easier to see how to do it that way.

I’m not sure the nested one works well, as it doesn’t indent the nesting, so it’s just confusing. It doesn’t look like it’s meant to be nested: https://developer.mozilla.org/en-US/docs/Web/HTML/Element/details

1 Like

Many thanks!

I think it might be a personal preference, keeping everything else collapsed allows me to focus on just one open tab, hence I suggested that.

I didn’t realise it doesn’t indent/display the nesting, thanks for pointing out.

1 Like