A walk with fastai2 - Vision - Study Group and Online Lectures Megathread

What is the default lr_max for the learner when we call fit_one_cycle?
self.lr if lr_max is None not sure where the self.lr is being set.
During some point in the session can you show us the best way to jump around source code :slight_smile: ?

Yes, if we have time next week I will do so (which we should!) If not then around lesson 6 weā€™ll get real heavy with the source code navigation. The default is 1e-3

1 Like

Just FYI - at least one issue in your try with resnets code for dbunch:
I got an assertion error using that code. On closer inspection it says
pets = DataBlock(types=(PILImage, Category) vs
Code early on in nb
pets = DataBlock(blocks=(PILImage, Category) vs

I donā€™t think the regex being different should cause this.
The regex is slightly different - orig code matches .*ā€™ at the end while the
code below uses .jpg$ā€™

When I use the orig code it worksā€¦

Not an issue but just FYI

2 Likes

thank you

Just a suggestion muellerā€¦
If we can have a walkthough of some deployment steps and a complete end to end image classification app through any apps like flask or render will complete the task image classification
And also we can update our work we are doing this week will help to enhance knowledge and ideas.

@muellerzr In Lesson 1 part 2 video youā€™re saying that you overfit the data as the validation loss is lower than the training loss, isnā€™t that the other way around you actually underfit the training set?, I remember Jeremy saying in one of his lectures that a well-trained model will always have validation loss higher than the training loss, it doesnā€™t count as overfitting as long as the validation accuracy (metric that you care about) is increasing.

3 Likes

Yes, that is correct. Thank you :slight_smile:

On deployment: Iā€™m afraid I do not have that. Weā€™ll do some deployment work on week 6 however! That being said though, here is the lesson 2 notebook of course-v3 in fastai2, which may assist you: https://github.com/fastai/fastai2/blob/master/nbs/course/lesson2-download.ipynb

1 Like

Nice catch, Iā€™ll fix this later today so they match

I have the same error, I changed types to blocks and got another error.
I switched PILImage to ImageBlock and it works. Is PILImage a valid argument?

pets = DataBlock(blocks=(ImageBlock, Category),
                 get_items=get_image_files,
                 splitter=RandomSplitter(),
                 get_y=RegexLabeller(pat = r'/([^/]+)_\d+.*'))
1 Like

PILImage was old syntax :sweat_smile:. (Now it is just ImageBlock). The blocks should be exactly the same as the one before it we used earlier. Weā€™ll see what PILImage is when we do the lower level API next week

1 Like

Ok great!

Iā€™d like some feedback, you can ā€œlikeā€ this comment to vote. If we get over 5 then I will make the adjustment. I can extend the lecture part next week to a bit longer and go (briefly) over some techniques for deployment of our models in Render. Let me know if this would be valuable :slight_smile: (more than the courseā€™s version I posed earlier)

Also: notebooks have been adjusted with the issues said earlier, along with the install directions too :slight_smile:

6 Likes

Iā€™ve added two folders to the repository for now, Tabular and Text. These have some examples and may have issues (they are my old notebooks). When the tabular getā€™s closer Iā€™ll make the course examples, for now look at these @Meditation :slight_smile:

Hi @muellerzr , excellent lesson last night! Based on my own experience deploying models, Iā€™d like to suggest you to use Heroku instead of Render. Render is great , but after a while they start charging on your card. Since we are just helping others to study and learn how to deploy some toy models, I believe Heroku is the best choice. I even have a tutorial created for our Fastai study group Machine Learning BrasĆ­lia that you can easily translate and share with your students, if you want: https://github.com/weltonrodrigo/fastai-v3/blob/master/tutorial/deploy-do-classificador-fastai.md . Hope it helps.

3 Likes

I also would like to mention that Iā€™m watching the video again and, so far, I noticed two spots where it gets truncated, like on 38:42, for example. Not a big deal, but I just wanted to let you know.

1 Like

Hi @NandoBr ! Thank you very much :slight_smile: I will certainly look into it! The goal was to show how to first test locally, and then follow up with how we deploy on the app and set it all up. I will look at your tutorial as well, thank you!

On truncation: yes, if thereā€™s any other places where it gets too bad let me know and I can re-cover those topics again next week. :slight_smile:

Iā€™ve constrained fastai2 to torchvision<=0.4.2 for the time being, will remove once we have sorted that bug (or have them sorted that bug).

1 Like

For ImageDataBunch if we used the factory methods from_csv or from_folder there is a seed, if iā€™m not wrong that is for reproducibility. now we set np.random.seed(2) but i donā€™t see where we are passing it in.
where Is this seed being used?

So it will do the same (more or less) that np.random.seed will do. Hence why we just pas in seed=2 to ImageDataBunch.

Take a peek at the Splitter in the source code here: https://github.com/fastai/fastai2/blob/master/fastai2/vision/data.py#L26 and it may provide answers

And next class I will go over how to swim and navigate it all easily :slight_smile:

Thanks for that, will do