Lesson 3 - Official Topic

Just to add to the others’ replies.

Path is a class from the pathlib library. One solution could be

from pathlib import Path

(essentially @mrfabulous1’s ).

However, fastai2 (and in fact fastai as well), adds methods to this class, e.g.

path.ls()

which you can only be sure to load if you use

from fastai2.vision.all import *

as recommended by pierreg.

1 Like

Thanks so much, it worked with me when I used ```
from fastai2.vision.all import *

but I didn't know I can import it as well by 
from pathlib import Path

Side remark: the format of your text is a bit off. The triple backpacks ``` at the end of your first sentence should be at the beginning of the next line.

I have just noticed that now, I didn’t write it, it might be added becasue I copied your line. Sorry for the confusion,
cheers!

No worries!

Note that you can edit your post: click on the edit icon (a pencil, a little to the left of “Reply”, next to the three dots).

1 Like

I posted my notes on chapter 3 from the book here: https://raw.githubusercontent.com/kldarek/fastbook-notes/master/chapter%203.JPG
Not sure if anyone other than me can read it, but I am trying to maintain a weekly drumbeat and posting these on the forum is a good motivation :slight_smile: Enjoy!

3 Likes

Hi, yes, I started from scratch following the setup instructions. But I had already created the fast_template so I thought that may have caused an issue. Thanks for the links – I will take a look at them.

In chapter 2 of fastbook,

We often talk to people who overestimate both the constraints, and the capabilities of deep learning. Both of these can be problems: underestimating the capabilities means that you might not even try things which could be very beneficial; underestimating the constraints might mean that you fail to consider and react to important issues.

In a very humble tone, looking at the continuity of text - should the opening line be We often talk to people who underestimate both the constraints instead of We often talk to people who overestimate both the constraints

Thank you @Salazar! Spent too much time before seeing your reply!

1 Like

There is no difference between a*(t**2) + b*t + c and a*(t-b)**2 + c.

a*(t-b)^2 +c
= a*(t^2 -2*t*b + b^2 + c
= a*t^2 -2*a*b*t + a*b^2 + c

This expression is equivalent to A*t^2 + B*t +C. Where
A = a
B = -2*a*b
C = a*b^2 + c

Gradient Descent, find the correct value in both cases. :slightly_smiling_face:

I really like your COVID plots within the app!

Since I didn’t have a shoe image at hand, I just fed the model my own profile picture. Predicted brand: nike, with probability: 0.9657 :wink:

1 Like

You can look at the requirements.txt I used to deploy fastai2 model to Azure Functions (which is like Lambda). You can remove one of the package specific to Azure when you deploy to Heroku. I do use the CPU version of Pytorch and Torchvision. But somehow due to some reason I needed ipykernel also. Not sure if I need to narrow my imports (Currently I am doing a “from fastai2.vision.all import *” in the deployed model service. I was not sure the minimal set of imports). Anyway my zipped package came to about (300 odd MB) even with ipykernel and a host of dependencies it pulled in. It surely would be good to decouple Jupyter related dependencies in the fastai2 while deploying. Also you can do a --no-cache-dir option when pip install to reduce the size as you may be having some pip caches.

Thanks.

Ha ha, I guess your smile looks like a tick!

Here is the question from the study group :slightly_smiling_face::
When creating a new DataBlock using the batch_tfms , are the transformations applied at random to each batch as it is passed to our CNN?

And when using item_tfms arguments, the selected transform is only applied once to each item in the whole dataset, or does it get re-applied each epoch?

1 Like

It gets re-applied each time you need to get that item. So it’s applied each epoch.

Depends on the transformation. Normalize will get applied equally to each batch, Rotation will get applied randomly.

Being a item_tfm or a batch_tfm does not define if the transformation will be random or not, you can have a random or deterministic item_tfm or batch_tfm

2 Likes

did you happen to get voila working in paperspace?

Im replacing notebooks with voila/render and get a 404

2 Likes

Finally got around to clean it up and push it to GitHub :grinning:

3 Likes

There even is a trick Jeremy describes to train your network using the same images, but on different sizes, as a form of data augmentation…
Pseudo steps:

data_224 (size 224)

create network with data_224

train network

save network ("some_name_224")

data_299 (size 299)

create network with data_299

load saved network "some_name_224"

train network

Hi darek.kleczek hope you are having a marvelous day!

I really enjoyed your notes, I thought they were a very clear summary.

We have similar hand writing :grinning: :grinning: :grinning: so I found them easy to read. :grinning:

Thanks for a great short and concise summary.

Cheers mrfabulous1 :grinning: :grinning:

1 Like

Thank you @mrfabulous1 - reading your comments on this forum always brings joy :smiley: :smiley: :smiley: Have a wonderful day/night depending what’s your time zone :smile:

1 Like