Lesson 3 - Official Topic

@christophercao1 if you’re interested in going deeper on the subject, I gave a talk last year called “CUDA in Your Python” that goes into different ways of integrating with the GPU from Python code, it may be useful to you https://www.youtube.com/watch?v=c9Ezk6d3IuY

2 Likes

yeah, on heroku i am only interested in deploying the app, similar to what Jeremy showed with binder. so i have trained model, exported to .pkl file. now i want to deploy the app to heroku using stramlit. the issue i have is that heroku limits build size to 500MB. When i put fastai2 in requirements.txt, heroku will build with pip install fastai2. Which installs everyting fastai2 has (nlp, tabular, collabe, vision) and all dependencies which is more than 500MB. I found the way to install torch for cpu only which was just below the limit, but when i uploaded export.pkl file it went up above limit again :frowning:

What i want to find out - if it’s possible to do something like pip install --no-deps on heroku, but so far i haven’t managed to find that…

2 Likes

I created a tree classifier comparing two similar types of trees. They are a little too similar and model didn’t work well. I change one of the trees, updated the code throughout, and restarted the kernal. But when I run:
fns = get_image_files(path)
fns
the output remains the old jpgs of trees. Does anybody know why the model’s not downloading the new tree_type?

1 Like

While running 04_mnist_basics clean on Colab I have had a wired behavior. I run the whole notebook without an issue. However, when I try to run, just before Jargon recap title, the simple_net, learn = Learner…, and learn.fit, I get the following error:

Is it correct that will apply augmentation each time when asking about data by data loader ? Can we define the probability for each transformation?

I have a problem in Google colab, when I write: path= Path(‘images’) for example it gives me an error saying path is not defined. What did I miss?

Is it saying that Path is not defined?

Maybe you are missing

from fastai2.vision.all import *

The Darkest Hour is Just Before Dawn

In a rough visual model of epidemic evolution (see the figure below) the cumulative distribution function (CDF) of deaths (blue curve) follows a logistic function. On a given day the CDF is the total number of deaths since the beginning of the epidemic, normalized by the total number of deaths at the end of the epidemic. So toward the end of the epidemic, the CDF approaches 1.

The United States is currently moving through the steeply rising part of the CDF (blue curve).

The derivative of the logistic function is the distribution of deaths, which has a peaked, symmetric shape. Its value at a given day is the deaths on that day, normalized by the total number of deaths in the epidemic

Bending the curve refers to the falling away of the slope of the CDF (blue curve) after the peak in daily deaths (red curve) at T = 0.0.

Note that the peak in daily deaths (red curve) happens exactly when the CDF is rising most rapidly, i.e. when the blue curve reaches its steepest slope!

That’s why the epidemiologists tell us that “social distancing is working” even though we can’t see it in the numbers yet.

(See the attached figure.)

5 Likes

I think we are saying effectively the same thing. I am saying the function for the daily fraction of total deaths is a logistic distribution function, and was clarifying that it isn’t an exponential like the OP originally said.

For my web deployment application I used a bird Kaggle dataset found here
You can find my web app here: https://birds.smbtraining.com/bird_name

Here is a URL to try it out: https://bdn-data.s3.amazonaws.com/blogs.dir/112/files/2017/07/Loon-by-Nick-Leadley.jpg
Just paste it into the URL box.

10 Likes

Nice job, it looks great!

1 Like

Awesome @sylvaint thanks for sharing, is it in your github as well?

Not yet… Will do.

1 Like

although it would need some caution or awareness that you might create a feedback loop?

Absolutely.
That’s part of using the whole process in a wise way, putting in place guardrails and solid monitoring.

Very nice, but I’m getting an error when I try and get the bird name, after verifying the image:

1 Like

When we use our model to predict the class of a new image, are the item and batch transforms applied to that image first?

@PDiTO When we call .predict() the same image augmentation that was done to the validation set is done here.

1 Like

There is some kind of invalid input that crashed the python code… Looking into it now. I restarted the server. Thank you for reporting that !

2 Likes

Does the image augmentation change between calls ? For training the augmentation presents a different image for each epoch. Is that the case when doing multiple predicts on one image ?