Lesson 2 - Official Topic

I saw this error while running ImageClassifierCleaner. Filed it here:
https://github.com/fastai/fastbook/issues/73
Anyone else seen this? @sgugger

Hi,
I had the same problem :frowning_face: on Paperspace. I had to do Kernel-> Restart & Clear Output
and then run again.

1 Like

btw i used shutil.move(str(cleaner.fns[idx]), str(path/cat)) and it worked

2 Likes

This is fixed now.

1 Like

Thanks @mrfabulous1 for the detailed post and explaining the issues with the models.

Would also request help on the below error:

/usr/local/lib/python3.6/dist-packages/PIL/Image.py:932: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images
"Palette images with Transparency expressed in bytes should be "

Good morning!

I haven’t experienced this particular error, but it looks like PIL doesn’t recognise the Transparency in the image.

Maybe converting it something like below could help.

import Image
im = Image.open("image1.png")
im.show()
print im.mode
im.convert("RGBA").save("image2.png")

Also check this post https://stackoverflow.com/questions/1233772/pil-does-not-save-transparency

Hope this helps

Cheers mrfabulous1 :smiley: :smiley:

1 Like

On Colab, the following code block

!pip install voila
!jupyter serverextension enable voila --sys-prefix

from 02_production.ipynb throws this error:

Enabling: voila Writing config: /usr/etc/jupyter Validating... Error loading server extension voila X is voila importable?

Has anyone seen this, and/or knows how to fix it?

2 Likes

I adapted the bears classifier to build a bird classifier that distinguished between cardinals, crows, falcons, orioles and ravens. As I expected, the classifier did very well except that it was not good a differentiating between crows and ravens. I doubt that I would be able to tell the difference, either. This confusion matrix was the best result before over-fitting kicked in:
image

On the plus side, it was not fooled by a red-headed falcon:

Select your bird!

Upload (1)Classify

Prediction: falcon; Probability: 0.9937

2 Likes

is there an easy way to work just on a subset of the data? I am using get_items=get_files and want to use just one or two batches of data to debug the transforms etc …

With all the data show_batch takes some minutes and I like to speed that up.

There’s a RandomSubsetSplitter which takes a train pct and valid pct.

https://dev.fast.ai/data.transforms#RandomSubsetSplitter

2 Likes

I am unable to run the Lesson 2 notebook on papersource. Any suggestions?

Were you able to install bing search and get the key?

Problem deploying bear_classifier notebook on binder.

On clicking the binder launch link, instead of getting the bear classifier app with the widgets, I got the docker image of my notebook. Executing the code block threw a ModuleNotFoundError: No module named 'fastai2'

Anyone have a clue what went wrong, how to fix?

1 Like

Thanks @DanielLam

Make sure to have this requirement file inside your repo.

3 Likes

Hi,
Are you accessing this through the Colab UI or the JupyterLab work around posted?
Can you share a screenshot?

1 Like

Thanks for the reply @imrandude Imran,
I did not try the JupyterLab Colab workaround. I was using paperspace.

1 Like

Hmm, I’m still getting the 404 error. Can you take a look and let me know what I may be doing different than you?

I’ve set up a separate repo here for the app here: https://github.com/megano/snowpeopleApp with the abbreviated .ipynb file, export.pkl, requirements.txt. This is what I’m putting in for binder, inserting the “/voila/render/” into the path URL.

1 Like

Whoops, I meant to reply to this post. Copy/pasting

Hi, I tried running your jupyter notebook. I think you might be missing a button widget creation.

Regards - Daniel

EDITED Some people got binder working with git lfs. Not sure what the difference is here, but their “.pkl” files uploaded correctly to mybinder. Their mybinder “.pkl” file sizes roughly matched their github repo sizes (eg 180 MBs).

All right, this problem was bothering me. I think I found out the reason.

binder does not support files uploaded through git lfs

Your .pkl file is uploaded through git lfs. When I got my bearApp working, I used regular git to upload my .pkl file.

https://github.com/jupyterhub/binderhub/issues/324

1 Like