I saw this error while running ImageClassifierCleaner. Filed it here:
https://github.com/fastai/fastbook/issues/73
Anyone else seen this? @sgugger
Hi,
I had the same problem on Paperspace. I had to do Kernel-> Restart & Clear Output
and then run again.
btw i used shutil.move(str(cleaner.fns[idx]), str(path/cat))
and it worked
This is fixed now.
Thanks @mrfabulous1 for the detailed post and explaining the issues with the models.
Would also request help on the below error:
/usr/local/lib/python3.6/dist-packages/PIL/Image.py:932: UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images
"Palette images with Transparency expressed in bytes should be "
Good morning!
I haven’t experienced this particular error, but it looks like PIL doesn’t recognise the Transparency in the image.
Maybe converting it something like below could help.
import Image
im = Image.open("image1.png")
im.show()
print im.mode
im.convert("RGBA").save("image2.png")
Also check this post https://stackoverflow.com/questions/1233772/pil-does-not-save-transparency
Hope this helps
Cheers mrfabulous1
On Colab, the following code block
!pip install voila
!jupyter serverextension enable voila --sys-prefix
from 02_production.ipynb
throws this error:
Enabling: voila Writing config: /usr/etc/jupyter Validating... Error loading server extension voila X is voila importable?
Has anyone seen this, and/or knows how to fix it?
I adapted the bears classifier to build a bird classifier that distinguished between cardinals, crows, falcons, orioles and ravens. As I expected, the classifier did very well except that it was not good a differentiating between crows and ravens. I doubt that I would be able to tell the difference, either. This confusion matrix was the best result before over-fitting kicked in:
On the plus side, it was not fooled by a red-headed falcon:
Select your bird!
Upload (1)Classify
Prediction: falcon; Probability: 0.9937
is there an easy way to work just on a subset of the data? I am using get_items=get_files and want to use just one or two batches of data to debug the transforms etc …
With all the data show_batch takes some minutes and I like to speed that up.
There’s a RandomSubsetSplitter which takes a train pct and valid pct.
Were you able to install bing search and get the key?
Problem deploying bear_classifier
notebook on binder
.
On clicking the binder launch
link, instead of getting the bear classifier app with the widgets, I got the docker image of my notebook. Executing the code block threw a ModuleNotFoundError: No module named 'fastai2'
Anyone have a clue what went wrong, how to fix?
Hi,
Are you accessing this through the Colab UI or the JupyterLab work around posted?
Can you share a screenshot?
Thanks for the reply @imrandude Imran,
I did not try the JupyterLab Colab
workaround. I was using paperspace
.
Hmm, I’m still getting the 404 error. Can you take a look and let me know what I may be doing different than you?
I’ve set up a separate repo here for the app here: https://github.com/megano/snowpeopleApp with the abbreviated .ipynb file, export.pkl, requirements.txt. This is what I’m putting in for binder, inserting the “/voila/render/” into the path URL.
Whoops, I meant to reply to this post. Copy/pasting
Hi, I tried running your jupyter notebook. I think you might be missing a button widget creation.
Regards - Daniel
EDITED Some people got binder working with git lfs. Not sure what the difference is here, but their “.pkl” files uploaded correctly to mybinder. Their mybinder “.pkl” file sizes roughly matched their github repo sizes (eg 180 MBs).
All right, this problem was bothering me. I think I found out the reason.
binder does not support files uploaded through git lfs
Your .pkl file is uploaded through git lfs. When I got my bearApp working, I used regular git to upload my .pkl file.