Yes, Javi. This is normal if you want internet connectivity and GPU usage.
Thanks Tanishq and Nick, I’ll press Kaggle to see why it might not be working for me
What’s the process of being verified on Kaggle? I can’t tell if I am or not
I think if you go to
https://kaggle.com/[USERNAME]/account there should be an option for verification.
You can see on your Account page in Kaggle (find by clicking on your top-right profile image) under Phone Verification.
You can also just check if you can turn on internet in a Kaggle notebook:
I strongly recommend using colab for the book, not kaggle. It’s all set up and ready to go on the course website. The chapters are too big to even save correctly on kaggle!
Thanks Sarada, I do remember this from the 2020 course now that you mention it, I am considering something with full-body pose estimation so I should probably look for some well-annotated datasets and see what I can find that could be similarly trained using fastai.
I am hoping to look at some estimates of joint angles, such as knees and legs/back for sports analysis.
So my post was worth it to learn this at least. Its not mentioned in fastbook/README.md at master · fastai/fastbook · GitHub
Confirming figures show out-of-the-box with Colab, per following example opening Chapter 2…
After signing in… File > Open notebook
Select Github tab, search fastai/fastbook, select chapter 2…
In case of domestic blindness (as I, missing that the
pip installline was commented out)
running the first cell produces the following error. Easily fixed by deleting the red boxed characters.
So this worked…
5 different datasets are available here.
Sorry I should have been more specific.
The best way to open the book in colab is to go to course.fast.ai, click Notebook Servers on the left, click Colab, then click on the chapter you want to read.
Everything should “just work”.
Customising the Default Root and Project Folders for Google Colab
So I think I’ve figured out how to specify your Google Drive folder as the location for your root directory before loading data and creating models with fastai. I’m very happy to share my notebook describing how I am doing this here.
I have been looking how to use collapsible headings and navigation feature which Jeremy mentioned at start of lesson in JupyterLabs and in colab/Kaggle?
On running the command jeremy pointed out it looks like it doesn’t work in JupyterLabs.
I found an answer in stackoverflow which mentions jupyter notebook extension are not supported.
Although it seems jupyter labs has it’s own navigate menu when I used it in Jarvislabs, couldn’t find how to use collapsible heading there.
I want to use wget from WSL on my home laptop to download this model.pkl file from Kaggle. Normally I would expect to be able to right-click on a button or link and choose Copy link address
but although kaggle has a very pretty download screen, I can’t find what the url is to the file. Please advise how to deetermine the url for this file…
I used this method:
# make sure you're in the right directory # should be /kaggle/working !pwd #zip it up !zip -r kaggle_working_050122.zip # see if the file is there !ls -lrt | tail # get a link to it from IPython.display import FileLink FileLink(r'kaggle_working_050122.zip')
If you have an exported pickle file in the kaggle home directory, you can get a downloadable link to it in your kaggle notebook using the above method as well.
At this point, you should see an html link in the output of the last command , if you click on it, you can download it to your local filesystem.
You can install kaggle python package.
Then you can download the kernel output like in the below screenshot.
@mike.moloch, thanks, this looks like a useful approach, looking into it.
@kurianbenoy, thats awesomesauce! even better than I was looking for.
For my own later reference, and maybe help others unfamiliar with git,
here is my action log…
Created a new space at hugging space
Cloned that space to my local machine.
$ git clone https://huggingface.co/spaces/bencoman/cats-v-dogs $ cd cats-v-dogs
- Added Git Large File Storage.
# This needs to do this once per machine $ curl -s https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | sudo bash $ sudo apt-get install git-lfs # This needs to be done once per repo $ git lfs install $ git lfs track "*.pkl" $ git add .gitattributes $ git commit -m "added git large file storage"
Copied random test pics dog.jpg and cat.jpg files into this folder
Edited the application code app.py as follows…
from fastai.vision.all import * import gradio as gr def is_cat(x): return x.isupper() def classify_image(img): pred,idx,probs = learn.predict(img) return( dict(zip(categories, map(float,probs))) learn = load_learner('model.pkl') categories = ('Dog', 'Cat') image = gr.inputs.Image(shape=(192,192)) label = gr.outputs.Label() examples = [ 'dog.jpg', 'cat.jpg' ] iface = gr.Interface(fn=classify_image, inputs=image, outputs=label, examples=examples) iface.launch(inline=False)
Confirmed the trained model.pkl file was available here…
https://www.kaggle.com/code/bencoman/cats-v-dogs-saving-a-basic-fastai-model/data. Note: it took me a while to realise this file was disappearing because I was doing a Quick Save rather than Save And Run All.
Downloaded and installed API key in kaggle.json file created per API Credentials section at https://github.com/Kaggle/kaggle-api#api-credentials
$ pip install kaggle # Downloaded API key to ~/.kaggle/kaggle.json $ kaggle list kernels --mine | grep -i cat code/bencoman/cats-v-dogs-inference-gradio Cats v Dogs - inference gradio Ben Coman 2022-05-08 13:39:35 0 code/bencoman/cats-v-dogs-saving-a-basic-fastai-model Cats v Dogs - saving a basic fastai model Ben Coman 2022-05-08 16:02:11 0 $ ls README.md app.py cat.jpg dog.jpg $ kaggle kernels output bencoman/cats-v-dogs-saving-a-basic-fastai-model --wp $ ls README.md app.py cat.jpg dog.jpg cats-v-dogs-saving-a-basic-fastai-model.log model.pkl
- Pushed to hugging-space
$ git add -A $ git status new file: app.py new file: cat.jpg new file: cats-v-dogs-saving-a-basic-fastai-model.log new file: dog.jpg new file: model.pkl $ git commit -m "First app & model" $ git push Username for 'https://huggingface.co': bencoman Password for 'https://firstname.lastname@example.org': ****** Uploading LFS objects: 0% (0/1), 22 MB | 113 KB/s Uploading LFS objects: 100% (1/1), 47 MB | 114 KB/s, done. Enumerating objects: 12, done. Counting objects: 100% (12/12), done. Delta compression using up to 12 threads Compressing objects: 100% (10/10), done. Writing objects: 100% (10/10), 65.79 KiB | 21.93 MiB/s, done. Total 10 (delta 2), reused 0 (delta 0) remote: Enforcing permissions... remote: Allowed refs: all To https://huggingface.co/spaces/bencoman/cats-v-dogs ae86db9..813dfce main -> main
- Tested application by clicking…
Note: After completing this local jupyter from scratch setup,
the app can be run locally by creating a new notebook applocal.ipynb
next to app.py and copying the contents of app.py into the notebook.
I went through the same exercise as you did.
Use the Extension Manager in case of JupyterLab. I searched for collapse and found the collapsible headings and select Install.
I was able to use Collapsible headings in JupyterLab
19 Best JupyterLab Extensions for Machine Learning - neptune.ai (See extension 17)
Thanks for sharing about it
hi @bencoman, thank you for these helpful tips.
I however don’t know if you have tried the ImageClassifierCleaner on Kaggle. I am currently trying to get the widgets to work but it is not producing the result. The pictures are shown and then the end produces
Error displaying widget: model not found