Beginner: Creating a dataset, and using Gradio / Spaces ✅

Hi! I am stuck with the bear notebook… Jeremy’s notebook on github uses the bing and I want to use ddg, but even though I copy the codes from the lecture today, I can’t seem to work the codes. Just looks like it is not downloading the images like it did previously. Can someone help?? Thanks!

I’m guessing the path already exists. Try removing that first “if” statement that checks for a path.


For mine to work I had to replace path=Path('bears') with path = Path('images/bears') you can check to see if the files got downloaded by switching to the Files tab in Jupyter and seeing what’s in the directories.

1 Like

Have you checked to see if the bears directory already exists? If it does then the rest of the code won’t run - I had this earlier today when I ran this cell without fully changing over the bing search code to ddg. Deleting the bears directory to start over worked for me.

Thanks, it worked for me once I hidden the if statement at the top.

Hello all! I’ve been trying and failing to deploy Saving a basic fastai model/notebook trained model to Hugging Face.

  • I have exported the model.pkl file then got the as shown on the video.
  • Saved the changes in Visual Studio Code
  • Created a Hugging Face Space, committed my files and pushed them using git commit and git push
  • This is the error that I get on my terminal

Please let me know what I am missing/doing wrong. Thank you.

You need to use lfs. Check out the tutorial from @ilovescience linked in the top post for step by step instructions.


Thanks Jeremy! I totally missed that, will check it out. If I succeed I’ll be posting on Share your work, if not then I’ll be back with more questions. Hopefully the former. :grin:

1 Like

Questions are always welcome!


I need your guidance once again Jeremy. I made some progress and was able to get lfs working but ran into an error when I deployed to Hugging Face Space.

Here’s the error I’m getting. I thought the requirements.txt and the would provide this.

Thank you.

I dug around and for some reason my requirements didn’t show as .txt so I made one and viola! My first ever Hugging Face Space Production! I’m very happy and excited to see what else I can do with it now :grin:


This looks great! I had some issues using git-lfs as well, I had to upload the pickle file directly to the spaces repo. I haven’t done it myself but I think there is a way to upload your model to hugging face (instead of pickling it) and then just referring to the model that exists on hugging face instead of uploading it from your machine to HF via git-lfs … maybe I’ll do that for my next creation on HF-Spaces.

Thanks! I’m just glad that I finally got it to work.
Is there a way to place a button that opens up your phone camera to take a photo which then gets used to classify?

1 Like

Actually, AFAIK, there isn’t such functionality in Gradio, but if you look at Jeremy’s Javascript interface thread, it has a submission where you can use the camera to take a photo but it’s a JS app which talks to the REST api exposed by Gradio. Jeremy talked about it in the 2nd lecture.

I’m looking at using the Huggingface API via a custom connector tp Microsoft’s low code solution, PowerApps - which makes it easy to get camera or saved images from the phone (or PC) - then sending the images over to the API and getting the prediction back. I’m still learning with PowerApps though - getting the records back now seeing get the data out of the table to display. Any other PowerApps Pros on the course who have tried this?


I’ll have a re-watch on Lesson 2 in case I missed it. Thanks for letting me know :smiling_face:

I’ll be interested in following your progress with this!


This is the thread/JS app I was talking about which Jeremy talked about in the 2nd lecture:


This is so sick! Thanks for sharing this. Been playing around with it.


I’m getting somewhere - pulling the label was easy, now need to work out how to get the values from the confidences and maybe chart them.

Edit - more details and code in Share your work

1 Like