Class1 Video Path of URLs.PETS

In class1 video path of URLs.PETS is comming as ‘

Where in code this path was put in URLs.PETs

1 Like

I think this is a function built into the FastAI library. When you call this function, the FastAI API downloads a copy of the data you specify, in this case PETS, from an S3 bucket on AmazonS3. So I don’t think you have to explicitly write out the AWS location of the data, the fastAI library downloads the data from where it is stored for you when you tell it which dataset to download,

URLs is a global constant built in to FastAI. You can check out the documentation for URLs and the source code.

So I guess we imported those global constants when we ran: from import *

As a side note, there are a few ways to find out more information about something in the code:

  • Run the line: doc(URLs)
  • Run the line: help(URLs)
  • Hover over it until a text-box appears, and then ctrl-click for the source code (at least this works for me in Colab):


thanks rhowell

I just completed lesson 1 and am trying to do this exercise with another dataset. Before getting my own from google images, I was thinking of using something readily available (like the Iris set for UCI.)

I am unsure of how to substitute the PETs example with this new data set. Can anyone help me out?

I am in the same boat. I have been through the first video a few times and I have been through the first notebook more than that. The first time just running the modules and than trying some different things. Mostly the number if iterations and watching with a few more the results improve, and with a lot more watching them overshoot, and start to move back to the best up until then. The terms used are not super well explained, and being out for the first time this is all new. After a few trips through, I tried uploading a different dataset and things got interesting. I have been unable to get untar to swallow anything I have tried. Oddly enough when I try sticking URLs.PETS into my browser it barfs, but if I add a tar extension it will try and download a .tgz file. IMHO this function and way of grabbing data seems to make it very hard for us to play with this notebook.

The other thing is that you get to the end of the notebook, and you see your numbers improve, but you have no real way to actually play with it. How about a piece at the end for you to upload your own image(s) and see how it does at classifying it/them. To me that is the part that I really want to see work.

I am thankful for this, you guys put a ton of work into it, but I think you are working at such a high level you may have forgot what it is like just getting started.

FWIW, I have managed to get the pets demo to download my data set. The URLs.PETS is buried deep in the guts of one of the things that is imported, and the rest of the demo of course depends on things having the same layout once it is all extracted. Given I am new to these notebooks and python it seemed (sadly) easier to combine my new data with the contents of the original file. This required a ton of copying and renaming, and changing the URLs.PETS to a URL that points to my server. Uploading is a lot slower and I trimmed everything out of the file except the images. From what I understood from the demo it built the names out of the image filenames, so I maintained the same filename format. It will be interesting.

Hello? Is anybody home at all?

Anyway, I learned something interesting. The naming on the merged data set and the original data set. I kept the original stuff except for the cat breeds, was not quite right, and it assumed that the new ones were different breeds. So I had a lot of pictures of say pugs, but it thought I had pictures of different breeds. What was odd is it did not confuse the pug breeds as often as I felt that it should have. I felt it should have confused the various naming mistakes that were all pugs nearly 100% of the time and it did not. It makes me wonder.