Part 1 lesson 2 Colab

Hello, can anyone help me figure out why I can not seem to download the images after I scraped them off of google images? I keep getting this error:

FileNotFoundError Traceback (most recent call last)

<ipython-input-13-e85756baeaa4> in <module>() ----> 1 download_images(path/file, dest, max_pics=200)

/usr/local/lib/python3.6/dist-packages/fastai/vision/data.py in download_images(urls, dest, max_pics, max_workers, timeout) 192 def download_images(urls:Collection[str], dest:PathOrStr, max_pics:int=1000, max_workers:int=8, timeout=4): 193 “Download images listed in text file urls to path dest, at most max_pics" --> 194 urls = open(urls).read().strip().split(”\n")[:max_pics] 195 dest = Path(dest) 196 dest.mkdir(exist_ok=True)

FileNotFoundError: [Errno 2] No such file or directory: ‘/content/gdrive/My Drive/fastai-v3/data/urls_grizzly.csv’

I’ve seen a few blog post suggesting to the following code :
from google.colab import drive
drive.mount(’/content/gdrive’, force_remount=True)
root_dir = “/content/gdrive/My Drive/”
base_dir = root_dir + 'fastai-v3/data/

However this does not work. I keep getting the same error. I’ve moved the files around to several locations, but I still keep getting the error. Can someone please help, I’m using google colab by the way.

Thank You

Let’s try this. Make a Path() object containing your base dir. IE myPath = Path(base_dir)

If you then do myPath.ls(), what shows up?

I get PosixPath(’/content/gdrive/My Drive/fastai-v3/data/grizzly’)]