/usr/local/lib/python3.6/dist-packages/fastai/vision/data.py in download_images(urls, dest, max_pics, max_workers, timeout) 194 def download_images(urls:Collection[str], dest:PathOrStr, max_pics:int=1000, max_workers:int=8, timeout=4): 195 “Download images listed in text file urls to path dest, at most max_pics" --> 196 urls = open(urls).read().strip().split(”\n")[:max_pics] 197 dest = Path(dest) 198 dest.mkdir(exist_ok=True)
FileNotFoundError: [Errno 2] No such file or directory: ‘data/bears/urls_black.txt’
How to upload the urls_black.txt file to the colab environment (if possible then from the pc directly)
1 Like
mario_jorge
(Mario jorge lopes chagas de almeida)
2
hi sheatran I have to deal with the same problem and I find this in the forum that works for me:
1 - create a new code cell after the line “folder = ‘grizzly’…” and put the following code:
I am not able to generate the text file using the javascript from the google image search. Can someone share with me the txt file or tell me what I could be doing wrong. I am using google chrome and this is the script that I am running.
For the javascript that scrapes the url, are the urls supposed to be separated in a new line? I have all the urls joined together using the script provided.
Hi, i am doing lesson-2 and i have a doubt on the opening images part in production.
img = open_image(path/‘blackbears.txt’/‘00000021.jpg’)
In the above code, how do i know there is a file with name “00000021.jpg” and what are the other names? @smigula
From my understanding that was simply some example code of deploying a model, the name of that file could be anything it depends on what it is saved as when it is uploaded to your server. I would bypass that section of the lesson and follow these instructions on deploying your model to render.
/usr/local/lib/python3.6/dist-packages/fastai/vision/data.py in download_images(urls, dest, max_pics, max_workers, timeout) 192 def download_images(urls:Collection[str], dest:PathOrStr, max_pics:int=1000, max_workers:int=8, timeout=4): 193 “Download images listed in text file urls to path dest , at most max_pics " --> 194 urls = open(urls).read().strip().split(”\n")[:max_pics] 195 dest = Path(dest) 196 dest.mkdir(exist_ok=True)
FileNotFoundError: [Errno 2] No such file or directory: ‘/content/gdrive/My Drive/fastai-v3/data/urls_grizzly.csv’
I’ve seen a few blog post suggesting to the following code :
from google.colab import drive
drive.mount(’/content/gdrive’, force_remount=True)
root_dir = “/content/gdrive/My Drive/”
base_dir = root_dir + 'fastai-v3/data/
However this does not work. I keep getting the same error. I’ve moved the files around to several locations, but I still keep getting the error. Can someone please help, I’m using google colab by the way.
Hey, it would be a big help if you could run me through what I have to do. I did exactly like you said…but there seems to be folder named fastai-v3/data