and received this error:
---------------------------------------------------------------------------
NameError Traceback (most recent call last)
<ipython-input-8-41891b907d8d> in <module>()
1 path = Path(base_dir + 'data/pets')
----> 2 dest = path/folder
3 dest.mkdir(parents=True, exist_ok=True)
4 path = untar_data(URLs.PETS); path
NameError: name 'folder' is not defined
I made in drive a folder named: fastai-v3. I am not well understanding where untar_data is saving the files.
How to make it saving in the google drive?
or in a folder in the pc?
@salvatore.r@vikbehal hope @gamo fix worked for you. I found that my overload learn.save/load code that i posted a few posts above helped me split up my model saves - Whilst retaining the fastai colab free functionality
If you are using standard data sets, the notebooks and saved weights are usually enough to learn and progress, and you can to keep a gdrive copy.
Saved weights can be in the 250Mb range for image recognition
If you are creating your own datasets then having all of the folder (images weights notebooks ) on your gdrive is a good option. Though I am not sure if you need to shift your image sets to the colab instance for performance. Anyone care to comment?
As you get more sophisticated you will see that practitioners begin using paid cloud services like AWS ec2/s3 for storage of models. or you build your own DL computer and have the datasets on it.
I am not sure of your tech experience (high | low) so I hope this advice hits the right level for you
I did try but above will just create the directory system. How do I tell fast.ai that mentioned path is my path where data and models will be saved?
I followed this post to change the configuration. It works if I run notebook as-is. As soon as I change the runtime to GPU, it goes back to default fast.ai path.
Gabriel, thank you. I did that. Now how do I tell fast.ai to download data at that path? Also, what changes should I make so that model data is saved in Google drive - i.e. the path.
Having fastai work directly with gdrive is probably not a good idea, you have to treat data on gdrive as on a NAS or other remote storage, if you try to run data directly off gdrive it will have to move that data over network and it will be slow.
Keep all data and models local on colab while you are working and then use !cp or python specific library to copy data and/or models to gdrive. It is best to create a function (def) for this that way you can include it in your learner and have it save the model to gdrive during learning if learning has to run for a long time giving you running backups of your model.
however doing this what it is actually happening is that is creating a folder in my google drive, but for some reasons I am missing it is not downloading inside this folder. So it is creating an empty folder in the fastai folder on my drive.
In my opinion I think what is happening is that is creating a new path but without downloading the data in it.
As I said in a previous post, if you are just saving the data to gd then that is ok, but if you will then use the data on colab, then colab will have to get that data back from gd over network before it can be used and that is slow.
Instead download all data and models locally to your colab instance and use them there, then when you want to save/backup your work copy the whole project folder to gd.
As of 30 minutes ago, I have not been able to use the Fastai library. Despite following the setup guide, I get the error below when I run: from fastai.vision import *