The only thing you might need to be aware of, is that if you make changes on your Google Drive itself (for instance if you remove/add images to your dataset), it can take some time for those changes to get synced to the Colab file system.
I will just add a note: for training, split in some parts (e.g. 50epochs to 25/25), to be able to save the model in between and then load it in case of any disconnect.
Even as a Pro user, I have been getting disconnects.
are you running notebooks from clean
folder?
i noticed on my local machine the same issue, i simply copied the utils.py
to clean
folder and it worked, but i suspect there may be cleaner solution to make this work. python needs to know where to look for this file. it checks in pytonpath and in current directory and as utils.py is not in the directory it throws an error. my understanding at least
You can add callback like “SaveModel” and “CSVHistory”, so you save every step of the way and keep a history of epoch results…
should it be. %cd '/content/drive/My Drive/fastbook'
Can you point me how?
@muellerzr
are you having any issues with nbdev ?
!pip install nbdev
solves that but complains next about azure
pip install -r requirements.txt
as per instructions here - http://book.fast.ai/
I import everything but I have this error:
Unsupported Cell Type. Double-Click to inspect/edit the content.
yup, that worked. thanks @miwojc
this worked for me:
After navigating on Google Drive and opening the notebook:
!pip install fastai2
Mount drive:
from google.colab import drive
drive.mount('/content/drive')
Navigate to fastbook:
%cd '/content/drive/My Drive/fastbook'
then
!pip install -r requirements.txt
Side note just realized on collab you don’t have to mention !
before pip
.
So both worked - !pip install fastai2
or pip install fastai2
Just making sure, but i dont see !pip install -r requirements.txt
in your screen shot
!pip install -r requirements.txt
This seems to work perfectly without needing any additional commands.
It works
thank @barnacl all great now.
@jeremy. Thank you so much again for the great work.
- Is there a way we can use fastpages with Colab and update on our blogs?
- Should we import our repositories on google drive?
- and then how can we send the commit to GitHub, fastpages?
@Albertotono I made a nbdev tutorial that describes updating and working with GitHub in colab. My best advice:
Just download the notebooks and run your scripts to build etc from a Linux subsystem. Much less headache
in the Final Example
image, the pip install nbdev step is required ?
Change of plans - we’re using this repo: https://github.com/fastai/course-v4
the requirements.txt is not course-v4, is that required for the utils.py @jeremy ?
I can select GPU for run time. With this command, it is possible to check the details of the GPU that will be used by the Colab at runtime.
!nvidia-smi