I must be missing something obvious, because I’ve read all of the documentation and the forum articles.
Every time I go away from my work in Colab and it disconnects, I lose all of my progress, and I have to redo the data loaders and learners.
Is there some way to save that progress?
On Colab with GPU, the only thing that persists is your notebook - every time it disconnects, you are assigned a new machine with a different disk, different GPU and without the state of the previous machine, so you have to rerun your notebook to get back to where you were before unfortunately.
If you are working on something that requires more than a couple of hours of training then you probably should get an actual instance at a cloud provider, Colab has its limitations there.
You can link your Google Drive to store files:
mount_location = "/gdrive"
# Mount Google Drive
from google.colab import drive
then, when training your model, you can use callbacks to save your model after every epoch, or only keep the best model:
callbacks=[SaveModelCallback(learn, every='improvement', name='best_so_far')]
Additionally, some debug code for when the link between Google Drive and Google Colab gives you a hard time:
## TroubleShooting helpers
# Uncomment and run the line below if you are having trouble (re)mounting Google Drive.
# Uncomment and run the line below if you want to remove all writes to Google Drive and unmoun the folder