I don’t have a machine with an Nvidia GPU so I really I lot on Google Colab and Kaggle Kernels for my experiments. Would it at all be possible to use Nbdev with Google Colab for creating a code base from their jupyter notebooks?
Absolutely! I’ll be showing it in my study group. Generally you write the notebook and then download and re-upload it to your instance, and from there you can run the export etc. We need this extra step since the notebooks themselves are static instances on their server so we can’t “directly” modify them from a storage space (and use the CLI)
A little tutorial/run-thru of this would be great! Maybe even as a PR to the nbdev docs?..
Sure I’ll have some time this weekend, I’ll whip something up! (with a video too)
Thanks a ton! I really want to use Nbdev to create a code base for things I use repeatedly and not copy paste them from one notebook to another. This will be a life saver
Hi muellerzr hi mate!
You said it in your Chai Time video.
I still don’t believe you sleep!
I had a go today and managed to get through the entire nbdev tutorial using just Colab notebooks and google drive. It all worked just as intended including the automatically generated documentation on githubpages and even a working a pypi release of a test library. It was fiddly, with a few workarounds here and there to get nbdev, github, google drive and colab working together, but certainly possible.
Yes. I made it a bit more streamline with some functions, I’m going to release the tutorial today. (So you never leave your colab instance) The easiest option by far though is clone the repo into your notebook, then download the notebook, move it to your cloned repository on your machine, clean the notebook, convert to script, then push up to your repo.
Ok. Great. I’ll take a look at your tuturial as it sounds like you may have a more straighforward way of doing this and if I have anything else to add on top I’ll post it.
BTW. Good job with the ‘A walk with fastai2’ lesson. Looking forward to the others.
Awesome I’m hoping so. It was a bit frustrating to get it all running. You’ll notice the flaws immediately but I can’t seem to find a workaround for them. I’m hoping you (and others) can improve upon this second method.
And thank you!
@muellerzr Thanks for doing this. I just had a quick look at the video. Your basic approach to working with nbdev exclusively from within Colab is very similar to mine though I’m not facing any of the bugs you mention, i.e. index.py updates as expected and all tests pass OK (albeit after a bit of fiddling around). I’ll take a look at your code in more detail tomorrow and see what might be making the difference.
One thing to be careful of though is that when you clone the git repo to google drive with ‘!git clone https://username:email@example.com…’ a copy of the username AND password gets stored in the local repo on google drive (in the .git/config file). Thus users must be careful NOT to share the local repo (on google drive) with anyone as it risks exposing their github password.
@mathew100 thank you for the feedback! Ideally you shouldn’t need to clone that way. You should be cloning via just
https://github.com/muellerzr/nbdev_colab.git (or whatever your forked repo is) so I’m unsure if that’s still an issue? (I never used the user/pass method you described, only with
add_origin. Does this still do the same? (I’m still learning git in a few places ). If so: is there a way to get around this?
@muellerzr Cloning with just
https://github.com/muellerzr/nbdev_colab.git doesn’t store username/password details but on its own won’t allow you to push to github from Colab. You need to either
! git remote add origin https://username:firstname.lastname@example.org... (as you do in the git_setup function) or
‘! git clone https://username:email@example.com…’ (which has been my approach) to allow pushing to github from Colab and in both cases the username/password gets stored in the local repo. There’s no way round this I don’t think so worth just warning people about it.
Got it! Thank you for the clear explanation That helped a lot! I’ll add this to the post
The Nbdev project is great – I’ve been using on Gradient which is a persistent environment so you don’t need to do anything fancy. I just ran:
$ pip install jupyterlab-nvdashboard $ jupyter labextension install jupyterlab-nvdashboard
Every time you start your notebook, it will still be there In any case, I was thinking about installing Nbdev by default on all the templates. Is there any reason why that’s a bad idea?
No, that would be great I think!
PS: @dkobran I sent you an email with some important questions and requests on Dec 13th - please check your spam folder in case you missed it…
Can we use nbdev_update_lib within google colab?
Yes, you can use any bash command so long as you are in your nbdev project directory by adding a ! in front of it
I found your PyPI package nbd-colab, but I’m having trouble using it.
I followed your documentation and did the below but at the very beginning ran into an error. I guess I missed some steps…
I create a Colab notebook and run the below
!pip install nbd_colab from nbd_colab import * from nbdev import *
leading to the below error:
--------------------------------------------------------------------------- AssertionError Traceback (most recent call last) <ipython-input-18-53c94a362f4e> in <module>() ----> 1 from nbd_colab import * 2 from nbdev import * 4 frames /usr/local/lib/python3.6/dist-packages/nbdev/imports.py in __init__(self, cfg_name) 40 while cfg_path != cfg_path.parent and not (cfg_path/cfg_name).exists(): cfg_path = cfg_path.parent 41 self.config_file = cfg_path/cfg_name ---> 42 assert self.config_file.exists(), "Use `create_config` to create settings.ini for the first time" 43 self.d = read_config_file(self.config_file)['DEFAULT'] 44 add_new_defaults(self.d, self.config_file) AssertionError: Use `create_config` to create settings.ini for the first time
I tried the below but got an
name 'create_config' is not defined
Any input would be more than welcome