So … you have a few notebooks exploring the same kind of problem and you find yourself copying/pasting code between them - but you’re not ready to go full nbdev - what if you could
- flag cells to be written to a py module as soon as you run them in the notebook
- make that module available to other notebooks?
As a quick example;
- cell 5 in: https://github.com/pete88b/data-science/blob/master/myohddac/notebooks/010_mnist_training.ipynb exports
new_model
(see cell 1 for nbdev_quick import and init)- so I can modify and run just cell 5 and the module is updated
- sell 6 in: https://github.com/pete88b/data-science/blob/master/myohddac/notebooks/021_mnist_or_not_training.ipynb calls
new_model
- you can see all exported code in: https://github.com/pete88b/data-science/blob/master/myohddac/notebooks/quick_module.py
If you’re interested in the implementation, please see: https://github.com/pete88b/data-science/blob/master/myohddac/notebooks/nbdev_quick.py
I realize this is similar to https://github.com/ipython-contrib/IPython-extensions/blob/master/ipyext/writeandexecute.py but …
- we don’t force you to type an ID and file name with every exported cell (too much typing for me :o)
- this won’t update your module if running the cell fails
- notebooks using this could be used in a full nbdev project (just need to change import / default_exp flag)
@sgugger/@jeremy might this be a good fit for nbdev? if yes, i’d be happy to work on a PR