Questions on nbdev

You can follow the markdown syntax inside your document and it will be properly displayed in your docs.

1 Like

I’m bit confused about nbdev_test_nbs. Cells having the flags we pass-in to the nbdev_test_nbs are skippd from test or only those cells are tested ? I more interested in achieving the former behavior :slightly_smiling_face:

Looks like comments above the function are being removed in latest version nbdev (from sources).

#export
# Adopted from 
# https://discuss.pytorch.org/t/how-can-i-replace-an-intermediate-layer-in-a-pre-trained-network/3586/7
def convert_MP_to_blurMP(model, layer_type_old):
    "Convert MaxPool/AvgPool to MaxBlurPool"
    ...

How can I make sure they get’s exported ? Copying it manually will be preserved ?

I’ve opened #167 to track this.

1 Like
@patch
def tsne(x:Tensor,k=2,seed=47):
  "TSNE embeddings using `sklearn`"
  ...

I’ve patched tsne to the Tensor class and it works fine with the code. However, when I try to run nbdev_build_docs, I’m getting “name ‘Tensor’ is not defined” error

converting: nbs/04_utils.ipynb
An error occurred while executing the following cell:
------------------
show_doc(Tensor.tsne, default_cls_level=2)
------------------

---------------------------------------
NameErrorTraceback (most recent call last)
<ipython-input-11-deee42e37eb3> in <module>
----> 1 show_doc(Tensor.tsne, default_cls_level=2)

NameError: name 'Tensor' is not defined
NameError: name 'Tensor' is not defined

Traceback (most recent call last):
  File "/home/kshitij/anaconda3/envs/fastai2/bin/nbdev_build_docs", line 8, in <module>
    sys.exit(nbdev_build_docs())
  File "/home/kshitij/anaconda3/envs/fastai2/lib/python3.7/site-packages/fastscript/core.py", line 73, in _f
    func(**args.__dict__)
  File "/home/kshitij/anaconda3/envs/fastai2/lib/python3.7/site-packages/nbdev/cli.py", line 162, in nbdev_build_docs
    notebook2html(fname=fname, force_all=force_all, n_workers=n_workers)
  File "/home/kshitij/anaconda3/envs/fastai2/lib/python3.7/site-packages/nbdev/export2html.py", line 537, in notebook2html
    raise Exception(msg + '\n'.join([f.name for p,f in zip(passed,files) if not p]))
Exception: Conversion failed on the following:

Can you take a look at https://github.com/pete88b/decision_tree/blob/master/71_tensor_patch.ipynb
hopefully this replicates your issue - and also suggests a work-around.

I’ve raised https://github.com/fastai/nbdev/issues/171 to see if this is an issue or intended bahavior

1 Like

By default, all code cells are run during testing. Putting a test flag (e.g. #slow) in a code cell means the cell will not be run unless you: nbdev_test_nbs --flags slow.
nbdev_test_nbs --flags slow will run all code cells without test flags and all code cells with the #slow flag.

so … If you want flagged cells to be skipped during testing, don’t pass the flag. e.g.

  • you have tst_flags=slow|gpu|broken in settings.ini
  • you want to run all test that are not flagged with broken (i.e. skip the broken test steps)

nbdev_test_nbs --flags "slow gpu"

hope this helps (o:

2 Likes

Thanks @pete88b ! I’m not sure I understood the workaround correctly. But I tried to add

from fastai2.torch_basics import *
from nbdev.showdoc import *

these lines in the imports cell at the top of notebook. It didn’t help.

Also, one thing I’m finding difficult is nbdev_build_docs is becoming extreme RAM hogger. I tried setting --n_workers according to the no. of cores in my system, but eventually it acquires almost whole RAM and freezes my laptop.
I’ve intel i7 7th gen processor with 8GB RAM and I’m finding this insufficient for generating the docs. With normal applications opened, my RAM usage is around 2.5 GB but when I execute nbdev_build_docs, it almost every time finds the RAM insufficient and eventually freezes the system.

I think you need to try this instead, Tensor is specifically designated in torch_core: This worked for me and did not throw the error when building docs.

from fastai2.torch_core import Tensor

As for the memory issues, I noticed the same and I had to just stop and restart the kernel when my computer started slowing down.

I would also add to save on RAM only import packages that you need instead of using import *

Interesting! will try that. Hope someone will build a tool to convert all import * to only used packages :stuck_out_tongue: Just like InteliJ’s optimize imports.

I guess Tensor is not defined in the fastai2.torch_core, rather it’s the original Tensor class from PyTorch. So I’ve tried to do from torch import Tensor but this didn’t help.

don’t worry about the workaround (o: I’m about to make a PR to fix this issue

i’ll try to take a look at memory use in the next few days - are you able to share the project you’re working on? it’d really help when i’m trying to replicate issues

I see but it did work for me, Tensor is imported from torch_imports.py and this is imported into torch_core

1 Like

pull the latest when https://github.com/fastai/nbdev/pull/173 get accepted - this should fix your problem

if you can’t afford to wait, try adding a cell containing:
_all_ = ['Tensor']
run nbdev_build_lib then i think nbdev_build_docs will work.
please let me off if this doesn’t work - it’s just a quick idea i’ve not had time to try yet

1 Like

Can you please specify your solution. I have the same problem. Did you mean, that you forgot to push your local changes to your remote repository? Not sure what you did!

If you’re using an editable install of nbdev, you can git pull the latest nbdev code and this should start working for you. This new nbdev code should be available via the regular pip installer in a couple of weeks.

If you can’t use an editable install, you can add the thing you’re patching to __all__. e.g.:
_all_ = ['Tensor'].

There’s an autogenerated file called _nbdev.py inside your lib folder. I pushed the changes done in my library but forgot push the changes occured in _nbdev.py file (auto-generated). After pushing those changes as well, it solved my issue. But lately, it’s occurring more frequently I don’t know any other cause for the same.

1 Like

I see that if I delete a notebook from my docs, and then perform:

nbdev_clean_nbs
nbdev_build_lib
nbdev_build_docs

The documentation files generated from the deleted notebook are not deleted. Is there a way to delete them automatically from the library using nbdev?

No nbdev allows you to write code outside notebooks for your library also. Therefore we don’t delete files for you with the assumption that you may have created that manually.

1 Like

It makes sense, thanks for replying @hamelsmu !

They are not deleted, but notice they are ignored by git (see .gitignore)