Nbdev discussion

Thanks for releasing this tool! I am checking out the documentation.

nbdev_install_git_hooks ()
I believe these git hook should be general for most platform not limited to Github?

This command install git hooks to make sure notebooks are cleaned before you commit them to GitHub and automatically trusted at each merge. To be more specific, this creates:

1 Like

We don’t use anything else than github, so I don;t know if they work the same way on other platforms. Let us know if they are more general!

Sure, I would like to test it with GitLab. I am trying to run nbdev_install_git_hooks in a local repository, it just hangs there and nothing got printed.

Here is what I have done.

  1. pip install nbdev
  2. make a new directory and create a dummy notebook
  3. git init
  4. nbdev_install_git_hooks
  5. It hangs…

Maybe I am not using the command right?

1 Like

It’s likely the command would fail unless you have filled the setting.ini file (at least with lib_name at the very top).

Thanks, I totally missed the part that it need to start with some template with config files.
Hooks works fine with local repository. The ReviewNB that Jeremy mentioned in the blog does not support GitLab, not sure if there are any alternatives for this.

Do you normally just use command line for things like git add? I tried to add files with the GUI of VS Code, it will cause error, but fine if I just add through command.


Amazing package, just what I’ve been missing in my professional life.

I’ve been using it for about 10 hours now, and it has already made my professional life better.

qq: what is the best way to give feedback on bugs and functionality?

This forum or creating a github issue like this one?

1 Like

Hi, big fan of fastai courses, so excited by nbdev. However, I am also a big fan of conda / Anaconda, and JupyterLab (next step after original Jupyter Notebooks)

Two questions:

1, will it be added to the Anaconda fastai channel, so its installation can be managed by conda?

  1. will nbdev work with Notebooks produced by JupyterLab?


Hi Don, JupyterLab and Jupyter Notebooks are just ways for your browser to interact with files and kernels.

The underlying files (.ipynb) do not differ between them.

Hence, nbdev is already fully compatible with both Jupyterlab and Jupyter Notebook

1 Like

Hi David

I think that JupyterLab and original Jupyter have different extension interfaces (eg there are two black formatter extensions, one for lab, and one for plain Jupyter). However, I suspect that nbdev doesn’t use these interfaces, so I would agree that nbdev should be blind to the source of ipynb.

If nbdev is not be hosted on the fastai channel, that makes environment mgt a little harder for heavy users of conda (although I read that conda installs from a local tar file are possible, so maybe a tar file could be made available?)


Hello, thanks for this awesome tool, was wondering how this can be used with gitlab, i know with github one can use nbreview to handle merge conflict but we use gitlab. any tips here. This is needed because we do code review. or do i have to use something like jupytext to still create py files and only commit the py files and not ipynb


@nok You error sounds like you don’t have nbdev installed, the command would exist otherwise.
@dzlob Either or is fine, we watch both.
@donrcameron Yes, we will probably add a conda package at some point.

QUESTION: What is the recommended way to fill in the title and summary tags in the head of the generated docs/*.html pages?
Currently newly added *.ipynb notebooks generate *.html with Title and "summary" respectively.

Ah it’s not documented, good catch.
It’s a markdown cell with

# Title
> Summary
1 Like

It may also be yet undocumented:
Table of contents in the docs/*.html pages is automatically pulled together from the markdown cells with headers of levels 2,… and from cells with functions definitions.

Hi !
Very nice package, I started using it and it looks very convenient for notebook lovers like me. I am stumbling on a problem with nbdev_build_docs though. I used it once at the beginning, and it worked fine, but now it gives me the same error all the time:

An error occurred while executing the following cell:
from nbdev.showdoc import show_doc
from grade_classif.core import *

An exception has occurred, use %tb to see the full traceback.

SystemExit: 2

Thing is I can’t use %tb as I am in a terminal. So I tried using notebook2html directly in a notebook, but it just takes forever to run then times out and writes Kernel didn't respond in 60 seconds (I used the #hide marker as I guessed it should prevent it from running itself indefinitely but still doesn’t work). Therefore I am unable to get a traceback and have no idea what causes the problem. Does anyone have any idea how to at least get a traceback of the error ?

This is weird. Can the notebook execute normally? I never saw this error before.

Yes it runs perfectly, and nbdev_build_lib does work perfectly fine as well.

Ok I found where it came from. I use argparse in one of my notebooks as I intend to make it available in the final source code, but it seems like it triggers this error. When I put away the #export tag from the cells using it, it works fine. I don’t know where the incompatibility comes from though.

You should use https://github.com/fastai/fastscript, it works well with nbdev. Haven’t tried argparse, so I don’t know if there are bad side effects.

1 Like

I’ll try that, thanks. Was hoping to be able to use test_tube HyperOptArgumentParser to enable hyperparameter search, but guess it might not work well with nbdev.
Just as a side note, I tried to completely isolate the argument parser on an independent notebook (in case it was the fact that it got called multiple times in parallel from multiple imports) but it doesn’t work better. I would guess the problem comes from multiprocessing but it still didn’t work with n_workers=1, so I’m not sure.

EDIT: Probably comes from a compatibility problem between nbconvert and argparse, as notebook2script works fine and it doesn’t use nbconvert while it uses multiprocessing.