How do I create packages with multiple variants using nbdev

Hi all,

I’m looking for suggestions/approaches on using nbdev to create python packages with variants, e.g.:

Lets say I have a main package named foo which combines the functionalities of vision, audio, text, medical and collab modules plus some core functionalities that come with the foo module.

Now I want to offer subsets of the foo package such as foo[vision] or foo[audio] which only contain the functionalities/modules of the core functionality plus the particular domain (ie. vision, audio, text, etc) - with the goal being able to reduce the size of the variant packages plus reduce the number of dependencies of the required packages for each variant (e.g. if I was using foo[vision], I don’t want to have to install the dependencies required by the foo[audio] module)

I know that based on this discussion, each of these subsetted modules (e.g. foo[vision] ) are actually separate packages from each other and its only a convention that isn’t enforced by any tooling (i.e. foo and foo[vision] can be totally unrelated packages).

But I was wondering if there was anything in nbdev (or something related) that could assist me in maintaining the main module and probably doing something to sync it across different nbdev projects.

The ideal scenario would be I would have a foo.core set of notebooks (which have a default_exp set to foo.core and I would have another set of notebooks in another nbdev project that have a default_exp set to foo.vision.* modules and another nbdev project for foo.audio and so on and so forth. Then I just run some make commands to build them and release them to pypi automagically…

As for the documentation, I could just have the main foo package automagically include the submodules into 1 nicely packaged doc site…

Any ideas on how this could be done? I’m thinking stuff like using symbolic links to connect different subdirectories across nbdev projects might work but suggestions are welcome!

Best regards,
Butch

Hey Butch,

First, meant to get to this much faster than I did, apologies.

The first thing I did was require some “extras” in the setup.py. In my case for fastinference I wanted 4 different “versions”, one for onnx-gpu, onnx-cpu, a specific interpretation module, and an all. What this dictated was specific extra requirements I may need. In my setup.py this looked like the following:

requirements = parse_requirements("requirements")
o_gpu = parse_requirements("onnxgpu")
o_cpu = parse_requirements("onnxcpu")
interp = parse_requirements("interp")
all_req = parse_requirements("all")

extras = {}
extras["onnx-gpu"] = ['onnxruntime-gpu']
extras["onnx-cpu"] = ['onnxruntime-cpu']
extras["interp"] = ['plotly', 'plotnine', 'shap<0.36.0']
extras["all"] = ['fastai', 'onnxruntime-gpu', 'plotly', 'plotnine', 'shap<0.36.0']

You also need to adjust the setuptools.setup to include extras_require, ie:

    packages = setuptools.find_packages(),
    include_package_data = True,
    install_requires = requirements,
    extras_require=extras,

(Look for that section)
And then in the settings.ini I had those requirements stated in there (not 100% why I didn’t put it in there, but it’s been a bit)

requirements = fastai>=2.0.0
onnxgpu = onnxruntime-gpu
onnxcpu = onnxruntime-cpu
interp = plotly,plotnine,shap<0.36.0
all = 
	fastai>=2.0.0,onnxruntime-gpu,plotly,plotnine,shap<0.36.0

Next, I wrote a soft_dependencies module, which was inspired by what I had seen out of IceVision. Basically, if you try and import a module and any error shows up (such as something not being installed via a dependency) it will flag an issue. I can then specify error codes:

You can see an example of the usage here: fastinference/onnx.py at master · muellerzr/fastinference · GitHub

from .soft_dependencies import SoftDependencies
if not SoftDependencies.check()['onnxcpu'] and not SoftDependencies.check()['onnxgpu']:
    raise ImportError("The onnxcpu or onnxgpu module is not installed.")

That’s the best solution I’ve seen so far. I know in the future fastai wants to do something with segmenting out requirements, so we’ll see what happens.

HTH! :smiley:

2 Likes