Slow nbdev loading (aka how does FastAI lib keep its imports speedy?)

Hello, I started using nbdev a month or so back for a data engineering piece. Really like the workflow it enables so well done!

I want to build a CLI tool that would allow me to automate some basic DEng tasks but somehow, the imports start to take too long. What I do not quite understand is that despite the data intensity the actual fastai library does not seem to slow down.

So when I “profile” init.py I get the following:

Data exp imported in 11.700356483459473s
Files prep imported in 8.344650268554688e-06s
Log urls imported in 8.58306884765625e-06s
Log reader imported in 5.4836273193359375e-06s
Segments imported in 5.7220458984375e-06s
Storage imported in 7.3909759521484375e-06s
Validation imported in 0.0002892017364501953s
Nbdev imported in 0.00013780593872070312s
Devices imported in 1.0967254638671875e-05s

And 11-12 seconds is obviously too long to just wait for the library to load. What makes me even more confused is that some of the the longest wait times only have functions (Data exp and Storage included). When I profile the imports (only other things in the file that could be taking so long), they all take <0.001 sec. Curiously that run I got a significant speedup of Storage importing in ~2 secs.

AFAIK, __init__ is auto-generated. But what is the best way to speed it up / or not have import everything?

Many thanks!