I have been shifting my full-time work to use nbdev. I have been using it for developing some of our microservices and tooling. Because I have been using nbdev, I have had the most detailed documentation for my code out of my whole team. It has been awesome and I swear by nbdev now for both work and side projects.
With this said, using nbdev for production development highlighted some areas in testing that seem to be a tad limited. Before nbdev, we used pytest for developing and testing our APIs. One of the things that makes the rest of my team apprehensive in using nbdev is the inability to truely understand “how tested” my code is.
There are 2 features in pytest that would be cool to have in nbdev:
- output the number of tests that have been run
- provide an estimate / function for failing and passing based on code coverage
Outputting the number of tests run would be doable. Code coverage I can see being more complex.