Type inferences for fastai

I am working through lesson 2, deploying a model to Gradio.

I noticed that my editor (both Neovim and VSCode) shows the return type of load_learner is Any.

Full code:

from fastai.vision.all import load_learner

learn = load_learner("model.pkl")

The editor type hint for learn is:

(variable) learn: Any

Is there something I am missing with my set up? Or is this the intended type?

I can run the code and deploy to Gradio just fine. I really prefer to have everything typed when I code so I can use auto-completions and in editor documentation. I can’t detect the predict method on learn without looking up documentation.

I have a valid type hint for the load_learner function though:

(function) load_learner(fname: Unknown, cpu: bool = True, pickle_module: Module("pickle") = pickle) -> Any
──────────────────────────────────────────────────────────────────────────────────────────────────────────
Load a `Learner` object in `fname`, by default putting it on the `cpu`

Can anyone comment on this? Is the load_learner function meant to return Any? Or are the types not fully fleshed out yet?

fastai is mainly designed to be used from a dynamic environment like Jupyter, rather than a static IDE, and you’ll see the right types there. However not all functions and methods have static types provided. In the case that no type is provided, it’ll be shown in most IDEs as Any.

Understood thanks for thanks for the explanation. I was looking into the source code and see that the load_learner function returns the result of torch.load – which itself returns Any: torch.load docs

Since fastai is built on top of PyTorch, the types are indeed correct :slight_smile: