After building my model using my GPU and my 1.0.46 fastai installation, I am now trying to do inference using another machine that has only a CPU and 1.0.57.dev0 fastai installation.
In my effort to load the data and after my fastai imports along with my definition of the
path = Path(‘path/to/my/data.pkl file’), I am trying to load the data in the following way:
data = load_data(path=path, fname=“data.pkl”)
When I try to do that I get the error:
FileNotFoundError: [Errno 2] No such file or directory: ‘path/to/my/data_save.pkl’
When referring back to my GPU machine I am able to locate this data_save.pkl file. If I copy that file to my CPU machine and try again to issue data = load_data(path=path, fname=“data.pkl”)
I now receive the following error:
TypeError: intercept_args() got an unexpected keyword argument ‘fname’ and the error seems to be in
~/fastai/fastai/text/data.py
Why data_save.pkl is needed to load the data? and why when I copy data_save.pkl over to my CPU machine, I get the TypeError? After all do I really need to load the data to do inference or I only just need the model?
Is there a complete example of what do I need to export after the training phase(s) is(are) complete to be able to do inference (and/or) classification?
I have seen in the videos that there are different things that you can do (export the encoder when you want to do classification, save a learner after fitting, save a databunch after creating it in order to save time next time you want to resume work, etc.) but is there somewhere a minimal example that describes how I can use the models I build? There are different also files that are created while someone saves a data-bunch, a learner, and a model. I can see *.pkl files created, but also *.pth files inside the models/ directory. How are these connected and what is needed each time?
Thank you and sorry for the many questions in one post. Thanks