Fastai apparently has a dependency on tk (when I do ‘import fastai.vision’ in a python environment w/o tk, I get Module not found: Tkinter). Is there any way to get around this dependency? I don’t mind training and fine tuning my model in an environment with tk in it. But when I deploy my model, I would very much prefer to do it in an environment w/o tk.
Initially I thought one way to accomplish this could be to load the model as a pure pytorch model (thus running it in an environment w/o fastai installed). But this does not seem to be a feasible route, since this way my input data will not be preprocessed in the same way as when I trained my model (I’m missing the steps in basic_train.load_learner).
Do you know of another way to do? Alternatively: would it be possible to get rid of the dependency on tk at least for inference?
I’m not sure but might be we can remove some dependencies in setup.py and install dev version of fastai using pip. Unfortunately, I can’t find Tkinter directly in the requirements. So I think it is a dependency of a dependency inside fastai.
Tkinter is something about GUI so I think it doesn’t intefere the inference. I took a look in sth has GUI like fastprogress and matplotlib but doesn’t found it there either. You can try to look more carefully in the dependencies of fastai to find where it is ?
dep_groups = {
'core': to_list("""
bottleneck # performance-improvement for numpy
dataclasses ; python_version<'3.7'
fastprogress>=0.1.19
beautifulsoup4
matplotlib
numexpr # performance-improvement for numpy
numpy>=1.15
nvidia-ml-py3
pandas
packaging
Pillow
pyyaml
pynvx>=1.0.0 ; platform_system=="Darwin" # only pypi at the moment
requests
scipy
torch>=1.0.0
typing ; python_version<'3.7'
"""),
'text': to_list("""
spacy>=2.0.18
"""),
'vision': to_list("""
torchvision
"""),
}
It is something that I’m not master yet so might be it is not a correct way But anyway, hope that helps