Binder and Fastai2

so if i update fastai2 in my repo and put it on binder, it should work?

You need the same version where you export your model and where you deploy it.

1 Like

got it, thanks - i exported from paperspace so iā€™ll update the deployment repo

@sgugger cannot easily find a way to update fastaiā€¦ is there a command that does it?

You can do pip install fastai2 ā€”upgrade

1 Like

@sgugger hmm cant seem to figure this outā€¦ I got a version of the bear_classifier.ipynb notebook running on my local machine where I have cloned fastai2 and installed voila and the it renders correctly when I run voila bear_classifier.ipynb from terminal
however when I copy this notebook to a public github repo https://github.com/nishanthegde/projects and try to use it with binder I get the error

It seems that we could use https://ai.facebook.com/blog/training-with-quantization-noise-for-extreme-model-compression?tn=HHH-R this approach to reduce the size of our model to improve the performanceā€¦May someone was able to try it or know the researchers there? I am not familiar with Quant-Noise and Quantization for model performance, but it seems that it could be implemented on our forward API, am I correct?

Thanks @enr . I followed your instructions and finally got the app published but after the fastai2 it seem I have to update something because it not working anymore