TensorFlow eager? Can fastai now adapt to TensorFlow provided we can code it that way?

Hi @jeremy What are your takes on Eager execution in tesnorflow, would it make tensorflow’s capabilities match or exceed pytorch’s? cuz from my limited experience with both, i felt that was one of the biggest pain points in tensorflow

1 Like

TF Eager is certainly in the right direction and provides more options for developers. I would suggest you give it a try and try to implement any one of the features in Fast.AI using TF Eager and share your experience here.

IMO, Jeremy wants to provide a easy to use library for researchers with SoTA (State of the art) results. This blog post might help to understand why Fast.AI choose PyTorch over Keras / TF in V2 - http://www.fast.ai/2017/09/08/introducing-pytorch-for-fastai/

If you need to deploy your models in Production, Fast.AI might not be right framework for you. In such cases, I typically use the underlying PyTorch Model and deploy as Flask or you can export PyTorch Model into ONNX format and import the models into any of the other frameworks - https://github.com/onnx/tutorials

1 Like

Our study group, Paris ML, has come to the conclusion that the Tensorflow eco system has overwhelming advantages over Pytorch. Thus we expect a gravity shift towards Tensorflow. I can’t find a Pytorch roadmap. Anyone? I no longer see any compelling reason to use Pytorch other than legacy such as fast.ai backend. OTOH, I see compelling reasons to use Tensorflow Lite for mobile and tensorflow.js for web (perhaps beyond web too).