Hi @jeremy What are your takes on Eager execution in tesnorflow, would it make tensorflow’s capabilities match or exceed pytorch’s? cuz from my limited experience with both, i felt that was one of the biggest pain points in tensorflow
TF Eager is certainly in the right direction and provides more options for developers. I would suggest you give it a try and try to implement any one of the features in Fast.AI using TF Eager and share your experience here.
IMO, Jeremy wants to provide a easy to use library for researchers with SoTA (State of the art) results. This blog post might help to understand why Fast.AI choose PyTorch over Keras / TF in V2 - http://www.fast.ai/2017/09/08/introducing-pytorch-for-fastai/
If you need to deploy your models in Production, Fast.AI might not be right framework for you. In such cases, I typically use the underlying PyTorch Model and deploy as Flask or you can export PyTorch Model into ONNX format and import the models into any of the other frameworks - https://github.com/onnx/tutorials