Tips for developers stuck with company using TensorFlow for now

Fast.ai library is amazing, but unfortunately, my company is still using TensorFlow for now, not even with Keras as the API.

I know that there are a lot of best practices built in our fast.ai library. I can see the difference when I am training on a dataset using Keras and when using fast.ai; fast.ai is just so much faster in terms of convergence speed. Naturally, I want to bring the best practices over to our company’s TensorFlow ecosystem, but there are so many of them and I am still not sure which of them TensorFlow + Keras is most lacking, and due to time constraint I could not simply just implement them all. So I have to choose the most important ones.

If you are going to rank by importance what fast.ai features & best practices I should bring over to a TensorFlow ecosystem to boost its training performance, what would the list look like?

Implement the callback system and it would save a lot of work. As fastai mostly uses callbacks to do the work, after implementing callback the implementation of OneCycleSceduler and other things is as simple as copying the code in fastai to Tensorflow (with some modifications).

Thank you for the direction! Since Callback system is actually available in Keras, it sounds like a super promising direction to go!

1 Like

Have you seen tf-fit. I’ve never used it or tensorflow, but if I had to this would be one of the first things I would check out

The callback system in keras only handles a tiny subset of what’s needed, unfortunately.

3 Likes

I don’t know if this is possible in your case. But I would continue using fastai and convert the model and weights using ONNX or simply use Torch Script.

Ah, what a bad news!

Probably TensorFlow 2.0 will bring some changes to the API to make it more convenient to use? At least, it brings eager execution, further integrates with Keras, not talking about S4TF. I see there are lots of models and libs written in TF v1.x so it seems to be a good choice to use in production. Also, the TF community and ecosystem are very large. Probably someone already implemented a subset of the functionality available in the fastai library?

I guess that the major tricks are callbacks-driven training loop, learning rate schedulers, some custom layers, and of course text processing models. It depends on your goals probably. The fastai library includes lots of tools while its core seems to be pretty compact still.

Though I am not an expert in these two frameworks and could miss something else. Sure enough, the fastai library is a very convenient and powerful solution so I believe it would be difficult to precisely replicate it. And I really like pytorch as well :smile: Would be great if it becomes more popular among companies and practitioners.

Also, how do you guys think which major features the Keras callbacks system lacks? Didn’t work with Keras for a long time so not really aware of recent changes. But for sure, would like to try TF v2.0 so would be great to know what to expect there.

1 Like

It would be an interesting project to replicate lessons 1-7 using Tensorflow 2.0 on a TPU which Google Colab provides.