Jax is a fresh take on deep learning and a really cool project, but the reasoning that a hacked JIT on top of python is better than a compiled language makes no sense. You either have a general purpose compiler that produces fast code or you’ll end up with the same limitations currently present in python. Simple example: try to write fast mutating code in JAX.
It follows the same principle stated here :
Any sufficiently complicated machine learning system contains an ad-hoc, informally-specified, bug-ridden, slow implementation of half of a programming language.
Also if Julia has poor results I have no idea what can be considered good. Please go watch the JuliaCon 2020 to see how many scientists have migrated their code and simulations, and how the language is growing really fast.
I think that trying to build a fastai-like high level interface on top of JAX is worth it because it’s trying to explore a different path than what both pytorch and tensorflow 2.0 have converged to, not because of some empty reasoning that tries to discredit the hard work that went into the alternatives.