We'd like to take this opportunity to give a shout out to some of the awesome projects folks are building on top of JAX, e.g.,
* Flax, a neural network library for JAX (https://github.com/google/flax)
* Haiku, a neural network library for JAX inspired by Sonnet (https://github.com/deepmind/dm-haiku)
* RLax, a library for building reinforcement learning agents (https://github.com/deepmind/rlax)
* NumPyro, a probabilistic programming library on top of JAX (https://github.com/pyro-ppl/numpyro)
* JAX-MD, a differentiable molecular dynamics package built on top of JAX (https://github.com/google/jax-md)
Interesting that googlers who are supposed to use Tensorflow are now actively developing a new autograd engine and at least three new DL frameworks on top of it. What do you think about this segmentation?
Comparisons are hard in general and I don't have a good answer for you right now, but keep in mind most of these libraries are from researchers openly sharing the codebases they develop for their own work. We see the role of JAX as analogous to NumPy, that is, a common substrate on which folks can build these sorts of tools.