
Objax, a new high-level JAX API with a PyTorch-like interface - homarp
https://github.com/google/objax
======
p1esk
OK, so why this and not Pytorch?

~~~
homarp
see this thread explaining why Objax:
[https://mobile.twitter.com/D_Berthelot_ML/status/12992751458...](https://mobile.twitter.com/D_Berthelot_ML/status/1299275145886875650)

~~~
p1esk
Skimmed it, but still not convinced. Not seeing any killer features that would
motivate me to switch from Pytorch. Maybe some advanced autograd machinery,
but that's all provided by Jax, and Jax already has half a dozen of higher
level NN frontends. I'm afraid this one is DOA.

~~~
chillee
If you're not interested in Jax, you're not going to be interested in Objax.
Objax is more for people who are interested in the features of Jax, but find
it awkward to write code in it and prefer the API of PyTorch.

I think the features of Jax are quite appealing - there's a reason PyTorch has
been trying quite hard to copy its features (vmap, forward mode
autodifferentiation, numpy compatibility).

~~~
p1esk
I'm obviously interested in Jax (otherwise why would I bother commenting on
this post?), but it's clearly not mature enough yet for regular DL
practitioners. For cutting edge research, especially done at Google, sure.
This is fine, it will get better, and I can see myself switching to it in the
future. But today, looking at Jax based NN libraries, it's a mess. Stax, Trax,
Flax, Haiku, now Objax. It's like the TF mess all over again: old style core
TF, Keras, TensorPack, eager mode, estimators, ugh... From what I can see,
Flax is the leader (8 months old, 800 commits, last commit is yesterday,
similar enough to Pytorch to make the switch easy). So if I wanted to switch
to Jax today from Pytorch, why would I choose newborn Objax over Flax? I
looked at the documentation and I have no idea how it is more "object
oriented" than Flax, or why would I care. The main dev being so bad at
marketing his creation leads me to believe it's DOA.

