The strong advantage of TensorFlow is it flexibility is designing highly modular model which also can be a disadvantage too for beginners since lots of the pieces must be considered together for creating the model. This issue has been facilitated as well by developing high-level APIs such as Keras and Slim which gather lots of the design puzzle pieces. The interesting point about TensorFlow is that its trace can be found anywhere these days.
I'm feeling TF is getting left behind compared to PyTorch. Dynamic graphs is a powerful thing. It's also kind of sad Keras is not yet supporting PyTorch.
Though when comparing a same model code written in Pytorch, TF Eager imperative code feels verbose and bit cluttered as if it is written in Java.
(I don’t agree/disagree with it’s usage. I just think of these as another form of expression)
This is also more whatever generation is after millennial. The youngest millennials are like 20 now.