It is certainly a good defense-in-depth measure to implement the recommendations from JHU and it's great that Apple has moved aggressively with them. Those mitigations will protect even from a catastrophic "goto fail"-type bug in the future where you could strip away TLS.
That leaves Apple cooperating with law enforcement (or an Apple insider, coerced or otherwise) to launch this attack against one of their users. I think this scenario is unlikely given their very public reaction to the iOS 9 backdoor the FBI requested.
Finally, I think their development of the gzip oracle is pretty great and that it has the potential to work against other cryptosystems. This is probably not the last that you'll hear of it. I suspect the JHU team is running down a list of other cryptosystems that it could work against right now...
The underlying problem here is unauthenticated encryption (or, MAC-then-encrypt constructions, which, when it comes to adaptive chosen ciphertext attacks, are often morally the same thing). Unauthenticated encryption is a game-over flaw, probably no matter what the rest of the protocol does.
What you (and, no doubt, the Apple engineers) are thinking of as authentication is really a signature. Signatures and authenticators aren't the same thing.
Here, we're referring to message authentication, which is the mechanism of using a secret key (usually agreed on at the same time as the secret key for the cipher itself) to apply a "secure checksum" to the ciphertext, to ensure it isn't tampered with.
Trying to use a signature in lieu of an authenticator is what got Apple in trouble here. In addition to ECDSA signatures, those messages also should have used a MAC, or, better still, an encryption mode with a MAC built in, like OCB or EAX or NORX.
I'd still recommend always to use an authenticated cipher, since the above distinction can be very fine.
Or a compromise of Apple's push notification infrastructure.