There has never been such a thing as end to end encryption on a cell phone. Carrier "debugging" tools such as CarierIQ hook at a lower level and can intercept and log everything that any application can see. CarrierIQ was acquired by AT&T and doesn't even officially have a name any more. They would tell you it only runs if the phone is in debug mode, but the dial home to the carrier can enable it via a simple header.
Perhaps the difference here is that Russia does not have access to this data?
This distorts the usual meaning of end-to-end encryption, which is that data over the wire is encrypted and can't be MITMed, even by the "service provider" (Telegram in this case).
You are bringing up a good but different point, which is that the application environment on a mobile device may not be protecting you from certain privacy violations. I'm no fan of Telegram, but that's not really within their control.
Put another way, if unbeknownst to Telegram someone had installed a keylogger on my device, would you consider that to be broken end-to-end encryption?
Look, I understand what you are talking about, but I think this logic is really misguided. Let's just take a step back and face it as it is: I don't care if an app has "bad encryption", is made by "KGB agents" or there is a keylogger installed by Obama and his friends — all I care is someone is reading my messages that are not supposed to be read by him. Now, this can happen for infinite variety of reasons, including Obama personally taking my friends phone from his hands. We cannot make anything "really secure", ever, all we can do is make security breach less probable and harder to achieve for a projected adversary. And this matters.
If my conversations are not e2e encrypted this makes the probability of someone reading my messages really high, and makes it reasonably easy for a state-level actor to do so even without targeting me specifically. This makes Telegram not secure, end of discussion.
If my messenger uses unbreakable e2e encryption w/o back doors, but also somehow requires running it on a device, that is known to have some kind backdoor that sends all my conversations to the 3rd party in some other way — it is not secure as well.
Now, I don't know if allegations of everything happening on my phone being monitored by MNO are true. But if they are, this means every messenger that requires you to have a phone number is not secure. And I mean "roughly equally insecure", because the probability of a breach is high, and not a philosophical "nothing is really secure" here.
All WhatsApp, Telegram, Signal and Viber (something I forgot?), REQUIRE you to have a phone number and use the messenger on a phone. There's no choice. So if OPs implication is any true, it makes him really on point with his remark, and makes all the messengers in question about equally (totally) insecure.
Could you please provide some home-reading material on the topic? I understand that the notion of "secure messenger" linked to your phone number and running on a phone is laughable, but it's the first time I hear accusations that my mobile operator can retrieve basically any info from my phone, like actually right now, without breaking a sweat.
Does it apply to all mobile operators or only to american ones?
There used to be a really good walk through and demo of the software on youtube, but it was pulled down. Perhaps someone has uploaded it again. I will see if I can find it.
IIRC, these debugging tools exist on all carriers and each have their own custom implementations.
I don't understand how Android phones in the US can be backdoor-free considering it's the carriers that provide the final update to devices and they are the ones to sign it. Would the OEMs know if the carriers included a backdoor in there on NSA's behalf? Probably not.
Perhaps the difference here is that Russia does not have access to this data?