No, the fun part is that one government's spies (India's) have apparently used these backdoors to spy on the agents of another country (the US). So, you see, by mandating these sorts of security compromises, governments have in fact made it easier for foreign governments to steal their own secrets—an irony apparently lost on the author of the Indian memo and on the authors of these policies.
If you want your data to be secure, use your own layer of encryption, preferably on your own physical disk or at the very least have the encryption executing in an environment you control. e.g. Not someone else's package of TrueCrypt running on their server, rather your verified package on your machine and then upload it.
Encryption is a major strategic issue and few governments will lay down and accept that their citizens or other countries have the ability securely store and transmit data. I was at an NSA show and tell recently and they had an original enigma machine on display which speaks volumes.
I think the software you're hoping for is jailbreaking the phone. But some governments are making or made that illegal: https://en.wikipedia.org/wiki/Anti-circumvention
Criminals and revolutionaries are surely just as aware of these simple facts as we are, so these systems will likely mainly be used to spy on the innocent for political or personal gain.
If you make secure crypto a crime, only criminals will have secure crypto.
related aside: I always wondered at people who push TOR for citizens of authoritarian regimes - please pin a "subject of interest" on me, pretty please. imho, it is not "content" that upsets these regimes, it is the desire to question the system that is the issue.
Communication is what happens on the wire.
If you fully controlled both ends, the security of the wire would be irrelevant. It could send copies of every message you send straight to your enemy's inbox and it would still be secure, because only the two fully controlled ends could decrypt the messages they send one another.
If you don't control the device, you don't control the encryption.
"CALEA's purpose is to enhance the ability of law enforcement and intelligence agencies to conduct electronic surveillance by requiring that telecommunications carriers and manufacturers of telecommunications equipment modify and design their equipment, facilities, and services to ensure that they have built-in surveillance capabilities, allowing federal agencies to monitor all telephone, broadband internet, and VoIP traffic in real-time."
Note that the expert used in the article is Bruce Schneier.
I mean there's pretty compelling evidence that others now have access to what we're doing on our (closed-source) devices. Would it now be totally irrational to go the Richard Stallman route?
Third announcement: http://pastebay.com/272822
Images of document: http://imgur.com/a/8XoGf
Because Android is open source and each device manufacturer has its own build, it isn't really plausible to have a single backdoor within the core operating system. However, the closed-source Android Market application is the perfect candidate for such a mechanism; when you install an app via the market, all of the following happens:
1. You request the application from the Market and agree to any permissions requested.
2. The Market pushes an install instruction to the phone via Google's XMPP-based Android notification service.
3. The phone downloads and installs the .apk as instructed.
Step 3 requires no interaction from the user – this is how, for instance, you can send an app to your phone from the Market web site. But it also means that any government with sufficient leverage over Google can coerce them to use this mechanism to install an arbitrary SpyOnStuff.apk at any time.
But if this is the case it isn't entirely bad news for Android users, rather it yields two means of protecting yourself which aren't necessarily options on other platforms:
1. Use an open source Android build such as CyanogenMod, without the closed-source Market app installed.
2. Proxy Google's XMPP notification service and require user confirmation before pushed APKs are installed. I don't know of any Android mods that currently implement this capability, but it would be nice to see in future versions of Cyanogen or similar.
Not necessarily. The baseband firmware on your phone is probably more capable than you think. I played around with hacking my old HTC WinMo phone and with a little work I got the (Qualcomm) firmware to respond over USB. There's a complete file system on there that's nearly identical to the file system structure on my wife's dumb-phone (and accessed the same way). I assume they'd put everything they need at that level and don't have to insert anything into Android itself.
So which would you rather develop: a different backdoor mechanism for every Android baseband out there, or a single mechanism for all phones equipped with the Android Market?
I'm very curious about how these end up working, and can't find much public work in the Android space of people trying to reverse basebands.
I found the 28C3 presentation entitled "Reverse-engineering a Qualcomm baseband." 
Although, I have wondered if Google might have had more secret weapons to take out droiddream if necessary other than asking it nicely to delete itself. If not, boy were they lucky. I mean, it is pretty balsy to make an OS that for all appearances just assumes /system can't be compromised. It's a double edged sword of course any API a virus scanner might use is also a vector for attack. I guess possibly they're relying on being able to push a targeted OTA update? That's the best contingency I could come up with when pondering "what if droiddream wasn't retarded". But there are also ways to prevent OTA updates from being applied, so that's probably a one-time trick before the next malware blocks all OTAs (methods to block OTAs from auto-installing are all over places like XDA/rootzwiki).
I doubt they would be asking Google to put in a backdoor. They would be asking the individual device vendors to do so. After all, it is the device vendor, not Google, who has to get the particular phone approved for sale in India.
RMS was right all along.
Plus, it looks like the contents of the intercepts listed were there for gratuitous purposes only.
I could be wrong, though.
From my brief reading it sounds like the person writing the memo may not be able to distinguish between surveillance of the carriers and putting backdoors into the devices themselves. I can believe that the carriers would be compromised, sooner than I'll believe that the operating systems are.
Specifically, one of the things Apple did with iOS 5 for iCloud was put in end-to-end encryption into the system. iOS has for an even longer period contained protections to make it less easy for an app to read data belonging to other apps, and the storage on Flash has been encrypted since at least iOS 4 (I believe.)
Yet the iOS 5 binaries are subject to the scrutiny of the jail breaking community, and if there is a backdoor here, I'd think that it might have been found. (Or will be soon now that its existence has apparently been leaked.)
Further, how could this actually work? (the memo is so full of jargon and nonsensical to me that I stopped trying to interpret it pretty quickly.)
Its tricky to get intentional communication- FaceTime, iMessage, etc, working on devices, let alone back doors for a specific government. I don't think the indian government could successfully delver to Apple a spec for them to implement that wouldn't intrinsically cause noticeable problems and thus reveal its existence regularly.
In short, I suspect that a backdoor is kinda infeasible from a software reliability viewpoint.
Finally, its also quite possible that this is a psyops campaign. By announcing that they have a back door (thur a leaked memo) the indian government could be attempting to bring pressure to bear from other governments to get their own backdoors, and thus, enable india to ultimately actually get a backdoor. I remember months ago reading that the indian government wanted more access to the mobile devices but were being stymied by manufacturers.
I've been observing Apple fairly closely for over 30 years and working with their operating systems, often at a fairly low level for almost 20 years. I've never seen Apple do anything that would betray their customers trust. People may object to the decisions Apple makes, but I have never seen a decision that, at the end of the day, didn't have at least the intent to do right by the customer (even when they failed to achieve that intent.)
Thus I'm highly suspicious at any claim that Apple has put in a backdoor. I think this kind of claim is extraordinary and out or character, and Apple has far more integrity in my eyes than these documents.
So, I'd like to see some evidence before we accept that this is fact.
I'm not familiar with iCloud. Does it have strong client side encryption?
Any telecommunications device in the US is subject to CALEA so expect backdoors.
However I believe without a passcode you can still just use USB to access everything.
end to end encryption means nothing if the other end has to hand over your data anyway.
Political problems are NOT solved with technical solutions.
But what I don't necessarily believe is that the government of India is in on it. Or that they are intercepting memos written by the United States about some confidential stuff. I feel this whole thing is a hoax and not real.
Has this been confirmed by anyone? I wouldn't necessarily trust some "hacked memo" posted to Pastebin. Also, the title bothers me. Memos are leaked, not hacked.
This is a pedantic distinction. Both could happen, if you take it to mean something like:
"Leaked": someone with "legal" access to it, made it available.
"Hacked": someone hacked their systems/email, and got the memo.