
Why an unhackable mobile phone is a complete marketing myth - sidcool
https://techcrunch.com/2016/10/08/why-an-unhackable-mobile-phone-is-a-complete-marketing-myth/
======
ufmace
Seems useless and pointlessly breathless to me to even talk about something
being "unhackable". What's important is hackable by who, and in what way.

If you turn off Android's setting to only allow app installs from the play
store, then install a sketchy third-party store and download a flashlight app
with a huge permission list that scrapes your email address book and sends it
off to Russia, is that really a hack? It's pretty tough to prevent that
without locking the phone down even harder than the iPhone. Doesn't seem worth
worrying much about to me.

Android seems to be pretty safe against more dangerous stuff like visiting a
website or displaying a web ad installs an app, bypassing user approval. Or
worse, doing something with root permissions. At least I haven't heard of
anything like that. If it's out there, presumably whoever has found such
exploits keeps them under wraps for high-value targets.

The Stagefright one sounded pretty dangerous, but I haven't heard of any
large-scale attacks with it in the wild. Presumably carrier-level general
protections against mass MMSing and specifically against payloads for that bug
were effective.

Given the difficulties the Android rooting community seems to be having, it's
hard to believe that a drive-by exploit could gain persistent root.

~~~
fnkym0nky
Android is notoriously broken among the AppSec community. For proof all you
have to do is look at the last 90 days of Android bug reports[1]. The reason
why your not hearing of exploits is because of the price war that currently
exists for them. Zerodium recently raised the price for a Android RCE to
$200,000, with iPhone RCEs at 1.5 million[2].

[1]
[https://source.android.com/security/bulletin/2016-10-01.html](https://source.android.com/security/bulletin/2016-10-01.html)
[2]
[https://www.zerodium.com/program.html](https://www.zerodium.com/program.html)

~~~
pjmlp
After watching the presentations about the state of the union in regards to
kernel security in Android, I came to realize that I was wrong in bashing
Google for their constrained NDK.

Actually, they should constrain it even more, given the poor state of kernel
exploits and how the whole security updates doesn't happen across Android
devices.

------
kabdib
There's a LOT that the phone OS and hardware manufacturers can do at the
hardware and firmware level to make things better. A good model would be the
security strategies used on game consoles, the recent generations of which
have moved critical pieces of the system to embedded processors (sort of like
the secure enclave on recent iPhones, but with more isolation).

Not putting device drivers written by random external engineers into the
trusted part of your OS would be a good start. So would not letting any page
acquire executable permissions without being signed (not sure what you do with
jitted code -- if you can't effectively sandbox it, you probably don't want to
run it, and yeah, that's a drag). Forbidding external processors (e.g., the
radio module) from becoming physical bus masters is also a great idea.
Encrypted memory (at the cache line level) has been in consoles for over a
decade; it's not that expensive and it's a nice first defense against stuff
like RowHammer and folks mucking with physical wires in a lab.

Consoles take this level of security seriously because it's not just about
customer privacy or security or similar pesky stuff, it's about _the revenue
stream_. (Secondarily, it's also about limiting the actions of bad actors on
hardware under their control -- it's much harder to cheat on a console).

~~~
mSparks
->There's a LOT that the phone OS and hardware manufacturers can do at the hardware and firmware level to make things better.

Yeah, like not issuing phones with intentionally compromised baseband
processors.

In fact, any phone that has a direct connection from the mic to the baseband
processor is not just "hackable" \- it's designed to hack your life (hint:all
of them).

~~~
nickpsecurity
Part of this comes from long-standing partnerships between telecoms and
intelligence agencies. Esp MI6. Ever wonder why there were protocol steps like
answering without ringing ir notifying user available in user phones? Sure it
was just for data collection systems in industry that happen to hack a
solution out of a GSM phone. ;)

Unfortunately, many schemes were baked into the standards you certify against.
Good luck changing that while reaching mass market.

~~~
mSparks
Who now have nothing better to do than down vote comments like this. Like
anyone important doesn't already have access to one of the thousands of black
market stingray type devices that came out a couple of years ago.

[http://motherboard.vice.com/read/the-black-market-dealers-
se...](http://motherboard.vice.com/read/the-black-market-dealers-selling-
state-surveillance-equipment-online)

~~~
nickpsecurity
For example:

[http://www.zdnet.com/article/invasive-phone-tracking-new-
ss7...](http://www.zdnet.com/article/invasive-phone-tracking-new-ss7-research-
blows-the-lid-off-personal-security/)

The NSA and GHCQ, among others, were constantly reviewing the phone standards
for use in secure products + SIGINT obviously. The were putting experienced
breakers and security engineers on it, many with extensive COMSEC background.
The phones their defense contractors produced had few to none of these
weaknesses depending on security vs usability tradeoff. Many private
suppliers, who sold to governments, did the same. The internal presentations
whose details leaked in various papers around government programs and defense
products also showed they consistently knew risks of transmission mediums.
It's actually the only form of security they know through and through down to
emissions of analog components. They mastered it decades ago with only real
challenge being people wanted benefits of insecure phones. :)

Then, they not only allowed these standards but endorsed them for the general
public and businesses. They also made it illegal for us to buy their Type
1-secure phones or TEMPEST-certified devices. That last part is how you go
from "were they just incompetent?" to "they're deliberately keeping us
insecure to enable SIGINT!" If they say it's to stop reverse engineering, then
ask why the recent devices are supposed to be field releasable where enemies
might pick them up but citizens of Five Eyes still can't buy them. Excuses
getting thinner and thinner even if we never get some contractor leaking the
proof we need from somewhere in Hawaii. ;)

------
Zigurd
The question is not "unhackable." The question is whether it is practical
for...

1\. A criminal to hack in to steal money

2\. A state actor to routinely hack in for surveillance

3\. A state actor to penetrate a system with "on the ground" resource: a black
bag job, bribes, blackmail, etc.

4\. A state actor to mount a heroic effort, having to invent new technologies,
in order to penetrate a system.

As long as a system can protect you from 1. and 2., it's adequate. As long as
you can't do pervasive surveillance, you're safe from criminals and from
"casual tyranny." Preventing pervasive surveillance doesn't mean securing the
first phone from surveillance. It means making it impractical to break into
and tap tens of thousands of phones undetected.

In practice, it is possible to end technology-enabled pervasive surveillance.
From the East German experience with the Stasi we know the limits to scaling
human surveillance. Preventing that from happening isn't a technology problem.

------
s_q_b
_" The only truly secure system is one that is powered off, cast in a block of
concrete and sealed in a lead-lined room with armed guards... And even then I
have my doubts."_ [0]

For your application to be secure, hundreds of thousands of individuals must
have never made a mistake in millions of LOC within libraries generally
written in low-level C. That isn't very plausible.

If at some point, my input hits your code, your memory, and your processor,
that fact alone is sufficient to know that the system is vulnerable.

[0] "Computer Recreations: Of Worms, Viruses and Core War" by A. K. Dewdney,
Scientific American, March 1989

~~~
wtbob
> … generally written in low-level C …

Maybe we should focus on that? Why are we writing OSes in C when more-secure
alternatives have existed for decades? Why don't we _consider_ writing our
phone OSes (which don't actually require backward-compatibility to 1960s OSes)
in something higher-level and more-secure?

~~~
pjmlp
Because industry adopted an OS whose code was available for free, aka UNIX,
and C came along.

Just today I read a paper fron the author of Concurrent Pascal to C.A.R.
Hoare, written in the early 90's about industry's looming doom due to C's
usage starting to take off outside UNIX.

He was right...

~~~
nickpsecurity
Which Hansen paper was that?

~~~
pjmlp
His remark written in 1993 is:

"The 1980s will probably be remembered as the decade in which programmers took
a gigantic step backwards by switching from secure Pascal-like languages to
insecure C-like languages. I have no rational explanation for this trend. But
it seems to me that if computer programmers cannot even agree that security is
an essential requirement of any programming language, then we have not yet
established a discipline of computing based on commonly accepted principles."

It was written on "Per Brinch Hansen (letter to C.A.R. Hoare 1993a", however I
got the quote from "Java's Insecure Parallelism" and so far failed to find a
copy of his letter.

I have spend the day going through his papers after you posted them, and now
Solo with Concurrent Pascal, done in a PDP-11/45 is yet another proof I can
refer to for safer alternatives done in about the same age C was coming to
life.

EDIT: Typos

~~~
nickpsecurity
I keep posting them since he simply got so much done ahead of others. Invented
monitors, nucleus concept for OS design, safe[r] concurrency, co-invented
Wirth-style development of OS's (sort of drew on each other), and so on. Thing
I like about his papers is he comes off like a real engineer that tried to get
at the root of the problem, solve it in a systematic but practical way, avoid
overcomplication, and clearly publish the solution. Yet, hardly anyone knows
his name or research. I fix shit like that. :)

Wait till you get to Edison System on PDP-11 that gave us C and UNIX. He beat
BCPL author and Wirth at their game of over-simplifying systems. Language was
simpler than C, safer, easier to compile, and more efficient for calls between
tasks. Then he did distributed system language after that. I for whatever
reason haven't got around to reading those in depth but I'm sure they'll be
interesting.

------
danjoc
All they had to say was "baseband processor back doors" and drop the mic.

~~~
JoshTriplett
If you're building a phone, don't give the baseband CPU any access to
_anything_ ; that's a design flaw, and you can build a phone without it. And
all your data should be encrypted, so exploiting or backdooring the baseband
just gives you yet another way to look at that encrypted data.

~~~
trendia
Many privacy and security concerns stem from the baseband processor -- from
Stingrays to completely unknowable rootkits. The only real fix is to not use a
proprietary baseband processor at all, though this would mean completely
rethinking the mobile phone. (say, switching to long range WiFi)

~~~
JoshTriplett
You wouldn't need to switch technologies to go open; several groups have
projects to develop Open Source modem firmware (and hardware, eventually).

------
nickpsecurity
Ive already proven it with a risk analysis from years sgo. Some already
happened. Here's a summary:

[https://news.ycombinator.com/item?id=10906999](https://news.ycombinator.com/item?id=10906999)

Just repost that any time someone claims their smartphone is secure. Ask what
they did for each area and proof of its effectiveness.

------
kobayashi
The point of the article is true, but it's also a pretty poor-quality piece.
Few links to references events, and even a basic fact like the name of the
company that used the Trident attack is wrong. It's the "NSO Group", not
"NOS".

~~~
greglindahl
It's a classic "let's make perfection the enemy of the good" headline...
clickbait and unhelpful at the same time.

------
zby
We need a CubesOS like system for phones.

~~~
carlesfe
Forgive me if I'm wrong, but isn't that what iOS does, where every app is
sandboxed and can't access anything from others?

Of course, there may be exploits derived from kernel/"supervisor" bugs, but
all apps are sandboxed by default, for better or worse.

~~~
walterbell
Qubes also isolates some devices (e.g. NIC, WiFi, USB) and their
drivers/firmware, using the hardware IOMMU to protect against DMA attacks by a
compromised device. There is a separate threat from the Intel ME, but the
threat from non-Intel devices is reduced.

------
walterbell
A wifi-only device (tablet or iPod Touch or other music player) can connect
over VPN+wifi to a mobile hotspot, which removes the baseband firmware attack
surface. Even if the hotspot is compromised (many have worse security than a
phone), the hotspot can only see VPN traffic.

Such a dual-device setup allows use of most mobile apps, with improved
security vs a mobile phone. Still need to avoid connecting to random servers,
apps, hotspots, devices and phishing emails.

~~~
wolfgke
> which removes the baseband firmware attack surface.

If the baseband processor uses DMA and there exists no IOMMU that you can
trust on the system, lots of evil things are possible if you can control what
is sent to the mobile phone OTA.

~~~
walterbell
A wi-fi only device _has no baseband processor_. In lieu of an IOMMU, the
baseband is delegated to a separate physical device, i.e. mobile hotspot.

A wi-fi only device is not a mobile phone, but can provide most of the
functionality of a mobile phone via OTT services and VPN to a _separate_
mobile hotspot.

~~~
nickpsecurity
That's not necessarily true. Always remember that the hardware industry likes
to consolidate multiple products into one for cost reduction where possible.
This happens especially with embedded or mobile SOC's where a chip might have
features usable in many applications at various price points. So, they make
them pretend to be chips without the extra features via factory switches.

As an example, a SOC for feature phones having both GSM and WiFi might be
reused in a cheaper "wifi adapter" that still has GSM circuitry but not
visibly. Whether if can be used against you depends on whether it's a hard or
soft switch they used to disable/hide it plus whether it's reversible or one-
time (eg antifuse).

~~~
walterbell
Wonderful. Now we need a teardown with chip identification before buying a
device.

~~~
nickpsecurity
The proprietary companies already do this looking for hidden stuff. Mostly
patent violations, though, so they can scheme some licensing revenue out of
the product they're inspecting. Here's the company that does much of it:

[http://www.chipworks.com/about-
chipworks/overview/blog/apple...](http://www.chipworks.com/about-
chipworks/overview/blog/apple-iphone-7-teardown)

In that one, you'll see the other end of the process where many mobile chips
will get extremely specialized. Notice this one has several RF chips instead
of an integrated one to attempt to hit some size/price/energy/performance
tradeoff. Plus Apple sells premium stuff whether they can put in more, nicer
parts. Notice there's 10 different chips involved in just the RF front-end
part. :)

Here's a nice PDF with illustrations of the RE processes themselves plus what
things look like:

[https://www.iacr.org/archive/ches2009/57470361/57470361.pdf](https://www.iacr.org/archive/ches2009/57470361/57470361.pdf)

I like Fig 3 especially as a lot of people forget there's circuitry inside
their PCB's rather than just on top or bottom. That cell phone has 9 layers of
wiring. :)

~~~
walterbell
Amazing RE paper, thanks.

------
walter_bishop
How about putting a hardware switch on the device such that you can't write to
the non-volatile memory when the switch is in the off state. For day-to-day
use the switch can be left in the off state. For software updates, set the
switch to on, boot the device, update the OS, set the switch to off and reboot
the device.

------
sebastianconcpt
Correct. As long as you have Turing Complete machines you made theoretical
hack possible. This was nicely illustrated by the problem that Ethereum had a
while ago while it would not be possible on Turing incomplete-Bitcoin.

~~~
Phithagoras
Could you post that Ethereum problem as a story and/or comment? Curious to
read it

~~~
hueving
I'm guessing op is just referring to the DAO disaster.

~~~
sebastianconcpt
Yes I was referring to the DAO disaster. Here are some interesting discussions
about it
[https://www.reddit.com/r/btc/comments/4p0gq3/why_turingcompl...](https://www.reddit.com/r/btc/comments/4p0gq3/why_turingcomplete_smart_contracts_are_doomed/)

~~~
schoen
It would be safer in many of these ways to prefer non-Turing-complete smart
contracts -- and non-Turing-complete software environments in general for many
tasks -- but I think formal undecidability is something of a red herring here
because not _all_ programs' behavior is undecidable. My previous comments on
this issue in this context:

[https://news.ycombinator.com/item?id=11942015](https://news.ycombinator.com/item?id=11942015)

