
Black market Blackphones get sent a kill message that bricks them - happy-go-lucky
https://arstechnica.com/information-technology/2017/01/silent-circle-bricks-grey-market-blackphones-with-os-update/
======
squarefoot
Slightly off topic but related to the product: how can a device using Android
be considered safe and trustworthy privacy wise? The maker can surely choose
FOSS only Android and encrypt communications, but the moment they add a binary
only device driver it can do whatever it wants since it runs with kernel
privileges (ie, sniff virtual keyboard, listen to conversations, access
messages, contacts, photos, documents etc). If I'm not mistaken (please
correct me in case I am) there's still no phone out there with open device
drivers, which IMO translates into giving only a false sense of security to
the user. Moreover, if I buy a security oriented device and it has a remote
kill switch, I would expect at least to be the one and only entity in control
of it.

~~~
raesene9
Having a FOSS phone would only be considered trustworthy if it had been
reviewed by an entity you trust. I'm not aware of any mainstream OS that has
been fully reviewed for security, so realistically the concept of a
"trustworthy" device in that sense doesn't exist.

Even if such an audit was done perfectly (a very serious and difficult
undertaking), it would be a point in time assessment, so would become less
relevant as new code was committed.

Basically you have to trust some group of people to run a modern computing
device/environment, it's just a question of who and how much...

~~~
mindslight
Proprietary software starts off with a high level of trust - trusting just one
company. But it can never progress past this.

The advantage of FOSS is that the set of who can audit the code is _open_. It
starts small, but grows with popularity and remains _agile_ \- eg doesn't
necessitate transitively trusting whatever nation-states a single company is
vulnerable to.

~~~
nickpsecurity
"Proprietary software starts off with a high level of trust - trusting just
one company. But it can never progress past this."

The first systems that were trustworthy were done by proprietary companies
with independent evaluation of the product, signatures on source/binaries,
and/or optional generation from source on-site. This became standard practice
in security- and safety-critical development. You can scale it to as many
people available for review often under NDA for the source itself but not
signatures or binaries.

~~~
mindslight
Sure the company can improve their process, but you're trusting the company to
do that and do it properly - a user's relationship is still effectively rooted
with one company. It's generous to call this "high", but forgive me for trying
to make the point palatable to people who believe an autocratic company makes
for something trustable.

A company can do things that are similar to Free software (eg allow customers
to build from source), but in the context of modern discussion I'd say that's
just taking on aspects of Free software.

~~~
nickpsecurity
Im responding to this claim:

"Proprietary software starts off with a high level of trust - trusting just
one company. But it can never progress past this."

It's totally false. The evaluation side has been done more times than I can
count. It doesn't even have to start with a single source. Many products
started as a collaboration of multiple organizations checking each others work
that share the result. Another model is CompSci inventing something with
details open at the start, patented, and then turned into closed source
product. Finally, there's Shared Source models where you can have, fix, or
extend the source so long as you're paying. Burroughs B5000 (1961), first
system resilient to 0-days, was distributed in source form to customers.

You were oversimplifying proprietary systems then attacking that
simplification.

~~~
mindslight
I appreciate your historic perspectives, but it feels like we're merely
quibbling over terms/framework on this one. I don't think _I 'm_
oversimplifying, more like it's current mass-oriented software that
essentially falls into two basic camps, Free and proprietary, with various
small tweaks to each model. For example, something like a mass-NDA for every
user is going to fail for consumer-oriented software, whereas a company
contracting a discrete number of external auditing decomposes into trust
through branding.

I'd _love_ to find a commercial model that could work for consumer-oriented
Free software [0], and I think it could sound quite similar to something you
describe. It's just that those type of multi-party collaborations have been
pulled into the attractor of free-as-in-beer, at least as far as the code
itself is concerned.

[0] I say Free instead of Open, because I can envision software lacking just
FSF freedoms 1b and 3 could get stuck in a bad state as well. Like say
everyone knows there is a bug and how to fix it, but is legally prevented from
doing so.

~~~
nickpsecurity
"but it feels like we're merely quibbling over terms/framework on this one. I
don't think I'm oversimplifying"

You were. You made a blanket statement about the whole, proprietary model. In
recent comments, you've changed your statement to talk about how the model is
applied in the general case. As in, popular implementations vs all
implementations. I'll reply to the new comment anyway.

"For example, something like a mass-NDA for every user is going to fail for
consumer-oriented software, whereas a company contracting a discrete number of
external auditing decomposes into trust through branding."

That's basically what happened. It could go further where lots of users get
the NDA with cross-checks but not mass on high-volume scale. It helps if the
software is designed in such a way where it can be shown it doesn't manipulate
the system. I once proposed memory safety, safe API use, and sandboxing as a
start on that. Automated tools could assess those. One could go further with
information-flow labels tracking confidentiality or integrity enforced by
compiler, runtime, or hardware. A few CompSci projects do it but not
mainstream.

"I'd love to find a commercial model that could work for consumer-oriented
Free software"

They pay for it. Then they get it. The source is in a FTP server or something
somewhere. Things like branding, enterprise features, and tie-ins keep them
buying from original supplier. Been done in a few ways although always a risk
of clones.

"[0] I say Free instead of Open, because I can envision software lacking just
FSF freedoms 1b and 3 could get stuck in a bad state as well. Like say
everyone knows there is a bug and how to fix it, but is legally prevented from
doing s"

Dual-licensed (proprietary + GPL) covers this. A few, quick proposals follow
on non-free trying to approximate free. One could do a shared-source license
that allows bug fixes. One could allow redistribution of software to other
_paying customers_. One could cap what's to be paid or for how long before
what's purchased becomes perpetual. One can make it go FOSS if it's EOL'd or
gets under certain amount of developer time/contributions (tricky measure).
Recent proposal was time limit on how long a version or individual product
would be paid with it going FOSS after that time limit.

So, quite a few options here. One thing that's important to remember is that
FOSS will always have a disadvantage over benign, paid model. The disadvantage
is you can contractually ensure those being paid are doing support, bug-fixes,
enhancements, pen testing, etc. They can also cover the pro's to do it right.
They have lawyer money for patent trolling that will come their way. Combined
with shared source like above, they might also get most or all benefits of
FOSS with benefits of paid. It's why I'm highly interested to see companies
experiment with hybrid models. A few have showed up here but nowhere near
enough.

------
tokenizerrr
So does this achieve anything other than fuck the people over who (probably
unknowingly) wound up buying such a Blackphone? I can just imagine waking up,
looking at my phone which has been working perfectly well up until that point,
and being told it now has been bricked for some licensing issue.

This seems really scummy and would drive me away from their products forever.

~~~
ekidd
> I can just imagine waking up, looking at my phone which has been working
> perfectly well up until that point, and being told it now has been bricked
> for some licensing issue.

If I were to buy a Blackphone, I would do it because I wanted security, and
because I trusted the manufacturer to provide it. The problem with security is
that just because my phone appears to be "working perfectly" doesn't mean that
somebody isn't eavesdropping on everything I say.

A counterfeit Blackphone, in other words, is completely defective and
untrustworthy, no matter how well it appears to work, because my trust in the
manufacturer is broken.

~~~
RubyPinch
But how does that lead to the logic of turning the hardware into a piece of
trash, instead of just informing the user?

~~~
ddalex
If you don't turn the counterfeit phones into trash, there is no incentive for
people to buy the real thing. This hurts the sales of Blackphones and the
revenues and profits of the company which invested cash into developing the
product - and harming their ability to provide security for genuine customers.

~~~
RubyPinch
So, protect the profit first, protect the customers second?

I don't exactly see how it bumps up the profit though. You have a bunch of
obviously interested customers, who want your product, and then the company
has come out with "Actually, we expect you to buy the phone twice, because of
a mistake that may or may not of been your fault"

~~~
EdHominem
The customers didn't buy a first blackphone, they bought something vaguely
similar, but counterfeit. They literally bought a backdoor.

Remember the hassle WhatsApp got for failing open?

The Blackphone+SilentOS is an actual crypto device for people who believe they
need crypto. It needs to fail closed. This may cost some people $100, but save
their lives.

This also gives the customers the ability to sue the sellers, or at least push
for refunds via the sales platforms.

~~~
RubyPinch
WhatsApp got that hassle because it failed open and didn't clearly inform the
user that it was doing so.

If the user is informed they have a non genuine device that is not safe or
secure (e.g. like Windows' nag notification), then they can't expect it to
work like a secure device, but more like an ordinary phone.

Then both normal consumers (who will continue to use the phone since they
didn't really care too much in the first place about safety) and security
conscious consumers (who will re-buy asap) would be more inclined to use the
same brand in the future

~~~
EdHominem
I can't imagine that anyone worldwide was going to buy their first and only
smartphone, but decided to spend more to get a special high-security black
phone, and is then without a phone because of this. And even if they were, I'd
rate the safety of even one user who'd accidentally click through the warnings
and make a call that should have been secure but wasn't, as paramount.

If my smoke/heat/etc detector starts to fails I don't want it silently
dropping back to a smoke-only detector. I want it to start beeping loudly and
refuse to stop.

> If the user is informed they have a non genuine device that is not safe or
> secure

Having to flash a new OS onto it is an appropriately sized clickthrough for a
warning of that magnitude. Like being woken in the night to change a smoke-
detector battery.

------
joecool1029
These don't sound like counterfeits. This sounds like Blackphone doesn't have
control of its supply chain.

Oh well, more reason to never consider purchase of one of their devices. Shame
because they actually are doing some decent software development.

~~~
phire
It sounds like they have a problem with ghost shifts.

The factory they contracted to assemble their phones is running extra shifts
off the books and selling the extra phones for extra cash on the side. This
happens all the time in China.

------
macspoofing
I stopped buying high-priced electronics on eBay or through Amazon resellers.
Number of counterfeit products is staggering and superficial quality is high
(exterior is usually close to perfect)

~~~
digler999
seems like this is a perfect case to use ebays/paypal's refund policy that
(apparently) always favors the buyer. The policy that people were using to
steal from sellers by pocketing the product and mailing back junk.

------
tehwebguy
How can someone trust a "real" Blackphone if they source production out to a
factory that they have no control over?

~~~
ddito
Do all sellers of security devices manufacture them in house? Now that you
mentioned this it really does sound like a serious problem for providing any
kind of guarantee..

~~~
Analemma_
Most of them do, yes. IronKey and Yubico both actually extensively document
their manufacturing process, which is in-house.

------
tptacek
I'm honestly surprised that enough people buy Silent Circle phones that these
countermeasures would be worth it to anyone.

~~~
prdonahue
Especially with their public claims of openness/open source vs. actual steps
taken to provide source: [https://www.quora.com/Will-Silent-Circle-silence-
critics-who...](https://www.quora.com/Will-Silent-Circle-silence-critics-who-
demand-they-uphold-their-promise-to-release-their-source-code-If-not-can-they-
be-trusted).

------
criddell
The implication in the story is that the contract manufacturer made extras and
sold them on the black market. How common is that?

Does anybody know who made the phones?

~~~
tqkxzugoaupvwqr
It is a common strategy of manufacturers in China. You, as the designer, get
ripped of by the very people that you pay to produce your product. The quality
can match the original because they use the same factory, same tools, same
processes that are used to produce the original product. The manufacturers
simply work another shift, e.g. night shift, or do a few more runs. In some
cases, manufacturers clone the whole factory that is used to produce the
original product so they can increase output of counterfeit goods.

~~~
kabdib
This is why devices like game consoles and at least a few models of cell
phones have a "geneology" database maintained by the official manufacturer,
with originating key material in the hardware, and additional activating key
material provided to the device at certain well-controlled manufacturing
stations. You simply can't make a working device without going through that
process.

[Just bricking Blackphones is a terrible move; letting the users know their
phones were counterfeit and leaving them on would have been a far better move,
IMHO]

~~~
criddell
So if this is a known problem with known solutions, then the fact that
Blackphone didn't do any of this makes them look like amateurs. Maybe one
shouldn't rely on them for security...

------
quickConclusion
If this is not their phone, made by them, how can they brick them? How is that
legal?

~~~
informatimago
It is legal because they don't license their software with a license ensuring
freedoms of the user. The user is not free to use their software.

That's the point of the GPL. If the users took care of only buying products
including only GPL'ed software, then they would have the freedoms: 1- to use
the software, 2- to copy it for their friends, 3- to get the source to audit
it and modify it (so they would be free to remove any backdoor that wouldn't
exist in the first place for this very reason), 4- to compile the source to
binaries and use them to replace the provided binaries (so their may increase
their level of trust of the software they run on their hardware).

Do not buy products that come with freedom-restricted software!

~~~
zaroth
After all that the GPL still doesn't help these users unless they are _also_
able to spin up and maintain the various servers & services the software also
relies on.

------
mastazi
So there is this company that markets an "enhanced privacy" device and the
marketing material shows a home screen with Chrome, Play Store and Google App.

Thank you for the laughs!

~~~
Markoff
plus they can remotely disable your device without your approval, very safe...

------
Tepix
These phones are on sale for less than 100 euros, a very good price given the
hardware specs.

The genuine phones sell for 662€ with a one-year subscription.

------
Markoff
TLDR privacy oriented phone can be disabled remotely by someone else than you
or in other words how to ruin your reputation in 5 minutes

------
uber1geek
Someone will make a patch for this, just wait.

------
ComodoHacker
Can't IMEI also be cloned though?

~~~
DanBC
In England tampering with IMEI is _serious business_ , and carries a potential
6 month prison sentence.

[http://www.legislation.gov.uk/ukpga/2002/31/section/1](http://www.legislation.gov.uk/ukpga/2002/31/section/1)

I'm a bit surprised that isn't the case in the US.

~~~
mschuster91
Good grace, this is an overreaching bill.

This also affects Bluetooth and WiFi MAC addresses, even of laptop computers
(!), if going by the letter of the law. Trivially changed by userspace tools,
and IIRC Apple randomizes the MAC addresses to prevent people tracking other
people's devices.

Edit: It gets even worse, in theory anyone distributing software capable of
changing MACs or writing tutorials on how to do this commits an offence. Just
look at
[http://www.legislation.gov.uk/ukpga/2002/31/section/2](http://www.legislation.gov.uk/ukpga/2002/31/section/2),
it's madness.

~~~
DanBC
It's poorly drafted, but it's only ever used for people who change IMEI. I
think it's only used if they change the IMEI on stolen phones. But yes, it's
yet another bad English law passed as a kneejerk response to something or
other.

------
rongway
ebay isn't a black market...

------
timthelion
Seems a bit funny, that a device that is meant for privacy allows remote
control by its masters.

~~~
sqldba
What are they meant to do?

They build a secure phone. It gets ripped off by the manufacturer and resold
by anonymous eBay people who can install whatever backdoors on the phone. Then
the people using them are being snooped on - exactly what they don't want in
the first place.

Bricking the phones is the right thing to do.

~~~
mindslight
They're supposed to _not have this capability_ , if they actually built a
secure phone!

~~~
StavrosK
What, the capability to install updates? So you'd be fine with a secure phone
that the vendor could never update?

~~~
mindslight
The vendor shouldn't be able to push an update, no. And to the extent that
this update was pulled by users without a diff/sanity check/etc, then that
constitutes a de facto _push_. In any case, any update should be completely
revertible (which doesn't prevent against data leaks from hostile code, but
does prevent against denial of service).

This type of _attack_ is just a _test case_ that any truly secure device has
to defend against. I'm not advocating a specific mechanism of avoiding this
behavior, because there's obviously work to be done here to come up with a
better mechanism than "every user audits every line of source before
installing". It's just that this design work _has to be done_ \- "security"
based on trusting a company is easy, and doesn't differ appreciably from what
say Apple already provides.

~~~
sjwright
Your demand is impossible. Any software update mechanism relies upon trusting
the software vendor.^ The only thing you can ever hope for is that the update
mechanism is capable of trusting that the update has not been tampered with in
transit.

^ with the debatable exception of software distributed in source code form in
a language you're able to audit and compile yourself, though you'd still be
vulnerable to cleverly concealed exploits

------
wildchild
BlackPhone is scam and snake-oil. Cloud "privacy" with remote control. Need
privacy? Use burners.

~~~
raesene9
I don't think it's fair to call Blackphone "Scam and snake-oil" just because
it uses cloud features and has remote control.

What's fair to say is that you're implicitly trusting Silent Circle to be both
competent and benign. But then running pretty much any modern computing device
requires you to trust various groups of people in that regard, so this isn't
that different really.

~~~
throwaway91111
Trusting third parties is, in general, not a great approach to privacy.

~~~
sjwright
Trust is the only way security works in practice. Have you personally audited
all the software and silicon in your secure device?

~~~
throwaway91111
That's an entirely different type of trust than trusting someone who says "I
swear that I won't look at your data".

~~~
sjwright
It's a bit easier for a cloud provider to look at your data surreptitiously,
but you apply that very same "I swear that I won't look at your data" trust
upon every software and hardware vendor from your USB keyboard to your
internet router.

~~~
throwaway91111
While that is a good point, it's easier to break open your keyboard and ensure
it's not emitting RF than it is to audit a site that may be under an entirely
different state.

