
Yubico: Secure Hardware vs. Open Source - francois2
https://www.yubico.com/2016/05/secure-hardware-vs-open-source/
======
davideous
In discussions like this the phrase "security by obscurity" gets used as an
accusation. We all agree "security by obscurity" does not work. But that's not
what is happening here.

Wikipedia's definition: "the reliance on the secrecy of the design or
implementation as the main method of providing security for a system or
component of a system."

Youbico isn't saying that the security of the device is increased by keeping
the source code secret.

They say they are increasing the security by things like this: disabling user-
loading of new firmware (which could be a bad actor loading bad firmware),
using hardware with built-in side-channel countermeasures, and disabling JTAG
ports (which could be used for key extraction).

This isn't obscurity. These are some good engineering arguments. Engineering
is always full of trade-offs.

~~~
colemickens
None of which precludes the implementation from being open source. In fact, it
just means that even if the software were open source, it would be near-
meaningless since I can't verify the code running on the device and can't
reflash it myself.

"Youbico isn't saying that the security of the device is increased by keeping
the source code secret."

Yeah, they're not really saying _anything_ other than _trying_ to provide an
excuse for why they won't release it. "You can't use it anyway" isn't much of
a response (I actually find it rather patronizing and dismissive).

Not to pile on, but regarding: "Engineering is always full of trade-offs."...
what exactly is the supposed trade off here? (Maybe they're using licensed
code that they can't redistrib?)

~~~
tadfisher
If I'm reading the statement correctly, they are unable to release the source
due to an NDA with their hardware provider, which is at least a reason other
than "it's not software under the Free Software definition".

~~~
mtgx
What would be the purpose of an NDA with the hardware provider? Surely not to
hide it from GCHQ/NSA?! I imagine a company like Yubico has all of its
employees on GCHQ/NSA lists and may even have cell tower simulators outside of
its offices.

The NDA makes this even more suspicious. Who's the hardware provider? Huawei?

~~~
yc-kraln
NXP makes you sign an NDA to use their secure stuff.

The purpose is anti-competitive, preventing NXP's competitors from learning
how the devices work. These devices often have advanced hardware and firmware
countermeasures.

The secure modules are considered weapons technology if they're allowed to be
updated after sale; the company is responsible for tracking each one, they're
impossible to ship overseas, etc.

It's not suspicious, it's SOP. Choose between open and secure, or make your
own silicon.

~~~
zepto
Trade secrets are not 'anti-competitive'.

------
fpgaminer
> we, as a product company

The most important thing any security company needs to realize is that their
primary product is their reputation, not the physical or digital goods that
they produce. "We, as a product company" is totally the wrong attitude.
There's really no question about it, every ounce of closed source
software/hardware in a security offering is something the customer should be
concerned about it.

From a product perspective it totally makes sense to be worried about open
sourcing the entire design. "Our competition will make clones!" And that may
be true of every other kind of product. But would you buy a cheap knockoff
Yubikey? I certainly wouldn't. Again, reputation is the key here. That's what
a security company sells to their customers. Confidence that when they buy
from company X they know that company X has put the best engineers to the task
and crafted a device that will protect their valuable digital information.

A company can build up a reputation in the security industry, produce world
class hardware and software, and charge a sharp premium on it, because
security is _so_ important and protects some of our most valuable assets. That
premium is completely derived from the trust that they've garnered. It's
insane for Yubico to squander theirs under some false sense of IP security.

EDIT: And all that said, I totally understand where they're coming from on
some of their points. They have to depend on chip manufacturers, and chip
manufacturers are just the absolute worst when it comes to open source and
security. Sometimes there are hard constraints and compromises have to be
made. Most of cryptography is a trade-off. So don't take my comment to mean
that designs absolutely have to be 100% open source. That's infeasible most of
the time for hardware. But Yubico should be striving for it and pressuring the
market.

~~~
nickpsecurity
"The most important thing any security company needs to realize is that their
primary product is their reputation, not the physical or digital goods that
they produce."

That's semi-true. They're both important. The belief that the product is worth
buying and effort into selling it are primary importance. Getting hacked or
sued in public diminishes sales. So, the most important aspect of security for
these kinds of companies is perversely minimizing potential for their image to
be hit by hackers even if the products have no security. Not an accusation at
Yubico but a common strategy in this market. So, they just have to present a
good impression to target market.

" every ounce of closed source software/hardware in a security offering is
something the customer should be concerned about it."

Not really. It might surprise you but many companies have run for _decades_ on
proprietary platforms. They generated ridiculous sums of money in the process.
All kinds of people got jobs, made money, and retired in this time. Nothing to
worry about apparently most of the time. The reasons to worry are there but
smaller than you think. One must balance many needs in a business. For most,
this kind of thing is a checklist item about reducing liability. They're fine
if it looks good on paper.

" But would you buy a cheap knockoff Yubikey? I certainly wouldn't. "

Most would. They want _something_ as an obstacle to hackers while minimizing
cost. They don't know if Yubikey has any real quality underneath given how
businesses often do things. So, it's a real Yubikey vs a cheaper one. Many,
not all, will choose the cheaper one. See Cisco and mobile manufacturers vs
Huwei to see how big of a market share that can lead to.

"Confidence that when they buy from company X they know that company X has put
the best engineers to the task and crafted a device that will protect their
valuable digital information."

There's a market for that. I used to try to serve it. It's tiny and fickle.
Yet, I question what confidence people have in those engineers to begin with
as they've never assessed their capabilities in INFOSEC and strong attacks
rarely are publicized. It's not like Googling rate of car crashes.

" produce world class hardware and software, and charge a sharp premium on it,
because security is _so_ important and protects some of our most valuable
assets. "

Many tried. Market rejected almost all of it. Still does. They want security-
defeating feature X, protocol Y, and fall-back Z. They want it to run as fast
as competition despite security or safety checks on insecure, potentially-
backdoored hardware to get COTS HW benefits. They also don't want to pay
hardly anything extra for it despite whole teams of extra people being put
into every other component for rigor and price of external evaluations. Market
for high-assurance guards is so small that they have to charge over $100,000
per unit to make the money back. Hell, Signal is free and Threema charges $1-2
but they're barely a fraction of 1% of WhatApp or Facebook in marketshare.
Demand-side is the problem.

So, Yubico is doing what's good for business. All of them are and should until
market shows it's willing to make the compromises necessary for strong
security. They won't. So, wasting money on it is foolish outside defense
sector, academia, and a few niches (eg smartcards) where one can keep a job
doing it.

~~~
the_ancient
>>Cisco and mobile manufacturers vs Huwei to see how big of a market share
that can lead to.

Implying the Huawei is the "cheap knock off" and Cisco/Apple/Samsung/etc are
the noble high quality product fighting the good fight....

My Hauwei Nexus 6P has been the best phone I have ever owned, far exceeding
the quality and usability of every Motorola, Samsung, and other phones I have
owned.

As to Cisco, after their fasco with the NSA I would not trust them at all for
security.

~~~
nickpsecurity
That's an accusation and implication. The Chinese strategy, which isn't
entirely secret, is to use their hackers to get trade secrets out of firms in
all kinds of sectors to hand to their own firms. Each time, their firms
leverage those as a head start on their own products which combine their own
innovations, labor advantage, and money from vast market in China. It's a
proven model. Far as Cisco and Samsung, it's been clear Huwei has been
knocking them off the same way.

Besides, what are you even questioning given that Huawei admitted they had and
removed Cisco source code? Of course they robbed them. :P

"As to Cisco, after their fasco with the NSA I would not trust them at all for
security."

Which is totally irrelevant to my point that cloners... especially Chinese
cloners... will make knock-offs of a hardware product in _any_ country that
hurt that company's business if the product is worth it to them. The NSA
collecting secret information to determine if you're a terrorist, felon, or
threat to foreign policy != Chinese intelligence giving your competition your
I.P. who then operate in _your_ market with cheaper labor. NSA is a
hypothetical threat for most companies whereas Chinese tech and labor market
have been doing my country (U.S.) in for decades with many companies achieving
parity or dominance in some sector through stolen I.P.. It didn't help that
idiots running our companies put R&D centers over there to reduce labor costs.
(rolls eyes) Such stuff is an existential threat to small, hardware providers
worth cloning given what Shenzhen can pull off.

~~~
grawlinson
The NSA does the same thing for American business.

[http://www.economist.com/node/1842124](http://www.economist.com/node/1842124)
[http://money.cnn.com/2015/04/30/news/airbus-germany-nsa-
spyi...](http://money.cnn.com/2015/04/30/news/airbus-germany-nsa-spying/)

~~~
nickpsecurity
Name five, foreign companies off the top of your head who have US products
that cloned... with source and such... their product line. Which also became
huge players in market taking huge sums from original. I'm interested in
seeing them as I blast NSA for what tiny, industrial espionage I find.

------
davb
It's a shame to see that they used the goodwill of security-conscious
cryptonerds to gain a foothold on the market only to, effectively, say "We're
now targeting enterprise and government who can afford to pay for third party
contracting security auditors. You can't, so just take our word that it's
secure."

Other companies have managed secret distribution for secure devices just fine
- randomise the card manager key and bundle a tamper proof packet containing
the key along with the product. Provide instructions on how to verify the
integrity of the packet, and confirm a digitally signed affirmation of the key
against Yubico's public key online.

That's more than RSA offers for SecurID seed verification and more than my
business bank offers for two factor device PIN integrity checking.

I'm not sure who they use for their Secure Element (NXP?) but it also sounds
like Yubico has gone along with their request (and NDA) to keep implementation
details secret. We've seen a similar situation in SE implementations in mobile
phones (for contactless payment, primarily).

Again, enterprise customers don't care (mid-sized one have insurance that will
cover loss if their Common Criteria EAL 5+ vendor's hardware is compromised,
big enterprise can pay for auditing). Governments don't care (they'll pay for
auditing or negotiate it in any significantly high volume contract).

End users and the tech community are the only groups who'll really lose out
here.

------
nickpsecurity
I've studied high-assurance security and hardware for a long time. This looks
to be motivated by a few things:

1\. Hardware cost money to develop, has to make it back, and is easy to clone.
They'll keep hardware secret by default for this reason like everyone does.
Also lowers odds of patent suits. All kinds of people demand open, secure
hardware but almost nobody will buy it. Just like software. Number 1 problem
in the INFOSEC industry.

2\. There's three companies IIRC building the kinds of secure IC's they need.
They NDA the stuff critical to understanding it. Plus, the implementations are
secret with tamper-resistance mechanisms. Pointless relying on open-source
model to understand or evaluate such a thing. Some marginal benefits but major
risks would still be there. Whereas, open-sourcing the stuff _adds_ risk in
terms of issues with the suppliers. So, no OSS is an acceptable choice here.

3\. Restricting some of the firmware/software is a tradeoff of the protection
methods they're using. Again, reduces value in open-sourcing it as you'd have
to dump it off the chip to verify it anyway. The kind of people that can do
that don't need Yubico's help.

4\. Yubico might not know how to build secure HW/SW combos. It's a rare skill
whose techniques are a mix of published and trade secrets. Plus, attackers are
always coming up with new stuff. So, obfuscation... not security by
obscurity... but obfuscation of aspects of design to increase work of
attackers between product releases is both justified and a proven method. If
no other measures exist, then it would be the garbage known as security by
obscurity. This seems to be better practice of proven mechanisms plus
obfuscation which can hamper even nation-state hackers. Who knows how good
_their_ mechanism are going to be but there's potential.

So, it seems like a combination of sustaining their business by stopping
clones and lawsuits with improved branding from effects of obfuscation &
hardened IC's on low-skilled attacks that dominate the press. Two, very-good
reasons to make a decision in this market. It's just economics in action. :)

~~~
uola
1\. The hardware design per se isn't that valuable. It's quite easy to reverse
engineer and is probably more like a reference design that anything. More
likely NXP (?) don't want open designs and open software because it makes it
easier to reverse engineer and clone the chips themselves. For YubiKey
themselves it's mainly the firmware that is valuable (well, design and access
to chips to of course) which is why part of their firmware isn't open source.

~~~
nickpsecurity
"The hardware design per se isn't that valuable"

People that spend considerable effort turning a good idea into hardware that
sells tell me otherwise. ;)

"because it makes it easier to reverse engineer and clone the chips
themselves."

You first said it's easy to reverse engineer and not valuable. Then, said they
want closed designs to reduce reverse engineering and cloning. Which is it?

"For YubiKey themselves it's mainly the firmware"

That may be true. I can't speak to that.

~~~
uola
"People that spend considerable effort turning a good idea into hardware that
sells tell me otherwise. ;)"

The execution and the overall ecosystem of course matters. But the hardware
design, how the chips are connected, isn't really a secret as such and is easy
to reverse engineer and recreate. It's just not very complex.

[http://www.hexview.com/~scl/neo/](http://www.hexview.com/~scl/neo/)

"Which is it?"

The hardware design is easy to clone, the chips themselves aren't necessarily.
Chips have a very low marginal cost and a functionally identical clone could
easily be sold for 1/100th the cost in volume, since all the cost is R&D.
Companies therefor try to protect their IP as much as possible by making
reverse engineering harder and by "owning the ecosystem". There's been cases
where clones have been made by emulating chips on much more capable (but
cheaper) hardware and sold for 1/10th the price.

~~~
nickpsecurity
"But the hardware design, how the chips are connected, isn't really a secret
as such and is easy to reverse engineer and recreate. It's just not very
complex."

Hardware design is a combo of how the chips are connected, the firmware, and
getting it to users. Your link supports my assertion that they should put in
whatever obstacles they can.

"Companies therefor try to protect their IP as much as possible by making
reverse engineering harder and by "owning the ecosystem"."

Point 1 in my original comment.

------
viraptor
After thinking through the initial "this is terrible" reaction, I actually
don't mind what they're doing. Even though if there was an equivalent solution
that was based on open source I'd definitely choose it over YK 4.

I also don't see anything that would really prevent them from just releasing
the source they're using, even if we can't realistically do anything useful
with it. The whole point of those systems is that it's secure via algorithms
and hardware silos - releasing their sources shouldn't change anything.

But in practice it doesn't really matter that much - as long as they use
standard interfaces and replace your key for free if someone finds a
vulnerability, I'm (cautiously) fine with their new position. I think a big
part of the issue is that they did something better before, but if they
started with the current design, people wouldn't really complain about it that
much.

------
rcthompson
Couldn't a hardware vendor theoretically provide read-only access to the
firmware and then have an open-source reproducible build process so that
anyone can build their own copy of the firmware and verify that the firmware
on the device is bit-for-bit identical? Wouldn't that satisfy people who want
to be sure of what code is running on their device while still preventing an
attacker from loading custom firmware?

~~~
vbernat
The read-only copy could be different from the running copy.

~~~
rcthompson
If you trust the hardware enough to use it for 2-factor authentication, then I
think you trust it enough to be honest with you about its contents.

~~~
dfox
The problem there is that in usual case, the read-only access to software will
not be provided directly by the hardware, but by the same software you are
trying to verify.

In theory, this could be solved by verifying whole memory of the device, but
that still depend on you believing that the device does not have more memory
than what it should have.

~~~
leni536
> the read-only access to software will not be provided directly by the
> hardware, but by the same software you are trying to verify. Why not?

~~~
dfox
Because in the usual case you want to do such verification through same
interface as normal operation, both for usability reasons and to limit number
of interfaces that cross the security boundary.

------
kerkeslager
The argument for disabling loading new firmware _on your own device_ is valid.
It prevents an outside actor loading malicious firmware. But it's a tradeoff:
it means that if a vulnerability is found, the device has to be replaced, and
users can't customize their firmware. That's a good tradeoff; I'd rather risk
paying for a new Yubikey than risk a security compromise, and most users are
unqualified to verify the security of firmware being loaded onto the device.

The problem is, it's not a tradeoff _Yubico_ have to make. They can allow
users to achieve the same goals by distributing the device un-flashed, with
the source code to the firmware. Upon flashing, the firmware would disable
further flashing. If the user doesn't like this tradeoff, the user can choose
to change the code. As a courtesy to more trusting users they could provide
the service of optionally flashing devices for you. And qualified users can
verify the security of the firmware before loading it.

But by flashing the devices themselves, Yubico has chosen the worst of both
worlds. Now an outside actor can once again add malicious firmware: _Yubico is
an outside actor_. AND nobody can verify the security of the firmware. This
isn't even a tradeoff, it's just a loss.

~~~
H3g3m0n
> They can allow users to achieve the same goals by distributing the device
> un-flashed

There is the possibility of the device being intercepted before it reaches
you. Or before you have gotten around to locking it down. Or when you plug it
into your (compromised) system to lock it down.

Since all communication is done over the USB port, the problem is that the
firmware can be flashed with a backdoored firmware that appears to be
normal/unflashed. One that can be flashable (by basically having a virtual
machine/emulator that runs the flashed image), appears to get locked down when
you go through any lockdown process (since you just end up locking down the
VM). But still has the backdoor in place.

Firmware aside, people can modify the hardware too. Unless you crack open the
device and inspect the internals (which many devices are designed to prevent).
And even then a really sophisticated attack could replace the chips with
identical looking ones. If you are using off the shelf ones then it wouldn't
be that hard. They can also add an extra chip before the real one that
intercepts the communication. Or maybe compromise the 'insecure' USB chip (if
it's programmable).

With locked down hardware the manufacturer can bake private keys onto the
chips and ensure that the official stuff checks the hardware by asking it to
digitally sign something with a private key. But if the attacker has added
their own chip between the USB and the legit chip, they can pass through the
requests to the official chip.

TPM will do something like keep a running hash of all the instructions that
are sent to the hardware and use the resulting has as part of the digital
signature verification, but if you mirror the requests that doesn't help.

The next stage is to use the keys on the chip to encrypt all communication
between the 'secure' chip. So any 'pirate' chip won't get anything useful.

Users could be allowed to 'bake' their own keys in, but that leaves us with
the intercepted hardware problem. The attacker gets the hardware, installs
fake firmware that appears to accept your custom key and preforms the
encryption.

Personally I think worrying about security to that level is over kill even if
your dealing with quite a bit of money. It would have to be quite an organised
attack. They would have to gain physical access to the device, compromise it,
return it unknown and then gain physical access again later. Requiring both
physical and digital security skills.

That's much more work than just, stealing it or applying Rubber-hose
cryptanalysis. Attackers can also compromise the system being used to access
whatever.

------
eggy
I am pleased they took the time to respond in length. It makes a bit more
sense now (NDAs, hardware manufacturers, etc...) vs. the 'security by
obscurity' mantra prevalent in the replies.

I have had my own business, and the one thing I would say to the critics of
Yubico: If you have a way, given existing hardware and software tools and
suppliers, to do a better job, step up and do it. AFAIK, Apple didn't
opensource their hardware related to crypto, or their software.

I think you will find it takes more than wishful thinking; more like, put your
money ( = or your time) where your mouth is. Engineers, and I don't just mean
CI engineers here, know it is a long way from a math equation or set of
equations to a real world working object. I would love to see, and I would
contribute money to an opensource solution. I just don't think it is as
cookie-cutter simple as the majority of comments seem to intimate on this
forum.

------
drazvan
BTW, the two major manufacturers they're talking about are NXP
([http://www.nxp.com/](http://www.nxp.com/)) and Infineon
([http://www.infineon.com/](http://www.infineon.com/)). STMicroelectonics
([http://www.st.com/](http://www.st.com/)) is also a player here and Feitian
has also started doing it
([http://www.ftsafe.com/product/epass/eJavaToken](http://www.ftsafe.com/product/epass/eJavaToken)).
NXP and Infineon are notoriously hard to get started with for small companies
and independent developers but they have some very clever proprietary stuff in
their chips.

------
tptacek
Very long post. Apparently, very simple explanation: they want to use NXP
hardware, and NXP requires NDAs, preventing them from meaningfully opening
source code to the platform.

~~~
infinite8s
This is the only comment that has figured out the real reason for not
releasing the code - they can't due to NDA.

------
dmitrygr
This reads more like an excuse then a reason. Nothing of what he says is a
reason that prevents them from being more open.

All that he says is summarized in "it was too hard to think of a solution, so
we didn't do it."

~~~
rrego
That's a very disingenuous summary. It seems impossible to make the device
open due to the NDAs. Can you explain how they would get around these?

With regards to the applet manager, that seems to be an issue with customer
friction less so than being too hard. While "crypto nerds" would be fine,
business applications could be affected.

~~~
dublinben
Not using hardware components that would require NDAs would be the obvious
alternative.

~~~
sgift
Obvious? As in "I didn't read the post, didn't read that there are only two
suppliers for this kind of hardware and didn't read that both of them require
this NDAs"-obvious?

Quote for your convenience:

    
    
      So — why not combine the best of two worlds then, i.e.
      using secure hardware in an open-source design? There 
      are a few problems with that:
    
      - There is an inverse relationship between making a 
      chip open and achieving security certifications, such as 
      Common Criteria. In order to achieve these higher levels
       of certifications, certain requirements are put on the 
      final products and their use and available modes.
    
      - There are, in practice, only two major players 
      providing secure silicon and none of their 
      products/platforms are available on the open market for 
      developers except in very large volumes.
    
      - Even for large volume orders, there is a highly 
      bureaucratic process to even get started with these 
      suppliers: procedures, non-disclosure agreements, secure 
      access to datasheets, export control, licensing terms, 
      IP, etc.
    
      - Since there is no debug port, embedded development
      becomes a matter of having an expensive emulator and 
      special developer licenses, again available only under 
      NDA.
    
      - Although this does not prevent the source code from 
      being published, without the datasheets, security 
      guidelines, and a platform for performing tests, the 
      outcome is questionable, with little practical value.
    

You can disagree with this arguments, but just ignoring them to provide an
"obvious" answer is a cheap tactic.

------
sigmar
My opinion on this is that physical security is paramount. Your threat model
can't possibly eliminate all threats from an adversary that has physical
access.

No hardware is 100% secure and for Yubico to say this issue is about "Secure
Hardware vs. Open Source" seems like a red herring. Perhaps they are just
trying to protect their business model? After all, there isn't anything
particularly unique about the hardware.

~~~
nickpsecurity
Physical security is a moving target and a spectrum. Basic mechanisms can
protect my computer if I leave it unattended in front of common hackers for a
few minutes to take a leak at a restaurant. Another level of security is
necessary for people with more access or tooling. At some point, basically
nothing I do will help given enough resources by pro's.

So, it's not so simple. Otherwise, all buildings containing valuable protected
by locks and stuff would be compromise because enemies had potential of
physical access. They aren't. That's telling you something.

------
foxhill
well, it's a shame that poor arguments get recycled like this, but it does
make for easy dismissal - cryptography is based off of the idea that the
methods used totally transparent, the power to decrypt comes from possession
of the appropriate keys. by closing a design, hiding it from scrutiny from the
majority of hackers like ourselves, helps no one other than the individuals
who wish to gain unauthorised/unwanted access.

this is a fundamental concept in FOSS and for anyone to try and rationalise
their way out of it - be it out of some corrupted sense of trying to do the
right thing - is absurd.

fortunately i feel that the very people that would be interested in this
device will be aware of this; i hope the folks at yubico reverse this
decision.

~~~
hendzen
So that's why the NSA has a whole suite of confidential ciphers?

[https://en.wikipedia.org/wiki/NSA_Suite_A_Cryptography](https://en.wikipedia.org/wiki/NSA_Suite_A_Cryptography)

------
captainmuon
This story made me think a bit about devices like the Yubikey. I'd really like
one to store my keys to sign mail, or for two-factor-authentication. But the
main selling point, the tamper-resistant secure-enclave-like chip, is
something I don't need. I'd rather have a tiny microcontroler in USB format
that I can program myself and understand nearly 100%, with no secret code
going on.

My reasoning: I don't need physical tamper-resistance for my threat scenario -
if it is stolen by a random thief, a coworker, a "friend", etc..

But if I was attacked by a nation-state-like actor, I cannot trust any
security measure of the device. How do I know the NSA does not have a copy of
every "random" card-manager key? How do I know that generated keys are not
subtly biased so that they can be guessed easily? Or that there is not a
secret function to extract them? Even if Yubico is 100% honest and their
device is clean, I must assume that if e.g. the NSA were after me, they have
the technology to extract the keys from the device, no matter what protection
it has.

------
microcolonel
I understand where they're coming from. Though it would be even braver for
them to get into the IC design game, and make a chip with the properties they
desire. They can then publish whatever they would like about that chip.

------
kriro
tl;dr: code is closed and I can't change it anyway so it shouldn't matter to
me.

I hope the response from consumers will be: we understand your position.
Unfortunately that is unacceptable and we'll look for another vendor. It is
mine. I own a Neo, not getting any of their future products.

Also as a strategic guideline...maybe if you're in the business of
security...don't use hardware that requires NDAs. Yes it'll make it impossible
to do some stuff and more expensive to do some stuff but I'd say there's
really no option to compromise.

------
ansible
While it is good that they are implementing all these hardware security
features, I think that we are in general over thinking the whole thing.

Their current industrial design very clearly says "hey, I am an important
security key", which is exactly the wrong thing to do.

It should instead look like a cheap flash drive. And when the thief plugs it
in, he sees exactly that, a low capacity USB flash drive, unencrypted, with
some random documents on it.

Is the thief at this point going to perform some sophisticated hardware
hacking? No, it will just get thrown away.

------
hlandau
The industry of smartcards and similar devices has annoyed me for a long time,
mainly due to its failure to provide a secure general purpose computing
environment and get out of the way. I wrote about it some time ago:
[https://www.devever.net/~hl/smartcards](https://www.devever.net/~hl/smartcards)

~~~
nickpsecurity
You don't know why they use interpreters? It's for the combo of security and
app development. Just like long ago, the development of a high-assurance
MULTOS or JavaCard system means you certify the interface one time. Apps get
to build on that into an ecosystem. Then, only new implementations of that
have to be certified [in theory]. MULTOS requires it while I think it's
optional with JavaCard. I have less clear answers than most since I don't sign
the NDA's either. At least let me reverse engineer and post _some_ answers. ;)

Regarding DES, the smartcards and HSM's were originally developed for use by
both government and financial industry. They originally standardized on DES
then used 3DES to reuse their HW and SW. It was one of few tradeoffs that made
long-term sense given a three, key version of a 1975 algorithm is _still
secure_ in 2016. That's 41 years of security through variants of that
algorithm. Unheard of in our industry. That you call 3DES, itself going strong
almost 20 years, something that should be repellant shows the difference
between security-critical sector and mainstream. Former prefers what's proven
longest with latter preferring what's popular and good in theory. Both AES and
3DES are valid choices given peer review. That their money-makers came from
3DES customers made the best choice obvious.

Regarding NDA's. A HW guru that taught me what I initially knew on the subject
mentioned patent suits. He said his company refuses to do business in the U.S.
since those companies get sued into the ground. There are so many patents on
HW, esp microarchitecture, that it's impossible to avoid all them. So, he said
keeping things as trade secrets was a common strategy of smaller firms to
reduce legal risks and ensure profits. Also, reduces copying and attacks by
hobbyists. And of course they didn't say "hides infringement" in the
datasheet. :P

I stopped there since I think these should address your concerns. At the
least, it should start to make sense what those companies are doing whether we
like it on our end or not. Personally, I'm more a fan of Caernarvon OS for
smartcards as one of the inventors of INFOSEC (Paul Karger, grandmaster of
high-security) made it. Look it up for interesting lessons on what smartcard
OS's deal with in terms of development and certification difficulties.

------
jwildeboer
I have been a long term user/promoter of yubikeys. But today I ordered a
Nitrokey Pro. They seem to be the better choice now. Definitely more open and
with pen tests of hardware and firmware on their website. All schematics and
firmware on GitHub.

------
xaduha
You can get blank 'java' smart cards and load open source applets on them, you
don't need Yubico.

Personally I only tried IsoApplet, but openpgp applet should work too.

~~~
floatboth
Previous/other Yubico products _are_ flashable javacards.

~~~
xaduha
You can buy a java smartcard for about $2-$5 without compromising on security.

------
exabrial
Each key's signature is randomized.... brilliant. So buying 1000 of the keys
doesn't give an attacker an advantage. Certainly flies in the face of FOSS,
but this is about security... I'll be watching this closely to see Yubico's
actions. I think so far this is a great response and I look forward to a non-
sensationalist rebuttal.

------
amluto
I thought about this for awhile, and here are my thoughts about having the
source code:

With the older YubiKey NEO devices, the applet source was available and I
could freely upload an applet. This was great for a few reasons. I could
modify or upgrade the app (of course, doing so would cause me to lose existing
keys, which makes sense from a security PoV). (I actually did this on my old
YubiKey.) I could also, in principle, audit the app. And, if I trusted Yubico
to get their security right, I would trust that my freshly-arrived-in-the-mail
device was secure. Moreover, if I trusted Yubico not to act maliciously, then
the applet on the device I got in the mail would match the firmware on github,
and I could trust that it did what I thought it did.

There were, of course, problems. The GlobalPlatform platform is awkward to
use, the toolchain is terrible, and the key management is awkward at best.

I could not trust that a key I installed in the OpenPGP applet while my
computer was compromised was secure.

With the new locked-down NEO devices, I can't change out the applets, and the
bad guys would also have trouble doing so. As before, if I trusted Yubico not
to act maliciously, then the applet on the device I got in the mail would
match the firmware on github, and I could trust that it did what I thought it
did. Also, as before, I could not trust that a key I installed in the OpenPGP
applet while my computer was compromised was secure (because an attacker would
simply export it before uploading rather than swapping out the whole applet).

Enter the YubiKey 4. If I use one, I am completely at the mercy of Yubico and
their third-party audits. I cannot audit the code myself. Even if I trust
Yubico not to act maliciously, I have to take them entirely at their word that
they didn't accidentally mess up. And, of course, I cannot not trust that a
key I installed in the OpenPGP applet while my computer was compromised is
secure.

In other words, there's a big difference between source-available and source-
not-available, even if I can't personally verify that the source I think I'm
running is the source I'm running.

As an aside:

> There is an inverse relationship between making a chip open and achieving
> security certifications, such as Common Criteria. In order to achieve these
> higher levels of certifications, certain requirements are put on the final
> products and their use and available modes.

This may well be true, but, if so, it's a sad statement about Common Criteria
and their misguided rules. Publicly disclosing the source code of an EAL5+
device should not reduce its supposed security level.

With SGX, Intel had the chance to offer a widely available security token
(built in to every new CPU!) that anyone could freely program and use for
their own security purposes. They blew it when they created their "launch
control" policy, which essentially says that developers who don't sign lots of
contracts (which you can't even _read_ without an NDA AFAICT) can write an
applet but can't run it. The Linux community, at least, is pushing back _hard_
, and this just might change in the next generation of CPUs or maybe even
sooner. Fingers crossed.

This inspires a challenge to Yubico: give me a hardware token that runs
applets. Let the token attest to the hash of a running applet, but let it run
any applet whatsoever. If I want to verify that I'm running the bona fide
Yubico OpenPGP applet, I can check the hash myself. If I want to replace it, I
can, but then the hash will change. It'll be hard: you'll have to figure out a
real isolated execution environment. It's definitely doable, though.

~~~
ctz
> With SGX, Intel had the chance to offer a widely available security token
> (built in to every new CPU!) that anyone could freely program and use for
> their own security purposes. They blew it when they created their "launch
> control" policy

Now rescinded.

~~~
stijnhoop
Could you detail that with a link to this news?

~~~
amluto
The Intel SDM, Volume 3, version 058 has a new set of MSRs called
IA32_SGXLEPUBKEYHASH along with a new feature control bit for them. The
intended policy is not specified anywhere that I can see, nor can I find any
PR announcement or whitepaper. I also don't know what CPU generation will
support that feature.

------
franciscop
Summary:

"If you have to pick only one, is it more important to have the source code
available for review or to have a product that includes serious
countermeasures for attacks against the integrity of your keys?"

------
bb88
Simply put, all things equal, the device will be reverse engineered and
exploited sooner by well funded governments than a hacker collective.

Even worse, you won't know when the device becomes obsolete. So you might be
buying an insecure solution from the start.

------
mindslight
A lot of handwaving which may throw those off who haven't pondered the design
constraints of hardened hardware. But alas, it essentially boils down to the
same reason that every productized solution goes closed: it's the expedient
lazy option. Age-old antisecure solipsism.

