> Further hostility against the company or our users will
> not be tolerated in this forum, and will be met with
They're looking for a profit, sure, but they're blessed to be a hardware company. It's not like I can just clone they're repo and not need to buy their product.
 - https://github.com/Yubico/ykneo-openpgp/issues/2#issuecommen...
Just look at bug trackers for large projects like Chrome. Any time there's a bug report or feature request that attracts lots of attention, without heavy moderation, the technical content is quickly drowned out by me-too-ing and angry rants.
> Everyone that does not have shit for brains knows that security through obscurity doesn't work .... it should not take to long for someone to exploit the dumb asses that buy and use their allegedly "secure" closed source product.
What I thought was "odd" was reacting to community feedback that OS involvement was so prized by closing the issue (and therefore discussion) without explaining why this decision was made.
Perhaps more seriously concerning is that it took people rooting through the code and issues to discover that this was the case.
Many would say that security companies have something of a duty to be open-source; I think they at least ought to announce that they've deprecated the open-source versions of their code!
Why is 'it's on the internet and they did something I do like so I can be an add' so accepted no matter what the circumstances are?
Seems like a very reasonable and measured reaction. It's not like they're banning all discussion of the topic, just trying to keep civility.
In the end, it's not difficult to burn opensource openPGP applet to your own card. But there are 2 problems:
1. Bulk sales. If you want to all the things by yourself, and you found an ideal chip (recent NXP SmartMX2 cards has all the fancy stuff you want), almost every reseller only allow bulk purchases, say 100 pcs minimum.
2. Propriety software. For NXP cards, you need a propriety software to initialize/unlock a card before you can use GlobalPlatform tools to flash your own Applets. A reseller told me that his can be done by sending raw HEX code with a Transport Key to workaround, but I'm not sure about it.
: For the sake of the discussion I will pretend he is a qualified to speak about security.
That's assuming you consider Steve Gibson to know anything at all about security.
Steve Gibson is a talker, even if he doesn't know what he's talking about. He will say whatever he wants and issue corrections later when he's been proven wrong. It's not a cheap shot. It's well documented, he's been called out on it for years, but he's still calling himself a security expert while simeltaneuously spewing bullshit, for 15 years now.
And "not current" in security means "useless", or sometimes "actively harmful". There is no way to call yourself a security expert if you're out of date.
Any cool project you're aware of? :)
Here are some fully open Yubikey alternatives.
It seems there is a big plus when you use a board made by the person who makes gnuk which is the same person who makes the smartcard gnupg code.
For too many applications, yubikey is still king
And if I'm reading the linked GitHub issue correctly, this is about a specific plugin that runs in a sandbox on the YubiKey NEO, where the main codebase of the NEO is still proprietary?
I don't understand the advantage of it being open-source then, at least as far as security goes. (For user freedoms in practice, maybe.) What guarantee do you have that the code on the device matches the code on GitHub, or that the code on GitHub isn't subverted by other code on the device?
No, it's the code for the new YubiKey 4, which has been closed off.
> What guarantee do you have that the code on the device matches the code on GitHub
Ideally, you can build and flash the firmware yourself.
yk4 firmware was never released; that particular issue is about NEO's openpgp applet.
>Ideally, you can build and flash the firmware yourself.
You couldn't do that with NEO(except from early dev version) as global platfrom management keys are unknown, which makes it impossible to delete/upload applets. I actually tried to get dev. version but they redirected me to NXP who never answered.
What you could do, however, is read the bits off of the firmware and make sure they're identical to the bits you make from source. Reproducible builds, and all that.
Could you have flashed the proprietary part of the NEO codebase yourself, or just the plugin?
Is there a central clearinghouse for security audits of hardware / software? This is something the FOSS community can do much better than msft or even open source promoters like fb/goog, but not if the results are distributed on the experts' blogs and tumblrs.
Edit: Found it. https://stallman.org/stallman-computing.html
"As for microwave ovens and other appliances, if updating software is not a normal part of use of the device, then it is not a computer. In that case, I think the user need not take cognizance of whether the device contains a processor and software, or is built some other way. However, if it has an "update firmware" button, that means installing different software is a normal part of use, so it is a computer."
Straight from the big GNU's mouth.
I also think open hardware would be pretty neat, but software people who get really upset about layers they can't verify implemented on top of layers they can't verify sound pretty silly. Either you verify from the bottom or you can't verify at all.
Bugs and vulnerabilities are discovered every day in the layers that can be independently verified. Your argument extends to every single piece of software everywhere that runs on Intel and AMD microprocessors, for instance. It's foolish to say that an OpenSSL instance shouldn't be examined because the network interface it communicates over uses a proprietary firmware blob, or runs on Windows, for instance.
I agree that the freedom to verify the behavior of (what amounts to) firmware is not one of the Four Freedoms, but there is a non-zero value in being able to find bugs and help the manufacturer improve the product, as well as being able to use that information to inform a purchasing decision. Especially for something which could potentially be considered to provide organizational security.
I see your point that we do need to implicitly trust the company at some level, but what's upsetting is we want to encourage the widest possible audience to be able to "look in" and with closed-source change they've stepped backwards by decreasing the number of people who can casually [or pointedly] audit their devices and software.
It is more closed to me now.
Honestly I thought they were a company that cared more about providing the security we need rather than making a profit off intellectual property. I thought they understood the niche community they were providing for.
I can't [implicitly] trust this - not after having had better in the past.
So 'curl ... | sudo -' is just fine now because we can't verify from the bottom, so there's no sense in verifying what we're piping to sudo?
You are kidding, right? Right?
Because, in principle, what you have said amounts to, "Open-source doesn't matter. Closed-source is just as good. Security by obscurity is valid and sensible." Or by real-world analogy, "I'm not a locksmith, so there's no point in having locks on my doors."
So, please tell me you are kidding...
I do agree that we need open hardware, but in practice that usually isn't available.
> there is no need to reject hardware with nonfree designs on principle.
That's not to say that there isn't value in open hardware, and certainly it's not a bad idea to advocate for open designs. It's just not part of the FSF's mission.
And it sounds like you can't update the code. So you don't have a way of verifying what the actual device runs. So people complaining about trust here are somewhat misguided when they say they don't want to trust Yubi. Open sourcing would only help find non malicious bugs. Since it sounds non modular there's possibly all sorts of stuff mixed in making it hard to audit so they decided it wasn't worth it.
Dunno if that's right but that's what it sounds like.
On the other hand, keeping the code closed does garner a lot of distrust.
1. This isn't exactly true; Yubico as a whole might be trustworthy but that doesn't mean individuals within the company can't slip in something that looks benign but breaks security, and slips past the other developers. Such vulnerabilities are documented even in open-source code.
2. Even non-malicious bugs can have security implications.
the original neo let you decide of the key/write to the neo, etc.
Thus they did not "disable it because of hardware", it is a choice of their, which they had already made for a year or two anyway.
Edit: They seem to have a pretty even mix of open source a closed source enterprise partnerships. https://www.yubico.com/support/partners/
There is one, big, obvious government-shaped reason.
I think this is a lazy broken heuristic that ends up labeling all social criticism and all allegations (even if credible) as "crazy." It's very Soviet-- you are mentally ill if you disagree with the government.
The problem is that we have hard documentation that governments (including but not limited to the USA) have run actual named and funded programs and efforts with the explicit intent of sabotaging crypto available in the public market. It's not even a "theory." It's established historical fact.
Believing the queen of England is a shape shifting reptile or that we didn't go to the Moon is a woo-woo conspiracy theory. Believing in things with a hard paper trail is not, nor is entertaining the possibility that governments might be doing things that we know for a fact they have done in the past.
It's not only a Soviet phenomena; American politics has utilized a "paranoid style" for a long time.
> sabotaging crypto
It seems like a lot of people are pretending that BULLRUN doesn't exist. Nobody wants to believe that a coworker might be a collaborator; that kind of thinking can easily erode trust and create paranoia even when it isn't warranted. Unfortunately, the program exists so it's foolish to ignore the probability that it is still working to weaken crypto. As PHK explain in "Operation Orchestra", encouraging weak crypto is much cheaper than breaking real crypto.
Nefarious: government is, I contend, only that group of criminals we have collectively decided we would be better to regulate and pay off. I'm not saying this is a bad thing, we certainly need some regulation and law enforcement, etc.
Diabolically clever: the word means 'characteristic of the devil' - government is responsible for torturing people, killing people in pointless wars, etc.
Cartoonishly inept: for sure! you only have to go outside, pick up a news paper, browse a news website, to see how incompetent government can be.
But, we tend to speak of 'government' as some cohesive whole, which it most certainly is not. Is any one branch or agency of government all three of those things at all time? I don't think so. Some parts of government do a fine job of administering their responsibilities, some of the time. Probably. Maybe.
> Cartoonishly inept
The simplest scenario isn't that YubiKey 4 went closed source to support a government backdoor. It's that it's entirely for business reasons as they've said. And then after a few years, a few more layers of middle management, a few interesting users, and a little more TLA focus, Yubikey 6 quietly gets subverted.
Tangentially - I was pretty close to buying a Yubikey Neo for its form factor, but it didn't seem like I could modify/reload the OpenPGP applet, and documentation was scant as to how configurable it was. I really want the thing to operate as semi trusted hardware - passphrase, etc. Smartcard tech is nifty, but it seems like a non-hardened chip would be more worthwhile for the ability to iterate features/UI.
tptacek's positions hold outsized influence here, so even after his public concession on Dual_EC_DRBG, it's still very unpopular to posit that nation states would ever backdoor products.
No reason to complain about downvotes, per the rules.
Even then you have to extract the firmware from the device then try to match it with your compiled binary. Seems like you might as well just reverse the binary and look for backdoors directly?
1. Send message to sign over the wire
2. Display message to sign on the device's screen
3. Send a message to continue or reject over the wire
4. Device displays scrambled pinpad
5. User enters PIN on the compromised computer, using a blinded pinpad (like the Trezor)
6. The blinded PIN is sent over the wire to the device
7. The device verifies the PIN, and if correct, signs the message displayed in step 2
8. The signed message is sent over the wire to the compromised device
But you have the bulk problem... it's a tough industrial design problem.
Tokens are fine to ensure that credentials cannot be easily compromised and to provide 2FA.
PED security is really critical when the goal is to duplicate the token e.g. credit cards if your machine is compromised then any data protected by the Token can also be compromised as soon as you access it.
When in doubt wipe, while it's nice to have a robust security stance in this case I don't think i would matter much.
Do you really expect a leading company of security hardware to give the keys of its kingdom away (pun intended)?
As far as I can tell, if you got one of these in the mail, there'd be no meaningful way you could verify that it hadn't been tampered with anyway. So you'd just have to make a leap of faith, and assume it was "secure"? If you were prepared to do that, then fine use the yubikeys. If not, perhaps you should take a deeper look at your usb mouse and keyboard too. Did you verify that your keyboard isn't running some code that might compromise your security?
That said, your point is largely on the money that, were're taking great faith that your computing device is secure. But at the same time, I'd put more stock into a device that handles my super secret key and attempts to make reading it and tampering impossible / unfeasible from the devices I plug it into.
Thusly, it's perfectly resonable to care more about your yubikey than your mouse, from a security perspective.
You should read all about "BadUSB". What you imagine is not the way that the world, in particular USB, actually works.
With that said, I wouldn't be suprised if you were right either, but that's going to need a different google search. Thanks for the link nonetheless.
I don't want it to read abitary files from my system and then call home and it's resonable to assume and desire that the computer does not allow that to happen.
Yes, I prefer open and Free systems. I don't like running on Intel chips, because they come with a back door monitoring chip that's hard to keep track of, especially on systems with an integrated network card. Yes, it's nice to have the PCB, hw design and code of a device whose purpose it is to "do crypto".
But I still don't see how things changed wrt. yubikey here. They have always been upfront about selling magic crypto beans so to speak: either you trust them, or you don't. There's no real transparency. There's not even (AFAIK) an easy way to know you have an actual yubikey device, and not a device that just looks like a yubikey - but in fact contains different, or modified hw that does a little more than you would like. And so is the case with keyboards on which you enter your secret communication (as well as passwords and pass-phrases).
This isn't new, it's been yubikey's business model to be a company you trust to "do crypto". I still think it is much more likely that a yubikey isn't compromised than the rest of your system. And I think it does buy you some security. I'd even go so far as to say I probably trust a small proprietary system by experts, more than the behemoth that's the jvm/jdk/javacard.
I'll also note, that it is probably easier to spot a yubikey "read abitary files from my system and then call home", than it is to spot a yubikey answering to a secret 40-digit number and disclose all session keys it's generated up to that point, along with any private keys stored on the system. Which is the kind of thing you'd probably not want it to do, when handled by Egyptian secret police, or whomever it is you've pissed off.
Greg K-H, as he's known, is the #2 Linux kernal developer, and release manager of stable. I disagree with him on some issues (ironically enough, the requirement for kernel dev submissions to have real name identities attached, also his attitudes toward choice generally and systemd specifically), but his information here warrants attention.
Of course, you've just taken a space alien cat's word for this, but you're welcome to independently verify the information.