Hacker News new | past | comments | ask | show | jobs | submit login
D-Link publishes code-signing private keys by mistake (translate.google.com)
243 points by svenfaw on Sept 17, 2015 | hide | past | web | favorite | 121 comments



Here's my translation:

D-link has accidentally published private keys for certificates which they use to sign their software. It was possible to extract the keys from the manufacturers open source firmware packages. That made it possible for Criminals to abuse the certificates.

[short explanation of how the certs work for n00bs]

The mistake was find by bartvbl who informed the reporter of the problem. He had bought a D-Link DCS-5020L security camera and wanted to download the firmware. D-Link makes the source from lots of firmware available under a GPL license. "After looking through the files it turned out that the they contained the private keys which are used to sign the code" reported bartbvl, "It gets worse - some of the batch files contained the commands and passphrases that were needed [to sign the software".

The user was able to confirm that the key could be used by signing a file that wasn't from D-Link. The certificate expired at the start of september, which means that the trick no longer works. Even now that the expiry date has passed, software which has been signed will still be seen as being valid. Windows only shows a certificate error after the certificates have been withdrawn. This withdrawal has already happened so this [bug/failure] cannot be abused.

Software security specialists Fox-IT confirm the findings. Yonathan Klijnsma, a researcher from the company: "The certificate was indeed in one of the firmware packages, version 1.00b03, the source of which was released on 27-feb-2015. The cert was thus published before it expired, which is a big error". He [, Yonathan,] found four other certificates in the same folder.

D-Link have in the meantime released new versions of the firmware which do not contain the certificates. In a statement the company told us that they regularly update the firmware in order to 'conform to the latest security and quality standards'. They emphasized that there is no question of [malicious] premeditation. "D-Link [does everything to prevent development of backdoors, bla bla boilerplate PR speak, etc etc]". D-Link promised Tweakers that they will release a new firmware next week which contains further fixes for the security problems.


TLDR: D-Link accidentally included their certs and batch files containing passcodes in a beta package of their firmware. Problem has been fixed so it's no longer an issue.


Has the problem been fixed? I would think the cat's out of the bag; the private keys are already out in the wild (I assume), so all D-Link hardware is now vulnerable to exploitation.


> "It turned out what to look through the files that were in private keys to sign with code", reports bartvbl, "In fact, in some batch files were the commands and pass phrases that were needed."

A company finally holds up their end of the GPL by including all source needed to create a working build, and it's called a mistake!?

For every device requiring images to be signed with a specific key, this is exactly what the manufacturer should be forced to release!


This is absolutely a mistake. The whole point of requiring the firmware to be signed with a specific private key is to prevent third parties from installing their own custom firmware (e.g. to prevent malicious actors from installing malware onto the device). If you want to let everyone install their own build, then just don't require the code signature (or, alternatively, create a program where third parties can request a certificate signed with your master, so they can then sign their own binaries, similar to how iOS development works).

> For every device requiring images to be signed with a specific key, this is exactly what the manufacturer should be forced to release!

What do you mean by "should" here? If the source is covered by GPLv3, then yeah, if they use code-signing they need to have some way to let you get around it. That's the point of the anti-tivoization clause. But if the software is distributed under GPLv2, or any other license, then there's no anti-tivoization restriction, and therefore no "should" about it.


> The whole point of requiring the firmware to be signed with a specific private key is to prevent third parties from installing their own custom firmware

Yes, which is blatantly anti-GPL2. I really wish any Linux copyright holder would step up to the plate and force the argument that since code+signature is the functional unit, adding the signature makes a derivative work and its source must be released in the preferred form for modification. The entire point of the GPL2 is that a developer who creates code and receives a modified version back should be able to modify that instance of code - tivoization is entering into the license in bad faith.

Of course going GPL3 would make for a more solid foundation for doing so, but clarifying this subject directly conflicts with the desires of the proprietary sponsors - Google etc.


The GPL is rarely litigated. What you are talking about is not blatantly anti-gplv2

IAAL (and an open source licensing lawyer at that, for many years). I'm also a contributor to FSF projects :)

Even I would tell you the GPLv2 almost certainly does not cover code signing.

It's actually very epxlicit about what needs to be there: That's the whole problem with gplv2: " For an executable work, complete source code means all the source code for all modules it contains, plus any associated interface definition files, plus the scripts used to control compilation and installation of the executable."

That's it. It very clearly and explicitly says what you must include, and installation keys in not there.

It even says 'scripts used to control installation", not "scripts involved in installation" or anything like that.


> What you are talking about is not blatantly anti-gplv2

Someone who buys a router running locked down Linux lacks two out of the four essential freedoms, so yes tivoization is blatantly anti-gpl2. I would say this is a popular (edit: non-legal) opinion, especially given the upvotes I have received.

But the legal interpretation may diverge from the spirit and purpose of the license, so I would love to hear the opinions of an experienced lawyer...

I don't see any real difference between "scripts used to control installation" and "scripts involved in installation", "installation scripts", "scripts used to install" etc, unless the specific wording has been used to create a huge loophole. But as you say, there is not much case law so I don't think this is so.

What exactly would you say that "scripts used to control ... installation" includes?

A script that says "cp -f a.out /bin/" most certainly qualifies, no?

And how about "st-flash write foo.bin 0x8000000" ?

And then "st-flash write foo.bin `cat BASE-ADDR.txt`" ? I would think BASE-ADDR.txt is still considered part of the "scripts" in the legal sense?

Surely you're not arguing that these scripts "perform" the installation rather than "control" it ?


"I would say this is a popular opinion"

Completely irrelevant to the legality question.

The fact that Eben Moglen and the FSF crafted the GPLv3 specifically to avoid future "tivoization" should serve as a pretty good indication that even the FSF's lawyers don't believe there is a (court-case winnable) legal infraction here when it comes to GPLv2 code, though everyone would agree the spirit and the purpose of the license often isn't being respected (which is, again, irrelevant to the actual legality when it comes to copyright law).


The addition of explicit anti-tivoization language to v3 does not imply that they think v2 couldn't have that effect, just that they thought it would be nice to have a stronger case. The GPLv3 makes it an almost-guaranteed win (to the extent that such a thing exists), but that doesn't mean trying with the GPLv2 is a guaranteed loss.

And this conversation is clearly about more than just which legal interpretations hold force. It's about the spirit of the license and how closely it aligns (or not) with the interests of its more prominent users who have the standing and resources to at least try to to fight against uses that go against the spirit of the license.


"The addition of explicit anti-tivoization language to v3 does not imply that they think v2 couldn't have that effect, just that they thought it would be nice to have a stronger case."

While true, having been around then and knowing what went on in the drafting committees, it's definitely the case that they were told they didn't think they had a winnable case with the GPLv2 language ;)

If they had thought they did, they also would likely have gone after TiVO :)


> But the legal interpretation may diverge from the spirit and purpose of the license, so I would love to hear the opinions of a practicing lawyer...

What are you talking about? You just heard an opinion from a practicing lawyer, and an IP lawyer at that.

> I would say this is a popular opinion, especially given the upvotes I have received.

There's no opinion here -- this is a matter of law, which you clearly have no expertise in.


> There's no opinion here -- this is a matter of law

No, it is a matter of opinion on whether a certain behavior is in line with the spirit of the GPL2 or not. The purpose of GPL2 is to support the four essential freedoms. If it has bugs that fail to prevent certain freedom-destroying behavior, that does not preclude those behaviors from being considered anti-GPL2. I am not saying this as a legal opinion, which is why I go on to contrast.

> You just heard an opinion from a practicing lawyer, and an IP lawyer at that.

Yes, I have read many of DannyBee's other comments. This was meant as an invitation saying I really looked forward to him replying to my specific questions. If it came across wrong then I am sorry. (I edited it to say 'experienced' because I can't tell if he is still professionally practicing or has completely moved to the technical side.)


"I don't see any real difference between "scripts used to control installation" and "scripts involved in installation", "installation scripts", "scripts used to install" etc, unless the specific wording has been used to create a huge loophole. But as you say, there is not much case law so I don't think this is so. "

So, then we have a problem. Courts assume that drafters who write lists mean something by each member of the list. So they are going to read a definition the differs between controls and installs when they see: " plus the scripts used to control compilation and installation of the executable" IE they will read a definition that means controlling compilation and installation are different things.

As for the rest, you want to read a very broad definition of script into it. This is going to go very badly for you.

Scripts are not the same as their data files or external requirements :) The script is the script. The combined script + datafiles+whatever is clearly a combined work. It's not even, in general, colloquially referred to as a script, it's usually referred to as a package or a distribution or any of a hundred other things.

"What exactly would you say that "scripts used to control ... installation" includes? "

Literally any files that are scripts that are used in the installation. Data files they use, unless covered otherwise, would not be covered.

"And then "st-flash write foo.bin `cat BASE-ADDR.txt`" ? I would think BASE-ADDR.txt is still considered part of the "scripts" in the legal sense? "

Most lawyers take a pretty strict definition of script. The script part is the script. Again, anything else would not need to be shipped unless otherwise covered by some other part of the requirement. In most cases, it happens to be.

For example, it's likely BASE-ADDR.txt is part of the executable's source, or generated by the build system, or whatever else.

Now, pragmatically, in order to try to keep everyone happy, some companies will ship all the non-proprietary utilities/data files necessary to run the script (though not necessarily as source), but that is a nicety, not required by GPLv2.

Your argument starts to also extend very broadly, what if my script requires a special fuse plugin for my company's magic distributed file system that i store all this stuff on, and becomes worthless without it. Do you believe i have to give you the plugin? The source to the plugin? etc


Thank you for the reply.

I threw out those various phrasings not because I'd expect they'd be treated identically, but because I was asking if the specific phrasing used happened to exclude some kind of major category. I really don't know where my thinking diverges from the accepted legal consensus, and wasn't sure if it hinged upon that exact phrasing somehow.

> As for the rest, you want to read a very broad definition of script into it. ... Data files they [scripts] use, unless covered otherwise, would not be covered. ... Most lawyers take a pretty strict definition of script

Concretely, per your narrow definition of "script" (which I am understanding as anything a programmer would call "a script", correct me if you have something more formal), a makefile is not a script. neither is some C with a sequence of system("...) commands, and neither is something piped through rot13 to sh, nor a sequence of commands in a free-form text file that are copy and pasted into an interactive shell.

As a result, the word "script" is essentially meaningless in the legal context, due to the technical transformations that can be done on any script that makes it not a "script" - computer science is a playground of made up words and endless reinvention.

I had the impression that courts looked at general concepts and frowned upon attempts to circumvent the intent of contracts; is this not so?

> what if my script requires a special fuse plugin for my company's magic distributed file system that i store all this stuff on, and becomes worthless without it. Do you believe i have to give you the plugin?

Doesn't this strongly come down to intent? If the company just happens to use some internal filesystem that differs radically from POSIX, and their engineers really do work on the program using it (rather than some obfuscation game), then I don't see them being compelled to rework the source tree into POSIX conventions. But if the file system is adopted and integrated specifically so the company can avoid its obligations under the license, isn't it likely that a court would call them out on it?

Stepping back, doesn't copyright law by default essentially mean most things added to a build are derative works, and also the final executable itself is a derivative work? The GPL2 has explicit exemptions for operating system components and "mere aggregation", plus further exemptions when combining with non-GPL works (eg Classpath and OpenSSL). Otherwise, these uses would run afoul of copyright. Presumably the GPL2 says "scripts" as distinct from data which is already covered since it gets incorporated into the final executable work.

I would think if you played a similar game with an independently-developed binary code blob - prepending it to compiled GPL source - you would be shot down pretty quickly since the entire executable is one derivative work. Don't binary kernel drivers only legally work as modules precisely because the resulting in-memory derivative is never distributed?

How does the scenario of incorporating such a machine code blob differ from creating a signature blob and inserting it into the executable's header?


> It even says 'scripts used to control installation", not "scripts involved in installation" or anything like that.

To be fair, one of the definitions of "control" is to "determine the behavior or supervise the running of."

So I think a fair case could be made that the software signing keys determine the behavior of the installation when their signatures decide whether or not a binary will be accepted by the hardware. You can say that they don't do much actual work, sure, but neither do real-world managers who merely sign off on projects and nobody disputes whether or not they "control" things.

And I know I'm being really pedantic about the word meanings here, but I'm also pretty sure that arguments of the same form are regularly entertained by courts.


> Yes, which is blatantly anti-GPL2.

No it's not. It's anti-GPLv3. GPLv2 absolutely allows tivoization. Stallman thinks this is bad, but not everybody else agrees (I know Linus Torvalds thinks tivoization is fine in the context of GPLv2, and I believe the FSF agrees).

> I really wish any Linux copyright holder […]

Your argument here is not a legal one. Any Linux copyright holder that did this would lose horribly. And in particular, Linus Torvalds explicitly stated that he does not support relicensing the Linux kernel under GPLv3 because of this (even if it were possible, which it isn't because of the very large number of copyright holders). Source: https://en.wikipedia.org/wiki/Tivoization#GPLv3

> The entire point of the GPL2 is that a developer who creates code and receives a modified version back should be able to modify that instance of code

And they can! But the GPLv2 does not require that the modified code must be able to be installed back on the original hardware by the person doing the modification.

> tivoization is entering into the license in bad faith

Not by any legal definition of "bad faith", and even with the informal definition, a lot of people would disagree with you.


> the GPLv2 does not require that the modified code must be able to be installed back on the original hardware by the person doing the modification

Does not explicitly. From GPL2:

> Accompany it [executable code] with the complete corresponding machine-readable source code ... The source code for a work means the preferred form of the work for making modifications to it

So, you're telling me that the preferred form for making modifications to the source code of a router does not include the scripts which install onto the router to test those modifications?

> Not by any legal definition of "bad faith"

The GPL2 and the FSF are very explicit on what the GPL2 is supposed to engender. Accepting the license and trying to circumvent it on a possible technicality (assertion that a signed blob isn't a derivative work, when the only reason anyone is interested in the source code is to run it) looks pretty bad-faith to me.

IANAMOAG (I am not a member of a guild, and you probably aren't either). Even if we were, we could still disagree until it was actually litigated. I see no reason to prematurely capitulate Linux devices being non-Free.


"So, you're telling me that the preferred form for making modifications to the source code of a router does not include the scripts which install onto the router to test those modifications? "

Yes. 100% yes. It is very explicit in what must be included and you are explicitly ignoring what it says about scripts:

" For an executable work, complete source code means all the source code for all modules it contains, plus any associated interface definition files, plus the scripts used to control compilation and installation of the executable. "

Scripts used to control compilation and installation != signing keys. It means scripts used to control compilation and installation.

You may want it to include something more, but it doesn't. That's why GPLv3 was written.


I have to imagine that the scripts controlling installation must sign the executable. (Assumption: unsigned executables can't be installed.) They can only do that by incorporating the signing keys. They might incorporate the keys by including the keys literally within the script proper, or by reading a file that contains them, but why are those situations different? Suppose the installation script defined a lisp interpreter and then read a data file which, when interpreted by the lisp interpreter defined in the script, installs the executable. Why isn't that data file also part of "the scripts used to control installation of the executable"?


"I have to imagine that the scripts controlling installation must sign the executable. "

This is a pretty big assumption, and in fact, is not actually true in a lot of cases :)

Let's assume it is here.

"Why isn't that data file also part of "the scripts used to control installation of the executable"?"

Because the data file literally isn't a script used to control installation of the executable? It's a data file.

At best, btw, you'll be able to argue that it would have to include the signing key, but not any password for it. So you are still pretty screwed :)

Your argument also swallows the definition. Where does it end? Do i have to include the interpreter for the script, and provide source? Why not, since it's required for the script to function. (Note, this would not be covered by the next exception in a lot of cases)

If i give you a bash script The installation script also almost certainly refers to the executable, but that's called out explicitly as something i have to give you source to. Why bother if it's included in the "scripts to control compilation/installation" part? (In general, courts are going to read definitions into things that make parts of enumerated lists differentiable from each other ) In general, i would say you can go down this path, but you won't get a court to come with you.

Scripts means scripts. Data files means data files.

If the key is literally part of the script, yes, you'd have to ship that. Not the password, mind you.

If they wanted something else, they should have written something else. Which they did, in gplv3 ;)


> Scripts used to control compilation and installation != signing keys.

You're referring to "scripts used to control compilation", but how are signing keys not part of the source code? GPLv2 uses a specific definition of source code, "the preferred form of the work for making modifications to it." Don't you have to be arguing that a form not allowing modifications to be used or tested is the "preferred form of the work for making modifications"?


"" Don't you have to be arguing that a form not allowing modifications to be used or tested is the "preferred form of the work for making modifications"?"

You are trying to take parts out and remove context from them.

I don't have to argue at all about the preferred form, because it very explicitly says what that is: "The source code for a work means the preferred form of the work for making modifications to it. For an executable work, complete source code means all the source code for all modules it contains, plus any associated interface definition files, plus the scripts used to control compilation and installation of the executable. However, as a special exception, the source code distributed need not include anything that is normally distributed (in either source or binary form) with the major components (compiler, kernel, and so on) of the operating system on which the executable runs, unless that component itself accompanies the executable."

The first two sentences go together. It defines, explicitly, what the preferred form of complete source code for an executable work includes.

So that isn't going to include signing keys unless you want to argue that one of the explicitly enumerated parts includes signing keys.

For years, people have argued back and forth about whether the scripts include data files. The general consensus is "no", because otherwise, it has no differentiable meaning from the other parts.


Do you check your private GitHub keys into VCS? Your wide definition of "scripts used to control compliation" would seem to include them! My preferred form of making modifications to my software includes uploading them to my GitHub account for code reviews among other things.

I use my GitHub private keys way more often than final code signing keys - be those keys used for as "nefarious" a use as tivoization lock-in, or as benign a use as proving authorship so Windows can tell my executables apart from some asshole's malware better.

During standard development I'm either using temporary, machine-local certificates, or using unsigned code. If that means I need a devkit, or an emulator, or a dongle, or whatever else might be required to iterate that doesn't require the final "publishing" steps, in a well designed system it shouldn't require final publishing certs. Because for security's sake, you'd consider going to such lengths as keeping those code signing certs on a tightly controlled, air gapped machine, to reduce the chance of getting pwned and dealing with cert invalidation. Or worse.

Which makes it a very far cry from being part of your compilation phase in any capacity, nevermind a script used to control it.


You haven't read what I wrote.

But the "preferred form of the work for making modifications" presumably involves each person using their own github keys (and/or repositories) etc. Anybody can create an account and use github. That is not true in the tivoization case because in that case the only keys that can be used are the original ones.


> You haven't read what I wrote.

Are you sure? Based on what? I've managed to quote you without the use of copy and paste in that previous post - I'm not sure how I accomplished that if I didn't read what you wrote - so please be more specific.

This also begs the question: Did you read what I wrote?

> But the "preferred form of the work for making modifications" presumably involves each person using their own github keys (and/or repositories) etc.

And the "preferred form of the work for making modifications" also presumably involves each person using their own signing keys (and/or hardware that may not require them) etc.

> Anybody can create an account and use github.

Anyone can buy their own hardware, and compile for it, and install on it. Just don't pick tivoized hardware.

> That is not true in the tivoization case because in that case the only keys that can be used are the original ones.

The source code is not tivoized - hardware is. So don't pick tivoized hardware.


> Are you sure? Based on what? I've managed to quote you without the use of copy and paste in that previous post - I'm not sure how I accomplished that if I didn't read what you wrote - so please be more specific.

You're still talking about "scripts used to control compliation" when my point was that it doesn't need to be that if it falls under the definition of source code proper.

> And the "preferred form of the work for making modifications" also presumably involves each person using their own signing keys (and/or hardware that may not require them) etc.

Which works perfectly well up to exactly the point where the person distributing the software delivers it on hardware that does require signing keys and doesn't allow the user/owner to use their own keys. Then they need yours because you prevented them from using their own.

> The source code is not tivoized - hardware is. So don't pick tivoized hardware.

The hardware is being chosen by the person distributing it with GPL software.


> You're still talking about "scripts used to control compliation" when my point was that it doesn't need to be that if it falls under the definition of source code proper.

Ahh. Expanding the definition of source code enough to include code signing keys would have also put them into the (sub)category of "scripts used to control compilation and installation" in my mind, making the distinction moot.

It's not moot in the context of the specific point you intended, of course, and which you've now provided, which I thank you for.

> Which works perfectly well up to exactly the point where the person distributing the software delivers it on hardware that does require signing keys and doesn't allow the user/owner to use their own keys.

Which works perfectly well up to exactly the point where the person distributing the software delivers it on hardware (my VCS server) that does require VCS keys and doesn't allow the user/owner to use their own keys.

> The hardware is being chosen by the person distributing it with GPL software.

...and by the person purchasing the product that has the GPL software in it, by choosing to buy that product.


> Which works perfectly well up to exactly the point where the person distributing the software delivers it on hardware (my VCS server) that does require VCS keys and doesn't allow the user/owner to use their own keys.

You distribute your software by conveying the VCS server hardware to the customer?

> ...and by the person purchasing the product that has the GPL software in it, by choosing to buy that product.

All GPL licensees have the rights the license grants them, not just the ones who buy or don't buy specific hardware. What good is the license at all if distributors can just say "I'm not doing that part, if you don't like it buy from somebody else"? And what if they all do that, or nobody else makes equivalent hardware?


In actual fact, on many (perhaps most) platforms for which code must be signed, the developers on those platforms don't have to bother with code signing to develop and test their own code; it's stubbed out in the development environment.

So even a reaching definition of "preferred form for modifications" doesn't hit signing keys, just like "preferred form" of a Rails app doesn't include your EC2 credentials.


"modification" applies specifically to the code as is installed and shipped.

Picture the scenario at the manufacturer where one developer is being tasked with creating a new feature using a consumer unit fresh off the line, for a business demo. What resources do they utilize in the course of doing this? This is the company's "preferred form" of the source tree for modifying an as-shipped device, and is what needs to be made public.

The company could supply a firmware update to overwrite the bootloader and stub out signature checking, if that is what makes sense. Or a well-specified description to use JTAG to do the same. Or perhaps they don't actually care about security and 'make dist' creates an installable image.

The thing is, all of these arguments ("preferred form for modification", the key being part of the installation script, code+signature being the executable work) seem like uphill battles when trying to show the manufacturer is violating them in an informal forum like HN.

But the intent and spirit of the license is straightforward and even spelled out in the preamble! "Our General Public Licenses are designed to make sure ... you can change the software". Fulling these goals is the entire point of the contract and only consideration for receiving a license. So the manufacturer is actually the party that must work to show how what they're releasing actually fulfills the specific terms.


> In actual fact, on many (perhaps most) platforms for which code must be signed, the developers on those platforms don't have to bother with code signing to develop and test their own code; it's stubbed out in the development environment.

What is the relevance of that supposed to be? You can put any part of the development process on any number of different computers or distribute the work between different developers however you like. And the relevant "preferred form for modifications" is presumably the form the user uses to make modifications to the device delivered to the user in any event.

> So even a reaching definition of "preferred form for modifications" doesn't hit signing keys, just like "preferred form" of a Rails app doesn't include your EC2 credentials.

Why wouldn't it include your EC2 credentials (or other root access to the instance) if conveying an EC2 instance to the customer is how the customer receives the software from you? But now you're wading into the SaaS vs. software distribution ambiguity which has nothing to do with the case where you've bought a physical device with the software on it.


I'm not sure how this is premature. The reason the practice is called tivoization is because we went through all this years ago when Tivo did it. The FSF acknowledges that releasing the source but locking you out from installing your self-built version of it does comply with the GPL2, which is why there's a GPL3.

See https://www.gnu.org/licenses/gpl-faq.html#Tivoization


IMHO, poor wording in a FAQ by the FSF. The company considers themselves complying with the GPL2 by releasing a scrubbed tree. The FSF looks the other way, having limited resources.

I'm having a hard time digging it up, but I swore I have seen a video where RMS lays out the exact same theory I'm saying here.

It's not a terribly outlandish theory. I'm just getting a lot of flak for bucking the status quo from people who grew up with DRM kool aid and see Free software in terms of the corporations that lock it down.


So, I don't think people have 'drunk the DRM kool aid' so much as everyone has generally agreed for the last decade or so that GPLv2 doesn't prevent tivoization and GPLv3 does.

That everyone includes RMS, by the way. Here's a transcript[0] from 2006 where he's talking about it and says:

>For instance, the Tivo itself is the prototype of tivoisation. The Tivo contains a small GNU/Linux operating system, thus, several programs under the GNU GPL. And, as far as I know, the Tivo company does obey GPL version 2. They provide the users with source code and the users can then modify it and compile it and then install it in the Tivo. That's where the trouble begins because the Tivo will not run modified versions, the Tivo contains hardware designed to detect that the software has been changed and shuts down. So, regardless of the details of your modification, your modified version will not run in your Tivo.

So while you may be correct that it's never been litigated, it would seem that if the strongest software freedom advocates like RMS and the FSF thought that it did violate GPLv2, surely rather than telling people they should upgrade to V3 because it blocks tivoization [1], they would have just litigated against tivo and set the precedent.

[0] http://fsfe.org/campaigns/gplv3/tokyo-rms-transcript#tivoisa...

[1] https://www.gnu.org/licenses/rms-why-gplv3.html


I'm not one to change my mind just because "everybody" believes something.

RMS was likely wrong about at least one thing in that clip - I would think signature checking, especially an early incarnation of it, would be done by the bootloader (ie software/firmware) rather than a specific piece of hardware.

This could be minor (thinking of embedded firmware as hardware isn't wrong in the freedom sense), or that framework could have misguided his thinking. He didn't use a specific term for the compiled source code, only "it". The GPL2 speaks in terms of "executable", but an unsigned or missigned binary is not actually executable. The "executable" would consist of the machine code and header, which would necessarily include a correct signature to be "executable".


It is executable - just not on the device it was supplied with. But you could quite legitimately run it on some other device or an emulator.

For comparison, say you had some GPL software which checked in with a dongle or license server, and refused to run if there was no license/dongle. Then you'd be within your rights to edit it to remove the license check.

But here, the device itself refuses to load the program because it isn't blessed. Consider another implementation: in the factory, the hash of the GPL firmware is burnt into a ROM, and the bootloader checks that before running the code. Clearly you can build a modified executable, and you still can't run it on the device. But crucially, there is no header that you're missing; the device just can't run that exe. Or suppose the modified executable is too big to fit in memory - are they obliged to come upgrade the device? The point is, the license covers the software, not the loading of that software.

IANAL, while this may appear against the spirit of the license, it's hard to see how the license prohibits it. The existence of GPLv3 implies the FSF thinks so too.

To put it bluntly: an ought does not imply an is.


The license itself doesn't explicitly "prohibit" your roundabout contraption that's been designed solely to defeat the license. But this is exactly the type of sham that courts tends to see through.

Please find me any real-world device that actually has a symmetric checksum burnt into OTP or mask _ROM_ while the code itself resides in flash.


> So, you're telling me that the preferred form for making modifications to the source code of a router does not include the scripts which install onto the router to test those modifications?

No. Not really. GPLv2 requires that the modifier of source code release the tools to build/compile that source code into an executable. Whether or not the source code can be run directly (ex. Ruby, Python) or requires a compile/assemble/link process (ex. C, Java) is irrelevant. The GPLv2 also does not entitle or give anyone rights to any method to execute that recontributed and rebuilt code, for example if the code was ported to a proprietary hardware or some closed sourced ISA. However, IANAL and could very well be wrong because the GPL is such an unwieldy beast of unknowns.


> So, you're telling me that the preferred form for making modifications to the source code of a router does not include the scripts which install onto the router to test those modifications?

In short: yes.

Being able to access and modify and redistribute source code isn't the same as having the ability to deploy it to a particular piece of hardware.

Likewise, if I open source a .NET project that doesn't mean it has to include Visual Studio even though VS is the preferred way to build it.


> Being able to access and modify and redistribute source code isn't the same as having the ability to deploy it to a particular piece of hardware

But the preferred form for modification includes being able to test your changes. Source without installation scripts is most likely not the preferred form - ie what do engineers at the company use?

> if I open source a .NET project that doesn't mean it has to include Visual Studio, which is the preferred way to build it

Yes, because having been designed in the era of expensive compilers, GPL2 explicitly covers this:

> However, as a special exception, the source code distributed need not include anything that is normally distributed (in either source or binary form) with the major components (compiler, kernel, and so on) of the operating system on which the executable runs, unless that component itself accompanies the executable


Visual Studio is not a "major [component] ... of the operating system on which the executable runs". It's not any component, in fact; it's a completely separate application from the OS, and in no way bundled together.


You're arguing that a compiler is not a major operating system component when the license explicitly enumerates it as a major operating system component. Nobody says you have to get all your operating system components from the same disc.


The compiler is part of VS though, not part of the Windows. You don't get VS if you have Windows, and you don't get Windows if you have VS. They're completely separate products. So if it isn't actually included with the OS, and you need to acquire (either through a free or paid license) a completely different product to get it, then I don't see how it could be considered a major component of the OS, regardless of what the license text says. It's not a matter of it being on another disk, it's a matter of it being a completely different product in the Microsoft world.


It doesn't say anything about products. It doesn't say that major components can't be optional components. It does say that a compiler is a major component of the operating system.

You're trying to draw a box around "Windows" and define operating system as everything in that box. But minesweeper is not an operating system component, it's just something that comes with Windows. A hypervisor is an operating system component, even though not all versions of Windows have one.

So is a compiler.

I imagine there exist developer-oriented PC OEMs that offer PCs with Windows and Visual Studio already installed. So then it is included with the OS by the OEM. The fact that that would change the outcome in your interpretation implies that there is something wrong with it.


The Windows SDK comes with the c++ and c# compilers, all headers and libraries needed to build windows software. It doesn't come with the fancy text editor or tools in Visual Studio, but it definitely comes with the compilers you need.


IIRC FSF's position on this is that the original author has to include inlude explicit GPL exemption for things like MFC, VCL or even OpenSSL, if the program depends on them, in order for anyone to be legally able to distribute binaries. I would assume that something similar comes into play when building the program requires proprietary compiler that is not "component of the operating system" (although in that case it might very well be implied).

That in practice nobody except FSF and Debian cares is another thing.


If you consider signed blobs derivative works (which means hashing and producing a signature creates a derivative work), then any signed statement referring to an hash of the source code (or including the source itself) is also a derivative work.

This means that someone sending a PGP signed e-mail saying "I audited that the source code of GNU Foo with hash 0xdeadbeef... is free of backdoors" is violating GNU Foo's copyrights unless he releases his private keys.

This is obviously a very undesirable outcome, and so it's resonable to hope that hashing does not create derivative works, and thus signing doesn't either.

Including the scripts to install in the "preferred form" also seems very problematic because then you might also be forced to include a full copy of the hard drive of your development machine since that's an even more "preferred form" under that theory.

This is also a very undesirable outcome, so it's again reasonable to hope that "preferred form" does not necessarily include things that were absent in the original work and that aren't essential for reproducing the binary derived work itself.


It's not the hashing of the object code that makes the signature a derivative work. It's the appending the signature and object code that creates one whole work.


Yet in GPLv3, where they could have explicitly made it so that the modified code must be able to be installed back on the original hardware, they declined to do so in the general case. The anti-tivoization clause instead was written narrowly to only apply when the object code was conveyed "as part of a transaction in which the right of possession and use of the User Product is transferred to the recipient in perpetuity or for a fixed term (regardless of how the transaction is characterized)".

Note that this means that I could sell a device that only runs signed code and could separately sell signed apps for that device that the user can buy and install later, and as long as the code that ships with the device is not GPLv3, I could use GPLv3 code in the signed apps and would NOT be required to release signing keys under GPLv3.

(The Apple App store incompatibility with GPLv3 has nothing to do with the app signing requirements. It is due to the terms of service that users are required to agree to in order to access the store. These impose what the GPL considers to be additional, incompatible conditions).

It seems odd, if they felt the way you think they do, that they would have written this so narrowly.


I would like to state quite firmly that releasing the signing keys is absolutely the wrong solution to this problem, because that destroys the security that code signing was designed to create.

The correct solution to this problem is to release a tool for people to change the code signing certificates on their devices to ones which they control, and to arrange for this tool to require a reasonable level of physical access so that it cannot be used remotely. Then device owners can have code signing security on their own terms and be confident that only their own trusted builds can be pushed to their devices via remote updates, which is a super useful feature when you have more than two devices.


I was under the impression that while it may be against the spirit in which GPL2 was written, it was an oversight and the necessary changes were made to GPL3.


As far as I know, it has never been litigated - there is enough work making manufacturers release anything called "source" as it is. People writing new software just go with the clarified GPL3.

Based on Linus's views, the Linux kernel being GPL is a happy accident of history. We should be glad it wasn't BSD so we at least ended up with DD-WRT/AOSP/etc. But we also should keep pushing to preserve the freeedoms that fostered it.


The problem is that most people releasing GPLv2 software were not involved at its creation, and are doing it for the bare text contents or completely unrelated reasons. The original goals of the creators are not a very relevant reason.

Yes, the GPLv3 fixed that, and there was plenty of time for projects to migrate already. Linux won't migrate, Linus already explicitly stated so.


Code-signing is a useful and necessary security measure, so long as it is possible for the owner of the device to sign his own code. You need a good way to distinguish between the end-user and someone who has pwned him.


IMHO a well-documented JTAG, USB, or serial interface on the board to change the trust root would suffice.

If you wanted to prevent evil maid attacks, then a time lock would actually be more secure than a fixed trust root with respect to downgrade attacks and leaked keys.


I think it makes sense to require that any automated firmware updates be signed by the device manufacturer and this this key is never released. This is to protect consumers from malicious third parties who may attempt to install malicious firmware.

Then there can either be an option to disable this check or a large warning that is produced when you try to install unsigned code. There's really no point in code signing if everyone has access to the private keys.


> Then there can either be an option to disable this check or a large warning that is produced when you try to install unsigned code.

If the device allows unsigned or self-signed code to be installed then there would be no need to release the signing keys because they are not necessary to make modifications. Releasing signing keys should only be required when it doesn't.


Exactly my point.


I think it makes more sense to simply forbid automated firmware updates. Put a physical switch on the box; I'll press it when I want to update the firmware. No crypto signature needed.


What about non-technical users who require firmware updates for bug fixes or new features? You can have a physical button, but if the network has been owned in some way, pressing the button could still lead to malicious firmware being installed.

I could see this easily coupled with some social engineering. Control the network and then email users saying that an important update is needed.


Sure, it's no panacea, but it would at least be an improvement over the mess we have now.


How? For the average user (which are who companies really care about), code signing offers an important level of protection. Sure this would be an improvement for technical users who want something hackable, but that's a small fraction of the market.


Code signing is a fix for problems introduced by automatic firmware updates. Intead of going further down the chain by trying to fix the problems with code signing, I'd rather take a step back and find a better fix for the original problem.

Besides, I don't believe that firmware updates need to happen so frequently that there needs to be a convenient way to install them. It's okay if they are a nuisance, because a majority of devices will be updated zero times in their usable lifetimes, and a majority of the rest will be updated one time.


It's not about convenience, it's about security. Regardless of how many menus you have to go through or buttons you have to press if your device will just execute any code that it gets with no way to authenticate the code, you're asking for trouble. At the very least, the device should use a secured connection to download updates from the manufacturer.


How about the device does nothing, and I can download updates from the manufacturer if I choose to? It's my device, not theirs, and I will decide when and if I want to update it.


Sure. If that's what you're looking for in a device. Some users prefer their devices to update on their own. Take Google OnHub[0] for example. They make it pretty clear that new updates will provide significantly enhanced functionality. For users who want a device like that, securing updates is still important.

[0] https://on.google.com/hub/


Here is Linus discussing GPLv3 and tivoization etc at DebConf 14 in quite some depth http://www.youtube.com/watch?v=5PmHRSeA2c8&t=47m20s

(Basicly he doesn't consider tivoization to be that much of an issue, and doesn't consider it to be something he wanted when he selected the GPLv2)


i'll add this to the very long list of reasons to never even consider using a GPL license...


s/malicious actors/hardware owners/

See, I bought this WNDR3800 off Ebay, because I wanted to put OpenWRT on it. Its compatible. But I made the mistake of buying a used router that was managed by a ISP. They built their own firmware and purposefully made the firmware updater require specific headers in the firmware package. Now I can't update this device to OpenWRT, without some very specific hacking and compiling my own version of OpenWRT.

Who owns this hardware? I bought it. I paid for it. Its in my house. But apparently this ISP is truly the owner and I can't do anything that I wish with this device.

The firmware is based on Linux, the ISP of course hasn't released their source. The only way I know about the header bit is because somebody else figured this out years ago and there is a single message on a random web forum detail it.

Presumably, the "openness" of the original device running Linux is what allowed them to build their own custom firmware in the first place.


The problem with the argument "I can't do anything that I wish with this device" is it's also the same thing as "Other people can't do whatever they wish with my device". I sympathize with the desire to customize your router firmware, and I think it's important that this ability be available in general (e.g. no laws prohibiting it), and I think it's great that there are devices on the market that allow it, I don't think that any particular vendor should be forced to allow it. If they want to use GPLv3 then they do have to allow it, but it's perfectly legal for vendors using GPLv2 software to put restrictions firmware updates, and I have no objection to that. If you want to customize your firmware, you can simply choose not to buy their product (this of course depends on my earlier statement that it's great that there are devices on the market that do support this).

The important point here is that vendors aren't locking down their firmware because they're control freaks that hate when customers install modified firmware, it's also a big security hole. If you can replace your firmware with a modified copy, so can anyone else who manages to gain access to your device. There's other angles too of course, such as vendors who offer unique hardware may want to require their customers to always be using that business's products, as that's how they make money, but when we're talking about generic equipment like routers, the most important reason is security.


> The problem with the argument "I can't do anything that I wish with this device" is it's also the same thing as "Other people can't do whatever they wish with my device".

It isn't.

If someone has physical access to your device, you have lost. The end. They could just replace the entire device with one with an identical case but different internals.

But if you successfully prevent attackers from gaining physical access to your device then there is no security cost in granting anyone with physical access full control over the firmware.

And for devices you know will be placed in unsecured areas where you would still like some security theater (or to inhibit meddling by unsophisticated malcontents), there is nothing wrong with allowing the owner to lock the firmware using the owner's key or password.


The problem is, that when I bought this device there was nothing to say that it was modified by this ISP. The device looks exactly the same as any other WNDR3800. It wasn't Netgear who limited the hardware in this way, it also wasn't the Ebay seller; though it was their responsibility to disclose it was 'locked' to this ISP. I could buy a _new_ WNDR3800 from Amazon and easily install OpenWRT on it without any issues.

I realize the security issues, I'm not worried about them. As somebody who wants to learn about embedded Linux development, and who owns several other OpenWRT compatible devices 'locking' the firmware is not something I am interested in.


I am perfectly fine with an ISP retaining property ownership of a device if it is marketed as such. It should be illegal to use technical means to evade laws that dictate how renting work and what taxes applies for that kind of product.


I made the mistake of buying a used router that was managed by a ISP. ... Who owns this hardware? I bought it. I paid for it. Its in my house.

This argument breaks down when you're not talking about the manufacturer -- at the time the ISP made that modification, they bought it. They owned it. Should they not be permitted to do what they like? Should you not be permitted to write your own firmware which requires headers?


You can write your own firmware, but when you distribute the router you should distribute everything needed to build it with it.


You "should" do what the license requires. Anything more is nice, but charity.


A private sale between two parties is not distribution.


While that might be true, it has no effect on requirements of GPL. If the actual contract explicitly stated that the buyer will not exercise his rights coming from GPL, that might be another thing.


If I sell you a cake, do I need to sell you the recipe with it?


If the recipe is licensed with GPL, then yes.


There is an infinite set of things you can't do with that device even though you own it. You can't turn it into a surfboard, a dishwasher, or a birdcage. You also can't turn it into a customized router, apparently.

Maybe it's just me, but this seems like an easy thing to get over by buying some different hardware. Instead of stressing out about this particular WNDR3800, focus your angst on the general case, which is the FCC's attempt to make it illegal to upload custom firmware to any router.


> There is an infinite set of things you can't do with that device even though you own it. You can't turn it into a surfboard, a dishwasher, or a birdcage. You also can't turn it into a customized router, apparently.

This is sophistry. It's not a question of "can" in the sense of "be able (because of your skills) to", but of "the vendor actively prevents it".

And the vendor is surely not preventing you actively from turning your device into a surfboard, a dishwasher, or a birdcage. But he is actively preventing you from turning it into a customized router.


I agree you own that hardware. I don't think the ISP is the owner. I just think that the piece of hardware you own is only useful if you have the private keys. That configuration is a physical property of the object you bought. Are you suggesting we should force that manufacturer to tell us some private knowledge he has?

Don't try to make a "freedom" argument here. Vote with your Value and buy an unlocked piece of harware.


I highly suspect the word "should" was used not in legal sense, but in ethical one.

You're right about GPLv2 — vendors are not legally forced to provide any way to load modified binaries (that's the primary reasoning to license software under GPLv3). Yet, there is a belief that every hardware manufacturer should do so to respect the customers freedom to tinker with the device, especially after the support from vendor is gone.


> If the source is covered by GPLv3, then yeah, if they use code-signing they need to have some way to let you get around it.

Only if its a consumer product by the specific rules used in the GPLv3. Business products under the GPLv3 don't require that.


Good point, but since businesses don't really care about being able to install modified versions of firmware, I doubt it's controversial.


If they're going to release the signing keys, why bother with signing?


If you want to believe that the release was deliberate, then it's all part of the act - signing could've been just a point in a requirements document.

https://news.ycombinator.com/item?id=10236827


I think a lot of people here will agree with you in spirit, but this is getting very controversially received because it's unclear whether this was a "leak by incompetence" or an actual "we want to follow GPL2 and therefore we have to do this".


Then what's the purpose of signing in the first place?

The keys for official releases are supposed to be kept by a trustworthy senior management or some people with highly restricted privilege. Also the software/firmware shipping process must be well audited if there are serious (not just pretending to be) security requirements. The keys should not be exposed to a random technician or a new hire who can possibly responsible for destroying your and your clients' businesses. It looks like they didn't have a quite "formal" shipping flow at least in this case.


This is the fundamental disagreement between Linus Torvalds and the FSF over the GPL.


Surely one doesn't have to use the same certificates for the firmware verification of boot images and for the signing of Windows applications.


Indeed what world would we be living in where people couldn't sign malware and even malicious drivers for rootkits using the keys of one of the largest home networking corporations.


Google translate doesn't seem to be 100% clear, but the signing key is actually for some windows software, rather than device firmware. Is that correct?


From what i understood they explained it with the windows software thing but the key we are talking about is actually used to sign firmware.

Since most routers dont require signed firmware this just makes the few that does as bad as them.

Personally i would think the likely hood that the key would be used for good(Open WRT) is far greater than that it would be used for bad.

On a general note:

Consumer routers are in my experience generally very easy to hack in to stuff like http://192.168.1.1/setting.cgi?setname=`echo 'password' | passwd` is not uncommon so altering the firmware is seldom needed.


This is precisely why you store your signing keys in a HSM attached to your build server, precisely so that it's impossible to leak them (either by malice or incompetence).


Yeah the fact that it was even possible for dlink to leak keys in this manner just shows how utterly incompetent their engineering is.


https://en.wikipedia.org/wiki/Hardware_security_module - since the parent poster seemed to feel the need to use an extremely uncommon acronym to show his or her superiority.


A: It's not an extremely uncommon as acronym and anyone participating in a discussion on key management probably knows it. B: It's best to assume a commenter is acting in good faith and not trying to show off their "superiority".


Thanks for the link!

> since the parent poster seemed to feel the need to use an extremely uncommon acronym to show his or her superiority.

That was uncalled for, though.


It's really not that uncommon.

I think "If there's an AWS product for it, it's not that obscure" is a reasonable standard. You could pretty clearly google "keys HSM" and it would show up.


I don't think that "can be found on Google" is a very good criterion for commonness...


Given that there's so much out there to know, and that each field has its own commonly-used acronyms, I do think that if you can find the -correct- definition of an acronym in the first ten Google search results, then (IMO) that acronym is common.

As an example: Would you consider RAM to be a common acronym? If you work with computers you're very likely to answer "Yes." or "Hell yes." to that question. But, there are many people out there who have no idea how to expand RAM into its parts, and may not have ever heard the term. Yet, the expansion of RAM is in the top ten Google search results for the term. (It's in my top two.)

Additionally, I know what an HSM is, and I don't even work in a field that requires their use.


Horrible .. but the certs expired in early September so not currently relevant except to monitor history/past infections/malicious activity


Even if the expiration date is checked, it's easy enough to change the date before flashing new firmware


if this allows anyone to use their D-Link hardware however they see fit, I'm all for this kind of mistakes...


and when was code signing any real kind of security anyway?

its always felt more like a protection racket to me... especially when you can takes someone's codesigned stuff and resign it. getting the ability to code sign requires just money, time and a bare minimum of paperwork...

the trusted authority on these things is completely untrustworthy by the nature of their business imo.


> Furthermore, the company Tweakers promises that early next week new firmware comes out which security issues are also resolved.

Very assuring! I am worried about my other D-Link home surveillance cam products now.


It is NOT by mistake. Dlink engineers are INCREDIBLY INCOMPETENT.

Let me remind you this excellent piece: http://www.devttys0.com/2015/04/what-the-ridiculous-fuck-d-l...


Pedant hat on: still a mistake. Driven by shitty engineering, sure, but it wasn't done on purpose.


The best hacks are the ones that look like mistakes. It's the reason plausible deniability is one of the judging criteria in the underhanded C contest.


What if they're doing it deliberately, because they personally believe in allowing users to install customised firmware, and feigning ignorance/incompetence? They certainly wouldn't be able to do it "officially", so they work around it. Once everyone else knows, they'll admit their mistake and "fix" the problem, but add another way in?

Of course it could just be incompetence, but it's particularly interesting to consider the alternative. Seeing all the other companies who have leaked keys, datasheets, and other useful but officially proprietary information on their products, really makes me wonder. Also, those vulnerabilities which allow for jailbreaking and rooting of locked-down devices? Are they also there deliberately? For some reason, I really hope so.


writeup suggests its their global private key, same one used for signing windows apps


Poor decisions are mistakes, not accidents.


Don't say "incompetent" when you mean "they don't give a fuck" or "this software was subcontracted to the India".


[flagged]


I'm sorry, I don't speak English natively. Any correction is welcome.


What you should be sorry for is your automatic equation of "incompetent" with "subcontracted to 'the' India". That is ridiculous. Why not just say "subcontracted to incompetent devs"?

Does your country, whatever it is, have a monopoly on competent developers?


"subcontracted to 'the' India" is nothing more than a cliché.


"not having or showing the necessary skills to do something successfully"

Demonstration of necessary skills failed. Incompetent.


The fact that D-Link published code-signing private keys by mistake, is why we need ethics in code-signing.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: