D-link has accidentally published private keys for certificates which they use to sign their software. It was possible to extract the keys from the manufacturers open source firmware packages. That made it possible for Criminals to abuse the certificates.
[short explanation of how the certs work for n00bs]
The mistake was find by bartvbl who informed the reporter of the problem. He had bought a D-Link DCS-5020L security camera and wanted to download the firmware. D-Link makes the source from lots of firmware available under a GPL license. "After looking through the files it turned out that the they contained the private keys which are used to sign the code" reported bartbvl, "It gets worse - some of the batch files contained the commands and passphrases that were needed [to sign the software".
The user was able to confirm that the key could be used by signing a file that wasn't from D-Link. The certificate expired at the start of september, which means that the trick no longer works. Even now that the expiry date has passed, software which has been signed will still be seen as being valid. Windows only shows a certificate error after the certificates have been withdrawn. This withdrawal has already happened so this [bug/failure] cannot be abused.
Software security specialists Fox-IT confirm the findings. Yonathan Klijnsma, a researcher from the company: "The certificate was indeed in one of the firmware packages, version 1.00b03, the source of which was released on 27-feb-2015. The cert was thus published before it expired, which is a big error". He [, Yonathan,] found four other certificates in the same folder.
D-Link have in the meantime released new versions of the firmware which do not contain the certificates. In a statement the company told us that they regularly update the firmware in order to 'conform to the latest security and quality standards'. They emphasized that there is no question of [malicious] premeditation. "D-Link [does everything to prevent development of backdoors, bla bla boilerplate PR speak, etc etc]". D-Link promised Tweakers that they will release a new firmware next week which contains further fixes for the security problems.
A company finally holds up their end of the GPL by including all source needed to create a working build, and it's called a mistake!?
For every device requiring images to be signed with a specific key, this is exactly what the manufacturer should be forced to release!
> For every device requiring images to be signed with a specific key, this is exactly what the manufacturer should be forced to release!
What do you mean by "should" here? If the source is covered by GPLv3, then yeah, if they use code-signing they need to have some way to let you get around it. That's the point of the anti-tivoization clause. But if the software is distributed under GPLv2, or any other license, then there's no anti-tivoization restriction, and therefore no "should" about it.
Yes, which is blatantly anti-GPL2. I really wish any Linux copyright holder would step up to the plate and force the argument that since code+signature is the functional unit, adding the signature makes a derivative work and its source must be released in the preferred form for modification. The entire point of the GPL2 is that a developer who creates code and receives a modified version back should be able to modify that instance of code - tivoization is entering into the license in bad faith.
Of course going GPL3 would make for a more solid foundation for doing so, but clarifying this subject directly conflicts with the desires of the proprietary sponsors - Google etc.
IAAL (and an open source licensing lawyer at that, for many years). I'm also a contributor to FSF projects :)
Even I would tell you the GPLv2 almost certainly does not cover code signing.
It's actually very epxlicit about what needs to be there:
That's the whole problem with gplv2:
" For an executable work, complete source code means all the source code for all modules it contains, plus any associated interface definition files, plus the scripts used to control compilation and installation of the executable."
It very clearly and explicitly says what you must include, and installation keys in not there.
It even says 'scripts used to control installation", not "scripts involved in installation" or anything like that.
Someone who buys a router running locked down Linux lacks two out of the four essential freedoms, so yes tivoization is blatantly anti-gpl2. I would say this is a popular (edit: non-legal) opinion, especially given the upvotes I have received.
But the legal interpretation may diverge from the spirit and purpose of the license, so I would love to hear the opinions of an experienced lawyer...
I don't see any real difference between "scripts used to control installation" and "scripts involved in installation", "installation scripts", "scripts used to install" etc, unless the specific wording has been used to create a huge loophole. But as you say, there is not much case law so I don't think this is so.
What exactly would you say that "scripts used to control ... installation" includes?
A script that says "cp -f a.out /bin/" most certainly qualifies, no?
And how about "st-flash write foo.bin 0x8000000" ?
And then "st-flash write foo.bin `cat BASE-ADDR.txt`" ? I would think BASE-ADDR.txt is still considered part of the "scripts" in the legal sense?
Surely you're not arguing that these scripts "perform" the installation rather than "control" it ?
Completely irrelevant to the legality question.
The fact that Eben Moglen and the FSF crafted the GPLv3 specifically to avoid future "tivoization" should serve as a pretty good indication that even the FSF's lawyers don't believe there is a (court-case winnable) legal infraction here when it comes to GPLv2 code, though everyone would agree the spirit and the purpose of the license often isn't being respected (which is, again, irrelevant to the actual legality when it comes to copyright law).
And this conversation is clearly about more than just which legal interpretations hold force. It's about the spirit of the license and how closely it aligns (or not) with the interests of its more prominent users who have the standing and resources to at least try to to fight against uses that go against the spirit of the license.
While true, having been around then and knowing what went on in the drafting committees, it's definitely the case that they were told they didn't think they had a winnable case with the GPLv2 language ;)
If they had thought they did, they also would likely have gone after TiVO :)
What are you talking about? You just heard an opinion from a practicing lawyer, and an IP lawyer at that.
> I would say this is a popular opinion, especially given the upvotes I have received.
There's no opinion here -- this is a matter of law, which you clearly have no expertise in.
No, it is a matter of opinion on whether a certain behavior is in line with the spirit of the GPL2 or not. The purpose of GPL2 is to support the four essential freedoms. If it has bugs that fail to prevent certain freedom-destroying behavior, that does not preclude those behaviors from being considered anti-GPL2. I am not saying this as a legal opinion, which is why I go on to contrast.
> You just heard an opinion from a practicing lawyer, and an IP lawyer at that.
Yes, I have read many of DannyBee's other comments. This was meant as an invitation saying I really looked forward to him replying to my specific questions. If it came across wrong then I am sorry. (I edited it to say 'experienced' because I can't tell if he is still professionally practicing or has completely moved to the technical side.)
So, then we have a problem. Courts assume that drafters who write lists mean something by each member of the list. So they are going to read a definition the differs between controls and installs when they see:
" plus the scripts used to control compilation and installation of the executable"
IE they will read a definition that means controlling compilation and installation are different things.
As for the rest, you want to read a very broad definition of script into it. This is going to go very badly for you.
Scripts are not the same as their data files or external requirements :)
The script is the script. The combined script + datafiles+whatever is clearly a combined work.
It's not even, in general, colloquially referred to as a script, it's usually referred to as a package or a distribution or any of a hundred other things.
"What exactly would you say that "scripts used to control ... installation" includes?
Literally any files that are scripts that are used in the installation.
Data files they use, unless covered otherwise, would not be covered.
"And then "st-flash write foo.bin `cat BASE-ADDR.txt`" ? I would think BASE-ADDR.txt is still considered part of the "scripts" in the legal sense?
Most lawyers take a pretty strict definition of script.
The script part is the script.
Again, anything else would not need to be shipped unless otherwise covered by some other part of the requirement. In most cases, it happens to be.
For example, it's likely BASE-ADDR.txt is part of the executable's source, or generated by the build system, or whatever else.
Now, pragmatically, in order to try to keep everyone happy, some companies will ship all the non-proprietary utilities/data files necessary to run the script (though not necessarily as source), but that is a nicety, not required by GPLv2.
Your argument starts to also extend very broadly, what if my script requires a special fuse plugin for my company's magic distributed file system that i store all this stuff on, and becomes worthless without it. Do you believe i have to give you the plugin?
The source to the plugin?
I threw out those various phrasings not because I'd expect they'd be treated identically, but because I was asking if the specific phrasing used happened to exclude some kind of major category. I really don't know where my thinking diverges from the accepted legal consensus, and wasn't sure if it hinged upon that exact phrasing somehow.
> As for the rest, you want to read a very broad definition of script into it. ... Data files they [scripts] use, unless covered otherwise, would not be covered. ... Most lawyers take a pretty strict definition of script
Concretely, per your narrow definition of "script" (which I am understanding as anything a programmer would call "a script", correct me if you have something more formal), a makefile is not a script. neither is some C with a sequence of system("...) commands, and neither is something piped through rot13 to sh, nor a sequence of commands in a free-form text file that are copy and pasted into an interactive shell.
As a result, the word "script" is essentially meaningless in the legal context, due to the technical transformations that can be done on any script that makes it not a "script" - computer science is a playground of made up words and endless reinvention.
I had the impression that courts looked at general concepts and frowned upon attempts to circumvent the intent of contracts; is this not so?
> what if my script requires a special fuse plugin for my company's magic distributed file system that i store all this stuff on, and becomes worthless without it. Do you believe i have to give you the plugin?
Doesn't this strongly come down to intent? If the company just happens to use some internal filesystem that differs radically from POSIX, and their engineers really do work on the program using it (rather than some obfuscation game), then I don't see them being compelled to rework the source tree into POSIX conventions. But if the file system is adopted and integrated specifically so the company can avoid its obligations under the license, isn't it likely that a court would call them out on it?
Stepping back, doesn't copyright law by default essentially mean most things added to a build are derative works, and also the final executable itself is a derivative work? The GPL2 has explicit exemptions for operating system components and "mere aggregation", plus further exemptions when combining with non-GPL works (eg Classpath and OpenSSL). Otherwise, these uses would run afoul of copyright. Presumably the GPL2 says "scripts" as distinct from data which is already covered since it gets incorporated into the final executable work.
I would think if you played a similar game with an independently-developed binary code blob - prepending it to compiled GPL source - you would be shot down pretty quickly since the entire executable is one derivative work. Don't binary kernel drivers only legally work as modules precisely because the resulting in-memory derivative is never distributed?
How does the scenario of incorporating such a machine code blob differ from creating a signature blob and inserting it into the executable's header?
To be fair, one of the definitions of "control" is to "determine the behavior or supervise the running of."
So I think a fair case could be made that the software signing keys determine the behavior of the installation when their signatures decide whether or not a binary will be accepted by the hardware. You can say that they don't do much actual work, sure, but neither do real-world managers who merely sign off on projects and nobody disputes whether or not they "control" things.
And I know I'm being really pedantic about the word meanings here, but I'm also pretty sure that arguments of the same form are regularly entertained by courts.
No it's not. It's anti-GPLv3. GPLv2 absolutely allows tivoization. Stallman thinks this is bad, but not everybody else agrees (I know Linus Torvalds thinks tivoization is fine in the context of GPLv2, and I believe the FSF agrees).
> I really wish any Linux copyright holder […]
Your argument here is not a legal one. Any Linux copyright holder that did this would lose horribly. And in particular, Linus Torvalds explicitly stated that he does not support relicensing the Linux kernel under GPLv3 because of this (even if it were possible, which it isn't because of the very large number of copyright holders). Source: https://en.wikipedia.org/wiki/Tivoization#GPLv3
> The entire point of the GPL2 is that a developer who creates code and receives a modified version back should be able to modify that instance of code
And they can! But the GPLv2 does not require that the modified code must be able to be installed back on the original hardware by the person doing the modification.
> tivoization is entering into the license in bad faith
Not by any legal definition of "bad faith", and even with the informal definition, a lot of people would disagree with you.
Does not explicitly. From GPL2:
> Accompany it [executable code] with the complete corresponding machine-readable source code ... The source code for a work means the preferred form of the work for making modifications to it
So, you're telling me that the preferred form for making modifications to the source code of a router does not include the scripts which install onto the router to test those modifications?
> Not by any legal definition of "bad faith"
The GPL2 and the FSF are very explicit on what the GPL2 is supposed to engender. Accepting the license and trying to circumvent it on a possible technicality (assertion that a signed blob isn't a derivative work, when the only reason anyone is interested in the source code is to run it) looks pretty bad-faith to me.
IANAMOAG (I am not a member of a guild, and you probably aren't either). Even if we were, we could still disagree until it was actually litigated. I see no reason to prematurely capitulate Linux devices being non-Free.
Yes. 100% yes.
It is very explicit in what must be included and you are explicitly ignoring what it says about scripts:
" For an executable work, complete source code means all the source code for all modules it contains, plus any associated interface definition files, plus the scripts used to control compilation and installation of the executable. "
Scripts used to control compilation and installation != signing keys.
It means scripts used to control compilation and installation.
You may want it to include something more, but it doesn't.
That's why GPLv3 was written.
This is a pretty big assumption, and in fact, is not actually true in a lot of cases :)
Let's assume it is here.
"Why isn't that data file also part of "the scripts used to control installation of the executable"?"
Because the data file literally isn't a script used to control installation of the executable?
It's a data file.
At best, btw, you'll be able to argue that it would have to include the signing key, but not any password for it. So you are still pretty screwed :)
Your argument also swallows the definition. Where does it end? Do i have to include the interpreter for the script, and provide source?
Why not, since it's required for the script to function.
(Note, this would not be covered by the next exception in a lot of cases)
If i give you a bash script
The installation script also almost certainly refers to the executable, but that's called out explicitly as something i have to give you source to. Why bother if it's included in the "scripts to control compilation/installation" part?
(In general, courts are going to read definitions into things that make parts of enumerated lists differentiable from each other )
In general, i would say you can go down this path, but you won't get a court to come with you.
Scripts means scripts. Data files means data files.
If the key is literally part of the script, yes, you'd have to ship that. Not the password, mind you.
If they wanted something else, they should have written something else. Which they did, in gplv3 ;)
You're referring to "scripts used to control compilation", but how are signing keys not part of the source code? GPLv2 uses a specific definition of source code, "the preferred form of the work for making modifications to it." Don't you have to be arguing that a form not allowing modifications to be used or tested is the "preferred form of the work for making modifications"?
You are trying to take parts out and remove context from them.
I don't have to argue at all about the preferred form, because it very explicitly says what that is:
"The source code for a work means the preferred form of the work for making modifications to it. For an executable work, complete source code means all the source code for all modules it contains, plus any associated interface definition files, plus the scripts used to control compilation and installation of the executable. However, as a special exception, the source code distributed need not include anything that is normally distributed (in either source or binary form) with the major components (compiler, kernel, and so on) of the operating system on which the executable runs, unless that component itself accompanies the executable."
The first two sentences go together. It defines, explicitly, what the preferred form of complete source code for an executable work includes.
So that isn't going to include signing keys unless you want to argue that one of the explicitly enumerated parts includes signing keys.
For years, people have argued back and forth about whether the scripts include data files.
The general consensus is "no", because otherwise, it has no differentiable meaning from the other parts.
I use my GitHub private keys way more often than final code signing keys - be those keys used for as "nefarious" a use as tivoization lock-in, or as benign a use as proving authorship so Windows can tell my executables apart from some asshole's malware better.
During standard development I'm either using temporary, machine-local certificates, or using unsigned code. If that means I need a devkit, or an emulator, or a dongle, or whatever else might be required to iterate that doesn't require the final "publishing" steps, in a well designed system it shouldn't require final publishing certs. Because for security's sake, you'd consider going to such lengths as keeping those code signing certs on a tightly controlled, air gapped machine, to reduce the chance of getting pwned and dealing with cert invalidation. Or worse.
Which makes it a very far cry from being part of your compilation phase in any capacity, nevermind a script used to control it.
But the "preferred form of the work for making modifications" presumably involves each person using their own github keys (and/or repositories) etc. Anybody can create an account and use github. That is not true in the tivoization case because in that case the only keys that can be used are the original ones.
Are you sure? Based on what? I've managed to quote you without the use of copy and paste in that previous post - I'm not sure how I accomplished that if I didn't read what you wrote - so please be more specific.
This also begs the question: Did you read what I wrote?
> But the "preferred form of the work for making modifications" presumably involves each person using their own github keys (and/or repositories) etc.
And the "preferred form of the work for making modifications" also presumably involves each person using their own signing keys (and/or hardware that may not require them) etc.
> Anybody can create an account and use github.
Anyone can buy their own hardware, and compile for it, and install on it. Just don't pick tivoized hardware.
> That is not true in the tivoization case because in that case the only keys that can be used are the original ones.
The source code is not tivoized - hardware is. So don't pick tivoized hardware.
You're still talking about "scripts used to control compliation" when my point was that it doesn't need to be that if it falls under the definition of source code proper.
> And the "preferred form of the work for making modifications" also presumably involves each person using their own signing keys (and/or hardware that may not require them) etc.
Which works perfectly well up to exactly the point where the person distributing the software delivers it on hardware that does require signing keys and doesn't allow the user/owner to use their own keys. Then they need yours because you prevented them from using their own.
> The source code is not tivoized - hardware is. So don't pick tivoized hardware.
The hardware is being chosen by the person distributing it with GPL software.
Ahh. Expanding the definition of source code enough to include code signing keys would have also put them into the (sub)category of "scripts used to control compilation and installation" in my mind, making the distinction moot.
It's not moot in the context of the specific point you intended, of course, and which you've now provided, which I thank you for.
> Which works perfectly well up to exactly the point where the person distributing the software delivers it on hardware that does require signing keys and doesn't allow the user/owner to use their own keys.
Which works perfectly well up to exactly the point where the person distributing the software delivers it on hardware (my VCS server) that does require VCS keys and doesn't allow the user/owner to use their own keys.
> The hardware is being chosen by the person distributing it with GPL software.
...and by the person purchasing the product that has the GPL software in it, by choosing to buy that product.
You distribute your software by conveying the VCS server hardware to the customer?
> ...and by the person purchasing the product that has the GPL software in it, by choosing to buy that product.
All GPL licensees have the rights the license grants them, not just the ones who buy or don't buy specific hardware. What good is the license at all if distributors can just say "I'm not doing that part, if you don't like it buy from somebody else"? And what if they all do that, or nobody else makes equivalent hardware?
So even a reaching definition of "preferred form for modifications" doesn't hit signing keys, just like "preferred form" of a Rails app doesn't include your EC2 credentials.
Picture the scenario at the manufacturer where one developer is being tasked with creating a new feature using a consumer unit fresh off the line, for a business demo. What resources do they utilize in the course of doing this? This is the company's "preferred form" of the source tree for modifying an as-shipped device, and is what needs to be made public.
The company could supply a firmware update to overwrite the bootloader and stub out signature checking, if that is what makes sense. Or a well-specified description to use JTAG to do the same. Or perhaps they don't actually care about security and 'make dist' creates an installable image.
The thing is, all of these arguments ("preferred form for modification", the key being part of the installation script, code+signature being the executable work) seem like uphill battles when trying to show the manufacturer is violating them in an informal forum like HN.
But the intent and spirit of the license is straightforward and even spelled out in the preamble! "Our General Public Licenses are designed to make sure ... you can change the software". Fulling these goals is the entire point of the contract and only consideration for receiving a license. So the manufacturer is actually the party that must work to show how what they're releasing actually fulfills the specific terms.
What is the relevance of that supposed to be? You can put any part of the development process on any number of different computers or distribute the work between different developers however you like. And the relevant "preferred form for modifications" is presumably the form the user uses to make modifications to the device delivered to the user in any event.
> So even a reaching definition of "preferred form for modifications" doesn't hit signing keys, just like "preferred form" of a Rails app doesn't include your EC2 credentials.
Why wouldn't it include your EC2 credentials (or other root access to the instance) if conveying an EC2 instance to the customer is how the customer receives the software from you? But now you're wading into the SaaS vs. software distribution ambiguity which has nothing to do with the case where you've bought a physical device with the software on it.
I'm having a hard time digging it up, but I swore I have seen a video where RMS lays out the exact same theory I'm saying here.
It's not a terribly outlandish theory. I'm just getting a lot of flak for bucking the status quo from people who grew up with DRM kool aid and see Free software in terms of the corporations that lock it down.
That everyone includes RMS, by the way. Here's a transcript from 2006 where he's talking about it and says:
>For instance, the Tivo itself is the prototype of tivoisation. The Tivo contains a small GNU/Linux operating system, thus, several programs under the GNU GPL. And, as far as I know, the Tivo company does obey GPL version 2. They provide the users with source code and the users can then modify it and compile it and then install it in the Tivo. That's where the trouble begins because the Tivo will not run modified versions, the Tivo contains hardware designed to detect that the software has been changed and shuts down. So, regardless of the details of your modification, your modified version will not run in your Tivo.
So while you may be correct that it's never been litigated, it would seem that if the strongest software freedom advocates like RMS and the FSF thought that it did violate GPLv2, surely rather than telling people they should upgrade to V3 because it blocks tivoization , they would have just litigated against tivo and set the precedent.
RMS was likely wrong about at least one thing in that clip - I would think signature checking, especially an early incarnation of it, would be done by the bootloader (ie software/firmware) rather than a specific piece of hardware.
This could be minor (thinking of embedded firmware as hardware isn't wrong in the freedom sense), or that framework could have misguided his thinking. He didn't use a specific term for the compiled source code, only "it". The GPL2 speaks in terms of "executable", but an unsigned or missigned binary is not actually executable. The "executable" would consist of the machine code and header, which would necessarily include a correct signature to be "executable".
For comparison, say you had some GPL software which checked in with a dongle or license server, and refused to run if there was no license/dongle. Then you'd be within your rights to edit it to remove the license check.
But here, the device itself refuses to load the program because it isn't blessed. Consider another implementation: in the factory, the hash of the GPL firmware is burnt into a ROM, and the bootloader checks that before running the code. Clearly you can build a modified executable, and you still can't run it on the device. But crucially, there is no header that you're missing; the device just can't run that exe. Or suppose the modified executable is too big to fit in memory - are they obliged to come upgrade the device? The point is, the license covers the software, not the loading of that software.
IANAL, while this may appear against the spirit of the license, it's hard to see how the license prohibits it. The existence of GPLv3 implies the FSF thinks so too.
To put it bluntly: an ought does not imply an is.
Please find me any real-world device that actually has a symmetric checksum burnt into OTP or mask _ROM_ while the code itself resides in flash.
No. Not really. GPLv2 requires that the modifier of source code release the tools to build/compile that source code into an executable. Whether or not the source code can be run directly (ex. Ruby, Python) or requires a compile/assemble/link process (ex. C, Java) is irrelevant. The GPLv2 also does not entitle or give anyone rights to any method to execute that recontributed and rebuilt code, for example if the code was ported to a proprietary hardware or some closed sourced ISA. However, IANAL and could very well be wrong because the GPL is such an unwieldy beast of unknowns.
In short: yes.
Being able to access and modify and redistribute source code isn't the same as having the ability to deploy it to a particular piece of hardware.
Likewise, if I open source a .NET project that doesn't mean it has to include Visual Studio even though VS is the preferred way to build it.
But the preferred form for modification includes being able to test your changes. Source without installation scripts is most likely not the preferred form - ie what do engineers at the company use?
> if I open source a .NET project that doesn't mean it has to include Visual Studio, which is the preferred way to build it
Yes, because having been designed in the era of expensive compilers, GPL2 explicitly covers this:
> However, as a special exception, the source code distributed need not include anything that is normally distributed (in either source or binary form) with the major components (compiler, kernel, and so on) of the operating system on which the executable runs, unless that component itself accompanies the executable
You're trying to draw a box around "Windows" and define operating system as everything in that box. But minesweeper is not an operating system component, it's just something that comes with Windows. A hypervisor is an operating system component, even though not all versions of Windows have one.
So is a compiler.
I imagine there exist developer-oriented PC OEMs that offer PCs with Windows and Visual Studio already installed. So then it is included with the OS by the OEM. The fact that that would change the outcome in your interpretation implies that there is something wrong with it.
That in practice nobody except FSF and Debian cares is another thing.
This means that someone sending a PGP signed e-mail saying "I audited that the source code of GNU Foo with hash 0xdeadbeef... is free of backdoors" is violating GNU Foo's copyrights unless he releases his private keys.
This is obviously a very undesirable outcome, and so it's resonable to hope that hashing does not create derivative works, and thus signing doesn't either.
Including the scripts to install in the "preferred form" also seems very problematic because then you might also be forced to include a full copy of the hard drive of your development machine since that's an even more "preferred form" under that theory.
This is also a very undesirable outcome, so it's again reasonable to hope that "preferred form" does not necessarily include things that were absent in the original work and that aren't essential for reproducing the binary derived work itself.
Note that this means that I could sell a device that only runs signed code and could separately sell signed apps for that device that the user can buy and install later, and as long as the code that ships with the device is not GPLv3, I could use GPLv3 code in the signed apps and would NOT be required to release signing keys under GPLv3.
(The Apple App store incompatibility with GPLv3 has nothing to do with the app signing requirements. It is due to the terms of service that users are required to agree to in order to access the store. These impose what the GPL considers to be additional, incompatible conditions).
It seems odd, if they felt the way you think they do, that they would have written this so narrowly.
The correct solution to this problem is to release a tool for people to change the code signing certificates on their devices to ones which they control, and to arrange for this tool to require a reasonable level of physical access so that it cannot be used remotely. Then device owners can have code signing security on their own terms and be confident that only their own trusted builds can be pushed to their devices via remote updates, which is a super useful feature when you have more than two devices.
Based on Linus's views, the Linux kernel being GPL is a happy accident of history. We should be glad it wasn't BSD so we at least ended up with DD-WRT/AOSP/etc. But we also should keep pushing to preserve the freeedoms that fostered it.
Yes, the GPLv3 fixed that, and there was plenty of time for projects to migrate already. Linux won't migrate, Linus already explicitly stated so.
If you wanted to prevent evil maid attacks, then a time lock would actually be more secure than a fixed trust root with respect to downgrade attacks and leaked keys.
Then there can either be an option to disable this check or a large warning that is produced when you try to install unsigned code. There's really no point in code signing if everyone has access to the private keys.
If the device allows unsigned or self-signed code to be installed then there would be no need to release the signing keys because they are not necessary to make modifications. Releasing signing keys should only be required when it doesn't.
I could see this easily coupled with some social engineering. Control the network and then email users saying that an important update is needed.
Besides, I don't believe that firmware updates need to happen so frequently that there needs to be a convenient way to install them. It's okay if they are a nuisance, because a majority of devices will be updated zero times in their usable lifetimes, and a majority of the rest will be updated one time.
(Basicly he doesn't consider tivoization to be that much of an issue, and doesn't consider it to be something he wanted when he selected the GPLv2)
See, I bought this WNDR3800 off Ebay, because I wanted to put OpenWRT on it. Its compatible. But I made the mistake of buying a used router that was managed by a ISP. They built their own firmware and purposefully made the firmware updater require specific headers in the firmware package. Now I can't update this device to OpenWRT, without some very specific hacking and compiling my own version of OpenWRT.
Who owns this hardware? I bought it. I paid for it. Its in my house. But apparently this ISP is truly the owner and I can't do anything that I wish with this device.
The firmware is based on Linux, the ISP of course hasn't released their source. The only way I know about the header bit is because somebody else figured this out years ago and there is a single message on a random web forum detail it.
Presumably, the "openness" of the original device running Linux is what allowed them to build their own custom firmware in the first place.
The important point here is that vendors aren't locking down their firmware because they're control freaks that hate when customers install modified firmware, it's also a big security hole. If you can replace your firmware with a modified copy, so can anyone else who manages to gain access to your device. There's other angles too of course, such as vendors who offer unique hardware may want to require their customers to always be using that business's products, as that's how they make money, but when we're talking about generic equipment like routers, the most important reason is security.
If someone has physical access to your device, you have lost. The end. They could just replace the entire device with one with an identical case but different internals.
But if you successfully prevent attackers from gaining physical access to your device then there is no security cost in granting anyone with physical access full control over the firmware.
And for devices you know will be placed in unsecured areas where you would still like some security theater (or to inhibit meddling by unsophisticated malcontents), there is nothing wrong with allowing the owner to lock the firmware using the owner's key or password.
I realize the security issues, I'm not worried about them. As somebody who wants to learn about embedded Linux development, and who owns several other OpenWRT compatible devices 'locking' the firmware is not something I am interested in.
This argument breaks down when you're not talking about the manufacturer -- at the time the ISP made that modification, they bought it. They owned it. Should they not be permitted to do what they like? Should you not be permitted to write your own firmware which requires headers?
Maybe it's just me, but this seems like an easy thing to get over by buying some different hardware. Instead of stressing out about this particular WNDR3800, focus your angst on the general case, which is the FCC's attempt to make it illegal to upload custom firmware to any router.
This is sophistry. It's not a question of "can" in the sense of "be able (because of your skills) to", but of "the vendor actively prevents it".
And the vendor is surely not preventing you actively from turning your device into a surfboard, a dishwasher, or a birdcage. But he is actively preventing you from turning it into a customized router.
Don't try to make a "freedom" argument here. Vote with your Value and buy an unlocked piece of harware.
You're right about GPLv2 — vendors are not legally forced to provide any way to load modified binaries (that's the primary reasoning to license software under GPLv3). Yet, there is a belief that every hardware manufacturer should do so to respect the customers freedom to tinker with the device, especially after the support from vendor is gone.
Only if its a consumer product by the specific rules used in the GPLv3. Business products under the GPLv3 don't require that.
The keys for official releases are supposed to be kept by a trustworthy senior management or some people with highly restricted privilege. Also the software/firmware shipping process must be well audited if there are serious (not just pretending to be) security requirements. The keys should not be exposed to a random technician or a new hire who can possibly responsible for destroying your and your clients' businesses. It looks like they didn't have a quite "formal" shipping flow at least in this case.
Since most routers dont require signed firmware this just makes the few that does as bad as them.
Personally i would think the likely hood that the key would be used for good(Open WRT) is far greater than that it would be used for bad.
On a general note:
Consumer routers are in my experience generally very easy to hack in to stuff like http://192.168.1.1/setting.cgi?setname=`echo 'password' | passwd` is not uncommon so altering the firmware is seldom needed.
> since the parent poster seemed to feel the need to use an extremely uncommon acronym to show his or her superiority.
That was uncalled for, though.
I think "If there's an AWS product for it, it's not that obscure" is a reasonable standard. You could pretty clearly google "keys HSM" and it would show up.
As an example: Would you consider RAM to be a common acronym? If you work with computers you're very likely to answer "Yes." or "Hell yes." to that question. But, there are many people out there who have no idea how to expand RAM into its parts, and may not have ever heard the term. Yet, the expansion of RAM is in the top ten Google search results for the term. (It's in my top two.)
Additionally, I know what an HSM is, and I don't even work in a field that requires their use.
its always felt more like a protection racket to me... especially when you can takes someone's codesigned stuff and resign it. getting the ability to code sign requires just money, time and a bare minimum of paperwork...
the trusted authority on these things is completely untrustworthy by the nature of their business imo.
Very assuring! I am worried about my other D-Link home surveillance cam products now.
Let me remind you this excellent piece: http://www.devttys0.com/2015/04/what-the-ridiculous-fuck-d-l...
Of course it could just be incompetence, but it's particularly interesting to consider the alternative. Seeing all the other companies who have leaked keys, datasheets, and other useful but officially proprietary information on their products, really makes me wonder. Also, those vulnerabilities which allow for jailbreaking and rooting of locked-down devices? Are they also there deliberately? For some reason, I really hope so.
Does your country, whatever it is, have a monopoly on competent developers?
Demonstration of necessary skills failed. Incompetent.