
Software Freedom Doesn't Kill People, Security Through Obscurity Kills People - ashitlerferad
http://ebb.org/bkuhn/blog/2016/08/13/does-not-kill.html
======
Godel_unicode
The author asks for evidence of GPLv3 software hurting people. This includes
the Linux kernel, for which there are many examples of incidents. The below
was the first I found[1] and it has the benefit of involving multiple open-
source projects under different licenses.

The idea that software is inherently safer because it's released under an
open-source license is overly simplistic in the extreme. Just the most obvious
reason for this is that opportunity for independent review doesn't guarantee
that review will happen. The wisdom of the crowd doesn't guarantee a third
party review will be better or more thorough than a solid first party system.
Open-source provides the possibility of review, and that's all. Hypothetical
eyes make no bugs shallow.

Edit - I'm not making a claim one way or the other about the relative safety
of open vs proprietary software, I'm merely saying that the comparison is
deeper and more nuanced than the way it is typically portrayed.

1 -
[http://www.bbc.com/news/technology-28867113](http://www.bbc.com/news/technology-28867113)

~~~
ktRolster
With open source software, you have the possibility of knowing how safe it is.
With closed source, you don't (ok, it's a lot harder to read through binaries,
but still vaguely possible).

~~~
tigershark
No way that you can understand how safe is the neural network that is driving
your car. And this applies equally to open source autopilot systems and closed
ones. The article is just a rant that misses this fundamental point.

~~~
chmars
I cannot, others can - I can even pay them to do if necessary.

~~~
EGreg
Can you pay someone to solve the halting problem?

~~~
chmars
I think we all agree that open source is not the holy grail but it shines with
comparative advantages. It is like democracy: Far from perfect but still the
best political system:

'Many forms of Government have been tried, and will be tried in this world of
sin and woe. No one pretends that democracy is perfect or all-wise. Indeed it
has been said that democracy is the worst form of Government except for all
those other forms that have been tried from time to time.'

[https://richardlangworth.com/worst-form-of-
government](https://richardlangworth.com/worst-form-of-government)

(OK, that was slightly OT!)

~~~
EGreg
Well if you must :)

[http://magarshak.com/blog/?p=212](http://magarshak.com/blog/?p=212)

~~~
nickpsecurity
[https://en.wikipedia.org/wiki/Tyranny_of_the_majority](https://en.wikipedia.org/wiki/Tyranny_of_the_majority)

~~~
EGreg
So the plutocrats will protect us from the majority? :)

Obviously we need constitutional rights and bureaucrats to implement the will
of the people as expressed in the polls. But why do we need representatives?
Do you really think eg a Tea Party representative will protect Muslims from
the majority better than the first amendment and courts?

~~~
nickpsecurity
"So the plutocrats will protect us from the majority? :)"

No. The lesson, as usual, is that extremes rarely work out for us. The
plutocrats will screw us if totally in control. So will politicians if they
can get money from them or special interest groups. So will democracy if
something horrible is trending in popularity. As usual, the solution will be a
compromise between various points in the design space with more effective
checks on various interactions and risks each pose to each other. I can't tell
you what that design will be but looking for it is best investment with these
things.

"Do you really think eg a Tea Party representative will protect Muslims from
the majority better than the first amendment and courts?"

The First Amendment is subject to interpretation by representatives and/or
courts. Fourth, Fifth, and Fourteenth have seriously been watered down.
They're among the best examples of the risk. A politician can outlast the
media fervor of the moment, get advise from well-informed people, and have
teams check the side effects of the law. The current situation exists because
apathetic democracy allows them to get away with not doing that. Also, passing
laws in exchange for bribes. That could all be stopped with enough voter
action followed by legislation or alternative models. The benefits of
representational democracy remain.

I'm not saying I believe it's the best system. I'm just saying plutocrats and
mobs cause lots of problems it can prevent with less effort than constantly
outsmarting plutocrats or fighting mobs. So, it's a consideration.

~~~
EGreg
_The current situation exists because apathetic democracy allows them to get
away with not doing that. Also, passing laws in exchange for bribes._

Or it exists because it is an emergent phenomenon that arises eventually. An
increasingly polarized two-party system where each primary race is rigged
against outsiders and voters are scared to vote third-party because each
candidate is super scary to the opposing party. It's bound to happen
eventually. In this case it happened in the US presidential race. And Congress
has long had a 10% approval rating. One can make up reasons as to why it
happened or accept that a representative democracy eventually reaches such
states, and it's not clear how to get out of them with _more_ representative
democracy. Trying the same thing and expecting a different result.

 _I 'm not saying I believe it's the best system._

Exactly, and I'm saying it's not. It's easier to fool all of the people some
of the time (eg during elections) or heavily influence some the people all of
the time (as lobbyists for special interests do) but you can't fool all the
people all the time. That is far more expensive.

Once again ask yourself: are eg Muslim and Mexican US citizens safer if Donald
Trump gets elected and magnifies the desire of those who elected him, or if
the population at large was polled for policy? Are the people of Gaza better
off under Hamas because they were democratically elected once by those who
showed up? Would the Nazis ever get into power if there were no representative
positions to get into? Besides gerrymandering, poor voter turnout and other
major problems, representative democracy eventually gives a giant hammer to
some special interest or other.

The crowd would do a better job on predicting policy outcomes than experts:
[http://www.npr.org/sections/parallels/2014/04/02/297839429/-...](http://www.npr.org/sections/parallels/2014/04/02/297839429/-so-
you-think-youre-smarter-than-a-cia-agent)

~~~
nickpsecurity
I'm going to have to think on these points some more. :)

------
nickpsecurity
"Meanwhile, there has been not a single example yet about use of GPLv3
software that has harmed anyone."

What!? That's dishonest. Most people doing safety-critical development use
robust tooling intended for it instead of FOSS. Or stuck with proprietary
tools attached to their platforms. So, FOSS will hardly cause harm by
definition just because so little use. Kind of like how Mac OS 9 and Amiga are
secure because you see no huge botnets. ;)

Far as harm, it's done plenty. Most GPL software, just like proprietary, is of
crap quality. I've lost years of data to open-source solution. Many people
were hacked due to FOSS solutions where "many eyeballs" didn't even try to
look for problems. Extrapolate that to cars, planes, and trains. Then think if
you want to use one afterwards. I won't. Give me DO-178B Level B/A software
over that any day.

"until you can prove that proprietary software assures safety in a way that
FLOSS cannot"

Wait, has this author heard of DO-178B and related standards? They do exactly
what he says. All are proprietary, too, although could be FOSS'd. How about
FOSS software people put their stuff through rigorous prevention and 3rd party
evaluation process before expecting us to believe its high quality or safe?
Cuts both ways. I do know a handfull that would make the cut. Need exceptions
for OSS like that. Most won't, though.

------
bkuhn
It seems that most commenting in this thread don't have not actually followed
the complex political debate going on in the world of automotive industry
adoption of FLOSS.

Specifically, the automotive industry has made a series of arguments that
proprietary software is ultimately safer than FLOSS, and all their goals are
to lock-down FLOSS in various ways to prevent the nefarious from "hacking" the
vehicle.

The vehicles are all still hackable, and many have been modified by people for
both reasonable and nefarious purposes. The FLOSS situation won't change
anything, and there aren't even any examples yet of hacks where FLOSS made the
situation worse. It's an assumption they make without full information because
of their inherent pro-proprietary bias.

------
AsyncAwait
I don't see the author arguing that being open makes software inherently more
secure, but that your software shouldn't rely on being 'secure' because it's
closed source, rather design the software to be secure even if the source code
were to be open-sourced, which may bring additional eyes, (and if not, it's no
worse than if it was closed).

I don't know if anyone has a good argument against that, but I didn't see one
yet. It's not an argument to say that open-source makes your software
bulletproof or even safer just by being open-source, rather if you designed
your software with proper security as a goal, you shouldn't mind releasing
your code.

------
Aloha
Both arguments are hyperbolic.

Both propriety and open source software has likely killed people, and likely
will do so again. No amount of eyeballs will help if no one is looking, nor
will hiding all the bad code away so no one can see it. Bad engineering
practice is bad engineering practice.

~~~
bkuhn
I say clearly in the blog post that there is danger when you have software
handle any life-critical service. The criticism I'm making in the blog post is
the arguments by the auto-industry and their providers that FLOSS is
inherently _more_ dangerous than proprietary software.

------
mwfunk
I don't think that one unprovable assertion ("GPLv3 software has never killed
anyone") is the best way to combat someone else's unprovable assertion ("our
cars would be less secure if everyone could see our code"). The only way to
argue against security through obscurity is to argue against security through
obscurity, although admittedly that can be tough depending on the audience.
Fighting against the status quo is just going to be an uphill battle
sometimes, it's the nature of the beast.

This sounds more like a rhetorical device, in which debater #1 has attempted
to frame the debate in terms of debater #1 being correct unless debater #2 can
prove that they're wrong. Debater #2 then tries to reframe the terms of the
debate such that debater #2 is right until proven wrong by debater #1.

In both cases it's more a form of sophistry vs. an opponent than it is a way
to make a case to the public. Using a dishonest rhetorical device to
counteract someone else's dishonest rhetorical device rubs me the wrong way,
because it feels like both sides are trying to twist facts in their favor, and
are not in good faith trying convince me that they're right and the other
person is wrong. Debater #2 is responding to sophistry with more sophistry,
rather than attacking debater #1's sophistry head-on.

------
unimpressive
>The time has come that I must speak out against the inappropriate rhetoric
used by those who (ostensibly) advocate for FLOSS usage in automotive
applications.

You mean used _against_?

~~~
Leynos
The author appears to be speaking out against people who, while advocating the
use of libre/open source code in automotive solutions, decry the use of GPLv3
(as opposed to v2).

The author suggests that the people opposed to the use of GPLv3 are making
false arguments on the grounds of safety. The relevant difference being that
GPLv3 requires not only the release of code, but the ability to use modified
versions of code in situ. E.g., by replacing your car's firmware.

------
limaoscarjuliet
I agree open source software has more chance of being safer, but the article
does not do good job of arguing it. Anecdotal evidence fallacy is not good
science, e.g. "Meanwhile, there has been not a single example yet about use of
GPLv3 software that has harmed anyone".

------
qwertyuiop924
The author of this post is right, to a degree, but his arguments are really
weak. Here's it simply:

-With OSS, your bugs are more likely to be caught.

-Therefore, it's harder to maliciously hack the vehicle.

-Therefore, it's less likely the software will fail and kill someone.

-Therefore, not opensourcing your code is a liability, because not doing so makes your product killing someone more likely.

Or the auto companies could just admit the real reason why they don't want to
OS their code: They're afraid of losing control of their product. Of people,
even if they've voided the warantee, using their product in ways they didn't
explicitly allow. Of losing their monopoly on support.

If you're going to be evil, at least give it to us straight...

~~~
eridius
Your first claim is not true. What makes bugs more likely to be caught is
having lots of eyes on the source. If you have a popular OSS project there's
probably a bunch of people looking at it. But making something OSS does not
guarantee that anyone will look at it. Meanwhile, proprietary software will
still be examined by everyone who's paid to work on it. So really, the value
of OSS in catching bugs is related to how interesting/popular the software is.
Boring software is rather unlikely to catch many bugs by being OSS.

~~~
qwertyuiop924
But the probabilty of people outside your org looking at your source if it's
proprietary is zero. Said probability is nonzero for OSS. So yes, even in a
low popularity OS project, there's a higher probability of more people setting
eyes on it than proprietary software.

------
arca_vorago
I'm going to skip the main points everyone else is hitting (because I've been
touting gplv3 for a while), and get straight to what I think is the most
important part of the evolution of foss.

I think FoSS is the way to go, _but_ , I think the main problem is with the
many eyes theory. To cut to the chase, I think the future of code lies in loc
reduction and simplification of code in order to combat the problem of too
much complexity.

The linux kernel is now at 10million loc +! I dont care if you are Red Hat,
you dont have enough eyeballs to properly audit that shit!

Thats what I think the future of software should be, is simplification of
code, maybe some refactoring, so that the barrier for average user to look at
and understand the code is lowered.

That and perhaps an AI/machine learning code reviewing system that can perform
the function for those not able but that still want the benefits of foss.

~~~
zdw
> The linux kernel is now at 10million loc +! I dont care if you are Red Hat,
> you dont have enough eyeballs to properly audit that shit!

To be fair, most of that is in device drivers and other optionally compiled or
non-critical pieces of code.

The common code path is extremely well tested and audited.

~~~
archimedespi
Exactly. I see so many people complaining about the Linux kernel SLOC versus
other kernels; it's an unfair comparison because no other modern kernel I know
of has such a wide selection of in-tree drivers.

For instance, when I install Windows on a computer, I will often have to
install additional chipset drivers (odd USB3.0 controller, motherboard chipset
stuff, fan control). When I install Linux, all of that stuff works out of the
box, no drivers necessary: it's rolled right into the kernel.

------
EGreg
Open source is better for some things, but security isn't one of them.

I say this as a developer of an open source platform. When it's just starting
out, it has the same number of holes as anything else -- but everyone can see
the code and find the holes. It takes years, and hundreds of man-years, to
find and fix 99% of the holes. Until then, the source code is all in the open,
making it an easier target for an attacker than a closed-source product. In
thw latter, you just have to close the "obvious" holes.

True, with very old or well-funded open-source products, this is not the case.
But MOST open source software isn't like that.

~~~
temac
It might _slightly_ slow-down a motivated and competent attacker to not have
the source. But it also very highly slow down discovery of some bugs by non-
hardcore security researchers, most especially security bugs.

------
_ph_
There seem to be many different topics involved. First of all open vs. closed
source, and GPL V3 vs other versions of open source licenses.

In general, I would say that open source is preferable to closed source, but
there are situations, where this is not always possible for the producing
companies.

Finally, there is the problem, that if the owner of the car is free to modify
his cars software (which in abstract sounds like something he should be able
to), there is the problem of people modifying the software of their car, which
should not. One should not just tinker with the algorithms used to control the
engine, breaking and stability control.

~~~
halomru
>there is the problem of people modifying the software of their car, which
should not. One should not just tinker with the algorithms used to control the
engine, breaking and stability control.

Why is that a problem that has to be solved? In purely mechanical cars, people
already can and do tinker with the engine and breaks. But only a tiny fraction
of people do, and those usually take care not to risk their lives. In the
limited cases where modifications cause problems for other people, regulations
are enforced through scheduled inspections and random stops of suspicious
vehicles. This system has worked quite well for the last hundred years or so.

If software really is different and shouldn't be modifiable, I think this
needs a little more justification.

~~~
_ph_
First of all, I trust a somewhat skilled mechanic more to get his car
modifications reasonably safe, than a random software modification. Then,
there is the problem, that mechanic parts are easy to inspect - many improper
modifications can be seen at a glance in a traffic control - software
modifications are pretty impossible to detect. Short of checksumming the whole
car software, it cannot be done.

~~~
wtbob
> I trust a somewhat skilled mechanic more to get his car modifications
> reasonably safe, than a random software modification

Those are not comparable things. The correct comparison is between a mechanic
and a programmer (after all, anyone who modifies his software is a programmer,
perhaps an unskilled one), or between an automotive part and a piece of code.
Using the correct comparison, we see that we already permit anyone to work on
his own vehicle, and that he _can_ put anything in it he wishes. Software
should be no different.

> Then, there is the problem, that mechanic parts are easy to inspect

 _Some_ of them are. Some look just like the correct parts, but were
manufactured to incorrect tolerances. These parts _wouldn 't_ be obvious in a
visual inspection.

> software modifications are pretty impossible to detect. Short of
> checksumming the whole car software, it cannot be done.

Software hashes aren't rocket science: your post & mine were both hashed at
least one. Indeed, software hashes make detecting changed software _easier_
than detecting swapped physical parts.

~~~
_ph_
You need to have a basic set of mechanic skills to take a car apart and put it
back together at all. Also, most mechanical work is not rocket science.

Software is an entirely different beast. It might be quite trivial to modify
parameters, but verifying that the software still works reliably might require
a whole testing department.

Yes, software can be checked for modifications via hash codes, but how do you
expect a cop to run a hash sum on the vehicle software? You would have to read
out the memory itself, because how could you trust any possible self-test of
the software?

So, a car could only allow signed software to be loaded, but that again seems
to be incompatible with the GPL 3

~~~
wtbob
> It might be quite trivial to modify parameters, but verifying that the
> software still works reliably might require a whole testing department.

That responsibility (for running full tests, or being responsible for damages
caused by _not_ running full tests) lies squarely on the modifier/owner. We
don't need to erect a whole new set of laws because somebody could make his
engine run poorly.

> Yes, software can be checked for modifications via hash codes, but how do
> you expect a cop to run a hash sum on the vehicle software?

I don't. Why should a cop care what software I'm running, any more than he
cares what brand of brake light I buy?

It's the owner's car. If he modifies it to be unsafe, then he's responsible
for the damages he causes. If he doesn't, he's not. Checksumming may be useful
in a court case to prove modifications (although diffing would work just as
well).

~~~
_ph_
A malfunctioning car might endanger its passengers and other people. For that
reason, car makers have to get certification for any new model they want to
bring to the market and later modifications have to be done with either parts
certified for that car or new certification has to be obtained. (At least here
in Europe). Finally, it has to made sure, that after the modifications, it
still complies to the emission standards.

So there are already popular modifications done to cars by their owners, which
do get checked by cops for their certification, and often enough people do not
have the necessary paperwork. Simple example: some people install custom
wheels which are the wrong size for that car. This poses a danger as the
drivability might suffer.

If you are modifying any software required for the safe operation of the car,
there is quite a potential causing harm to others, and there is where the
officials have to take notion of it.

------
IncRnd
You are arguing things you do not fully know. If you want open source
examples, goto fail; was in open source. Can you quantify how many MiTMs
happened? Stuxnet had open source for PLC infection. That could have wounded
some people. Heartbleed was in open source. Guess how many times that was
exploited by governments with sensitive information disclosed?

The list of open source failures is long. It is not true in and of itself that
the open source "more eyes" argument is correct. Do you audit the source code
of everything you compile for security issues?

One could convincingly argue, just based upon DROWN, FREAK, Logjam, and POODLE
that code in the open doesn't get inspected and those who do inspect it miss
an extensive number of security issues.

I'm not sure how many people were harmed by this bug in MRI software, but the
bug resulted in false positive rates of up to 70%
[http://www.theregister.co.uk/2016/07/03/mri_software_bugs_co...](http://www.theregister.co.uk/2016/07/03/mri_software_bugs_could_upend_years_of_research/)

------
slavik81
Do open source developers actually want to take on the liability that comes
from distributing software for use in safety-critical environments? One
mistake could leave someone with a lifetime of medical bills, and they'll sue
anyone with money involved in it.

~~~
aylons
You don't seem to realize that the software license means no warranty, but
that does not preclude anyone from giving support and warranty on the product.

So yes, GPL'ed software may be used on critical applications the same way as
every other piece of hardware and software: via special contracts.

~~~
slavik81
I realise that. Hell, I've done that. As a licensed engineer, I would just be
very concerned about contributing to an open source project that controlled
critical systems in cars. That sounds like all the liability of my actual
employment, but none of the pay.

~~~
aylons
Unauthorized use of your code or product for uncovered applications is a
problem no matter if the source code is open or not. Companies exclude
themselves from such liability as a matter of policy, demanding special
contracts on these cases.

If you get any IC datasheet, or even an application-specific brochure such as
(1), you will see a notice saying explicitly that.

(1)[http://www.ti.com/lit/slyb108](http://www.ti.com/lit/slyb108)

------
dagaci
The idea of "Security Through Obscurity Kills People" as a bad thing is a very
catchy phrase and great rhetoric. However it is also one of those statements
which lowers the critical thinking and allows people to talk without producing
stats or facts or demarcations of when a idea should apply or not apply.

For example we told repeatedly to use obscure passwords, lock our phones and
tablets with 4 digit numbers, and even swiping-the-screen-gestures, bank
account information is protected by numbers and password! And all this in most
cases is an obscurity scheme, and that obscurity scheme is our single point of
security failure before complete systems access.

"Heartbleed" was present in OpenSSL for over decade, out in the open and
nothing detected. ...

p.s. slightly off-topic, but can you accurately count the number of ball
passes?
[https://www.youtube.com/watch?v=47LCLoidJh4](https://www.youtube.com/watch?v=47LCLoidJh4)

~~~
greenleafjacob
The 4 digit PIN is rate limited. How is that a single point of failure?

~~~
phreeza
For example if someone skims the code.

------
yuhong
This ignore the issue of actually modifying the software.

~~~
phire
Proprietary software doesn't stop people modifying software, it only slows
them down.

And it doesn't slow people down anywhere near enough, hence all the talk about
DRM, encryption, warranty voiding and anti-convention laws.

There is nothing wrong with allowing people to modify software, the issue is
when people modify software to break laws (environmental law, in this case).

The solution is going to be some form of compliance certification/testing. The
Manufacture provides the default software and gained a certification to prove
it compiles with the laws. If the user wants to modify the software, they are
going to have to prove that the replacement software also complies with the
laws and get a certificate.

During annual vehicle inspections or licensing, they can check that certified
software is installed. Or maybe the ECU's bootloader only allows certified
software to be installed, unless a developer mode is enabled (which will
require the owner to keep logs to ensure compliance)

~~~
nickpsecurity
"Proprietary software doesn't stop people modifying software, it only slows
them down."

See DMCA and licensing agreements. Modifying your stuff might be a felony
someday. Best to be sure the right to modify or repair is right there in the
license. Irrevocably and transfering to next buyer automatically. Not
happening right now.

~~~
0xcde4c3db
Making hacking of your own property into a felony has already been attempted
under current law, after a fashion. Sony's lawyers claimed that George Hotz's
PS3 exploit was a violation of the CFAA because he hacked "the PS3 system",
and "the PS3 system" belonged to Sony (apparently equivocating between
individual consoles and PS3 as a platform) [1]. I'm not sure whether a judge
ever actually addressed this argument, since Hotz settled. And of course
that's a different situation than actually being indicted under the CFAA, but
you can see the gears turning.

I seem to recall that a similar argument was used against Jon Johansen at some
point, but I don't feel like trying to dig up early-2000s history on the web
today (IME going back to ca. 2003 is okay, but before that _a lot_ of things
have fallen down the memory hole).

[1] [http://volokh.com/2011/01/13/todays-award-for-the-lawyer-
who...](http://volokh.com/2011/01/13/todays-award-for-the-lawyer-who-has-
advocated-the-silliest-theory-of-the-computer-fraud-and-abuse-act/)

~~~
mwfunk
I am all in favor of safety-critical software being open source, but I dislike
the argument of forcing the issue by claiming the right to tinker, or really
any variation on "I bought it therefore I can do whatever I want to with it".
It's not true when the hardware in question has safety implications. For
example, when I buy a car, I do not have a right to do anything I want to with
it. I can't drive it in ways that would be dangerous to others, and the fact
that I am legally restricted as to where and how I may drive it does not feel
like an inalienable right being taken away from me. That's clearly a strawman
but bear with me.

In a similar vein, if I were to physically modify the engine in such a way as
to make it faster but fail environmental or safety regulations, that could
also have financial or legal repercussions for me (or Volkswagen, to cite a
recent example :). The same argument could be made against modifying
automotive software- vehicles are certified by government agencies around the
world based on the idea that they will perform a certain way WRT safety and
emissions, therefore software behavior has to be just as unchanging and
predictable as hardware behavior. The fact that people can and do hack their
cars all the time doesn't change the regulatory environment for car companies.

I do believe that embedded software should be as open, modifiable, and
testable as possible, but I don't think people claiming the right to tinker as
a basic human right are going to change anyone's minds about this. The biggest
impediments to (for example) car companies making their ECUs open source and
hackable have nothing to do with a dogmatic attachment to proprietary software
and security-through-obscurity (although those concepts have their adherents,
misguided as they may be), and much more to do with compliance with government
regulations and minimizing their potential liability.

There is at least a perception that having open source, hackable ECU software
would be all downside with no upside to the companies selling those vehicles.
The way to change this perception is to show it as being demonstrably wrong,
rather than to simply claim a right to tinker. The best way to get open source
ECUs is to either make the consumer market care about this (not likely) or to
update the regulatory environment in such a way that car companies have a non-
abstract motivation to go open source.

This is not unlike the dreaded binary blobs required to use certain 802.11
chips on open source operating systems- they exist because of FCC
requirements, not because the vendors love binary blobs. They are allowed to
ship software-defined radios, but the only way they can guarantee that their
SDRs don't broadcast on illegal frequencies is to lock down the software that
controls the hardware. It's not an ideal solution, but its the only solution
they've got aside from spending even more money to ensure that safety at the
hardware level, which wouldn't do anyone any good.

~~~
nickpsecurity
"It's not true when the hardware in question has safety implications."

It's a sensible counterpoint. The modification would be, if it's safety-
critical, that you have a right to get modifications by professionals who
aren't the original seller. Or perform them yourself if you are such a
professional. This covers the biggest use-cases of repairs and enhancements
respectively.

EDIT to add:

" is to lock down the software that controls the hardware. It's not an ideal
solution, but its the only solution they've got aside from spending even more
money to ensure that safety at the hardware level"

Not true. They could do it right at hardware level with simple bounds-checks
burned into logic. The checks' setting set at manufacture or OEM configuration
using something like anti-fuses. The kind of tech that's in 8-bit and 16-bit
MCU's that retail for a few bucks. Cost almost nothing. Also more secure given
simplicity. They aren't doing it because they don't care given demand side of
this issue. Plus, lock-in among oligopolies brings them long-term financial
benefits. ;)

------
nxzero
Arguing over if code being publicly readable makes the code more secure is
obviously BS.

Case in point, take code which is closed source, make it open source, and it
magically does not become more secure.

Security of code depends of the effort made to make it secure; aka more eye
balls, more secure.

------
PinguTS
The author is arguing wrong. Because especially automotive is not about
security, it is a about safety. That said, security can influence safety, but
the goal is completely different.

A system is considered functionally safe, when any unexpected input lead to an
situation where the system safely shut down. So it does not matter if we have
real security or security by obscurity. Because if something unexpected
happens, the systems shut down and becomes unavailable. That is what we have
in general automation. The same applies for a railway. If something happens
then the system shuts down, an emergency brake is issued. That works, because
there is a higher level safety designed in, that the currently occupied rail
is blocked and no other railway can enter.

Then we have highly available systems, or mission cirtical systems as we say.
That is for example an airplan or a car. The engine of a car can shut down at
any time. That is not considered harmful. But the brakes are mission critical.
The brakes have to work under all circumstances.

As such safety-related systems are highly regulated and require mandatory
third party review called certification. You need to certify with TUV or other
such organisation. Part of that mandatory review is to disclose the source.
That means the review argument for GPLv3 is obsolete, as there is mandatory
review for safety-related systems. The license of the software does not
matter.

The author then argues, that the Tesla incident would not have happend with
GPLv3 software. He argues, it happend because of proprietary software. Which
is wrong. It would have happend as well with GPLv3 software. This incident
happend because of false claims by Tesla. This systems was never designed to
handled this kind of situation. MobilEye, the supplier behind this system
said, this kind of situation should be handled by the 2018 version. Tesla gave
here a false impression about the completeness and safety features of its
Autopilot. BTW the same Tesla that uses GPLv3 software in its MMI, which they
have locked down.

Then the author argues with VW and thier false claims regarding emissions. As
prooven, the ECU came from Bosch with proprietary software. The same ECU is
used by many different OEMs and Bosch does not disclose the inner working, as
this is the idea that Bosch sells. Even VW does not had the source, but used
the ECU in a testing mode. So the problem here is also only partly related to
the licensensing of the software. It is more like you can use software for the
good cause and the bade cause. Either way it is not related to licensing.

The argument that GPLv3 code is reviewed is also unproven and even sometimes
wrong. In the past there are several example where review didn't take place.
It was more like "yeah, I think someone has reviewed because it is open source
and I could too. But why to waste time and instead build on top of it." So,
GPLv3 can lead you to false assumptions. Except, you will really do the review
your self. Here it comes the question: how many really have done that in their
past?

------
aaron695
I have a rule, if you mention Therac 25 then you lose.

It's a cliche. If it was an example of a common problem then it wouldn't have
to be mentioned every single time.

~~~
Flimm
Hmm, just because it's mentioned frequently doesn't necessarily mean that it
should be dismissed or deserve an automatic "you lose". It doesn't necessarily
mean that other instances of the same problem aren't common or couldn't happen
again, but it could mean that that particular instance is the most famous.
Your so-called "rule" is a faulty mental shortcut.

------
zaidf
Password is the ultimate form of security through obscurity.

~~~
__david__
a) You apparently did not read the article.

b) Passwords are not "obscurity", they are "secrets". "Obscurity" in this
context is about cryptographic algorithms/systems.

