
Germany adopts first ethics standards for autonomous driving systems - blahedo
https://360.here.com/autonomous-cars-get-a-lesson-in-ethics
======
Bucephalus355
In the 1960’s and 1970’s there were many efforts to professionalize software
development and bring it line with other engineering practices. This would
include things like license boards, a code of ethics, etc. Obviously at first
blush this sounds ridiculous, but like Wikipedia it seems to not work in
theory but when it comes to the real world work just fine (as the other
engineering disciplines have shown). This is actually where the term “software
engineer” dates from, it was partly a marketing / professionalization attempt.

One thing this would lead to (could be good or bad) is people being “stripped”
of their license to practice software engineering. After the 1981 Hyatt
Regency disaster in Kansas City, where over 200 people were killed, this is
exactly what happened to the engineers involved in building the walkway. They
forgot the software equivalent of a line-ending semicolon when they eliminated
a single-support bolt for the catwalk accidentally during design review. It
collapsed during a holiday party, killing those above and those below. Perhaps
as we start seeing more “building computers”, sometimes called “architectural
robotics” (a super fascinating subject if you have a chance to look it up,
there are already prototypes out there), a software engineer causing a similar
amount of destruction might not be so farfetched.

~~~
slavik81
> They forgot the software equivalent of a line-ending semicolon when they
> eliminated a single-support bolt for the catwalk accidentally during design
> review.

Absolutely not. Their original design supported only 60% of the minimum load
specified by the Kansas City building code. However, that's not why they
changed it. A supplier pointed out that the design was impractical to
assemble, and suggested an alternative. The engineers in charge accepted that
alternative without doing any calculations about its effect on the load-
bearing capacity of the walkways. Had they done so, they would have seen that
the new design could support a mere 30% of the minimum load required by the
building code. Their engineering process was flawed from start to finish. If I
had to make an analogy to software, it would be a company that regularly did
development on the production box.

Engineering is not about never making mistakes. Forgetting a semi-colon is
common, and any software engineering process that depends on people never
forgetting something that trivial is fatally flawed. When you make a mistake
in vehicle control software, there should be many layers of failsafes to catch
it, such as: the compiler, automated unit tests, automated acceptance tests,
code review, manual feature testing (with a simulator), manual release testing
(with a simulator), integration testing on a vehicle that has been
immobilized, integration testing on a closed track, and production environment
trials.

Keep a few mistakes between yourself and disaster. One mistake is a very slim
margin.

~~~
djsumdog
A better analogy would be promoting a service to production with the unit
tests failing (or not having an automated CI process that ran the unit tests
... or not having unit tests).

There are different degrees of software engineering. If you work for a company
that makes pacemakers, there are very strict software testing requirements.
The same goes for the developers of real time avionics systems for air craft,
ECUs for cars and autonomous train systems in places like Singapore or London.

If you're writing software that tracks inventory and ships out shoes, mistakes
aren't life threatening and you might have faster processes to get things into
production. If you reuse that software for shipping out medical equipment for
people with diabetes or sleep apnea, you could run into life treating problems
if the testing process is not updated to match the seriousness of the problem
you're trying to solve.

~~~
slavik81
> There are different degrees of software engineering.

That's true, but at some point it's no longer engineering. To practice
engineering in my jurisdiction, it's required that you hold a professional
license and work for a company with a permit to practice.

I am a professional software engineer (P.Eng.), but many employers for non-
critical software do not have a permit to practice. They have no need for it.

I need to file my Continuing Professional Development hours every year. One
category of hours is just "practicing engineering". However, if I am working
for a company that does not have a permit to practice, it would be quite a
problem if I claimed I was practicing engineering.

Practically speaking, that means software engineering is just the development
of software which could be dangerous (at least where I live). It's
distinguished from general software development mostly by the need for rigour.

~~~
gpvos
Fascinating! Which country is that?

~~~
slavik81
Canada.

I should also note that anyone can assist an engineer, so it's not like
Computer Scientists are cut out of the field. Many disciplines have
technicians and other specialists who are just as technically capable as the
engineers in their domain (often more so).

The lead engineer signs off of on releases, and they're the main person held
responsible if there's a safety incident traced back to sloppy work. That was
the only real difference between the engineers and non-engineers I worked
with. It basically just meant, "the buck stops here."

------
azinman2
“While vehicles may react autonomously in the event of emergency situations,
humans shall regain control during more morally ambiguous events.”

That’s not going to end well. As we saw in the AZ crash, when there’s a
situation at hand, it’s hard to transition the attention properly quickly over
to a human who has been not paying attention because while the car was
handling everything just fine. It’s unrealistic to think that’s possible in
anything less than at least 10 seconds.

~~~
Roritharr
Control could mean ahead of time. So basically the driver gets ahead of time a
couple of gutwrenching morality questions like:"Would you rather crash into: 2
Kids on a bicycle or a woman with a stroller?"

That's what I call an intense first-time-setup session.

~~~
Normal_gaussian
Two months later two kids cross the road in front of you; the system
misidentifies an overflowing bin as woman and stroller and proceeds to plough
right through the children.

As this was a technical fault you suffer no legal repercussions, but carry the
mental despair over whether you answered the setup question correctly to an
early grave.

------
lumberjack
In the US regulation is reactive, but you can sue for damages. In the EU
regulation tries to be proactive, but it is much harder to sue for damages.

So from the perspective of a startup trying to get in this market, this sort
of environment might prove superior, because your business risk is much
reduced. You know what sort of standards you must follow and as long as you
follow those standards you do not have to worry about being blamed for
negligent murder by the family of every accident victim.

~~~
pitaj
Yes but complying with that regulation is a larger up front cost, which raises
the barrier to entry.

------
gsich
best part: "Drivers retain sole ownership over whether or not their vehicle
data is forwarded or used by third parties."

I like it how it shows that data your device generates is your data, and only
yours. It's like stating the obvious.

~~~
tomohawk
It doesn't say that there must be means for you to access your data, though.

And, you'll likely have to sign a waiver granting the manufacturer access to
your data or they won't sell you the car.

~~~
tgsovlerkhgsel
European courts may take a rather dim view on that. If consent is required to
be given freely, they might determine that this sort of forced consent isn't
valid. There are also limitations on what you can put in the fine print, and
"fine print" is often defined rather broadly, to the point where putting it on
another form requiring a separate signature may not save you.

------
phyller
I like these, but I think that this one is too evasive: "In any driving
situation, the party responsible, whether human or computer, must be clearly
regulated and apparent." In reality, who is responsible for what is going to
be the main problem that is faced on a day to day basis, and not attempting to
define exactly how that works takes a lot of meaning out the standard. Other
than that, it's largely stating what I would think most people would have
assumed anyway. Which is a good thing and I'm glad they are doing it.

~~~
CM30
Maybe the cars could actually say specifically whose in control at the moment?
Like a little sign that either says 'human' (when under human control) or 'AI'
(when under computer control). Perhaps in the same place you'd put a taxi sign
on that kind of vehicle.

Maybe then with a built in logging system that says every time control
changes, and deletes the record when irrelevant (like three changes back).
Then just take the data whenever there's a car crash, and hey Bob's your
uncle.

~~~
phyller
I think the more difficult problems are if the car crashes while under
computer control, who is at fault. Also, when a car crashes under human
control, why didn't the software take over to prevent that, are car companies
liable for that?

~~~
CM30
Well then it'd require an investigation after the fact. If the computer has
been modified, it's the driver or garage's fault. If it hasn't been modified,
then it's the company's fault.

So in most cases, the likes of Tesla or Uber or whoever pay the other party's
insurance. If it can be proven that a certain programmer or team screwed up,
then they're personally held liable and sent to prison if necessary.

That'd certain encourage decent standards here. Do your job poorly and have it
tied to your actions, and you go down for 20 years.

~~~
phyller
That sounds like a really bad idea. Pay the insurance, fine. That's a good
idea actually. Instead of paying your insurance company, you pay the company
that built the software for insurance. Then the software/insurance company
pays for any damages caused by the car. The best software could offer the
cheapest insurance.

But put developers in prison for 20 years because the car crashed, that's
untenable. You wouldn't have decent standards, you just wouldn't have anything
at all. What kind of developer would put themselves in that position? Everyone
makes mistakes. All complicated software has bugs. You would only have
software made by fools, and marketed by fools, that was much more dangerous.

If someone makes software that reduces accidents from 10% of drivers per year
to 0.1%, they still cause 100,000 accidents a year in the USA, maybe a few
thousand fatalities. Give them the electric chair right?

If you are going to punish them for the mistakes, will you reward them
commensurately for success? For every life they save, can you give them a
life?

------
sarabande
This actually happened in August 2017
([http://www.bmvi.de/SharedDocs/EN/PressRelease/2017/128-dobri...](http://www.bmvi.de/SharedDocs/EN/PressRelease/2017/128-dobrindt-
federal-government-action-plan-automated-driving.html)).

I'm curious how this blog post by HERE (9 months later) differs from the
earlier official statements.

~~~
camillomiller
It’s a company blog from here maps, they just run out of topics I guess. By
the way, Dobrindt is not even a Minister anymore. The new Federal Minister of
Transport and Digital Infrastructure is Christian Schmidt.

~~~
ofrzeta
The Federal Minister of Transport and Digital Infrastructure actually is
Andreas Scheuer. Christian Schmidt was only interim.

~~~
camillomiller
Gee, they go so faaaast :) Thanks for the correction!

------
carapace
I swore I was going to try to _not_ post on HN today (it's Mother's Day), and
this is a little off-topic, a little.

I was watching car accident videos on youtube yesterday (I got sucked in and
couldn't look away. I lost two hours.)

Almost all of them were due to someone doing something obviously stupid or
wrong: turning without checking for oncoming traffic; passing unsafely; over-
correcting and then losing control; mis-judging clearance with stationary
obstacles; tailgating; etc...

It occurred to me, what if we program the robot cars (and can we _please_ call
them _" auto-autos"_!?) just so that they won't engage in dangerous behaviour?
The basic idea is, if the way is not clear, slow down.

What I mean is, it may just be Too Hard to make a robot drive a car like a
car; but it should be already possible to make a _safety_ vehicle that
prevents human error.

------
tomohawk
> Autonomous driving systems become an ethical imperative if the systems cause
> fewer accidents than human drivers.

This makes a certain kind of sense if viewed totally in a vacuum, apart from
any other considerations. But life is not all about the car. Is human choice
unimportant? What about freedom to travel where I want and when I want? Is
curtailment of those and other freedoms to be sacrificed based on statistics?
Whose statistics? How will these statistics be gathered? Who will interpret
them? Through what lens?

This sounds like the kind of reasoning that made it illegal in many
jurisdictions to let your kids play outside on their own.

~~~
icebraining
I suggest you read the actual report, rather than the soundbytes. They put a
lot of importance on the question of personal freedom:
[http://www.bmvi.de/SharedDocs/EN/publications/report-
ethics-...](http://www.bmvi.de/SharedDocs/EN/publications/report-ethics-
commission.pdf?__blob=publicationFile)

------
instaheat
This reminds me of I, Robot:

[https://en.m.wikipedia.org/wiki/Three_Laws_of_Robotics](https://en.m.wikipedia.org/wiki/Three_Laws_of_Robotics)

------
amelius
I want this rule: software updates, however small, shall not be installed
without X hours of testing by a government institution.

~~~
tobyhinloopen
That sounds extremely expensive, why would you want that?

~~~
amelius
I foresee companies pushing quick and dirty updates, sweeping critical
problems, or problems with their internal software development cycle under the
carpet. I want transparency in how companies deal with problems and software
updates.

Expensive is a feature in this case. It ensures that stuff gets tested well
before companies even consider having it double-tested by the govt.

~~~
tgsovlerkhgsel
That's a great way to make sure software doesn't get updated at all unless
it's completely unavoidable.

------
plink
Will Germany be applying the same enforcement to these standards that they
applied to VW and emissions?

~~~
Sylos
Probably, yeah. Our car industry is a driving factor of our economy, so
politics doesn't punish them as hard as they punish other parties.

~~~
josefx
After a quick google I get two results that disagree:

* wikipedia claims ongoing investigations against 37 people involved

* Politicians are working on the Musterfeststellungsklage, a class action equivalent for the German justice system. Explicitly so that buyers have better odds suing VW.

~~~
Sylos
Which in my perception is still not nearly as hard as other industries would
have been punished for similar crimes. I did not claim that they don't get
punished at all.

------
jasonjei
I wonder what the solution to a trolley problem would be for autonomous
vehicles. Would a trolley problem situation exist if all cars are autonomous?

~~~
supernova87a
The trolley problem has and continues to be a stupid distraction for blogs and
would-be pundits to insert some kind of false ethical puzzle into what is a
very clear set rules of the road.

There is no trolley problem. When vehicles are on the road and operating
according to the rules of the road, there is no choice about whether to swerve
to avoid something at the expense of another. Cars are not being commanded to
leave the road to avoid crashing into something on the road. You stay on the
road, stay in your lane, and don't get involved with people on the sidewalk,
etc.

If there's a bicyclist, motorcyclist, etc, they are just as much a vehicle as
you are, occupying their position in another lane, or in front/behind you. You
stop in your lane, change lanes if safe, and that's it.

There's no, "sacrifice 1 to save 3" kind of argument.

~~~
sjwright
Trolley problem situations absolutely can arise; whether the system is
programmed to recognise and consider them is entirely beside the point. All
that matters is how the system actually behaves when tested.

But I do agree that I don't see such scenarios ever occurring in reality
because computer controlled cars are not going to get themselves anywhere near
a trolley situation. They'll put the brakes on long before a human would ever
recognise the situation as catastrophic. Almost all unexpected emergency
situations are straightforward: the correct response will be to brake hard;
the consequences are not ethical but rather dictated by physics.

Swerving onto the footpath is an absurd option for countless reasons and cars
shouldn't be allowed even if the algorithm is fairly sure the footpath is
clear.

The closest plausible scenario to the _trolley problem_ I can imagine actually
occurring: "An unidentified object has fallen onto the road meters in front of
me, and I am boxed in with cars to my immediate left right and rear so I can't
swerve. Do I break hard and probably get rear ended or do I break soft and
definitely crash into the unidentified object?"

~~~
andrewaylett
Even then, the chances that the car behind has automatic braking get higher
every year. And if they're so close that they can't brake before hitting you,
that's _their_ fault, not yours. Brake, every time.

~~~
king_phil
"Fault" doesn't matter that much in an accident, saving lifes matters. "My
wife and child died. But doesn't matter, it was someone elses fault" said no
one ever

~~~
andrewaylett
Not quite the point of what I was trying to convey.

My point was more that your responsibility should be to avoid crashing into
things. In the posted scenario, you're probably better off slowing down and
being rear-ended than smashing into whatever and then _also_ being rear-ended.

On a slightly different point, if someone's following me so closely that I'm
properly worried about their stopping distance, I _will_ slow down and let
them pass. Better that the idiot is in front where I can see them properly and
they can't crash into me, and then I don't have to worry about the car behind
me if I need to brake suddenly.

