
Toyota Wants Cars to React to the Unexpected, and Then Explain Themselves - chewymouse
http://www.technologyreview.com/news/545186/toyota-wants-its-cars-to-expect-the-unexpected/
======
ceejayoz
"I'm sorry we're falling off a mountain right now, but it was us or that
family."

~~~
afarrell
I think that self-driving cars should act in the self interest of the owners.
It will encourage people to use self-driving cars and decrease road deaths.

Also, if you are at a rail yard and see something hurtling toward 5 workmen
and can push a fat man in front to protect them, you should refrain. The
workmen had the opportunity and duty to take actions to prepare for and
prevent rogue train cars. The fat man should not have a duty to be constantly
on guard against assault.

~~~
ceejayoz
> I think that self-driving cars should act in the self interest of the
> owners.

To what extent? At some point, we're going to have to decide as a society what
call a self-driving car has to make in situations like "I can run down this
family in the cross-walk or get rear-ended by that oncoming vehicle".

~~~
zyxley
I'd rather see a focus on smarter and faster braking then on solving contrived
moral dilemmas.

~~~
mikeash
I don't understand why this keeps coming up. I have never seen or even heard
of a driver getting into a situation where there's any sort of moral dilemma
in how they react. There's always an obvious best answer or a bunch of
equivalently good answers.

For the one-in-a-trillion cases where this is not true, it's OK if the car
reacts sub-optimally. It will still be _vastly_ better than human drivers, who
frequently react sub-optimally in unambiguous situations.

------
ryanmarsh
Customers Want Cars to Not Kill Them

FTFY

~~~
protomyth
Pedestrians on the sidewalk want the car not run them over to avoid killing
the Customer.

~~~
pc86
Fair point, but if the alternative is hitting something that has an as high or
higher likelihood of killing the occupants, I'm not sure what the alternative
is (don't say "avoid the situation in the first place," that's obvious).

Nobody's going to buy an automated vehicle that might intentionally kill them
to save a pedestrian.

~~~
protomyth
I think the alternative is pretty much a no go. If some poor pedestrian that
was on a side gets hit by a car to preserve the owner then the lawsuits will
be huge. Killing bystanders is not going to be acceptable.

Let's look at a picture[1] from an article[2] that made it to HN[3]. If any
program chose A as an acceptable solution, then they should be sued into
oblivion. Killing the bystander will kill the driverless car.

1)
[http://www.technologyreview.com/sites/default/files/styles/v...](http://www.technologyreview.com/sites/default/files/styles/view_body_embed/public/images/Ethical%20cars.png?itok=A_wye_WL)

2) [http://www.technologyreview.com/view/542626/why-self-
driving...](http://www.technologyreview.com/view/542626/why-self-driving-cars-
must-be-programmed-to-kill/)

3)
[https://news.ycombinator.com/item?id=10439106](https://news.ycombinator.com/item?id=10439106)

~~~
cam_l
People drive SUV's, which have a much higher likelihood than ordinary cars of
killing the occupants of other vehicles or pedestrians. No lawsuits. Just for
perspective, the lawsuit you are talking about is the family of one dead
pedestrian involved in a freak accident suing a car company. I mean, how long
could a car company hide the effects of proprietary closed software engineered
specifically for flouting the law.

..also from your picture and as I have learned from TWD, if you see a herd of
zombies on the road your self driving car should probably be programmed to
drive straight through them.

~~~
protomyth
That's not equivalent. We are talking about a vehicle that is programmed from
the manufacture to do something. The SUV driver is responsible for his/her
actions. When we are talking about a self driving car, we now have machines
that decide what to do and killing the bystander tends not to be a sympathetic
item in from of the jury.

As to the second part... I'm not sure how the software will judge the depth of
the hoard to see if avoidance is a better answer than just pushing through.
Also, a couple of zombies could really mess up a Tesla versus a SUV or Jeep.
Zombie Apocalypse mode would be a much hard problem.

~~~
cam_l
"The SUV driver is responsible for his/her actions."

Hey, that is what ford said about the Bronco II.

Anyway, I agree with you in principle that if presented with the clear
evidence in court it would be as you say. My contention is it would take many
years and a whistle-blower before anything like that would ever come to light.
And there is much greater likelihood of deaths occurring accidentally due to
the code, than by design.

Depth of herd is an issue, to be sure, and this even assumes the car could
tell the difference between zombies and humans.. perhaps we need a waze for
herd avoidance.

~~~
protomyth
> Hey, that is what ford said about the Bronco II.

Programming is treated a bit different from a tire specs

I get the feeling a lot of programmers are going to be put on witness stands
pretty early in the roll out. Lawyers already like to interview Sys Admins.

> Depth of herd is an issue, to be sure, and this even assumes the car could
> tell the difference between zombies and humans.. perhaps we need a waze for
> herd avoidance.

Classically, an IR sensor should tell the car the difference between zombies
and humans. Herd avoidance is probably a better tactic because of resource
depletion.

------
shallowpedantic
Self-driving cars are going to make people even lazier then they already are.
And as for those cars that automatically park themselves - whats going to
happen when 5 cars see only one spot, and they all try and go for that same
spot?

~~~
zyxley
> whats going to happen when 5 cars see only one spot, and they all try and go
> for that same spot?

What happens when 5 human drivers see only one spot, and they all try and go
for that same spot?

