
Nissan driverless car guilty of “close pass” overtake of UK cyclist - bootload
http://www.bikebiz.com/news/read/nissan-driverless-car-guilty-of-close-pass-overtake-of-uk-cyclist/020846
======
kalleboo
The car clearly identified the bicyclist, so this just seems like poor
training/rules, which is easy enough to fix.

I'm more interested in the second video from the BBC where it couldn't
identify the maintenance vehicle. If the driver hadn't intervened, would it
really have plowed straight into it? I seems like even a current, non-self
driving car, would have intervened with automatic braking (or slowing down
with auto-cruise control) in that situation.

~~~
tomatsu
> _this just seems like poor training_

Heh. Yea, passing too closely is what many humans do.

Speaking of potentially murdering cyclists, since they already got all those
sensors, they could prevent drivers and passengers from "dooring" cyclists.

The "dutch reach" (opening the door with your other hand which automatically
makes you look over your shoulder) is unfortunately not commonly taught in
driving schools.

~~~
bootload
_" The "dutch reach" (opening the door with your other hand which
automatically makes you look over your shoulder) is unfortunately not commonly
taught in driving schools."_

Excellent idea, going to use that one.

------
raverbashing
Though the article shows some bias, it's nonetheless something that shouldn't
have happened

More interestingly, this was a LHD car in an UK road, might have something to
do with it (though the navigation system seemed to be aware of it).

~~~
m0nty
> shows some bias

The clue is in the name, it's a mag about cycling. It's run by Carlton Reid, a
well-known expert on cycling - his most famous book is "Roads were not made
for cars", a history of the intertwined histories of cars and bicycles and how
motor transport came to dominate our roads almost to the exclusion of all
others. So I'm not sure it's a "bias" so much as "what he thinks about cars
and bikes".

> More interestingly

No, that's a minor point. The important thing is that cyclists get killed in
close-pass situations, and those that don't are often put off cycling
permanently. Along with the Uber car jumping a red light in SF, and Uber
basically saying "it doesn't matter", this shows a worrying tendency for self-
driving at any cost, including safety, which many of us were hoping would be a
benefit of such technology.

~~~
raverbashing
Thanks for your comment

You're right cyclists get hurt or killed in close-pass situations (by humans).

It seems here the navigation identified the cyclist and avoided it. It
_incorrectly_ didn't follow the legal clearance but nobody got hurt (because
the avoidance happened correctly it seems). It's an unsafe situation but the
car did not ignore the cyclist.

The chance for someone to get injured or an accident happening seemed to be
much higher in the Uber case.

------
vesak
This is perhaps the real risk of AI. In the first wave, the industry is
dominated by a wish to make a great product. In the second wave, the bean
counters walk in and start to optimize everything. And then we start getting
the behavior from Fight Club.

I don't see any way out of this loop except for penalizing bad business
behavior (on an individual level) heavily while protecting engineers who fight
them.

~~~
na85
Yes, no engineer has ever behaved badly, or been complicit in ethically grey
areas, and all the world's ills are due to business types.

~~~
vesak
Well, sure, there's another point.

Programmers should have a similar guild/organization that medical doctors do,
and people who abuse the ethics should be barred from practicing.

~~~
na85
I fully agree. It would solve a lot of problems we currently face in
technology, but there are two issues that I see:

1) Doctors, surgeons, engineers (actual professional engineers, not people
with comp sci degrees who call themselves engineers), lawyers, etc. are called
professionals because they have professional colleges of their peers that
oversee things like ethics and can revoke licenses, etc. They're self-
regulating.

This works because for things like medicine and engineering and law, there is
a large barrier to entry. You have to go to a recognized school and pass an
ethics/comprehensive exam. This isn't so with programming. There's almost no
barrier to entry.

Any kid can open up a tutorial on MySQL and call themselves a back-end
engineer. Unless the software industry is willing to require formal
qualifications on all their hires, this will remain a problem.

2) One of the great things about software is the democratic aspect. Software
is great precisely _because_ any random kid can open up a tutorial and
notepad.exe and start writing code. To rectify problem #1, you'd to a certain
extent kill the democratic nature of learning which is a large part of why
many people are in the industry today. I also think that a large portion of
people who work in software would be highly resistant to this sort of thing,
to say nothing of the difficulty of enforcing ethical behaviour if you can't
see what's going on due to proprietary licensing and obfuscation.

~~~
vesak
You're right. But then again, let's consider how little time has passed, even
if the progress has been very fast.

Anyone could have called themselves a doctor-analogue 100 years after the
invention of "medicine", too.

------
anotheryou
Would a radar reflector on my bike help me? Like this but smaller:
[https://aceboater.com/media/guide/1/radar-
reflector-3.jpg](https://aceboater.com/media/guide/1/radar-reflector-3.jpg)

~~~
ptaipale
This was because of decision made by software, not because of not detecting
the cyclist.

~~~
anotheryou
Are you sure? I read bycicles are really hard to detect. And I thought a
bigger radar signature, pumping up some confidence value, might influence the
behaviour too.

~~~
manmal
You can see the cyclist on the bottom screen in the video, represented as a
yellow speck.

------
discordianfish
Playing devils advocate here.. Couldn't you argue a driverless car simply
needs less space to guarantee the same/better safty for the cyclist?

~~~
tmnvix
A cyclist can swerve very quickly for all sorts of reasons. Drivers (human or
computer) need to allow space for this.

~~~
phicoh
Swerving is good way to get yourself killed. If you find yourself swerving to
the center of the road without making sure there is no traffic trying to
overtake at the moment, then you have to figure out what to do to avoid that.

Relying on each an every driver to leave enough space just in case you need to
swerve is a losing proposition.

~~~
jeennaa
a cyclist should be treated the same as any other vehicle on the road. they
might need to swerve to avoid an obstacle while the car is overtaking, just as
a car may need to swerve while another car is overtaking. when a safety margin
is possible, as it was here, it costs nothing to apply one.

~~~
phicoh
I don't know where you live that you have that much space. My personal rule is
don't swerve. If you don't have time to signal that you are going to move, or
look to make sure other traffic is not in the way, then either brake hard or
continue straight (with the obvious exception if not swerving causes you to
hit another person).

There are just too many roads where you can't swerve without causing a head on
collision. If you make it a reflex to swerve, then you will also do it when
there is no space for other traffic to avoid you.

------
dabeeeenster
Amazing how Nissan have managed to get their car to emulate 99% of London
drivers.

~~~
phicoh
The Dutch rule is basically, that the driver is liable if an accident happens,
even if it's the cyclist's fault (unless you can prove that the cyclist
deliberately caused the accident).

So there are not a lot of rules on what you should do. And this kind of
passing is required often enough on narrow streets, that usually cars don't
swerve if there is more space.

So that means that those self driving cars have to have a per country profile
on how to behave. So if you have to do that for every country in the EU then
that will make developing self-driving cars quite a bit more expensive than
the current system where you have a common market.

~~~
Zak
Self-driving cars should try to avoid collisions regardless of fault and
should not adjust that behavior based on who the local liability laws will
hold responsible.

As far as traffic regulations specifying behavior go, yes, they differ from
country to country and self-driving cars do need to be able to follow them.
That doesn't necessarily mean the cars need to learn to follow a bunch of
weird rules; it may be more of a lobbying problem. "Standardize your rules so
we can offer self-driving cars that are X times safer than human drivers in
your country" isn't a terribly hard sell.

~~~
gens
>Self-driving cars should try to avoid collisions regardless of fault and
should not adjust that behavior based on who the local liability laws will
hold responsible.

I would love to watch a show called "Driverless cars in India".

~~~
na85
The Russia edition of that show would be quite entertaining as well.

------
snackai
Well remove cyclists from streets. Its as simple as that. Remove humans from
streets. Also remove all cars from the streets that need a driver. It will
happen sooner or later. Humans will eventually be the single point of failure.

~~~
Neliquat
Let me simplify that for you.

>Remove humans

