
Uber disabled Volvo SUV's safety system before fatality - molecule
https://www.sfgate.com/business/article/Uber-Disabled-Volvo-SUV-s-Standard-Safety-System-12782878.php
======
kelnos
It seems a little weird to focus on this. The article itself admits it may be
standard practice for makers of self-driving tech to disable any built-in
automated driving features in the car. This makes sense: presumably there's no
good way to talk to the built-in system; having a second system running that
can't coordinate with the self-driving system would likely make the combined
system _less_ safe. Even if communication between the two were possible, I'd
still imagine that just having a single system operating the care would be
much more desirable.

~~~
ilamont
_“We don’t want people to be confused or think it was a failure of the
technology that we supply for Volvo, because that’s not the case,” Zach
Peterson, a spokesman for Aptiv Plc, said by phone. The Volvo XC90’s standard
advanced driver-assistance system “has nothing to do” with the Uber test
vehicle’s autonomous driving system, he said. Aptiv is speaking up for its
technology to avoid being tainted by the fatality involving Uber ..._

No. This is Volvo pressuring Aptiv to make a statement on its behalf, to avoid
being tainted by this PR disaster and to shift investigators' focus away from
both companies.

Also, in all of the discussions about this incident, let's forget that Elaine
Herzberg, a mother of two children, lost her life due to an engineering
experiment and publicity stunt ("Come to Arizona! We don't have safety regs!")
being conducted on public roads with great risk to members of the public.
Volvo, Aptiv, and other participants in these trials should have never signed
off ... yet they willingly participated, and an innocent member of the public
has paid with her life.

~~~
pythonaut_16
I don't see how Volvo is responsible for Uber's recklessness. Volvo didn't
make them test on public roads with unsafe technology.

~~~
dagw
The question isn't if Volvo is in any way culpable (they almost certainly
aren't) but if there's a risk that Volvo might appear in any way culpable to
the general public. The very last thing Volvo wants is the headline "Self-
driving Volvo kills pedestrian".

~~~
chillingeffect
Especially because a Volvo plowed into a pedestrian almost a year ago:

[https://www.youtube.com/watch?v=WbW2UgmjJUA](https://www.youtube.com/watch?v=WbW2UgmjJUA)

IIRC it was discovered this happened because the owner hadn't purchased the
"Pedestrian Avoidance Package".

And shortly before that, a Volvo plowed into a parked truck (in a demo):

[https://youtu.be/aNi17YLnZpg](https://youtu.be/aNi17YLnZpg)

This was "was the result of an earlier flat battery that temporarily disabled
the system due to low voltage" [0]

The pattern here is not only due to the collision systems themselves, but to
the system around the collision avoidance system.

[0] [https://jalopnik.com/5648126/volvo-pedestrian-avoidance-
cras...](https://jalopnik.com/5648126/volvo-pedestrian-avoidance-crash-test-
fails-spectacularly)

------
smogcutter
This seems like an obvious thing to do: if you’re trying to test the AI,
remove complicating, possibly contradictory OEM systems.

But compare with the other discussion currently on the front page, where users
point out system subsumption is a basic safety principle. You need a safety
system to fall back on if the AI fails (as it will, at least at this stage).
It seems grossly negligent if Uber didn’t re-implement collision avoidance at
a lower level.

Edit: link to discussion I’m thinking of
[https://news.ycombinator.com/item?id=16681611](https://news.ycombinator.com/item?id=16681611)

~~~
freddie_mercury
Why is "car's built-in system took control" harder to handle than "human
safety driver took control"? Seems like they are exactly the same: "external
system took control".

~~~
buzer
I would imagine it's easier to hook into human operated controls than to car's
internal systems. Just put some sensors on steering wheel, pedals, handbrake
and couple of others and you will know when someone uses them. If they are
used, give up control of the car (or the specific system that was controlled).

Getting information that some automated system initiated some operation is
almost certainly harder. System only knows that the car is not behaving as
expected, but it doesn't know what caused it. You could probably figure out
how to determine if something like collision avoidance was triggered (large
shifts in expected vs actual outcomes), but in case of something like ABS you
could end up with two systems fighting for the control over brakes in
emergency situation.

~~~
alistairSH
The auto-brake systems in all cars will activate the same brake-is-on switch
as the human operator. Often, it's just a switch on the brake pedal assembly.
Trivial to tap into it.

If the switch state changes from closed to open, [driver|AEB] have intervened
and Uber's system should relinquish control of that sub-system.

~~~
cesarb
> If the switch state changes from closed to open, [driver|AEB] have
> intervened and Uber's system should relinquish control of that sub-system.

No, both cases are fundamentally different. If a human driver has intervened,
the system should expect the human wants to take over all subsystems: braking,
steering, accelerating. If the AEB has intervened, the system should expect it
will take over only the braking, and only for a limited amount of time; it
shouldn't relinquish control (and doesn't have to, since a properly-designed
AEB will be an override of the braking control), and should be ready to take
over immediately after it releases (and probably bring the car to a safe stop
as soon as it has control back).

That is, the self-driving system must know whether the command came from the
human driver, or from another system.

~~~
alistairSH
Steering input sensed, give up steering control. Brake input sensed, give up
brake control.

I'm not convinced the two should be linked in any way, human driver or AEB or
whatever else. If the human is taking evasive action, the auto-drive is
getting input from all the available sensors.

This shouldn't be that hard (relative to the other challenges being faced).

------
derekp7
I was wondering why it is necessary for self driving tech to be tested in
"live" mode, instead of having the software passively log all event data. Then
analyze it to see what the software "would have done" compared to a human
driver.

Then a lot more testing data could be gathered by outfitting random vehicles
(taxis, etc) with the tech, and analyze (and further refine the software)
around every event where the software and human driver differed in opinion
(i.e., whenever the vehicle abruptly changes speed, did the software detect
that it should have hit the brakes at / before the same time).

~~~
lopmotr
Any feedback from the response of the vehicle to the commands would be absent.
So would whatever happens next. If the computer says "steer left a little" but
the video keeps going straight, what's the computer supposed to do next?

~~~
carlmr
You can't really test the control part of the algorithm, but you could test
the detection and object recognition parts.

------
CookWithMe
Volvo is also manufacturing a car with a pedestrian airbag, the V40:
[https://support.volvocars.com/uk/cars/Pages/owners-
manual.as...](https://support.volvocars.com/uk/cars/Pages/owners-
manual.aspx?mc=Y555&my=2015&sw=14w46&article=7fceb4e7544b4fbbc0a801e800b0ef6b#)

Both Volvo and Uber could have opted to use this car instead. Even choosing a
SUV is question-worthy, as they're known to be more dangerous for pedestrians
due to higher bumper heights:
[https://en.wikipedia.org/wiki/Criticism_of_sport_utility_veh...](https://en.wikipedia.org/wiki/Criticism_of_sport_utility_vehicles#Risk_to_other_road_users)

IMO there is at least some negligence on their part for not choosing a car
that is more likely to protect pedestrians.

~~~
autogeek
The pedestrian airbag was introduced before the Automatic Emergency Braking
(AEB) safety function was available. Volvo no longer offers the pedestrian
airbag as they found most pedestrian accidents were avoided by the AEB.

The V40 is also a rather old car and is based on an old platform. The XC90 on
the other hand was their newest car at the time the Uber deal was made and is
based on their latest platform. So it is not unusual that both Uber and Volvo
would prefer the XC90 over the V40. Besides, given the newer technology in the
XC90 it is quite possible it is better in pedestrian accidents than the V40
(with the safety systems enabled on both).

------
nwrk
There you go. Was seeing the videos of other drivers passing same road. Even
the Google XL is better quality than Uber cam. Also, all the lidar gimmick
looks like not worked well there.

Like the Intel statement from article. Hope there will be some sort of
requirements for this equipment defined. Or technically benchmarked before
allowed on road.

> Intel Corp.’s Mobileye, which makes chips and sensors used in collision-
> avoidance systems and is a supplier to Aptiv, said Monday that it tested its
> own software after the crash by playing a video of the Uber incident on a
> television monitor. Mobileye said it was able to detect Herzberg one second
> before impact in its internal tests, despite the poor second-hand quality of
> the video relative to a direct connection to cameras equipped to the car.

~~~
kevin_thibedeau
Nice try. Tesla already demonstrated Mobileye's inability to handle impending
collisions.

~~~
DannyBee
??? Mobileye explicitly did not want Tesla using them in the fashion they
were, and discontinued supplying them with chips.

[https://www.google.com/search?q=mobileye+sends+tesla+a+lette...](https://www.google.com/search?q=mobileye+sends+tesla+a+letter&rlz=1C5CHFA_enUS770US770&oq=mobileye+sends+tesla+a+letter&aqs=chrome..69i57.5718j0j7&sourceid=chrome&ie=UTF-8)

This came about because Mobileye was honest about their capabilities:
[https://electrek.co/2016/07/01/tesla-autopilot-mobileye-
fata...](https://electrek.co/2016/07/01/tesla-autopilot-mobileye-fatal-crash-
comment/)

Tesla, in their immediate response (same article), basically tried to imply
they have magic abilities to do much better, which, based on later crashes, we
know is not true.

Mobileye, in response, chose to end their partnership with Tesla (and yes, it
was Mobileye who did it. The letter they sent ending the contract can be found
if you browse around, after Tesla explicitly asked them to continue and signed
a multi-year contract earlier in 2016)

------
yason
That's quite opposite to how autonomous systems should be built in the first
place: start with isolated modules that do _not_ depend on each other, and
stack them on top of each other as the abstraction level goes up.

If you have an automatic but a brutal, on-off style emergency-braking module
based on, for example, lidar then it's much easier to develop another, maybe
less aggressive but much smarter, auto-braking module based on the regular
camera inputs. You can count on the failsafe system to act if higher-level
modules fail. With braking in particular, maybe even a couple of redundant
low-level emergency brake systems would be a good idea before you even
consider any of the more high-level systems. You really don't want to hit
jaywalking pedestrians or wandering drunks because both kinds do exist in
reality.

Similarly, once the car has a couple of obstacle-detection and auto-braking
systems up and running, it's easier to work on autonomous driving because you
can be confident that your computer won't be able to drive the car into
anything as these isolated and separate systems will take care to stop the
car.

Consequently, when you're developing the navigation and routing system it has
to be able to rely on the driving functions that deals with the present-time
traffic situations, like keeping up with the flow of traffic, slowing down,
changing lanes, etc. Again, same thing as with emergency brakes: to develop a
smarter higher-level system you need to have lower-level failsafes in place.

In the case of autonomous driving a failure should mean a harsh but
unnecessary full stop or the inability to continue, not a collision or the
inability to stop. They should definitely have left Volvo's own system on as
an additional failsafe mechanism.

------
Someone
With friends like this, why would Uber need enemies? We now have public
statements from both Velodyne (makes the LiDAR that Uber uses) and Aptiv
(makes the hardware used in the Volvos that Uber uses) saying “it wasn’t us”.

Volvo is the gentleman here, with its “can’t speculate on the cause of the
incident” statement.

~~~
fixermark
The gentleman in public, but we don't know if pressure was put on Aptiv to
make a statement so Volvo didn't have to. ;)

------
msoad
The released video is suspiciously dark. That street has street lighting. Can
it be edited?

~~~
romwell
It doesn't even need to be edited to be misleading.

Simply taking a crappy dashcam video with low dynamic range suffices; it will
make an illusion that anything not directly in the headlights of the vehicle
is invisible.

I don't know how people can believe a single word of the early official
statements which claimed that the victim "suddenly" appeared in front of the
vehicle. The problem of "it's too dark to drive" has been solved a hundred
years ago with the invention of headlights.

Here's an experiment anyone can try: turn on the headlights on a dark road,
and see if you really see nothing in the lane to your left (driver side).
Whatever you do see (which, under normal circumstances would be at least three
seconds worth of road ahead) is the way in which the released video is
misleading.

~~~
kmonsen
Also, if it is too dark to drive at that speed you are supposed to slow down
not use it as an excuse after the fact.

The speed limit is the max limit, conditions permitting.

------
eklavyaa
The autonomous vehicle era is in same stage which once passenger planes were
in

------
Zigurd
Maybe disabling Volvo's built in autonomous braking capability is needed for
operating an autonomous driving system. It seems intuitive that multiple
autonomous systems could clash.

But it also seems like "It works better than the stock production autonomous
braking" would be a gating factor in putting these vehicles on the road.

~~~
kwhitefoot
It's not autonomous it's automatic.

------
m3kw9
You do need to disable those systems to have a proper test and may have
adverse confusion with the AI

------
ahemphill
Warning: this article contains an auto-playing video of the incident.

------
andrewprock
Why was this being tested at night? It looks like the nighttime visibility was
less than 50 feet.

~~~
romwell
I'd say because night time visibility in Arizona on a clear day is way, way
more than 50 feet, and the low-quality dashcam footage you see here is highly
misleading.

Low visibility on a clear day is simply lack of illumination, and it is a
horrible excuse, because:

1\. The car is equipped with headlights, which solve the "limited night time
visibility" problem;

2\. The car is equipped with multiple types of high-quality cameras;

3\. The car is equipped with a LIDAR, which is an active sensing system.

Finally, if one accepts that the visibility was limited to 50 feet, as the
video misleadingly suggests, THEN one has to concede that the car was going
_too fast for how much it could see_.

People are taught to drive so that at least three seconds worth of road ahead
can be seen. If you believe that either the car or the driver were seeing the
equivalent of what the webcam footage was showing, the car was going unsafely
fast.

Hanlon's razor would suggest, though, that the car simply wasn't able to
process the data available to it. We'll know for sure if the investigation is
successful.

------
billysielu
Worst website I've seen in a long time.

