
Car crashes killed 37,133 people in the US in 2017 - pseudolus
https://arstechnica.com/cars/2018/10/car-crashes-killed-37133-people-in-the-us-in-2017/
======
nopinsight
To all self-driving companies testing on public road, even Waymo, which is the
current leader: [1]

Why don’t you develop/deploy a face and eye tracking system to detect if the
test driver is dozing off or otherwise not paying enough attention and act
accordingly? (Warn first, and if needed, take the car to a safe spot and
stop.)

The system should be useful in real deployment with passenger/driver on board
as well, since the self-driving car needs to know if it should obey human’s
manual command, which should also depend on the person’s state of attention at
the moment.

[1] Waymo’s self-driving car crashed because its human driver fell asleep at
the wheel [https://qz.com/1410928/waymos-self-driving-car-crashed-
becau...](https://qz.com/1410928/waymos-self-driving-car-crashed-because-its-
human-driver-fell-asleep/)

[2] In addition, we all heard about the case which the test driver was
watching a movie and a pedestrian was killed.

~~~
xoa
> _Why don’t you develop a face and eye tracking system to detect if the test
> driver is dozing off or otherwise not paying enough attention and act
> accordingly?_

At least in Waymo's case, because their explicit goal is to go directly to
level 5. Unlike some companies who think intermediate steps are desirable, for
Waymo test drivers are specifically that, for testing. The accident (which was
just on HN) if anything supports the hypothesis that there should either be
Level 1, maybe a little 2, or 5, and that's it.

So spending time and effort on developing more complex technology like that
purely for the special limited time case of R&D is probably not worth it vs
just plain having better and more professional driver practices. Waymo can and
should be blamed for having shifts of more then say 4 hours, insufficiently
rigorous selection/training, not having a 2 person requirement, online real
time observation by a control center, etc. But it's still a one-off situation.
Where technology comes into play is scaling and general usage by non-
professionals, but for the specific, low volume case of testing it is
absolutely possible to have low-tech or no-tech handled well.

For companies who claim level 3/4 are valuable stepping stones rather then a
valley, yes driver attention is integral to their proposal in general
deployment and thus requiring that also be part of their technology makes
sense. But it's not just over engineered for a level 5 shift, it's to some
extent making excuses for what is a direct and simple management failure. If
managers overwork and badly train a fork lift operator or any other equipment
operator and an accident results, we don't say the fork lift needs eyeball
trackers, we say management failed at following best practices and is at
fault.

~~~
nopinsight
Even IF a level 5 self-driving becomes practical soon (I doubt it based on the
fact that AI still lacks commonsense), many human riders would still want an
option of manual override.

In the linked article of my post above, the manual override went wrong because
the driver fell asleep and that could occur to any rider as well.

Obviously, one could make it harder to override the autonomous system but it
would result in slower activation which could result in an accident for any
car that still lacks commonsense.

In fact, taking into account the commander/rider’s state of attention should
be part of commonsense needed for any level 5 self-driving system.

~~~
xoa
> _Even IF a level 5 self-driving becomes practical soon (I doubt it based on
> the fact that AI still lacks commonsense), many human riders would still
> want an option of manual override._

And they should be accommodated why exactly? If they want manual they can just
buy a standard car with l1/l2.

> _In the linked article of my post above, the manual override went wrong
> because the driver fell asleep and that could occur to any rider as well._

No, not just Waymo but even GM have explicit plans for final cars to not even
have so much as a steering wheel at all, and pushing for Congress to allow an
exception in that case from full FMVSS-compliance. The manual override and
functions are just for the development process, the whole point of level 5 is
that passenger involvement is eliminated. Anything less then that isn't level
5 and gets back into the problem of _almost_ not needing any attention, except
when you suddenly do. I'm not commenting on how close they are, just on what
the goal is and why developing drive attention technology purely for testing
purposes makes no sense vs just having better test procedures (and paying for
it).

> _In fact, taking into account the commander /rider’s state of attention
> should be part of commonsense needed for any level 5 self-driving system._

Are you sure you actually understand what the "levels" are here? It's
certainly widely regarded as an imperfect classification system, particularly
at 2, 3, and 4, but 1 and 5 are clear enough and 5 is "start to finish without
even a human in the vehicle at all, or everyone sleeping".

~~~
nopinsight
> Are you sure you actually understand what the "levels" are here? It's
> certainly widely regarded as an imperfect classification system,
> particularly at 2, 3, and 4, but 1 and 5 are clear enough and 5 is "start to
> finish without even a human in the vehicle at all, or everyone sleeping".

I understand ‘Level 5’ and I am an active AI researcher. Do you understand
what kind of technology is required to get to Level 5?

Do you think level 5 can be achieved without understanding how humans behave
in a vehicle? Commonsense is needed for Level 5.

For an example: What if someone sleepwalking or rather sleep kicking and break
the windows open then leave a leg or an arm hanging out of it? Do you think a
Level 5 car should not monitor how the passengers behave at all?

Do you have real information/references rather than speculation that Waymo and
GM have plans to launch only after achieving Level 5 rather than Level 4 which
most observers agree on?

Do you have a cost/benefit calculation of why the driver detection system is
“not worth it” since it is better to wait for Level 5 before launching anyway?
By the way, the driver detection system is something a team of competent AI
researchers can do today (perhaps imperfectly but good enough) while no one
even knows for sure how to achieve Level 5.

When do you expect Level 5 can be achieved? If your answer is less than 10
years, you appear to know more than most of the top AI researchers I have
talked with or read their prognostications.

~~~
xoa
> _I understand ‘Level 5’ and I am an active AI researcher._

Then frankly your posts make even less sense, unless you've gotten so deep
into AI research that you've lost sight of what humans can accomplish without
technology.

> _Do you understand what kind of technology is required to get to Level 5?_

Utterly irrelevant to the discussion at hand, beyond that by definition L5
means zero human involvement is required.

> _Do you think level 5 can be achieved without understanding how humans
> behave in a vehicle?_

Yeah, absolutely. If how humans behave in the vehicle mattered it wouldn't be
L5. Active sabotage is out of scope since there is no difference there then a
human driver. If you are a passenger and suddenly start hitting your driver in
the head on the interstate what could possibly happen? Oh right, accident and
everyone is badly injured or dies.

> _What if someone sleepwalking or rather sleep kicking and break the windows
> open then leave a leg or an arm hanging out of it?_

Then they'll have shattered glass and bad cuts all over their leg or arm which
would likely wake them up and also hurt a lot. They should probably tell their
L5 car to go to the ER and also _are you fucking serious?_ It's not as if
humans sleeping in vehicles including ones with glass windows is some new
thing and somehow an epidemic of people kicking through windows has never come
to my attention. Feel free to share your stats on that one.

> _Do you think a Level 5 car should not monitor how the passengers behave at
> all?_

No. On the contrary, self driving cars monitoring their passengers at all
times sounds terrifyingly dystopian. Exactly what country/agency do you do AI
research for?

> _Do you have real information /references rather than speculation that Waymo
> and GM have plans to launch only after achieving Level 5 rather than Level 4
> which most observers agree on?_

Amongst others: "GM Says Car With No Steering Wheel Or Pedals Ready For
Streets In 2019", [https://www.npr.org/sections/thetwo-
way/2018/01/12/577688125...](https://www.npr.org/sections/thetwo-
way/2018/01/12/577688125/gm-says-car-with-no-steering-wheel-or-pedals-ready-
for-streets-in-2019). A lot of mass publication stuff doesn't use "level 5"
specifically because it's not terminology everyone will understand, but if it
has zero manual controls and human involvement beyond high level orders ("Go
here", "stop") that's what it is.

> _Do you have a cost /benefit calculation of why the driver detection system
> is “not worth it” since it is better to wait for Level 5 before launching
> anyway?_

Why would I need to? You're the one making the assertion that this is
something Waymo should do, so presumably _you_ have a cost/benefit analysis
right? Waymo apparently doesn't though, and I'm just observing why that would
be the case. There are plenty of actual for real jobs being done all the time
worldwide that are of the "99% boredom, 1% HOLY SHIT WE'RE ABOUT TO DIE" sort,
it's a tough but well known problem that can be worked on in a restricted
setting with training, checklists, scheduling and support, no brand new tech
required. You're asserting that they should develop a whole new untested
complex technology, even if less complex, that will immediately become
worthless (or even a negative for those worried about mass surveillance) in
deployment. Why do you think this is better then just following decent
professional best practices for R&D testing?

> _When do you expect Level 5 can be achieved? If your answer is less than 10
> years, you appear to know more than most of the top AI researchers I have
> talked with or read their prognostications._

Irrelevant, and you should really rethink your logic here if you think
otherwise. Whether it's 1 year, 5 years, 10 years, or 30 years it's not the
public's problem, it's Waymo's and their investors. They can't deploy L5 until
it's ready, the timeline for "ready" is just their issue. If they're satisfied
that with how their development is going just with human testers and training
and the results are what the public wants then how they get from A to B is up
to them.

~~~
nopinsight
My assertion is that they will deploy Level 4 cars first. In fact, Waymo is
deploying Level 4 cars to the public soon.

If you are so sure that Waymo and GM will deploy Level 5 and skip Level 4,
based on your level of understanding of the technology and the economics
involved, please publish a blog post or an Ask HN and see how many experts
agree/disagree with you.

Otherwise, are you willing to bet with the people who disagree?

By the way, no steering wheel does not mean Level 5.

[https://www.techrepublic.com/article/autonomous-driving-
leve...](https://www.techrepublic.com/article/autonomous-driving-
levels-0-to-5-understanding-the-differences/)

------
walrus01
I think 50 years from now we're going to look at this fatality rate much like
we look at 1930s-1950s era cigarette advertising, or pre-antiseptic era
medieval medical practices like bloodletting.

------
mring33621
So, 10+ 9/11 equivalents (compared on number of deaths) in one year.

~~~
kraftman
I wonder what a $2 trillion war on car crashes would look like.

------
pedalpete
The autonomous driving companies are aiming to change this headline to "People
driving cars killed 37,133 other people", this is similar to the argument that
"Guns kill people", isn't it?

~~~
smt88
"Guns kill people" is a straw man. The actual argument is that guns make it
easier to kill people (and yourself) on a whim.

Autonomous vehicles have nothing to do with gun control because no one is
trying to teach handguns to operate themselves.

Autonomous driving companies can and should be judged on how well their
software lowers rates of collisions compared to human drivers.

------
alkonaut
How does this compare to elsewhere in the world, in some comparable metric
(e.g deaths per driven distance or hour)?

~~~
another-cuppa
It's very bad. The US is an outlier when it comes to developed countries. It's
closer to third world levels.

~~~
gandhium
You should check the rate per number of motor vehicles.

Because if you're looking at car crashes, there's a direct correlation between
number of cars and number of crashes.

Otherwise one can think that India is a much safer place, but it's way worse
in reality.

------
aNoob7000
Wait until driverless cars hit the road.

~~~
andy-x
Would you count car's AI "driver" as a fatality then?

