
Elon Musk says full self-driving Tesla tech 'close' - helsinkiandrew
https://www.bbc.com/news/technology-53349313
======
helsinkiandrew
"I feel like we are very close. "I remain confident that we will have the
basic functionality for level five autonomy complete this year. "There are no
fundamental challenges remaining. "There are many small problems. "And then
there's the challenge of solving all those small problems and putting the
whole system together." Real-world testing was needed to uncover what would be
a "long tail" of problems, he added.

~~~
Rebelgecko
Based on my personal programming experiences, if there's no fundamental
problems and they're just fixing minor edge cases then about 90% of the work
still needs to be done

------
uniqueid
Why wait till it's ready?

~~~
uniqueid
More seriously, the nice thing about this "prediction" is that it's so close.
So I can go on record and say: Teslas will _not_ be fully self-driving "by the
end of the year" nor even by the end of _next_ year, nor the year _after
that_. I doubt we'll see one in 2030, for that matter.

If a car isn't smarter than a human, it will kill people in ridiculous
circumstances. Reaction time can be an order of magnitude better than human,
but you can't release a car that occasionally drives over a baby for no
reason.

If Musk actually claimed otherwise in earnest, he's either having another
episode, or maneuvering to fix an issue with PR or stock price.

~~~
grecy
> _you can 't release a car that occasionally drives over a baby for no
> reason._

I disagree entirely. We have planes that occasionally kill people, we have
x-ray machines, microwaves and gas BBQ's too. And of course, 36,000 people are
killed on the roads in the USA each year [1].

Any self driving car doesn't have to 100% perfect, it only needs to be better
than human drivers, and it will be saving lives. Yes, the "bad press" will be
painful, but in the long run less people will die as a result, so it's a net
win.

[1]
[https://en.wikipedia.org/wiki/Motor_vehicle_fatality_rate_in...](https://en.wikipedia.org/wiki/Motor_vehicle_fatality_rate_in_U.S._by_year#Motor_vehicle_deaths_in_U.S._by_year)

~~~
jocker12
I am not sure you are looking at the correct numbers. according to NHTSA –
[https://www-fars.nhtsa.dot.gov/Main/index.aspx](https://www-
fars.nhtsa.dot.gov/Main/index.aspx) there are 1.18 fatalities per 100 millions
miles driven. That means, if an individual drives 15.000 miles per year, that
individual will face the possibility of dying in a fatal crash as a driver,
passenger or pedestrain, once in 6666 years, so the cars and road system are
extremely safe as they are today. Most of the self driving cars developers
recognize this like Chris Urmson in his Recode Decode interview – “Well, it’s
not even that they grab for it, it’s that they experience it for a while and
it works, right? And maybe it works perfectly every day for a month. The next
day it may not work, but their experience now is, “Oh this works,” and so
they’re not prepared to take over and so their ability to kind of save it and
monitor it decays with time. So you know in America, somebody dies in a car
accident about 1.15 times per 100 million miles. That’s like 10,000 years of
an average person’s driving. So, let’s say the technology is pretty good but
not that good. You know, someone dies once every 50 million miles. We’re going
to have twice as many accidents and fatalities on the roads on average, but
for any one individual they could go a lifetime, many lifetimes before they
ever see that.” – [https://www.recode.net/2017/9/8/16278566/transcript-self-
dri...](https://www.recode.net/2017/9/8/16278566/transcript-self-driving-car-
engineer-chris-urmson-recode-decode) or Ford Motor Co. executive vice
president Raj Nair – “Ford Motor Co. executive vice president Raj Nair says
you get to 90 percent automation pretty quickly once you understand the
technology you need. “It takes a lot, lot longer to get to 96 or 97,” he says.
“You have a curve, and those last few percentage points are really difficult.”
Almost every time auto executives talk about the promise of self-driving cars,
they cite the National Highway Traffic Safety Administration statistic that
shows human error is the “critical reason” for all but 6 percent of car
crashes. But that’s kind of misleading, says Nair. “If you look at it in terms
of fatal accidents and miles driven, humans are actually very reliable
machines. We need to create an even more reliable machine.” –
[https://www.consumerreports.org/autonomous-driving/self-
driv...](https://www.consumerreports.org/autonomous-driving/self-driving-cars-
driving-into-the-future/) or prof. Raj Rajkumar head of Carnegie Mellon
University’s leading self-driving laboratory. – “if you do the mileage
statistics, one fatality happens every 80 million miles. That is unfortunately
of course, but that is a tremendously high bar for automatically vehicle to
meet.” min.19.30 of this podcast interview –
[http://www.slate.com/articles/podcasts/if_then/2018/05/self_...](http://www.slate.com/articles/podcasts/if_then/2018/05/self_driving_cars_are_not_yet_as_safe_as_human_drivers_says_carnegie_mellon.html)
or Elon Musk himself - "On average, a person is killed in a traffic accident
in the United States once every 100 million miles. Elon Musk says Tesla’s
Autopilot is half as likely to be involved in a collision as a human driver.
That would suggest that somewhere around the 200 million mile mark someone
will die as a result of an automobile driven by a machine." \-
[https://www.teslarati.com/autonomous-cars-make-us-worse-
driv...](https://www.teslarati.com/autonomous-cars-make-us-worse-drivers/)

What you are using is a fallacy, emotional statement done by self driving cars
developers and enthusiasts in order to make people think by adopting this
technology they will be part of a bigger better future, by doing essentially
nothing."

~~~
grecy
What I'm saying is that self-driving cars don't have to be perfect, they just
have to be better than human drivers.

Once there are less deaths with self-driving cars that human-driven cars
(however you want to measure that) then it makes sense to use it.

~~~
uniqueid

        >  they just have to be better than human drivers.
    

That's where the difference in our outlook is:

The way I see it, people can stomach the existence of _ridiculous_ accidents
(ridiculous in the sense that the details are "absurd", not just "very grim")
when they can blame a human.

I don't think many people will tolerate even occasional "self-driving"
accidents if the circumstance show a human would have recognized the danger.

For a self-driving car to be fit-for-purpose, it can't run over kittens at low
speeds on an empty road - _even if they 're very small kittens that don't
exceed some motion-sensing threshold_. That requires common-sense reasoning.

~~~
grecy
I think you're absolutely right there will be push-back if the self driving
cars crash is stupid ways, but again, it's all about the numbers.

Planes still fall out of the sky for stupid reasons (maintenance errors,
mostly) yet the numbers say they're very, very VERY safe, so people use them.

Once the facts show self driving cars are safer than human driving cars (even
if they sometimes crash in stupid ways), there will be no holding back.

