
Tesla Model S Autopilot Reliability – Why Americans Should Love Tesla - Osiris30
http://www.roadandtrack.com/car-culture/news/a29944/leave-tesla-alone/
======
simonsarris
> We now live in an era where raising a billion dollars of other suckers'
> money and developing a new "app" to take selfies or find imaginary creatures
> in a porta-potty is considered the apex of human civilization but investing
> your entire fortune in a quest to build a self-driving electric car is
> treated like dangerous, egomaniacal adventurism. It makes you wonder why
> Elon Musk couldn't just be content sitting on his Paypal money and living
> the lifestyle of the famous one percent. It certainly would have been less
> hassle.

I wish this sentiment was more common. Don't we all want the future to win?

~~~
krisoft
I want the future to win. And we are one bug away from a Hindenburg moment. So
if I think what they do is reckless I will say so.

It's ludicrous to expect that an otherwise untrained, average human can keep
up his or her attention while day by day the automation just works. It's just
not how human psychology works. Someone called this particular kind of
operation here the "deadly valley" because it's good enough to lure you to be
relaxed and not good enough to solve all the kind of possible problems.

I absolutely want the future to win, and I'm terribly afraid that Tesla in its
hastiness might hurt the progress.

~~~
JoeAltmaier
But must the standard for safety be higher than existing human drivers? We're
already living the 'hindenburg' life - every day my freeway overhead sign
shows me the fatality total this year for my state. Its almost exactly one per
day. This is 'normal'.

What are the on-the-road stats for automatic driving? Aren't they better than
human drivers already?

Its not a failing, just because they have different failure modes than humans.
Because our human failings are so many and so frequent. There's so much low-
hanging fruit. Auto-driving never gets tired, never argues with the passenger,
never gets distracted by changing the playlist.

We're net ahead already with auto-driving.

~~~
ethbro
_> But must the standard for safety be higher than existing human drivers?_

This is my line of thought as well. Same problem I have with the anti-nuclear
environmental crowd. Are there substantial drawbacks? Absolutely.

That doesn't matter. The important question is not "Does this new technology
solve all previous problems?" but rather "Does this technology solve a net
positive number of problems while offering more opportunities for
improvement?"

With decades of empirical data that average humans are pretty bad at reliably
manually piloting powerful, heavy vehicles, I can't see a reasonable reason to
reject deploying autopilot technology in an aggressive manner. And that sadly
means a few deaths. But those should be measured against the number that would
have died were nothing done, not against zero.

~~~
stcredzero
Long haul trucking on routes with less than 2000 feet at each end off of a
limited access highway could be automated very soon. The reduction in sleepy
human drivers on the road would probably increase highway safety while saving
companies money.

~~~
duaneb
How do you figure? I don't believe there's been any interstate testing of
automated driving on freeways with human drivers, let alone a standardized
legal framework that would enable end-to-end automation.

~~~
stcredzero
Have you seen logistics centers in Texas? They are often out in the middle of
nowhere, often with dead-easy access to the highway. Automating driving in
those conditions would be much easier.

If nothing else, one could have "pilots" guide the trucks by remote control
for the last mile.

~~~
duaneb
My point was that it's not technology that will be the hard part, it's
legality and unions.

~~~
stcredzero
If there are no employees, what say will the unions have?

~~~
duaneb
There are far from no employees with about 3.5 million truck drivers in the
US. A company cannot operate an automated commercial service today because
there is no licensing system for driverless trucks. The unions aren't going to
allow legalization of automated truck driving with that many jobs on the line.

You're not providing any argument for the ease of legalization here.

~~~
stcredzero
_The unions aren 't going to allow legalization of automated truck driving
with that many jobs on the line._

The unions can only have direct leverage with companies that currently employ
truckers and those that work closely with those companies. All it takes is for
one tech company that doesn't fit into the above category to develop the
technology, then the market will make the licensing of such devices
inevitable.

You thought you were giving a clever rejoinder. However, I was specifically
talking about companies that don't directly employ large numbers of truck
drivers. Then it comes down to the economic power of the companies that
operate the trucks versus the economic power of the companies that have to pay
the trucking companies. The latter is going to win the lobbying contest by
sheer size.

------
wibr
"It's possible to walk into a dealership and buy a car that drives itself on
the freeway about as safely as you would if you were driving the thing. "

How did the author arrive at this conclusion?

"In any even remotely sane universe, this achievement would be celebrated in
the most hyperbolic fashion possible by every man, woman, and child on the
planet. "

So everyone who doesn't do that is not remotely sane?

"Americans would be as proud of the Tesla Model S as we used to be about the
moon landing or about winning the Cold War."

It's a car.

"Note that the harshest critics of Tesla and its products are not affiliated
with Nissan, General Motors, BMW, or Ford. Instead, they are gadflies who
rarely have any industry experience whatsoever. "

Not sure why this article did ended up on the frontpage, it doesn't really add
anything useful to the discussion and does not address the main criticisms
people have about the autopilot like the limited number of sensors, wrong
expectations due to the name and lack of strict enforcement of driver
engagement (hands on wheel).

~~~
studentrob
> Not sure why this article did ended up on the frontpage, it doesn't really
> add anything useful to the discussion and does not address the main
> criticisms people have about the autopilot like the limited number of
> sensors, wrong expectations due to the name and lack of strict enforcement
> of driver engagement (hands on wheel).

Great point.

------
kctess5
I keep seeing the "130 million miles of Autopilot control per fatality"
statistic, especially in the context of "look these cars are better than
humans" but I think this is incredibly disingenuous. First of all, you can't
just take a sample size of 1 for an effectively random event and then do
(total miles driven) / 1 = 1300000000 miles/fatality and pretend like that has
any statistical significance.

Furthermore, during those entire 130 million miles, there was theoretically a
human also sitting behind the wheel ready to take over if the system failed.
If there was no human behind the wheel, the cars would not have been able to
attain 130 million miles of drive time without a very large number of crashes.
When humans drive, there is no intelligent being ready to take over whenever
we make a mistake as is the case for Autopilot.

In my opinion, using this "statistic" is a very lazy way of hiding behind the
numbers in dealing with a very non-obvious issue.

~~~
fmihaila
> Furthermore, during those entire 130 million miles, there was theoretically
> a human also sitting behind the wheel ready to take over if the system
> failed.

Yes. That is the point, though. When Autopilot is used the way it is supposed
to be used, it doesn't seem to decrease safety, and it may actually increase
it.

I agree with the point in your first paragraph, though.

~~~
kctess5
I would assert that you cannot make that claim, as there is no statistically
significant evidence in support of it.

~~~
fmihaila
I wasn't making any kind of strong claim. I said "doesn't seem" and "may". My
point is, whatever evidence we've got so far, it doesn't seem to indicate a
decrease in safety. If anything, it seems to point (however feebly) towards a
safety increase. IOW, the data so far doesn't support, in my view, the stance
that Autopilot in its current form is unsafe, as long as it's used in
accordance to the instructions.

~~~
noxToken
The issue is that 1 data point does not point to either side. You can neither
make a strong nor a weak claim for either side, because the sample size is too
insignificant.

If I flip a coin twice with one head and one tail as the result, we cannot say
whether or not the coin is fair. Even with both heads or both tails as the
result, we still do not have enough information. It's best to just say that we
need more data.

~~~
fmihaila
It's 1 data point if you look at crashes, there are 130 million data points if
you look at miles driven. It's by far the largest amount of data available on
Autopilot driving out of all manufacturers.

I agree we have no statistically rigorous argument yet that Autopilot
increases safety. I agree with you and others who say we need more data. My
only point is that there is no reason, IMO, to stop gathering that data in
real-world conditions by banning Autopilot, as some media articles are
suggesting (not anyone in this thread), because we have no evidence so far
that it decreases safety.

~~~
kctess5
As a person who in robotics, believe me when I say that I am interested in
these technologies being developed and tested. That said, I am aware that a
major PR battle is the last thing that the industry needs right now.

I think that Tesla is doing a disservice to the rest of the industry by being
overly zealous with their marketing and naming department. "Autopilot" is a
bit of an exaggeration that needlessly oversells what is effectively a lane-
assist feature. By saying that it's "autopilot" they set an expectation of a
high degree of autonomy, and thereby influence how their technology is
perceived and evaluated both by users and the press. Fatal accidents caused by
incorrect use of a "Lane Assist" is a very different headline than an accident
caused by "Autopilot."

Tesla is not the only manufacturer with some kind of active cruise control,
they just get a lot of attention because they like to pretend they are selling
autonomous cars. I am just crossing my fingers that their recklessness will
not cause a significant setback to autonomous car technology in general. Tesla
hand waving away safety concerns by quoting dodgy apples-to-oranges
"statistics" doesn't exactly make me feel warm and fuzzy.

~~~
ModernMech
Thank you, I'm glad to see a fellow robotics engineer speaking out against
this. Oftentimes I see this debate cast as "pro-driverless-car" vs "anti-
driverless-car", but the debate is more nuanced than that. It's up to people
who understand the technology to point out what is actually wrong here. We're
not trying to hold back progress; we're trying to promote progress in a _safe_
and _responsible_ way. No one needed to die here.

edit:

> Tesla hand waving away safety concerns by quoting dodgy apples-to-oranges
> "statistics" doesn't exactly make me feel warm and fuzzy.

Same here. I've always liked Audi's philosophy and their slogan really spoke
to me: Vorsprung durch Technik - "Advancement through technology". After their
emissions scandal, it seems that their strengths do not extend to ethics. I
was hoping Tesla would replace Audi in my eye, but they are turning out to be
just as morally bankrupt, IMO.

------
Animats
One wonders what Tesla did to get that laudatory article.

Tesla's "autopilot" problems come from bad engineering and bad human-factors
design. I wrote on this in another topic yesterday.[1] It's not that automatic
driving is bad; it's that Tesla's semi-automatic driving isn't very good. It's
way under-sensored. Google and Volvo are ahead in this area.

Volvo is very close to shipping their autopilot. They're testing now. In 2017,
they will release 100 self-driving cars to customers. They do _not_ require
the driver to pay attention. Volvo's CEO says that if a Volvo self-driving car
gets into an accident, it's Volvo's fault, and they will take full
responsibility.

Their "City Safety" anti-collision system has been on all new Volvo cars for a
year now. Unlike Tesla's system, it's always on. Volvo has radar and LIDAR and
cameras, more sensors than Tesla has.

Volvo's self-driving car has even more sensors, for full coverage in all
directions and redundancy. Redundant computers and actuators, too. And if it
gets into a situation it can't handle, and the driver doesn't take over, it
will pull over and stop.

There's a way to do this right, and Volvo is a lot closer than Tesla is.

[1]
[https://news.ycombinator.com/item?id=12082893](https://news.ycombinator.com/item?id=12082893)
[2] [http://www.volvocars.com/intl/about/our-innovation-
brands/in...](http://www.volvocars.com/intl/about/our-innovation-
brands/intellisafe/intellisafe-autopilot)

~~~
Animats
On a related note, here's Chris Urmson's long talk at SXSW on how Google does
automatic driving.[1] This is worth watching. He shows what Google cars sense
and gives an overview of how they analyze the data. This is the first time
Google has released this level of detail. If you're going to comment on
automatic driving, you need to see this.

Urmson goes into great detail on Google's 2mph collision with a bus. The
sensor data from the Google car is shown. The video from the bus (it had
cameras, too) is shown. Exactly what the software was trying to do is
discussed. The assumptions the software made about what the bus driver would
do are discussed. What they did to prevent this from happening again is
mentioned.

Most of the talk is about the hard cases. In the beginning, Google developed a
highway driving system, but that was years ago. Now they're working hard on
dealing with everything that can happen on a road, including someone in an
electric wheelchair chasing ducks with a broom.

[1] [https://www.youtube.com/watch?v=Uj-
rK8V-rik](https://www.youtube.com/watch?v=Uj-rK8V-rik)

~~~
studentrob
Cool! Thank you for sharing this.

Here's the part where he starts talking about the 2mph bus accident [1]. He
says the self-driving car should not have moved. He does not say why. He says
they ran a lot of tests to make sure it will not happen again.

In plain English, I'd say, when the SDC is stopped on the side of the road,
give sufficient time for traffic coming from behind to clear out before the
SDC moves forward. Also, if an accident is imminent, being in a non-moving
position is probably safer and less likely to cause liability for the SDC than
moving. Plus, rolling over some sandbags is probably preferable to being hit
by a bus.

[1] [https://youtu.be/Uj-rK8V-rik?t=21m45s](https://youtu.be/Uj-
rK8V-rik?t=21m45s)

------
svalorzen
Since the Tesla autopilot will only work on long, clear stretches of road with
high visibility and very predictable traffic, it's not really hard to see why
the statistic would look better than it really is.

I can build an autopilot to drive a car with zero fatalities ever; it just
goes forward on an infinitely straight road.

Statistics can only be compared under the same conditions, otherwise it's just
pointless self-delusion.

------
astrocat
>In any even remotely sane universe... Americans would be as proud of the
Tesla Model S as we used to be about the moon landing or about winning the
Cold War.

While I generally agree with the author's sentiment, I think part of the
difference with why Tesla is getting so much flak instead of praise is because
the pioneers of yore were astronauts, scientists and soldiers who took risks
they understood, and did it willingly.

But with Tesla, the intrepid pioneers of autonomous driving are everyday
people who are engaging in behavior that is probably more risky than they
realize, and we're now seeing some of the consequences of that.

At the same time, I think you could argue that some people, at some point, are
going to have to take on these risks at a large scale, and the one thing Tesla
has going for it is a rabid fan base, and it may be true that without that
kind of cult following, no company would be able to survive this phase of
autonomous driving development.

~~~
studentrob
> But with Tesla, the intrepid pioneers of autonomous driving are everyday
> people who are engaging in behavior that is probably more risky than they
> realize, and we're now seeing some of the consequences of that.

Yeah. The author of this article has the same observation [1]. Tesla seems to
have been caught off-guard by pushback against its driver-assistance system.
Yet, everyone knew there would be pushback if a fatality occurred.

> At the same time, I think you could argue that some people, at some point,
> are going to have to take on these risks at a large scale, and the one thing
> Tesla has going for it is a rabid fan base, and it may be true that without
> that kind of cult following, no company would be able to survive this phase
> of autonomous driving development.

Fan bases are important, particularly ones that include people who can be give
constructive criticism. 100% praise or 100% critique is rarely useful in
identifying the path forward. It's a windy road. Everything Musk says is not
gospel.

[1] [http://fortune.com/2016/07/11/elon-musk-tesla-self-
driving-c...](http://fortune.com/2016/07/11/elon-musk-tesla-self-driving-
cars/)

------
billions
Tesla launched autopilot like a startup in beta mode before big co's and
regulators could slow it down. If you believe computers will eventually be
safer drivers, would you prefer:

A) Autopilot launches gradually over 20 years by a big co, costing hundreds of
thousands of lives due to 10 year delay or

B) Tesla launches fast below the radar, iterates quickly, makes data-driven
incremental updates to autopilot and saves 10 years of human error deaths?

~~~
rhino369
False dilemma

C) put these tech features into the car but force the driver to keep actively
driving the car getting the best of both worlds. AKA what every other
automaker is doing.

~~~
billions
If you're doing what everybody else is doing, by definition, you are not
making progress

------
danso
> _Again, in any sane world nobody would pay any attention to the opinions of
> completely unqualified individuals on any given topic. There 's a reason I
> write for Road & Track and not Men's Health, for example, and it has
> something to do with the fact that I've literally had more racing wins in my
> life than I've eaten salads. If I started pontificating about whether a
> particular protein supplement built more muscle mass and got you more ripped
> than another one, the readers would be entirely right to point out that I am
> not a doctor and that I have never been seen to bench press more than 255
> pounds, not even once._

Reminds me of the Tragedy of Theranos. A company valued at $9 billion, staffed
to the gills with engineers and medical experts who were building a
revolutionary disruptive way to draw and test blood, was taken down based on
the accusations of a _newspaper_ reporter who has never run his own startup,
nevermind having not attained enough medical expertise to even competently
check someone's temperature.

I'm being a little facetious here. There's a difference between a cable news
reporter blabbing about something that happened 7 minutes ago to fill an hour
of airtime with controversy and fear, and a 5-month investigation by a
Pulitzer-winning reporter. But let's not shit on the laypersons who dare to
challenge the clergy.

But in one aspect I agree with the OP. People who aren't experts and who don't
bother to check with other experts should perhaps be mindful before they
propagate uninformed armchair analysis. But let's make that cut both ways.
These uninformed reporters should also hold their tongues when a company touts
_any_ technical feature. Maybe that would reduce the incentive for the Elon
Musks of the world to name things in viral-headline-friendly ways, e.g.
"Autopilot" for advanced driver assist. Tesla has benefited heavily from the
press corps ignorance about technology -- you think Tesla has received $4
billion in government subsidies based on technical merit alone? And now
they're paying for it. Maybe 10 years from now we'll all agree that that was a
good strategic gamble. But it doesn't mean Tesla should be given kid glove
treatment _today_ , after the effort they've put in to manipulate the press to
their own benefit.

~~~
studentrob
> People who aren't experts and who don't bother to check with other experts
> should perhaps be mindful before they propagate uninformed armchair analysis

I believe some of this controversy began with an article by Fortune [1] in
which they claimed Musk overlooked some business reporting requirements.

Do you agree that a financial news outlet is qualified to judge a company
based on how they report to the SEC? The world of corporate finance is their
wheel house, and that spans to any company, tech or not. The larger the
valuation of the company, the more interesting the news would be to them.

Fortune's readers are current or would-be Tesla investors.

[1] [http://fortune.com/2016/07/05/elon-musk-tesla-autopilot-
stoc...](http://fortune.com/2016/07/05/elon-musk-tesla-autopilot-stock-sale/)

~~~
danso
That definitely falls into the field of Fortune's expertise. But I took OP's
comment to be a broadside against the alarmist headlines about the recent
accidents.

~~~
studentrob
> I took OP's comment to be a broadside against the alarmist headlines about
> the recent accidents.

Yeah. I feel the Fortune article came out, then Musk claimed it was BS, and
other headlines both critical and defensive of Tesla started popping up.
Fortune's legitimate criticism has been overlooked by many who believe Musk
can do no wrong.

At this point, Musk has tweeted in response to the article from Road and Track
titled "Leave Tesla Alone",

> "We don't mind taking the heat for customer safety. It is the right thing to
> do"

Perhaps tensions are easing between Musk and Fortune.

------
topbanana
Why is Tesla risking time, money and reputation on making their cars self-
driving? The primary goal is surely to get everyone into electric powered cars
so we can move to cleaner power, and lower localised pollution.

~~~
drcross
It depends on if you believe the perspective that Musk doesnt care about time,
money and reputation but wants to spur on the rest of the car manufacturers to
compete, in some sort of altruistic, noble effort.

~~~
pm90
I'm not denying he doesn't have that motivation, but let's not kid ourselves
here: he's an entreprenuer first and the looking for every way to make his
product stand out in the market. A market which will soon be flooded with
electric cars produced by the major car companies as well. And _they_ are
betting big on autonomous driver technology as well [0].

[0]: [http://fortune.com/self-driving-cars-silicon-valley-
detroit/](http://fortune.com/self-driving-cars-silicon-valley-detroit/)

------
JoelSutherland
Fatalities per 100 million miles driven is not a great metric for measuring
the reliability of Autopilot.

That number is influenced not just by Autopilot avoiding accidents, but also
by the overall safety of the car. Teslas are incredibly safe in accidents. The
current 25% lead that Autopilot has over the average is probably not
attributable to Autopilot at all, and instead to the overall safety of the
vehicle.

* Fatality rates vary by make/model quite a bit (up to 8x). Some of this is surely attributable to the relative driving skill of purchasers, but most is probably the safety of the cars themselves. Given the spread seen in this IIHS report I would guess that Autopilot performs worse than the average human. [http://www.iihs.org/iihs/topics/driver-death-rates](http://www.iihs.org/iihs/topics/driver-death-rates)

~~~
studentrob
It's not even a good figure for comparing fatalities. See my comment here [1]

Those 100 million miles include non-divided roads, motorcycle deaths, adverse
weather conditions, and cheaper, older cars.

[1]
[https://news.ycombinator.com/item?id=12087603](https://news.ycombinator.com/item?id=12087603)

------
studentrob
> Here's a statistic for you: 1.08 fatalities per 100 million miles driven.
> According to the National Highway and Traffic Safety Administration, that's
> the death rate on the American road as of 2014, the last year for which
> there is complete data.

This one per 100 million miles statistic keeps getting thrown around. I think
we can come up with a better number for comparison (see #'ed list below)

I also think we can push for more transparency from Tesla and other companies
offering driver-assistance systems. California already requires monthly
reports on all accidents involving completely autonomous vehicles [0]. We
should be getting this for systems offering driver-assistance across all
states. Tesla seems unlikely to report on this until their competitors are
forced to report on it too.

Here are a few reasons why the 100 million figure isn't as good as it could
be,

(1) Those 100 million miles include all roads, not just divided ones where
autopilot is likely to be activated

(2) Autopilot is less likely to be activated during adverse conditions like
rain, ice and snow

(3) The 100 million figure includes all types of vehicle deaths, including
those from motorcycles. Motorcycle deaths should not be compared to car deaths

(4) The vehicle price is lost in such an average. Safety reports are best done
when considering cost. Obviously a 2016 Tesla is going to be safer than a 1990
beater

I'd like to see someone report a more useful statistic for comparison using
available data such as this [1]. This guy [2] says it is 1 in 150 million when
considering only divided roads. That would make Tesla's autopilot less safe
than the average American driver given current statistics.

[0]
[https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/auton...](https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/autonomousveh_ol316)

[1] [http://www.iihs.org/iihs/topics/t/general-
statistics/fatalit...](http://www.iihs.org/iihs/topics/t/general-
statistics/fatalityfacts/state-by-state-overview)

[2]
[https://www.reddit.com/r/technology/comments/4sgxv1/while_te...](https://www.reddit.com/r/technology/comments/4sgxv1/while_tesla_appears_surprised_over_criticism_of/d59eazl)

~~~
greglindahl
Given the large error-bar on the Tesla data, you have no idea if Tesla is less
safe or more safe than the average American driver.

~~~
studentrob
I'm using the same data Tesla used when they made their claims.

~~~
greglindahl
And that adds to a well-reasoned and statistics-informed discussion on HN,
because... ?

~~~
studentrob
None of Tesla's recent press releases indicate their data is error-prone [1]
[2]

Rather, they mention a "wealth of internal data demonstrating safer, more
predictable vehicle control performance when the system is properly used."

I'm drawing a different conclusion from the same data. I'm not claiming it's
gospel. I'm just an internet commenter.

Forthcoming investigations from the NHTSA and NTSB may shed more light.

[1] [https://www.teslamotors.com/blog/tragic-
loss](https://www.teslamotors.com/blog/tragic-loss)

[2]
[https://www.teslamotors.com/blog/misfortune](https://www.teslamotors.com/blog/misfortune)

~~~
greglindahl
I was trying to engage you in a discussion about _your_ error, not Tesla's.
Hopefully you're an internet commenter trying to make Hacker News a better
place?

~~~
studentrob
> Hopefully you're an internet commenter trying to make Hacker News a better
> place?

For sure. Through this discussion about assisted-driving cars, I also try to
contribute to a safer driving community, more transparency from companies
offering driver-assistance systems, thoughts about the best ways to advance
AI, and better capital allocation.

> I was trying to engage you in a discussion about your error, not Tesla's.

I don't see what error of mine that you are pointing out here. It sounds like
you're talking about Tesla's data:

>> Given the large error-bar on the Tesla data, you have no idea if Tesla is
less safe or more safe than the average American driver.

------
noonespecial
We're at the place with self-driving that air travel found itself at in the
20's.

From here on in each new technological update and safety regulation will like
likely be based on some unfortunate and unforeseen event.

Like they say when you study the rules of aviation: Each one is written on a
tombstone.

~~~
gkya
Most often, you don't die in an air accident if you're not on a plane. With AI
vehicles sharing the road with manual vehicles and pedestrians, the case is
different.

------
kiba
If autopilot is slightly better than a human driver, why not use it?

It's an incredibly low standard, sure, but more lives are saved, and it only
to get better from then on.

The only problem is the public's perceived unwillingness to accept such risks
in using autopilot.

~~~
SCAQTony
Tesla over-promised: If they called the Tesla a 'driver-assisted vehicle'
rather than an 'autopilot car' this would be a different argument.

You can't claim you have a driverless car (autopilot) when it is expected of
the driver to keep their eyes on the road, hands on the wheel and a foot near
the brake. That is so over-promising.

~~~
skykooler
Tesla has never claimed "driverless" \- the driver has to be there (well, now
- earlier firmware versions didn't check for weight in the driver's seat).
What they did do is market it as "autopilot", which means different things to
different people - in an airplane, "autopilot" merely flies the plane into the
direction and altitude set by the pilot - it does not avoid other aircraft,
take commands from ATC, or anything like that. So, technically, calling
Tesla's auto-steer "autopilot" is pretty accurate. The problem is that most
people being unfamiliar with aircraft systems think of autopilot as "something
that flies the plane on its own" and analogizes that to driverless cars.

You've never flown in a pilotless airplane. Autopilot is not AI.

------
ams6110
> You don't have to buy a Tesla. You don't even have to like the cars.

If only that were the case. It seems to me that more likely when someone says
Teslas are too expensive, or not practical they get attacked as luddites and
imbeciles.

------
outworlder
Ever since I started driving in US streets, I've been shocked and terrified at
the amount of people checking their cellphones in dangerous situations.
Inching the car forward at a traffic light without even looking forward,
texting doing 70+ mph on a bridge, yelling at their kids in the back seat,
changing lanes without looking at their blind spots...

I'd really prefer if those vehicles were driverless instead. And mine as well.

------
WWKong
0.76 fatalities for 100 million Tesla autopilot miles driven? Is that
extrapolated?

~~~
hughes
Not sure what you mean by extrapolated. It doesn't mean that 0.76 people died,
it means that 1 person died after 130 million autopilot miles were driven.

------
madengr
Best quote:

"These individuals are assisted in their quest by a media that long ago
decided that it was completely okay with killing the society on which it
parasitically feeds."

------
dodosayak
When Tesla stops building "someone may die" into their decision matrix, I'll
consider buying a Tesla.

~~~
Someone1234
By the same logic you should never buy a car period.

All forms of transport have to accept the possibility of fatalities to a
certain extent. The test for Tesla is are they safer, not are they perfect.

