
People are reporting collisions with Tesla’s Smart Summon feature - Alupis
https://www.thedrive.com/news/30074/people-are-already-reporting-collisions-with-teslas-driverless-smart-summon-feature
======
dkarl
If you're incredulous that Tesla would let people oversee a moving car at
15mph in a parking lot, wait until you hear what happens next after the car
gets to them!

I think driving is a special case for safety, because we have put humans in
charge of controlling 2000+lb cars at high speed in close proximity with other
humans doing the same thing, and we've either designed or retrofitted this
activity into almost every dense human habitation on the planet. Obviously we
would never make such a decision now, but it gives a special context to
today's safety decisions. It is a major ongoing source of death and injury as
well as a major source of pollution, and bringing that era to a close is a
special priority. In the United States roughly 100 people per day die in car
accidents, and many more are seriously injured, so you could say that every
day we can accelerate the end of human driving saves a hundred lives.

~~~
DougWebb
100 people per day out of how many people driving per day? I'm pretty sure
it's a very small percentage. How does that compare to the percentage of self-
driving car accidents relative to the number of self-driving cars? My gut
feeling is that if you replaced every car on the road with today's self-
driving cars, the accident rate would go way up.

There's probably a danger-maximum somewhere before 100% replacement, though.
Some mix of highly-predictable self-driving cars and poorly-predictable human-
driven cars will cause the most accidents, and a lower number of either will
cause fewer accidents.

~~~
dkarl
_There 's probably a danger-maximum somewhere before 100% replacement, though.
Some mix of highly-predictable self-driving cars and poorly-predictable human-
driven cars will cause the most accidents_

That's an interesting point. Since the technology will be improving at the
same time adoption is increasing, I don't know if we'll be able to observe
that maximum, but I think you're right. One thing to consider is that once
there are a significant number of self-driving cars on the road, they'll start
communicating in a much faster and richer way than human drivers are able to
communicate. If self-driving cars are able to achieve human collision rates
with human drivers, they will be able to achieve much, much lower collision
rate with each other. That effect might lower the percentage at which safety
starts to improve.

Another variable will be the built environment. Over the years we have
developed road systems that are highly enriched with features to assist human
drivers, and of course we will begin to do the same for self-driving cars.

Nobody is talking about those improvements right now because they aren't
relevant to getting the first cars on the road. The engineers working on self-
driving cars have to tackle the hardest version of the problem that will ever
exist first, and only then can they start making it easier.

Confession: for years I predicted to my friends that the first autonomous
passenger vehicles to coexist with untrained members of the public would be in
a theme park or a city center closed to other vehicles, with sensors and other
assistance built into the environment, because I thought we'd have to start
with an easier version of the problem.

------
molmalo
Honest question, Can somebody explain me the use case for this feature? I
don't get it.

I mean, if the summoned car came to me in a totally autonomous way, it would
be useful in very large parking lots, like an airport, campus, etc... Or I
could call it while still leaving my apartment/office, and the car would be
there by the time I get to the door. But if I need to have the car in my line
of sight, just standing still while watching it and holding the button, with
me being aware and responsible of the car not bumping into any obstacle, then
it means I'm pretty close to the vehicle, so why not just walk the very short
distance to it?

~~~
LogicX
My brother just had serious back surgery. We've been dropping him at the store
entrance, and picking him up. He can sit fine, but walking is difficult. I can
see this being a benefit for certain people with injuries or disabilities.

~~~
targonca
Are there no dedicated parking spots for the disabled in the US?

~~~
GordonS
If it's anything like the UK, plenty of non-disabled people use them
regardless - because they drive a massive Q7 which doesn't fit into regular
spaces, or just because they're closer to the store/school/whatever.

Pretty dispicable if you ask me, but there does seem to be a large segment of
the population that just don't give a shit.

------
brainless
"You are still responsible for your car and must monitor it and its
surroundings at times within your line of sight because it may not detect all
obstacles. Be especially careful around quick moving people, bicycles, and
cars."

Is this a joke? I have huge respect for Tesla and SpaceX, but how do they
expect to release a feature that will self-drive a car (even tiny distance)
and someone will oversee a moving car?

The software engineer in me just died.

~~~
natch
As you should know as a software developer, development of major new
capabilities is an incremental process, and for something like self driving
cars it’s expected that there will be a transitional period before self
driving. Summon in parking lots is a good transitional step.

Remember Tesla is not just training cars here; humans also need to learn to
deal and not freak out. There will be fender benders along the way. Most of
them probably caused by other drivers, but bugs are also a possibility.

~~~
coldtea
> _As you should know as a software developer, development of major new
> capabilities is an incremental process_

Hopefully software developers with such an idea are not allowed in medical
devices, aviation, missile guidance, space industry, industry robotics, and
automobile...

~~~
natch
Developments in all the areas you mention start with baby steps and
incrementally progress toward greater and greater capabilities. This should
not be news to anyone here.

~~~
coldtea
Development yes. Releasing them as products (which is the case we're
discussing) when they have people-killing faults, no...

~~~
natch
Plenty of people die in cars every day. Cars are released products, right? You
are making no sense here. And the collisions reported with smart summon are
fender benders not “killings”. No need to hype up the drama level.

~~~
coldtea
> _Plenty of people die in cars every day. Cars are released products, right?_

Yes, but current cars killings are due to operators (drivers) errors. Not in a
self-driving parking-summoner killing people.

Companies don't (or aren't supposed to) release production cars with known
people-killing faults. And when some are nonetheless released (which happens
sometimes, e.g. a defect found in the break system) companies get fined and/or
people go to jail for those. It's not all A-OK because "people die in cars
every day" anyway...

> _You are making no sense here._

Oh, the irony.

------
cj
As a motorcyclist, these videos are sickening. Especially the guy purposely
using smart summon to cross LIVE traffic (not in a parking lot).

These kind of low speed collisions are not a big deal to car drivers. But for
someone on a motorcycle, having a Tesla with no driver obliviously pulling out
into live traffic in front of a motorcycle (even at low speeds) can cause
significant injury.

Please, people... don't be stupid.

~~~
gruez
>But for someone on a motorcycle, having a Tesla with no driver obliviously
pulling out into live traffic in front of a motorcycle

I don't get it. Do motorcyclists use whether someone's in the driver's seat to
determine whether the car is moving or not? Shouldn't they be looking at the
brake/head lights?

~~~
cj
The point is not to needlessly increase the risk for other people using the
road for the sake of showing off your car’s cool party trick.

But to directly answer your question: in fact, most people on a motorcycle
will try to look at the driver, when possible, to check if the driver is
looking at them to gauge whether the driver sees them or not. In the case of a
Tesla, it’s impossible to know if the car’s sensors know you’re approaching...

Also, I don’t know any human beings that share a similar driving pattern as
the Tesla while it’s being summoned. It seems to start and stop unpredictably
and does so in a very jerky nature (and most humans do not drive as slow as in
the videos, and yes sometimes driving slowly is more dangerous than driving
regular speeds, and most humans also won’t drive on the wrong side of a lane,
which the Tesla often does in videos I’ve watched... it’s just very
unpredictable

------
antpls
The feature being a worldwide premier, with no past experience, I honestly
expected that Tesla had added more sounds and visual warnings when the Summon
feature is running, such as flashing the car's headlights and emitting regular
"bip" (like when a big truck is doing a reverse manœuvre), to catch the
attention of other drivers, or even warn blind people when the Tesla is near.

Then, once the feature is polished with more real world data, they can always
remove the warnings later.

The video with the white car coming through especially is scary. From all the
sensors and marketing materials I know from Tesla, I would expect the Tesla to
have detected that white car way sooner and stopped automatically. There was
really no ambiguity to what decision to take here.

~~~
BoorishBears
Does the Model 3 have a concept of stop signs (not just recognizing them, but
understanding arrival order and how long to wait) and right of way?

Or are they expecting to get away with just avoiding cars and pedestrians
while breaking driving rules because it's a parking lot and blaming owners
when it inevitably falls short.

Tesla's track record leads me to believe the latter, but I'd love to find out
something to the contrary.

-

Actually it looks like this is still controlled by holding a button? So yeah,
guessing it has no concept of those andalmost-crash in video #2 would be
expected behavior. Marvelous.

~~~
mysterydip
If four teslas arrive at an intersection at the same time, do they all crash?
:)

~~~
mannykannot
You say this as a joke, but I suspect they have been programmed to drive
fairly aggressively, though probably to come to a halt before any collision
that it can predict. If so, we might soon see two or more Teslas deadlocked,
their fenders inches apart.

------
gpm
So, the author managed to find 3 concrete examples.

One where someone else backed into the Tesla.

One where no collision occured.

One where the tesla ran into a garage.

Only the last seems at all worrisome...

I vote for changing the headline to singular collision.

~~~
ajross
Right, the first was an at fault incident by the non-Tesla driver for sure.

In the second, the vehicle was allowed to drive across a traffic lane. And in
fact it attempted to do so, but stopped at the threshold when it saw a vehicle
approaching (which also hit the brakes). I fail so see how this isn't exactly
the behavior claimed and desired.

~~~
rohit2412
Hmm, Tesla definitely crossed into the road in that video.

If driving into live traffic and stopping in middle of it is acceptable, then
okay, self driving cars are here!

~~~
ajross
Are we looking at the same video?
[https://twitter.com/eiddor/status/1177749574976462848](https://twitter.com/eiddor/status/1177749574976462848)

It pulls out as if it were going to continue across the traffic lane and very
clearly stops _right at_ the threshold. At no point is it in the traffic lane,
nor in the path of the gray vehicle that stops.

It's true that it "looked like" a human driver who was going to continue
straight out, so the other driver (correctly) hit the brakes.

But spinning this like the Tesla "stopped in the middle of traffic" is just
plain wrong.

~~~
FireBeyond
The Tesla stopped when / because the 'driver' released the button. Watching
the vehicle until that point, it hadn't slowed or made any hint that it was
about to stop or do anything but continue forward, oblivious, and that -would-
have caused an at fault collision.

~~~
ajross
Sorry, what's the evidence for the reason why the vehicle stopped? I mean, it
stopped. That's what it's supposed to do. You have some cite showing that it
wasn't a sensor detection?

And of course it hadn't "slowed", at this speed (looks like about ~1 m/s) the
time taken to stop (at about the .7G a car on its wheels can achieve) is about
4 frames of that 30 Hz video, and it happens over a distance of about 7cm,
most of which is the vehicle just rocking forward on its suspension. Be real,
we're talking about parking lot maneuvering here.

~~~
FireBeyond
Evidence? The author of the video saying he took his hand from the 'button'?

Okay, now we need to do some math. The evidence that a 4,000lb vehicle coming
to a stop even from 5mph in... 3 inches? "Most of which is the car rockingon
its suspension". Removing distance from the equation, because g forces are
related to time of deceleration, using your numbers,
[https://rechneronline.de/g-acceleration/](https://rechneronline.de/g-acceleration/)
says more like -2.3G which is far from a gentle stop.

You can also plainly see the other car decelerate too, over far more than '4
frames'.

------
Jasper_
All evidence so far suggests that self-driving cars are more dangerous than
human drivers when similar conditions are taken into account.

The comparison numbers that Tesla parrots are between Teslas with modern
safety features and in good weather conditions, and the entire rest of road
fatalities, including motorcycles. Comparing similarly bodied cars in similar
weather conditions, Tesla's self driving has more reported collisions per
miles driven.

~~~
theferalrobot
Do you have some sources on this? Every study I am finding even comparing like
for like shows SDVs are about the same or safer. Also the facts I was able to
find from Tesla do a lot of comparisons, not just the obscure ones you cite.

~~~
masklinn
> Do you have some sources on this?

Fatalities of human-driven cars is ~1.5 per 100 million miles driven, and
that's across the entire range of driving conditions.

The autonomous car industry (level 3) totaled under 10 million miles driven as
of the Uber kill, putting them way beyond despite benefiting from only good
conditions.

Tesla just reached 1 billion miles on autopilot early this year and has 5
fatalities, that's slightly better than the human-driven rate but (again) one
can assume this is in good conditions pretty much exclusively.

~~~
mentat
Only in good conditions? Have you seen the videos of the Uber kill? Come on...
95% of humans would have had the same outcome.

~~~
Jasper_
The dashcam footage released by Uber was misleading; their cameras and sensors
were far from as good as the human eye. The Tempe police concluded 85% of
humans could have stopped the car in time: [0]

> In simulating conditions of the crash and consulting their textbooks,
> detectives found that Herzberg could have been seen 143 feet down the road
> by 85 percent of motorists.

[0] [https://www.phoenixnewtimes.com/news/self-driving-uber-
crash...](https://www.phoenixnewtimes.com/news/self-driving-uber-crash-
avoidable-drivers-phone-playing-video-before-woman-struck-10543284)

~~~
jmpman
I drove the section of road. If I wasn’t distracted by my phone, I would have
stopped properly 100% of the time.

~~~
jmpman
Downvoted... I brought up the phone distraction because the Uber safety driver
was distracted by watching video on her phone.

------
SrslyJosh
Nobody who has an inkling how hacky self-driving is would try this.

~~~
GaryNumanVevo
I cringe whenever my friend tell me his Tesla M3 will be full self driving and
he'll be able to earn passive income in the future.

------
tjoff
It is bold to imply that what tesla does is anything but directly harmful
towards the goal of ending human driving.

Tesla and uber do a terrific job of pushing the acceptance of self driving and
risking postponing that with their recklessness.

~~~
dkarl
Maybe; it's hard to say. I think the reputation of human driving is so low
that it's hard to go lower, but opinion on that is pretty polarized. As a bike
commuter in a quickly growing city, I'm aware that there are a lot of people
who think driving is a fundamentally safe activity that is only made dangerous
by unwelcome novelties such as cyclists, pedestrians, etc. But there's also a
lot of awareness that it's really a shit show: humans are terrible drivers who
kill people by the thousands every year. It's such an outlier that I don't
think normal safety standards apply. I think the only standard that self-
driving cars should need to meet is being comparable to human drivers.

If self-driving cars replaced humans the day they became safer than human
drivers, they would become our second most deadly consumer technology next to
guns. Crazy to think about, isn't it? As safety improves, self-driving cars
can save thousands of lives per year and _still_ be our second most deadly
consumer technology. We should get used to that idea and look forward to
cranking the death rate lower and lower as technology improves. Humans behind
the wheel will always be a menace.

~~~
tjoff
Well they are nowhere near human drivers. And yes, human drivers are terrible.

Thing is that human drivers vary _a lot_. So we have excellent drivers and we
have awful drivers. And also we have tired drivers, bored drivers, angry
drivers and we have sporadically inattentive drivers.

If we were to aim for "at least the same number of accidents as a human per
mile" that would be a pretty disastrous AI. Because the AI would be much more
uniform in its performance.

A human driver will send text messages and won't even slow down before hitting
a stand-still truck right in front of them. A human driver will mix up the
brake pedal and speed straight through an intersection.

An AI that does comparable damage (obviously wouldn't make the _same_
mistakes) would be pretty darn scary to sit inside or be near.

There is also the very important factor that a self-driving car needs to
behave as a human to not cause accidents just with it's presence.

The PR disasters that await, when people die because a self-driving car did
something that a conscious human would never do people won't react rationally.
Tesla are just begging for a, well deserved, ban on self-driving cars and
attach a stigma to them. All while killing some people.

There is no silver lining to it.

------
Havoc
Taking move fast and break things to the next level

~~~
PhantomGremlin
Yup. The following is from a previous HN Tesla discussion:

    
    
       Tesla Will Have Full Autonomy in 2020, Musk Says
    
          Tesla has an advantage here in that they
          don't feel the need for their autonomy
          to be particularly safe.
    

These problems and accidents are a direct result of Tesla's laissez-faire
attitude toward autonomous driving. They do whatever the fuck they think they
can get away with, consequences be damned.

So far, they've been allowed to get away with quite a lot.

~~~
floatingatoll
Please don’t use HN’s code formatting for quoting text, as it isn’t readable
on mobile. [https://i.imgur.com/wiBoZyb.png](https://i.imgur.com/wiBoZyb.png)

~~~
PhantomGremlin
Sorry. Yeah, that's ugly.

I've used code formatting before, and I've tried to keep the lines short. But
obviously not nearly short enough for some mobile users.

Let me turn it around. How would _you_ format those two sentences for mobile,
while still maintaining my intent of showing that the second was a response to
the first?

~~~
floatingatoll
The same way emails do, approximately:

> Quoted paragraph here. Many words are in this paragraph.

> Second paragraph here.

Reply here. Note that you’ll need to leave a blank line between paragraphs if
you’re doing extended multi-paragraph quotes, which is unwieldy - but so are
multi-paragraph quotes. For those, either quote less (if you’re replying),
quote interleaved with replies (if it’s a multi-segment reply), or reply first
and quote second^.

^ if it’s external content you’re including from an external source that you’d
like to reference from your comment, as opposed to a footnote.

Here’s the relevant segment from the HN faq, for the record:

[https://news.ycombinator.com/formatdoc](https://news.ycombinator.com/formatdoc)

 _Blank lines separate paragraphs. Text surrounded by asterisks is italicized,
if the character after the first asterisk isn 't whitespace.

Text after a blank line that is indented by two or more spaces is reproduced
verbatim. (This is intended for code.)

Urls become links, except in the text field of a submission._

------
moltar
Ok, so, from the article:

1\. Lexus backed into Tesla

2\. Nearly collided, on a busy lot, both cars stopped in time.

I’m really not seeing a problem yet.

Just an alarmist, fake news article.

~~~
BoorishBears
Are you joking?

In the first case the Lexus was in the wrong, but a human would have
intervened with that loud auditory device affixed to the steering wheel, or
moved further out of the way.

The Tesla freezes deer in headlights after it turns into the path of the car
backing up. I wonder if it even signaled before turning. The fact the owner is
worried that the other driver thinks they were driving leads to be believe it
probably did something a human driver should have, not signaling makes sense

-

In the second case it ignored right of way and almost caused a serious crash!

In another it ran into a wall!

If anything tuis article is being incredible charitable by trying to focus on
the cases where it doesn't fail...

~~~
pound
>> In the second case it ignored right of way

This is exactly the thing that people should !not expect! from it in the very
beginning. It's an owner's responsibility to _monitor surroundings_ and make
car move when it's safe to do so. Car will navigate, but it's not a full self
driving obeying all the rules of city traffic.

~~~
BoorishBears
The real world is not a train line. A car that moves on a track while you hold
a button, then stops when you let go is not safe.

Even when it is safe to move from your point of view, it doesn't exclude you
from having to signal to other drivers.

It doesn't exclude you from situations that were safe but are no longer and
require the car to move out of the way in a maneuver more complex than a few
feet forward or backwards.

-

Not to mention all the scenarios Tesla touts in the launch are also perfect
completely contradictory to monitoring surroundings.

You wouldn't drive with a hand full of groceries, or your baby crying in your
lap, because you'd be driving distracted.

Why is Tesla bragging you can control the car while dealing with these
distractions when they haven't taken the steps to make it safe to do so?

------
tpmx
Related - if you like spending time on reddit: please consider visiting
r/realtesla. Sometimes it's a bit too negative, but mostly I find it a nice
counterbalance to that creepy "everything is _always_ awesome" subreddit.

------
sunstone
I like the Tesla cars and definitely thinking of getting one but the self
driving stuff, I'll let that age for a decade or two.

~~~
Shivetya
Own a TM3, unless Tesla can make "smart" summon work the idea they will ever
have full autonomous driving down is just a pipe dream. What the car can do on
a long drive is nothing short of amazing compared to any other brand but treat
it like your kid with their new driving license, you watch over it like a
hawk.

At most I use it to slide out of a space where someone got too close, usually
on purpose, or as a party trick. Let it drive out of my sight, oh, hell no.

~~~
throwaway66920
Isn’t that the intended use case? You’re not supposed to let it drive out of
your sight.

------
chvid
It looks it is running the same software as my vacuum robot.

------
perspective1
Tesla. I'm disappointed. You say your cars are the safest on the road. That
your self-driving AI is safer than humans. You spar with regulating bodies
along those lines-- arguing that you're protecting consumers. And then you
roll out a fluff feature that causes accidents.

~~~
rohit2412
Then you shouldn't be trusting Tesla's marketing. Did you think Elon musk had
personally made break throughs that ML scientists at Google are struggling
over?

~~~
JJCoh
As much as google likes to brand themselves as an AI company it is pretty much
all marketing fluff. What google has actually deployed is all super simply
modeling because of the scale they have to run at. They can’t deploy expensive
models because they have to run across a billion+ users. They have made some
contributions on the NLP side but mostly they just have the best consumer
packaging of ai.

------
new_realist
This feature was pushed out way before it was ready, all so Tesla can
recognize some deferred FSD revenue in Q3, which ends on Monday.

------
benj111
Apparently the car doesn't try to read road markings, I stand by my comment I
made 2 days ago in response to that.

"Isnt that below minimum viable product for something expected to be used with
other people around?"

[https://news.ycombinator.com/item?id=21089373](https://news.ycombinator.com/item?id=21089373)

------
GaryNumanVevo
Honestly what's interesting here: People are rushing to try out the v1.0
software of a feature from a car company, potentially risking their expensive
vehicle because FOMO of all the other Tesla owners posting about summon before
them.

------
grizzles
I love Musk but this is a bad feature because he hasn't looked at the
actuarial case for not having this. TONS of car accidents happen in parking
lots.

------
imglorp
This seems like a huge opportunity for insurance fraud like you see often in
Asia. Someone arranges to be struck by a car (ie flings themself in front of
it), then splits the medical payout with a crooked doctor.

------
fyfy18
Most of these look like user stupidity rather than anything wrong with the
car. Especially the video of the guy who trys to summon his car across a road.

I imagine summon is pretty dumb, but all Tesla's talk of autopilot and self-
driving cars has made people believe it is a lot smarter than it actually is.

~~~
missosoup
If it's a user facing feature. And it fails catastrophically if some
preconditions aren't met. And it doesn't have the ability to verify those
preconditions. Then allowing users to invoke it is probably not user
stupidity. It's just poor design.

The same topic is being explored in the 737 MAX clusterfuck where Boeing
failed to provide sufficient safeguards OR sufficient training for non-expert
users to manage catastrophic failure. People are stupid, yes. But you can't
blame people being stupid as an excuse for insufficient safety engineering,
just like you can't blame drag force being proportional to the square of the
velocity. These are just the constraints we have to engineer around.

~~~
hef19898
True! I have the impression that Tesla is trying to "innovate" itself out of
the corner they are finding themselves in. Already said multiple times, but
there are valid reasons why the automotive sector is working the way it is.
Sure, as everywhere a lot of stuff is just there because it was always there.
Go ahead and brake that. But _understand_ first which ones can be ignored and
which ones not.

Kind of worked for SpaceX by ignoring all the political bullshit in aerospace.
I'm slowly wondering whether that was luck or intention, so.

Just how the latest Tesla features ever became road legal is beyond my
understanding so.

~~~
loceng
Tesla vehicles are the safest ever made - I'd say allow Tesla to innovate away
from whatever stagnancy the status quo auto manufacturers have allowed. User
education is part of good design - and perhaps driver tests will need to start
including proper use of such systems. They're road legal likely because it's
innovative and it hasn't been discussed nor laws written yet - it takes time
for the regulatory processes to work.

~~~
FireBeyond
> Tesla vehicles are the safest ever made

Citation needed.

~~~
loceng
I mean, I guess I can clarify that they're safest mass consumer vehicle?

I'll let you read through these links to decide where you stand:

[https://www.greencarreports.com/news/1124439_nhtsa-to-
tesla-...](https://www.greencarreports.com/news/1124439_nhtsa-to-tesla-stop-
claiming-your-cars-are-the-safest)

[https://www.cnn.com/2019/08/07/business/tesla-
model-3-safety...](https://www.cnn.com/2019/08/07/business/tesla-
model-3-safety/index.html)

...

It sounds like the regulatory body doesn't like that Tesla is claiming they
got a 5.4 rating out of 5, as they say they don't give out higher than a 5
rating - and it sounds like Tesla is using public data points to calculate
that 5.4. We'll have to wait to see if this goes to court to see if Tesla's
methods for stating their 5.4/5 rating is unreasonable - or if the guidelines
perhaps are if they "cap" what rating a vehicle can reach based on comparable
data; as Tesla states in one of those articles alone, ~40% of vehicles have a
5-star rating - so they argue it's important to be more specific to allow
consumers to better differentiate.

------
thrower123
I can't believe that this is a real thing.

Are we so lazy that we can't walk to our cars now?

~~~
sixothree
We waste vast amounts of space on parking spots. We destroy perfectly good
land just in case someone might want to park there.

This could change the shape of shopping and make the world a better place.

~~~
thrower123
And? This doesn't do anything about that if you still need to be line-of-
sight.

~~~
sixothree
That won't always be the case.

------
RaceWon
And so if a person accidentally drops their phone the car eventually stops
when it meets an unmovable object? Is this correct?

~~~
reitzensteinm
No, if you release the button it'll stop (barring some kind of fault). It will
also stop before hitting anything (again, barring some kind of fault).

It's not ready, though. The Tesla might not technically have been at fault in
this clip, but driving is _full_ of situations where one party makes a
mistake, and another takes corrective action to prevent an accident.

A self driving system that is never "at fault" but crashes regularly into
drivers that are at fault is not good enough. It needs to drive defensively.

The timing of this release is suspiciously close to the end of Q3. It concerns
me that this was rushed out the door early to recognize some revenue.

[https://twitter.com/DavidFe83802184/status/11777611732713922...](https://twitter.com/DavidFe83802184/status/1177761173271392256)

~~~
RaceWon
> No, if you release the button it'll stop

OIC, thanks for the info that I was too lazy to DuckDuck

------
danielovichdk
AI taking over and teasing owners.

Or just a bad idea for fat and lazy people.

~~~
dhdjdkd
Pretty much - ppl are too lazy to walk to their bloody cats and drive their
own cars because they want to keep starting at the car video they’re watching

------
samstave
It was a freaking typo guys, Jesus Christ. iPads suck to type from some times.

------
mindfulplay
The carelessness and dangerous way this entire thing has been "rolled out" is
alarming.

People are touting this as some sort of "amazing" feature, I think this guy
Elon Musk is a crazy whackjob who puts people's lives at risk.

All the videos I saw from Tesla were " ideal well-lit" situations; well of
course! University graduate students have Legos and basic robotics that do
that crap. Getting it to work in "normal" (read "edge-cases) is the most
difficult part. And they conveniently just ignored that part.

It's insane people are putting these death machines on public roads.

Before people claim "this is just a parking lot", imagine how many parents use
their stroll-cartd with babies in them or how many kids run around in parking
lots.

~~~
GhostVII
I think "death machines" is a bit hyperbolic, the speed is capped at 5mph, and
I imagine that Tesla is better at avoiding hitting humans in parking lots than
a normal driver since it has sensors all around it. Avoiding hitting people
isn't the hard part, navigating in a predictable way and following the
unwritten expectations of driving in a parking lot is, and that is something
Tesla doesn't seem to do a great job of yet.

~~~
rantwasp
yes. you have a couple thousands pounds of "self-driving" technology that does
not do well in parking lots (that's like 95% of where I usually park my car).
It's definitely not going to hit or kill anyone. Also, it's the responsibility
of pedestrians 100% to look for cars.

Give me a break. You can either handle what your feature advertises or you
cannot. This is in the cannot bucket - the same way autopilot is in the cannot
bucket.

The sad story with Tesla is that they should probably focus on the one thing
that works beautifully on the car and that is the fact that it's electric.
Focus on making a beautiful, insanely fast electric car. People love it
already - just stop w/ the gimmicks.

~~~
GhostVII
There is a big different between not doing well in parking lots, and being
dangerous in parking lots. From the videos I have seen, it doesn't do well in
parking lots because it is too cautious - it is safe, but not very functional.
Now of course we have to wait longer before determining if it is safe or not,
but I would imagine that it is not hard for Tesla to avoid hitting people, and
if it does hit someone it is only going at walking speed anyways.

~~~
rantwasp
To me doing unpredictable things, in a parking lot, is dangerous. I would like
to not be hit, period.

And good luck with the liability when accidents happen. You explain that it
was the car and the car manufacturer explains it was you. In the meanwhile I'm
in a wheel chair or worse for the rest of my life.

I am also not a big fan of experimenting if something is safe or not using
others' people lives. I would think that if you're going to put something out
there you will be liable if it's malfunctioning.

You want to have really cool features that don't work in the real world? Play
with them on your private estate - just don't force people to be guinea pigs.

Also as a side rant: we live in a world where a significant chunk of people
doesn't have clean water or access to internet, but hey I can remotely summon
my car! (but have to be sure it's in my line of sight...). How about we focus
on giving everyone clean water and internet and stop with the gimmicks?

------
ztjio
The feature is in beta, and it is enabled on cars lacking the latest hardware.
I think it would be reasonable to expect issues especially for those cars.

I know it's very popular to cherry pick incidents and attack Tesla these days,
but, nobody else is pushing the envelope. And nothing about this feature is
putting anyone at significant risk. These are inconveniences that have
occurred, and nothing else. In the meantime, the pursuit of self driving cars
is pushed forward greatly through the collection of this data.

In the end, this will benefit everyone.

For now, we can sit back and enjoy the cringey videos this will produce.

~~~
consolasfont
> The feature is in beta

I. Don't. Care.

The way I see it is that if someone's Tesla crashed into my car in the parking
lot then as far as I am concerned it is 100% the fault of the owner and the
owner should compensate me 100%.

The fact that it wasn't technically their fault but instead it was bad
software matters nil to me. To me the only person I interacted with is the
Tesla owner, and that's who I will put blame on. I don't care if the owner is
able to defer blame to someone else. They can do that, but I want "mine" from
the owner.

~~~
Erlich_Bachman
What part or letter of the word "beta" don't you understand? Can't you just
not use the functionality until the company says that it is complete? There is
plenty of other functionality in a Tesla car, why not use that?

~~~
sschueller
The Public road is not a beta testing ground.

