
After 6 months of working fine, Tesla software update drives at barriers again - SamuelAdams
https://np.reddit.com/r/teslamotors/comments/b36x27/its_back_after_6_months_of_working_fine_2019515/
======
SilasX
I would be a lot more forgiving of these screwups if Tesla didn't constantly
swear up and down that they've solved self-driving cars.

As far back as 2016 they were claiming they had full SDC capability above
human driver safety [1], and their recent Model Y announcement suggests that
the only thing holding it up is regulatory approval, and not failure to
achieve the desired spec.

>Model Y will have Full Self-Driving capability, enabling automatic driving on
city streets and highways pending regulatory approval, as well as the ability
to come find you anywhere in a parking lot.

[1] [https://www.tesla.com/blog/all-tesla-cars-being-produced-
now...](https://www.tesla.com/blog/all-tesla-cars-being-produced-now-have-
full-self-driving-hardware)

[2]
[https://news.ycombinator.com/item?id=19397942](https://news.ycombinator.com/item?id=19397942)
(linking HN because it's hard to find the text in the page with their UI)

~~~
slg
I don't want to dismiss your whole point, because it is certainly valid, but
that isn't really the issue here. It is entirely possible for bugs like this
to exist in the self driving tech _and_ for Tesla to be correct in their
claims that Autopilot is on average safer than a human driver.

It is obviously troubling to see self driving cars run into solid and
stationary objects, but human drivers do that all the time too. The question
shouldn't be whether this technology is perfect, it should be whether this
technology is safer than humans. You and I certainly don't have enough data to
say one way or another on that. I would bet even Tesla doesn't have enough
data to say definitively. However writing this tech off as unsafe just because
it makes what seems like an obvious mistake is a great way to slow progress
which will result in more human deaths long term.

~~~
rayiner
I think you're missing the forest for the trees. We're not even at the point
where we're comparing safety rates. Can a Model Y drive through DC? Can it
ignore traffic signals in favor of following hand signals from a uniformed
traffic officer, or follow the directions of a construction worker having
traffic in turns through a road that they've decided to turn into one lane?
Can it deal with an unannounced presidential motorcade rerouting traffic?
Because that happens literally every day in DC. If Tesla can't handle that,
it's inaccurate to say it has "full SDC," even before you get into an analysis
of safety.

~~~
slg
As far as I'm aware, Tesla has never claimed their cars will be able to handle
all those situations. "Self driving car" is a vague term that has different
definitions depending on who you ask. You are considering "self driving" as
level 5 autonomy and Tesla probably considers it somewhere between 3-4
autonomy. I don't think it makes sense to get angry because their definition
of the phrase is not the same as your definition of the phrase.

~~~
dragontamer
Tesla skirts morality.

The feature is called 'Full self driving'. Their sales reps regularly told
customers to take their hands off the steering wheel of their "Autopilot".
Sure, the fine print says "always pay attention", but there's an entire
marketing scheme going on here which is borderline dishonest.

~~~
darkpuma
> _" Sure, the fine print says "always pay attention","_

In fact the not-so-fine print in the user manual says to keep your hands on
the wheels, but the promotional material tells a different story, as does Musk
when he takes his hands off the wheel on national television.

~~~
salawat
Honestly, I'm surprised a false advertising case hasn't been pursued if this
is the case. Marketing practices this inaccurate are well outside the puffery
defense.

------
tbabb
I have been yelling about this for a long time: Tesla is not going to be able
to deliver full self-driving as promised. They don't have the hardware, for
one (ranging is terrible; they need stereo cameras), and second, their
software strategy seems to be a dead end.

They need sensor fusion. The system needs to make maximum use of all the
information available to it: Where is the road striping? Where are the other
cars going? Where are the road signs and signals? (If there's one in your
path, you certainly shouldn't drive into it!) Are there camera-visible
obstructions? What were the interpretations and actions of previous Tesla
trips along the same route?

In these problem cases, all data _except_ the left and right lane striping
seems to be completely ignored. There was even more information at the fatal
offramp location (cross-striping over the lane separation zone), which the
vehicle drove straight over. The system is not making maximum use of the
information available to it, in fact it is using hardly any of it at all, and
fixating on what it _thinks_ is a single most salient piece of data.

Sensor fusion algorithms tend to behave the opposite way-- each additional
piece of data informs the interpretation of _all_ the other data. You can have
very poor-quality data, but if it is even moderately over-constrained, your
state estimate can be very good in spite of it. I think it would be completely
reasonable to have a neural net in the loop of a sensor fusion algorithm, with
fusion constraints informing the NN's interpretation, and the NN's estimates
feeding back into the fusion algorithm as uncertain data.

IMO Tesla will do at _least_ one of:

* Very expensively retract their promise of full self driving for delivered vehicles

* Completely overhaul/redesign their driving software and start again nearly from scratch

* Get into a regulatory/legal tangle with the NTSA/courts/DOJ over all the dead people their system is making.

~~~
tim333
Musk is more optimistic than you - says late 2020 for sleep while the car
drives
[https://www.youtube.com/watch?v=Y8dEYm8hzLo&t=10m25s](https://www.youtube.com/watch?v=Y8dEYm8hzLo&t=10m25s)

Also quite interestingly he says there will be a big jump forward in quality
when they switch to their own computing hardware (18m40 or so)

~~~
x38iq84n
> late 2020 for sleep while the car drives

Musk is often overly optimistic and he keeps underestimating the problem at
hand. I call BS on this, it won't be ready in 5 or even 10 years. And then
there is a regulatory approval.

~~~
tim333
Indeed though he's delivered some stuff. It'll be interesting to see how it
goes.

~~~
tim333
Though the 2017 LA to NYC drive is behind schedule.
([https://www.theverge.com/2016/10/19/13341100/tesla-self-
driv...](https://www.theverge.com/2016/10/19/13341100/tesla-self-driving-
autonomous-road-trip-la-nyc))

------
SamuelAdams
Additional context: on March 23, 2018, a Tesla Model X, while using autopilot,
drove into a barrier and killed the driver. This was fixed in a software
update, but the issue seems to have resurfaced.

NTSB analysis of the March 23 2018 collision:
[https://www.ntsb.gov/investigations/AccidentReports/Pages/HW...](https://www.ntsb.gov/investigations/AccidentReports/Pages/HWY18FH011-preliminary.aspx)

Tesla's statement of the March 23 2018 collision:
[https://www.tesla.com/blog/update-last-
week%E2%80%99s-accide...](https://www.tesla.com/blog/update-last-
week%E2%80%99s-accident)

~~~
MagicPropmaker
Well, it wasn't really "fixed" \-- the driver is still dead.

~~~
craftyguy
Sure, if you construe the meaning of 'fixed' to mean 'roll back all events and
time', which I don't think anyone expects. They probably fixed the software
bug.

~~~
falcolas
I think that it's important to note that "we'll fix bugs via live patches" is
a deadly decision when it comes to self driving software.

Just discussing the incident in terms of a "software bug" does a disservice to
the severity of the issue.

~~~
gwbas1c
Remember, autopilot and autosteer are not self-driving. Tesla is very explicit
that the driver must remain alert, and supervise, at all times.

That being said, drowsy driving is a thing, and it's very easy to fall asleep
behind the wheel. The car really needs a better strategy to handle this
situation.

~~~
netsharc
Tesla's legal department is very explicit, the marketing department? Not so
much...

------
steelframe
The day I took delivery of my 2016 Model X with AP 1.0, Tesla announced AP
2.0. A friend of mine immediately ordered a Model X with AP 2.0 and rubbed it
in my face.

For the entire next year, my AP 1.0 (which is non-Tesla technology -- Mobileye
rocks) had no trouble doing adaptive cruise control and lane assist. Meanwhile
his AP 2.0 would brake suddenly and swerve all over the place. It took a full
year of OTA updates before his AP 2.0 was finally on-par with the
functionality that I had the whole time. Of course, by then Tesla pulled a
"we're sorry, but the princess is in another castle" and came out with AP 2.5.

Now this kind of stuff doesn't matter to me. I got tired of that company's
shit and have pulled out of the Teslasphere entirely. I'm now driving a non-
Tesla EV, and I'll never look back. I'm also letting my government
representatives know that they should support a common EV charging standard
and keep Tesla so-called "self-driving" shit off public roads.

------
sschueller
Eventually we need the NTSB to certify updates before they are pushed out over
the air. Similar to what the FAA does.

These OTA updates are not ok for large machinery and endagers not only the
Tesla driver but all others on the road.

~~~
kitsunesoba
I agree, but is there a way that could happen without slowing the process to a
crawl? Depending on what’s involved it could easily push the gap between
updates from months to years.

~~~
toomuchtodo
This is a feature, not a bug, and should be expected in life critical systems.
Would you want Boeing to push updates out as frequently as Tesla does with the
same sparse release notes Tesla provides (“bug fixes”) when safety system
functionality is modified?

Disclaimer: I own a Model S

~~~
anth_anm
Imagine if Boeing got away with just pushing a software update to the 737 MAX8
and saying "it's fixed now".

~~~
CamperBob2
Well, it might have prevented the second crash if they had treated the matter
with a bit more urgency. Depends on whether the two incidents really had a
common cause, which is looking like the case.

Of course it's also looking like they should have grounded the fleet after the
first crash, given the history of the aircraft prior to its last flight.

------
erobbins
I think Tesla eventually has a catastrophic accident and is sued and/or
criminally prosecuted into oblivion. I feel sorry for the people who are going
to have to die for this to happen.

This trope that humans are bad drivers is, in general, crap. Humans are very
good drivers. The US has 7.3 deaths per billion km driven. This means if you
drive 50km a day, every day, you are (essentially) guaranteed to die... after
7500 YEARS. You have less than a 1 in 10 million chance of dying on any given
trip you take. That is NOT risky, and is NOT dangerous.

~~~
kirse
_I feel sorry for the people who are going to have to die for this to happen._

I try to explain this to friends who are far too optimistic on self-crashing
car technology. Self-driving cars (SDC) ultimately trade one class of problems
that result in death (human-attention deficit) for another class of growing
issues (sensor malfunctions/incapabilities, software defects).

Ultimately SDC deaths end up as bugs/features on some random devteam's
backlog, and I have no desire to have a JIRA ticket named in my honor.

In my opinion, by the time all the money and effort is spent making SDC's
capable of successfully driving from point-A to point-B in the near infinite
possible conditions they could encounter, it would have been 50x cheaper to
simply build a fully modernized high-speed rail network over existing highways
and roads.

~~~
1123581321
You are saying that the cost to develop successful self-driving cars is 1-50
quadrillion dollars using an optimistic estimate of rail costs. That does not
seem reasonable. Perhaps 200 billion across all companies have been invested
in self-driving cars so far (I.e., Waymo is just a fraction of that.)

~~~
kirse
Agree, SWAG was heavily W. Updated. I still think we're 50 years off from
near-flawless SDC's though, assuming current LOE.

------
paul7986
Scary... software developers at these robotic cars and their mistakes/bugs
aren’t just going to bring down a business application(lose money) but kill
their customers and innocent drivers.

Progress to where it’s safer is going to be a killer and we the drivers on the
road are unwilling guinea pigs to billionaires’ dreams/goals.

~~~
nathanaldensr
Unwilling? Don't buy a Tesla. Don't believe the hype. It's that simple.
Granted, there is nothing stopping _any_ ECU from killing you, but I'd trust a
company like Honda _way_ before Tesla.

~~~
Tomte
I generally have zero interest in cars and don't follow the new models, but my
impression from articles I've seen in the last years is that Volvo is actually
a top contender for driver assistant systems (when you don't fool yourself
into thinking you have an autopilot, but you really want sensible safety
augmentation features).

Is that impression accurate?

~~~
mongol
I have no idea. Volvo certainly has the culture to do something good out of
it. But do they have the money and resources required? Today they are owned by
Chinese Geely. I don't know what partnerships and capital they can work with
to compete with the top contenders (who I assume have Silicon Valley capital
behind them)

~~~
sangnoir
Geely has invested a lot into Volvo[1], and Volvo are innovating in
interesting ways[2] . I would choose the electric Polestar 2 over a Tesla in
the same price-bracket due to Volvo's culture of safety. Hopefully the cheaper
versions will be released soon.

1\. [https://www.bloomberg.com/news/features/2018-05-24/volvo-
is-...](https://www.bloomberg.com/news/features/2018-05-24/volvo-is-better-
than-ever-thanks-to-this-chinese-billionaire)

2\. [https://arstechnica.com/cars/2019/02/volvo-spinoff-
polestar-...](https://arstechnica.com/cars/2019/02/volvo-spinoff-polestar-
reveals-its-battery-ev-the-polestar-2/)

------
bsaul
I wonder how do unit tests work with NN (or if they're even a relevant concept
at all).

You could replay some testing video frames and make sure the objects are
correctly identified, but i suppose that's already what training is about...

If an issue like that resurface, does it mean that the original frames leading
to the 2018 accident aren't part of the training (or at least frames from
someone driving in this kind of scenario) ?

~~~
skwb
You don't unit test a NN. You can unit test certain functions, but
fundamentally this is an integration test.

This is why serious scientific training is needed to understand these complex
systems when health and safety are on the line.

~~~
leggomylibro
"Autonomous vehicle integration test track" could make for a great setting in
a spy thriller. The villain could own the megacorporation which makes the
cars, and the heroes could find evidence of their evil plot among the
sprawling acres of labs and potemkin streets. But then, in the distance, the
sound of revving engines...

Seriously though, I wonder if that sort of physical test track will become
popular. You would load your build onto an idle car, queue it up, and make
sure that it didn't hit any of the silhouettes which spring up, unusual
traffic and weather conditions, etc. They must already do that in some
capacity, right?

~~~
saalweachter
> "Autonomous vehicle integration test track"

Just say "the 101", it's shorter.

------
x38iq84n
I have always wondered... Does the AP have some higher-level notion of object
permanence, continuity (road behind a horizon or after a curve) and things
like that? Does it track a pedestrian that is momentarily hidden behind an
obstacle and will probably reemerge in a second or two on the other side? Does
it expect that kids may run after that ball that just flew from behind a car?
Does it continuously track and improve classification of all objects in the
field of vision, with their trajectories and speeds if they are moving?
Personally I don't think it does, otherwise it would not erratically slam into
clearly visible and marked large objects in its way, or it would be aware of a
truck moving in a perpendicular way and so on. I am of opinion that without
such higher-level awareness it can never succeed, hope to learn about the
state of the self-driving art.

~~~
dragontamer
I don't think neural networks are wired to "remember" things. In theory, they
could be hooked up that way. But your typical convolutional neural network is
looking at things frame-by-frame.

In theory, ANNs could have an output layer that passes data from one frame to
another frame to assist things. But there's no real programming to "hardcode"
something like object permanence into an ANN. You pretty much throw a bunch of
data into the system and hope for the best.

~~~
hemogloben
NNs are just the first step in the pipeline. Their outputs (detected objects,
segmentation, etc) will be piped into other software that builds higher level
models.

Considering the path-planning requirements I would be absolutely shocked if
Autopilot wasn't build history models and estimated paths for objects around
the vehicle (other cars etc).

~~~
JanSolo
Agreed; I imagine they use neural networks to detect and classify objects
which are then saved into a scene-graph for use in pathing.

I expect what happened was that they trained their NNs for improved detection
in one area but unknowingly reduced it in another. Perhaps now it can detect
tricycles 99% but road barriers went down to only 30%. Having worked with NNs
it's very common to see gains in one domain which come at a cost of reduced
performance in another.

~~~
stevenjohns
They must have known. I haven’t worked with NNs for a few years but I don’t
believe the methodology has changed where you would stop testing it over
different sets of training data.

Barriers are a pretty big part of driving on roads and highways and the only
reason it would have been unknowingly reduced would be if they just weren’t
testing the NN against data with them.

------
mslev
Watching the video, '2019.5.15 - Try 2' is interesting. You can see the car
moving normally, then it starts to follow the black crack in the road and
moves to the right- at this moment, the white Nissan in front is blocking the
white lines ahead where the lanes actually split.

Does AP use other cars as reference points, or just the road? Ideally in this
situation it would be both: "The line has disappeared, and there's a new one
now, but that car went over it". Instead it seems to just be following
whatever lines it can see. Does that make sense?

Note- not at all defending the AP behavior here. Just thinking out loud.

~~~
treis
>Watching the video, '2019.5.15 - Try 2' is interesting. You can see the car
moving normally, then it starts to follow the black crack in the road and
moves to the right- at this moment, the white Nissan in front is blocking the
white lines ahead where the lanes actually split.

It seems like it's failing in different ways:

Try 1- Toughest to tell, but it looks like it failed to recognize any lines.
Kept going straight which was at the barrier. Hard to tell if the car would
have recovered.

Try 2- Looks like the car tried to go left into the closed lane. Seems like an
error in detecting the barriers closing the road. I'd guess that it would have
avoided the concrete barrier and driven down the closed lane

Try 3 - This one looks like it picked the wrong lane marker to be the left
side of the road. In that it thought the right lane marker of the closed lane
was actually the left lane marker. This one probably ends up with a smashed
car and dead driver.

------
alanh
My Model 3 suddenly changed lanes today for no discernible reason. I think it
considered my lane to jump over into the next lane, for some reason. I should
have hit 'Record' to save the footage from TeslaCam.

(Notably, AutoPilot is not supposed to change lanes without explicit
confirmation from the driver, which is clearly illustrated on the
dashboard/panel.)

It's a stretch of road on which I have previously used AutoPilot many times.

~~~
gpm
Recently I accidentally changed lanes in an intersection. The road had 3 lanes
in each direction (+ separate streetcar tracks down the center), including the
right lane which was required to turn. I was in the center lane. After the
intersection the road was still a 3 lane road - which didn't quite register on
my brain. I moved from the center lane to the right lane (figuring it exited
so I was supposed to be in the now-right lane) when I should have remained in
the center lane.

While I am a novice driver, I've gone through that intersection before without
blinking or doing the wrong thing. It's not a particularly complicated
intersection.

Anyways, point is, driving is surprisingly hard. I think counting anecdotes on
the internet probably gives you a sample heavily biased against Tesla, because
most people don't go post "so I did this stupid thing" but they do post "so my
car did this stupid thing".

~~~
alanh
My car and I are stupid in different ways. It has perfect attention and will
always brake in time to avoid a fender-bender in traffic. It’s also a lot
worse than I am about reading and communicating intent with other drivers or
simply recognizing the objects around us.

------
systemspeed
With demand for self-driving vehicles as high as it is, yet with the lag in
advancement of computer vision, I think it's about time for a serious
discussion about smart roads. I realize this isn't entirely relevant, but I
can't be the only one thinking that we're missing the forest for the trees by
trying to solve a transportation problem while simultaneously solving a vision
problem.

~~~
progfix
Might as well put them on rails (Smart-Rails(TM)) and make a railway network
for smart people. Transportation problems solved!

------
jaimex2
Warning: Autosteer is intended for use only on highways and limited-access
roads with a fully attentive driver. When using Autosteer, hold the steering
wheel and be mindful of road conditions and surrounding traffic. Do not use
Autosteer on city streets, in construction zones, or in areas where bicyclists
or pedestrians may be present. Never depend on Autosteer to determine an
appropriate driving path. Always be prepared to take immediate action. Failure
to follow these instructions could cause damage, serious injury or death.

[https://www.tesla.com/content/dam/tesla/Ownership/Own/Model%...](https://www.tesla.com/content/dam/tesla/Ownership/Own/Model%203%20Owners%20Manual.pdf#page=75)

~~~
kurtisc
And then... [https://www.businessinsider.com/elon-musk-breaks-tesla-
autop...](https://www.businessinsider.com/elon-musk-breaks-tesla-autopilot-
rule-2018-12?r=US&IR=T)

------
Tomte
The comments are scary. People defending Tesla's misleading advertisement re:
full self-driving hardware etc.

Because clearly "having full self-driving hardware" only means the hardware is
there, not that the car can actually self-drive.

I bet those people find themselves misunderstood in pretty much every
discussion with people outside their nerdy circle.

~~~
mikejb
On the earnings call, Musk even claimed that the Navigate on Autopilot feature
set is "full self-driving" [1]

[1] [https://youtu.be/oB0sAGykWT8?t=2895](https://youtu.be/oB0sAGykWT8?t=2895)

------
mcguire
One might suspect, given the reintroduction of the bug, that Tesla doesn't
understand their code.

------
syntaxing
Tesla should just partner or acquire a LiDAR startup already like Baraja
[1]...They can let the computer vision do all the magic they want. Just have
one scanning LiDAR as a redundant system so that it doesn't run into anything
in front of the car.

[1] [https://www.baraja.com/](https://www.baraja.com/)

~~~
ip26
I don't know why, but supposedly they've made "no LiDAR" their hill to die on.

~~~
syntaxing
Yeah, it's such a weird mentality. Their new patent on a "CNN" ASIC is pretty
neat but still doesn't solve a lot of their problems. Are they banking on a
magical depth CNN architecture to be released or something?!

------
gwbas1c
The video really doesn't convey what happened. It's not really clear when the
driver took over.

My Model 3 is very odd when going through forks like this. It just has trouble
picking one side of the lane or the other. Yesterday I got a little scared
when it swerved back and forth while trying to figure out how to take an exit.

------
sabareesh
Since ModelX incident I have made sure not to drive on first lane. I always
drive on the 2nd lane and it is much better.

~~~
bdcravens
For a $700 car I might consider crazy compromises, but $70k+? Absolutely not.

~~~
warp_factor
That's what I find so interesting with Tesla owner. They spend a fortune on a
car, then they minimize every single issue they have with the car. My
explanation for this is that for a lot of owners, the car is a way to be part
of a hyped group more than being a utilitarian object (what a car should be).

~~~
mikestew
Happens with a lot of stuff, certain American-made motorcycles, for instance.
You're in the club now, and the only way to stay _in_ the club is to carry the
manufacturer's water. In no case are we to admit that we spent a bunch of
money on something that does a poor job of fulfilling its advertised purpose.

~~~
bdcravens
Or more relevant to the HN audience, Apple.

~~~
mikestew
What, Tesla wasn't relevant enough for you to not take an "obligatory" dig at
Apple product owners?

~~~
bdcravens
Trust me, if I'm taking a dig, I'm pointing at myself; I have probably $7k of
Cupertino products within arm's reach at the moment. But I have to admit
there's times when the comparison is apt.

------
Scoundreller
That’s a very strange way to close a highway.

Where are the blinkenlights? The words? The flashing arrows? The plastic
jersey barriers?

The buckets full of sand or water?

The repainted lines directing you to the right?

Can’t tell if this is temporary or long-term closure, but if it’s a multi
month thing, I would expect more “Don’t you dare fork to the left” signalling.

Not everyone is a local.

~~~
jakeogh
Why would any of that be relevant?

~~~
Scoundreller
Because the same things that confuse humans can confuse computers that are
largely trained with human data.

~~~
dymk
Millions of drivers per year use roads like this in California and Washington.
They’re not very confusing for a human driver who’s paying attention.

~~~
Scoundreller
I’m from Toronto, so I never underestimate drivers’ ability to do stupid like
drive into streetcar-only tunnels past the end of the ashphalt, despite ample
signage. 26 times.

[https://torontolife.com/city/queens-quay-streetcar-tunnel-
dr...](https://torontolife.com/city/queens-quay-streetcar-tunnel-drivers-
event/)

~~~
BoorishBears
How many hundreds of thousands of cars for those 26 to make that mistake?

Now imagine if every Toyota did this instead?

FSD is being touted as a "force multiplier" for safe driving, but it could end
up being a force multiplier for deadly mistakes

~~~
Scoundreller
Few care about the 26 that missed the signs.

It’s the multi-hour shutdown of the city’s 3rd busiest transit route that
connects to the continent’s 3rd busiest train station that impacts a lot of
people.

------
newnewpdro
I find it absurd how the NHTSA doesn't treat Tesla similarly to how it treated
Toyota ~10 years ago with the sudden unintended acceleration controversy. [1]

Tesla has demonstrably flawed and dangerous vehicles operating on our roads.
These persistent autopilot bugs are killing people. If we treated the Tesla
autopilot as a licensed driver, it'd have its license suspended. (Not that it
could have passed a driving test in the first place)

[1]
[https://en.wikipedia.org/wiki/2009%E2%80%932011_Toyota_vehic...](https://en.wikipedia.org/wiki/2009%E2%80%932011_Toyota_vehicle_recalls#Investigations)

~~~
DarmokJalad1701
> These persistent autopilot bugs are killing people.

Are there any numbers on this?

------
cmurf
So far Tesla automation seems to handle edge cases. I have never rear ended or
side swiped anyone, getting into such an accident or avoiding one is an edge
case for me. Useful but not really automation.

For it to even have a chance during lane keeping of doing something only a
sleeping or suicidal driver would do? It's utter bullshit. It's less safe than
a human.

Autonomous driving is overhyped vaporware. Autonomous airplanes would be
easier than cars, and we still are no where near that.

------
ummonk
Maybe it's merely that I'm used to cones rather than barriers of that sort,
but that stretch of road looks rather confusing to me as a human as well.

------
FreedomToCreate
Tesla can only really gather depth data from there radars, while the cameras
operate a DNN to detect features and make adjustments for steering and speed
based on fusing those two pieces of data together. An error in the radar or
the DNN detecting features incorrectly because of lighting, road color change
or object on the road which the model was not trained on, can cause problems
like this.

~~~
zmarty
"Tesla can only really gather depth data from there radars" \- they can also
do it visually through depth from motion

------
sandos
What I dont understand is, that yellow sign is actually very visible compared
to other things the neural nets can detect. Why hasn't it been done yet?
Detecting things as visible as that sign should not be a huge problem.

------
dreamcompiler
I used to be a big supporter of Tesla, but I now have to apply Internet-of-
Shit Rule #1 to them: No software updates _ever_ without explicit owner
approval.

And "owner" is me, not Elon Musk.

------
jtaft
Can they simulate their sensors input pretty well? Do they use car driving
simulators (video-game like) to see what would happen in similar scenarios?

Of course, real world testing would still be necessary.

~~~
jaimex2
When a customer disengages or corrects AP that periods sensor data is recorded
and uploaded to Tesla. The uploads are curated and submitted into the neural
network training data-set.

They basically have an infinite amount of real world data coming in.

------
lgleason
Tongue in cheek......this is planned obsolescence. New model with upgraded
features is announced so you start to brick the old ones by making them run
into things. :) Of course there is that pesky issue of it potentially
injuring/killing the driver and passengers....

In all seriousness, these things are getting better, but Skynet is not here
yet. Just because a company, governmental agency etc. says something is safe
you always need to evaluate things yourself..and it's probably not a good idea
to rely on these auto pilots unless you want to win a Darwin award. I love the
cars, but the hype is a bit annoying around stuff like this.

------
GoToRO
That's one way to keep drivers alert I guess... Add a little bit of randomness
into your boring drive.

------
perfunctory
It troubles me that we use the term "software bug" to describe this.

------
anth_anm
Said it before, saying it again. OTA updates to safety critical things is not
a feature. It's a bug.

