This comment from OP (beastpilot) is pretty frightening:
> Yep, works for 6 months with zero issues. Then one Friday night you get an update. Everything works that weekend, and on your way to work on Monday. Then, 18 minutes into your commute home, it drives straight at a barrier at 60 MPH.
> It's important to remember that it's not like you got the update 5 minutes before this happened. Even worse, you may not know you got an update if you are in a multi-driver household and the other driver installed the update.
Very glad the driver had 100% of their attention on the road at that moment.
Elsewhere from the comments: "Yes the lane width detection and centering was introduced in 10.4. Our X does this now as well and while it's a welcome intoroduction in most cases every one in a while in this instance or when HOV lanes split it is unwelcome." So basically, if I'm understanding this right they introduced a new feature that positions your car a little more consistently in the lane at the small cost of sometimes steering you at full speed head-on into a barrier.
Remember Tesla's first press release on the crash and how it mentioned "Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times"? I imagine the number that have driven it successfully in that lane since that update was rolled out sometime in mid-March is rather smaller...
And that's exactly what makes Tesla PR so disingenuous, they know better than anybody what updates they did and how often their updated software passed that point and yet they will happily cite numbers they know to be wrong.
So, now regarding that previous crash: did that driver (or should I say occupant) get lulled into a false sense of security because he'd been through there many times in the past and it worked until that update happened and then it suddenly didn't?
I'm a huge fan of Tesla and I thought that bit of PR was horrifyingly bad. It came across as completely dismissive of any further investigation or accepting of even the possibility of responsibility.
Now that these other videos are showing up, and further details (the update) are emerging, that PR should bite them in the ass hard enough that they decide never to handle an incident that way again.
I don't want this to kill Tesla -- I sincerely hope they make their production goals and become a major automobile manufacturer -- but their handling of this should hurt them.
I'm also curious if any of the people at Tesla saying, "we call it autopilot even though we expect the human driver to be 100% attentive at all times" have studied any of the history of dead man's vigilance devices.
I've got to take issue with you there. Tesla Engineering knows better than anybody what updates they did and how often they update etc.
Tesla PR knows nothing about what updates the engineering team did. At least some people in Tesla PR probably don't even know the cars update their software regularly.
It's bad practice for them to speak out of turn, but I can absolutely see the PR team not having a good grasp of what really indicates safety and their job is to just put out the best numbers possible.
I'm sorry but Tesla PR == Tesla. If they don't have a good grasp on this they should STFU until they do. That would make Tesla even worse in my book.
Their job is not to put out the best numbers possible, their job is to inform. Most likely they were more worried about the effect of their statement on their stock price than they were worried about public safety.
If they do put out numbers (such as 85K trips past that point) then it is clear that they have access to engineering data with a very high resolution, it would be very bad if they only used that access to search for numbers that bolster their image.
Well I kind of came to hope that Tesla PR was less BS than other PR companies. But that memo basically shows they're no different from all other PR, just twisting the truth to avoid negative stories until the fuss dies down.
The entire point of a PR department is to gather that information before releasing it. Yes, they wrangle it to put the company in the best light, but that doesn't mean they should be operating in ignorance. They have access to Tesla employees that outside reporters do not (or at least, should not) have.
It seems like the collective pack of Teslas would feed telemetry back to the mothership as unit tests for any new versions. In other words, any change has to redrive (simulate) a big sample of what has already been driven, especially where the human had to correct.
> Very glad the driver had 100% of their attention on the road at that moment.
Yes, and 100% is critical. That required pretty quick and smooth reactions. The car started to follow the right-hand lane, then suddenly re-centered on the barrier. The driver had about a second to see that and correct. That's just the sort of situation where even a slightly inattentive driver might panic and overcorrect, which could cause a spinout if the roads were wet or icy or cause a crash with parallel traffic.
And endanger others who have decided to opt-out of the beta test. Keep in mind that in any accident all bets are off with respect to the traffic around you, even if you ram into a stationary barrier you could easily rebound into other traffic.
That's the best kind of small print. The kind that disclaims all liability no matter what. I really don't get why we accept this stuff. It's akin to driving without a license. After all, as long as a Tesla 'autopilot' can't take a driving test it has no business controlling cars.
Not to mention the fact that, if they're software screws up and kills you, they'll rush to to Twitter in an attempt to smear you using personal data and deceptive statements.
I don't know why people use this thing. The cars are nice, and justify their purchase on their own, without this autopilot feature. But I don't think you can do good autopilot without lidar.
I really don't think the cars are that nice... this is a side from the main issue here but the build quality is poor and the design is bland - both internally and externally (That massive big iPad like screen - yuck!).
The big Youtuber that rebuilds salvage Teslas made the point that almost every Tesla owning Youtuber has a video showing their car getting hauled away on a tow-truck. One of the possible reasons that Tesla service dept. doesn't provide maintenance history on salvage cars is because the amount of service would reflect poorly. It doesn't seem uncommon for a model S with 80,000 miles to have several door handle replacements, at least one whole drive unit replacement, battery contactor replacement, an infotainment system replacement, in addition to all the recall/redesigned components that are replaced before failure. I still think Tesla's are quite nice, but a bullet-proof Honda they are not.
Not just your own risk... There are other people on the road who share that risk. It makes me angry at Elon Musk himself. First he didn't sterilize that stupid thing he sent into interplanetary space. Now he's putting peoples lives at risk.
Have to disagree with you there. It is in a fairly normal solar orbit. There are a shitload of dead 3rd stages and satellites in solar orbit. They were not sterilized. National space agencies have been launching things into earth escape velocities without sterile procedures for 40 years. If it were designed to land intact on Mars that is another story
Yeah, but it's coming back to Earth, where it came from. This is an object that could land on Mars or another planet.
Great measures have been taken in the past to ensure that other planets aren't contaminated before we've had a chance to understand their existing biology. Elon Musk is the kid who comes and knocks over some other kids' block tower for his own amusement.
There may already be life on other planets. Whether or not it's simple is a short-sighted human judgement.
Look at the big picture. We risk denying these planets the ability to evolve in isolation. That is a decision that cannot be reversed. Do we really want to do that? Maybe so, but it ought to be a conversation. Great measures have been taken to reduce the risk of contaminating other planets with Earth based lifeforms. Then, this belligerent guy comes along and disregards all that.
> Whether or not it's simple is a short-sighted human judgement.
You realize I was talking about your "contamination" (e.g. bacterial organisms, microscopic lifeforms, etc)?
> We risk denying these planets the ability to evolve in isolation.
Seems like a fairly limited risk. It is more likely these planets have no form of life at all and that we'd seed their only life (if it could sustain there).
> Great measures have been taken to reduce the risk of contaminating other planets with Earth based lifeforms.
You're conflating craft that were designed to land on other planets and look for life on them, with a space craft.
> The driver had about a second to see that and correct.
And if there had been a crash, Tesla probably would have said that the driver had an unobstructed view of the barrier for several seconds before impact (conveniently omitting at what point AP started to swerve).
From the OP's comments on Reddit - he said that it happens about 90% of the time on that stretch and that he has more footage of the same exact incident.
He was likely prepared for it, which kinda makes it even scarier in a way. An inattentive driver would have totally botched this.
After working in large codebases I discovered that there are all sorts of assumptions in the code, and that a small change in one function can break an assumption used a long ways elsewhere. I came to the conclusion that any change basically invalidates all your tests. An update could introduce any possible problem that wasn't there before, so now all your experience with the system as a driver is actually reset to zero.
Or to put it differently, I don't want to be driving on the same road as you with your rooted self driving car. You can be great sysadmin/coder, tesla guys may be too, but both of you changing random stuff without any communication with each other.. I've seen enough servers.
It's one thing to remove the warning stickers from a gas-fire oven, its another when you're commanding several thousand pounds of unintentional killing machine.
Everything to be completely open is really important, but I don't want you to have root or control over update if you are using your car on a public road.
Only certified (by the government) updates should be allowed on self driving cars. It should be at least a misdemeanor to have your own code on the car.
Do these things really matter unless you are going to verify the software yourself? (and if you are, kudos to you!)
While, in principle, users could organize their own communal verification programs for open software, that does not happen in practice, even when the software is widely used and needs to be safe (or at least secure - OpenSSL...)
I just don't understand the willingness people have to put their lives in the hands of unproven technology like this. I mean, I don't even trust regular old cruise control in my cars and I keep my finger on the "cancel" button whenever I use it during long highway drives.
> Yep, works for 6 months with zero issues. Then one Friday night you get an update. Everything works that weekend, and on your way to work on Monday. Then, 18 minutes into your commute home, it drives straight at a barrier at 60 MPH.
> It's important to remember that it's not like you got the update 5 minutes before this happened. Even worse, you may not know you got an update if you are in a multi-driver household and the other driver installed the update.
Very glad the driver had 100% of their attention on the road at that moment.