To a more general point, when driving a car I should not have to ask, "is it going to work this time?" I hit the brake pedal, I have a high degree of confidence that the car will scrub off speed. I hit the gas, I don't worry that it will go (except in our old VW). I set Autopilot, I expect that it won't plow into the back of car in front of it. Whoops, scratch that last one. Instead, I'll constantly be wondering if it's going to do its job.
Of course they do that. The functionality is completely useless if you still have to fully engage with driving the car. Cruise control is useful for resting the driver’s leg. This doesn’t require a reduction in attention. Partial automation’s benefit comes specifically from relieving the demand for attention. To turn around and say that you still have to be fully attentive and in full control is to say that the automation is entirely useless.
But they need to be honest with themselves - in the public's mindshare these are 'self-driving' vehicles, and there's no doubt that Tesla has benefited from the idea that Tesla cars have this technology and the corresponding image of their cars being a new generation of vehicle.
It isn't helped (or it is, depending on your perspective) by naming the system 'auto-pilot', which implies to the lay person an automatic system.
Really this isn't much more than lane-assist, adaptive cruise control and emergency braking, terms which do not imply you can delegate control to a computer.
There's really only so far Tesla can go in its attempts to educate the public. In fact, they've been strenuously stating, over and over again, that these aren't self-driving cars. Autopilot as a name for the technology is in line with how it's used and expected to perform in commercial airplanes. You still have to know how to fly the plane and you still need to stay vigilant.
Thankfully, they have a long way to go before they reach the limits of how far "only so far" is.
In fact, they've been strenuously stating, over and over again, that these aren't self-driving cars.
The first link I came to when searching "tesla fully automated hardware" was this: https://www.tesla.com/blog/all-tesla-cars-being-produced-now... I read the page twice, and didn't see a single outright statement that the cars aren't self-driving. Oh, I could infer it, but Tesla never came right out and said it, let alone "strenuously".
EDIT: because my search string was a bit loaded, in that I knew it would find the link I'm looking for, let's be more fair with:
"tesla fully automated driving" - the link above is the first listing
"tesla autonomous driving" - the link above is the 2nd link.
In summary, should I care to know more about Teslas and their autonomous driving abilities, I will likely be led to a page that never mentions that these cars can't drive themselves, but strongly implies that they can.
They only state that in response to "Tesla car involved in crash"-style reports. Otherwise, the messaging they send out is "our cars are basically self-driving." At best, Tesla is guilty of sending out severely mixed messaging, and at worst, they're lying to people and claiming that it's not lying because there is a footnote that's telling the truth (that is designed so that no one will see it).
The Tesla fans keep repeating this, but it flies in the face of the description of Autopilot on Tesla's website. I can't tell if they are being deliberately ignorant, or what. But the fact that Tesla claims you can press a button and summon your car, at least in my mind, says "cars that drive themselves".
- Advanced cruise control
- Lane Following
Doesn't sound as sexy, though. :-)
Have you read the marketing on the website?
All you will need to do is get in and tell your car where to go. If you don’t say anything, the car will look at your calendar and take you there as the assumed destination or just home if nothing is on the calendar. Your Tesla will figure out the optimal route, navigate urban streets (even without lane markings), manage complex intersections with traffic lights, stop signs and roundabouts, and handle densely packed freeways with cars moving at high speed. When you arrive at your destination, simply step out at the entrance and your car will enter park seek mode, automatically search for a spot and park itself. A tap on your phone summons it back to you.
So it's not like this is a case of "it's called something that it isn't really", or mismanaged expectations. Tesla is explicitly stating the car drives itself. They still claim it's "Autopilot". Is what they are selling anything close to what is described there?
Furthermore, even if you're totally aware it's only assistance, there is a tendency to be more easily distracted. That's what a google study found like 5 years ago.
I am a huge Tesla fanboy, and stock holder. But be careful.. if you are struggling to remain vigilant... turn off autopilot.
Again, where do you get that this is a "misconception among the general public"? Read the marketing. Watch the demos. Listen to the claims.
"Tap your phone and the car will return to you". Where does being vigilant behind the wheel fit with that?
Cogging Cruise Control ?
The video says it doesn't have room to stop before hitting the first vehicle.
Not with ABS, you won't. Yes, I've tested this exact scenario. YMMV, based on the manufacturer.
The Ars article was not as clear-cut.
I am not passing judgement, merely making an observation/prediction.
Detecting these patterns is easier than pedestrian and bike detection because the problem space is so much more bounded. It's hard to give them a pass for not doing it.
I, and the other drivers behind me, avoided hitting the deer because we're used to dealing with them. Should we give Tesla a pass if it hits one because detecting a stationary deer is harder than detecting a stationary car?
1. Indicate to the driver that they drifting out of their lane
2. Warn about possible slow or stationary object ahead
If the driver knows that car won't take action on it's own, but will help them but telling them about things that can't see (or when they are distracted), then we wouldn't be blaming Tesla but the driver.
Disclosure: I currently own an Audi vehicle with this technology that has been available since 2009.
This is a stationary non metal car. This shows that Tesla's vision system cannot even recognize a car, much less any random obstruction on the road.
I re-watched the video and it really doesn't look like the Tesla is breaking. The video says "unable to break in time". This to me sounds misleading. It is not clear whether Tesla tried to break but the distance was too short or it didn't attempt to break. Which leads me to conclude there's an agenda here.
In the U.S. at least, it's highly unlikely that the driver in front would be found legally at fault in a similar accident. In most cases of rear-end collisions, the driver who impacts the rear of the car will be found at fault.