The obvious question to me is that if "Full Self-Driving" doesn't imply autonomy, what is it supposed to imply? Here is the relevant quote from the article:
> While testifying last week, Tesla engineer Eloy Rubio Blanco repudiated the allegations that the name “Full Self-Driving” was chosen to deceive the public. Blanco stated that Tesla drivers did not believe that their vehicles were autonomous.
It doesn't say anywhere else how he thinks the public is supposed to interpret "full self-driving." As a native speaker of English, I'm baffled to think of what else it could be intended to mean.
> if "Full Self-Driving" doesn't imply autonomy, what is it supposed to imply?
100% agree. Furthermore, if neither "Full Self-Driving" nor "Autopilot" are meant to imply autonomy, what term of art would imply that?
Both of those terms imply autonomy IMHO, and if we're proposing to (eventually) introduce a 3rd term which Tesla agrees would imply autonomy, then it feels like splitting hairs to try and argue that the first two don't but the 3rd does.
Pretty much all of the other car companies have managed to settle on rather generic[0] names for their level 2-3 autonomy implementations: Ford BlueCruise, GM Super Cruise, BMW Driving Assistance Professional, Mercedes Drive Pilot (the first certified level 3 system in the US), Toyota Safety Sense 3.0, Rivian Highway Assist, etc.
The names all imply that the car is doing something to help you drive, but none of them imply further meaning about their capability or even suggest anything about their autonomy. Tesla is pretty much the only company to embrace names--Autopilot, and later, "Full Self-Driving"--that come loaded with cultural and linguistic meaning.
Doing so allowed Tesla to borrow the excitement and prestige of future developments: look not so much at where we are, but where we're going and how some small pieces of that future are there in the current system. Which worked great for their marketing efforts, but didn't do so well in educating drivers about what their cars were actually capable of or doing.
Anyhow, the SAE eventually developed a taxonomy with six levels of driving automation, which were later adopted by the Department of Transportation.[1] It's not a perfect taxonomy, but it does illustrate that autonomy is a spectrum. But it's definitely intended more for the industry and regulators rather than consumers; if a car has "level three conditional automation" systems, that needs additional context to be useful to consumers.
It doesn’t matter at all what they claim they meant. Self driving and autopilot are both terms in the zeitgeist from sci-fi that objectively refer to autonomous behavior. Tesla claiming anything else is what they meant is almost certainly bad faith and even if it isn’t it should still be false advertising simply because they chose these terms.
Exactly where Tesla’s marketing failed. It irritates me as a pilot that they have given it a new definition.
Autopilots have been around a long time, and have a very specific meaning. They require training, knowing what their limitations are, constant monitoring, and frequent manual input.
I’m not sure what Tesla thought they were doing by billing it as a “hands off” term.
As much as I dislike Musk and the mess Tesla has created with "Full self-driving", I don't think you can blame Tesla marketing for the public misconception of what an aircraft autopilot does.
Because, the Tesla autopilot feature is actually rather similar to a simple aircraft autopilot, with very similar limitations. At its core, it's really just a lane following driver assist, combined with adaptive cruise control. If you ignore the "adaptive" part, it's roughly equivalent to an aircraft autopilot in heading mode, plus autothrottle.
Tesla marketing did lean into that misconception, benefited from it and might have made it worse. But they didn't create it, and the public misconception predates Tesla by a long margin... it just didn't really matter before then.
From what I can tell, the misconception arises from the fact that the public know flying an aircraft is hard, the fact that pilots do so much training proves that. Therefore, they assume an aircraft autopilot must be advanced to handle the perceived complexity of keeping an aircraft stable in the air.
The introduction of autoland, and the way it was presented in magazine articles by journalists, probably made the problem much worse.
The actual algorithms used for autopilots are really simple.
A simple autopilot will do nothing more than hold the wings level so the pilot doesn't need to continually keep their hands on the controls. From there, more functionality is added as layers.
Heading mode makes the autopilot always point towards a compass heading. Nav mode is a computer that changes requested compass heading every time it reaches a Nav point.
A more advanced autopilot will add modes for speed-hold and altitude-hold that trade between altitude and speed. When combined with the auto-throttle, the autopilot can implement constant rate climbs and decent.
The autoland mode simply updates to heading mode and constant rate decent to follow the glide slope.
While it's true that a modern autopilot can takeoff, fly to a destination and land without the pilots touching the flight controls, the pilot is required to constantly switch between the various modes, feed nav points, and adjust the autopilots mode. At the same time, the pilots are doing a bunch of other tasks to keep the plane flying and safe.
The simplicity is very intentional. Pilots are expected to know what the autopilot is doing at any point and understand why the autopilot is doing that. They are expected to spot when it's doing something weird very quickly and disconnect it.
Constant adjustment based on density altitude changes, waypoint changes, ATC inputs, depending on turbulence - disable or alter inputs, it is required to be constantly monitored (I am a low hour pilot, but have had one need to be disconnected due to malfunction).
It’s basically a glorified cruise control for airplanes. It will fly a general route, but anything beyond that requires manual input (at least for small plane autopilots).
A good example might be a cop rerouting traffic. This would be an ATC route change in the airplane world necessitating you to reprogram waypoints. I’m afraid of what Tesla’s “autopilot” does in this situation.
One last addition edit - in the airplane world, the PIC has ultimate control and responsibility over the airplane. There are a ton of disclaimers and training that goes along with a real autopilot that makes sure you know what its limitations are. Tesla also has failed here IMO.
Especially since Tesla (or at least Musk) were pushing the idea that Tesla owners would be able to earn passive income by letting their "Full Self-Driving" equiped cars act as self driving taxies, competing with Uber/Lyft on the "Tesla Network"
Unfortunately the testimony was not recorded so we can't hear/read the testimony in full with context. Perhaps he was talking about the fact that FSD was labeled as a beta and therefore bugs were to be expected? Tesla does in fact use the future tense in all their marketing ("your car will be able to drive itself with updates"), and this engineer might not be taking it account expectations due to a name vs explicit claims about performance.
From the other excerpted testimony it does not seem like he is covering for Tesla in general so further context would be helpful.
This is just his opinion and easily falsifiable. The courts job is to ascertain if a reasonable person would consider the claim to mean the car is autonomous
There's a lot of unwarranted drama around this. Reposting a blip from what I wrote below:
>>FSD controls speed, steering, braking, obstacle avoidance, traffic maneuvering, parking, navigation in a manner that, yes, I would call "Full Self Driving." It is very far from perfect, and Tesla has certainly communicated it as being better than it really is. there's a high dose of caveat emptor, but this FSD does a lot more than any other car I've ever driven or owned. The fact that I will not let it attempt to navigate certain intersections, either out of embarrassment or risk of getting hit, does not diminish the fact that my car will give it the ol' college try.
I think it's reasonable, certainly not baffling, to call its performance "Full Self Driving." As noted below, I also think Tesla should be obligated to a generous refund/trial program, which to me absolves the ticky tackiness of "what the definition of FSD is"
Autopilot is great. It does the boring repetitive parts of driving. It behaves predictably in response to stimuli, and generally errs on the side of caution. This causes it to be suboptimal in ways that the driver can account for. As an example: when making lane changes, Autopilot is very sluggish, to the point that if there's any traffic at all, you have to give it a kick on the accelerator to complete the move.
FSD is a trashfire. I'd say the closest description is like you're constantly supervising a 15 year old learning to drive for the first time. Where I live, it'll disregard no-right-on-red and creep past stop lines that are set back from a light. It'll try to take turns and then lose track of which lane it wanted. The cognitive load of babysitting FSD is worse than just doing the driving myself.
I'd say the closest description is like you're constantly supervising a 15 year old learning to drive for the first time.
This is precisely it. It'll work a lot of the time, deceptively well, but you need to be as attentive as a driving instructor to intervene when it makes a mistake.
> FSD is a trashfire. I'd say the closest description is like you're constantly supervising a 15 year old learning to drive for the first time.
Rented a Tesla for a day a couple years ago, and can confirm. I spent the day experiencing a car repeatedly trying to murder me for the first time in my life, and mainly during what's supposed to be easier highway driving at that.
Even Autopilot can't be trusted in my experience. I've had it slam on the breaks countless times when the light changes (going under an overpass or through a tunnel). I've had it swerve over yellow lines, or onto the shoulder, or into a turning lane while I wasn't planning on turning. The only time I ever use it is when I'm going straight on a highway and I need to blow my nose... then I quickly turn it off.
When you're cruising along on the high way with assisted driving on, you just use your turn signal and it switches lanes for you if it's safe or waits until it's safe to do so.
For long distance driving it's a super relaxing experience
Autopilot is nowadays nothing special. Majority of cars have (at least as options) systems that are often better (for example, with much much less phantom braking).
> Where I live, it'll disregard no-right-on-red and creep past stop lines that are set back from a light. It'll try to take turns and then lose track of which lane it wanted.
This all sounds like a typical driver. I imagine it's matching the training set pretty well
FSD beta is significantly better on highways than the current standard Autopilot. It is a total crapshoot on surface streets. The cognitive load is definitely higher than manual driving on surface streets still, the highway it’s less.
This is why I think some of their safety stats are misleading. It’s much easier to rack up safe mileage on the highway. I’d like to see some standard set for the driving needed to develop these stats, similar to what the EPA does for emissions. It has flaws for sure, but it’s less likely to be gamed.
This makes me think of a recent experience I can relate to. I got a new iPhone that has a feature called, and I quote, an "Always-On display." According to https://support.apple.com/en-us/HT213435:
Always-On display goes dark when you don't need it
To save battery life, the display is completely dark when:
• Your iPhone is lying face down
• Your iPhone is in your pocket or bag
• [...]
These seem reasonable to me, at least they haven't caused any noticeable problems. But it goes on, including:
• You haven't used your iPhone for a while
(your iPhone learns your activity patterns
and turns the display off and on accordingly,
including if you set up an alarm or sleep schedule)
This one's kinda infuriating. Every now and then, it's dark when I look at it, and I try to repeatedly "teach" it that I'm not asleep and it shouldn't have gone dark, or give up and try to ignore that it's not doing what I expected from it.
What makes it frustrating more than anything is the feature's name: "Always-On display." If not for that, I think its actual best-effort behavior is fairly reasonable.
I always turn off any of these “learning” features, if possible. They are never correct. The worst part is when you do something different one day, and suddenly it thinks you’ve permanently changed your behavior.
I’ll never understand where the confidence in these features comes from.
Android's always-on display turns off most of the screen, besides a clock and set of notification icons. And the display constantly shifts to avoid burning out the screen.
It looks like iOS, on the other hand, just dims your lockscreen slightly? I can see why they'd have to get smart with how often they leave the display on. It'd be distracting while you're trying to sleep and sounds like a battery killer. In a way, Apple may be using the phrase "Always-on display" more literally than you'd expect, even if it is less practical.
Now the only way to get a cursor is to tap and hold.
Why this change? Was double tapping to select a word too difficult to understand for most users. It’s not like tap and hold to get a cursor to insert words/punctuation is any more intuitive.
*) Disclaimer: neither "verified" nor "fact" implies by any means that somebody believes that the stated and/or expressed sentiment or observation is actually true.
**) Neither "somebody" nor "nobody" imply the presence, existence, or absence of existence of a sentient being, nor does "believe" imply a cognitive process or state of mind. Neither do "observations" or "express" imply an activity by any sentient, semi-sentient, or algorithmic agency, nor is "true" an expression of judgement.
Watching law happen in the media is funny. I don't think a reasonable person test would work here. "Nobody would believe it really was self driving" is trying to say "when we said that, we didn't mean that" which simply swaps one lawsuit (risk to the community) for another (you lied to the SEC)
I don't have a time machine, so it's a bit untestable but what do you, reasonable HN readers think the person in the street would say if you vox-pop'd "what does FSD mean"
Fully is pretty well understood.
Self has a really specific meaning.
Driving.. is what happens to make cars move safely, or not.
But "Fully self driving" actually means "not fully self driving" .. ok.. how did the "not" get in there?
All of synthetic motor oil is a synthesis of components, individually chosen and assembled. It is an engineered product, as opposed to natural motor oil, which is a naturally-ocurring product that has been refined. Both being fossil fuel petrocarbons is neither here nor there — it’s not a claim made.
The original (apparently staged) FSD promo video featured a tagline which read, “The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself.”
As reported, he seems to only be saying "there might be bugs". The "certain limitations" had better have been explored by the plaintiff lawyers.
I'd hate to be the engineer caught between perjury and Musk's interests. It'd take a lot of RSUs to compensate.
The name really doesn’t matter. I don’t know why everyone is always arguing about it. It’s very clear that it isn’t autonomous when you buy and use it, and that most of what you’re buying is future updates.
Watch the feature video on www.tesla.com/autopilot. It has been there since 2016 claiming the driver is just there for legal reasons. Nothing is clear except that Tesla wants customers to think they sell autonomous cars. Of course they also insist the opposite when convenient
There's been much written about how fake the video is as well, including it actually getting into an accident during one of the many takes.
> All new Tesla cars have the hardware needed in the future for full self-driving in almost all circumstances. The system is designed to be able to conduct short and long distance trips with no action required by the person in the driver’s seat.
> The future use of these features without supervision is dependent on achieving reliability far in excess of human drivers as demonstrated by billions of miles of experience, as well as regulatory approval, which may take longer in some jurisdictions. As these self-driving capabilities are introduced, your car will be continuously upgraded through over-the-air software updates.
It is certainly easier to make your case if you look only at that one video from 2016 which was released as a tech demo and pretend that is the only thing Tesla has ever said about it and that anyone mistook that for something your car could do at the time. (A Tesla can in fact do that today, other than transitioning from self-driving to parked.)
It’s not a problem that the video was intended to show that. They’re pretty clear that that is the end goal, and it is not currently available, but your car will be updated to do that at some point. They weren’t claiming it did that at the time, they were demoing what it would do in the future and that the hardware could do that.
Because it's just a name. What actually matters is whether anyone who actually paid for this feature thinks it immediately makes their car fully autonomous. Nobody thinks that. So it's easier just to squabble over the name itself, ad infinitum, for no reason.
Sorry, that's against the site rules and all. But what else is there to say about such colossal delusion and doubletalk?
If they'd said, for instance, 'full driving' we'd have thought sure, it has all the features but still, take care it might be buggy.
If they'd said 'self driving' again we'd think, it can do it but maybe not in all circumstances.
But they said it all; 'Full self-driving' means it does the whole enchilada. It can mean nothing else. It's normal English, well-understood by anybody who is listening. No ambiguity. No wiggle room.
These guys ought to be enjoined to stop selling their irresponsible death machines.
It's mind blowing what Tesla and Musk are allowed to do.
And don't get me started about the doubletalk ! Musk clearly thinks he's above the law now (and for some reason, this seems to be the case for now). There are countless other examples, some are not even debatable.
For example, when he bought the first 9% of Twitter. Legally, he should have disclose that at 5% threshold - which he did not. It's estimated that it saved him ~$150 million. There is nothing to debate - the law is clear, and he intentionally did not disclose his buying. One would think that it's not a complicated case, yet still 1 year after there is nothing done ! Nothing. SEC is suing him now, but ... not really for wrongdoing ! They want to bring him to testify. I don't know what's going on, but clearly he's getting a lot of free passes.
That’s the problem when you get too big, isn’t it? Musk now has $149 million extra that he can spend on fighting legal battles with the SEC and still come out at least a million ahead. Does the SEC have $149m in spare resources to fight this fight?
Unfortunately this is the math I believe the enforcement agencies have to do.
What does $149M get you in a legal argument? I realize this must seem like a very naive question, but if the law said "you must disclose at 5%", and someone fails to do so, shouldn't that be a simple case?
Nikola, the EV truck company, was full of shit. The truck couldn't drive under its own power. Tesla FSD on the other hand, is just not as good as advertised. Kinda like how a Lindt Truffle isn't velvety bliss, or a 787 Dreamliner is still hard to sleep on, or even BMW's excessively superlative "ultimate driving machine." The 4 cylinder Z3 was a terrible vehicle.
One can wish FSD means perfection in every circumstance all the time, and Tesla certainly implied in its marketing that it is. In reality, FSD is really amazing at times and 'what-the-hell was it thinking' at times. But it does control speed, steering, braking, obstacle avoidance, traffic maneuvering, parking, navigation in a manner that, yes, I would call "Full Self Driving." It is very far from perfect, and Tesla has certainly communicated it as being better than it really is. there's a high dose of caveat emptor, but FSD does a lot more than any other car I've ever driven or owned. The fact that I will not let it attempt to navigate certain intersections, either out of embarrassment or risk of getting hit, does not diminish the fact that my car will give it the ol' college try.
I do think Tesla should get a substantial hand slap though, and should be required to give an extended free trial to everyone to see exactly what it is that FSD can and cannot do, and also give those who purchased FSD outright an extended refund window.
The difference between saying your car is "the ultimate driving experience" and that your car has a "full self driving" mode is the difference between puffery and straight up fraud.
If I asked 100 people what "ultimate driving experience" meant, I would get 43 different answers, 58 of them being people telling me it's bullshit nothingness.
If I asked 100 people what it meant for a car to be "full self-driving," 99 would say it means the car could fully drive itself. The final one is too busy sucking up to Elon.
No, you took the analogy too far. It's just a trademark name that doesn't fully live up to the hype. Let's look at GM's Supercruise, which is marketed as "hands free driving." Well guess what: you still need hands to drive it.
I believe people are being overly dramatic and think they are entitled for Elon's overcommitment. I agree that Tesla overpromised and underdelivered. Meanwhile, I am happy to use and optionally pay for FSD as delivered. It is really an amazing technology and significant.
> The fact that I will not let it attempt to navigate certain intersections, either out of embarrassment or risk of getting hit, does not diminish the fact that my car will give it the ol' college try.
If there are regularly occurring street driving situations it can't be trusted in for safety reasons, even (perhaps especially) it if it "will give it the ol' college try" in those circumstances, I suppose you can argue whether that indicates that it is fraudulent to describe it as "Full Self-Driving" or if it is merely defective Full Self-Driving, but I think that that's a fairly fine distinction in what Tesla has done wrong.
> and should be required to give an extended free trial to everyone to see exactly what it is that FSD can and cannot do
That seems to be exactly the wrong solution for a system which, by your own description, is unsafe in some of the situations one would reasonably expect "Full Self-Driving" to handle.
It does work, though. That's all I'm saying. I totally agree that the marketing implies it should be more reliable. In my experience, in all circumstances, it errs in being painfully cautious, which is where I intervene.
I mean, I pay $200/mo for FSD because I like it so much.
I've said it before: We're past the point Elon is solely responsible. We need to start looking at the blatant negligence of government officials at both state and federal levels for letting him get away with it for so long. We are so far past the point anyone could possibly believe anyone letting this continue has good intentions.
It is (was?) actually called Full Self-Driving Beta, which to an engineer does explicitly indicate that bugs are to be expected. Not sure if they're still calling it Beta, but at the time of this accident they certainly were.
It's hilarious seeing hn turn on Musk and Tesla (and self-driving cars in general) after years of being told here that "you just don't get it" when pointing out the absurdity of Tesla being worth more than all the other auto companies combined.
No, I think in general HN has been more wary/skeptic of Musk than the general public. The turning point to me was the pedophilia accusation against someone actually saving kids, but since Solar city at least you could find people extremely critical (and since then I've looked at the solar city video critically and people here were right, it was oil salesman tactics)
HN, generally speaking, is _very_ vulnerable to hype. It has, collectively, largely dismissed crypto-stuff now, for instance, but it took a _long time_. Up to a few years ago, any story about Bitcoin (or even more obviously nonsense crypto things like DAOs and NFTs) had comments full of breathless “this will change the world” energy.
> While testifying last week, Tesla engineer Eloy Rubio Blanco repudiated the allegations that the name “Full Self-Driving” was chosen to deceive the public. Blanco stated that Tesla drivers did not believe that their vehicles were autonomous.
It doesn't say anywhere else how he thinks the public is supposed to interpret "full self-driving." As a native speaker of English, I'm baffled to think of what else it could be intended to mean.