It seems there’s a huge disconnect between people who know how to drive and are confident in their skills, and those who are scared to drive and want to offload it to Musk.
I’m actually afraid to drive behind a Tesla and either keep extra distance or change lanes if possible. I still have more faith in humans to not randomly brake them in beta FSD.
It’s one thing to put your own life in the hands of this beta model, and it’s another to endanger the life and property of others.
Yeah it's not that I'm scared of my own driving, I mean that if other people had it it would be safer. Things which have happened in the last 3 days of moderate driving around a city:
1) A car gets impatient at some traffic turning right on a green light in a construction zone, and comes into my lane meaning into oncoming traffic. We have to swerve out of the way and slam on the brakes to avoid them.
2) A guy gets impatient behind me slowing down over a speedbump, tailgates me a few inches behind and then passes me by cutting into oncoming traffic in a no passing zone, then cuts me off, again with a few inches to spare, and speeds through the neighborhood where we eventually meet at the stoplight at the end of the road.
3) Every single day people run red lights. Almost every light where there are people turning left at a green there are 4-5 cars that go through the red light after it turns.
My safety score on my tesla is 99. I am an extremely safe driver. I wish FSD was more common because so many people are terrible drivers.
Your Tesla tells you you're a safe driver. Well, geez, I'm convinced that the $55,000 (maybe+?) you spent tells you things you like to hear. I mean, one of the criteria is "Late Night Driving". Why not driving between the hours of 4pm and 6pm? When there are a lot more cars on the road, which is just statistically less safe.
And looking at the rest of the criteria, they're ok, but hardly comprehensive. This is like, the bare minimum. It doesn't measure what I would call "doing stupid shit". Like crossing several lanes of traffic to get in a turn lane. Forcibly merging where you shouldn't. Nearly stopping in the middle of traffic before merging into a turn lane at the last minute. Straddling lanes because you're not sure if you really want to change lanes or not. Making a left turn from a major artery onto a side street at a place that is not protected by a light. Coming to a near complete stop for every speed bump, then rushing ahead to the next.
And a host of other things that demonstrates the person does not consider other people on the road at all.
Here's an entire article about how to game the safety score:
One of the tips is to "drive around the neighborhood when traffic is light".
And the car doesn't ding autopilot for behaviors it would knock a human for. Because the assumption is that the car would know better I guess. But then why isn't the safety score simply a deviation from what autopilot would do in a situation? If autopilot would brake hard to avoid a collision, shouldn't you?
The Tesla doesn't just tell me I'm a safe driver, it also gives me a cheaper insurance rate due to the safe driving. The safety score is related to my monthly insurance premium.
So maybe it's to stroke my ego? But they're also putting their proverbial money where their mouth is.
The insurance you get from Tesla, which has a vested interest in its own Safety Score. It's not giving a cheaper insurance rate "due to the safe driving", it's giving a cheaper rate due to having a better score on the metrics it decided.
You see how that's circular, right. It does not mean you are a safe driver.
…what? No I am not following that. The safety score is a representation of the metrics used to determine the insurance rate. This is not in any way circular. The insurance rate and the safety score are representative of the same thing: safe driving.
They're both from Tesla. There's no real proof that they're representative of safe driving. And you kind of want it to flow both ways: "the safety score represents good driving, so I get cheaper insurance" and "I get cheaper insurance so the safety score must represent good driving".
You're trying to use each of these things to validate the other so you can then claim it's something entirely else.
I think it's telling that autopilot is allowed to drive in a manner that would otherwise negatively impact your safety score. Either autopilot is unsafe or the safety score doesn't measure actual safe driving.
Man I think you're really confused about what the "safety score is". Here it is stated a simpler way:
Tesla tracks driving style and gives you an insurance rate based on those driving habits.
My insurance rate is about $90, which is the lowest the rate goes, because the things it tracks (following distance, for one) corresponds to a lower liability risk for the insurance underwriter, which in this case, is tesla.
The monthly price is directly correlated to driving habits. You seem to be getting confused about "safety score".
I'm genuinely confused as to what about this doesn't make sense to you.
Tesla is the underwriter. They give a monthly premium based on driving habits.
Tesla is the insurer. Tesla is also the manufacturer of the insured item.
You are using as proof of the claim, the fact that the group making the claim says they're right. This is big "We've investigated ourselves and found nothing wrong" energy.
You are talking to someone who is excited about… The Future (tm). And will defend their purchase and choices to the death. So I doubt there is any reason to continue. The fact they even gave us their Tesla given “safety rating” like it was something to give a shit about says a lot.
We are all glad your car is giving you head pats and game badges.
Also, the Tesla driving score reminds me of the Windows “Experience Index” score. Which was complete and utter toss. But made people feel good about their systems.
> I’m actually afraid to drive behind a Tesla and either keep extra distance or change lanes if possible.
Irrespective of the Teslas or FSD:
If you're afraid of an accident due to the vehicle in front of you braking regardless of the circumstances, then you're following too closely. It doesn't matter if the braking event is anticipated or unexpected. If you're not confident in your ability to avoid an accident if the vehicle in front of you slams on their brakes then you are following to closely.
> If you're afraid of an accident due to the vehicle in front of you braking regardless of the circumstances, then you're following too closely.
I know what you're saying, but that is not what I'd meant. Also, I don't follow too closely.
The difference here is that almost always a driver will brake depending on the happenings in front of them. So, if you pay attention to not only the car in front, but in front of them and in the neighboring lanes and so on, you can sense and also detect patterns in how a particular driver is driving. There are many skittish drivers who brake every 1 second and some don't, and so on. Basically based on your driving experience you can predict a little.
The problem here is that this stupid POC FSD will brake randomly or change lane randomly or whatever, so there is no way you can predict, and hence my concern and issue with it. I just prefer to change the lane, but that's not always an option.
> Basically based on your driving experience you can predict a little.
Yes, we all do that AND you're using that predictability to take liberties in safety such as following too closely. FSD's unpredictability exposes the vulnerability in your driving process and makes you feel uneasy.
You (and everyone else) can follow too closely AND FSD can be an unsafe steaming pile of crap. It's not an either or situation.
I’m actually afraid to drive behind a Tesla and either keep extra distance or change lanes if possible. I still have more faith in humans to not randomly brake them in beta FSD.
It’s one thing to put your own life in the hands of this beta model, and it’s another to endanger the life and property of others.