Hacker News new | past | comments | ask | show | jobs | submit login

Why should the car use an average .250 kWh/mi hardcoded rate on a car I have owned for 4 years when it knows my lifetime average is 0.300 kWh/mi?

There's plenty of smarter things it can do by default other than "its hard, meh".

They are gonna figure out how to do coast to coast self driving this year but can't project a reasonable battery range estimate given temperature/driver history/weather? Do they need more GPUs?




That might be a useful metric but it won’t always be the most accurate.

If you mostly drive around town, that estimate will be way off when you get on a highway to go to Grandma’s house.

There’s probably a better way to do it, but Tesla seems to be optimizing for avoiding the “why does the website say 300mi but my car shows 200mi fully charged” support question, in exchange for a different set of support questions.

I think ideally the car would give a best guess estimate, along with a clear breakdown of why this is more or less than the rated range. I just don’t think that’s clearly the “most accurate” option. Most accurate requires knowing where you are going.


So make it a menu option - range estimator: best/worst/spec.

It feels like one of those Muskisms where one bending of the truth requires more and more stuff down the line.

If they weren't over optimizing for the EPA test to give almost unachievable range, then they wouldn't need to have the car range meter lie as well. But if you fib once, you need to keep fibbing and keep the fib straight.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: