This is unsurprising but I'm glad that it is being researched "formally".
I think this level of automation in-between cruise-control and full self driving is incredibly dangerous and should be made illegal. It is well known that humans can't maintain focus on repetitive tasks for any significant length of time. Driving is already quite repetitive (and we see lots of accidents due to lack of focus) but the more automation you add the worse it gets. Especially when you add irresponsible marketing such as calling it "Autopilot" which strongly suggests that it can "Pilot" itself "Automatically".
My proposal would be to hold the manufacturer responsible for the behaviour of their products as soon as they pass a very basic form of automation (such as anything more than speed-maintenance cruise control). They shouldn't be allowed to claim that it is the driver that is responsible and need to pay damages both to property and life when accidents occur.
> I think this level of automation in-between cruise-control and full self driving is incredibly dangerous and should be made illegal. It is well known that humans can't maintain focus on repetitive tasks for any significant length of time.
I would rather see conditional-use licenses that require full automation. I.e. maybe Tesla gets a license that allows using Autopilot, but only on highways with a speed limit equal to or lower than 55 MPH, and only when it isn't raining or snowing.
Then just require Autopilot to enforce those conditions. Have it refuse to engage if it's raining or snowing. If your route needs to take an offramp, have it loudly announce "Your route needs to take this exit, but I cannot drive there. Please take the wheel." If you don't take the wheel, it just keeps going down the highway or pulls over or something.
Then let them slowly increase the situations they're licensed for through some kind of testing process.
> My proposal would be to hold the manufacturer responsible for the behaviour of their products as soon as they pass a very basic form of automation
While I agree, I think this is a significant dampener on user-owned self-driving cars. It's really unappealing to sell a product where you're liable for what end-users do with it. Especially a product the end-user has physical control of.
On the tame side, what if they don't update it? On the more malicious side, what if they physically tamper with it? Or if they purposefully put it in terrible situations to see what it does? You just know there will be some kind of Tiktok fad involving doing stupid things with self-driving cars.
Most of those are far less of an issue with company-owned cars you hire as a taxi. The user doesn't have physical control, and the company can maintain control of the route. Makes it a lot easier to avoid problem areas, and prevents users from doing dumb things with routes like trying to drive through an airport or something.
I just wanted to bring up that that's probably the death knell for getting to own a self-driving car. Maybe it's still worthwhile.
> maybe Tesla gets a license that allows using Autopilot, but only on highways with a speed limit equal to or lower than 55 MPH, and only when it isn't raining or snowing.
This is in line with my thinking of the manufacturer taking responsibility. If they can guarantee that it is safe only in certain conditions they can enable self-driving only in those certain conditions. If they can guarantee that it is safe everywhere, they can enable it everywhere.
> It's really unappealing to sell a product where you're liable for what end-users do with it.
Yes, this is a serious problem that would need to be addressed before implementing a system like this. It would make third-party repairs and modifications difficult. (They would probably have to take responsibility or disable the self-driving features. Of course maybe that is the right model, the repair shop takes responsibility but if they show the problem wasn't their fault then they can turn around and demand damages from the original manufacturer. Unfortunately now there is a lot of annoying legal work generated.)
> On the tame side, what if they don't update it?
This would be the manufacturer to decide. Maybe it refuses to self-drive if not updated. Of course the reasons that the feature would be removed would need to be made clear to the consumer (or they could get a refund).
> I just wanted to bring up that that's probably the death knell for getting to own a self-driving car. Maybe it's still worthwhile.
I don't think it is the death-knell, but it definitely blurs the line further between who owns the car and will likely slow down the adoption of individual self-driving cars. This does have downsides but to me it looks like these are likely better than letting this dangerous combination of insufficiently "smart" vehicle and distracted driver zoom around the roads.
To be honest I think this will be the case with any such autopilot. Any system that doesn't require regular attention will be ignored or forgotten. Even drivers without autopilot become inattentive when driving doesn't take their constant attention. Human beings are just not good at this.
This is why military watch standers are often asked to make rounds instead of standing in one place. Paying attention to something that requires no intervention will make even heavily trained soldiers start to daydream, fall asleep, and wander off.
I think training would be necessary for people to safely use autopilot in cars. However, getting people to take driving seriously is an uphill battle already.
Someothing I don't understand about the press coverage about Tesla Autopilor/FSD is the ommission of what (to me) is the single most important metric of Autopilot's relative safety: the number of accidents per unit mile driven, with and without Autopilot. I'm not saying this is the only number that matters, but that it's the one that matters the most. right?
Tesla voluntarily releases quartely safety data[0], and yet I literally cannot find a news article that mentions it in the first page Google results for "tesla autopilot safety." [1][2][3][4].
What am I missing? Isn't this the most important metric to report to the public in any discussion of Autopilot's safety?
The first thing that jumps out at me about that number is that it doesn't appear to account for the type of road.
If I had to guess most people driving down the highway use "Autopilot" while less use it in city driving. Since the highway is safer this likely skews the numbers significantly. Basically the situations where "Autopilot" may be used and is used is likely already situations where accidents are less likely.
Yes, I'm sure that breaking it down by road type, or by driving skill ( as measured by number of driving infractions before buying Tesla), ..etc would be even better.
But that, to me, doesn't change the fact that the currently released metric is still absolutely necessary to report in any news article, with all of its caveats.
Autopilot is the only way I can feel safe operating the touch screen. When I want to focus on the road I still have to watch the screen to see when to wiggle the wheel - so you always need an eye on the screen no matter what’s going.
I think this level of automation in-between cruise-control and full self driving is incredibly dangerous and should be made illegal. It is well known that humans can't maintain focus on repetitive tasks for any significant length of time. Driving is already quite repetitive (and we see lots of accidents due to lack of focus) but the more automation you add the worse it gets. Especially when you add irresponsible marketing such as calling it "Autopilot" which strongly suggests that it can "Pilot" itself "Automatically".
My proposal would be to hold the manufacturer responsible for the behaviour of their products as soon as they pass a very basic form of automation (such as anything more than speed-maintenance cruise control). They shouldn't be allowed to claim that it is the driver that is responsible and need to pay damages both to property and life when accidents occur.