So far, Teslas have crashed into two semitrailers, a street sweeper , a stalled car partially in the left lane, a stalled truck partially in the left lane, a fire truck, and two road barriers. We don't know about the latest one yet, but in all the other cases, the crash was not preceded by automatic emergency braking.
Tesla lacks basic obstacle detection. The original Mobileye system had "recognize car" as the first step. If it didn't look like a car, it wasn't an obstacle. Tesla claims to have replaced that with their own system, but we're still seeing full-speed collisions with obvious obstacles.
I don't understand the rest of the comment though. Hard to reconcile "self-driving" with "ignore stationary objects".
I don't think of anything available to end users today as "self-driving". Auto pilot is a better analogy, even if people don't seem to understand that.
Then there is the matter of autopilot being a bad analogy, even if it were technically accurate (I dispute that because AFAIK airplane autopilots officially allow pilots to take their hands off the controls, unlike Tesla's system), because the general public has serious misconceptions about what airplane autopilots actually do. Aircraft pilots receive serious training, so they don't have those same misconceptions, which is why aircraft autopilots aren't a problem. But much of the general public is under the mistaken impression that pilots of modern airliners simply "push a button then take a nap" as the airplane takes off, flies and lands itself. That's not reality as you and I both know, but many in the general public have that misconception, making Tesla's use of the term likely to mislead and therefore irresponsible.
See page 73-74 of the Model 3 owners manual: https://www.tesla.com/content/dam/tesla/Ownership/Own/Model%...
Warning: Autosteer is a hands-on feature.
You must keep your hands on the steering wheel at all times.
Warning: Autosteer is intended for use only on highways and limited-
access roads with a fully attentive driver. When using
Autosteer, hold the steering wheel and be mindful of road
conditions and surrounding traffic.*
Warning: Autosteer is not designed to, and will not, steer Model 3
around objects partially or completely in the driving
lane. Always watch the road in front of you and stay prepared
to take appropriate action. It is the driver's responsibility
to be in control of Model 3 at all times.
 - Nissan - https://youtu.be/HkL67DgleQY
 - Ford - https://youtu.be/Z_9218dTXXY
 - Audi - https://youtu.be/nUlK6fpveXg
 - BMW - https://youtu.be/xsQvq4WlUYU
And it's not like there haven't been issues with auto pilot or people bemoaning it as some moral hazard for pilots.
"We can't make this work without inconveniencing the driver, but dead people don't sue!"
no idea why people thought companies would have handled this differently.
beyond that, it's the usual overselling 'neural network' and other classification technologies as AI, of which they are not.
"To understand the strengths and weaknesses of these systems and how they differ, we piloted a Cadillac CT6, a Subaru Impreza, a Tesla Model S, and a Toyota Camry through four tests at FT Techno of America's Fowlerville, Michigan, proving ground. The balloon car is built like a bounce house but with the radar reflectivity of a real car, along with a five-figure price and a Volkswagen wrapper. For the tests with a moving target, a heavy-duty pickup tows the balloon car on 42-foot rails, which allow it to slide forward after impact.
The edge cases cover the gamut from common to complex. Volvo's owner's manuals outline a target-switching problem for adaptive cruise control (ACC), the convenience feature that relies on the same sensors as AEB. In these scenarios, a vehicle just ahead of the Volvo takes an exit or makes a lane change to reveal a stationary vehicle in the Volvo's path. If traveling above 20 mph, the Volvo will not decelerate, according to its maker. We replicated that scenario for AEB testing, with a lead vehicle making a late lane change as it closed in on the parked balloon car. No car in our test could avoid a collision beyond 30 mph, and as we neared that upper limit, the Tesla and the Subaru provided no warning or braking. (Emphasis mine)
Our tests also exposed the infallibility myth that surrounds computers and automated vehicles. Driving the same car toward the same target at the same speed multiple times often produces different results. Sometimes the car executes a perfectly timed last-ditch panic stop. Other times it brakes late, or less forcefully, or even periodically fails to do anything at all. In our stationary-vehicle test, the Impreza's first run at 50 mph resulted in the hardest hit of the day, punting the inflatable target at 30 mph. It was only on the second attempt that the Subaru's EyeSight system impressively trimmed the speed to just 12 mph before the collision. All the results in the charts in this feature show a car's best performance."
Note that these tests were executed against an object intended to replicate the radar profile of a light vehicle. I expect worse results against a semi's trailer. Tesla is not alone in facing this challenge.
To be clear, these systems are a net win: without automatic emergency braking, fatality rates would be much higher. People will still die, but less people than before.
Good to know.
The fact that a Tesla crash causes NTSB to investigate is a sign that these crashes are incredibly rare.
Tesla is either going to have to hobble it enough that drivers pay attention, or significantly improve it, if that's possible.
Self-driving cars are decades away from reality, and self-driving trucks are even further away.
If only we could figure out how to detect solid objects, but alas, such a difficult problem. So hard to implement that no one should hold their breath, in hopeful anticipation that a carmaker might add solid object detection to their sensor array package, so that cars can do the most important thing everyone expect from them.
But, you know, when you make a self driving car, hazard avoidance and collision detection actually turn up much lower on the list of priorities that you'd expect.
(in driver's ed, we learned not to overdrive our headlights, to not drive so fast that our stopping distance began beyond our visual field of good lighting surrounded by darkness; meaning that should an object suddenly appear in the light cone as you approach it, the combination of your human reaction time, combined with the tire's capacity for friction against the road surface, combined against the total inertia of the car in motion should all remain in the cognitive focus of your situational awareness as you drive a car at night; without this, as a human, you are not a safe driver;
and yet robots are judged according to different standards, even though, for them, it is always nightime, and the world is perpetually a cloudy environment shrouded in mysterious darkness; too bad that's not what sells cars)
Can’t we just put signs on everything?
By placing an industry-standardised logo on the corners of vehicles, and giving them away free as stickers / screw on signs, we could identify the anything that ought be considered solid and necessary to avoid regardless of its velocity.
In my opinion we aren’t going to see Level 4+ autonomous vehicles until we start including their requirements in our built environs design.
We’ve designed roads and cars to accomodate the shortcomings and limitations of humans.
Is there any reason to expect robotic autonomous systems to have the same set of short comings as humans?
Consider that Google/Waymo seems a lot more serious about CV research than Tesla, and yet their prototypes all use LIDAR and CV sensor fusion. If Waymo thinks LIDAR is important, why should I trust Tesla when they say it isn't? And remember there is a strong profit motive for Tesla to downplay the necessity of LIDAR: LIDAR is expensive and bulky right now, but Tesla wants to advertise their cars as containing all the hardware necessary for self-driving so they can profit from the automation hype. Waymo uses LIDAR because they're trying to make it function while Tesla scoffs at LIDAR because they're trying to sell cars with hype.
Yes there will be bugs--all software will have bugs--and they will decline over time. Meanwhile, the accident rate of autopilot is probably better than human drivers.
Will crash in to stationary object at high speed is and inentional design decision.
I wouldn’t go so far as to call it a feature, but it’s certainly not a bug in the traditional, or common, usage of the term.
This case (truck perpendicular to travel path) should literally be the easiest test case for object detection, other than larger stationary objects like buildings and bridges.
I suspect this is going to be a perpetual problem for vehicles without lidar, but I'm not particularly clueful.
That's interesting. I can imagine, lacking real depth information, why that would happen. Especially for specialty trailers, like the ones that pull logs or wind turbine blades... where there's a large open area under the cargo.
That's my biggest objection, I either want to be 100% disengaged or nothing. A system that wants but doesn't require you to pay attention and allows you to check out is absurd. It's hard enough sometimes reacting to things when you are driving but a system that let's you stop paying attention but might ask you to make immediate responses to situations it can't handle? Hell no.
We've had our Tesla since November and it has given us two 30 day trials of Autopilot and the "no thanks" is so obfuscated that it's easier to say yes and then opt out after it enables it.
Some of the features like Intelligent Cruise Control would be nice but no way in hell do I want the half assed autopilot feature.
Stop marketing Tesla's Level-2 ADAS as 'AutoPilot', maybe? It would be a pretty radical change, but it would get rid of what is by far the most common misconception about it - namely, that it allows for even momentary distraction by the driver (as the "autopilot" on an aircraft would) - in fact, it does not. On the flip side of it, they could fairly reintroduce the AutoPilot name when they reach at least Level 3 on the ADAS scale (allowing for momentary distraction in highly selective conditions, such as-- for a start-- when the car is basically restricted to moving at pedestrian-safe speeds anyway, due to intense congestion or other reasons).
And then the same for autopilot.
Tesla's stingy with their mileage data though, so you have to get academic.
Combining our crash data with the MIT numbers, we arrive at the following fatal Tesla accident rates:
- Autopilot: 0.25 per 100 million miles
- Human Pilot: 0.20 per 100 million miles.
(Put another way, one Autopilot fatal crash every 400 million miles, one human pilot crash every 500 million miles.)
* autopilot is used mostly in favorable road conditions, highways, clear weather, while humans drive in any condition, for example snow or fog
* Teslas are on average newer than the average car on the road, increasing the likelihood to survive a crash
* Teslas are upper class vehicles, further increasing the likelihood to survive a crash
All in all, the cards are stacked in favor of Tesla in a mere (fatal) accident per mile comparison.
They did it to themselves though; bringing a glorified cruise control to market while touting it as the safest way to travel by car and calling it Autopilot.
I own a TM3, I do not have FSD features but those were not involved in this crash. I do know enhanced autopilot doesn't seem to do a good job with tracking certain objects, especially stationary objects in the path of travel. While a truck would not be stationary I really do not see any excuse for not seeing one. At minimum two cameras had eyes on it. One of my favorite, what does it see videos