Hacker News new | past | comments | ask | show | jobs | submit login
Pedestrian detection systems don’t work well, AAA finds (arstechnica.com)
126 points by lelf 13 days ago | hide | past | web | favorite | 117 comments





Once I'd had it pointed out to me that these articles go out of their way never to refer to people on foot as actual people, I just can't unsee it.

The only thing referred to as human here is the driver, and the only person whose safety is discussed is the driver. Clearly at some point during the composition process the writer got bored of using the term "pedestrian", because it looks like he hit up the thesaurus, discovered the term "biped", and decided that would be a good term to use. (But I bet you a fiver these cars won't stop for kangaroos or ostriches other than as part of some generic collision avoidance system.)

In the writer's defence, the sentence about the driver's safety is a mite ambiguous - but it could have been more explicit, so I'm going to be uncharitable here. And I do give him points for sneaking the term "run people over" in the last sentence, because he could so easily have referred to it as an accidental unavoidable pedestrian collision incident or something.


Entire cities are built around our love for cars — the ideology of the superiority and desireability of cars is deeply buried within the fibre of society. It could be that the author thinks they are just being neutral, while the neutral thing would be to describe this technology as something that regulates the interaction between equally valuable human beeings.

This is car culture 101: expensive Cars (and their drivers) are the top of the foodchain, cyclists and pedestrians are the bottom. Depending on where you live this might differ (e.g. in the Netherlands cyclists are on the same level as motorists and you can really see it in the architecture).


> Entire cities

And the modern police force.


Kind of like how they go out of their way to refer to people operating cars as drivers instead of actual people?

This kind of language is par for the course because these articles basically use a less terse version of whatever language the study or official report uses and those kinds of documents tend to shy away from things like "people driving cars" and "people walking".

You are finding meaning where there is none.


One of the side-effects of using "human centered language" tends to be needless verbosity and resulting clumsy writing as the human being writing the piece of writing labors to appropriately center the human being or human beings being discussed.

No, this is actually a real thing that happens when the news covers driver pedestrian interactions.

https://usa.streetsblog.org/2018/03/28/how-coverage-of-pedes...


Yes. Thank you for this post - this was exactly the sort of thing my comment was referring to.

To me, the term "bipeds" seemed like a stylistic choice to emphasize the shortcomings of the systems -- the algorithms are not sophisticated enough to understand what a "person" is, so the author used a different word to emphasize that fact.

And the rest of the article is about a test with crash dummies. Why would the author talk about "people" when the tests were performed with dummies?


> And the rest of the article is about a test with crash dummies. Why would the author talk about "people" when the tests were performed with dummies?

Because the upshot of the tests is that if your car is equipped with ACAS that "detects pedestrians," then it may lull you into a false sense that your car will avoid running people over and killing them even if you stop paying attention. The reality is that the ACAS is not foolproof and you shouldn't rely on it. But these systems are likely to create a Volvo Effect, where the people who have them drive much more recklessly than people without them.


What you may be describing is "Risk Compensation:"

https://en.wikipedia.org/wiki/Risk_compensation

That has been borne out with studies of how people adjust to airbags, SUVs, pickups, and other vehicles they emotionally perceive as being safe.

One of the things I've read about risk compensation is that it has very little correlation to the actual safety of a vehicle, and a lot of correlation to certain features that people emotionally associate with safety.

For example, modern automobiles are safer when they provide plenty of rear visibility. However, many inexperienced drivers feel safer when cocooned in opaque materials, so they feel safer in vehicles with less glass. It feels to them like they're wrapped in protective steel.

Likewise, in a previous era people felt safer in longer vehicles, even if those same longer vehicles were less maneuverable and therefore more likely to get into an accident in the first place.

Length of vehicle is highly visible, the ability to avoid an accident is not.


"Not foolproof" is the most generous way possible to speak about pedestrian-avoidance systems that hit the pedestrian in every single tested case.

From the article:

Tesla: 20 tests, 20 kills - 0% success

Chevy Malibu:20 tests, 20 kills - 0% success

Honda Accord: 20 tests, 4 kills - 80% success

Toyota Camry: 20 tests, 12 kills - 40% success

There were other tests (also failed) but the article didn't give full details on those so I won't include them here.

Looking at that, one manufacturer has a product that works poorly and three manufacturers should probably face sanctions for selling their "product" in a US marketplace.


These tests seem less than useful if you're not comparing them to the null hypothesis: people keep driving by themselves. As the article points out, pedestrian deaths are up, and that's not because of this technology.

It's hard to compare because people and machines fail in different ways. Machines don't really know what people look like or how they behave, and are easily misled. People get drunk, get tired, get angry, look at their phones, and generally get confused. Humans have remarkable talents for putting together a 3D scene with only two eyes, but those eyes are extremely restricted in the range of wavelengths they perceive.

If we get aggressive in shutting down the cause of accidents, we're going to ban cars entirely. The machines are far from perfect; they may even be far from good. But the machines are getting better, and humans are getting worse.


The “Volvo effect” refers to standardized testing mostly reflecting the income and education level of one’s parents. I can’t find any research showing that Volvo owners drive more recklessly, but I’d believe it. Do you have any more info?

The theory amongst some motorcyclists is that know they’re shit drivers, so they go buy a Volvo. They’re not driving like idiots because they’re in a safe car, they’re in a Volvo because they know they’re a menace.

Hard hypothesis to test. Just stay away from Volvo’s.


That's not a theory, that's some kind of urban legend or folklore, and it's nonsense. Repeating it without a shred of substantiation is equally nonsensical.

If you want to "stay way from Volvos" on account of it, fine. Please also consider throwing a pinch of salt over your shoulder when you spill it, and crossing the street when you see a black cat.


Easy, cowboy. If you took that as being passed off as serious science or some shit, you need to go grab another cup of coffee. Or maybe cut back.

That's ok. This is the REAL reason to give Volvo drivers plenty of room:

https://www.youtube.com/watch?v=_I3hnTaTufg


Volvo even does go out of their way to convince you that your car will avoid running people over and killing them even if you stop paying attention. https://youtu.be/pjQt2lEZIXg?t=144

I approach driving as full focus activity at all times, never trust the other participants in their actions and reaction time, and I am cautiousy trusting that the system that my volvo has will try to prevent harm if all else fails. Their approach and mission statement around safety are very no non-sense, and the Pilot assist system manages to avoid being hyped as a revolutionary but it has just the right features to help me on long commutes. If their software development pipeline is half as refined as their overall engineering, I trust to recommend them.

Aren’t you referring to the seatbelt effect? As far as I know, Volvo drivers are still considered more timid on average?

Just because they aren't perfect doesn't mean they don't make your car safer. The writing here almost sounds like the author would prefer not having it. Having AEB on your car can't make it more dangerous unless the AEB system accelerates into pedestrians. You have two options-

1. A car that only brakes when you react

2. A car that brakes when you react, and also sometimes when you don't.

The only way that AEB might conceivably make you less safe is if you pick up bad driving habits because you expect the all-knowing robot to keep you safe. This certainly applies to some people, but that's the person's fault, not the manufacturers fault for rolling out systems that are better than nothing, even if they aren't perfect.

Edit: I'll acknowledge that unexpected braking events might mean that AEB equipped cars have a higher risk of being rear-ended, but this test was about pedestrian safety. I'm including this edit just because I know someone will bring it up thinking it proves that they are smart even if it wasn't the point of the study.


The trouble is that, humans being humans, drivers come to rely on these systems and decrease their own vigilance, ultimately resulting in greater danger to pedestrians.

There may be some evidence this is true for certain specific features, but I think particularly with AEB, the vast experience most drivers have with the feature are just the false positives. Most people just get annoyed by it and want it off.

The very few times someone will have a positive experience with AEB it’s likely to be a huge adrenaline inducing panic response. As in, oh fuck fuck fuuu.. phew!

It think would take quite a bit of psychosis to actually drive in a way that you were “relying” on AEB.


I've had the collision alert give false alarms from approaching another car too quickly, but I have never had it alert or brake for a pedestrian ahead.

I think if drivers are getting annoyed by false pedestrian collision detection alerts, they are approaching pedestrians far too closely/quickly.

It think would take quite a bit of psychosis to actually drive in a way that you were “relying” on AEB.

You mean like approaching an intersection with phone in hand, and eyes on the phone? I see it every day.


> approaching an intersection with phone in hand, and eyes on the phone

I think the point GP was making is that those people who are prone to do so will do that without AEB just as much as with.

I'm a full driving automation sceptic who believes that even automatic transmission makes people more dangerous drivers, but even I can follow the argument.

Unlike many other elements of car automation, AEB isn't meant to move a task from doing it yourself to supervising the computer, it's the reverse of that. And even if AEB worked 100% reliable every time, drivers would still try to keep AEB engagements at zero because the sudden maximum braking is so much less comfortable than regular braking.


Yes, AEB is specifically designed to avoid the problems we saw with earlier interventions like ABS where users who don't understand the technology may mistake its safety interventions for their own skill and so trick themselves into driving worse than before.

AEB deliberately chooses maximum braking to a full stop because that does NOT feel like skill, it correctly feels like you nearly died just there. The experience is intended to be "Oh shit" not "See, I was never going to hit that, it's fine".


Re your first paragraph, I don't agree - people will consider it safe to drive and not pay attention 'because the car will stop me hitting anything'.

During rush hour in my UK city I reckon on about 1-in-50 drivers I see using mobile devices in hand whilst driving. Most appear to be lone female car drivers, most dangerous are male lorry/van drivers (who presumably are using delivery/routing apps).


> I think if drivers are getting annoyed by false pedestrian collision detection alerts, they are approaching pedestrians far too closely/quickly.

What? No, it means they are approach bridges at perfectly normal speeds. Or glare in the Sunrise/sunset. Or random things blowing across the road that aren't actually pedestrians. False positives are a real problem.


If AEB is to be part of a holistic automated driving experience, then it's clear it's not good enough yet. I think that is worth knowing so we can make sure we don't have users accidentally rely on them.

I’ve never seen anyone “rely” on auto emergency breaking before to drive more recklessly. Such a concept reminds me of an old Aerosmith video where some kid runs a car into a wall Just because they know the airbags will save them.

It’s not about driving in ways that are visibly more reckless. It’s about thinking they don’t need to pay as close attention to the road.

That really isn’t how emergency auto breaking works though, it’s a last minute thing that you are already screwed on if t activates. Seat belts, blind spot detectors, lane keep assist, and auto cruise (with back off) inspire much more false confidence than something many people are never going to be able to experience directly.

It's not encountering it that inspires false confidence. It's knowing that the car has it inspires false confidence. People think that it keeps you from hitting things, so they worry less about hitting things, so they focus less on not hitting things.

The feature isn’t tangible until it is used. Psychologically, people aren’t going to trust it very much because they have no idea about its capabilities. Automatic emergency breakdown my is very much like air bags in that way, which inspire a lot less confidence than seat belts because most people never experience how air bags actually work (whereas they can feel the tug of the seat belts constantly).

The best safety features are intangible until needed for this reason.


> Psychologically, people aren’t going to trust it very much because they have no idea about its capabilities.

What makes you believe this? Just because you won’t trust it until it happens doesn’t mean that the average driver won’t. It’s advertised as a feature of the car for s reason.


>The only way that AEB might conceivably make you less safe is if you pick up bad driving habits because you expect the all-knowing robot to keep you safe. This certainly applies to some people, but that's the person's fault, not the manufacturers fault for rolling out systems that are better than nothing, even if they aren't perfect.

Not quite. Imagine you are driving along a narrow but straight sections of a road. The road is wet, there are leaves on the surface and shallow puddles of water. You're doing 50 mph (or 80 km/h) and you feel comfortable because visibility is good, there are no other drivers, no animals and you can see for hundreds of meters in front of you and there are empty fields beyond one row of trees on each side of the road. Then suddenly a flicker of light or some shadow makes your AEB system brake suddenly as if trying to avoid hitting a pedestrian right in front of you. Your car's ABS does it's best, but the leaves, the water and the speed mean your car starts sliding, when the tires recover traction the whole car is angled 20 degrees to the left.You have no time to do a right turn to recover and you end up crashing into one of those thick trees still going 30 mph (or 50km/h). If you're lucky you're ok. The car is totalled. Will you buy another car with that automated braking system?


You’re describing and absolute failure of the ABS system. If your car has such a massive flaw little else matters, it’s inherently unsafe. Granted, if you had turned off whatever version of vehicle stability assistance came with the car and ABS had some major defect it’s possible, but again at that point you’re describing several failures.

I don’t mean to suggest the AEB system is without flaws, but if applying sudden breaking cause the break lines to rupture or whatever, that’s a serious failure on it’s own.


Breaking hard loads weight onto the front tyres, which changes handling in every car ever. My car breaking (unexpectedly) when I'm maneuvering (eg around an object) means I have no idea who is responsible for the outcome. My decision was overridden by the car.

Same is true of ABS and traction control, if you take the view that your decision was to apply this much braking or power to the wheels.

Yet ABS and traction/stability control are safer for the vast majority of drivers (who are terribly poorly trained on vehicle dynamics).


>Same is true of ABS and traction control,

I have to disagree. After using a car in various slippery conditions the way ABS and traction control works is very predictable. This new system is more like an airbag than ABS. You can't really test it in a controlled environment and get familiar with it. You will be surprised when it activates. The key question everyone is asking is: is it going to be a positive surprise or a negative one?


Try ABS on gravel or sandy soil or most traction control systems on wet leaves. It's easy to find a situation where the direct driver input would be preferred over the nanny systems over-riding. (I do agree that overall, they improve safety.)

It may be easy for you, but I've had a tough time outperforming ABS on ice and leaves. And the ABS of today is better than the ABS of the '90s. In pretty much every case where I've done real-world emergency hard braking I have either appreciated ABS or wished it was there. The last time I drove a vehicle without ABS was riding a motorcycle; if ever there was a candidate for ABS it's motorcycles, yet surprisingly ABS has taken much longer to become standard on bikes. Did I manage to avoid plowing into a fool on the interstate? Yes, and it was thanks to the hours of self training on perfecting my braking technique. Would I prefer not to have worried about it at all, knowing that the bike would pulse the front wheel a hundred times a second and maintaining traction 95% of the time? Absolutely I would.

All of my traction tests on severe ice have shown that the ABS does about the same as I can, which is to say, terribly. Drivers should know that ABS can't stop a car that has no traction. That is an education problem, not an ABS problem. If you can reliably reproduce a case where ABS does worse than a driver with a few hours of training, I'm pretty sure that there are some organization that want to talk with you.


On any loose gravel drive [or most surfaces where a slid, it's trivial for an untrained driver to out brake an ABS-equipped car 9 (or 10) out of 10 times.

Try it sometime.

Or read the NHTSA study on it: http://www.nhtsa.gov/DOT/NHTSA/NRD/Multimedia/PDFs/VRTC/ca/c...


Mangled the text above; it should say “where a sliding tire deforms the surface]

I maintain there is a fundamental difference between an electronic system which attempts to aid a car do what the driver has asked, and one which just puts the brakes on hard and without driver input.

I know what you're saying, but I think it's a false equivalence.


You’re describing and absolute failure of the ABS system.

In that scenario, braking hard enough to engage ABS at all is the failure. It is not a failure on ABS's part unless you consider all ABS faulty. Some loss of yaw control is inevitable whenever ABS is engaged. ABS only gives better yaw control than completely locking the wheels.


Going from “straight sections” of the road, applying breaks and getting:

> when the tires recover traction the whole car is angled 20 degrees to the left.

That’s not just some loss of yaw control. That’s an ABS system utterly failing to do it’s job. The point of an ABS system is to avoid the exact situation described.


That's pretty normal behavior, in my experience.

In that case, I honestly believe you may want to get a new vehicle, or at a minimum get yours checked by a competent mechanic. There have been plenty of advancements in ABS and traction control, but this is literally what they where designed to deal with.

Ex: https://m.youtube.com/watch?v=iz9_4qiSXkw

Especially if rather than stopping with a slight twist it’s twisting enough that hard breaking means you often lose control.


I don't think I have ever seen any vehicle whose ABS wasn't essentially rapid cadence braking, and in bad road conditions, the car will turn a bit during the intervals while the wheels are locked (so I generally found threshold braking more useful). If the road is bad enough, even leaving the wheels free won't necessarily keep the car pointed forward. That said, I haven't really had issues recovering from these situations.

Didn't once have an unexpected braking event on my old '14 Mazda 3 Astina (6MT sedan w/radar cruise as well).

LOVED that car so much. They just get the UX so much.

Tested driving through empty cardboard boxes three times from 30km/h steady not moving inputs a mm:

First time detected and applied braking, but stopped 5cm into them, only just knocking top one off the bottom.

The next two times it stopped just in front. All times car engine stalled into stop and go mode, IIRC, and applied hazards automatically.

(Was I one of the three only drivers who ever verified the systems workings?)

In dense highway traffic, on a few instances it alerted me to stopped traffic ahead when I was checking mirrors and tired. Was able to stop myself before AEB.

Found myself using radar cruise all the time after that, as it applies braking earlier.

Loved the rear cross sensor especially when pulling out from 45d angle parking, and doubly so when being parked next to a truck. One does not simply reverse into these. AFAIK Tesla still doesn't offer rear cross alert.

Did I mention BLIS with visual indicators in the mirrors? Or the fully functional HUD? All this for a fraction of the cost of a Tesla or other luxury vehicles offering the same - in 2014 this list was far shorter than today. Really miss it, would buy again if we had to have another 2nd car.


> Just because they aren't perfect doesn't mean they don't make your car safer.

My reading of this article is different: indeed they do make your car safer, if you're driving it. But we're being told that self-driving cars are just around the corner and regulators are beginning to allow self-driving cars on our streets. This article shows just how poor the technology is, and how we are nowhere near ready for self-driving cars on our streets.


>But we're being told that self-driving cars are just around the corner

I'm sure not hearing many people saying that these days. I'm not sure there's a real consensus on exactly where things will be in 10, 20, and 50 years. But there seems to be a pretty broad consensus at this point that pretty much nothing is "just around the corner."

Once most of the people with a vested interest in self-driving being imminent found they couldn't credibly keep to that storyline, things got quiet in a big hurry.


To be fair to self driving cars, none of the vehicles being tested are Waymo vehicles. There are obviously some that are much farther along on this than others.

> The only way that AEB might conceivably make you less safe is if you pick up bad driving habits because you expect the all-knowing robot to keep you safe. This certainly applies to some people, but that's the person's fault, not the manufacturers fault for rolling out systems that are better than nothing, even if they aren't perfect.

When you bring up fault, it implies you view this as a conscious change in behavior.

It could be. But many driving habits can't really be conscious decisions as they're judgement calls that one couldn't explain if pressed. If visibility is poor and you decide to drive slower than the speed limit, you are picking a speed that feels right. The way you approach an intersection or a curve with bad line of sight is based on your sensation of the possible risk.

It seems impossible to figure out how people respond to a moral hazard individually, but there is a strong enough signal in aggregate that the phenomenon is well known by insurers.

If the unconcious factor outweighs the benefits of the system, an AEB is a net negative.


There's been evidence for a while that these safety systems do cause drivers to not pay as much attention:

https://www.wired.com/2011/07/active-safety-systems-could-cr...

Like this article says, in city traffic that's stop and go, these systems could prevent rear-endings. But it could also make drivers more complaisant and not pay as much attention in situations where these systems don't work as well (higher speeds on freeways or with pedestrian detection).


I don't get it. Are you arguing that because the systems quietly encourage habits of mind that the human driver isn't responsible for knowing that these habits exist and correcting them? Because that subtly implies that human drivers shouldn't be at fault for anything the car does. And that's a pretty scary attitude for me considering that I walk and bike on the roads you drive on.

There are good rules of thumb for things like approaching intersections(slow down, move foot to brake pedal), and you can use EG the dotted lines of the road to decide whether you're going too fast for conditions. The rule of thumb is that you pick a dot that just enters your field of view, and then count out how many seconds it takes for it to disappear below your hood. If it's less than about 5, you are driving too fast for conditions and should slow down. And the rule of thumb is 3 seconds for a strip appearing from behind the car in front of you for a safe following distance on the highway.

It's not nebulous at all.


> Are you arguing that ... the human driver isn't responsible ...

I'm arguing that "responsibility" and "fault" are appropriate for resolving a legal dispute, but they don't tell you if a system as a whole is more safe or not.

> The rule of thumb is that you pick a dot that just enters your field of view...

How often do you do this check? And why?


How much can you unconsciously change your behavior based on a system that has probably never triggered for you on a pedestrian?

You heard a salesman explain it to you, or read about it when purchasing the vehicle.

> Just because they aren't perfect doesn't mean they don't make your car safer.

Sure it can. If "drivers" rely on this instead of being vigilant themselves, then it absolutely does make the car less safe.


Euro NCAP has a category named "Vulnerable Road User Protection". Test try to determine how forgiving the front of the car is when you run someone over.

Cars can get extra points in that category if they have AEB features. But if it turns out that AEB isn't effective, this just makes dangerous cars (like SUVs or vans) appear safer than they really are.

So I'd say it's fair to criticise AEB. It's misleading to claim "AEB makes your car safer for pedestrians" when in reality a car with a lower hood and bigger windows without AEB would be a lot less dangerous for pedestrians.


Right, but right now there is a real issue (https://www.google.com/search?channel=fs&q=tesla+driver+auto...) with people b̶e̶i̶n̶g̶ ̶t̶r̶i̶c̶k̶e̶d̶ ̶i̶n̶t̶o̶ believing that their cars are more autonomous than they actually are. Its good to assume that your car will not avoid accidents by itself.

Abrupt, unexpected breaking could cause a rear-end collision, which could conceivably kill quite a few more people than if the breaking had not occurred. Imagine a car breaking hard directly in front of 16-wheel semi-trailer. That could lead to a pile-up and quite a few people dead.

I don't know if you started typing your comment before I got my edit up or not, so I'll give you the benefit of the doubt.

Yes, unexpected braking events can cause rear-end collisions, but that was not the point of this study. If a pedestrian walks in front of a car unexpectedly you can either try not to hit them by braking (or possibly swerving if the situation permits), or you can just take the action you're implying and plow right on through them because your unexpected braking might cause some other people to get hurt.

My 2016 Subaru Forester with Eyesight has gotten confused and applied the brakes while weaving through the gates at military bases (these are low speed and lined with barriers), but other than that I have had no unexpected braking events in my car that weren't justified. In 40k miles the Eyesight system in my car has helped with about six close calls (I can't know if they would have become collisions), and haven't been rear ended. Right now I'm going to say that AEB systems add plenty of safety value to offset the increased risk of a rear end collision.


A rear-end collision is caused by the following driver being too close. A car in front could at any time, for any reason, use full breaking force and you have to plan your following distance appropriately.

That said, a good way to get an angry rant from a truck driver is to talk about cars cutting in front so they no longer have an appropriate stopping distance.


> That said, a good way to get an angry rant from a truck driver is to talk about cars cutting in front so they no longer have an appropriate stopping distance.

If you do that then I'd imagine that the trucker is going to be more likely to just steamroll you rather than risking jackknifing.


If you hit a pedestrian because you don't want a 16-wheeler up your backside, you're as much at fault for murdering that person as the trucker who murders you by following too closely.

seriously? I get it, drivers in general are not cautious enough of pedestrians and cyclists. but if a 16-wheeler is following so closely that the other driver is at risk of getting rear-ended when they stop, the pedestrian is probably toast anyway. the semi is just going to push the car straight through the pedestrian. now instead of one person dead there's 10 tons of mass careening through the street out of control, putting a lot more lives at risk.

The same is true if there is a pedestrian, so the fault lies with the truck driver being too close to stop in time. Perhaps that means the law should mandate automatic braking technology on trucks.

How is this different from a human slamming on the breaks to avoid the accident? Or a human crashing into the truck. These situations can also lead to a rear end collision the same was as AEB would.

A lot of drivers also look ahead of the vehicle directly ahead of them, so they have some sense of when the vehicle in front of them is likely to slow down so that they can be better prepared to slow down themselves. If the AEB has a false positive and suddenly slows the vehicle, it would likely catch drivers following that vehicle off guard.

If the AEB has a false positive and suddenly slows the vehicle, it would likely catch drivers following that vehicle off guard

Then stop following so closely. Problem solved. Don't tell me not to have a safety feature in my car because another driver isn't driving safely.


Really? You've never had a car cut in front of you from the inside lane because they just had to pass three cars before making this exit? I see this happen way too often. Please stop assuming every rear-end collision is the fault of the rear driver.

From an insurance "at fault" perspective it's basically always going to be the fault of the following driver. However, I can pretty much guarantee that, if you go out on the road and randomly slam on your brakes, you'll get someone to hit your vehicle in pretty much no time flat.

>From an insurance "at fault" perspective it's basically always going to be the fault of the following driver.

It is a quick simple rule that avoids dealing with he said she said situations in the days before dashcams and the inaccuracy of this approach would roughly balance out across the insurers. Sure it screws the people who aren't actually at fault but they can't (or couldn't, before dashcams) prove that anyway. This kind of thing will probably go away in the future because everybody will just have dashcams.


That's not the situation the parent post was talking about "If the AEB has a false positive and suddenly slows the vehicle"

The case you're describing is exactly what AEB is designed to protect you from and if the car behind you couldn't react as fast as your AEB and he rear-ends you, then he was following you too closely.


True but spotting a pedestrian through the windows of the car in front of you is much more challenging than spotting a lineup of brake lights from cars ahead. Keeping a safe following distance is by far the more influential factor in a collision occuring or being avoided.

> Abrupt, unexpected breaking could cause a rear-end collision

If you’re tailgating. Don’t tailgate.


Or they're cutting. Don't cut.

> Just because they aren't perfect doesn't mean they don't make your car safer.

There is no evidence that they make your car safer, but there is evidence that AEB fails in common situations or in ways a human driver wouldn't.


It is irreleveant if AEB fails when a human doesn't- the human and AEB systems are complimentary. When the human driver doesn't fail, then it's okay for AEB to fail. AEB only needs to pick up the slack when a human isn't fast enough to get on the brakes. Even if it only works in 10% of those situations, it's still preventing collisions.

> It is irrelevant if AEB fails when a human doesn't

This is only true with false-negatives. A false-positive (say, on a pedestrian-free, controlled access highway) could easily be actively harmful if it causes abrupt changes in vehicle behaviour that would not occur if a human were in full control.


> It is irreleveant if AEB fails when a human doesn't

Maybe in the eyes of the law. If pedestrian-detection-equipped vehicles cause more deaths, however, it is quite relevant.


It matters to me as a pedestrian because drivers will become reliant on their AEB, and when it fails, they might run me over.

We better design these systems not to fail then. Seems obvious but I think that’s the solution to what you’re describing. Having an unreliable human as a backup is not going to do it because as you say, and I agree, humans will become reliant on it.

The systems are not reliable yet. Fortunately humans are not totally reliant on them yet either so humans are still effective as a safety backup.

The trick is to make sure the crossover point (when humans become so reliant that they are ineffective safety backups) comes after the system becomes highly reliable. Unfortunately humans are already unreliable to begin with even without AEB, so we are just going to deal with a few incidents from autos, both those without AEB, and, until it becomes perfect, those with AEB.


> We better design these systems not to fail then

Too late, we're commenting on an article about them failing in common scenarios.


But we're not done designing them. This is an ongoing project.

Actual studies show it decreases rear end collisions around 40% and pedestrian insurance claims 35% http://bestride.com/news/new-study-by-iihs-shows-automatic-e...

It's weird that they didn't include Subaru's eyesight system, given that it generally performs much better than Tesla.

https://www.caranddriver.com/features/a24511826/safety-featu...


Agree completely - Eyesight impresses me on a regular basis on the complexity of scenarios it seems to be able to react to, both when it comes to collision alerts / automatic breaking and when used for assisted cruise control.

I haven’t had any experiences with pedestrians at speed in front of the vehicle, but the rear automatic breaking in my Subaru is very aware of / sensitive to people walking behind it.


NCAP footage seems to show the opposite.

https://youtu.be/cMiZa3HgRVE?t=125

The systems work as described and don't claim to work in every scenario.

https://www.tesla.com/sites/default/files/model_3_owners_man...


I’ve seen that NCAP video and was really impressed by it at the time. It doesn’t jive with the results in TFA at all - where it sounds like the Tesla (and most of the other cars they tested) hit the dummy every time, and often never even slowed down.

I wonder how to explain the disparity. Could the video you linked have been taken with AutoPilot enabled maybe?


From reading the description it seems like AAA used tests where there was a bend in the road and / or the pedestrian was moving across so they weren't actually in the path of the vehicle until a couple seconds before impact. These systems are going to be conservative in classifying potential obstacles as in need of emergency braking, as they should be - you don't for example want your car slamming the brakes because it sees a pedestrian on the sidewalk next to the lane you're driving on.

That said, it should be noted that the Honda Accord actually did quite well in these tests.


Touchscreen dash; training people to expect their car will do stuff for them (beep if something is behind, alert if someones in the next lane, break if something is in the way, drive); cellphones (both drivers and pedestrians); pedestrian crosswalks in the middle of busy roads with new signaling most drivers have never been formally trained on; motorcycle lanes between car lanes; bike lanes on the side of the road, sometimes second lane from the side, sometimes on thruways. The list goes on and it's getting worse everyday. Keep it simple stupid.

FYI, worth reiterating, a deep learning vision system will not necessarily recognize a dummy as a human. Particularly if it operates in both visible and IR spectrum.

That's a fair point which also occurred to me while reading the article. It is, I think, indicative of a deeper issue with using ML in these sorts of safety contexts. If the only way to really test your safety system is to actually put people in danger your whole concept may be problematic.

That's why Tesla's approach is pretty brilliant IMO. It's easy to collect samples where there was hard braking and there was a real, actual human visible in the path of the car while the car is under human control. No dummies are needed, and AI was not in control of the car, so there's no ethics issue either. Your Tesla will upload such samples automatically if Tesla deep learning system wants them.

So one good thing about the (perhaps unfair? perhaps disproportionate?) media attention that failures of semi-autonomous systems get, over the long term I'm pretty sure it helps focus the teams to make the tech better.

It's how aviation got insanely safe. Every passenger airplane crash is scrutinized. The very distortion in thinking that makes people think flying in airplanes is more dangerous than cars is what motivates every crash to be scrutinized.


It's how aviation got insanely safe

Not by media attention though; it took an authoritative, hard-line government body to reach that level.


I have former colleagues working on some of these systems, and they tell me some pretty scary horror stories of the simple and obvious ways in which they fail. Some of the stories that I have heard just blow my mind - and seem all-too-reminiscent of the VW emissions-test defeat-device scandal.

There's so much talk about false-positives in this thread, but I have yet to experience that on my own car, even when people cut me off dangerously close. Not once has AEB mistakenly been activated by the car.

My car doesn't have AEB, but it does have the front collision warning sensors. I get plenty of false alarms, particularly with an S-turn (quick right then left turn). On the final turn, there's a guard rail which sets off the FCW about 25% of the time I drive through there.

Last time this was discussed on HN:

https://news.ycombinator.com/item?id=21151117


Looking at the methodology here, I don't think this test is likely to be representative. There are two important differences between a test dummy and a pedestrian. First: pedestrians glow in infrared; and second, pedestrians move, even if it's only shifting in place. Both of these features of pedestrians are extremely useful for a sensor that needs to find them, so I expect using a still, cold dummy to make them perform worse than they would in the real world.

> Looking at the methodology here I don't think this test is likely to be representative There are two important differences between a test dummy and a pedestrian. First: pedestrians glow in infrared; and second, pedestrians move, even if it's only shifting in place.

The dummies were articulated, moved, and designed to mimic the infrared response of a real human. See section 4.2.


> ..pedestrians glow in infrared; ..

> ..using a still, cold dummy..

The kind of infrared you're thinking of is probably not the kind the sensor can detect. Only thermal imaging cameras can detect far infrared range - the one that can help distinguish hot objects from cold ones. 'Normal' IR cameras only capture near infrared - which tells you nothing about the temperature.


> The kind of infrared you're thinking of is probably not the kind the sensor can detect. Only thermal imaging cameras

Parent is talking about far-infrared (ie, thermal) sensors. Many car models have high-resolution thermal sensors.



Automotive night vision is expensive and unusual.

What is the reasoning for the vehicle not also sounding its horn while it's braking?

Probably because it's only legal to use the horn if it's to avoid an accident, but it's almost always legal to use the brakes.

welcome to real-world, where out-of-labs conditions are far harder than expected.

Now let's put a person talking on the phone and texting at the same time and have that person do the same exercise that AAA conducted..

These systems would work better if people were made out of metal.

It's a bit scary that the Tesla Model 3 doesn't fare well at all.. and this is the same company shipping live metal torpedoes in parking lots via the Summon feature.

Scary.


I think this is a little over the top. Vehicles being summoned have a top speed of 5 MPH, hardly a "torpedo".



Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: