Hacker News new | past | comments | ask | show | jobs | submit login
Autonomous vehicles are great at driving straight (ieee.org)
27 points by jnord 7 days ago | hide | past | favorite | 42 comments





The title is accurate, but it implies that AVs are more dangerous than human drivers in other types of driving.

You have to read the whole thing to understand that the implication is false.

SAE Level 4 self-driving vehicles (those capable of full self-driving without a human at the wheel) performed especially well by several metrics. They were roughly 36 percent less likely to be involved in moderate injury accidents and 90 percent less likely to be involved in a fatal accident. Compared to human-driven vehicles, the risk of rear-end collision was roughly halved, and the risk of a broadside collision was roughly one-fifth. Level 4 AVs were close to one-fifthtieth as likely to run off the road.

So if you ignore all the horrific accounts of Tesla FSD casualties because Tesla is Level 2, not Level 4, then AVs are already saving lives.

Level 4 AVs are worse at dusk and dawn, better in rain and fog, worse when making turns, better at staying alert with no fatigue, better at staying on the road.


Great! So if I am hit (blind pedestrian) by an AV at dusk, I can at least say it wouldn't have happened statistically if I weren't out and about so late? Sorry, but these "Is safer then human drivers" statistics dont decrease my AV anxiety. No, they make it even worse, because wording makes it obvious some of you want them to look good.

The overall fatality rate went down by 90%, though. That’s a fairly big deal: 30,000 fewer fatalities per year in the US alone. The ideal, of course, is no cars: but if you have to have a car, the next-best is no humans near it or in it.

> The ideal, of course, is no cars:

Of course? While I can get behind mixed development attitudes that integrate walking and biking systems with publix transit, cars will always be necessary. It's foolish to think cars are a scam on the earth - not every errand or trip can be walkable or covered by public transit or even affordable for a family.


It depends on what the accident pattern at dawn and dusk looks like. If it's dangerous enough to pedestrians then maybe we need to restrict autonomous cars at those times until they have better testing.

But unless you only go out at dusk and dawn, you should be glad that these particular level 4 systems are controlling cars.


"You should be glad" is so very much patronising, I cant write here what I think of that down-looking behaviour.

It's pretty normal to say people should be glad about unambiguously good numbers. I don't know why you think it's patronizing.

In the scenario given, your risk is lower. What do you not like?

If you think the numbers are wrong, that's reasonable. But this thread of conversation was based on taking the numbers at face value.


The comparison between machine and humans uses the results of the average human driver. Not all humans are alike. Some cause accidents, others don't. There's some randomness involved (every human might be the cause of an accident at some point), but most accidents are caused by individual behavior of specific humans.

For a convincing comparison, remove all those human accidents caused by intentional violations / reclessness, looking at their phones, and known but ignored medical conditions. And compare again.

On average we might be safer with all-AV traffic. But we might be even safer by removing the most dangerous drivers and allowing the rest to continue to drive manually. That's the capability of automation to aim for, not the average human.


The police do their best to remove dangerous drivers! And yet, people keep exercising free will and breaking the law and rarely get caught.

As with most systems involving human error, improving the humans has a limit if their livelihoods do not depend on strict adherence to the rules (and even then, you get violations!) The most complete way to remove human error is to remove humans.


Most traffic enforcement is focused solely on speed, and typically in areas where speed doesnt matter as much. Drunk driving enforcement is typically an after-the-fact (eg accident already happened).

On top of that, jail time is minimal and people with suspended licenses obviously still drive because you need to in order to live.


Turns are pretty common, you know. And the article said that even in their dataset, level 4 autonomous vehicles were twice as likely to get into an accident at a turn as a human driver.

The article—and the repository for the dataset—are also pretty vague on just what the conditions were for the autonomous vehicles being measured. Were they being kept in areas that their manufacturers know is safe and within their capability to drive well? Were they being taken out on arbitrary roads? Back roads? Dirt roads?

The article doesn't mention snow. Were they being driven in the snow at all? When the lines on the road—and even the whole road itself—are covered with a uniform sheet of white, barely any markers to indicate where there's road and where there isn't?

Were they being driven in construction zones, where you're explicitly asked to drive across and outside of the lines?

I'm very well aware that once self-driving cars are able to handle arbitrary conditions like this even as well as an average human, they'll be amazing and save many people from serious injury and death. I'm genuinely looking forward to a future where I can get in the car and tell it where I want to go, then relax the whole way—not least because I know that one day, I'll very likely be frail enough not to be able to safely drive myself anymore.

However, I'm deeply skeptical that so-called level 4 autonomous vehicles are truly capable of all this without a human driver at this point in time.


Level 4 inherently means a limited ODD. That means there are conditions and situations in which they aren't allowed to operate. No one is even claiming that these vehicles operate safely in snowy conditions today, so why draw this specific line in the sand?

It's also worth noting that AV companies have done snow testing. Waymo specifically tests in Tahoe. None of that data is publicly available for us to even discuss, though.


> so why draw this specific line in the sand?

Simple: because I live in an area where these conditions are common, so self-driving cars will be 100% useless to me until they can solve it.

I am glad that they are starting to be feasible in areas without as many different road conditions, but until they can handle the conditions in my area, I can only benefit from them extremely indirectly.

...And furthermore, because it is frustrating to see people talk about "self-driving cars working in normal driving conditions and being ubiquitous" (which hasn't happened in this thread that I've seen, but I've seen it a bunch in the past) when what they really mean is "self-driving cars working in SF and a few other limited areas in the US Southwest". It just reeks of a lack of concern for anything outside Silicon Valley—which is a depressingly common problem on this site.


Waymo tested in NYC for years. The state was fairly hostile to the efforts and passed a few laws to restrict competition with the taxis. That's a major reason why there isn't much interest in New York state.

There have been various testing and public deployments across the southern half of the US, not only California and the Southwest. Texas, Georgia, Tennessee, and Florida all come to mind, plus Pennsylvania due to CMU. The deployments in California and Arizona are the most mature, but they're not the exclusive focus.

I'm not going to get into the bits about silicon valley, because frankly that's not a terribly interesting conversation to have.


It's fascinating that your defense of someone being tired of the CA-centric attitude here is to then turn around and hide behind the NYC-centric attitude.

If it's not SF or NYC, it barely matters to most HNers, which is rather annoying to see. It very much obviously causes a heavy bias and arrogance since there's no realy balance to the areas outside those two metros.


there is a second side to this medal.

Those companies dont give a flying f** about you or your snow.

They want to create autonomous taxi for a city and monopolize the public transport.

They might be producing PR materials about private self driving cars. But they don't want that, its silly. They want to own all cars and rent them to as many people as possible.

To me the whole argument is a pointless.


The sentiment against self driving is fundamentally irrational. Something bad might happen. Therefore, we stick with human drivers (because nothing bad ever happens with those, right?!). Of course the reality with human drivers is that they get into all sorts of lethal trouble all the time. So much so that traffic deaths are a pretty high ranking cause of death. Doing better than human drivers is not that big of a technical challenge because most of them simply aren't that great.

What the article shows is the obvious: overall machines perform better. As you would expect from something that doesn't suffer from tunnel vision, fatigue, distractions, alcohol abuse, etc and has really good reflexes and responsiveness. Of course self driving cars aren't perfect. But neither is their human competition. And the cars are pretty good already and getting better over time. These numbers are not exactly set in stone.

Also the circumstances where accidents are more likely to happen are also where people struggle. And of course getting cars to do better under those circumstances might be solved with technology. Better sensors, more cautious routines that adapt to visibility, etc. Or simply deciding to not drive based on the risks. That's a decision that humans are very bad at. Human drivers taking unnecessary risks is why people get stuck in winter storms and other bad weather. Going on the road when authorities are broadcasting warnings not to is exactly the kind of moronic behavior that gets people killed.

As for FSD, it gets a lot of the press but there are quite a few self driving taxi programs now operating in a growing number of countries and cities. That's all level 4 autonomy of course but mostly rollouts are going fine. People that have taken Waymo rides seem to mostly enjoy it. I think restricting the driving areas for these things has more to do with liability than technical risk at this point. But the more cities manage to deploy this stuff fine, the less of an argument there is to block it elsewhere.


Before anyone gets excited, this is Level 4, e.g. Waymo, not your Tesla.

I’m a bit skeptical that there is enough data for a meaningful comparison at this point. Humans can and do drive in significantly more exciting situations than Waymo, and that’s probably going to remain true for a good while yet.


I've never gotten to drive in Waymo, but my understanding is that they are very limited in the areas and routes they will take. Perhaps I'm off here, but I almost think of it as a trolly with tires, taking specific known roads and turns it can handle.

I know it won't drive on the highway, and I've also heard it avoids unprotected lefts.


If your definition of "limited" is the near-entirety of SF, most of central Phoenix metro, and the bits of LA between Santa Monica and downtown, I suppose you can label Waymo that.

Personally, I don't consider it remotely similar to a trolley. Aside from the obvious, there's not many differences between a Waymo and a good Uber within the service areas. You call a taxi and it shows up to drive you, unprotected lefts and all.


I looked at waymo.com and the only city anyone can take a ride is Pheonix, I didn't realize it's still not generally available in SF or LA (or Austin). Though I know it can do all these things now, why they don't open it to the public?

Good to hear, I didn't realize they covered that much area now. I'm in Austin and hoping they will get service to my house.

They actually cover quite a bit more than that, those are just the public areas today. They have employee operations going down the peninsula, for example.

Waymo is in Austin. Rainey street gets clogged with them on the weekend. I've seen them navigate Mopac without any trouble, too.

Like, they know to skirt traffic on Mopac via the clever use of service roads? And hopping off Mopac at 24th to slide along Pease Park?

What is this clever use of service roads?

A Texas highway like Mopac often has 2 additional travel lanes if one is willing to get off/on the highway repeatedly and to make questionable usage of merging lanes. For years, this was the way to travel Mopac in stop-and-go traffic. Dunno these days.

Oh you mean frontage roads.

App says it’s still coming soon, I guess I need an invite code.

Talk to me when they can reliably cover upstate NY in winter and construction season.

MIT's John Leonard was saying years ago that unprotected lefts are one of the hard problems. They're so situationally dependent and can depend a lot on social signaling. I have to make one to get into my driveway probably daily on a fairly busy road. I've had drivers flash their lights to let me know that, yes, they see me and are going to let me turn. I've also (rarely) been honked at by someone who was not about to slow and didn't think I was giving them enough space/turning quickly enough.

(Many other hard problems of course but unprotected lefts are something that drivers face multiple times a day in often ambiguous situations.)


Off topic, but when you’re waiting for a gap to make your turn into your driveway, do you pull the left edge of your lane to allow people behind you to pass on your right? (I observe people that don’t do this, they dwell in the middle and often even the right side of their lane, and I’m wondering if they are just oblivious to traffic building up behind them, or there’s a safety reason to block people from passing on the right or other safety reason to stay to the right of their lane..)

I do try to pull left as far as reasonable without getting into the opposite lane but they may or may not be comfortable passing me on the right given there's a minimal shoulder on the road with one lane in each direction. It depends how adventurous they are, whether there's plowed snow, and how big their vehicle is.

If you're trying to suggest Tesla will win, you're off base. Waymo does the hard stuff would major problems. Tesla does the easy stuff while killing people.

I'm not suggesting anything of the sort, why does Tesla trigger people so easily?

This paper is extremely difficult to parse. They are aggregating together ADS incidents, regardless of whether the ADS was in command of the vehicle. The other thing that is making their data quite hard to interpret is that they aren't including half of Waymo operational data, because they are using accidents reported to the California DMV. Accidents are only reported to the DMV when they happen in the absence of a paying passenger. Any incidents that occur with a customer aboard are reported to the CPUC instead.

I've been thinking about some problems recently and one that keeps popping up is "engineering vs science". Autonomous vehicles are the epitome of engineering vs science. Engineering is dirty, pragmatic, and with a lot of trial and errors. Science is pure, abstract (not necessarily practical), and proven. Human brain is great at engineering stuff (that's how we survived the natural selection), but computers are fantastic proof machines. When you wanna map a problem that our brain solves through dirty engineering tricks to something grounded in math and theory, you lose the emergent magic of all the little hacks that made the system possible.

Science is not like that really. Science is mostly experimental, which is pretty much trial and error. The theories produced through that process can closely fit observed results and yet still be wrong, like Newton’s theory of gravitation. There isn’t much in science that is truly pure and proven all the way down to its foundation, though something doesn’t need to be pure or proven to have good predictive power and utility. And there isn’t much that is truly abstract in science since everything is ultimately trying to explain how material reality works, even physics.

Math is the only thing that is like what you’re describing.


You're right—what I said applies to math only. I guess I was thinking about how ML engineers try to frame a problem in math terms whereas our human brain finds the most efficient circuits to solve the problem.

I'm still finding your comments a little confusing; and, now, bordering on the self-contradictory. I think you'll find that the proofs presented in the lower maths tend to be of "the book" form and really overstate the simplicity and clarity available to mathematics 'at the edge'. On the flip side, modern engineering sometimes gets to benefit from the work of centuries and the final (refined) output of science: the solutions are very elegant!

tl;dr "The authors highlighted two significant negative outcomes for level 4 AVs. It found they were over five times more likely to be involved in an accident at dawn and dusk. They were relatively bad at navigating turns as well, with the odds of an accident during a turn almost doubled compared to those for human-driven vehicles."

You're not summarizing if you grab the negative data points and remove the positive ones.

They are good at: going straight, not killing people, staying on the road, entering lanes, rain

They are bad at: dawn/dusk, turns




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: