Hacker News new | past | comments | ask | show | jobs | submit login

It's part of the growing myth that is "self driving cars".

People seem to hand-wave and make up what these supposed future self driving cars will do - and worse, they hand-wave and assert they'll be objectively better at X than humans, without any evidence to backup the assertion.

Self driving cars are made by fallible humans using fallible programming languages and constructs. They can't possibly account for every situation or scenario - but people hand-wave and say it magically will.

Sure, one day you'll be able to sleep in the back seat of your car or read a book while it precisely weaves you between traffic only to navigate you right off a cliff. Or the neighbor's kid with a laser pointer prevents your car from turning into the driveway.

Driver-assisted cars are the real future.




> they'll be objectively better at X than humans, without any evidence to backup the assertion.

Google's road tested self driving car is already safer[1] than a human driving. Suggesting that a computer will be a more reliable processor of data and computer of maths than a human is not something which needs data to back it up. The ability of drivers is so variable and in the aggregate there are so many of them that it's almost self-evident that a self driving car which crosses a very low threshold for entry ("being road legal") will be better than a human.

> They can't possibly account for every situation or scenario - but people hand-wave and say it magically will.

Nobody is saying that they will any more than people argue that autopilot on a plane will. It's very plain to see that right now, as of this second, there is a self-driving car which is safer than a human driver. It is not yet legal to buy, but it doesn't change the fact that it's safer. It may be that a bug crops up which kills a few people. But that doesn't make it less safe, it makes the cause of death for some of the users different to "human error".

[1]http://bigthink.com/ideafeed/googles-self-driving-car-is-rid...


It doesn't have to be infallibly. It only has to be a single order of magnitude better than any human driver, and most people will start using it.

How many other sectors have abandoned human interference after computers surpassed human performance?


The important question then becomes - is society OK with bugs and shortcomings in software and hardware killing people? (this is based on the assumption that even driverless cars will not be perfect, some people will still die on the road)

So far, society seems to not be OK with this (as-in we'd rather a person do the killing, even if we think that killing was wrongful).

We aren't OK with autonomous robots having weapons, even though they might be objectively better at guarding prisoners, military bases, killing "bad guys" in bank robberies, etc. We freak out when a fatality occurs at an automotive plant, and those robots only pivot in place!

If society is going to agree we're all OK with a bug left by some short-sighted engineer being responsible for people's deaths - then OK. However, I wager people aren't really OK with this, most just haven't really considered this aspect yet.


A lot of the backlash against autonomous weapon systems is fed by the last 50 years of sci-fi movies showing what might happen (however unrealistic), self driving cars are a different thing and there isn't really an equivalence.

Sure there will be legal issues (in a crash who is responsible, the driver, the manufacturer or the programmers) but they will get resolved with time and case law.

The economic advantages to self driving cars are huge (unless you drive for a living but then progress is what it is), 35,000 people a year die on American roads an order of magnitude improvement would save ~32,000 lives a year (and that's just accidents resulting in fatalities, many many more experience life changing injuries), this generation of drivers might not like it but as the cars get better and better at driving themselves the next generation will hand over more and more of the responsibilities until a human driving a car manually on the road will look like an anachronism.

Also people aren't ever going to be happy with a bug in hardware or software killing someone but we are currently 'happy' with allowing tens of thousands of people to die from car accidents, if the motorcar had been invented in 2000 many people would have wanted to ban it immediately.

"You want to operate a 2500KG metal box at 40mph in proximity to people!? oh hell no!"


There is no reasonable argument for preferring that more people should die as long as the agents of their deaths are the kinds of biological organisms we're used to. What we happen to be already accustomed to has no relevance in determining what we ought to do in the future, except in trivial cases where the different alternatives don't lead to widely distinct numbers of casualties.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: