Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You seem to be implying that in the short run it's ok for them to kill some extra people. One, I don't think that's necessary; Waymo is a good example of how a safer approach is also apparently no less effective. Two, you're presuming that we will get to self-driving cars that are economically viable and safer than current human-driven ones, something that is not a given. And three, it's not clear to me who gets to decide exactly how many unwilling people should be sacrificed on the altar of technological progress, but I hope it's not us and it sure shouldn't be Musk.


https://www.youtube.com/watch?v=zdKCQKBvH-A

Here's a video of Waymo car in Phoenix. Around min. 17 it gets confused by a cone and stops in the middle of two lane road, hugging both lanes.

Then google sends support but just when they arrive, the car drives away from them.

They were lucky it didn't lead to an accident and as long as they keep the fleet to 600 cars then yeah, the accident rate will be much lower than Tesla's AutoPilot, shipped in more than a million cars.

My point is not to rag on Waymo just to inject some reality.

We don't have a choice between "safe and unsafe" way of developing self-driving software.

We have a choice between "test software we know can't handle all situation on real roads and make it better based on that testing" or "we'll never have self-driving software".

Except the other option will rather be: "U.S. doesn't allow testing of self-driving software on real roads and a Chinese company will develop it and will capture a trillion-dollar market in U.S."


The Waymo car in that video drove extremely safely give that it was confused. It was conservative and thought it might have seen and obstacle so it stopped. Did this inconvenience other drivers? Absolutely. But it was not a major safety risk. In fact, slowly coming to a stop is the legal and correct thing for a confused or impaired human driver to do. In comparison, Teslas seem to rapidly and suddenly brake for no explainable reason while traveling at fast speeds and do so routinely. Further, Teslas have other safety issues which are indicative of sloppy design (such as the fly-by-wire passenger doors that will trap back sheet occupants in the car if the electronic system is disabled). This is a failure of Tesla specifically, and regulating to stop it wouldn’t really slow down others like Waymo.


> ”Teslas seem to rapidly and suddenly brake for no explainable reason”

The reason is widely known. Phantom braking is caused by rogue radar reflections that confuse the car into thinking there’s an obstruction in its path, activating the AEB automatic braking.

The real question is, why does it happen more often with Teslas than with other cars equipped with radar AEB? Maybe Tesla’s is just more sensitive.


heh, i guess their fix to that is to just get rid of radars altogether.


Exactly. There's a big difference between approaching this problem with a "first do no harm" perspective and a "move fast and kill a few people" perspective.

And this part from the previous poster strikes me as a big problem: "They were lucky it didn't lead to an accident and as long as they keep the fleet to 600 cars then yeah, the accident rate will be much lower than Tesla's AutoPilot, shipped in more than a million cars."

That seems like an excellent reason to keep the number of active cars very, very small. Rather than, as stated, an excuse for shrugging at a death rate at least 1667 times higher.


100% agree! And I don’t think the approach needs to be “first, do no harm”. I would be very happy with “move at a normal pace and do your best not to kill anyone.”

But “move fast and kill people” is ludicrous and it’s exactly what Tesla is doing.


> We have a choice between "test software we know can't handle all situation on real roads and make it better based on that testing" or "we'll never have self-driving software".

This is like saying that we'll never have a cure for cancer if we can't experiment on the public without their consent.

The bar for medication, at least, is proving safety first before testing on large amounts of people and allowing the public to buy it.


> The bar for medication, at least, is proving safety first before testing on large amounts of people and allowing the public to buy it.

Except for vaccines, apparently...


> We don't have a choice between "safe and unsafe" way of developing self-driving software.

It's also entirely possible Waymo hasn't achieved peak perfection in its software development practices despite doing better than Tesla, and that another entity could do it more safely.


How many unwilling people are sacrificed because we won't ban alcohol and all impairing drugs without exception and then enforce those bans with immediate, Judge Dredd-style summary execution?

I bet you that'd reduce traffic fatalities dramatically too.

How far do you want to go to 'save lives'.

I guarantee you with 100% certainty I can design a society that will 'save lives' at every turn for every single activity, and I can guarantee you with 100% certainty you wouldn't want to live in it.


You seem to ignore that we've already tried banning alcohol and currently ban many drugs. We scrapped banning alcohol for a reason.


you are arguing a fake hypothetical. Tesla accident data shows that their autopilot feature reduces accidents.

https://www.tesla.com/VehicleSafetyReport

fortunately our regulators don't take a one size fits all approach for how new technologies can be developed


It doesn't, as it compare general rates with self selected "good driving conditions" as defined by the software - only highways, only good enough weather, only good enough maintenance state of the car.


Your notion is that everything is fine because according to Tesla, a company with a leader known for telling whoppers, they are killing less people on net?

Even if we trust them on that stat, which I certainly don't, that still doesn't mean they aren't killing people unnecessarily.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: