Hacker News new | past | comments | ask | show | jobs | submit login

It's worth noting that we can make assumptions about how a human would have handled the situation, because BlueCruise is not full-self-driving. There was a human at the wheel.

So the answer in both cases is definitely "There was a human operating the vehicle, they saw the vehicle running headlong into stopped traffic, and they failed to use the brake in time." They may, perhaps, have trusted the semi-automation too much, but that's no more excuse for rear-ending a stopped car than setting your regular cruise control to 55 and failing to disengage when you see traffic stopped ahead is.

We want the technology to do better than human, but (unless NHTSA turns up something truly bizarre and awful, like "Driver tried to brake and BlueCruise overrode the command to disable braking," which should be impossible) the situation in both these cases is people died because the car was being driven by a human and the human made a bad call.




This is an important take for this system. BlueCruise does a lot to check for attentiveness with its gaze-tracking. The driver would have been looking at the road at least a few seconds before the accident. As someone who has done several hundred miles on BlueCruise, it can be pretty touchy about you paying attention. Even taking a drink from a cup a little too long will get it to start alerting you and eventually kick you out of it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: