
How do you teach a car that a snowman won’t walk across the road? - pseudolus
https://aeon.co/ideas/how-do-you-teach-a-car-that-a-snowman-wont-walk-across-the-road
======
Shutaru
If you haven't tried to solve the self-driving car problem in a practical
environment, there's a powerful tendency to get caught in edge cases. The vast
majority of situations encountered by an autonomous vehicle involve complex
but conventional problems, and yield relatively well to rules-based coding.

If someone builds a snowman by the side of the road, my AV may mistake it for
a human and start calculating possible paths it might take. But then the same
thing can happen for utility poles, mailboxes, or piles of garbage. We find
that if the potential human remains stationary, moving around it cautiously is
usually a sufficient solution.

On the other hand, if someone builds a snowman in the middle of the street,
the initial response of an AV and a human driver will be the same: stop and
analyze the situation. No responsible human is going to drive through a
snowman, so that's not an option for the AV either. Since the snowman will
always remain stationary, however, the AV is likely to treat it like other
road-obstructing obstacles (such as double-parked vehicles, another
significant but more common problem) and eventually move around it.

Don't get me wrong -- it would be great if we could imbue an AV with "common
sense." But I would argue that you can solve at least 99% of the problem
without it.

~~~
Shutaru
Please note, I currently write self-driving car code for Cruise Automation in
San Francisco.

