
Tesla approach to self-driving is harder but might be only way to scale - hunglee2
https://electrek.co/2020/06/18/tesla-approach-self-driving-harder-only-way-to-scale/
======
imheretolearn
> Waymo and many others in the industry use high-definition maps. You have to
> first drive some car that pre-maps the environment, you have to have lidar
> with centimeter-level accuracy, and you are on rails. You know exactly how
> you are going to turn in an intersection, you know exactly which traffic
> lights are relevant to you, you where they are positioned and everything. We
> do not make these assumptions. For us, every single intersection we come up
> to, we see it for the first time. Everything has to be sold — just like what
> a human would do in the same situation.

This seems like the right approach to me given the accuracy, or lack there of,
of LIDAR. However, I do not like Tesla's approach to test/experiment with
Autopilot in production ie deploying it in real world environments and
expecting that _humans_ will responsibly use the technology. It is analogous
to fixing a software design flaw by "convention" where you expect your clients
to understand the convention when they are using your code as opposed to
fixing the design flaw in software itself. Unlike software crashes on a
desktop, a bug/irresponsible use of Autopilot is literally the difference
between life and death.

> Tesla has the giant advantage of collecting a ton of data from hundreds of
> thousands of cars, but he describes finding the right data as finding a
> needle in a haystack.

> When Tesla owners drive around and see a stop sign blocked by foliage, they
> make a voice command that sends the data to Tesla and you gather points

As long as the data is anonymized, this doesn't seem like an issue. Given how
many of Tesla's vehicles have been hacked into, I wouldn't feel comfortable
handing over that much data to Tesla or _any_ other company.

