Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Human eyes are unlikely the only thing in parameter-space that's sufficient for driving. Cameras can do IR, 360° coverage, higher frame rates, wider stereo separation... but of course nothing says Teslas sit at a good point in that space.




Yes, agreed, but that's a different point - I was reacting to this specifically:

> Humans use only cameras.

Which in this or similar forms is sometimes used to argue that L4/5 Teslas are just a software update away.


Ah yeah, that's making even more assumptions. Not only does it assume the cameras are powerful enough but that there already is enough compute. There's a sensing-power/compute/latency tradeoff. That is you can get away with poorer sensors if you have more compute that can filter/reconstruct useful information from crappy inputs.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: