Hacker News new | past | comments | ask | show | jobs | submit login

I understand what you mean, though drivers already take personal risks on the behalf of corporations.

If you are a Uber-driver, and enable the FSD on your Tesla, it's not Uber nor Tesla who will get in troubles if you are having an accident.

For now the situation with Waymo, can be the same as a driving instructor. The remote driver is the driving instructor, and Waymo AI is the student getting monitored and doing the driving.

The driver is responsible and, then they can potentially sue (or have agreement with Waymo) if it caused damages beyond their own responsibility (for example, if driver hits the brakes, and Waymo has overridden his manual commands).

Once this solution is safe, drivers will get more comfortable babysitting more than one car at once; essentially solving the scale issue that you mention.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: