As long as they keep on upgrading the hardware for free until they can provide full-self driving to the owners that chose the option it shouldn't matter?
Tesla said in October 2016 that cars with the current "HW2" generation hardware would be capable of FSD, and that they would be upgraded to that in mid-2017.
If they don't have FSD by mid-2019, that's two years off that target. Deadlines slip, companies overpromise, this is a hard problem, all that jazz, but by early 2020, people who leased Teslas that they were told would have FSD are going to be deciding what their next move is. If part of the calculation is "FSD still ain't here yet, but Tesla is promising a free upgrade to HW4 which they say will really lay the foundation for FSD this time Elon pinky-swears," I would submit that it's gonna matter.
Tesla'll lose the last strong differentiation from competition and Kona follow-ups will eat model 3 margins. Tesla will get back where is profitable (sport cars) and will likely want to double down on electric trucks, probably with a specialized highway autopilot.
I’m not worried about people signing up to beta test dangerous tech on themselves, I worry about bystanders who made no such choice. That’s what really needs to be dealt with, and if people want to take facetious arguments about the overall lethality of existing tech to the public, let them, but I hope they brought pitchfork and torch repellent.
Does it matter for the people that die in the meantime because someone didn't read the instructions, the product description, or the in-car warnings?
When you select full self-driving capability in the Tesla configurator, it states explicitly and prominently that this isn't enabled (or ready!) yet, and is subject to legal approval.
Anyone that buys a Tesla, thinks it is fully autonomous, and dies as a result is failing natural selection. Third parties (pedestrians and other drivers) on the other hand are victims, but have there been any examples of this with Tesla cars? My not-full-self-driving AP2 Model X can detect pedestrians, and will also emergency stop if it detects things. This is aside from the fact I, as the driver, should be emergency stopping if I see a forthcoming collision with a pedestrian!
Think of Autopilot like a horse. You still need a driver, but at least you've got another set of eyes on the road. Sometimes it will notice things you don't, sometimes it will get spooked by things it shouldn't do, but two sets of eyes/two brains are better than one.
The things that could happen are that either Tesla ships full self-driving (FSD) and better delivers when it does (i.e. it has to work reliably).
Or it puts off the customers again, promising a new hardware 4, 5, 6, ... capable of FSD - in which case people better read the instructions (and pay attention at all time lest they end up in a divider or under a trailer).
I bought an AP1 Tesla just before they started shipping AP2. I kept my AP1 because I doubted that they'd really be able to ship FSD anytime soon. Shortly after, a friend of mine bought an AP2 Tesla. For the next year, my AP1 performed much better at the features either one provided -- adaptive cruise control and lane-keeping -- than his AP2. And now we're at the point where it's "just kidding, you really need AP3." Still with no end (FSD) in sight. Two years down the road, and no regrets keeping my AP1.
If it isn't, then I would hope Tesla makes it right somehow (providing HW4 for free or something similar) or they'll open themselves up to be liable for false advertising.
Why should we believe Tesla that HW3 is actually capable of FSD this time?