
Tesla has a self-driving strategy other companies abandoned years ago - close04
https://arstechnica.com/cars/2019/03/teslas-self-driving-strategy-is-outdated-and-possibly-dangerous/
======
Animats
Waymo keeps plugging away. Each year, for the last few years, the number of
miles between disconnects they report to the CA DMV has doubled. Three more
doublings, and that number will be bigger than average miles between accidents
for human drivers. Then they can ship a product.

Nobody else is even close.

This problem is slowly being solved, by normal engineering practices. The
"fake it til you make it" players are being left behind. Uber has been shown
to be totally incompetent. Tesla really just has a good lane follower and an
mediocre car follower. Apple has been trying to hand-wave by talking about
"significant disconnects" vs. the all disconnects DMV requires them to report.

The LIDAR industry is struggling, but there is progress. Quanergy seems to
have been mostly hype.[1] Continental, the big European auto parts maker,
bought Advanced Scientific Concepts, which makes and sells a good but
expensive flash LIDAR used in DoD and space applications. They packaged it up
for automotive use, and are waiting for the self driving industry to catch up.
That technology uses exotic indium-gallium-arsenide sensor ICs, which are
expensive in small quantities but would probably be affordable if they could
sell a few million a year.

This looks like a problem that's being solved. Just not fast enough for
startups used to quick payoffs in software.

[1] [https://www.bloomberg.com/news/features/2018-08-13/how-a-
bil...](https://www.bloomberg.com/news/features/2018-08-13/how-a-billion-
dollar-autonomous-vehicle-startup-lost-its-way)

~~~
perl4ever
"Three more doublings, and that number will be bigger than average miles
between accidents for human drivers. Then they can ship a product."

I am skeptical of that, because I expect that even if disconnects are ultra-
rare overall, and the testing is representative of normal usage, there will be
enough people that frequently use the technology in ways far off on the tail
of the distribution that for them, disconnects will be much less rare. And
that might create a PR blowup or worse.

I think what is worrying me may be something called "heteroscedasticity".

[https://en.wikipedia.org/wiki/Heteroscedasticity](https://en.wikipedia.org/wiki/Heteroscedasticity)

~~~
est31
For self-driving cars, what would that be? You can misuse a non-self driving
car by speeding, giving you above-average chances of accidents. You can be
old, giving you above average chances of being involved in accidents (per km
driven). You can smoke, giving you above average chances of getting lung
cancer. You can misuse a partially self driving car by somehow convincing it
that you do follow the traffic while actually not following it, and then blame
any accidents onto the self-drive feature.

But fully self driving cars only have two inputs: starting point and
destination. Doesn't matter whether you are old, sleeping, 13 or a dog. Unless
some roads are more susceptible to accidents than others, or manufacturers let
people also fiddle with how much a car should risk accidents in order to get a
little decrease in travel time, I don't see any way possible. And if
manufacturers really do give that option, it's truly their fault.

~~~
paulddraper
> Unless some roads are more susceptible to accidents than others

By chance do you happen to live somewhere without icy roads?

~~~
Phlarp
I live in Minnesota and have my entire life. I hope very much that self
driving cars lead to a more sane response to icy conditions, which is to not
drive. If the vehicles are in a position to enforce that at a technical level
then all the better.

~~~
paulddraper
I agree. I was simply pointing out that "unless some roads are more
susceptible to accidents than others" certainly happens, very frequently.

But I too am hopeful that machines are better than human drivers in all
scenarios.

------
martythemaniak
There's a few things to keep in mind when discussing self-driving tech.

1) Self driving tech doesn't exist today. It simply does not. There's lots of
people working on it and they are using different strategies, but it is not
clear how long it will take, which strategy is technically superior, which
strategy is more economical, etc etc. There's a lot of very strong opinions
floating around (LIDAR is essential! No, vision and lots of data! No, 3D maps
or bust, etc etc). It is important to keep in mind that we don't know the
future, the people busy inventing it don't know how things will play out and
outside observers know even less. The tech industry is littered with strong
opinions which have aged terribly.

2) Musks's first principles aside, Tesla was never going to do LIDAR. They're
in the business in shipping real cars to real people and there was never any
possibility of mixing that with LIDAR. If they had gone with LIDAR and started
a research program like Uber, they would have started years after Google, with
less data, far fewer resources and absolutely technical advantage. It is far,
far more likely that this program would have bankrupted them than beat google.
In essence, whether Musk's first-principles is real or a sales pitch doesn't
matter. Tesla's choice was their current approach, or nothing at all.

~~~
Someone1234
> They're in the business in shipping real cars to real people and there was
> never any possibility of mixing that with LIDAR.

Can you explore that a little? We have consumer products that contain LIDAR
already (e.g. Neato Vacuums). The sensors no longer cost thousands of dollars,
and could supplement other types of inputs to provide a broader picture of the
road.

~~~
Gladdyu
Yes, but you can't install a LIDAR over a software update. They've been
promising full self-driving mode and getting closer by updating the software -
the owners that already have the current car won't be to thrilled if they get
cut-off from the state-of-the-art in Tesla's self-driving tech.

~~~
FireBeyond
... particularly since they were charged up to three or five thousand dollars
more for Elon's "promise" that some day (apparently this year) it will be
available...

------
peeters
I've always found that half the issue is with the name: Autopilot. On an
airplane, autopilot means you can take your focus off of both the physical and
mental act of moving the plane through the air, and if it needs you to
intercede there is pretty much no emergency in which you wouldn't have a few
seconds to react.

In a car, Tesla's Autopilot simply doesn't give you that. If you are reading
and your autopilot exits, you could be dead before your focus can return to
the road. Planes fly in wide open spaces far from hazards. Cars drive in busy,
tight spaces where a fraction of a second lapse in control can be fatal.

I get that Tesla's implementation is analogous to what you find on a jet. But
the environments make them very different. So start by dropping the name. It's
not auto pilot, it's handsfree driving.

~~~
jniedrauer
I don't think this accurately considers how the term "autopilot" is actually
used in aviation. Autopilot can mean anything from "hold a gyro estimated
heading but not pitch or speed" to "maintain airspeed and follow the localizer
down a glide slope while cross checking GPS and ILS."

> if it needs you to intercede there is pretty much no emergency in which you
> wouldn't have a few seconds to react.

Traffic avoidance is certainly one. Airspeed dropping while the autopilot
tries to maintain altitude is another. This can result in a stall and complete
loss of control under the wrong circumstances.

Autopilot does _not_ mean "read a book and let the plane fly itself." You are
always cross checking instruments, visually scanning for traffic, running
checklists, rehearsing your next steps, etc.

~~~
danaliv
You're absolutely right, but the issue isn't what an aircraft autopilot
actually is. It's what the word "autopilot" evokes in the minds of the non-
pilot public. To them, autopilot means fully automated with no oversight.
("Planes fly themselves!" and other such nonsense.)

~~~
toomuchtodo
Please don’t force marketing to conform to the lowest common denominator.

People have _some_ responsibility to educate themselves about the products
they’re using.

~~~
threeseed
You are being ridiculous.

The definition of autopilot has been this way for decades. You could literally
ask anyone of any age what it means and everyone would have the same answer.

~~~
erikpukinskis
That's not true.

------
zaroth
I’ve read so many of these articles obviously written by someone who has no
hands-on experience with the Tesla Autopilot and what it actually does for the
driving experience.

Maybe Tesla will get to a point where my TM3 will drive me to work while I
nap, or use my phone. One thing that’s pretty amazing is that if they can get
to that point, it will come as a free automatic software update or an
available for purchase hardware upgrade on my current car.

This month my car will be getting 5% faster acceleration from an OTA update -
how cool is that?

But right now, the reality is that I can engage AutoPilot on the highway and
it immediately and dramatically changes the driving experience. Instead of
focusing on _steering_ the car, I am focused on situational awareness. I am
not just looking down my lane, I am looking at the cars around me, who is
passing who, what might be coming up around that bend, that guy on his phone
next to me, etc.

I can look at over drivers and assess whether they are paying attention and
avoid them if needed.

Because the car is actually lane keeping — not what everyone else calls lane
keeping which (surprise) is a total lie - but actively steering the car in the
lane around curves and centering the car in the lane probably more precisely
than I would be if I were steering manually.

I strongly believe, if AutoPilot never truly advanced much beyond its current
capability, and as the currently functionality becomes more widespread, we
will wonder 15 years from now, how did people operate their vehicles without
this level of assistance? How could you be properly aware of your surroundings
if you had to be so preoccupied with minor steering inputs?

If you use your phone and watch a movie or put on makeup while driving, you
are breaking the law and endangering yourself and those around you.

Every time a new technology comes to cars people say it will distract, or
hypnotize, or lull drivers into distraction (see wipers, radios, automatic
transmissions, original cruise control) and anyone who has actually driven
with AutoPilot knows this feature is no different.

As a Tesla owner I enjoy and appreciate Tesla’s real-world approach to self-
driving and it makes my life better and my drive safer. Thank you Tesla.

~~~
ypzhang2
Issue is Autopilot is more of a feature that is geared towards convenience.

You are doing all of the driver attention work, but someone who activates
Autopilot isn't required to. I think a lot of the argument against Tesla is
that Autopilot isn't doing enough in that arena.

GM's Super Cruise is a lot more feature-full with regards to driver
attentiveness. It might not be as useful in terms of being able to be used
everywhere, but its definitely more well-rounded in terms of forcing driver
attentiveness.

~~~
zaroth
Anyone who drives a car is technically, legally, and ethically responsible for
being capable of operating the machine and to do so in a responsible manner.

Tens of thousands of people a year fail to do exactly that. Tens of thousands
of fatalities every year, not to mention maimings and injuries, countless
billions of dollars in property damages and lost wages, because humans
generally fail in their legal, and ethical responsibility to operate their
motor vehicles responsibly and in conditions they could handle.

There have been a few cases where Tesla drivers have made the same failing.
Sometimes the AutoPilot feature was engaged at the time. Sometimes they were
just pushing the vehicle too hard and ended up on the wrong side of physics.
The Tesla is stupidly fun to drive and sometimes I drive more spiritedly than
I should. Being too fun to drive could get people hurt or killed too (and it
has)

I strongly believe a responsible driver is more safe with than without an
AutoPilot feature, mostly from my own personal experience, as the data point I
used to cite is somewhat controversial.

Just like the amazing cornering and torque of the Tesla can be abused and even
lead to fatalities when pushed too far, so can the AutoPilot.

I personally think it’s a mistake to make the feature significantly less
useful to a responsible driver to try to possibly prevent these edge cases. If
it were possible to make the attentiveness features entirely non-intrusive
than absolutely they should be added. But in reality the attentiveness
features are already intrusive to a responsible driver and detract from the
experience.

Interestingly Volvo seems to be going in a different direction. They believe
as the manufacturer it’s their responsibility to create a product that even a
human attempting to operating irresponsibly or illegally should be kept safe.
They’re adding hard speed limits well below the functional limit of the
hardware, and contemplating even systems like breathalyzers and fatigue
detection which would entirely disable the vehicle.

I personally don’t want to live in a world where every product I use is sizing
me up and deciding how I should use it, whether it’s a chef’s knife, jet ski,
automobile, or semi-automatic.

In the meantime what I love most about AutoPilot is how it can only possibly
get better over time, and every single car in the fleet is benefiting from
that. That’s as long as the regulators don’t fuck it up.

------
devy
The top editor promoted comment[1] on the article summed up very well.

    
    
      The attention problem is well known in engineering. 
      It is very hard to get a human to concentrate on something 
      that will turn up good more than 99% of the time, 
      even when there's serious or fatal consequences of failure. 
      Trains are the classic example - tracks are amost always clear, 
      signals are almost always correct which means you have to
      devise all sorts of systems to keep the driver alert.
    

[1]: [https://arstechnica.com/cars/2019/03/teslas-self-driving-
str...](https://arstechnica.com/cars/2019/03/teslas-self-driving-strategy-is-
outdated-and-possibly-dangerous/?comments=1&post=36970595#comment-36970595)

~~~
justapassenger
Sadly, Silicon Valley companies don't think that best practices of engineering
apply to them. It's all about disruption, both with products and safety.

~~~
WrtCdEvrydy
"We're going to apply well known engineering to an existing problem" doesn't
sell VCs as much as "We're going to disrupt things and launch satellites
illegally"

------
CPLX
It's funny how much people have danced around all this, including this
article, but what Musk has done with his statements on self-driving
capabilities is called _lying_.

~~~
xeromal
If someone has a delusional idea, is it still a lie? I don't think Musk 'lies'
to earn more money. I think he is delusional for how long certain things take
to do and at worst is trying to keep his companies afloat so he can continue
with his ideas. I'd say his sins are less than the CEO of Enron for instance.
It's hard to blame the guy when he successfully built a rocket company.

~~~
fastball
You could almost call him the successful / less pathological Elizabeth Holmes.

He promises the future before it has arrived, but unlike Theranos, he actually
delivers a product. That product might be inferior to what was promised, but
it is an improvement on the status quo nonetheless. And once released, Musk
does not lie about its capabilities.

~~~
SheinhardtWigCo
This is on the order page for the Model 3 right now:

> Summon: your parked car will come find you anywhere in a parking lot.
> Really.

Except that feature hasn’t shipped and there’s no ETA. I call that a lie. I
appreciate what Tesla is doing to the industry but I wish they wouldn’t be so
dishonest in their marketing.

~~~
Negitivefrags
Actually that particular feature is currently in beta testing and the ETA is a
few weeks.

It has a limit of 150 feet and I’m sure in practise it will just annoy other
drivers unless the parking lot is fairly empty, but there you are.

[https://electrek.co/2019/03/01/tesla-enhanced-summon-self-
dr...](https://electrek.co/2019/03/01/tesla-enhanced-summon-self-driving-
parking-lots/)

------
kevin_thibedeau
> "We already have full self-driving capability on highways," Musk said during
> a January earnings call.

Except for obstacles that coincide with whitelisted locations their sensors
can't handle. It's only a matter of time before someone dies because of that
hack.

------
cbames89
This is a (perhaps short-term) failure of first principles reasoning. Elon's
known to favor thinking about things from first principles, and there's no
theoretical reason that vision can't work. However, technical limitations
might take a while to overcome.

~~~
ForHackernews
Vision alone fails pretty frequently even with a human brain and 700 million
years of evolution behind it. Count me as a skeptic that cameras + computer
will produce a safe self-driving system.

~~~
munificent
_> 700 million years of evolution behind it_

Sure, but only about 0.000014% of that time occurred while cars existed. It's
not like we evolved to control big metal objects hurtling down artificially
made flat surfaces.

~~~
mikeash
On the other hand, while our brains did not evolve to control cars, our cars
were designed to be controlled by our brains. A system designed from scratch
for autonomous vehicles would look radically different.

------
haberman
I got a Tesla last year. I love it. But it's been painfully obvious to me that
Autopilot is nowhere close to something that will let you take a nap.

Often when I'm stopped at a light, cars that are standing completely still
will appear to be constantly moving forwards and backwards on the display. My
best theory so far is that the Tesla's spatial model is getting thrown off by
the other car's turn signal. This does not inspire confidence.

In general, I find the radar-enhanced cruise control very reliable (so nice in
bad traffic), but autosteer is flaky at best.

~~~
FireBeyond
> In general, I find the radar-enhanced cruise control very reliable (so nice
> in bad traffic)

Which is great but hardly unique to Tesla. Every major manufacturer out there
offers adaptive cruise control. My car even recognizes the difference between
in traffic stop-and-go/rush hour, and "queue" mode (exiting parking lots after
events, etc).

------
tbabb
IMO Tesla does not have the hardware on the Model 3 to do full self-driving.

I believe that stereopsis (multiple cameras using parallax to solve for per-
pixel depth) is necessary to get a practical, well-functioning self driving
system working. LIDAR is just too expensive and not good enough, but
stereopsis is extremely flexible and can have extremely high angular
resolution.

Combine naive stereopsis with temporally-coherent sensor fusion (e.g. a well-
designed Kalman filter), and I think you could have very robust ranging.
Humans are already very good at this with two narrowly spaced eyes (stereopsis
to 1/4 mile range is not unreasonable for a person)-- but a car is not limited
to a 1.5 inch stereo baseline; it could have a stereo cameras on opposite
sides of the windshield. That would _hugely_ increase the depth sensitivity,
even at moderate resolution-- parallax can be detected well below the Nyquist
limit (since Nyquist cuts off frequency, but does not destroy phase).

Tesla is totally failing at even the basic level of environment awareness
(c.f. cars which have been driving into exit dividers), which is what I
consider to be the easy part of self-driving (the hard part is getting the
machine to participate in a nonverbal social environment, which is what the
roadway is). Rumor has it that Teslas can't detect obstacles far enough ahead
to avoid them at more than 30mph-- absolutely abysmal, if true.

If it were me, I would put cameras on the Tesla windshield in this pattern:

    
    
        xXoO      OoXx
    

Where x and o are long-range and wide-field cameras, and caps and lowercase
are dynamically exposure-adjusted to capture both brights and darks. Each pair
has the largest possible parallax baseline. And I would do a big, fat sensor
fusion on all eight of them to get a high-res, depth-augmented HDR map of the
surroundings. No fancy, expensive sensors. Just a large number of cheap
cameras and sophisticated software. Maybe do some tricks with cycling
'attention' or multi-res hierarchies to keep the computation load down and
realtime.

Tesla has only three cameras with different FOVs and a very narrow baseline--
I doubt they are doing stereopsis. And I don't think they can do the job
without it.

------
jaimex2
I'm confused.

The article seems to extrapolate that Tesla has failed in autonomous driving
because it removed some lines from its Autopilot page. It's still very much in
progress and progress is routinely confirmed by the company. Musk is known for
blowing timelines but they do get delivered.

When it talks about "old approach" I'm again confused. No one else crowd
sources driving data from a real world fleet. They have a unique unsupervised
learning datasource from shadowing drivers.

As for attentiveness. I don't see how drivers not paying attention is any
different to mobile phone use. Currently Autopilot is very clear on telling at
every chance its an assistant. Saying it will kill people is like blaming
phones for drivers texting while driving and getting killed. These people die
because they are breaking the law and not in control of their vehicle.

~~~
g6nhe9twPd66
> Currently Autopilot is very clear on telling at every chance its an
> assistant.

Yep, hence why this article is just clickbait for Tesla haters. Every Tesla
owner is fully aware that it's not self-driving in the sense that you can take
a nap.

~~~
jaimex2
Arse Technica indeed.

------
wurst_case
I wonder, though, if this had more to do with how much they are actually
investing in AI vs electric car tech. Surely Tesla is an EV company first and
an AI company second. I know they like to brag about their autopilot but it
eems to me that having a lower cost car is more important than having an
expensive lidar system on board.

~~~
Skunkleton
Most car companies make lots of money on add-on features. The base model
couldn't be super profitable, but the base model with the self-driving package
probably makes some good money. Or it would if the didn't have to keep doing
recalls to upgrade the hardware.

------
Shivetya
TM3 owner, car has EAP. I have the option to buy FSD for two thousand but
haven't jumped. I not only don't believe it but I don't have a need for it.
Now it might be useful down the road for resale.

That being said, two thoughts.

First, if they want me to buy it then demo it in passive mode in my car. That
is, there is space where the current speed limit is shown. Use that space and
below that to show what signs and signals it has seen recently and the order
of importance. Currently it does not see speed limit signs and it that is
wholly FSD territory then Tesla is overcharging compared to other systems.

Second, just at a stand still cars around me jump and I am not sure it sees
stationary cars when I am driving. The best example I have is a two lane road
through a subdivision I usually take, the outer lane is a long turning lane
but people tend to stop there to let kids out for the water park. I cannot
recall my car showing a car when someone is loading/unloading kids but it does
see cars moving in that lane when I over take them. So is it just going by too
fast for the stopped car in the lane over to register? It knows its a lane. I
am not sure but I will wait to see how it develops.

------
abbbacccus
I am still trying to figure out what started and sustained the enormous self-
driving car hype of the last few years. I understand why people who don't know
much about technology would buy into it, but I don't understand why so many
relatively tech-savvy people have bought into it and why huge amounts of money
have been invested into it. It should be obvious - and should have been
obvious when the hype started, as well - that to create fully self-driving
cars that can operate as well as a competent human driver in a full spectrum
of real road situations would require solving extremely difficult technical
problems that are nowhere near solved and that cannot necessarily be solved
any time soon simply by throwing money at them. So what explains the hype and
the investment money? Out of the people responsible for the hype, what
fraction were/are merely deluded and what fraction were/are lying?

~~~
justtopost
We desprately want it to be true, so we ignore the truth long enough to try
earnastly to make it so. It the brave and ignorant charge of the new
generation. Which begs the question; Is it enough to mean well and act in
earnst, or is ethics reserved for those who have the luxury of self-
reflection?

------
syntaxing
Not having LiDAR for anything over level 3 self-driving capabilities seems
like a very bad idea...Computer vision right now just does not have the
spatial awareness that you need for self driving capabilities. I wish Tesla
would work on something similar to the Kinect v3 but for long range (above 20
m).

~~~
leesec
Having LiDAR in any car for self driving seems like a prohibitively expensive
idea though. For instance: See how no company has released a commercial
product with LiDAR.

And I definitely think Computer Vision will get better before LiDAR gets cost
competitive.

~~~
Robotbeat
I agree.

However, suppose the opposite: Suppose LiDAR gets cheap first. Like, I don't
know, $1000 per unit and compact enough to not be a big burden on the rest of
the vehicle.

Tesla charges $5000 for the FSD add-on. In principle, they could easily afford
to retrofit vehicles with the cheap LiDAR plus maintain any advantages the
full computer vision system would add (such as identifying vehicle types to
assist in predictive behavior modeling and avoidance strategies).

So trying to go the full Computer Vision (with ancillary radar) route actually
has pretty low opportunity cost for Tesla.

Another thing: I wonder how much simply much-improved geolocation could help?
That failure from last year of driving right into a (failed) collision
attenuator (where an off-ramp split off) could've been avoided simply by
having high confidence, half-meter-or-better geolocation (from GPS, cell, IMU,
etc fusion) and good mapping. And with good mapping/geolocation combined with
other sensors, the sensors could focus on identifying changes to the expected
road condition, perhaps increasing their robustness. (and when networked, they
could use sensing from other vehicles ahead of the current vehicle to assist
in navigation as well)

~~~
leesec
AFAIK all the big companies working on this problem include high-definition
mapping as a major component. The problem is gathering that data without
having an actual LiDAR. So some companies like Waymo will literally just have
cars with LiDARs drive all around every street and map it for them. This is
great but means they're limited by when they travelled the road and conditions
at the time.

Ideally, you have a mapping system that is constantly updating (every time
your fleet drives by something new like construction on the road it auto-
updates the map).

Comma.ai is working on a vision+GPS Kalman filter solution along these lines
up to 10 cm accuracy. I'd guess Tesla is as well.

And then yes, after that the vision would be primarily for localization.

~~~
Robotbeat
Indeed, as recent Tesla patent(s) show: [https://electrek.co/2018/12/09/tesla-
patent-technology-accur...](https://electrek.co/2018/12/09/tesla-patent-
technology-accurate-gps-positioning-for-self-driving-vehicles/)

------
ableal
_" Self-driving cars also benefit from lidar sensors, and the best ones cost
thousands—if not tens of thousands—of dollars each. That's too expensive for
an upgrade to a customer-owned vehicle. But the economics are more viable for
a driverless taxi service, since the self-driving system replaces an expensive
human taxi driver."_

That's probably the crucial point. For now, lidar is needed for safe
operation, and too expensive for mass deployment in private cars.

~~~
alkonaut
Commercial (not even taxi, but transport) will be the first to be autonomous
because of the larger savings possible. That we'll see useful autonomy for a
$50k car driven by individuals, rather than in a $1M transport vehicle
operated by a huge logistics company isn't likely.

------
skwog
ITT: Hacker News devolves into Slashdot

[https://old.reddit.com/r/funny/comments/az2c2i/life_well_spe...](https://old.reddit.com/r/funny/comments/az2c2i/life_well_spent/)

------
jayess
If self-driving can navigate US streets fully autonomously, even in bad
weather, that will be impressive. Now transplant that to just about anywhere
else in the world, and it will be impossible. Mexico City, Rome, Buenos Aires.
Hah, no chance.

------
agumonkey
Are there efforts to invert the problem ? instead of fully independent
vehicles, having a bit of road signaling system (a reincarnation of 50s US
embedded radio track).

------
woodandsteel
Tesla has taken on so many things that have never been done before and
succeeded, it would not be surprising if it failed at one of them.

------
EngineerBetter
The big missing piece is not LIDAR, but causal reasoning. Autopilot and
similar 'AI' cannot reason; it doesn't have a mental model to ponder 'what if'
and can't use counter-factuals to ponder what would happen if it didn't do
something. It's just glorified pattern matching currently.

Reading Judea Pearl's The Book Of Why certainly sobered my outlook on AI.

~~~
andreyk
Well, that's not necessarily true... self driving in particular usually uses
planning, which is basically all about 'what if'. But it's true this is
relevant to AI in general.

------
_pmf_
It also has a factory automation strategy that other companies abandoned
decades ago.

------
pascoej
Anthony Levandowski seems to have the same opinion. I'm a big believer in
vision, it'll be interesting to see how it plays out.

------
Robotbeat
Interestingly, for air travel, there's another strategy: Rely entirely on GPS
and commanding from air traffic control (with systems and structure in place
to minimize damage in case those systems fail or are insufficient).

That's what Zipline uses for their _fully operational_ medical delivery system
in Rwanda.

The drones are completely blind, directly, but they travel in pre-defined
flight paths and communicate with one another and follow directions from
flight control.

For self-driving cars, a similar situation may be extremely good GPS
geolocation combined with network-wide sensor fusion and mapping. In
principle, you could even do real-time optical or synthetic aperture radar
from orbit or via persistent aircraft to allow the central controller to
identify hazards and obstacles and to update maps in real time without much at
all happening on-board except low-latency reactions. Most highways already
have much of their length covered by cameras for the local Department of
Transportation, and even many intersections and sidestreets have surveillance
cameras. A low latency connection to that system, upgraded with higher
fidelity, could significantly help autonomous systems. Might be another very
helpful public good for cities to provide, much like GPS is provided.

...and on the engineering side, you can have vehicles designed specifically to
reduce pedestrian injury, such as not having those large, boxy grilles on SUV.
Such things are almost entirely cosmetic and reduce efficiency, so Tesla
doesn't have them on their cars. Going beyond that, external airbags may help
as well. Zipline uses a foam body and a fail-safe parachute to stop the
vehicle if there's a problem.

~~~
kec
If a plane can only get a GPS fix to within a 15m radius that's no big deal,
whereas if a car has a 15m fix it's running down pedestrians on the sidewalk.
Planes also don't have a habit of flying in tunnels or urban canyons where
obtaining a gps fix is even more problematic.

~~~
lutorm
Planes already have a problem with GPS outages or military jamming exercises.
Imagine if all cars within a 100 mile radius all of a sudden would stop
working.

~~~
Robotbeat
...that's why you develop systems designed to bring them to a safe stop (in
addition, stable IMUs can assist for several seconds, plenty of time to come
to a safe stop). That should be standard on all autonomous systems, TBH.

And BTW, there are 4 GNSS systems, each run by a different country/entity. The
odds of them all failing at the same time is very small.

~~~
ksherlock
GPS is (negatively) affected by atmospheric variability. Are the other 3
systems immune from it?

