
The Battle for Best Semi-Autonomous System: Tesla Autopilot vs. GM SuperCruise - kartD
http://www.thedrive.com/tech/17083/the-battle-for-best-semi-autonomous-system-tesla-autopilot-vs-gm-supercruise-head-to-head
======
Animats
GM did a nice job there. They have a good solution to the "expecting the
driver to watch and take over" problem. Their system watches the driver
constantly to make sure they're in position to take over, but doesn't require
a hand on the wheel. GM is also more careful about when to allow the control
system to engage.

Tesla's crashes on "autopilot" have mostly been in situations where the system
should have detected an obstacle. Four times, a Tesla on autopilot has slammed
into an obstacle partly blocking the left edge of a lane. All those crashes
were in freeway situations, where Autopilot is supposed to be reliable. That's
inexcusable. Tesla's radar has inadequate resolution for the job, the vision
system doesn't really recognize fixed obstacles, and they don't have LIDAR. So
they don't detect big things like fire trucks and street cleaning trucks that
are mostly on the shoulder and partly into the lane.

~~~
SimbaOnSteroids
Why aren't we using sonar for this sort of thing. Bats have nailed down the
echo location thing in extremely heavy bat traffic, but I don't see that being
something that has been vigorously pursued by tech companies.

~~~
screye
Lidar kind of works the same way, but is a lot faster (speed of light vs speed
of sound)

I could also see sonar taking a lot more energy to use all the time as
compared to Lidar.

~~~
mandevil
LIDAR has severe range limitations, such that it's basically unusable for
highway speed driving (or at least did, <mumble> years ago when I worked on
autonomous driving), which is why Tesla doesn't use LIDAR.

Looking at Thrun et al.'s paper on Stanley ([https://www-
cs.stanford.edu/people/dstavens/jfr06/thrun_etal...](https://www-
cs.stanford.edu/people/dstavens/jfr06/thrun_etal_jfr06.pdf)):

"The effective maximum range at which obstacles can be detected with the laser
mapper is approximately 22 m. This range is sufficient for Stanley to reliably
avoid obstacles at speeds up to 25 mph. Based on the 2004 Race Course, the
development team estimated that Stanley would need to reach speeds of 35 mph
in order to successfully complete the challenge. To extend the sensor range
enough to allow safe driving at 35 mph, Stanley uses a color camera to find
drivable surfaces at ranges exceeding that of the laser analysis."

They built a neural network to analyze terrain and find safe paths- and
trained it driving at 25 mph across the environment with the LIDAR providing
the control- and then let it work at higher speeds where the LIDAR would be
able to panic stop but not much more.

~~~
giobox
I may be way off base here, but hasn't the LIDAR field evolved enormously
since 2006 (date of that paper)? Last I read prototype solid state LIDAR
systems have ranges in the ~150m range, although I don't follow this field all
that closely.

Given that the major players like Waymo/Uber/Apple etc seem to be betting very
heavily on LIDAR, one would assume these challenges are being resolved. I
suspect the traditionally very high cost of LIDAR was likely a much bigger
factor in Tesla's decision not to use, rather than any technical reason.
Cameras plus radar is vastly cheaper to ship for now.

------
gtowey
According to the article GM has decided to simply restrict the domain of their
semi-autonomous offering to only work on divided highways. The author says
that the result is innovative and brilliant. But the problems they're trying
to solve are not nearly as ambitious or challenging as those Tesla has decided
to tackle.

While it can be argued that GM should received kudos for releasing a feature
that can work reliably, I don't see it as innovative.

It kind of reinforces that traditional car makers are never going to offer
anything revolutionary to the car market. They're going to use the work of
others to incrementally add features to cars that they think will increase
sales. They won't take risks with technology.

~~~
013a
Their decision to limit the initial problem domain is genius. When it comes to
autonomous driving, safety is paramount. Limiting it to highways is a good way
to ensure that you can maximize the system's safety and performance, while
still getting a product to market that is ahead of the competition and which
allows you to gather data and iterate to expand in the future.

SpaceX wants to go to Mars. Why haven't they yet? Do you consider them
uninnovative? Or maybe they're just taking their time getting the technology
right, because they know (1) they're leading the industry already, and (2) if
they fuck it up, it would be a highly expensive mistake they probably wouldn't
recover from.

If your definition of "innovation" is "irreparably risking your business and
customers' lives in the pursuit of profit", I sincerely hope I never have to
interact with you in any way in the future.

~~~
ramenmeal
Curious though, would you buy this car over a Tesla (basing your decision on
the self driving features only)? The reason I ask is because if I buy a Tesla,
I have the idea that one day it will be completely self driving one day. With
this Cadillac, I would expect it to only do this specific subset of self
driving for the life of the car.

~~~
semi-extrinsic
> I have the idea that one day it will be completely self driving one day

Why do you think that? Can you explain how this would happen, when the car
appears to lack the physical hardware (e.g. LIDARs, computing power) to do so
in anything but clear conditions on highways?

Moreover, why would Tesla be interested in doing so? They gain nothing, reduce
a customer's desire to upgrade to the latest model, and put themselves into
major extra liability if the system ever fucks up.

~~~
trevyn
They would gain a massive fleet of self-driving cars overnight.

Remember, Elon doesn’t think like other industrialists. I fully believe that
he views profit as little more than a tool to reach his other goals.

Also, having worked on autonomous vehicles, my opinion is that LIDAR is really
overrated, and _if_ compute power is an issue, that can be a straightforward
modular upgrade.

~~~
bonestamp2
> LIDAR is really overrated

Fair enough, although it does have better range than ultrasonic and it can see
through rain/fog/snow better than cameras.

------
stephenbez
(Slightly off topic) Has anyone else been around the Cruise Automation self
driving cars in SF? I have and I can't believe they are allowed on the road
given how poorly they drive.

While biking to work today, there was a Cruise car behind me, driving at about
7 mph and just randomly stopping in the middle of the road constantly for no
reason at all. Lots of cars kept on constantly honking at the car the whole
time.

Last week I saw a Cruise car start to make a left hand turn at an
intersection, but its sensors must have thought someone was in the crosswalk,
so it stopped in the middle of the other lane of oncoming traffic causing
other cars to abruptly stop and start honking.

If this was a normal car, I'd call 911 to report a drunk driver. I wonder if I
should just call 911 for a dangerous driver in this case. Is there any way I
can report this behavior and get their cars off the road?

In case anyone wants to do some investigative reporting, all their cars come
out of a garage labeled Borden Decal Co.
([https://goo.gl/maps/F85WPTwkmbQ2](https://goo.gl/maps/F85WPTwkmbQ2)). If you
follow them around you can probably observe similar behavior.

~~~
cloudwalking
Yes I always get a good laugh watching the Cruise cars trying to drive. They
are not smooth and do weird things that a human driver would only do when
drunk.

------
abalone
It's kind of a weird review. For example it rates Cruise's lane changing
superior -- "as perfect as it gets" \-- because it simply _requires you to do
it manually_. (The "perfection" is in how the UI clearly communicates
auto/manual transitions.)

I mean say what you will but that's just a weird way to rate autonomous
driving systems.

~~~
Negitivefrags
If you had used this feature on a Tesla you would understand.

Cruise ranks 0 (it gets out the way) but the Telsa ranks -5. (actually
dangerous to use).

The Tesla dealership literally recommended leaving the feature turned off.

~~~
abalone
Fine, then give them both Fs, don't call 100% manual lane changing "perfect".

The reviewer also experienced _frequent_ highway disengagements by the
SuperCruise system which you can observe in the video, yet still rated it
higher than Tesla, citing only one Tesla accident. That's also weird!

------
thebluehawk
I'm really confused. When describing the "Operational Domain" The author
explains that a Tesla he drove 18 years ago handled a sample drive on FDR with
two disengagements, while the Cadillac wouldn't stay engaged when he tried the
same drive. Then says the Cadillac wins by a hair. How does a brand new system
that doesn't work reliably win over a system that's been working better for
over 18 months?

~~~
make3
the whole field of modern "ai" is about 6 years old (since AlexNet won
ImageNet), so 18 months is quite a long time.

More importantly, a review of flash performance is something that can be
reasonable to perform, as another more long term reliability oriented one also
can be.

~~~
felippee
This is a huge misconception. Modern AI is at least 30 years old, first
convnets were conceived in 1980'ies, LSTM in early 90's. A lot of work went
into deep learning before AlexNet eventually won the ImageNet. To some degree
last years were not the "emergence" of deep learning but rather a culmination
of many years of research and development.

~~~
make3
I agree with you about the theoretical origins, but you'll agree that we
didn't have anything remotely as deep or as effective before AlexNet, and that
there was not remotely as much attention to the topic

------
redcomplier
I'm not sure most readers know but SuperCruise has nothing to do with Cruise
Automation. SuperCruise was a GM project long before Cruise was a company or
acquired by GM. Different technology and teams all together.

------
xversilov
I'll be driving manually for a long time. I have no interest in beta testing
anybody's autonomous systems at 70mph.

~~~
undersuit
You're part of the beta test if you're on the road with any car using
autonomy.

~~~
rubicon33
Yea well, you're also beta testing some 16 year old's brain on 6 pints of
beer, or some 90 year olds brain who missed their afternoon nap.

You're beta testing a lot of shit on the road. I pray for the day when the
only thing we're really testing, is the software, and not some idiot's brain.

------
uptownfunk
My driver has had to catch the wheel on more than one occasion. When we’re in
the carpool lane and the lane opens up for others cars to merge in the
autopilot doesn’t see the edge of the lane and so starts swerving into the
other lane. Also we did a trip to lax from San Diego and the dividers come
pretty close to the edge of the lane, we were always little nervous the car
was going to hit it. Overall it rides so smoothly but those few glitches make
me quite nervous sometimes. It is kind of surreal to be riding in an almost
self driving car. Tesla is taking steps to add safety features. Lately my
driver sped up to overtake a car and the autopilot disengaged and wouldn’t re
engage until we put it on park.

------
simion314
I see many people using this reason: Because there are people that drive drunk
or text while driving or get distracted then we need the autopilot.

I think this is wrong, we just need an AI or other tech that detects this bad
drivers or distracted drivers and don't let them drive/start the car or for
distraction like texting safely stop the car, maybe report the driver, there
must be other solution then allowing bad rrivewrs or bad AI to drive around.

~~~
dna_polymerase
Perfect, let's report all the drunk people! I'd love to live in a world where
my car reports me to the authorities whenever I don't obey to its definition
of good behaviour! \s

~~~
simion314
Why is not fair to have more ways to not allow drunk people to drive and
depend on having policemen stopping you? Is your need to drive drunk more
important then the life of others?

Let's assume your car won't let you start if drunk, there will be a fail-safe
mode, you activate that in case of emergency and start it and drive to a
hospital because there is an emergency. In case the car is broken and it is
reporting false positive we could have rules so you get compensated for this
issue by the people at fault for the false positive.

Your argument is like saying that forcing people to learn to drive is too
much.

My point is that the bad drivers problem has more solutions not only a less
bad AI autopilot.

~~~
dna_polymerase
My argument is that people should stop and think before asking something like
this. If the car is capable to detect drunk people and shut the car down it
could do so for any reason the car manufacturer likes, or even worse, the
manufacturer could sell access to that system to the NSA, your bank, your
insurance, whatever.

Did you really assume I'd like more drunk people driving around?

~~~
simion314
I accept your concerns, but the problems you have could be solved with laws
right, you have laws for medical data, you can make laws for your car data.
The Tesla and the other cars also can track you and send your data to NSA, is
there less bad potential for bad things on this automated cars?

Again my point is you can reduce the car accidents with other solutions then
the sexy self driving cars solution that is better then a drunk or texting
driver but worse then a beginner.

Let's assume we are not in a bad country with bad laws and evil companies and
we want to reduce the fatalities in accidents starting next month, can we
create some hardware and software that does: -checks if driver is drunk
-checks if driver is paying attention, not texting, or having a medical
problem or sleeping at the wheal

My opinion is that we could implement that, and it would be better then having
AI that hits stopped trucks, or that get updated monthly with who knows what
new features that could make it worse in some cases.

What I would suggest is to apply the scientific method, create this devices ,
find some people to use them(give them some tax exceptions or something) then
look at the number. If NSA is a danger in your country then I don't think
Tesla has less bad potential, I would fix the NSA problem, like find a way to
make them follow the laws and not spy everyone. IF the companies are bad then
make laws and not let them track you or sell your data, there are many lives
lost so we should try more solutions and not give up because NSA or bad
companies.

Until self driving cars are as good as a regular driver then we should not
allow them drive because are better then a bad driver.

Edit , I am sorry I accused you of wanting to drive drunk, it was not ok what
I said, I read your comment as I don't want some law to add a device in my car
to prevent me doing illegal things on public roads, I am also against
companies that collect your data, I am just frustrated about the people that
want the un-optimal solution of bad self driving cars to the problem of drunk
or tired drivers.

------
ibson99
sometimes i wonder how long it will take for these technologies to be
available to poor people in Africa.

------
ibdf
So I guess now we are settling for semi-autonomous vehicles. This would have
been impressive 10 years ago (maybe more). They are just playing catch up
while they could have been trying to be innovative all along.

~~~
empath75
I think it's more that there is a big dangerous gap between semi-autonomous
vehicles and fully self-driving cars, and car companies are trying to figure
out where to draw the line. Tesla, I think, takes too many chances, and GM
probably is erring so far on the side of caution that most people won't use
it.

Personally, I don't really have any use for a self driving car that makes me
pay attention to the road. The primary 'cost' of driving isn't that i have to
move the steering wheel a bit, it's that I can't do something else when i'm
driving.

------
ucaetano
Kinda of a pointless comparison when one of the vehicles compared isn't and
never will be commercially available...

~~~
umurkontaci
It's already commercially available on Cadillac CT6.

[http://www.cadillac.com/sedans/ct6-sedan](http://www.cadillac.com/sedans/ct6-sedan)

~~~
ucaetano
Not the Cadillac, the Tesla. The article compares the no-longer-available
first version of the Autopilot, powered by MobilEye.

------
DonHopkins
"Head to head" is an unfortunate way to compare automobiles, like playing
chicken.

[https://www.youtube.com/watch?v=sxooLC9EwII](https://www.youtube.com/watch?v=sxooLC9EwII)

