
Autopilot: an open source driving agent - fabrice_d
https://github.com/commaai/openpilot
======
Voloskaya
Not everything seems to be open source. I can't seem to find the vision part,
only the binary, see:
[https://github.com/commaai/openpilot/tree/master/selfdrive/v...](https://github.com/commaai/openpilot/tree/master/selfdrive/visiond)

~~~
bri3d
Based on a really cursory inspection of the binary and symbols, the binary
seems to be an OpenCV project which uses some common image processing
algorithms (Sobel edge detection, Hough to find straight lines).

There doesn't seem to be a large amount of embedded data indicative of a
trained model or neural network.

So, it seems that the vision algorithms probably mirror those in the common
literature and represent a reimplementation or iterative improvement on
existing lane-finding systems.

More reversing would probably yield greater insight :)

~~~
commaai
There's a neural network in there :)

~~~
bri3d
Ha, straight from the horse's mouth - I stand corrected :) I think this will
be the first binary I've pulled apart with one, look forward to it!

------
jaclaz
Actually the thingy is called "openpilot", there is already enough confusion
with Tesla calling its thing "autopilot".

------
revelation
This seems to be a code dump from the comma AI startup. It also indicates
their autonomous driving platform runs on a "stripped version" of Android.

Which seems to indicate that they are clinically insane, given they are using
a system that can easily have excessive scheduling latencies under load to
steer a car.

~~~
6stringmerc
People like me laughed at Android platform adherents because audio latency was
a huge hurdle to even remotely be on par with Apple's Core Audio, but, believe
it or not, 8 years later (give or take) Android has caught up. Point being,
what might be "clinically insane" to you is simply a "fun challenge" to people
who may be smarter than the both of us combined.

~~~
wyager
It's not about being clever with Android programming; it's about the fact that
Android is not by any stretch of the imagination a real-time system and should
absolutely not be controlling a car.

------
dejawu
This is great IMO. I was incredibly dismayed to hear they were canceling the
Comma One without any further action, without even attempting to work with
regulators. This seems like the culmination of that - now the responsibility
for safety is in the hands of individual users/developers and not on Comma,
which is what got them in trouble in the first place.

Now that this is open source though, I wonder what their next project will be?

------
sounds
I have my reservations about this code. I am a long time free software
developer so it is hard to say it but it seems clear to me that a single
entity (from a legal perspective) has to "own" the self-driving car software.

In the future bits of a self-driving car system may get open-sourced the way
Facebook Open Compute has emerged in data centers, but first the proprietary
implementations will need to pave the way in the legal world.

It's not all bad news, though: I'd love to see a free software project that
converted my lawn mower into a self-driving lawn mower...

~~~
jstanley
> it seems clear to me that a single entity (from a legal perspective) has to
> "own" the self-driving car software.

That's not clear to me _at all_. It seems clear to me that the person
operating the vehicle has ultimate responsibility for how it is driven,
whether they're personally driving it or not.

~~~
andars
Interesting opinion. However, this isn't current U.S. law. See, for example,
the numerous successful suits against Toyota for unintended acceleration.

In addition, it may soon be questionable to say that an occupant of a vehicle
is "operating" said vehicle if it drives entirely autonomously.

------
moflome
Interesting project, and valuable within the context of a larger autonomous
system like the Robinos [0] autonomous vehicle framework from Elektrobit.

[0] [https://www.elektrobit.com/products/eb-
robinos/](https://www.elektrobit.com/products/eb-robinos/)

------
Matthias247
Really, a safety critical hard realtime system with control loops on top of
Python, a general purpose OS and a CAN-USB adapter?

There are reasons why we have realtime operating systems, deterministic bus
systems and dedicated CPUs for these kinds of applications.

That might be an interesting research, prototyping or simulation platform. But
nothing you want to have on a real public road.

~~~
hasbroslasher
I'm not really following. If it _works_ then _why not_ use a control loop on
top of Python? We are engineers, after all.

~~~
throwaway729
Because by "works" you mean "I tried it out a bunch of times and nothing bad
happened, so must be production ready", because that's what you do when you
build websites and desktop CRUD apps and nobody ever died cause the web server
choked under load or the web page rendered a bit funny or the request took a
full quarter second because the GC kicked off as a wave of requests came in.

And then one day a one-in-ten-million event that never would've surfaced
during testing _does_ happen and the OS crashes or the interpreter hangs or
some non-determinism causes a period of non-responsiveness or a bit gets
mangled because you don't have any redundancies and solar rays are thing or
... and someone dies. And of course that didn't happen in testing, because all
of those things are rare possibilities.

And then it happens again.

And again.

And then you're in court being sued for millions. And a bunch of industry
experts who _do_ build their systems with redundancies and _do_ design their
systems with a safety-first mindset come to the stand and rip you apart for
not taking even the most basic precautions. And when it comes out just how
many best practices you completely ignored, you're mostly just hoping beyond
hope that all of the legal problems stay on the civil side of the
civil/criminal divide.

Or, to say it in a sentence, "because the sort of exceptional circumstances
that safety-critical software needs to handle with grace are very difficult or
impossible to account for with a desktop machine running interpreted code on
top of Linux."

~~~
hasbroslasher
I like that you reduced that to a one-liner, and I understand your point. I
sure as hell wouldn't hook my car up to this software.

But I also feel like you're being a bit hasty. Obviously a Python script isn't
going to turn into a Tesla overnight. But maybe it'll help you find a few bugs
before you throw all that time and effort into building the real deal. When I
look at a Github repo that claims to be a self-driving car I don't say to
myself "yep, looks production ready," I say "Cool prototype, now let's break
it." To me it seems entirely reasonable to get a working Python prototype 90%
of the way there and then send it off to the OS programmers to design
something that actually meets the concept of "production ready".

~~~
throwaway729
Yes, I absolutely agree with you on that. Nothing wrong with prototyping in
whatever setting is most convenient.

I think w/ self-driving cars there is an interesting ethical question. The
full auto cars on the roads today definitely aren't production ready, but they
also have constant safety drivers. Probably even ACC systems are tested in the
wild with a safety driver before production.

Basically, "is the safety driver sufficient to justify running prototype
software in the real world?"

~~~
hasbroslasher
Yeah, that's a tough question that we as a society will have to wrestle with.
Knowing that humans are imperfect drivers as well doesn't make it any easier.
Even cars today have crippling safety flaws -- remember that one Lexus model
that had a sticky gas pedal? Or worse yet, the Ford Pinto. I'm genuinely
interested to see how the governments of the world weigh in on this, if at
all.

Intuitively, though, I think that buying/using the software is tantamount to
accepting its imperfections, so long as they are adequately (factually)
presented to you beforehand. You're signing off your ability to make your own
decisions, but are still responsible for them.

------
amelius
Is there a closed-simulation-environment to test this? By that I mean that the
sensor inputs (images of the road, etcetera) are generated by the simulation
environment with which the driving agent interacts.

~~~
pmuk
Have you seen their Dash / chffr phone apps?

[https://itunes.apple.com/us/app/dash-train-self-driving-
cars...](https://itunes.apple.com/us/app/dash-train-self-driving-
cars/id1146683979?ls=1&mt=8)

[https://play.google.com/store/apps/details?id=ai.comma.chffr](https://play.google.com/store/apps/details?id=ai.comma.chffr)

------
6stringmerc
So, ladies and gentlemen of Hacker News, following all the ire shoveled
GEOHOT's way for closing up his project, can you please show me any equivalent
Open Source contribution from major manufacturers that is comparable? Not like
White Paper bullshit. I mean where has Tesla, Audi, or GM shared this kind of
platform without Sam Altman writing a think-piece about how they're not as
good as Cruise and he knows the truth and doesn't need to listen to fuckin'
lawyers.

Sorry, maybe that was an inappropriate aside but I know I'm loud & proud team
GEOHOT and have yet to see the Giants in the industry really show they can
fend off a determined Goliath.

------
agumonkey
His latest video
[https://www.youtube.com/watch?v=vU26Pa6ERkY](https://www.youtube.com/watch?v=vU26Pa6ERkY)

------
influx
Curious why this isn't using PEP8? Another Xoogler?

------
openasocket
Correct me if I'm wrong, but if I were actually to install this in my car,
that car would no longer be street legal, right?

~~~
striking
The car would likely still be street legal. (IANAL) However, allowing this
system to take control of your car on a public road would not.

Another way to say this is, there's nothing dangerous about the hardware. You
could be using it as a logging platform for the data on the CAN bus in your
car. But letting the software send commands and take control from a driver is
a different story.

~~~
darawk
> The car would likely still be street legal. (IANAL) However, allowing this
> system to take control of your car on a public road would not.

IAANAL, but I don't think even that is true. I think it actually is legal to
allow it to take control of your car, provided you are still sitting in the
driver's seat and can take control back at any time.

~~~
openasocket
> can take control back at any time

but if you're going down the highway and the AI decides to suddenly swing the
wheel far right as quickly as possible, no human being would have reaction
time good enough to recover without crashing. I'd be pretty concerned if you
could add an autopilot to your car and drive it around on public roads without
any sort of demonstration of safety.

~~~
ams6110
People are allowed to work on their own cars, including brakes, suspension,
steering, etc. Happens every day in driveways everywhere. I do it myself.

~~~
openasocket
But there are restrictions of what parts you can use. I can't just make my own
engine and drive around with it without getting that engine certified, right?

~~~
ams6110
I don't know of any reason why not. Certified by who? Who would ever know?

I can put a Chevy engine in a Ford, and plenty of people do much more than
that, e.g. homebrew EVs, etc.

------
alinspired
Can this be good for other applications, ie drones or RC cars ?

------
jesionaj
Hmm, the usage of .dbc files is interesting. That format is owned by Vector
and proprietary. I'm curious as to what the limits of using DBC are, as I'd
love to integrate them into the product I currently work on without paying
thousands of dollars for the official API from Vector.

------
CamperBob2
Why is the zipfile encrypted?

------
sheeshkebab
Awesome! not sure about using it my real car, but I'd stick that on an RC car
if it works.

wondering whether this can run on raspberryPi...

------
halayli
You want me to trust my life with a source code that uses zmq? no thanks.

------
devy
George Hotz once said[1], if Tesla Autopilot is the iOS, Comma.ai's system is
the Android. Today he's the MAN of his words by open sourcing Comma.ai
openpilot.

Big kudos to Geohot!

[1]: [http://learnbonds.com/131150/tesla-ios-self-driving-comma-
ai...](http://learnbonds.com/131150/tesla-ios-self-driving-comma-ai-android/)

------
pmoriarty
I read an interesting ethical dilemma related to self-driving cars recently.

Imagine that a self-driving car detects a child in the road, and the only way
to avoid hitting the child is to crash the car in to a wall, certainly killing
the driver.

What choice should the car be programmed to make?

Who should decide what that choice is?

Who is ethically responsible for that choice?

In a normal car that is driven by a human, the choice is obviously made by
that human and the responsibility is theirs. But with self-driving cars it's
not so clear.

~~~
antisthenes
Other threads related to autopilot (mainly Tesla), with great posts that
explained in detail why this problem is unlikely to arise in reality and that
the resultant action undertaken by the system would be the same in both
scenarios (e.g. apply brakes as hard as possible).

IIRC part of the explanation was that a human is unlikely to make a qualified
choice in a situation like this anyway, so programming a decision matrix based
on the utility of the target into the crash avoidance mechanism is moot.
Something like 99.999% of the safety benefit would be achieved by a faster
brake response time.

~~~
skuhn
It's also pointless to worry about because the proposed dilemma greatly
overestimates the ability of computer vision systems to determine what an
object actually is.

These systems are looking for obstacles and road contours. They don't know
that the obstruction in the road is a human baby, let alone the composition of
an adjacent wall.

