
Announcing Project Zero - newscasta
http://googleonlinesecurity.blogspot.com/2014/07/announcing-project-zero.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+GoogleOnlineSecurityBlog+%28Google+Online+Security+Blog%29
======
arkem
This sounds like a new name (and possibly more executive support) for what
those guys have been doing for years now.

The guys in that article have been working on finding bugs in non-Google
products for a little over two years and you can see their past results
through the advisory credits they've received at Microsoft and Adobe as well
as from open source projects like FFmpeg.

For example see Ben Hawkes on this list of Google CVEs in other companies'
products:
[https://www.google.com/about/appsecurity/research/](https://www.google.com/about/appsecurity/research/)

~~~
tptacek
My read of the announcement is that they are now hiring vulnerability
researchers to work on arbitrary targets, the way security product companies
do to staff their research labs.

Starting back in the 1990s, companies like ISS and McAfee staffed teams of
bugfinders to comb through high-profile software for vulnerabilities, and kept
score with advisories (and with IDS/IPS signatures and scanner checks for
those bugs, which they'd have a semi-proprietary interest in, since they found
them). They'd use this to market their expensive products. The researchers
generally had a long leash so long as they were (a) looking at software that
customers might care about and (b) were actually finding things.

What Google seems to be doing is starting a lab like that, but without the
hooks into products and marketing; instead: Google has cash, wants to retain
security researchers, and so will throw money at a vulnerability lab run for
the common good, partially for the PR win, partially for the knock-on benefits
to Google of having lots of good security people, and yes, partially for the
good of humanity.

(FWIW: I worked at what I think was the industry's first vulnerability lab, at
SNI, which eventually became McAfee's vulnerability team).

~~~
eli
Is working at one of those labs as much fun as it sounds?

~~~
tptacek
Yes. Tim and I got something like 8 months to do this:

[http://cs.unc.edu/~fabian/course_papers/PtacekNewsham98.pdf](http://cs.unc.edu/~fabian/course_papers/PtacekNewsham98.pdf)

... where we discovered (or were at least first to publish) two whole new
attack classes, tracked down something like 6 different super expensive
security products and got them up in a lab, designed and implemented a new
programming language, and succeeded in giving a giant middle finger to
surveillance software.

It's the most fun I've had in my whole career.

Not for nothing, but working at a software security consultancy is a close
second; you lose the freedom to choose your targets (at least 80% of the
time), but the work is the same.

~~~
chrisfosterelli
That's fascinating! I'm curious what sort of education and experience you had
to land a job like that, do you mind sharing? I'd love to work in a lab like
that one day, but I'm not sure what is considered "good enough" to get a
career in security rather than just a hobby.

~~~
tptacek
I'm entirely self taught. I have a single semester of college. I took psych
and political science. :)

Here's two starting points:

\- a reading list: [http://amzn.to/cthr46](http://amzn.to/cthr46)

\- how we hire: [http://matasano.com/careers/](http://matasano.com/careers/)

~~~
freehunter
I've looking into buying Grey Hat Python as more of my job starts to require
scripting, but I'm put off by that first review (and overall, the reviews
aren't glowing). Interesting that it comes recommended from you, someone whose
opinion I respect.

I don't know how long ago your list was made; would you still recommend Grey
Hat Python?

~~~
tptacek
I don't think it's an especially great programming book, but it is a great
cross-section of the programming tasks you actually do when working in a
vulnerability research lab (or software security consultancy, for that
matter).

~~~
droopybuns
Did you ever look at "Violent Python"? Any opinion as an alternative?

[http://www.amazon.com/Violent-Python-Cookbook-Penetration-
En...](http://www.amazon.com/Violent-Python-Cookbook-Penetration-
Engineers/dp/1597499579)

~~~
meowface
I'm not tptacek, but...

After lightly reading through both books, I think Gray Hat Python is a great
book for more advanced security concepts, especially on the reverse
engineering and exploit dev side of things, but isn't a very good book for
learning Python or programming.

Violent Python on the other hand is a great book for beginners to Python and
programming, and it teaches both pretty well, but it only goes into surface
level security concepts for the most part.

Gray Hat Python is closer to a Windows API/x86 assembly book than a Python
one. Violent Python is a real Python book and mostly covers general
information security and network security concepts.

Gray Hat Python is also purely application security. Debugging, reversing,
hooking, writing shellcode, exploiting... Violent Python is almost entirely
network security, with one chapter on forensics. Exploit dev vs. exploit user.

It depends on your experience level and what you want to actually learn. If
someone was brand new to Python, application security, and even programming,
I'd recommend reading Violent Python first and then Gray Hat. If someone has
more advanced security knowledge and has some decent programming skills
already, I'd probably tell them to skip Violent Python.

Or if they wanted to focus on appsec vs. netsec, I'd direct them to one or the
other based on that. If you want both, you should definitely read both.

~~~
dobbsbob
Blackhat Python is out in November
[http://www.nostarch.com/blackhatpython](http://www.nostarch.com/blackhatpython)
'Automating offensive forensics' should be an interesting chapter

~~~
meowface
Looks pretty interesting. I'll probably buy it.

------
meritt
1) What steps will Google be taking to ensure _timely_ vendor acknowledgement
and fixing of issues? It often seems a public disclosure is the only thing
which will trigger them to finally act (and then they like to fire back with
litigation). If a vendor simply ignores Google, will the bug go unannounced
indefinitely? Is there a reasonable timeline in which all bugs - fixed or not
- are made public?

2) Will Google take bug submissions from 3rd parties and offer any degree of
anonymity and/or protection for the bug-finder? I routinely come across gaping
security flaws but when we have an over-zealous judicial system and the CFAA,
it's often not worth the personal risk to get something fixed.

~~~
tptacek
Regarding point (2): if you're thinking about things like SQLI and XSS
vulnerabilities in websites, that's not the kind of research Google is likely
to be doing. But if you're thinking about finding memory corruption flaws in
software you install on your own machines: you have very little to worry about
from the CFAA to begin with.

~~~
meritt
Good to know, thank you. What in your opinion is the best way for someone to
submit a security vulnerability (the sort which could land them into legal
trouble, they don't have permission to be penetration-testing, could be viewed
as malicious, etc) or is it simply not worth the risk?

~~~
tptacek
You need to clarify. Are you talking about submitting a vulnerability in
someone else's website, or in a product you installed on your own computer?

In the latter case, it's pretty straightforward. Your legal liabilities in
that situation are (so long as you don't demand money) civil (you many have
violated a click-wrap that will probably prove toothless against security
research). You can tell the vendor directly if you like (from experience, this
isn't fun; you'll probably spend a couple hours in tier 1-2-3 tech support
hell). If it's an important target, you can also talk to bug bounty programs.

In the former case: it's not not straightforward. Start by scouring the
target's website to see if they have a disclosure program, which will probably
equate to permission for looking for flaws in their site. If they don't, it's
very possible that you've broken the law in whatever process you used to find
the vulnerability. Submit anonymously and carefully. If they prove themselves
to be cool, you can always take credit later.

~~~
r0m4n0
you would think they would be more focused on vulnerabilities of websites and
the underlying technology as Google is an internet company (though they have
morphed into a wider spectrum). Most recent public security outcries have
related to internet services so my assumption that would be their focus.

Part of me hopes they tread in this gray area so that we are forced to address
issues with the CFAA and how its presently enforced.

It doesn't matter if you are sitting on a bean bag chair at a google lab, you
have no defense in criminal court if you are unauthorized to test for a vuln
in someone's system. Disclosing a flaw is sufficient grounds for a felony.

~~~
tptacek
This isn't a "grey area". It's illegal to test web applications run by other
people for security vulnerabilities. The examples you've seen of above-board
security research targeting web apps fall generally into these buckets:

(a) Web apps run by other companies but which are available for download to
run on one's own machines

(b) Web apps run by other companies that have published bug bounties or other
forms of permission for testing

(c) Web apps tested carefully and, at first, usually anonymously (or, if not,
then by researchers working from jurisdictions where CFAA is hard to enforce)

~~~
r0m4n0
Right, they mentioned researching bugs like Heartbleed. Discovering and
disclosing bugs on underlying open source technology hasn't been the target
prosecuting under CFAA in the past so I follow you there.

If a prosecuting attorney wanted to, couldn't they charge the researchers that
discovered the bug? CFAA doesn't specify intent to do harm...

If someone used the disclosure of the bug to do harm, you have assisted in
unauthorized access of information.

~~~
tptacek
I don't understand what you're asking. The Heartbleed research they didn't
probably didn't have any CFAA implications for them.

The distinction isn't between "open source" and "closed source". It's between
"software running on machines you own" and "software running on other people's
machines".

Hundreds of thousands of vulnerabilities have been discovered in the past
decade and a half. None of those researchers have been prosecuted for
disclosing the vulnerabilities. If you disclose a bug and someone unrelated to
you breaks the law with it, CFAA does not say you're liable.

~~~
r0m4n0
haha not to drag this conversation out but just a hypothetical scenario:

I download and install Drupal on my web server. I find an SQL Injection
vulnerability in the login form. I post on a public forum the vulnerability
where someone proceeds on their own to deface a government website using that
knowledge. You don't think they would charge you in assisting?

~~~
tptacek
No, they would not. The equivalent of this scenario happens _all the time_. In
the one case I'm aware of where the developer of exploit code was found
criminally liable for its use, that developer had a direct relationship with
the person who actually did the exploiting (for commercial gain).

~~~
r0m4n0
The law doesn't require relationship or commercial gain. I can't share my
story (and I'm sure there are others out there) but I can assure you: if you
mess with the wrong people, the CFAA has no bounds

~~~
tptacek
The story you told about discovering SQLI in Drupal simply isn't covered by
CFAA.

------
saurik
> Every bug we discover will be filed in an external database.

[https://code.google.com/p/google-security-
research/issues/li...](https://code.google.com/p/google-security-
research/issues/list?can=1)

I am surprised that they didn't use something (other than Google Code) where
one could more easily search by vendor or product, or other kinds of tags like
"privilege escalation possible", "CVE-2014-0160", or "stack overflow" (as far
as I know Google Code Issues doesn't support tags of any kind; maybe I'm
wrong?), but I can see the appeal of using off-the-shelf code from somewhere
else in Google.

~~~
vog
As of now, this database contains some strange entries. Probably they have
issues with access permissions?

    
    
      01 | Invalid | This is a test
      49 | Invalid | <please do not file bugs here>
      50 | Invalid | <please do not file bugs here>
      51 | Invalid | Random Guy Has Access To File Bugs
      52 | Invalid | Google PR doesn't respond to press inquiries
      53 | Invalid | The issue is the blog
      54 | Invalid | FR
      55 | Invalid | <please do not file bugs here>
      56 | Invalid | hello

~~~
AaronFriel
Google Project Zero has it's Project #1: secure their bug tracker.

------
Cthulhu_
Another article or comment I read about this (or maybe an interview, IDK)
highlighted the main reason behind this endeavour: more bugs squashed means a
safer internet, a safer internet means people will be more likely to click on
ads. Because ads have a bit of a trust issue; ad networks have been used to
distribute malware via legitimate sites, and sites behind ads have frequently
been serving malware themselves.

So basically similar to other of Google's 'free' endeavours (Chrome, SPDY),
this is another project intended to make the web safer, faster, more trusted,
which by extension leads to more ad impressions / clicks.

------
p4bl0
The HN discussion on the Wired article was interesting:
[https://news.ycombinator.com/item?id=8035726](https://news.ycombinator.com/item?id=8035726)

------
danielweber
They say they are hiring. Where is the job announcement?

~~~
tptacek
If you want to be on that team, don't wait for the job announcement. Actually:
don't ever wait for job announcements. :)

------
melvinmt
It reads a bit like they're trying to recruit NSA employees to come work for
Google...

~~~
hessenwolf
Hmmm... job-for-life with enforced zealous national patriotism, vs five-year
job with enforced zealous corporate patriotism (and everything is so god-
damned colourful).

They'd have a tough sell to get the NSA employees I would think.

~~~
tzakrajs
Don't forget the part where they make significantly more money than working
for a federal acronym.

~~~
ForHackernews
The NSA pays quite well, and has excellent benefits. Also, if you want to work
on really seriously challenging mathematics outside of academia, the NSA is
close to being the only game in town.

~~~
jonknee
The NSA pays government salaries, if you want real money you need to be at an
outside vendor (a la Snowden). I'm sure the benefits are great, but so are
Google's.

[http://www.glassdoor.com/Salary/NSA-
Salaries-E41534.htm](http://www.glassdoor.com/Salary/NSA-Salaries-E41534.htm)

[http://www.opm.gov/policy-data-oversight/pay-
leave/salaries-...](http://www.opm.gov/policy-data-oversight/pay-
leave/salaries-wages/salary-tables/pdf/2014/GS.pdf)

------
antocv
Time for some trust-building with the community, after that um, unpleasant
discovery that NSA has been up in that Google, and Google willingly accepted.

About Google, Facebook, Apple, Microsoft etc, we have seen the list, Ive got
to say, scumbags.

They sold us all out and threw us under the bus.

Lavabit - a simple company put up more of a fight than a multi billion dollar
giant.

Never forget.

~~~
viraptor
"unpleasant discovery that NSA has been up in that Google, and Google
willingly accepted." [citation needed]

Specifically for the willingly accepted part.

~~~
spacefight
That fibers have been tapped by state actors was well known pre-Snowden (I
recall reading a story about how subs were used in Russia etc), yet Google
only completed encrypting their private fibers post-Swnoden.

~~~
mike_hearn
The assumption pre-Snowden was that only "hostile states" did that kind of
thing, e.g fibres in and out of China were probably tapped, but most of the
internet wasn't. Because, you know, search warrants do exist.

It's rather arrogant to say "everyone should have known". Snowden's leaks have
been making waves for a year solid now exactly because he showed that reality
was the worst case scenario only the most extreme of the extreme postulated
previously.

~~~
vezzy-fnord
_It 's rather arrogant to say "everyone should have known"._

No, it isn't.

[http://en.wikipedia.org/wiki/DCSNet](http://en.wikipedia.org/wiki/DCSNet)

[http://en.wikipedia.org/wiki/ECHELON](http://en.wikipedia.org/wiki/ECHELON)

[http://en.wikipedia.org/wiki/Clipper_chip](http://en.wikipedia.org/wiki/Clipper_chip)

[http://en.wikipedia.org/wiki/Room_641A](http://en.wikipedia.org/wiki/Room_641A)

[http://en.wikipedia.org/wiki/Project_SHAMROCK](http://en.wikipedia.org/wiki/Project_SHAMROCK)

[http://en.wikipedia.org/wiki/President%27s_Surveillance_Prog...](http://en.wikipedia.org/wiki/President%27s_Surveillance_Program)

etc.

Of course, no one listened. It was much easier to just decry everyone as a
"crazy conspiracy theorist", wasn't it?

~~~
spacefight
This. And thanks to everyone for the down-vote, much love from Google these
days, eh?

~~~
antocv
Jesus, the Google puppets are out strong on this one.

Amazing.

------
liricooli
Security is a big issue for all these corporates to the bare minimum degree
that would make most people feel secure enough.

I don't buy it anymore.

~~~
liricooli
I see the next PR headline: "We found so many bugs in all the vendors'
programs. In google, there are no 0 days. Why not switch to our platform ?"

------
tribaal
Good to see Google doing this!

Let's hope the bugs found are published in a timely and open fashion. Yes they
say it will, but the proof is in the pudding :)

------
TomGullen
> Security is a top priority for Google

Can't take this seriously any more. Used to like Google, not any more.

Remember this was a company co-operating with the NSA.

~~~
bellerocky
> Can't take this seriously any more. Used to like Google, not any more.

> Remember this was a company co-operating with the NSA.

I don't have the same suspicions you do, and I don't know of any evidence of
Google willingly co-operating with the NSA beyond their legal obligation to,
but it might make more sense for Google to spin off this team into a non-
profit organization that they fund. Project Zero would get more credibility
and independence, and maybe more publicity and contributions from the
community. Maybe there's some kind of financial benefit for Google separating
it out of it's org, but I don't understand corporate structures or finance
even a little.

