What our systems found was definitely a compromised JS file, and others on this thread have posted something similar to what we saw. This is not a false positive.
We have detailed help for webmasters in this kind of situation:
One thing that I strongly suggest to any webmaster in this situation is to look for any server vulnerability that allowed this file to get compromised in the first place. We sometimes see webmasters simply fix the affected files without digging into security hole that allowed the hack, which leaves the server vulnerable for repeat attacks.
Happy to answer questions.
>> Malicious software is hosted on 4 domain(s), including cobbcountybankruptcylawyer.com/, stephaniemari.com/, northgadui.com/.
What does this mean? How do these sites relate to php.net?
Verified owners in Webmasters Tools get more info.
As a student interested in security, will the js file be provided so we can examine and learn how this was done outside of privileged access?
OKay looks like safe browsing said it is no longer suspicious. And I think someone already provided the JS file below.
Google's safe browsing looks pretty cool! Really powerful infrastructure. I wonder if they did this with virustotal?
Can virustotal recongize this?
Another thing is other search engines dont seem to have this built-in. I wonder people using DDG will ever think about querying Google safe browsing or not.
Does google provide an api (beside just querying safebrowsing directly).
There are huge repercussions for any website that gets blacklisted with the Stop Badware clearinghouse, the least of which is the inability to figure out exactly where the problem actually is because the company you work for's information for a webmaster to resolve the problem is ridiculously minimal. There are no notifications (unless you are signed up for Google Webmaster Tools) and restoring a website to normal globally can take anywhere from 48 hours to two weeks. There are millions of developers who rely on the PHP website daily for performing their day jobs and you've now made it that much harder for us to do our jobs.
Stop Badware needs a serious overhaul. At the very least, they should contact the contacts in the WHOIS record for the domain BEFORE doing anything. Give the website owners 24 hours to resolve the issue before blacklisting the site. And give them a heck of a lot more information to go on than some vague text.
Also, there are several anti-virus vendors out there who use the clearinghouse database for their products...6 to 12 months after the original blacklisting. So this will happen all over again 6 to 12 months from now. Finding contacts for anti-virus vendors for removing domain blocks is a lot harder than removal from the blacklist on the Stop Badware site.
The CORRECT solution for this situation was to find a contact at PHP who could resolve the issue quickly and amicably. Seriously, how hard is it to locate Rasmus' e-mail address? Always try to find a human contact before using Stop Badware. You can chalk using Stop Badware for the PHP website as being the dumbest decision you've made this year. Hopefully this decision of yours will raise the ire of the Internet just enough to force the company you work for to revamp Stop Badware so it doesn't suck, Google Webmaster Tools so they don't suck, and the reporting tools for sending information to Stop Badware so they also don't suck.
Why? This isn't a responsible disclosure, "we found a potential vulnerability but we don't know if it's being exploited yet" kind of situation. This is a "there's a real threat to anyone visiting that site via your search engine right now" kind of situation.
As a user, I'd be much happier if the search engine flagged this immediately.
As a site owner, if someone found malware on my site I'd want to know ASAP too. Obviously it would be helpful if they sent me a notification and made the specific details of the identified threat available. However, I could hardly criticise them for blacklisting my site while it should be blacklisted, or for claiming that we were dangerous while we were actually serving malware.
Not clearing up the blacklists promptly after the threat is identified and removed is an entirely different question. If you're going to go around blacklisting sites then I think you also have a responsibility (and, for that matter, you should also have a legal obligation) to remove them from the blacklist with similar efficiency if you're notified that the threat has been removed. Claiming that someone's site is dangerous when it isn't is defamatory, and should be treated as such.
> You sir, get -9,001 Internets.
Cut this shit out, you're not on reddit.
It's incorrect to say that Google doesn't attempt to contact the site owner. According to the Webmaster Tools support site , Google will send notices to several common email addresses when it blacklists a site.
Why should google do that ? because it's Rasmus? they dont have to do that period.
The CORRECT SOLUTION is to protect users FIRST and not allow the site to infect more computers.
IT IS NOT google responsability to warn webmasters if their site are infected (though they can be warned by email automatically).
IT IS the webmaster's responsability to audit his website security, which obviously did not happen with php.net. If they get punished for that , that's FAIR , because it will force them to take security more seriously.
it's hight time people move from httpd to something else like nginx. httpd is insecure by default, this is not how you deal with security. as for PHP, since it doesnt promote any good security practice by default, it should be avoided.
A million times THIS!