Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
How to beat AVG's fake traffic spew (theregister.co.uk)
7 points by lurkage on July 2, 2008 | hide | past | favorite | 5 comments


Is this really as an important issue as The Register is making it out to be? So the user fetches your page without looking at it. That's how the web works. Bandwidth, at least with my host, is nearly free... so who cares. Your ads and javascript-based analytics aren't being hit, so it shouldn't be affecting your monetarily. And even if it is, this is how the web works. Clients can do whatever they want. They can scan your page for malware, they can not display it, or they can download it over and over for no reason.


I agree. At most I think it could be argued that what AVG does is impolite client behavior. Do enough people even use this product that it's is a problem to begin with? This isn't much different than all those web accelerators that were popular a few years ago and I don't remember anyone griping about those taking up bandwidth. This whole thing seems like a non-issue.

I also wonder if the people talking about filtering or redirecting this AVG traffic are some of the same people clamoring for net neutrality. Granted, this is a little different because we're not talking about filtering at the ISP level, but keep in mind this effectively degrades the quality of a product that someone has paid for. If this was being done by Comcast or Time Warner we'd be raising bloody hell.


Personally, I use the web server hits to let me know what people are interested in. I don't burden my users with javascript analytics.

For the moment I can just drop all the data from IE6 users, but if AVG is serious about this they will have to start randomizing their signature or it won't work.

AVG is a robot. They should use the robot exclusion protocol and clearly identify themselves in their user agent string.


My understanding is that the whole point of this feature is to identify bad sites. It defeats the whole purpose of the feature if it identifies itself as a robot and site owners can easily filter or redirect the traffic.


It depends on the site. A non-malevolent site that has malware injected into its HTML will still be caught. The content won't get a chance to see the identification.

A malevolent site can just track their mechanism and work around it, like not showing malware to IE6 users unless they ask for the content a second time in quick succession. (Might not work, I don't know if the AVG copy gets cached. One would hope it doesn't for the sake of sites that do browser sniffing.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: