That makes it sound like Find Articles is quite likely to have a lot of duplicate content, one of the things that Google's update was meant to penalize:
> This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful.
I'd hope those particular articles should surface then. Their other content though is of low value.
I'm sure this can also be used for evil purposes by content scrapers, but as a startup with limited resources, an automated (and free :) way to stop such content would be great.
If my site is a content farm, then surely so are sites like Stack Overflow and Trip Advisor, as I'm using the same model of moderated and curated user generated content, and while I don't think my site is quite as useful as Stack Overflow (what is?), I've been running this site as a labour of love since 1997, and I've had countless emails from people who've found the site useful, so I must be doing something right.
"we've still got several changes in the pipeline"
Matt, are these changes/domains related to the farmer update specifically or just in general? Any idea when we can expect to see something?