Hacker News new | past | comments | ask | show | jobs | submit login

For now, we're excluding Reddit posts that are clearly automated and making sure the YouTube content is not sponsored, which you are required to disclose by the YouTube ToS.

We'll have to dig deeper into not to filter out spammy reviews. I can imagine analyzing a user's post history or detecting if content was clearly GPT written, but it's hard to really tell. I know there things like Amazon review analyzers out there, but we'll have to learn more about this. I wonder if the people of HN have any suggestions on this front.

There'll probably be a lot AI generated reels that look like they're from real people online soon too. I wonder what platforms like Tiktok and YouTube will do about this. If this ends up being a huge , we can probably try to use ML methods to check if the video was filmed in the real world




what does clearly automated mean


For now, it's just removing AutoModerators and things labeled as bots. Now that I'm reading this again, I realize doesn't really help, since bots pretending to be people recommending products, don't get filtered out.


That strikes me as very naive. Reddit bots are never marked as bots, that's the whole point of astroturfing. Youtubers aren't diligent about disclosing sponsorships either, regardless of what their ToS say.

Slightly outdated (2018), but they found that only 10% of Youtube videos disclosed sponsorship: https://www.engadget.com/2018-03-28-youtube-influencers-spon...

A recent report by the European Commission found that only 20% of overall "influencer" posts disclosed sponsorships: https://ec.europa.eu/commission/presscorner/detail/en/ip_24_...


I think this is a very good point. We've focused a lot on correctly matching reviews with products / brands, but haven't taken hard enough of a look at astroturfing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: