Hacker News new | past | comments | ask | show | jobs | submit login

Hmm. Isn’t “AI” is an algorithm too?

Writing about training a content moderation AI by humans marking metadata on the content for AI seems to be throughout the article.

“Might” provide a potential reduction feels like a stretch. LLMs literally are decent at receiving input and prompting “tell me about this”.

There is a less than desirable human price being paid to encode content to train an algorithm. Maybe if it was done by someone with tech skills they could reduce it.. instead, for now, it appears to be outsourced to traumatize people who don’t have the health support and salaries to build the AI to begin with.

Someone is choosing to send this work far, far, away to a much more vulnerable population, and maybe in time it doesn’t have to be that way.

If any group is capable to make a difference in reducing any questions about this approach it’s the most capable in the tech, not the least.

Content recognition is something AI done before the current wave of AI pretty well. If anything this might boost things like OpenCV.

Training a content moderation AI is meant to do what social media hasn’t been able to do, moderate content in large ways other than using humans.

AI a popular word right now in zeitgeist, it doesn’t meant it’s beyond being connected to everything that already exists, when it’s based on existing knowledge and practices.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: