Hacker News new | past | comments | ask | show | jobs | submit login

I was listening to a podcast/article being read in the authors' voice and it took me an embarrassingly long time to realize it was being read by an AI. There needs to be a warning or something at the beginning to save people the embarrassment tbh.



I think it will eventually be good public policy to make it illegal to post massive amounts of texts produced by AI without disclosing it. As with all illegal things on the internet, it's difficult to enforce, but at least it will make it more difficult/less likely


How about articles written by human charlatans? Claiming they are 'doctors' or 'scientists'. Or posters claiming something that didn't happen? Like a... pro bullshtter claiming he was denied apartment renting because of his skin color. He could make a lot of money if that was true. But poster is still taking ads place, payed by poor 'suffering' minority. Another example 'influencers' who pretending, or really being, experts advise you on forums about products. The tell mostly the truth, but avoid some negative details and competing products and solutions. Without disclosing their connections to businesses.

Shorter version: intentional bullshtting never ends, it's in human, and AI, nature. Like it or not. Having several sources used to help, but now with flood of generated content it may be not the case anymore. If used right this has real affect on business. That's how small sellers live and die on Amazon.


Escape your aster*sks \* , please.


How would you accomplish this without every website asking for phone number and photo ID?


you people keep forgetting two things:

- there isn't a world government to enact such laws

- people would break those unenforceable laws


The Internet could be governed. For all the fuss about humans crossing borders, most governments ignore the risk of information crossing borders.


lol


What if it was good enough?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: