Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The ability to fake photos has been around for a while and it hasn't been a big deal. People overall have proven savvy enough to mostly discern real from faked. My prediction is that the same will be true for ML faked videos.


For quite a while and probably still now, advertising photos were/are routinely doctored without people realising (e.g. fashion or makeup models). This was a scandal a few years ago certainly in the UK. The outcome was for plus size models to be more frequently featured. So in some cases it can go unnoticed and be a serious problem.


Besides, look at how many reddit submissions are just an image of a headline, or an image of a picture + caption. Not even a link to a source. Yet thousands of upvotes and reactions to it, taking it at complete face value.

To worry about fake photos/videos seems out of touch with the current state of the internet. People will upvote and bicker about a screenshot of the text of a tweet. Why even bother with ML when an image of an outrageous sentence is enough?


That's what I thought too until I saw The Shining staring Jim Carrey.

https://youtu.be/HG_NZpkttXE




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: