I've seen lots of pearl-clutching over this about the "dangers of deepfakes" etc., but frankly, it sounds like Wall Street's problem. Nobody asked them to build automated market making bots that scrape Twitter for their signals.
I guess in that case I'd amend "Wall Street's problem" to be "Wall Street's fault." But the fact that a Wall Street hiccup can impact the rest of society (note: in this case it did not), is a separate problem from deep fakes impacting either Wall Street or the rest of society.
It's notable that news agencies (some of them) and Twitter users were able to suss out the fake news, even if Wall Street bots weren't able to. So as long as we aren't launching nukes based on unconfirmed tweets, then a little patience should be sufficient to avert most catastrophes that Wall Street would like to claim as our shared responsibility.
Like what? Ultimately the rumor was quashed by readers and news agencies who identified it as fake (by checking if there really was an explosion at the pentagon). What scenarios should we be worried about, that we shouldn't also be worried about for any other kind of unconfirmed report?
It's not like militaries are declaring war based on unconfirmed news (unless they created it...), so what other time-sensitive reactions do you have in mind that could cause catastrophe? I have trouble thinking of a scenario that could be caused specifically by deep fakes but not by any other kind of unconfirmed report.
If someone is taking drastic action based on fake news, then the proximate problem is the drastic action, and the secondary issue is the fake news.
1. Does this photo even show the Pentagon? I guess I don't recognize it? But this could be a photo of a smoke cloud anywhere, not clear that AI has much value here, as this is also something trivial to Photoshop. Seems like the text of "EXPLOSION AT THE PENTAGON" is doing most of the work here.
2. Is there any evidence of the stock market dip being real? And attributed to this photo? Seems like it's probably just random.
Presumably, a well done photoshop could have looked much more realistic than this image, which has the characteristic AI image warping distorting the sidewalk. Which begs the question: did any photoshopped images do something like this in the last 15 years, and if not, why?
Makes you wonder about the credulity of these news sources. Like the average person I consumer news every day, and this post on HN at nearly 5:00 PM local time is the first I am hearing of it.
I heard about it mid-day in the context of how it was spread by verified accounts on $8chan (the microblogging site formerly known as Twitter) and picked up from there by news feeds.