Hacker News new | past | comments | ask | show | jobs | submit login

This is what's being accomplished by one amateur with a little bit of hardware. Imagine what will happen when you have professional services (legal or otherwise) making altered videos of whoever or whatever you want.

Want court evidence of your wife cheating? Fake grainy hidden camera video. Want to discredit a politician? Here he is picking up a male escort and using racial slurs. These sorts of things could be used to start wars.

Sure, deeper analysis might prove that this video isn't real, but that won't stop the immediate public reaction and response. The knee-jerk reaction to what the public sees is the end-goal. I'll bet good money that this technology plays a major role in future elections.




The problem I see with tech like this is that it muddies the water for everyone. I find it very strange that technologies like the internet appear to have created a world with more ambiguity and less transparency than we've had before - mostly because people now can find support for any kind of belief online. Some people are either willing to learn or have learned how to get information from reputation based systems and some will just go rogue and create their own reality. I hope democracy survives this shift: Because not only will we have to discuss the facts, now we will also have to discuss: What are the facts?


We've always had to discuss what the facts are - the extreme bias between newspapers and TV channels picking and choosing facts to provide to their users, even on the same news stories - news source A chooses not to provide background info, news source B chooses to provide background info from one primary source only, news source C chooses to provide background info from another source, news source D chooses to provide a general overview but with a slight bent towards assuming that its readers hold values X, Y and Z, etc etc.

You can go back and read stories from the 1950s which have these issues and more. There has never been a time where all individuals involved in a democracy have understood all the facts. Popular politics has always been driven by newspapers and special interest groups.

In my country, implementation of a certain policy has very clear correlation with a marked increase in suicide rates - the slow rollout of this specific policy across areas of the country provides an easy way to show this. Certain newspapers refuse to report on this - it doesn't match their ideology - after all, when pointed out, people holding this ideology will often point out that suicide is one's own decision and it's not the Government's problem.


>We've always had to discuss what the facts are - the extreme bias between newspapers

That's an assertion, but others say otherwise. According to this video below, Americans have argued values in the past, but agreed on the facts.

E.g. global warming. People agreed it was happening, but differed in what and whether to do about it. Today, we dispute its very factuality.

https://youtu.be/XirnEfkdQJM


That video of the past was faked.

And the experts you cite that say it is real are in on it.

It's the Establishment trying to suppress the truth. That's what they say today. With more technology, the underdog with AI can beat the Establishment in a few weeks. And there will not be any way to distinguish what's true anymore than you can distinguish a chess line played by AlphaGo vs any other really good chess line.


Which policy is causing suicides?

Seems like you'd mention it unless it is a politically charged issue, perhaps related to gender relations or immigration or something.


It's related to a benefits reform designed to cut costs by giving people less and making them jump through more hoops to get it, without providing any additional support.


In the UK there has been several benefit reforms.

One of these is a simplification of a bunch of benefits into something called "Universal Credit". It's a good idea, with some nice points. It's being implemented terribly, and is causing significant harm.

When someone makes a claim there is a minimum wait of six weeks before they get any money. In the England a landlord can apply to the courts to resposses a rented home if the tenant misses 8 weeks of rent. So the rollout of UC is causing some people to be evicted from their homes.

The old benefits were paid fortnightly. UC is paid monthly. People have to budget very carefully to last all month. Poor people tend to be bad at budgeting, and being poor makes it harder to budget.

There is a very strict, punitive, sanctions regime in place. You'll see people being given sanctions because they attended a medical appointment or had to look after a child. People are expected to spend 35 hours a week searching for work, and they have to provide evidence that they've done so. That could be ok, but it means a claimant will send very many applications to unsuitable jobs, rather than spend a couple of hours polishing an application for a more suitable job. There's a lot more about sanctions and suspension: the rules are very strict; there are very many rules; the DWP and JC+ don't know what their own rules are and sometimes give incorrect advice. So sometimes a claimant will ask DWP or JC+ what to do, and will follow that instruction, and end up being sanctioned because it was the wrong thing to do.

Benefits are paid by the Secretary of State for the Department of Work and Pensions. That person is busy being a government minister, so they employ "decision makers" who look at the various acts of parliament, statutory instruments, and case law and apply those to the details of the claim. The quality of decision making is particularly poor at the moment.

Two judges (who are involved in appeals) have said this. Here's a useful article: https://linustechtips.com/main/topic/872276-ai-assisted-fake...

Note here they're not just talking about UC, but also about PIP and ESA. (PIP and ESA are disability benefits. ESA is an out of work benefit.)

To get PIP or ESA the claimant will need an independent medical. This is provided by a nurse, a physiotherapist, or an occupational therapist. These people are employed by private companies. Those companies have said that over 30% of the assessments they do are not acceptable. https://twitter.com/CommonsWorkPen/status/942773136390590465

Imagine that - imagine if fully one third of everything you did was not acceptable.

Finally, because the claim is about suicide: https://www.disabilitynewsservice.com/shocking-nhs-stats-sho...

We have to be a bit careful with these figures. (We'd expect the population claiming benefits for a disability to include more people who've had suicidal thoughts).


You just set your credibility on fire with the phrase "the extreme bias between newspapers and TV channels picking and choosing facts". What you are describing is literally the opposite of how objective journalism is performed. That you believe this is how professional newsrooms function indicates you haven't worked in the industry.


That you and vertex-four disagree not only on the facts, but on the nature of facts indicates that perhaps his point has some merit. ;-)

FWIW, I've been that primary source for some major news stories, and the final story usually bears just a passing resemblance to reality as I experienced it. Assuming good faith on the part of the reporter, I have to assume that reality as I experienced it bears only a passing resemblance to reality as other primary sources experience it, which is congruent with other known facts like the difficulty of communication, the existence of cognitive biases, and the field of psychology. Objectivity is a myth; it's oftentimes a useful myth, but it's worth remembering that the reporter has his own biases and preconceptions as well, and as the person choosing what to report upon, these will necessarily make it into the final piece. The audience has their own biases as well, so they won't be reading exactly the same piece that the reporter wrote.


"That you and vertex-four disagree not only on the facts, but on the nature of facts indicates that perhaps his point has some merit."

Unlikely. One of us has spent a large chunk of their professional career working in and around newsrooms.


The fact is that facts lie and don't always capture the big picture and they certainly don't capture the importance of an event in and of itself.

With millions of events happening each and every day the simple act of choice is incredibly meaningful.

Your own belief in the objectivity of modern newsrooms proves my point.

Clearly both Al-Jazeera and Fox News do fact checking... yet one would be left with a very different view of the world depending on who did the reporting.


It certainly proves nothing of the kind, and your use of the phrase "belief" is disingenuous. It implies an element of faith that is entirely lacking from any of my assertions. My statements are made based on direct observation of 35 independent newsrooms and direct access to the AP wire over a period of three and a half years. Fox is an outlier in the industry.


The fact that you use the authority argument means you are the opposite of an authority on objective truth.

If anything it is the one place where getting the objectivity wrong has no consequences. In a typical company if you are getting the truth about something wrong (e.g. what people want, what people are willing to pay, where the market will move to) the lack of profit margins will refute the correctness.

In a newsroom however objective truth is a hygiene issue like a cook washing his hands. They should. But if they don't it has very few consequences.


I read newspapers, I often find that important context is missing from every article on a given story, unfortunately - and different context is missing per paper to meet the biases of that paper.


I wouldn't say it's much of a shift as much as an arms race between hoaxers and investigators.

In the olden days you certainly had less ways to fake footage (no image editing or digital forms of manipulation), but at the same time it was also much easier to trick people into believing falsehoods. I mean, look at the older stuff on the Museum of Hoaxes. It'd be shot down in minutes in 2017, since it's obvious the people involved did very little work to back up their claims. You could literally walk into a town where a famous guy's son when missing ten years ago and pretend you're him with no one any the wiser for months. Despite not looking anything like the missing guy and knowing very little about his life. [0]

https://en.wikipedia.org/wiki/Tichborne_case

Or pretend to be a missing Russian princess/foreign visitor/royal with no one being any the wiser. Or lie in newspapers about how Astronomers found a society of advanced humanoids on the Moon. Either way, while there were few ways to falsify information to the same quality as can be done now, it was made up for by how easy it was to trick people in a world without the internet, a decent press/journalists or easily hidden cameras.

As history went on, obviously the amount of ways that you could falsify evidence increased, as did the convincingness of the trickery. However, at the same it also became much easier to debunk false claims too, so the quality of the fakers had to get better if they wanted to stand a chance of fooling anyone.

Hence the arms race. The better technology gets at letting people forge evidence, the better it also gets at letting people debunk fakes.


Just as food for thought, being able to lie or con people is also not necessarily a bad thing: Giorgio Perlasca was able to save more than 5000 Jews from deportation by posing as the Spanish consul in Nazi-occupied Budapest.

https://en.m.wikipedia.org/wiki/Giorgio_Perlasca


That too. Lying isn't necessarily evil, and the fact it's possible has averted a lot of tragedies as well as caused others.


I think this is a fallacy. the olden days were way worse. to put in presepective Carrots improving vision is propaganda created by the british yet see how many people believed it. Humans inventing and accepting fake beliefs to comfort themselves was never a new thing. Governments gaslighting its own citizens is communist party 101. People always want to believe that yesterday was better and blame their problems on the new.


Carrots improving vision is propaganda created by the british yet see how many people believed it.

Not exactly propaganda, more misdirection, maskirovka our Russian comrades call it


> I find it very strange that technologies like the internet appear to have created a world with more ambiguity and less transparency than we've had before

Any technology created by humans will be a mirror of human shortcomings. Techo-utopians believe that technology will make humanity somehow better. It won't. We'll just use it to do the same terrible shit we've been doing since the dawn of time.


This is another means by which advance comms and info capabilities undermine social trust.

https://www.reddit.com/r/dredmorbius/comments/6jqakv/communi...


All of this has been possible for the last century.

For instance, at least in Russia you just get an actor that looks like the politician, film them doing some bad stuff and blackmail them. Pretty simple.

We have had doctored photos/videos for as long as the mediums existed.

If anything it will saturate and you wont be able to tell what is real or not. Most people will default to "fake" and have to rely on the trustfulness of the source. Just like now. We have photoshop and there isn't mass chaos.


> Most people will default to "fake" and have to rely on the trustfulness of the source.

Until faked video of the trusted source him- or herself appears.


I think we’ll probably have to move to a model where any security footage or images that is used in court must be digitally signed at the hardware level, with a verification process similar to how CAs work now. Of course, this will still be tamperable, but tampering would have to be much more premeditated and sophisticated.


The privacy implications of that would be scary. It seems on par with North Korea's Red Star Watermarking which has been dubbed as "dictator’s wet dream".

https://qz.com/583311/north-koreas-dissident-tracking-comput...


that is absolutely never going to be workable because you can just film another video being played back.


I mean, there are all kinds of ways to bound uncertainty. You’re right, you could film another camera, but that could be addressed separately. Digital signing would only prove that a video came directly from a particular camera’s sensors. To assert the veracity of a video, you’d then have to prove that particular camera was at the scene, and the light coming into the lens was authentic. Most court footage is incidental, not intentional, so the sort of doctoring you describe is unlikely in a majority of cases. Where it is likely, different measures could be taken to bound uncertainty.

Separately, generating false images or videos for libel purposes would become more difficult; if a camera that has never been owned by a person is used to produce a signed video that is pornographic in nature, the authenticity of that video is immediately suspect. It would take a much more sophisticated attack to produce a mathematically convincing fake video.


if you are worried about somebody actively faking footage, and are willing to go to such drastic DRM measures to prevent it, you need to explain how it is actually going to solve the problem, not just handwave.


I can imagine watermarking at the light sensor level. It would need some very special provisions for compression, though.


Problem is, this way journalists won't have anonymous sources anymore and you will be able to track any citizen filming power abuses.


Well, but then you'd have to trust the light sensor manufacturer to provide secure keys. Remember when the NSA hacked into Gemalto and had a chance to steal its SIM card keys?

However, that's still interesting; isn't it possible that each light sensor has certain watermark-like characteristics arising from the manufacturing process?


Yes, fingerprinting of digital cameras has been a thing for at least a decade [https://www.eurekalert.org/pub_releases/2006-04/bu-bur041806...]


While the technique does make fraudulent videos easier to make, I don't see this in and of itself as that fatal to public trust. Faking content to hurt someone has been around forever. We then learned not to trust gossip, not to trust uncorroborated sources, etc.

I hope it might actually help people be more skeptical of what they're reading and seeing. And hopefully it will help news media be more skeptical of their sources as well. If that doesn't happen then this will be a disaster.


> We then learned not to trust gossip, not to trust uncorroborated sources, etc.

Somewhat ironically, I feel a citation is needed here.


Makes me think of the phrase "pics or it didn't happen".

Granted it was a lot more common before photoshop-style software was everywhere.


The video by itself will be shortly a medium what noone believes as ground truth, just as a written word or a drawing is now. We should be able to construct better cameras with tamperproof footage. But not sure how, as with realtime editing we cannot trust trivial things like secure timestamping. Secure video blockchain anyone?


> We then learned not to trust gossip, not to trust uncorroborated sources, etc.

If it needed any more proof (than facebook, fox, ads, gov propaganda...) that this never happened, I think the Trump debacle should be final evidence this is fantasy.


You overestimate people


This just gives more power to "authority". If subjects can't believe their own eyes anymore, who will tell them what to think? NBC would be happy to take the job...


How long will it take for this approach to be used to splice people into child pornography? The court of public opinion might sentence the victim to maiming or death.


Such things have been known to happen. Emmet Till. The Dutch de Witt brothers in 1672 are a notorious example. False testimony, witness, and evidence are all things.

http://tywkiwdbi.blogspot.com/2010/09/corpses-of-de-witt-bro...


The end game I see is that people won't trust photos or videos anymore. That politician picking up an escort and using racial slurs? Real video. But he'll say "oh, that's just AI, I would never do that" and now he has the plausible deniability.

The end result is more freedom and privacy because people won't trust "hard evidence" anymore. I think this might be a good thing.


As I have said for years, our current systems rely on the inefficiency of an attacker.

Progress in technology ruins that illusion.

Want to play some heads up poker or chess with someone? How sure are you they aren't getting instructions from a program through some means?

Don't have anything to hide? How about an AI to dig up plausible sounding parallel construction stories to convince a jury?

And in a few decades we're talking about flirting, humor, writing stories, producing any sort of content.

There will be a time when a lot of these false positives will still be accepted, until people just stop trusting EVERYTHING.


The propaganda potential will be interesting as well. My favorite recent example so far: https://www.youtube.com/watch?v=LYEY4x0_RDY

With a few obvious exceptions -- like one particular shot of Ivanka's face that crosses the line from realism all the way past comedy into horror -- some of these replacements are almost good enough to start fooling people. Voice synthesis is approaching the 'solved problem' point as well (e.g., https://www.youtube.com/watch?v=YfU_sWHT8mo ). Not there yet, but getting there.


Most obvious use-case to my mind is stock (or crypto) price manipulation. I wonder what would happen to the ETH price if Vitalik Buterin suddenly "announced" he has seen the light and is quitting ETH to work on DOGE :)


This is why we all should be skeptical of the media ( traditional and social ) as public skepticism is the only thing preventing mass hysteria. If the public were a tad bit more skeptical and didn't believe everything they read in news or social media, we would have avoided the wars of the past 15 years.

But every criticism of traditional media here gets flagged. Which is ridiculous for a "hacker" news site when you think about and the hacker ethos.


These days you don't even seem to need any evidence. An allegation seems to be enough to get you dethroned.


You can't update the state of the technology and assume that everything else will remain unchanged...


Deeper analysis by whom? Who cares about deeper analysis anymore?


> Want to discredit a politician? Here he is picking up a male escort and using racial slurs.

Worked so well in a recent election...

> These sorts of things could be used to start wars.

Wars have been started on much flimsier pretexts, unless you are talking about getting the other side so irate that they fire first.


And that’s why the use and mandating everyone uses their real and verified identity everywhere on the web is needed. You can still comment and post whatever anonymously but it won’t hold any weight!


That doesn't really make sense in this context; if someone is pretending to be you using your own face and voice, using real identities is kind of the point, isn't it?


Ummm but if it’s not you the uploader thru a verified ID system (block chain maybe) is held liable immediately. Also by the time such a system is out in place the public will be even more suspect of believing what they see on the net as truth. Such is already happening with those 40 and below in terms of finally waking up to all the fakery on the web. Just saw on FB a video where a girl crosses her eyes and they get stuck. Majority of commenters below 40 notes it’s fake. Above 40 commented otherwise.

Either we instate an ID system or the web becomes a joke no one believes. The latter I guess wouldn’t be a bad thing.


Well I disagree.

There are many ways to disprove such video, and if the technology exists, we already know that it exists, so it will make any video, real or not, much less relevant.

The tale of new technology bringing war and mayhem might make sense, but in reality it doesn't. You just wish it would happen, because you might want to have some fiction that says "what if?", but it's just a wish. It's also why I dislike black mirror as a tv show. Distopian science fiction is boring most of the time.


What a misplaced comment. Perhaps you would like to add the many ways the video could be disproven. Instead, you chose to ramble on about personal preference in sci-fi entertainment.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: