Let's back this up a few steps. You're an actor, you appear on television. By being filmed and accepting the pay offered to you, you're agreeing to allow these images of you be disseminated to the general public, for their enjoyment. But what if somebody finds you attractive, and looks at your picture whilst... you know. Can you sue? No, they're well within their rights to do so. What if they cut your face out and place it over a Playboy centerfold? Same deal. Several technogical innovations later, here we are. Fundamentally, nothing has changed. Fundamentally, people are still 100% within their rights to combine images legally obtained in this way. And post them online. This may not be what the Internet was created for, but this was always what it was used for.
Are they? If you distribute movies and media to the public, you normally are breaking laws. (Hence the infamous FBI warning on movies for a couple decades now.) Likewise, you cannot just use people's likeness in marketing and other public uses without their permission. So I'm not at all sure that just because you have an image, you have the rights to create and distribute derivative works from them.
Tell that to Prince and Peter Cushing. ;)
> If you distribute movies and media to the public, you normally are breaking laws.
That's not what we are talking about. You can transform any public image for parody, criticism, etc. Porn is considered speech so anyone can make pornographic parodies/etc. This is especially true if fans are doing so for fun and not for profit.
Of course reddit has a right to ban it from their platform, but you as a fan can transform any public image and criticize, parody, etc it.
Maybe, maybe not. Transforming generally creates a derivative work. On its face that requires permission from the owner of the copyright of the work, but there may be exception that allows it in particular cases. In the case of parody in particular, it MAY be covered by fair use.
A lot of people on the net think that parody is automatically fair use, usually from misunderstanding the Campbell v. Acuff-Rose Music, Inc., 510 U.S. 569 (1994) case.
Briefly, in the case, the district court said that parody was fair use. The appeals court said because it was commercial parody it presumptively could not be fair use.
The Supreme Court said they were both wrong, and it might be parody and sent the case back down to the lower courts.
A lot of people just looked at as the Supreme Court reversing the appellate court's reversing of the district court's ruling that parody was fair use, and took it as therefore the Supreme Court saying parody was fair use. (I don't blame people for misunderstanding--the press is generally terrible at reporting Supreme Court decisions. They often fail to interpret Supreme Court rulings in the context of the lower court decisions that led to the case).
This is true. It's not censorable free speech, meaning the government can't prohibit it ahead of time. But speech can be subject to tort claims despite surviving the First Amendment. A deepfake victim can sue for damages in civil courts, and there's a 99.99999% chance they'd win massive damages every time.
This is especially true if fans are doing so for fun and not for profit
Profit motive may affect the amount of damages, but it doesn't effect whether or not the victim can sue and win in a court of law. Even in the US, you'd be paying out significant damages to someone for damage to reputation or use of likeness.
In any case, you shouldn't pretend as if this is morally okay just because you can focus in on each individual step and find a way to justify it. There are a lot of crimes that clearly fall into this bucket: stalking, for example. Any one interaction may be innocent, but it's the sum total of actions that completes the picture of abuse.
If you endorse distributing fake naked pictures of celebrities, you are a bad person, no matter your line of reasoning.
This idea of altering images has always confused me. When I was a kid, I got a scanner and a printer. The first thing I did was take a picture of a Battletech Griffin 55 ton battlemech (a giant war robot) and replaced the robot head with that of my mom, titling it “Grifmom.” I thought it was awesome and she burst into tears and beat me terribly. I never understood why. And this didn’t have any nudity or offensive material, just a normal face on a robot body.
Perhaps this is related to your aversion?
Do you imagine others naked? Is that disrespectful?
It’s really interesting hearing about other people’s mora systems.
You can't merely copyright your likeness.
As for socially, consider this: They have used the likeness of Bruce Lee, Carrie Fischer, Prince, and Peter Cushing in commercial works, undoubtedly without prior consent. We also have "public" nude photos taken and distributed of celebrities on a regular basis (paparazzi). This appears to be quite socially acceptable, so I'll run with the idea that "socially acceptable" is pretty fluid when it comes to celebrities likenesses.
Which country are we talking about? Because that is not the case in the U.S. https://en.wikipedia.org/wiki/Personality_rights#United_Stat...
> They have used the likeness of Bruce Lee, Carrie Fischer, Prince, and Peter Cushing in commercial works, undoubtedly without prior consent.
Who is "they"? Who has been using Fischer's and Cushing's likenesses without contractual agreements?
> We also have "public" nude photos taken and distributed of celebrities on a regular basis (paparazzi)
Public photography has legal protections. Doesn't mean that if you took a photo of Obama at the gym you can use the photo in an ad campaign that suggests his approval of commercial usage.
They licensed their likenesses from their estate, or used licensed footage (i.e., film clips of Bruce Lee).
We also have "public" nude photos taken and distributed of celebrities on a regular basis (paparazzi).
Photos taken in public are not subject to the same tort concerns as private photos, or faked photos purporting to be of a celebrity.
Mass redistribution of somebody image is not a "right".
A photograph of something is open to interpretation or dispute, but a video is a series of such stills, each one slightly different, each of which adds to the provenance of the whole.
You have two images, one physically overlaying the other...it's not even remotely the same thing as a single integrated image, technically or legally. Integrating the (edit) images (end edit) changes everything.
Fundamentally, people are still 100% within their rights to combine images legally obtained in this way. And post them online.
Completely. False. You may have the right to combine images for your own personal use (in the US, ignoring discussions of CP), but you absolutely do not have the right to distribute those images, and the associated tort actions both pre-date and have survived the First Amendment.
What if someone combines it with some kind of "deep ageing", to create artificial child porn? Sexual representations of children, even if completely artificial and involving no victims, are illegal in many jurisdictions, but not all.
We're only scratching the surface of what semantic editing of video is going to be capable of. It's a very big barrel of worms.
Just because the technology is out there doesnt mean we throw our hands in the air and give up. Yes it’s there, it’s going to be used for most of the nefarious cases we can imagine, but that doesn’t mean we have to tolerate someone using our image against our will.
In fact, I would imagine that the data privacy advocates I often see on hn should see this as a logical extension of the privacy protections they want to see across the web.
No, Lyft employees should not be able to view our trip history willynilly. No, the NSA should not be able to gobble up all of our google searches for profiling us. And no, we should not have to suffer being put in porn against our will because we are a person in the public eye.
This stuff should be treated like revenge porn, IMO. Functionally it’s the same even if the technical implementation is different
then there is the case that the program generates porn for me. if i privately generate porn using a famous person's image, to me - thats my right. no different than me fantasizing about them in my head.
That’s why I think this is a poor argument.
i wouldnt be violating their privacy any more than taping a picture to a pillow and humping it. are you claiming that is illegal? if not, then it feels like its the quality of the 'content' you are against, not the act.
These are centralized entities. In theory it is very easy to implement sound internal controls surrounding the access of that type of data given audit logs or whatever. The issue with deepfakes is that given a reasonable stack of images, anyone can produce their own porn. If you're a woman with many many photos on a public instagram account you can be a likely target given that anyone can go ahead and scrap your images. However, if you're one of the millions of wives that doesn't have a large presence on social media, then the deepfake can easily be tracked back to the spouse who has access to that private stack of photos.
The normal caveats apply: you need access to a gpu, probably access to Tor to download the source in case github shuts the repos, etc etc. But when there is a will there is a way.
The question is whether or not my likeness is my data, which I don't think has ever been settled. Anyone can take a photo of me in public and the photographer owns the copyright. Are security cameras recording me violating my privacy? On the other hand, football players have to be paid to have their likeness appear in games.
I think this might be the catalyst to resolve these issues once and for all.
This means that you can demand money if the photographer sells that photo to anyone, especially if the sale is for commercial use.
Meanwhile nobody fixates on all the horrible stuff that can be done with cameras or Photoshop or Poser.
You can, or will be able to soon, do previously Hollywood standard effects at home. You could put Tom Cruise in your wedding video. You could put yourself in Mission Impossible. You could have the cast of the Magnificent Seven in the movie you created with a few mates. You could do tons of imaginative, funny, creative stuff.
Celeb porn, sticking the head of a baby on the body of an adult porn star or any other sad/weird porn related use - all of which has done with images for a quarter of a century - are the least interesting things about this.
As to the reddit ban, I doubt anyone is surprised.
It's not even that complicated. Petite actress + childs face. I take solace in that from what I saw of the subreddit, it seems like it's quite difficult to create something decent easily.
Actually I imagine that day won't come.
- Porn is done under consent, where participants should be reasonably aware that it will be published
- Porn tarnishes reputations
If you pasted Daisy Ridley's face in a crowd in, say, China, doing every day stuff, no one would rightly care because there's no real potential unless you are doing it for some ulterior motive.
They have been, since inception, illegal, in the sense that they represent a tortious act. They might be illegal in the criminal sense, depending on the jurisdiction, even in the US if absent meaningful context bringing the fake under the protection of the First Amendment as a form of speech.
What if I make a fake video of a specific person being physically attacked?
He didn't consent to be attacked and it may tarnish his reputation. Is it unethical to do any of the above?
1. r/deepfake exists, draws novelty, and has appeal in part because it's explicitly identified as fake. It's part of the name. So there's not only any claim to it being real, it's explicitly identified as fake. It's hard to argue how there's any nonconsensual anything when the parties involved in the deepfake porn all agree and understand it's fake.
2. Let's say that someone posts deepfake porn as real porn. Now this is a different issue, one closer to liable. But that's where there's some misattribution of something to the potential victim. The victimization is from the assertion that it is about the individual (as opposed to a deepfake creation, where the opposite is being asserted).
3. Let's say that, out of curiosity, you, in the privacy of your own home, on your own hardware, create deepfake porn involving your spouse (who fully consents and wants you to do so because it's arousing to them) and publicly available imagery. You do not distribute it. By the "nonconsensual porn" logic, though, you have now engaged in something akin to sexual assault, by engaging in nonconsensual photography. But this is absurd, because the public figure has suffered nothing, and nothing was obtained from them without their consent. It's your (and your spouses') creation.
4. Let's take this a step further, and say that a year from now you create software that will create a simulation of a person solely from its knowledge of what humans look like. Let's say that you obtain something that looks like a celebrity. Have you now created nonconsensual photography?
5. Let's say you find a person who is the doppleganger of a celebrity--a dead-ringer lookalike. You film them in porn. (This has been done actually.) Is that nonconsensual porn, because the celebritie's likeness is being used without their consent? The porn actors/actresses consented, though--why is deepfake any different, when there's nothing to consent? Why do you need porn actors/actresses consent to supercede the celebrities whose likeness they resemble?
The logic behind this reddit (and pornhub, etc.) decision is full of holes as far as I'm concerned, and it creates a very dangerous precedent concerning consent. It essentially gives people power over others' likenesses due to their popularity.
If that fake sex gif you made of your wife gets spread around her workplace, you better believe that would be an extremely humiliating and emotionally traumatic experience. An experience these rules will be created to prevent.
What's there to be on the fence about? It's terrible and a infringement of free speech. But reddit has the right to ban it as a private company. It just make reddit look terrible and hypocritical.
But considering they are planning on IPOing, I guess they have to sell out.
The only thing that make this not a treat to your regular folk it's the fact that needs a lot of images references to make the model, but imagine a politician or activist, they have a lot of images on the net; So this can take the fake news to another level. Yeah if this happens the news media and legal system, will probably not take shady videos seriously without verification (specially now, that the algorithm it's still is in the middle of the uncanny valley, so for the moment it's easy to recognize without experts need). But think about your friend, uncle or cousin that shares his "echo chamber" posts on facebook?
Or who knows, maybe I should't binge black mirror, and this only gets limited to porn and good uses. Like a new era for stunts in movies.
Because mainstream media is great at fact checking all the bullshit they report on? Some do, sure, but there is a lot of shit on mainstream news.
I hope that happens, and as soon as possible. Otherwise, there will be an 'uncanny valley' period of sorts, where people are able to mass-produce this video, but not everyone is aware that it exists. When it really is flawless, or close enough to, societal change will begin. This will also mean that video evidence will likely no longer hold in court (which is good because it was already possible to fudge manually with videoediting tools).
That said, I think this will again drive some more people to voat. What it definitely won't do is stopping people from doing deepfakes.
I long for the olden days when bbs allowed for lots of diverse stuff without people being banned for mentioning a leaked game of throne episode, or cgi boob.
Right now, I feel similarly toward Reddit as Google- moving in an anti-user direction, but no viable alternative to quit to.
These ideas basically self-select fringe groups.
Here’s a picture I drew of your mom. Look at those stink lines.