Hacker News new | past | comments | ask | show | jobs | submit login
Scarlett Johansson hits AI app with legal action for cloning her voice in an ad (theverge.com)
74 points by grammers 7 months ago | hide | past | favorite | 107 comments



This isn't really an AI issue. You can't use someone's likeness for commercial gain without their consent. This is well-established. It's true for actual footage. Being AI-generated changes nothing.

You'll see the same thing in defamation cases. Someone will no doubt use AI to create someone's likeness to say something that person finds reprehensible and whoever created that AI may try to argue it's just an AI. It won't matter then either.

Where this gets more interesting is if whoever created the AI gets held responsible for what people do with it. The kneejerk reaction in the tech community will be to say "no, it's just a tool" but there's such a thing in legal terms as a reckless disregard for the consequences making you culpable.


It does raise the question that if it becomes cheap and easy enough to do, could a celebrity even afford to protect their brand? You can sue a company for doing this. You can't really sue 10,000 companies. This could result in a massive shift downwards for the value of a celebrity endorsement.


The first company I saw that offered ai voice cloning was essentially pitched as an entity just to protect a celebrity's likeness. They'd sell to celebs and become eg the exclusive vendor of authorized Taylor Swift AI voice cloning, never license it to anyone, and sue anyone else who cloned her voice or image. Seems like a reasonable plan.


> This could result in a massive shift downwards for the value of a celebrity endorsement.

We should also expect large numbers of authorized AI endorsements. Imagine you're some actor and a foreign company wants to make a TV advert advertising yogurt and pay you $1000. Might as well say yes - as long as they use AI, you don't even need to lift a finger. In fact, you can authorize thousands of ads across the world and pull in a nice million dollars, while becoming 'the advert guy', all without doing any actual work.


Once you race to the bottom, you'll find that those million $ deals turn into 500k deals, etc.. celebs can do a lot more work than they do today. They protect their brand in part by saying no to the shit brand partnerships that would drag down their reputation. Hey Daniel Craig's AI likeness endorses Bob's oil and lube, it must be good!


Is that what the whole Hollywood strike is about though?

re: this film -- https://en.wikipedia.org/wiki/The_Congress_(2013_film)


As much as I hate it, so many problems will be solved by a lack of anonymity on the web, and by gatekeepers preventing unauthorized access.

Yes, I know. There are enormous upsides to anonymity. But at the same time, so much harm would vanish, if every word written, and every deed done, was indelibly linked to the entity responsible.

I think, sadly, this is where all this will end up.


> so many problems will be solved by a lack of anonymity on the web

Honestly, I can't think of very many problems that would be solved by removing anonymity. There are a few, mostly related to to reducing actual criminal activity, but not enough to compensate for what we'd lose.


Well you'd immediately get rid of all spam...

And hate speech/fraud/scams could be punished.


I think spam levels would be unchanged. Cases of fraud are indeed one of the kinds of crime that would be reduced -- but absolutely not eliminated.


Exactly. It's not hard to imagine cloning agencies buying avatar/voice of people dirt cheap before they get famous and reselling the rights.


> there's such a thing in legal terms as a reckless disregard for the consequences

Lots of tools can be dangerous if misused. We hold the people who misuse the tools responsible, not the people who made the tools. Why would AI be any different?


We hold tool makers responsible all the time. It all depends on how the tool is meant to be used and how it's marketed.

For example, a voice cloning AI that's marketed and sold for people diagnosed with ALS to clone their voice in case they lose their ability to speak that is misused will not have any legal liability. However, if that same tool is marketed and sold to clone celebrity voices for commercials, then that company is going to have a bad time in court.


> if that same tool is marketed and sold to clone celebrity voices for commercials

That wouldn't be "reckless disregard for consequences", though. It would be straight up fraud.


That's not true in all situations, is it? If someone constructs something dangerous like an explosive and then hand it off for someone else's use they'd be culpable as well, wouldn't they?


It all depends. If I sell bottles of arsenic for the purpose of killing rats, I won't be held culpable if someone uses it for murder. If I sell bottles of arsenic for the purpose of murder, I would be.


> If someone constructs something dangerous like an explosive and then hand it off for someone else's use they'd be culpable as well, wouldn't they?

As others have commented, it would depend on the person's intent. If an explosive making company sells some explosives to a construction company for use in excavation, but then the construction company misuses the explosive and kills people, the explosive making company wouldn't be liable. But if person A makes some homemade explosives and gives them to person B to put into a bomb, and the bomb kills people, yes, we would hold person A liable. But even in that case the basis wouldn't be "reckless disregard for consequences", it would just be straight up murder.


Manufacturing and selling explosives in general is restricted, but assuming you follow the regulations you are not culpable for what people who legally purchase them do with them afterwards.


Regarding that knee-jerk reaction: one clear difference here is that the company does not just sell the tool, as in hardware, they sell a model that already contains an impression of the likeness of Johannson's voice derived from training data that was built by the supplier, not by those who created the ad. If they'd sold a blank slate to the ad-maker who then fed it with Johansson content until it was able to create the fake, the situation would be very different.


I've seen it asserted a lot, but I'm not clear on the exact mechanism in play here.

Is it copyright?

Is it some protected legal mechanism encoded in law, or is it agreement(s) between parties (Broadcasters/TV/Film producers)?

Is it only in play if you explicitly say it's a particular person? Can you use an impersonator to voice over something without referencing the person in question/give no grounds to suggest it's a particular person? What if you hire a good lookalike? A soundalike?


You're looking for the "right of publicity" (and "misappropriation) [1][2].

[1]: https://firstamendment.mtsu.edu/article/misappropriation/

[2]: https://www.inta.org/topics/right-of-publicity/


> This isn't really an AI issue.

Whether a sound-alike produced with AI voice cloning with the specific explicit disclaimer on the material containing the use in this case is, under existing law, a use of protected likeness is an AI-specific question in the application of the existing right of personality laws.

It won't be the last such question, I suspect.


Original article:

https://variety.com/2023/digital/news/scarlett-johansson-leg...

From the guidelines [1]:

> Please submit the original source. If a post reports on something found on another site, submit the latter.

[1] https://news.ycombinator.com/newsguidelines.html


To me, these kinds of events support the idea that no AI-specific law is needed. AI is already illegal to use insofar as it's used to commit a crime, and it's already subject to tort law.


*only if your pockets are deep enough to afford to sue the corporations who are misusing your likelihood without your consent

Having something ratified as a law makes any lawsuit a slam dunk and discourages abuse even by wealthy corporations.


> Having something ratified as a law makes any lawsuit a slam dunk

No, it doesn't, otherwise there would be no lawsuits proceding to trial without settlement ubder established laws.

Having law that (whether on its face or through case law) clearly applies to the situation that you claim occurred obviously helps, but it doesn't make any suit an automatic slam dunk.


I think its too early to make that claim. Anyone can sue anyone else for pretty much anything. The question is will the case survive and have a reasonable chance of success?

We might find out that this case dies in the first phases of proceedings because the law is not on Johansson's side.


Why would you suppose that? Broadly, celebrity likeness is a recognized legal concept prior to the advent of AI, so why would sprinkling magic AI dust on the case change things?

https://en.wikipedia.org/wiki/Personality_rights


What is less clear is whether existing criminal and tort law is adequate to cover trh concerns that arise with AI, but, yes, a lot of the things AI is being perceived as being abused for are at least on the fringes of existing legal regimes and may not need updates, or at least not major updates, to adequately handle them.


This makes me wonder, what will AI mean for professional impersonators? Is this the end of Saturday Night Live? Or will satire be considered fair use?

I have a feeling we will definitely need some new legislation in the next 10 years for regulating AI.


The concept of "Fair Use" is well-established. Impersonators are safe, provided the work is "transformative".

Satire is the textbook example of fair use.


AI-specific regulation is needed for the same reasons that crypto related regulation is needed - the tech exists and anyone can do it to do bad things without being tracked. Murder is illegal but if there was a way to commit it without anyone knowing, it would become quite popular.

In terms of deepfakes or crypto scams, in the end what we need it is more literacy on the matter, which won't prevent the ability for AI to do "things" but will let us acknowledge that's where we're at. Specific laws to get everyone on the same page on this is absolutely necessary.


That would be letting the courts interpret how the existing law applies to a new situation through. They may decide something that is not ideal.


>They may decide something that is not ideal.

I don't follow your argument. That same problem applies to legislators and regulators.

Also, I don't see any legal novelty. Same shit, different tech.


Legislators are at least supposed to consider different positions, interests and consequences and decide in the interest of the citizenry. Courts are supposed to just look at the law and precedents.

And I meant novelty for AI in general. This specific case does seem like same shit different tech.


From a legal standpoint it is not a new situation. If you create some novel invention and use it to rob a bank, the laws against robbery do not need reinterpretation.


> the 22-second ad showed Johansson behind the scenes while filming Black Widow, where she actually says “What’s up guys? It’s Scarlett and I want you to come with me.” But then, the ad transitions away from Johansson, while an AI-generated voice meant to sound like the actress states: “It’s not limited to avatars only. You can also create images with texts and even your AI videos. I think you shouldn’t miss it.”

wow this is really amateur hour for whoever did that ad without permission. It's like this person never got a letter from their cable company for bittorrenting and just had no idea anyone would mind using major film studio actors/content in their for-profit ad without paying for it.

it's like there's a reason when you go to the gym, all the music is some piped in re-recorded version of famous songs with an unknown singer doing it, rather than the actual music.


> it's like there's a reason when you go to the gym, all the music is some piped in re-recorded version of famous songs with an unknown singer doing it, rather than the actual music.

Wait, is that a real thing? I’ve never heard of that before.


It's definitely real. And it's not well known covers. The only place you'll hear these covers are on this channel, where presumably they paid for performance rights as a work for hire and they own the songwriter rights in their catalog. Sometimes they have the real performance, so you'll get real bands/songs mixed in.


Googling around to find who these vendors are, it might be Rockbot https://rockbot.com/ and others who are the big players here. This is really where I wonder where the AI solutions are. Feed it a Taylor Swift song, get a high quality cover recording with comparable vocals of a different voice of that song back which can be licensed much more cheaply (still has to be covered by ascap/bmi etc).


> when you go to the gym, all the music is some piped in

Heh, I hate it so much when gyms play music at all that I only use gyms that don't play any.


Yeah, I remember we had to pay licensing fees for just turning on the radio in the office. It wasn’t cheap either.


> where she actually says “...

It's the real clip that'll be the basis for this court case. Actual copyright infringement.

The AI-generated portion is much more murky legal waters.


The AI-generated portion is much more murky legal waters

https://en.wikipedia.org/wiki/Personality_rights

4) appropriation of name and likeness [..] violation of a right of publicity most closely aligns with appropriation. The right of publicity often is manifest in advertising or merchandise

It specifically quotes a relevant part of California's civil code:

Any person who knowingly uses another’s name, voice, signature, photograph, or likeness, in any manner, on or in products, merchandise, or goods, or for purposes of advertising or selling, or soliciting purchases of, products, merchandise, goods or services, without such person’s prior consent [..] shall be liable for any damages sustained by the person or persons injured as a result thereof.


Tom Waits sued Frito Lay for using his voice / musical stylings in ad: https://www.mentalfloss.com/article/79648/when-tom-waits-sue...


This is why you saw "Justin Guarini as Lil Sweet" briefly in those Dr. Pepper ads, lest they get on the wrong side of Prince's estate: https://www.youtube.com/watch?v=-1H_CaCR46M

More recently, there was the Michelob Ultra commercial that had a ton of look (and a sound) alikes, but they basically one by one said "this isn't the person you thought it was". https://www.youtube.com/watch?v=ZCk9vfbnlfA


Regarding Justin Guarini in that ad: is that really the best wig they had? It looks so plasticky...


and ScarJo did a whole album of Tom Waits covers a few years ago. This is the next logical thing to do!


That's a weird website


[dupe]

More discussion over here: https://news.ycombinator.com/item?id=38105463


Highly interested to see the result of lawsuits like this. Technically, a person's image, even a famous person's, can be sourced from the public domain and that is enough to generate an AI avatar.

Wonder how these AI celebrity clones will be regulated, if they are.


> Highly interested to see the result of lawsuits like this. Technically, a person's image, even a famous person's, can be sourced from the public domain and that is enough to generate an AI avatar.

Copyright status is irrelevant to right of personality/publicity claims, which is what this lawsuit is.


Am I the only one who thinks that chat-gpt's voice chat sounds extremely similar to Scarlett in "her"?


Amen - let's see more of this!


> Scarlett Johansson hits AI app with legal action for cloning her voice in an ad (theverge.com)

It is important to note that the legal action is against the company's use of her appearance rather than the capabilities of the app itself.

However, I wonder if voice and appearance cloning capabilities are something that would be classified as unsafe by the regulations stemming from the "Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence"?


The real world version of 'Looker' isn't all that far away.


I really, really wish I had possessed the fortitude to finish the sci-fi novel I started writing a few decades ago. The backdrop was post Reality Crisis, which occurred after faking pretty much anything became simple and available to the masses. Huge waves of paranoia had overtaken distressingly large portions of the population, who managed to uncover actual conspiracies in their digging for what truth really was, swaths of people had been convicted on falsified evidence, major elections were overturned, and so on. Nobody knew what to believe any longer.


Oh that sounds interesting. But soon you'll be able to do the same thing by making a documentary... and likely it will be far wilder than your imagination ever foresaw.


My only hope in all this mess is that it pushes us back towards local communities.


That's what I come up with when I try to think of what the best-case result of all this will be. I sincerely hope that's what will happen (or something better that I haven't thought of).

What I think is more likely, though, is it will result in a serious increase in the rate of people withdrawing from society and community altogether.


Maybe you should ask ChatGPT to write it for you, section by section, chapter by chapter, until it's done.

Then publish it, with Art Imitating Life as the title.

(Cite me in your dedications, pls :-)


We seem to be barrelling straight into implementing most of the fictional dystopias that people have thought up.


You're broke and alone and the air is poison but at least the fake Scarlett Johanssons are cheap.


Was that the recent Black mirror episode with Salma Hayek? That one was fantastic

E: no it wasn't, I was thinking of "Joan is Awful"



How does this work if you just find someone who sounds exactly like her do the ad? Do you then bring the new person into court to prove they sound like her and it's their real voice?


Same intent, similar result, so I'd expect same consequences.


Sounds like there is scope for a viral campaign where the actor is actually paid, but pretends not to be and threatens to sue etc.

Maybe someone already did that?


We should be free to use celebrity voices for private purposes. I would love if I could replace Siri’s voice with Scarlett Johansson instead.


> We should be free to use celebrity voices for private purposes.

We are, in the US anyway. It's just when you distribute IP to others that things get dicey.


Why? Just because you would love to do something with it isn't a reason. What's a reason we should be able to do this?


"Because I'd love to" is a great reason to to do something. The negation usually comes from the other negative impacts what you want to do has, not because it's inherently an invalid piece of justification.


If there is zero difference to Johansson either way, and the use is purely private and never shared, then why is personal enjoyment insufficient justification?


Usually it is insufficient justification for those who want to profit off something people do freely for fun.


If they choose to sell the rights.


Straight up ads on Facebook with Elon musk face and voice saying he’s doing a new crypto project. It’s so illegal but is allowed to happen.


Haven’t seen too much discussion about how AI will democratize media production. Celebrity is a pretty significant form of inequality that weirdly spills over into things like political and social influence. If the phenomenon of media celebrity disappears, isn’t that a good thing?


Celebrities exist primarily because people want them too. They aren’t celebrities because they are rich, they are rich because celebrity is easily monetizable. This predates modern media, so it’s hard to see why technology shifts would change it fundamentally.


I can't see AI getting rid of celebrities because one of the reasons people like them is that it's a form of human connection (even if it's parasocial). Also, there will still be inequality - same as it is now, in that people/companies with the most computing power will be the ones on top.


If anything, "parasocial" things are on a steep upward trajectory. There is a larger number of smaller "celebrities" than before, though. Going from low channel count (cinema, TV, analogue radio, major publishers) to wideband (millions of YouTube channels, etc etc). It probably follows a Pareto-esque power law distribution.


A while back I emailed a podcast about an issue with my account. I got a reply back from one of those hosts. It was a bizarre moment as I was starstruck to be communicating with someone I'd been listening to for years, but for them it was nothing. I'm sure they don't feel famous, they seem to live a very average life, but they felt famous to ME.


It's not going to disappear, instead people who work to build a reputation, media celebrities but also scientists and researchers and reporters and politicians will have their reputation and image appropriated and used by others without their consent, for speech, for pornography, for propaganda, etc.

That's not to say that celebrity isn't a form of inequality or some perfect meritocracy, but your face and voice being used by others isn't inherently democratizing, but instead advantages bad actors.


Why is AI going to cause celebrities to disappear?


Media celebrities exist because in order to produce entertainment content with convincingly human characters, you need to plug real humans into the image and sound replication machine.

I imagine that it won’t be too long until you don’t need a real human anymore, and we can generate realistic human characters at will.

At that point, why would we still plug real humans into the replication machine? It will be considered old fashioned very quickly.


Why would people accept that? It seems like the point where people accepted that, they'd accept their entire lives being virtualized. It seems weird to think celebrities are some kind of special case compared to.. everything else in the world. We are humans, we look up to and admire other people. For example: I think people would accept a completely virtual vacation or friends before giving up on the idea of even looking up to other people.


Personalised entertainment where every element of it is generated specifically to your personal preferences, is going to make the scaling required for celebrity status for a mear human performer, quite challenging.

It won't be too long before enough talented nobodies sell everything about themselves for personally life changing amounts, that we can create anything from scratch on the fly.

But there will always be other types of celebrities.


Brand dilution


Nonsensical. AI won’t stop the existence of popular individuals.


“ EVERYONE will be Super! And when everyone is Super (chuckle) no one is.”


There is much more to being a celebrity than just the person's image and voice - even if currently that is a big part.


If someone impersonates Scarlett Johansson or sounds a lot like her naturally is it illegal to make money from this? Why does AI have less rights than humans here?


That’s not the issue and the article clearly states it as well.

The ad has this bit “What’s up guys? It’s Scarlett and I want you to come with me.”

That strongly suggests endorsement. It wouldn’t matter if it was AI or Human, the ad is deceptive and uses someone’s likeness without rights to represent that person.


It also sounds like they used a clip from a Marvel movie in the ad, presumably without licensing it - I'm surprised Disney aren't involved in the lawsuit too.


Whether it is legal for a human depends on what they are doing. On one extreme impersonating as a parody is clearly legal protected speech, while as impersonating to trick people into thinking it is actually her is clearly illegal fraud. Then you have publicity rights that govern the use of ones likeness for commercial purposes, which apply to tools like this, and which have some clear cases and a whole ton of grey area.

Judges generally don't care about internal details of the tech involved, instead focusing on the end result, ruling no different than if a human had done the same thing with different tools. The problems come when lawyers convince the judge into misunderstanding what that end result is using false analogies and incorrect explanation of the tech.


Impersonation can be illegal whether you're man, or man using a machine.

> Why does AI have less rights than humans here?

I'm sure that's not quite what you meant, because that question in itself is ridiculous. Obviously dumb bricks of plastic and silicon don't have rights.


1. "AI" doesn't have rights, let alone human rights.

2. The ad didn't feature images of a person that looked like her, it featured actual images of her that were manipulated to shill the product. This obviously isn't legal. You can't just cut out a picture of Tom Hanks and put him in your ad with a speech bubble saying "buy this product", what was done here is no different from that.


The AI has zero rights. It is not a person, it is not sentient.



You talk as if an AI was its own lifeform that just happened look like her. (AI rights at this stage, really?) But this is just a company blatantly using her image to promote an app, the AI is irrelevant:

> The ad, reviewed by Variety, begins with an old clip of Johansson behind the scenes of Marvel’s “Black Widow.” Johansson says, “What’s up guys? It’s Scarlett and I want you to come with me…” before a graphic covers her mouth and the screen transitions into AI-generated photos that resemble the actor. A fake voice imitating Johansson then continues speaking, promoting the AI app.


It's complicated. See articles, for example, on the "right of publicity":

https://firstamendment.mtsu.edu/article/right-of-publicity/

There's certainly precedent for where using a visual likeness ended up being a copyright violation.


You have to pay [1] the Hebrew University of Jerusalem when you use something that’s only a characterisation of Albert Einstein - I believe it’s considered part of his estate’s intellectual property.

[1] https://einstein.biz/


The difference, at least from my perspective as a viewer, is an advert that used someone impersonating a celebrity would need to be marked as such. If an advertiser used a celebrity impersonator in order to make people think the celebrity endorsed, or was at least willing to be associated with, the brand that would be grounds for the celebrity to sue, and for the advertising regulator to fine the brand.

By recreating a voice in AI the company is effectively doing the same thing - associating that celebrity with their brand without permission. That's reasonable grounds for the celebrity to use the court system to say "That's not me, and I want to be paid for the damage this association may have caused to my status."


Why would a voice synthesizer and ad firm have any rights here? There's no "AI" here in the sense of some sentient being, are you stupid?


Yes it's illegal for a human to impersonate Scarlett Johansson to endorse a product for money. Why would AI have more rights than a human here?


> Why does AI have less rights than humans here?

I'm not sure but this seems like an issue that shouldn't be taken for granted.


> Why does AI have less rights than humans here?

Because "AI" is not people. "AI" is not even conscious.


Companies are not people either but have claimed all the same rights as us legally (and more). I'm unsure if AI should be able to do this hence the badly received question! I think everyone has their disgust triggered by the idea of AIs getting rights like humans but it's interesting to think about IMO.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: