Hacker News new | past | comments | ask | show | jobs | submit | kaiwen1's comments login

There are many people with voices similar to Scarlett Johansson's. If SJ is unwilling to be a voice actor for OpenAi, then why should OpenAI not find a similar voice and use that instead? SJ certainly does not have a monopoly on all voices similar to hers. Anyone in possession of such a voice has the same right as SJ to monetize it. And someone did in fact exercise that right. If you compare the Sky voice to SJ's, they're not the same.

OpenAI's mistake was caving to SJ. They should have kept Sky and told SJ to get lost. If SJ sued, they could simply prove another voice actor was used and make the legitimate argument that SJ doesn't have a monopoly on voices similar to hers.


I too am mystified.

I think what’s going on here is that Scarlett is famous, and so media outlets will widely cover this. In other words, this latest incident hasn’t riled up people any more than usual — if you scan the comments, they’re not much different from how people already felt about OpenAI. But now there’s an excuse for everybody to voice their opinions simultaneously.

They’re acting like the company literally stole something.

It also didn’t help that OpenAI removed the Sky voice. Why would they do that unless they have something to hide? The answer of course is that Scarlett is famously rich, and a famously rich person can wage a famously expensive lawsuit against OpenAI, even if there’s no basis. But OpenAI should’ve paid the cost. Now it just looks like their hand was caught in some kind of cookie jar, even though no one can say precisely what kind of cookies were being stolen.


IANAL, but I think the mistake they made was constantly referencing the movie 'Her' when talking about Sky.


Regardless of the exactly voice spectrum, the plot would apply with any flirty female voice. It was not a movie about Scarlett Johansson. It was a movie about AI eliciting a relationship.

For the “her” reference(s?), was there anything beyond the single tweet?


> It was not a movie about Scarlett Johansson. It was a movie about AI

With Johansson voicing the AI. And now they're marketing their AI sounding like Johansson, referencing the movie that had Johansson voicing the AI.

Yeah, no similarities at all there.


> they're marketing their AI sounding like Johansson

This is subjective. I, personally, don't hear it, at all: https://news.ycombinator.com/item?id=40435695


100%. This whole thing is more stupidity than anything else. There is nothing wrong with using a voice that sounds like her. There is everything wrong with referencing the movie and sort of implying it is the voice from the movie. They could have easily let others make the connection. So dumb.


Why is it wrong to explicitly mimic a part played in a movie? Are we saying that the actor owns their portrayal of the role?

OpenAI should’ve owned their actions. "Yes, we wanted to get a voice that sounded like the one from Her." There’s nothing wrong with that.


> OpenAI should’ve owned their actions. "Yes, we wanted to get a voice that sounded like the one from Her." There’s nothing wrong with that.

https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.


Not an IP lawyer, but I think the company that produced the movie owns the relevant IP, and Johansson might also own IP around it.

You can have an opinion on it, but they are going to get sued. Just like I can't take Moana and throw her in an ad where it says "I like [insert cereal here]", they can't take a character and use it without expecting Disney/whoever to come sue them.


Actors get a lot of rights to their likeness.

So, yes maybe?


Hmm. Being able to say "thou shalt not make a character similar to Her" is a lot like saying "thou shalt not make a video game character similar to any other." It’s not an explicit copy, and their name for Sky was different. That’s the bar for the videogame industry; why should it be different for actors? Especially one that didn’t show her face.

This whole thing is reminiscent of Valve threatening to sue S2 for allegedly making a similar character. Unsurprisingly, the threats went nowhere.


You've really contorted the facts here. This isn't a character, it's a voice.

The voice sounds remarkably like Scarlett Johansson's.


It’s the other way around. The contortionists are on the other side of the issue. We’re talking about OpenAI hiring someone to use their natural speaking voice. As movies say, any similarity to existing people is completely coincidental from a legal perspective.

From a moral perspective, I can’t believe that people are trying to argue that someone’s voice should be protected under law. But that’s a personal opinion.


> We’re talking about OpenAI hiring someone to use their natural speaking voice.

How do you know?


They said so, and it’s what I would have done. I have no reason not to believe them.

Unfortunately a commenter pointed out that there’s legal precedent for protecting people’s voices from commercial usage specifically (thanks to a court case from four decades ago), so I probably wouldn’t have tried this. The cost of battling it out in the legal system is outweighed by the coolness factor of replicating Her. I personally feel it’s a battle worth winning, since it’s bogus that they have to worry about some annoyed celebrity, and your personal freedoms aren’t being trodden on in this case. But I can see why OpenAI would back down.

Now, if some company was e.g. trying to commercialize everybody’s voices at scale, this would be a different conversation. That should obviously not be allowed. But replicating a culturally significant voice is one of the coolest aspects of AI (have you seen those recreations of historical voices from other languages translated into English? If not, you’re missing out) but that’s not what OpenAI did here.


Do you always believe everything a corporation tells you?

If so, I have a bridge you might be interested in buying


No. But in this particular case, there are two factors that make that irrelevant for me. One, I would have made their same mistake. (If I was Sam, I too would have found it a really cool idea to make GPT have the voice of Her, and I too would not have realized there was one dumb court case from the 80s standing in the way of that.)

Two, it’s bogus that conceptually this isn’t allowed. I’m already anti-IP — I think that IP is a tool that corporations wield to prevent us from using "their" ideas, not to protect us from being exploited as workers. And now this is yet another thing we’re Not Allowed To Do. Great, that sounds like a wonderful world, just peachy. Next time maybe we’ll stop people from monetizing the act of having fun at all, and then the circle of restrictions will be complete.

Or, another way of putting it: poor Scarlett, whatever will she do? Her voice is being actively exploited by a corporation. Oh no.

In reality, she’s rich, powerful, and will be absolutely fine. She’d get over it. The sole reason that she’s being allowed to act like a bully is because the law allows her to (just barely, in this case, but there is one legal precedent) and everyone happens to hate or fear OpenAI, so people love rooting for their downfall and calling Sam an evil sociopath.

Someone, please, make me a moral, ethical argument why what they did here was wrong. I’m happy to change my mind on this. Name one good reason that they shouldn’t be allowed to replicate Her. It would’ve been cool as fuck, and sometimes it feels like I’m the only one who thinks so, other than OpenAI.


"This is perfectly legal!"

Actually, there's a similar court case from 1988 that creates legal precedent for her to sue.

"That's just one case! And it's from 1988! That's 36 years ago: rounded up, that's 4 decades!"

Actually, there's a court case from 1992 that built on that judgement and expanded it to define a specific kind of tort.

"That's bad law! Forget the law! I demand a moral justification."

Anyway, asking a person if you can make money off their identity, them saying no, and you going ahead and doing that anyway seems challenging to justify on moral grounds. I don't think you're willing to change your mind, your claim notwithstanding.


If you approach a debate from a bad faith standpoint, don’t be surprised when the other person doesn’t change their mind. "I think you’re a liar" is a great way to make them nope out.

Which is a shame, since you had a decent argument.

Except it isn’t. Again, you’re acting like OpenAI tried to profit off of Scarlett. They tried to profit off of the portrayal she did in the movie Her. These are not the same thing, and treating them as interchangeable is some next level moral rationalization. One is taking advantage of someone. The other is what the movie industry is for.

Now, where’s this case from 1992 that expended and defined the scope of this?


> Except it isn’t. Again, you’re acting like OpenAI tried to profit off of Scarlett. They tried to profit off of the portrayal she did in the movie Her.

Ahhh... so you admit OpenAI has been shady, but you argue they're actually ripping of Spike Jones not Scarlett Johansson?

HEH. The people who say Sam is shady aren't really interested in this distinction.

(And you're wrong, both ScarJo and the film own aspects of the character they created together.)


> Again, you’re acting like OpenAI tried to profit off of Scarlett. They tried to profit off of the portrayal she did in the movie Her.

From her statement:

> I received an offer from Sam Altman, who wanted to hire me to voice the current ChatGPT 4.0 system. He told me that he felt that by my voicing the system, I could bridge the gap between tech companies and creatives and help consumers to feel comfortable with the seismic shift concerning humans and Al. He said he felt that my voice would be comforting to people.

So, they wanted to profit off of her voice, as her voice is comforting. She said no, and they did it anyway. Nothing about, "come in and do that song and dance from your old movie."

> where’s this case from 1992

https://news.ycombinator.com/item?id=40435928


> If you approach a debate from a bad faith standpoint, don’t be surprised when the other person doesn’t change their mind.

Yeah, so stop doing that then.


> I have no reason not to believe them.

Seems you mut have reason to want to believe them.

Otherwise you'd have noticed all the reasons not to.


> then why should OpenAI not find a similar voice and use that instead?

That's assuming they did, right now they're asking us to pretty please trust them that their girlfriend from Canada is really real! She's real, you guys! No I can't show her to you.


Agree. And what about people who look similar to SJ? Are they precluded from acting jobs, simply because SJ became an actor first?


I encourage you to look through this case: https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.


1. The case is from 1988. That’s the year I was born. Societal norms are in a constant state of flux, and this one case from 36 years ago isn’t really an indication of the current state of how case law will play out.

2. Ford explicitly hired an impersonator. OpenAI hired someone that sounded like her, and it’s her natural voice. Should movies be held to the same standard when casting their actors? This is about as absurd as saying that you’re not allowed to hire an actor to play a role.


Midler is actually quite similar. Midler didn't want to do a commercial, and refused an offer, so they hired a lookalike that fooled her friends. The appellate court held that Ford and its advertising agency had "misappropriated" Midler's voice.

Waits v. Frito Lay, Inc was '92, and cited it. They used a Tom Waits-sounding voice on an original song, and Waits successfully sued:

> Discussing the right of publicity, the Ninth Circuit affirmed the jury’s verdict that the defendants had committed the “Midler tort” by misappropriating Tom Waits’ voice for commercial purposes. The Midler tort is a species of violation of the right of publicity that protects against the unauthorized imitation of a celibrity’s voice which is distinctive and widely known, for commercial purposes.

https://tiplj.org/wp-content/uploads/Volumes/v1/v1p109.pdf

Of course, who knows what a court will find at the end of this. There is precedent, however.


Thank you. I didn’t know it was similar specifically for voices in commercial use.

That’s annoying, but we live in a country with lots of annoying laws that we nonetheless abide by. In this case I guess OpenAI just didn’t want to risk losing a court battle.

I still think legal = moral is mistaken in general, and from a moral standpoint it’s bogus that OpenAI couldn’t replicate the movie Her. It would’ve been cool. But, people can feel however they want to feel about it, and my personal opinion is worth about two milkshakes. But it’s still strange to me that anyone has a problem with what they did.


I was born in 1983 and it is wrong to make profit off of someone else's art without their permission. It isn't strange at all. This includes using an impersonator. This excludes parody intentions.

So the overall argument isn't strange, you just disagree without having articulated exactly what biases you to disagree. It is moral disagreement ultimately.


> OpenAI hired someone that sounded like her, and it’s her natural voice.

They say so, yes. Seems like they didn't want to go through discovery in order to prove it.


> The case is from 1988. That’s the year I was born. Societal norms are in a constant state of flux, and this one case from 36 years ago isn’t really an indication of the current state of how case law will play out.

Correct, while Midler presents a similar fact pattern and is a frequently taught and cited foundational case in this area, the case law has evolved since Midler, to an even stronger protection of celebrity publicity rights, that is even more explicitly not concerned with with the mechanism by which the identity is appropriated. Waits v. Frito Lay (!992), another case where voice sound-alike was a specific issue, has been mentioned in the thread, but White v. Samsung Electronics America (1993) [0], while its fact pattern wasn't centered on sound-alike voice appropriation, may be more important in that it underlines that the mechanism of appropriation is immaterial so long as the appropriation can be shown:

—quote—

In Midler, this court held that, even though the defendants had not used Midler's name or likeness, Midler had stated a claim for violation of her California common law right of publicity because "the defendants … for their own profit in selling their product did appropriate part of her identity" by using a Midler sound-alike. Id. at 463-64.

In Carson v. Here's Johnny Portable Toilets, Inc., 698 F.2d 831 (6th Cir. 1983), the defendant had marketed portable toilets under the brand name "Here's Johnny"--Johnny Carson's signature "Tonight Show" introduction–without Carson's permission. The district court had dismissed Carson's Michigan common law right of publicity claim because the defendants had not used Carson's "name or likeness." Id. at 835. In reversing the district court, the sixth circuit found "the district court's conception of the right of publicity … too narrow" and held that the right was implicated because the defendant had appropriated Carson's identity by using, inter alia, the phrase "Here's Johnny." Id. at 835-37.

These cases teach not only that the common law right of publicity reaches means of appropriation other than name or likeness, but that the specific means of appropriation are relevant only for determining whether the defendant has in fact appropriated the plaintiff's identity. The right of publicity does not require that appropriations of identity be accomplished through particular means to be actionable. It is noteworthy that the Midler and Carson defendants not only avoided using the plaintiff's name or likeness, but they also avoided appropriating the celebrity's voice, signature, and photograph. The photograph in Motschenbacher did include the plaintiff, but because the plaintiff was not visible the driver could have been an actor or dummy and the analysis in the case would have been the same.

Although the defendants in these cases avoided the most obvious means of appropriating the plaintiffs' identities, each of their actions directly implicated the commercial interests which the right of publicity is designed to protect.

–end quote–

> Ford explicitly hired an impersonator. OpenAI hired someone that sounded like her, and it’s her natural voice.

Hiring a natural sound-alike voice vs. an impersonator as a mechanism is not the legal issue, the issue is the intent of the defendant in so doing (Ford in the Midler case, OpenAI in a hypothetical Johansson lawsuit) and the commercial effect of them doing so.

[0] https://law.justia.com/cases/federal/appellate-courts/F2/971...


Nice write up, thanks.

Unrelated, but as someone who came along into this world after Carson's Tonight Show, I had no idea that that moment from The Shining was a play on that. Today's lucky 10,000.


You keep harping about the "forty-year-old law!" (actually only 36), as if that meant it were somehow bad or irrelevant.

So I guess you wouldn't mind if someone killed you, since laws against murder are much older than that? Shit, outmoded old boomer thinking, amirite?

Wow, when you realise how you're coming off here...


>OpenAI's mistake was caving to SJ. They should have kept Sky and told SJ to get lost. If SJ sued, they could simply prove another voice actor was used and make the legitimate argument that SJ doesn't have a monopoly on voices similar to hers.

Yes, they should have not reached out again, but now they are screwed. In no way will they want a trial and associated discovery. SJ can write her own ticket here.


OpenAI caved immediately because they knew they would lose a lawsuit and be looking at a minimum of an 8 figure payout.

Voice impersonation has been a settled matter for decades. It doesn't matter that they used another actress. What matters is that they tried to pass the voice off as SJ's voice several times.


> OpenAI's mistake was caving to SJ. ... If SJ sued, they could simply prove another voice actor was used and make the legitimate argument that SJ doesn't have a monopoly on voices similar to hers.

Or... hear me out... maybe they couldn't prove that, which is why they caved. Caved within a day or so of her lawyers asking "So if it's not SJ's voice, whose is it?"


Can they make that claim if SJ voice exists in the training data before fine tuning? We don’t know what they train on


If they told the voice actor to try to impersonate SJ, then Scarlett does have a case.

That may not be how it should work, but it is very much how the law currently works.


Not quite. All the had to do was tell themselves in discoverable email that they were going to seek out someone who sounded like ScarJo, or emails to the recruiter saying they wanted someone to mimic the Her voice.


See Waits vs. Frito Lay.


In that case, as I understand it, the voice actor intentionally mimicked Waits, purposefully using his intonations, style of speech, and phrasing, all of which were not natural to the voice actor. He was intentionally mimicking Waits. I doubt the same claim can be made of the Sky voice actor.


Doesn't matter if the voice was natural or not if there are emails at OAI saying "find us someone who sounds just like ScarJo." I suspect there are and that's why SamA turned pussy and ran.


OpenAI had the opportunity to prove as much in court and chose not to.


Sky sounded quite different to SJ. Them taking the voice off means they are accepting guilt it seems.


Just do math. A student driven merely by the pleasure of doing math without concern for external validation is lucky. But if external validation is a driver, that's lucky too. In both cases, math gets learned.


Molecular Biology of the Cell is one of my all-time favorite books. Read it 15 years ago, rereading it now in the 7th (newest) edition. If someone has read the books you've listed, done the exercise and has a comprehensive understanding of the material, but no biology degree, only a passion for learning about cell biology, are there job options to keep feeding that passion?


Work/volunteer as a minimum wage software developer lab rat (or whatever your day job speciality is). There are plenty of labs that are in need of free labor when it comes to software/engineering support in general, just ask around.


If you're an experienced software developer, you can also just get a job as a software developer at any number of companies in pharma and biotech. No need to do it for free.


This is wonderful! If you want to go deeper, I highly recommend the textbook "Molecular Biology of the Cell". I discovered it in a bookstore 20 years ago and it consumed me for months. Every paragraph was a revelation. After hearing me speak in awe of this book for years, my wife recently bought the latest (7th) edition, which we're now reading together and I'm still mesmerized. Nothing compares to the astonishing complexity of a cell.


I bet the editors read it not once, but several times. And I bet it was even noticed years later and posted on websites where many others read it, were impressed, and commented on it. Don Knuth’s letters are special.


Why would the Russian military use a US satellite communications service to direct their operations in Ukraine? They would expect US intel to capture everything and share it with Ukraine.


93% of teens use YouTube and 63% use TikTok, and 20% "almost constantly" [1]

Imagine if 93% of _all_ people used TikTok "almost constantly", and imagine if the content consisted almost entirely of misinformation causing instability at scale in alarming, obvious ways.

Does it matter whether this outcome is driven by malicious foreign actors or uncaring algorithms optimizing profits? Shouldn't it be restricted on principle?

Free speech should be protected for individuals and also groups at some scale. But why should entities with vast, nearly universal reach among the youth be permitted to delude them at scale to the detriment of our collective future? This can't be the right answer to free speech.

[1] https://www.pewresearch.org/internet/2023/12/11/teens-social...


It’s astounding that until very recently it was standard practice to put humans on top of untested, first-of-kind rockets.


Well, to be fair, NASA is much more risk-averse than SpaceX.

That’s why everything takes so much longer, the “classic” way.

But the way SpaceX does things, gets seriously good results, quite quickly (and gives us some great videos of stuff getting blowed up).


I think it's the other way around. NASA flew the shuttle 135 times and lost two of them, but they kept going after the first failure. And both failures followed NASA being advised of the root cause before the orbiters' destruction, and NASA management ignored the advice. Meanwhile the safety track record for Falcon 9 exceeds that of the shuttle, and it exceeded it before humans were put on top of Falcon 9. The Apollo program did very few launches and yet they put humans on top of it. No, I think what you might say is that NASA in the 60s cared a lot less about safety than SpaceX do today, though perhaps NASA today cares a lot more about safety than NASA 20 years ago (and definitely than 60 years ago).


Lip-service to safety versus actual safety that looks unsafe.

I deal with bureaucracies a lot and this is how they do everything: it has to look good, but they don’t actually care to make things good.

E.g.: fill out a ton of paperwork about how secure the web application is against hacking, but nobody reviews the source code for vulnerabilities. Or they fill out the paperwork and report the app as “secure” even when third parties like me are listing vulnerability after vulnerability.

The report is what mattered, not reality.

OFFICIALLY, on paper, the Shuttle was very safe.


GP might have meant that NASA is more risk-averse in the business sense of the word risk, as in they don't want to risk failure, therefore they don't risk success. SpaceX definitely doesn't have that sort of risk aversion.


It was five nines "safe".


We haven't fully stopped. The Orion that flew in SLS's first flight didn't test the life support system, the second launch is already going to carry a crew.


"very recently" is more than 40 years ago, when the Space Shuttle made its first flight in 1981. That's closer in time to Sputnik than it is to today.


What are you referencing???


This article is trying desperately to avoid stating the obvious, that the evidence now points directly at a failed Hammas missile launch.


If it states "obvious" and then later we find out the obvious is wrong, they lose all of their credibility. Publications should only report, not make conclusions (let someone else connect the dots and report on their conclusions instead).


Unlike the situation in which everyone jumped on the same headline within 5 min before any evidence or footage was even identified and then had to walk back from it?


Again, if you read the article, it would probably say X claims Y, not Y is true. Like BBC articles:

- Hundreds feared dead at Gaza hospital as Israel denies strike (17 hours ago)

- Hospital blast in Gaza City kills hundreds - health officials (18 hours ago)

- Dozens killed as Israeli strikes hit southern Gaza refuge areas (1 day ago)

The last one is before the hospital was hit.

If you have a specific article with a headline, what was it?

The only thing I can find is:

https://www.thejc.com/news/news/bbc-criticised-after-reporte...

Which wasn't a headline, rather than in-person live report. And ya, that's still bad, but it isn't as bad as crafting an article with time to think and still coming up with a bad headline. And even in the live reporting, the byline still has attribution ("health officials: at least 500 killed in strike on hospital"), meaning I would need to actually hear what was said live to pass judgement (if they said "health officials are saying that X is true" is quite different from saying "X is true").


The titles were changed after the fact, BBC did had “Israeli strike hits hospital killing over 500” for several hours last night on their main page.


Ok, I'll have to take your word for it, or maybe someone archived it?


The problem is that they started by implying the opposite. They should have been at least this ambiguous to begin with.


Hamas? I understood the Israelis were stating it was Palestinian Islamic Jihad's. Which evidence are you relying on to be sure it was Hamas?


Are you insinuating that Hamas is not an Islamic jihadist organization?


I made no such assertion. However, PIJ != Hamas. It's factually wrong and unhelpful to conflate the two.


This. I own two Pint Xs. Ride almost every day for three years. Had two nose dives at full speed and ran both out. I'll never upgrade to a faster model for this reason. You can't run out a nosedive going any faster. Also, the Pint X is just so nimble and fun. To me it's the sweet spot.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: