Hacker News new | past | comments | ask | show | jobs | submit login
OpenAI didn’t copy Scarlett Johansson’s voice for ChatGPT, records show (washingtonpost.com)
574 points by richardatlarge 6 months ago | hide | past | favorite | 1206 comments




Well, here are some things that aren't really being disputed:

* OpenAI wanted an AI voice that sounds like SJ

* SJ declined

* OpenAI got an AI voice that sounds like SJ anyway

I guess they want us to believe this happened without shenanigans, but it's bit hard to.

The headline of the article is a little funny, because records can't really show they weren't looking for an SJ sound-alike. They can just show that those records didn't mention it. The key decision-makers could simply have agreed to keep that fact close-to-the-vest -- they may have well understood that knocking off a high-profile actress was legally perilous.

Also, I think we can readily assume OpenAI understood that one of their potential voices sounded a lot like SJ. Since they were pursuing her they must have had a pretty good idea of what they were going after, especially considering the likely price tag. So even if an SJ voice wasn't the original goal, it clearly became an important goal to them. They surely listened to demos for many voice actors, auditioned a number of them, and may even have recorded many of them, but somehow they selected one for release who seemed to sound a lot like SJ.


Clearly an SJ voice was the goal, given that Altman asked her to do it, asked her a second time just two days before the ChatGPT-4o release, and then tweeted "her" on the release day. The next day Karpathy, recently ex-OpenAI, then tweets "The killer app of LLMs is Scarlett Johansson".

Altman appears to be an habitual liar. Note his recent claim not to be aware of the non-disparagement and claw-back terms he had departing employees agree to. Are we supposed to believe that the company lawyer or head of HR did this without consulting (or more likely being instructed by) the co-founder and CEO?!


They hired the actor that did the voice months before they contacted SJ. The reaction on this site to the news that this story was false is kind of mindbending.


My guess: Sam wanted to imitate the voice from Her and became aware of Midler v. Ford cases so reached out to SJ. He probably didn't expect her decline. Anyway, this prior case tells that you cannot mimic other's voice without their permission and the overall timeline indicates OpenAI's "intention" of imitation. It does not matter if they used SJ's voice in the training set or not. Their intention matters.


Please don't take this as me defending OpenAI's clearly sketchy process. I'm writing this to help myself think through it.

If it weren't for their attempt to link the voice to SJ (i.e. with the "her" tweet), would that be OK?

- It's fine to hire a voice actor.

- It's fine to train a system to sound like that voice actor.

- It's fine to hire a voice actor who sounds like someone else.

- It's probably fine to go out of your way to hire a voice actor who sounds like someone else.

- It's probably not fine to hire a voice actor and tell them to imitate someone else.

- It's very likely not fine to market your AI as "sounds like Jane Doe, who sounds like SJ".

- It's definitely not fine to market your AI as "sounds like SJ".

Say I wanted to make my AI voice sound like Patrick Stewart. Surely it's OK to hire an English actor who sounds a lot like him, so long as I don't use Sir Patrick's name in the advertising. If so, would it have been OK for OpenAI to do all this as long as they didn't mention SJ? Or is SJ so clearly identifiable with her role in "Her" that it's never OK to try to make a product like "Her" that sounds like SJ?


There’s a special branch of law called “right of publicity” or “manners and likeness.”

It protect celebrities who rely on endorsements and “who they are” for income.

It very clearly prohibits copycats with near-likeness as a workaround to getting permission from a celebrity.

OpenAI asked SJ to use her voice. That right there helps her case immensely.

She said no. They went ahead anyway, presumably with someone or someone’s with a similar voice.

They publicized the product by referencing SJ.

These facts are damning.

They might be just a part of the story. Maybe 100 actresses, all sounding roughly the same, were given the offer over a two year period.

Maybe they all were given the same praise. Maybe one other, who signed an agreement, was praised on social media much more.

But this isn’t a slippery slope or a grey area. SJ was asked and said no.

That prohibits using a similar sounding copycat and publicizing as SJ.


Altman tweeting "her" didn't help his case


Most of those "rights of publicity" don't apply in the US though. If they did, half the actors in movies couldn't get work, because quite a few of them get work as the off-brand version of X in the first place.


Off the top of my head I can't think of a single example of any actors that come across as an "off-brand" version of someone else. What are some of the examples you have in mind?


I don't think this has anything to do with this case, but there certainly are some "types" in Hollywood with doppelganger actors.

Isla Fischer / Amy Adams

Margot Robbie / Samara Weaving / Jamie Pressley

Justin Long / Ezra Miller

Not to mention all of the token/stereotype characters where it hardly matters who the actor is at all. Need to fill the funny fat lady trope? If Rebel Wilson wasn't available, maybe they can get Melissa McCarthy.

The voice from Her isn't even the first voice I'd think of for a female computer voice. That trope has been around for decades. I'm sure OpenAI just wanted SJ specifically because she's currently one of the most popular celebrities in the world.


Javier Bardem & Jeffrey Dean Morgan

Bryce Dallas Howard & Jessica Chastain

Selena Gomez & Lucy Hale

Amy Adams & Isla Fisher

Keira Knightley & Natalie Portman

Not to mention that anytime an actor ages out of the early adulthood age range, a lookalike begins to play the same roles.

If you can't think of any examples, it's probably because you haven't been able to tell them apart yourself.


> Keira Knightley & Natalie Portman

Wait, which one is the "off-brand" here?

> Bryce Dallas Howard & Jessica Chastain

I must confess I used to confuse these two!

Another one you didn't mention: until relatively recently, I thought actors Elias Koteas (Exotica) and Christopher Meloni (CSI: Special Victims Unit) were the same person!


Logan Marshall-Green - Tom Hardy Dennis Quaid - Harrison Ford JJ Feild - Tom Hiddleston Gerard Butler - Russell Crowe


Every imitation comic maybe?


You recognize them as an "off-brand version of X" right?

Then there's no actual confusion.


There isn't so much in this case either if you actually listen to the voices, which most people don't seem to have done.


>She said no. They went ahead anyway

This is where your post breaks down. Many people say they don't think the voice sounds like SJ. Others do. But it appears you've made up your mind that they deliberately emulated her voice?


There's no clear line for this. To get the definite conclusion, you will need to bring this to the court with lots of investigation. I know this kind of ambiguity is frustrating, but the context and intention matter a lot here and unfortunately we don't have a better way than a legal battle to figure it out.

Thanks to Sam, this OpenAI case is clearer than others since he made a number of clear evidence against him.


Sure. And I'm not necessarily concerned about the outcome of the seemingly inevitable lawsuit. I'm more interested in calibrating my own moral compass here. If I were asked to participate in this, where would my comfort zone be?

I'll defer to a judge and jury about the legalities. As you noted, Sam gave them a lot of help.


My own thought is - every artist in history has taken inspiration from previous artists. The British voice actor in the above example surely studied greats in his field like Sir Patrick. We don't typically mind that. Where I think the line is between standing on the shoulders of giants and devaluing someone else's art is how well digested, for lack of a better term, the inspiration has been. Is the later artist breaking the components out, examining each, and assembling them with the insights they gained? Or are they seeking to resemble the end result as much as possible? I think that separates the cheap hack from the person who 'steals like an artist'


When it comes to whether something is "wrong", in general intent matters a lot and what they did was communicate an obvious intent. There are certainly ways they could have avoided doing so, and I'm not sure I understand the value of trying to dissect it into a dozen tiny pieces and debate which particular detail pushes it over the line from ambiguous to hard-to-deny? Maybe I don't understand what kind of clarity you're trying to achieve here.

This particular area of law or even just type of "fairness" is by necessity very muddy, there isn't a set of well-defined rules you can follow that will guarantee an outcome where everyone is happy no matter what, sometimes you have to step back and evaluate how people feel about things at various steps along the way.

I'd speculate that OAI's attempts to reach out to SJ are probably the result of those evaluations - "this seems like it could make her people upset, so maybe we should pay her to not be mad?"


What if they hired 10 different voice actors with no intent to have someone like SJ, but one voice actor actually did sound like from Her so they liked it the most and decided to go with it. And if only after the fact they realized that it is quite similar to SJ in general and decided to reach out to her and also went along with the Her idea due to obvious similarities?


Such a situation strains credulity. Voice acting is a profession, and professionals by their nature are aware of the wider industry, including the competition. SJ was the world’s highest paid actress in 2018-2019. The film Her was nominated for 5 Academy Awards, including Best Picture, and won Best Original Screenplay.

Even if this did go down the way you suppose, once they realized the obvious similarities, the ethical thing to do was to not use the voice. It doesn’t matter if the intention was pure. It doesn’t matter if it was an accident.


>If it weren't for their attempt to link the voice to SJ (i.e. with the "her" tweet), would that be OK?

Taking the recent screwup out of account... It's tough. A commercial product shouldn't try to assossiate with another brand. But if we're being realistic: "Her" is nearly uncopyrightable.

Since Her isn't a tech branch it would be hard to get in trouble based on that association alone for some future company. Kind of like how Skynet in theory could have been taken by a legitimate tech company and how the Terminator IP owner would struggle to seek reparations (to satisfy curiosity, Skynet is a US govt. Program. So that's already taken care of).

As long as you don't leave a trail, you can probably get away with copying Stewart. But if you start making Star trek references (even if you never contacted Stewart), you're stepping in hit water.


> It's probably not fine to hire a voice actor and tell them to imitate someone else.

Pretty sure this is fine, otherwise cartoons like the simpsons or south park would've gotten in trouble years ago.


This is specifically covered in cases like Midler v. Ford, and legally it matters what the use is for. If it's for parody/etc it's completely different from attempting to impersonate and convince customers it is someone else.


Midler v. Ford is a bit different from CahtGPT in that it was about a direct copy of a Midler song for a car ad, not just a voice sounding similar saying different stuff.

You can hear them here:

Midler version https://www.youtube.com/watch?v=WFVhL0jbutU&t=22s

Car ad https://youtu.be/hxShNrpdVRs


One of male voice actor contractors impersonated Julia Childs for a commercial years ago, writing was in a parodying style. She sued and won.


Parody is a protected fair use case.

You can make a Saturday Night Live sketch making fun of Darth Vader.

You cannot use a Darth Vader imitator to sell light sabers.


Sure, you can't sell light sabers. Can't you use a Darth Vader voice impersonator to sell vacuums? What about a voice that sounds like generic background actor 12?


You seem to be conveniently ignoring the fact that OpenAI is selling conversational AIs.


Exactly, parody isn't a commercial product.


There's a distinction between parody and impersonation, yes.


> If it weren't for their attempt to link the voice to SJ (i.e. with the "her" tweet), would that be OK?

Yes

> Say I wanted to make my AI voice sound like Patrick Stewart

Don't tweet "engage" of "boldly go where no man has gone before" when you release the product and you should be ok.


> It's fine to hire a voice actor who sounds like someone else.

Not necessarily, when you're hiring them because they like someone else—especially someone else who has said that they don't want to work with you. OpenAI took enough steps to show they wanted someone who sounded like SJ.

> Surely it's OK to hire an English actor who sounds a lot like him, so long as I don't use Sir Patrick's name in the advertising.

See https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co. and also Tom Waits vs. Frito-Lay.

> as long as they didn't mention SJ

Or tried to hire SJ repeatedly, even as late as 2 days before the launch.


> If it weren't for their attempt to link the voice to SJ (i.e. with the "her" tweet), would that be OK?

No.

The fact that it sounds very much like her and it is for a virtual assistant that clearly draws a parallel to the virtual assistant voiced by SJ in the movie (and it was not a protected use like parody) makes it not OK and not legal.

> Surely it's OK to hire an English actor who sounds a lot like him, so long as I don't use Sir Patrick's name in the advertising.

Nope.

If you made it sound identical to Patrick Stewart that would also likely not be OK or legal since his voice and mannerisms are very distinctive.

If you made it sound kind of like Patrick Stewart that is where things get really grey and it is probably allowed (but if you're doing other things to draw parallels to Patrick Stewart / Star Trek / Picard then that'd make your case worse).

And the law deals with grey areas all the time. You can drag experts in voice and language into the court to testify as to how similar or dissimilar the two voices and mannerisms are. It doesn't nullify the law that there's a grey area that needs to get litigated over.

The things that make this case a slam dunk are that there's the relevant movie, plus there's the previous contact to SJ, plus there's the tweet with "Her" and supporting tweets clearly conflating the two together. You don't even really need the expert witnesses in this case because the behavior was so blatant.

And remember that you're not asking a computer program to analyze two recordings and determine similarity or dissimilarity in isolation. You're asking a judge to determine if someone was ripping off someone else's likeness for commercial purposes, and that judge will absolutely use everything they've learned about human behavior in their lifetime to weigh what they think was actually going on, including all the surrounding human context to the two voices in question.


A random person's normal speaking voice is nobody's intellectual property. The burden would have been on SJ to prove that the voice actor they hired was "impersonating" SJ. She was not: the Washington Post got her to record a voice sample to illustrate that she wasn't doing an impersonation.

Unless & until some 3rd other shoe drops, what we know now strongly --- overwhelmingly, really --- suggests that there was simply no story here. But we are all biased towards there being an interesting story behind everything, especially when it ratifies our casting of good guys and bad guys.


If "Her" weren't Sam's favorite movie, and if Sam hadn't tweeted "her" the day it launched, and if they hadn't asked SJ to do the voice, and if they hadn't tried to reach her again two days before the launch, and if half the people who first heard the voice said "Hey, isn't that SJ?" -

Then I'd say you have a point. But given all the other info, I'd have to say you're in denial.


That's not how any of this works.


I'm just listing known facts.


Wrong.


> the Washington Post got her to record a voice sample

Actually it only says they reviewed "brief recordings of her initial voice test", which I assume refers to the voice test she did for OpenAI.

The "impersonating SJ" thing seems a straw man someone made up. The OpenAI talent call was for "warm, engaging, charismatic" voices sounding like 25-45 yr old (I assume SJ would have qualified, given that Altman specifically wanted her). They reviewed 400 applicants meeting this filtering criteria, and it seems threw away 395 of the ones that didn't remind Altman of SJ. It's a bit like natural selection and survival of the fittest. Take 400 giraffes, kill the 395 shortest ones, and the rest will all be tall. Go figure.


You’re right that a random person’s voice is not IP, but SJ is not a random person. She’s much more like Mr. Waits or Ms. Milder than you or I.

I don’t believe the burden would be to prove that the voice actor was impersonating, but that she was misappropriating. Walking down the street sounding like Bette Midler isn’t a problem but covering her song with an approximation of her voice is.

You are dead right that the order of operations recently uncovered precludes misappropriation. But it’s an interesting situation otherwise, hypothetically, to wonder if using SJ’s voice to “cover” her performance as the AI in the movie would be misappropriation.


> You are dead right that the order of operations recently uncovered precludes misappropriation.

I don't think that follows. It's entirely possible that OpenAI wanted to get ScarJo, but believed that simply wasn't possible so went with a second choice. Later they decided they might as well try anyway.

This scenario does not seem implausible in the least.

Remember, Sam Altman has stated that "Her" is his favorite movie. It's inconceivable that he never considered marketing his very similar product using the film's IP.


There are 4 other voices including male and androgynous ones. It wasn’t a second choice. Those voices have been available since 2023.


That's irrelevant. According to Midler v. Ford:

"We hold only that when a distinctive voice of a professional singer is widely known and is deliberately imitated in order to sell a product, the sellers have appropriated what is not theirs and have committed a tort in California."

Ford having multiple ads would not have changed the determination.


The voice doesn’t sound like her and the article shows there’s a fair amount of proof to back up the claim that it wasn’t meant to and that there was no attempt to imitate her.


> The voice doesn’t sound like her

And yet so many people think it does. What a weird coincidence.

> there’s a fair amount of proof to back up the claim that it wasn’t meant to

Sam has said his favorite movie is "Her". Sam tweeted "her" the day it was released. Sam wrote to ScarJo to try to get her to do the voice. OpenAI wrote to her two days before the release to try to get her to change her mind. A large percentage of the people who hear the voice thinks it sounds like ScarJo.

I think we're just going to have to agree to disagree about what the evidence says. You take care now.


I interpret it completely differently given that the voice actor does not sound like SJ.

1. OpenAI wants to make a voice assistant. 2. They hire the voice actor. 3. Someone at OpenAI wonders why they would make a voice assistant that doesn’t sound like the boss’s favorite movie. 4. They reach out to SJ who tells them to pound sand.

Accordingly, there is no misappropriation because there is no use.


I understand that the voice actor does not sound like ScarJo to you.

But you need to understand that it does sound like ScarJo to a lot of people. Maybe 50% of the people who hear it.

Those kinds of coincidences are the things that make you lose in court.


The voice is different enough that anyone who listens to samples longer than 5 seconds side by side and says they can’t tell them apart is obviously lying.

All the reporting around this I’ve seen uses incredibly short clips. There are hours of recorded audio of SJ speaking and there are lots of examples of the Sky voice out there since it’s been around since September.


To elaborate on the other comment -

It doesn't even need to sound like the person. It's about the intent. Did OpenAI intend to imitate ScarJo.

Half the people of the world thinking it's ScarJo is strong evidence that it's not an accident.

Given that "Her" is Sam's favorite movie, and that he cryptically tweeted "her" the day it launched, and that he reached out to ScarJo to do the voice, and that the company reached out to her again to reconsider two days before the launch -

I personally think the situation is very obvious. I understand that some people strongly disagree - but then there are some people who think the Earth is flat. So.


I don't think the intent matters (though it's moot in this case because I think there is clear intent): If someone unknowingly used a celebrity's likeness I think they would still be able to prohibit its use since the idea is that they have a "right" to its use in general, not that they have a defence against being wronged by a person in particular.

For example if someone anonymously used a celebrity's likeness to promote something you wouldn't need to identify the person (which would be necessary to prove intent) in order to stop have the offending material removed or prevented from being distributed.


> I don't think the intent matters

The closing statement of Midler v. Ford is:

"We hold only that when a distinctive voice of a professional singer is widely known and is deliberately imitated in order to sell a product, the sellers have appropriated what is not theirs and have committed a tort in California."

Deliberate is a synonym for intentional.


The passage you cited reads "we hold only that when" (compare with "we hold that only when") which I understand as that they are defining the judgment narrowly and punting on other questions (like whether the judgment would be different if there were no intent) as courts often do. In fact the preceding sentence is "We need not and do not go so far as to hold..."

It might make sense for intent to be required in order to receive damages but it would surprise me if you couldn't stop an inadvertent use of someone's likeness. In fact the Midler case cites the Ford one: 'The defendants were held to have invaded a "proprietary interest" of Motschenbacher in his own identity.'. I think you can invade someone's "proprietary interest" inadvertently just as you can take someone's property inadvertently; and courts can impose a corrective in both cases, in the first by ordering the invasion of proprietary interest be stopped and in the second by returning the taken property.


Fair enough. But then Midler v Ford doesn't support your argument. Do you have a case that does?


No. (I did cite the Ford statement about "proprietary interest" which I think supports my argument).

I'm not familiar with all the case law but I assume that no case has been brought that directly speaks to the issue but people can and do discuss cases that don't yet have specific precedent.


Well - sure - for exotic areas of the law. Can the president pardon himself, etc.

Just seems like this area isn't that exotic.


I don't think that's true. I can't cite them off the top of my head but when I read about supreme court cases often a big point of contention of the ruling is whether they decided to issue a narrow or broad ruling. Sometimes they decide to hear a case or not based on whether it would serve as a good basis for the type (narrow/broad) ruling they want to issue.

And the legal universe is vast with new precedent case law being made every year so I don't think the corpus of undecided law is confined to well known legal paradoxes.

As for this case it doesn't seem that odd to me that the issue of intent has never been at issue: I would expect that typically the intent would be obvious (as it is in the OpenAI case) so no one has ever had to decide whether it mattered.


I dunno man. You sound like a nice guy and all - but I don't think you can make hypothetical legal arguments. It needs to be rooted.

I don't see much merit in continuing our discussion. You take care now.


That's a different standard: "Can you tell them apart side-by-side" vs. "does this sound like person X" or "is this voice exploiting the likeness of person X". It's the latter question that is legally relevant.


Well, I’m glad at least I know the things that make you lose in court now. I’m appreciative for the lesson.


No problem. You take care now.


I cannot read the article because of it's paywall - is there actual proof OpenAI reached out to Johansson - or is it just being alleged by her lawyers?

It seems she has every reason to benefit from claiming Sky sounded like her even if it was a coincidence. "Go away" payments are very common, even for celebrities - and OpenAI has deep pockets...

Even so, if they got a voice actor to impersonate or sound similar to Johansson, is that something that's not allowed?


Johansson is a super successful actress and no doubt rejects 95% of roles offered to her, just as she rejected Altman's request to be the voice of ChatGPT.

She doesn't need "go away" payments, and in any case that is not what we're looking at here. OpenAI offered her money to take the part, and she said no.

According to celebrity net worth website, SJ is worth $165M.


> Johansson is a super successful actress and no doubt rejects 95% of roles offered to her

> She doesn't need "go away" payments

> According to celebrity net worth website, SJ is worth $165M.

I have no idea what Johansson's estimated net worth, or her acting career have to do with this? Wealthy people sue all the time for all kinds of ridiculous things.

The voice is, in fact, not Johansson. Yet, it appears she will be suing them non-the-less...

It's not illegal to sound like someone else - despite what people might be claiming. If it turns out to be true that Sky's voice actor was recorded prior to the attempted engagement with Johansson, then all of this is extra absurd.

Also, Sky doesn't sound like Johansson anyway... but apparently that isn't going to matter in this situation.


Miller v. Ford Motor Co. would disagree. There is a viable legal precedent, and it is very arguably illegal.

> The appellate court ruled that the voice of someone famous as a singer is distinctive to their person and image and therefore, as a part of their identity, it is unlawful to imitate their voice without express consent and approval. The appellate court reversed the district court's decision and ruled in favor of Midler, indicating her voice was protected against unauthorized use.

https://en.m.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.


They’d have to prove the voice actor was imitating SJ. If OpenAI recorded the entire interview process as well as the sessions where they recorded the actor for samples to steer the model with then it should be open and shut. There’s also the fact of the 4 other voices. Who are they meant to be?


If her lawyers are half competent, then they wouldn’t lie. They may not tell the whole truth, but we’re not discussing what wasn’t said here.

As for your second question, yes. Otherwise you have a perfect workaround that would mean a person likeness is free-for-all to use, but we already decided that is not acceptable.


> Otherwise you have a perfect workaround that would mean a person likeness is free-for-all to use, but we already decided that is not acceptable

That is not decided. There have been high profile cases were someone's likeness was explicitly used without permission, and they still had no recourse. It was argued the person was notable enough they could not protect their likeness.

Regardless, it appears debated if Sky even sounded like Johansson, which will make this very difficult for anyone to argue (being subjective and all). If the Sky voice actor was recorded prior to engaging with Johansson (which has been claimed by OpenAI), then it seems even more difficult to argue.

In the end, this will net Johansson a nice "go away" payday and then everyone will forget about it.


>Even so, if they got a voice actor to impersonate or sound similar to Johansson, is that something that's not allowed?

Correct, that is not allowed in the US.


Sure, no-one is disputing that, and despite this Altman then contacts SJ again two days before release asking her to reconsider, then tweets "her" to remind the public what he was shooting for. The goal could have just been ChatGPT with a voice interface, but instead Altman himself is saying the the goal was specifically to copy "her".


He's not necessarily saying that was the goal from the start. All he is admitting with that tweet is that it is indeed (he finds it to be) reminiscent of "Her".


Well, from the start, Altman wanted SJ to do the voice. Perhaps he'd never seen or heard of the movie "her", and the association is just coincidental?

After "she" said no, then Altman auditions a bunch of voice talent and picks someone who sounds just like SJ. I guess that's just the kind of voice he likes.

So, Altman has forgotten about SJ, has 5 voice talents in the bag, and is good to go, right? But then, 2 days(!) before the release he calls SJ again, asking her to reconsider (getting nervous about what he's about to release, perhaps?).

But still, maybe we should give Altman the benefit of the doubt, and assume he wanted SJ so badly because he had a crush on her or something?

Then on release day, Altman tweets "her", and reveals a demo not of a sober AI assistant with a voice interface, but of a cringe-inducing AI girlfriend trying to be flirty and emotional. He could have picked any of the five voices for the demo, but you know ...

But as you say, he's not admitting anything. When he tweeted "her" maybe it was because he saw the movie for the first time the night before?


Sky has been in the app for many months, the release last week didn't add the voice, it merely is for a mode which allows a much more natural back and forth and interaction via voice, which is, indeed, fairly reminiscent of the voice interface in "her".

He probably just really wants to actually have SJ's voice from the film. But SJ doesn't really have a right to arbitrary west coast vocal fry female voices. Without the "disembodied voice of an AI" element, I don't think most people would note "Oh she sounds like SJ". In fact, that's what the voice actress herself has said -- she hadn't gotten compared to SJ before this.


If down was up and blue was red then blueberries on the floor would be strawberries on the ceiling.


> After "she" said no, then Altman auditions a bunch of voice talent and picks someone who sounds just like SJ.

According to the article, the word After here is incorrect. It states the voice actor was hired months before the first contact with SJ. They might be lying, but when they hired the voice actor seems like it would be a verifiable fact that has contracts and other documentation.

And like others, not defending OpenAI, but that timeline does tend to break the narrative you put forth in this post.


The 2 days thing won’t fly. The omni model can probably produce any voice when led with a few seconds of audio. Meta developed a similar model a while back but didn’t open source it out of fear of deepfakes and there are commercial offerings like elevenlabs and open source ones like bark et al. So the last minute ask wasn’t nerves but a last ditch attempt to score a rebound for the launch.


The voice, or the plot/concept of the movie? Her was a bout an AI having enough realism that someone could become emotionally attached to it. It was not a movie about Scarlett Johansson's voice. Any flirty female voice would be appropriator a "her" tweet.


Maybe, but that isn't what Altman did. He specifically tried to hire SJ. Twice.


And, hired a voice actress that sounds exactly like Rashida Jones [1] rather than SJ, 6 months before.

These don't have to be related. Maybe they are, but the confidence that they are is silly, since having a big celebrity name as the voice is a desirable thing, great marketing, especially one that did the voice acting for a movie about AI. My mind was completely changed when I actually listened to the voice comparison for myself [1].

[1] https://news.ycombinator.com/item?id=40435695


Exactly, just reminiscent of the movie. So the tweet really proves nothing in this case, unlike what some people seem to believe.


And if you commision an artist to draw a black-leather tight-fit clad red-head superspy in an ad for tour product, it need not look like Black Widow from the MCU.

But if it does look very much like her, it doesn't really matter whether you never intended to.


You can see plenty of discussion elsewhere in this thread regarding how similar the result actually ended up being.

All I'm saying in the comment you are replying to is that it's incorrect to claim that Altman's saying that "the goal was specifically to copy 'Her'".


Lots of people are disputing it.


Where has anyone disputed "that they hired the actor that did the voice months before they contacted SJ"? Links preferred please.


It’s actually just a few comments away, and visible for me from the parent to your comment

> Well, from the start, Altman wanted SJ to do the voice. Perhaps he'd never seen or heard of the movie "her", and the association is just coincidental? After "she" said no, then Altman auditions a bunch of voice talent and picks someone who sounds just like SJ.

I suppose some might imagine asserting the opposite as a distinct concept from disputing but there you have it. You should be able to find a link quite easily.


I don't see where that quote says "they contacted SJ before they hired the actor that did the voice" or anything similar. Perhaps you need to taste it a bit more before you try again.

"Whether or not Altman wanted SJ's voice" and "whether or not they got someone else to do SJ's voice before asking SJ to do it" are two completely independent matters.


> After "she" said no, then Altman auditions a bunch of voice talent and picks someone who sounds just like SJ.

What does 'after' mean to you? Strange response.


What does "picks" mean to you? Strange response.

Do you think they only got one person to do it or something?

If they weren't looking at multiple options then ... why did they still ask SJ?

Oh right because they asked SJ because they hadn't picked yet.


Even if the voice actor was sourced before they originally contacted SJ, it was clearly the intent to sound like her. There are so many other distinctive voices they could have chosen, but instead they decided to go as close as possible to "her" as they could. Many people thought it was SJ until she stated it wasn't. I appreciate the voice actor may sound like that naturally, but its hardly coincidental that the voice that sounds most like the voice from "her" was the one chosen for their promotion. It is clearly an attempt to pass-off.


>Even if the voice actor was sourced before they originally contacted SJ, it was clearly the intent to sound like her.

Her, being the voice SJ did for the movie, not SJ's conversational voice which is somewhat different.

If OpenAI were smart, they did it in a chinese wall manner and looked for someone whose voice sounded like the movie without involving SJ's voice in the discussion.


This is not a thing. They hired a voice actor, who spoke in her normal speaking voice. That voice is not SJ's intellectual property, no matter what it sounds like. Further, I don't know how you can say any intention here is "clear", especially given the track record on this particular story, which has been abysmal even after this story was published.

They could have taken auditions from 50 voice actors, come across this one, thought to themselves "Hey, this sounds just like the actress in 'Her', great, let's use them" and that would be fine. Laurence Fishburne does not own his "welcome to the desert of the real" intonation; other people have it too, and they can be hired to read in it.

Again: the Post has this voice actor reading in the normal voice. This wasn't an impersonator.


> I don't know how you can say any intention here is "clear"

You are suggesting that it is coincidence that they contacted SJ to provide her voice, they hired a voice actor that sounds like her, they contacted SJ again prior to launch, and then they chose that specific voice from their library of voices and tweeted the name of the movie that SJs voice is in as a part of the promo?

I haven't suggested what they have done is illegal, given that the fictional company that created the AI "her" is unlikely to be suing them, but it is CLEARLY what their intent was.


What part of "actor" in "voice actor" did you not understand? You don't hire an actor to play themselves generally. "SJ" was not playing herself in Her.


> They could have taken auditions from 50 voice actors, come across this one, thought to themselves "Hey, this sounds just like the actress in 'Her', great, let's use them" and that would be fine.

Except that is simply not true. If their intent was to sound like Her, and then they chose someone who sounds like Her, then they're in trouble.


That is false. Read the appellate decision for Midler v. Ford. Remember that's a case where the first court to hear it said "lol no".


You should read it yourself, and the Watts case, and any other impersonator case.

You can use impersonators for parody, but not for selling products.


They didn't use an impersonator.


> "Hey, this sounds just like the actress in 'Her', great, let's use them"

You agree that OpenAI is choosing the voice because it sounds like SJ. How exactly is that different from impersonation?


That's perfectly fine. SJ does not have an intellectual property claim on someone else's natural speaking voice. This is addressed directly in Midler v. Ford.

You don't know that's what happened, but it wouldn't matter either way. Regardless: it is misleading to call that person an "impersonator". I'm confident they don't wake up the morning and think to themselves "I'm performing SJ" when they order their latte.


It’s not perfectly fine. If a company uses an actress because she sounds similar to a character they want to associate with their product, they are liable for damages whether or not the actress lists “impersonator” in her job description.

The key here is intent. If there was no intention for OpenAI to model the voice after the character Samantha, then you're right, there's no foul.

But as I have explained to you elsewhere, that beggars belief.

We will see the truth when the internal emails come out.


It was not claimed that they cloned ScarJo's voice. They hired a soundalike when they couldn't get the person they wanted. Use or lack of use of AI is irrelevant. As I said before, both Bette Midler and Tom Waits won similar cases.

Since they withdrew the voice this will end, but if OpenAI hadn't backed off and ScarJo sued, there would be discovery, and we'd find out what her instructions were. If those instructions were "try to sound like the AI in the film Her", that would be enough for ScarJo to win.

I know that the Post article claims otherwise. I'm skeptical.


> It was not claimed that they cloned ScarJo’s voice.

There were some claims by some people when the issue first arose that they had specifically done a deepfake clone of SJ’s voice; probably because of the combination of apparent trading on the similarity and the nature of OpenAI’s business. That’s not the case as far as the mechanism by which the voice was produced.


It's technically possible that the Sky voice/persona is half voice actress and half prosody/intonation ("performance") from SJ/"her". Clearly the ChatCGT tts system is flexible enough to add emotion/drama to the underlying voice, and that aspect must also have been trained on something.

Clearly a lot of people (including her "closest friends") find the ChatGPT demo to have been very similar to SJ/"her", which isn't to deny that the reporter was fed some (performance-wise) flat snippets from the voice actor's audition tape that sounded like flat sections of the ChatGPT demo. It'd be interesting to hear an in-depth comparison from a vocal expert, but it seems we're unlikey to get that.


> They hired a soundalike when they couldn't get the person they wanted.

Your opinion may vary, but they don't sound alike to me: https://news.ycombinator.com/item?id=40435695


Your immediate acceptance that a timeline that represents the best spin of a deep-pocketed company in full crisis PR mode proves the story "false", full stop, no caveats is... I wouldn't say mind-bending, but quite credulous at a minimum. The timeline they present could be accurate but the full picture could still be quite damning. As Casey Newton wrote today [1]:

> Of course, this explanation only goes so far. We don’t know whether anyone involved in choosing Sky’s voice noted the similarity to Johansson’s, for example. And given how close the two voices sound to most ears, it might have seemed strange for the company to offer both the Sky voice and the Johansson voice, should the latter actor have chosen to participate in the project. [...] And I still don’t understand why Altman reportedly reached out to Johansson just two days before the demonstration to ask her to reconsider.

They absolutely have not earned the benefit of the doubt. Just look at their reaction to the NDA / equity clawback fiasco [2], and their focus on lifelong non-disparagement clauses. There's a lot of smoke there...

[1] https://www.platformer.news/openai-scarlett-johansson-chatgp...

[2] https://www.vox.com/future-perfect/351132/openai-vested-equi...


>They hired the actor that did the voice months before they contacted SJ. The reaction on this site to the news that this story was false is kind of mindbending.

People lose their rational mind when it comes to people they hate (or the opposite I suppose). I don't care for Sam Altman, or OpenAI one way or another, so it was quite amusing to watch the absolute outrage the story generated, with people so certain about their views.


I don't understand the point you are trying to make. The essential question is whether they were trying to imitate (using a voice actor or otherwise) Scarlett Johansson's voice without her permission. Nothing in the article refutes that they were; whether they sought the permission before or after they started doing the imitation is irrelevant. Others have pointed to previous case law that shows that this form of imitation is illegal.

Moreover I can't see any reasonable person concluding that they were not trying to imitate her voice given that:

1. It sounds similar to her (It's unbelievable that anyone would argue that they aren't similar, moreso given #2).

2. Her voice is famous for the context in which synthetic voice is used

3. They contacted her at some point to get her permission to use her voice

4. The CEO referenced the movie which Johansson's voice is famous for (and again depicts the same context the synthetic voice is being used) shortly before they released the synthetic voice.


Except the story isn't false? They wanted her voice, they got her voice*, they did marketing around her voice, but it's not her voice, she didn't want to give them her voice.

Notice how the only asterisk there is "it's technically not her voice, it's just someone who they picked because she sounded just like her"


>> They hired the actor that did the voice months before they contacted SJ.

Are you saying that story is false?


Yeah, but then again, I totally expected this opening the comment threads. Same happened with RMS debacle, same happened with similar events earlier, same happened on many a Musk stories. It seems that a neat narrative with clear person/object to hate, once established, is extremely resilient to facts that disprove it.


Right. Even if you think OpenAI isn’t a good place, this is an investigation by an established newspaper that refuted some of the more serious accusations (that OpenAI got a Johannson impersonator - they didn’t, that they modified the voice to sound like Johansson - evidence suggests this didn’t happen). When the reaction is “I don’t care that an investigation refuted some of the accusations”, it demonstrates someone isn’t openly approaching things in good faith.

Likewise, if someone’s attitude is - “OK, maybe there’s no paper trail, but I’m sure this is what the people were thinking”, then you’ve made an accusation that simply can’t be refuted, no matter how much evidence gets presented.


> refuted some of the more serious accusations (that OpenAI got a Johannson impersonator - they didn’t

A lot of the argument here comes down to whether the article does refute that. I don't believe it does.

What it refutes is the accusation that they hired someone who sounds like Johansson after she told them she would not do it herself. That was certainly a more damning accusation, but it's not an identical one.

But in my view, it requires a pretty absurd level of benefit of the doubt to think that they didn't set out to make a voice that sounds like the one from the movie.

Maybe good for them that they felt icky about it, and tried to get her for real instead, but she said no, and they didn't feel icky enough about it to change the plan.

Do you believe the article "refutes" that? Does it truly not strike you as a likely scenario, given what is known, both before and after this reporting?


> A lot of the argument here comes down to whether the article does refute that.

It clearly refutes the claims that they got a Johansson impersonator. The article says this is a voice actress, speaking in her normal voice, who wasn’t told to mimic Johansson at all. You can say that you personally think she was chosen because people thought she sounded similar to Johansson, even though there’s no evidence for that at this point. But the claim - which was made several times in discussions on here before - that she is a Johansson impersonator is factually incorrect.

> But in my view, it requires a pretty absurd level of benefit of the doubt to think that they didn't set out to make a voice that sounds like the one from the movie.

I tried it several times in the past and never once thought it sounded like Johansson. When this controversy came out I looked at videos of Her, because I thought Johansson could have been using a different voice in that movie, but no - the voice in her is immediately recognizable as Johannson’s. Some have said Sky’s was much closer to Rashida Jones, and I agree, though I don’t know how close.


I think this is quibbling over the definition of "impersonator"?

I think the most plausible thing that happened is that they thought "hey it would be so awesome to have an AI voice companion like the one in Her, and we can totally do that with these new models", and then auditioned and hired someone that sounded like that.

Does it not fit the definition of "impersonator", since they didn't explicitly tell the person the hired to impersonate the voice from the movie? Sure, fine, I guess I'll give it to you.

But it doesn't refute "they wanted to use a voice that sounded like the one in Her", and there are a number of indications the this was indeed the case.


> When the reaction is “I don’t care that an investigation refuted some of the accusations”, it demonstrates someone isn’t openly approaching things in good faith.

When the reaction is "it doesn't matter, it's still not ok to copy someone's voice and then market it as being that person's voice or related to that person's voice" and your reaction is to cast that as being something else, it demonstrates you are not openly approaching things in good faith.


An "investigation"?

Let's note that OpenAI didn't release the names of the voice talent since they said they wanted to protect their privacy...

So, how do you think the reporter managed to get not only the identity, but also the audition tape from "Sky"? Detective work?

An interesting twist here is that WashPo is owned by Bezos, who via Amazon are backing Anthropic. I wonder how pleased he is about this piece of "investigative reporting"?


> they said they wanted to protect their privacy

This very well could be a contractual obligation.


OpenAI allowed the reporter to hear some snippets from the audition tape. Not exactly my idea of an "investigation".

There are multiple parts to the voice performance of ChatGPT - the voice (vocal traits including baseline pronunciation) plus the dynamic manipulation of synthesized intonation/prosody for emotion/etc, plus the flirty persona (outside of vocal performance) they gave the assistant.

The fact that the baseline speaking voice of the audition tape matches baseline of ChatGPT-4o only shows that the underlying voice was (at least in part, maybe in whole) from the actress. However, the legal case is that OpenAI deliberately tried to copy SJ's "her" performance, and given her own close friends noting the similarity, they seem to have succeeded, regardless of how much of that is due to having chosen a baseline sound-alike (or not!) voice actress.


Have you listened to both voices in the comparisons floating around? There is no way any of SJ's closest friends or family would be fooled if that voice called them up pretending to be SJ.


What facts disprove OpenAI making a voice that sounds like SJ such that the movie Her is referenced by Altman, and why is that actress upset?


> What facts disprove OpenAI making a voice that sounds like SJ

The objective parts of this are disproved in several ways by the very article under which we're commenting. The subjective parts are... subjective, but arguably demonstrated as false in the very thread, through examples of SJ vs. Sky to listen side by side.

> such that the movie Her is referenced by Altman

You're creating a causal connection without a proof of one. We don't know why Altman referenced "Her", but I feel it's more likely because the product works in a way eerily similar to the movie's AI, not that because it sounds like it.

> and why is that actress upset?

Who knows? Celebrities sue individuals and companies all the time. Sometimes for a reason, sometimes to just generate drama (and capitalize on it).


> You're creating a causal connection without a proof of one. We don't know why Altman referenced "Her", but I feel it's more likely because the product works in a way eerily similar to the movie's AI, not that because it sounds like it.

There's no proof needed. A marketer doesn't market something for no reason.

We are all capable of interpreting his statement and forming an opinion about its intent. Indeed, the entire point of making any statement is for others to form an opinion about it. That doesn't make our opinion invalid - nor does the whining and backpedaling of the person who made the statement.

Your opinion may be different than others, but I doubt that would be the case if you were truly approaching this situation in an unbiased way.


You want to say that the dispute here is over ignoring objective facts, but it isn't. I haven't seen anybody here ignoring the facts laid out by this article.

The dispute is instead about statements just like your We don't know why Altman referenced "Her", which, on the one hand, you're right, the mind of another personally is technically unknowable, but on the other hand, no, that's total nonsense, we do indeed know exactly why he referenced the movie, because we're a social animal and we absolutely are frequently capable of reasoning out other people's motivations and intentions.

This is not a court of law, we don't have a responsibility to suspend disbelief unless and until we see a piece of paper that says "I did this thing for this reason", we are free to look at a pattern of behavior and draw obvious conclusions.

Indeed, if it were a court of law, that's still exactly what we'd be asked to do. Intent matters, and people usually don't spell it out in a memo, so people are asked to look at a pattern of behavior in context and use their judgement to determine what they think it demonstrates.


The objective parts don’t disprove that OpenAI set out to make an AI that sounded like Scarlett Johanson to use as a marketing ploy. In fact, I’d argue it’s more likely that’s exactly what the evidence suggests they did. But maybe a judge will get to rule on whose interpretation of the facts is correct.


I also see this dynamic on these same kinds of threads, but what I see is that one side is very sure that the facts disprove something, and the other side is very sure they don't. I've been on both sides of this, on different questions. I don't think there is anything weird about this, it's just a dispute over what a given fact pattern demonstrates. It's totally normal for people to disagree about that. It's why we put a fairly large number of people on a jury... People just see different things differently.


It's unhelpful because the massive comment chains don't bring anything to the "discussion" (this is literal celebrity gossip so I'm having a hard time using 'discussion', but wait this isn't Reddit how could I forget, we're the enlightened HN.) It just devolves into ones' priors: do you hate or love OpenAI and sama for unrelated reasons. It's just a sports bar with the audience a few drinks in.


I mostly agree with you, but would ask: Why are you here, reading this thread? This isn't, like, a thread about something interesting that is being tragically drowned out by all this gossip. It's just an entirely bad thread that we should (and probably do) all feel bad about getting sucked into.

But the tiny sliver of disagreement I have with "this is a bad thing to discuss here and we should all feel bad" is that some people who frequent this site are sometimes some of the people involved in making decisions that might lead to threads like this. And it might be nice for those people to have read some comments here that push back on the narrative that it's actually fine to do stuff like this, especially if it's legal (but maybe also if it isn't, sometimes?).

The way I see this particular discussion is: outside the tech bubble, regardless of the new facts in this article, people see yet another big name tech leader doing yet another unrelatable and clearly sleazy thing. Then what I see when I come to the thread is quite a few of the tech people who frequent this site being like "I don't get it, what's the problem?" or "this article totally refutes all of the things people think are a problem with this". And I feel like it's worth saying: no, get out of the bubble!


I'm surprised by many people's reaction to this too, especially that IF it was possible right now, the industry would steal everybody's lunch in an instant without thinking about consequences. This case is like an appetizer to what's to come if things keep on going in this way..


> I mostly agree with you, but would ask: Why are you here, reading this thread? This isn't, like, a thread about something interesting that is being tragically drowned out by all this gossip. It's just an entirely bad thread that we should (and probably do) all feel bad about getting sucked into.

I was just reading the comments after reading the article to see if anything new came up, and was pretty appalled at the quality of commentary here. I'm not participating in the thread more because it's not worth it.

> But the tiny sliver of disagreement I have with "this is a bad thing to discuss here and we should all feel bad" is that some people who frequent this site are sometimes some of the people involved in making decisions that might lead to threads like this. And it might be nice for those people to have read some comments here that push back on the narrative that it's actually fine to do stuff like this, especially if it's legal (but maybe also if it isn't, sometimes?).

I've been involved in big decisions in other Big Tech companies. I'm proud of having fought to preserve Tor access to our offerings because I believe in Tor despite the spam and attacks it brings. I don't know about other folks in these positions, but if I were to read discussion like this, I'd roll my eyes and close the thread. If a random incoherent drunk ranter told me something was wrong with my ideas, I'd dismiss them without much hesitation.

> The way I see this particular discussion is: outside the tech bubble, regardless of the new facts in this article, people see yet another big name tech leader doing yet another unrelatable and clearly sleazy thing.

Because journalists know there is anti-tech sentiment among a segment of the population and so they stoke it. I don't know that much about this case, but a different story I've been following for a while now is the California Forever creation of a new city adjacent to the Bay Area. Pretty much every article written about it calls the city a "libertarian city" or "libertarian, billionaire dream". I'm involved in local planning conversations. I've read over their proposals and communications. They never, ever, mention anything about libertarianism. They're not proposing anything libertarian. They're working with existing incorporation laws; they're literally acting as a real-estate developer the same as any other suburban tract developer anywhere else in the US. But the press, desperate to get clicks on the story, bills it as some "libertarian city".

This "bubble" that you speak of is literally just a bubble created by journalists. I'm not saying that tech hasn't created some new, big, real problems nor that we shouldn't discuss these problems, but we need to recognize low-effort clickbait where we see it. This [1,2] article and thread talks about the reasons why, and it's not simple or straightforward, but at this point I consider most (not all) tech journalism to basically be tabloid journalism. It's meant specifically to drive clicks.

The only silly thing is some folks on HN think this site is somehow more high-brow than some general social media conversation on the news. It's the same social media as everywhere else, it's just more likely that the person talking is a software nerd, so the clickbait they fall for is different. My comment is my attempt as a community member to remind us to strive for something better. If we want to be more than just another social media site then we need to act like it. That means reading articles and not reacting to headlines, having good-faith conversations and not bringing strong priors into the conversation, and actually responding to the best interpretations of our peers' comments not just dunking on them.

[1]: https://asteriskmag.com/issues/06/debugging-tech-journalism

[2]: https://news.ycombinator.com/item?id=40201818


> I don't know about other folks in these positions, but if I were to read discussion like this, I'd roll my eyes and close the thread.

I'm loathe to reply because I know you don't want to engage anymore, but I think it's fair to reply to your reply on this point:

I'm sure you're right that the people involved in this will be defensive and roll their eyes, and I'm sympathetic to that human reaction, but it's also why society at large will continue along this path of thinking we suck.

If we roll our eyes at their legitimate criticism of all this sleazy stuff that is going on, then they're just right to criticize us.

And sure, "we shouldn't care if people at large think we suck because we roll our eyes at their criticism of our sleaziness" is a totally valid response to that. But I'm certainly not going to take up our cause against the inevitable backlash, in that case.

> This "bubble" that you speak of is literally just a bubble created by journalists.

I don't think so. I'm honestly sympathetic to how you've become convinced of that by the California Forever thing, which I agree has gotten a raw deal in the press. But I think this tech / SV / HN bubble is nonetheless a real thing. I work inside that bubble but live outside it. I spend a decent amount of time during my days reading and (often foolishly, like today) commenting on threads here.

But I spend a lot of my evenings and weekends with friends and family (in Colorado) who are very distant from our "scene". And I'm telling you, in my twenty year career, I have lived this evaluation from from "the internet is so awesome, google search is amazing, you know how to do software, that's so cool!" to "I don't know how you can stomach working in that industry". Sure, the media has had some impact on this, but we've also been super arrogant, have screwed up tons of stuff that is visible and salient to lots of people, and have seemed completely oblivious to this.

This episode is just one more example of that trend, and I think it's crazy to think "nah, this is all fine, nothing to see here".


I'm not sure what RMS has to do with Altman. I'm also not sure why you think people just want to hate on Musk when it took a decade of his blatant lies for most people to catch on to the fact that he's a conman (remember, everyone loved him and Tesla for the first 5 or 10 years of lies). But the comparison between Musk and Altman is pretty apt, good job there.


Well not sure what you mean by 'Conman'. Wildly successful people do aim high a lot, a lot. They don't meet 80% of their goals, that is perfectly ok. Even as low as 20% success on a lot of these moonshot things sets you ahead of the masses who aim very low and get there 100% of the times.

This whole idea that some one has to comply to your idea of how one must set goals, and get there is something other people have no obligations to measure up to. Also that's the deal about his lies? He can say whatever he wants, and not get there. He is not exactly holding an oath to you or any one that he is at an error for not measuring up.

Musk might not get to Mars, he might end up mining asteroids or something. That is ok. That doesn't make him a conman.

tl;dr. Any one can say, work and fail at anything they want. And they don't owe anybody an explanation for a darn thing.


It's not aiming high when anyone competent and informed can tell him there's no way, and he pays many competent and informed people to tell him.

He can set all the goals he wants. Setting a goal is not the same as telling people the company that you are dictator of is going to do something.

He's not setting goals, he is marketing, and he does it very well.

As far as how he's a conman https://news.ycombinator.com/item?id=40462194 although you already know that full well so you'll continue thinking he's some sort of hero.


> It's not aiming high when anyone competent and informed can tell him there's no way, and he pays many competent and informed people to tell him.

You know that this is exactly how SpaceX won big? There were many a competent, credentialed people telling Musk that reusable rockets are a pipe dream, all the way to the first Falcon 9 landing and reflight. Some of them even continued giving such "competent and informed" advice for many months afterwards.

> Setting a goal is not the same as telling people the company that you are dictator of is going to do something.

That's literally what it means, though.


> You know that this is exactly how SpaceX won big? There were many a competent, credentialed people telling Musk that reusable rockets are a pipe dream, all the way to the first Falcon 9 landing and reflight. Some of them even continued giving such "competent and informed" advice for many months afterwards.

And there were also people saying it could be done. Where were those people for self-driving? (Oh right they were in the Facebook comment section with no relevant knowledge.)

> That's literally what it means, though.

No, it's not at all.

A goal is personal, or perhaps organizational. You need not announce something on Twitter in order to set a goal for yourself or for your company.


It’s human nature: people see others achieve what they cannot, and try to pull them down. You see this wrt Musk on this site a lot, too.


It has nothing to do with this. There are many successful people and businesses that I admire, and a number of notable examples of those I do not. The two your comment mentions are simply part of that latter group. I think for good reason. (But of course I would think that...)


I’m not talking about you specifically here. What you’re saying could be true for you, and not true for the community as a whole. With the benefit of experience, I can tell for certain there’s (on average) a strong undercurrent of jealousy against people perceived as overly ambitious, particularly if they are successful in their ambitions. This is not specific to this site, of course, or even to the tech community in general.


You're right that it could be the case. But I disagree that it is likely to be, in this specific case, and the other specific case you cited.

I think many people distrust and dislike Altman and Musk in particular because of their own specific behavior.

Some people are hated because people are just jealous, but other people use that as an umbrella excuse to deflect blowback from their behavior that they entirely deserve. I believe this is one of those latter cases.


But you don’t even know what their behavior really is. What you do know is the narrative created by fake news media mostly. At best it’s lies by omission, at worst - outright smears and fabrications. Idk much about Altman, but people who worked with Musk hold him in very high regard and say he’s the real deal. Yet if you go by the narratives you’d think the exact opposite


I don't know about you, but judging by what Elon says in his social media posts, I'm going to assume that the people who worked with Musk are poor judges of character. Or these same people happen to have a ton of TSLA stock or some other vested interest in Elon, so kissing his ass is good for their bottom line.


I don’t care what he says. Last I checked he’s free to say whatever he likes. Attempts to police the speech of others should be no more socially acceptable than farting in a crowded elevator. I care what he actually accomplishes. If his being a little unhinged leads to greater accomplishments, that’s a worthwhile tradeoff in my book.


Judging people for their behavior is not "policing the speech of others".

The hilarious thing about this line of reasoning is that I am saying "that thing he said is not acceptable to me" and people like you say "that thing you said about that thing he said is not acceptable to me". The pattern here is "saying X is not acceptable to me", just different values of X. If what I'm doing is policing his speech, and that isn't acceptable to you, then what you're doing is policing my speech, which also shouldn't be acceptable to you.

A paradox? No, not at all, because the resolution of this paradox is just: Nobody here is policing anybody's speech. Everyone is just expressing their own opinions, in a completely normal way.

I'm free to care about how people behave, and not just what they accomplish. And you're free both to not care about that, and also to judge people who do care.

All of these streets go two ways!


Do you think there’s more to “behavior” than just speech?


Yes.

But I wish you'd made your actual point, instead of asking this vague question. I don't want to guess what point this question is a setup to, but who knows if I'll ever make it back to this thread to find out what point you're going to make.


These are famous people who do and say things in public all the time, which I can see and evaluate entirely on my own.

That in no way implies that people who have worked with these people or know them socially will agree with my assessments of them. People just assess things differently from one another. We all care about and prioritize different things.

(For what it's worth, it's possible that I also know a few people who have worked for or with Musk, and have incorporated the nuances of their views into mine, to some extent...)


Ah yes, the real deal.

How's that self driving coming along?

How about the Cybertruck being a few years late?

How about the low cost car being cancelled?

How about the you have all the hardware you need - oh wait oopsie you need to pay us thousands more

How about the taking Tesla private tweet?

How about the repeatedly and flagrantly violating government contracts that are basically his company's only revenue because he's too powerful for consequences?

How about...

Definitely the real deal.


Self driving is actually coming along better than I thought it would. Version 12 behaves like a human driver.

The real source of your frustration though is that he does not fully subscribe to the neoliberal dogma and lets millions of others to say whatever they like. Totalitarian left can’t handle that, it’s a visceral reaction


> Self driving is actually coming along better than I thought it would. Version 12 behaves like a human driver.

LOOOOOOOOOOOOOOOOOL

Anyway that would be cool if it was true but doesn't change what a conman he is. He said he was gonna make it true how long ago? Yeah.

> The real source of your frustration though is that he does not fully subscribe to the neoliberal dogma and lets millions of others to say whatever they like. Totalitarian left can’t handle that, it’s a visceral reaction

That's not even remotely true. Seriously. No part of it.

- It's been obvious to me that he was a conman since he started lying at Tesla, a decade or more ago. 'Fraid to say that alone disproves your unfounded personal attack since he didn't own a social media platform at the time.

- He doesn't allow people to say whatever they like unless he agrees with them.

- He (ok, and perhaps you) is the only totalitarian in this conversation.

But hey good job trying to make this out to be about politics instead of about what a terrible human Musk is, I know that's the only way for you Repugnantcans to cope with the cognitive dissonance.


That may be true, but you specifically said "on this site" and now you're saying "this is not specific to this site". And no, the vast majority of people "try[ing] to pull them down" to Musk are doing it because he's an egotistical hypocritical whiny jackass.


That was what Elizabeth Holmes claimed as well, however, we know that some people who try to achieve greatness are grifters. A pithy saying doesn’t change that reality.


You can’t seriously claim there’s any equivalence between Altman/Musk and Holmes. The former two have something to show for their ambition, Holmes was basically a fraud with no substance behind her whatsoever


So it's okay to commit fradulous acts if "you have something to show for it"?

Even if Altman was a good person, they are the face of a company that is doing some very suspicious actions. Actions that got the company cooked in litigation. So those consequences will assossiate with that face, consequences for not following robots.txt, for trying to ask forgiveness over permission against other large companies, and now this whole kerfuffle.


I'm not comparing the products, I'm criticizing the statement that people are just jealously trying to bring down those who attempt to achieve greatness. Also, you can have a great product and still do ethically and legally questionable things that people will criticize.


Tbf here Altman really screwed this over with that tweet and very sudden contacting. There probably wouldn't be much of a case otherwise.

If I had to guess the best faith order of events (more than what OpenAi deserves):

- someone liked Her (clearly)

- they got a voice that sounded like Her, subconsciously (this is fine)

- someone high up hears it and thinks "wow this sounds like SJ!" (again, fine)

- they think "hey, we have money. Why not get THE SJ?!"

- they contact SJ, she refuses and they realize money's isn't enough (still fine. But this is definitely some schadenfreude here)

- marketing starts semi-indepenently, and they make references to Her, becsuse famous AI voice (here's where the cracks start to form. Sadly the marketer may not have even realized what talks went on).

- someone at OpenAi makes one last hail Mary before the release and contacts SJ again (this is where the trouble starts. MAYBE they didn't know about SJ refusing, but someone in the pipeline should have)

- Altman, who definitely should have been aware of these contacts, makes that tweet. Maybe they forgot, maybe they didn't realize the implications. But the lawyer's room is now on fire

So yeah, hanlon's razor. Thus could he a good faith mistake, but OpenAi's done a good job before this PR disaster ruining their goodwill. Again, sweet Schadenfreude even if we are assuming none of this was intentional.


Just how many "Good faith mistakes" is a company / CEO permitted to make before a person stops believing the good faith part?


I'm a pretty forgiving person, I don't really mind mistakes as long they are 1) admitted to 2) steps are taken to actively reverse course, and 3) guidelines are taken to prevent the same mistakes from happening.

But you more or less drain thst good faith when you are caught with your pants down and decide instead to double down. So I was pretty much against OpenAI ever since the whole "paying for training data is expensive" response during the NYT trials.

----

In general, the populace can be pretty unforgiving (sometimes justified, sometimes not). It really only takes one PR blunder to tank thst good faith. And much longer to restore it.


Mistakes should be made once and once only, irrespective of good or bad faith. It is no longer a mistake when you do the same misstep over and over again, it is a deliberate pattern of behaviour.


If they gor the voice anyway why did they contact her?


The population of this site reacts to all stories like this. It’s only Gell-Mann Amnesia that causes your mind to bend.


thats even extra flattery for her


Legally, the issue isn’t what they were thinking when they hired the actor, it’s what the intent and effect was when they went to market. (Even if there was documentary evidence that they actively sought out an actor for resemblance to SJ’s voice from day one, the only reason that would be relevant is because it would also support that that was there intent with the product when it was actually released, not because it is independently relevant on its own.)

Whether or not they had any interest in SJ’s voice when they hired the other actor, they clearly developed such an interest before they went to market, and there is at least an evidence-based argument that could be made in court that they did, in fact, commercially leverage similarity.


No, that's not the legal standard here.


It is a curious reaction, but it starts to make sense if some of these posters are running ops for intelligence agencies. Balaji Srinivasan noted that as the US started pulling out of foreign wars, the intelligence apparatus would be turned inward domestically.

Some of it can also be attributed to ideological reasons, the d/acc crowd for example. Please note I am not attacking any individual poster, but speculating on the reasons why someone might refuse to acknowledge the truth, even when presented evidence to the contrary.


> Are we supposed to believe that the company lawyer or head of HR did this without consulting (or more likely being instructed by) the co-founder and CEO?!

Yes this is pretty typical. The CEO doesn’t make all decisions. They hire people to make decisions. A company’s head of legal could definitely make decisions about what standard language to use in documents on their own.


It could have simply been the other way around: they auditioned some unknown voice actors, then someone noted that one of them sounded like Scarlett Johansson. They optimistically contacted SJ, assuming she would agree, but then had to back off.


Sky does not really sound like SJ though if you listen side by side. According to OAI's timeline, they intended to have Sky in addition to SJ. OAIs voice models including Sky predate the GPT4o voice assistant. Also:

"In a statement from the Sky actress provided by her agent, she wrote that at times the backlash “feels personal being that it’s just my natural voice and I’ve never been compared to her by the people who do know me closely.”"

It did not seem like an issue before and the Sky voice was public many months before GPT4o. I don't believe SJ can claim to own all young, attractive woman voices whether they are used as a voice assistant or not. It seems like the issue is being blown out of proportion. It does make a good story. The public perception of AI right now is generally negative and people are looking for reasons to disparage AI companies. Maybe there are good reasons sometimes, but this one is not it.


> It seems like the issue is being blown out of proportion.

It kinda feels like its on purpose. Someone in a previous thread mentioned that this might have been a cynical marketing ploy and I'm warming up to the theory. After they recorded the Sky VA, they figured out a whole marketing campaign with SJ to promote the voice feature. After she turned them down (twice), they released with just enough crumbs referencing the movie to goad SJ into committing a first degree Streisand.

With the slow roll out, everyone would have forgotten about the feature the day after the announcement but now it's been in the news for a week, constantly reminding everyone of what's coming up.


Calling it a Streisand on behalf of SJ is wrong though. She wanted it to be a topic of discussion and succeeds with it.


In addition they have prepared for a full court case before hand, with all their ducks in a row in theory. I am not a lawyer so I am not sure if the law works this way but this might help them defend their case and set a precedence.


Are we really at a point where tech companies will bait lawsuits just to get more PR? Clearly they need to be smited down to remember why old school companies go out of their way to avoid such possibilities of lawsuits


That sounds plausible actually. The controversy has given OAI free marketing.


I'm also curious, legally speaking, is it an issue even if Sky's actress does sound like Scarlett? What if OpenAI admits they intentionally chose someone who sounded like Scarlett? Does it matter whether she was using her natural speaking voice or intentionally mimicking Scarlett's voice and mannerisms?

This seems similar to the latest season of Rick and Morty. Whether justified or not in that particular case, it rubs me the wrong way a bit in principle to think that a production can fire someone only to hire someone else to do a near-perfect copy of their likeness. If (as in the OpenAI case) they'd gone further and trained an AI on the impressions of Justin's voice, would that have been considered an AI impersonation of Justin with extra steps?

All of which is to say, this seems like a pretty interesting legal question to me, and potentially broader than just AI.


The fired actor would have already signed away any claim to the character's likeness. The likeness the company cares about is that of the character, not of the actor portraying the character. The actor never owned the character, so the actor shouldn't be miffed that someone else gets the part for future performances.


That's probably the case. Having said that, there are also a lot of one-off side characters which use Justin's distinctive voice style, although I can't remember specifically whether that was the case in the latest season, and I'm not aware that detailed information about their internal agreements is public knowledge either way. I was speaking more about the general principle, not strictly that particular situation. Maybe I'm wrong, but it seems like there are some interesting overlapping legal and moral dilemmas in all of the discussions about both situations, regardless of what the specific facts of the OpenAI and R&M cases may be.


Yes, I can see a plausible argument that a character is so intertwined with a well-known real-life persona that a company can't replicate a character without borrowing some of the persona's value. One might also make the case that the actor developed so much depth in an initially thin character that they deserve more credit than just acting the part.

I don't personally subscribe to the notion that the recent legal invention of intellectual property is a moral right. Capitalism has been doing just fine as a productivity motivator. We don't need to capitalize expression of ideas, let alone pure ideas. I accept the tradeoff of the temporary monopolies of copyright and patent, and I appreciate that trademark and trade secrets disincentivize bad behavior. But I have no desire to try to find new boxes to store new kinds of intellectual property, like Scarlett Johansson's right to monopolize performances of a character in an app that remind people of her performance of a character in a movie. Such a kind of property right is not necessary.


See Midler vs Ford, Glover vs Universal or Stefani vs Activision for prior cases in this area. Courts usually side with the person being imitated.


"SJ can't own all female AI voices" is attacking a straw man version of the complaint, which is much narrower. The question is whether OpenAI deliberately fostered the impression of an association between their product and her performance, which she had so far refused.

To your point, there have many female assistant voices on the market, including Sky -- but what might have tripped the line of impersonation was the context this particular one was presented and marketed. I don't know where exactly that line should be, but you can certainly reject this kind of marketing without stifling anybody's legitimate career.


Regardless of the moral implications, "sounds almost exactly the same" is not copyright infringement. Perhaps it could be trademark infringement if she had trademarked her voice like Harley-Davidson attempted (and failed) to trademark the sound of their motorcycles, but "sounds alike" is a pretty hard case to prove, and it's completely blown away if they can demonstrate that another human sounds indisputably similar.

People do celebrity impressions all the time, and that's not infringement either, because it's not actually copying that person's voice.

I'm sympathetic to SJ in this matter, especially after the Disney Black Widow debacle, but it sounds like she had the opportunity to write herself a nice check, and she turned it down.

On the basis of this article, it sounds like she doesn't have the cause of action that she had believed she had; I imagine that her legal team are now advising a fast settlement, but OpenAI's legal team might prefer to milk the free publicity for as long as they can, especially if they are fairly certain they would prevail at trial.


It isn't about copyright, it's about passing-off, it's described elsewhere in detail in these threads what it means. It's about intention and what the customer believes. If customers might believe its SJ, due to samas tweets, general likeness in voice, and the context (voice assistant), the public info about them trying to get SJ to do this - that's passing-off, even if it wasn't training on her voice per se. There are numerous law cases about this.


> it sounds like she had the opportunity to write herself a nice check, and she turned it down.

If I were SJ, I'd turn it down too. Shes in no need of money, and selling her voice to OpenAI would make most of creators and every single voice actor hate her (not to mention the Twitter mob).

In majority of creative circles, the current social norm is to hate AI, so touching AI in any way is too risky for reputation.


It probably is worth paying attention to the water WaPo is carrying for OpenAI here next to their publisher's announcement about prioritizing the use of AI in their newsrooms.


It doesn't seem like you'd need "shenanigans" for this. Lots of voice actors are capable of doing voices that sound like other people, and some even have a natural voice that happens to sound very similar to a particular more noteworthy celebrity. AFAIU, the rights to your likeness only apply to your likeness, not to the likeness of someone else who happens to look or sound a lot like you.

For a case that doesn't involve AI at all, consider situations where a voice actor in a cartoon is replaced (sometimes while still alive) by someone who can perform a voice that sounds the same. Decisively not illegal. Most people don't even find it immoral, as long as the reason for getting rid of the original voice actor wasn't wrong on its own (e.g. Roiland).


> For a case that doesn't involve AI at all, consider situations where a voice actor in a cartoon is replaced (sometimes while still alive) by someone who can perform a voice that sounds the same. Decisively not illegal.

Because there are contractual clauses. Do you think Hank Azaria owns the voice of 'Homer Simpson'? Or does Fox own that? It would be crazy to develop a show and then be held hostage to your voice actors for all future shows - what if they get hit by a car?


The attempts to sound like Mel Blanc after his death just don't sound right to me. Or maybe it's just the bad scripts.


The article clearly disputes this. They hired and worked with the voice actor for Sky months before the first time SJ was contacted, and the voice actor used for Sky never had the movie Her or SJ's name mentioned to her a single time


The Movie Her predates all of this by years, and Sam Altman even tweeted "her"! The OpenAI team are clearly well aware of Scarlett's voice (its inconceivable the majority of the team at OpenAI haven't at least seen part of the film that almost defined their industry). The movie predates all of this by years - of course they knew.

When auditioning actors "months before" they can still look for an actor who guess what? Sounds like SJ, even "before the first time SJ was contacted".

As the actor - I'd likely also be looking to emulate SJ in Her - its clearly what the client was looking for.


> its inconceivable the majority of the team at OpenAI haven't at least seen part of the film that almost defined their industry

Let's not exaggerate. It was a somewhat popular movie, yes, but not really defining and far from the first example of conversational AI speaking woman's voice. There are plenty of examples in movies and TV shows.

If anything, the seminal work in this space is Star Trek casting Majel Barrett-Roddenberry as the voice of computer systems with conversational interfaces, as early as 1987 (or 1986, if she had that role in the Original Series; I don't remember those episodes too well), all the way to ~2008 (or to 2023, if you count post-mortem use of her voice). That is one distinctive voice I'd expect people in OpenAI to be familiar with :).

Also, I can't imagine most people knowing, or caring, who voiced the computer in Her. It's not something that most people care about, especially when they're more interested in the plot itself.


> Let's not exaggerate. It was a somewhat popular movie, yes, but not really defining and far from the first example of conversational AI speaking woman's voice. There are plenty of examples in movies and TV shows.

I'm honestly surprised so many people are making this argument, seemingly with a straight face.

It would have been a pretty weak argument even without the tweet from Altman - it is not exaggeration to say it is the canonical "AI voice companion" cultural artifact in our times, but the opposite, it requires exaggeration to downplay it - but then the CEO's own marketing of the connection weakens the argument past the point of plausibility.

Surely there are better defenses available! But with this line ... phrases like "don't piss on me and tell me it's raining" and "don't believe your lying eyes" keep popping into my mind for some reason ...


The quotation above is

> its inconceivable the majority of the team at OpenAI haven't at least seen part of the film that almost defined their industry

rather than

> It is inconsistent that Sam personally wasn't aware

(He obviously was)

I'd agree that Majel Barrett-Roddenberry is the prime example of a computer voice interface for most nerds… but then I looked up when Her was released and feel old now because "surely it's not 11 years old already!"


If "her" wasn't the canonical example of a near-future AI assistant in a film, why then does Sam bother to tweet the single word "her" following launch? I think that film is far more influential than you give it credit for here.

Everyone in tech who saw that tweet knew what it meant - a single word. The tweet doesn't even require additional context or explanation to almost anyone in this industry.

There is also a clear difference in the behaviour of the "computer" in Star Trek vs "her" - what OpenAI shipped is far more like the personality of "her" than the much more straight-laced examples in Star Trek, where the computer was virtually devoid of emotional-sounding responses at all.


Just anecdotally, I personally didn't know about the existence of that movie before this whole drama began. Sam tweeting that probably knew however.


Who cares.


What matters is whether OpenAI leadership had the movie Her in mind, and the AI in Her is more similar to LLMs than the Next Generation Star Trek main computer.


Computers have had conversational voices at least since Lost In Space.


I think we can surely all mostly agree 'her' presents a much more realistic portrayal of a near-term future than Lost in Space et al.


I was referring to the robot's voice.


Even if thats true why would that be illegal or unethical? She can't possibly have a copyright on all voices that sounds like "her"


There have been cases where it was decided that a person had rights[0] to their distinctive voice, as an extension of the Right of Publicity[1]. For example Midler v. Ford Motor Co.[2], and one or two other cases I've seen mentioned but can't remember.

[0]: Though not necessarily "copyrights"?

[1]: https://higgslaw.com/celebrities-sue-over-unauthorized-use-o...

[2]: https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.


Midler v. Ford is a poor comparison for this case specifically because of: 1. hiring an impersonator 2. copying a song 3. intending to confuse.

If what OpenAI is saying is true, then none of the conditions apply. I'd say (1) is unlikely, (2) is very unlikely, and (3) is maybe, at least to some degree.


I would suggest that (3) is a solid "yes" given the other communications around this, and honestly the similarity of the final tone of voice.

Very little suggests an intent to confuse more than tweets from company leaders confirming that there was intent to confuse. What is left on the table is whether actual confusion occurred, which is different.


NIL rights are pretty broad, and more like trademark rights than patents or copyrights. The main test isn't similarity, it's whether there is brand confusion. Karpathy and Altman's tweets indicate brand confusion.

Still, this isn't recognized in every state or country, and there aren't many cases yet (although there are laws).


Sure, it could have happened, but it seems we don’t have evidence either way.

Tweeting “her” months later doesn’t prove anything. That Tweet might superficially look like evidence of intent, but if you think about it, it’s not.


Counterpoint: if you think about it, yes it is.


To spell it out, based on the date, it's very weak evidence for something that happened many months before.


To spell it out in the way that doesn't require a crazy level of suspension of disbelief:

Based on the date (right after the public release of the assistant), it is actually very strong evidence for "we thought it would be awesome to have an AI voice companion that sounds like the one in Her", which, combined with the (undisputed) revelation that they indeed tried (twice) to get the person who did that voice, is a very strong indicator of the intent of the thing that happened many months before.


Yes, actually trying to hire Scarlett Johansson is very good evidence that they thought it would be awesome if they succeeded in hiring her as a celebrity voice. A one-word tweet adds nothing to that.


Sure it does. Intent is all about a pattern of behavior. It's part of the pattern.


Right. And that's extremely hard to believe. A discovery search of the internal emails should give us a definitive answer.


To find this "extremely hard to believe", you have to argue that this story, which has multiple sources unaffiliated with OpenAI, contemporaneous documentary evidence, and is written by a reporter with every incentive in the world to roast OpenAI, is directly wrong about facts plainly reported in the story.

I think you have to want this story to be wrong to think it's wrong. It's a tough beat! Everyone was super sure OpenAI did this bad thing just a couple days ago, and now they're feeling sheepish about it.


The reporter was quite unquestioning.

OpenAI reached out to SJ to use her as a voice actor.

Why was the casting call for non-union actors only? SJ is a SAG union member. "Non-union actors, unless you happen to be her?"

Her agent feared for her client's safety? Why? And to the point where the agent wants to stay anonymous, too?

Highly ironic, given that ultimately, a single voice statement from the voice actor could be far more conclusive of OpenAI's defense.

Altman was not "intimately involved"... well, other than the latter attempt to sign SJ two days prior to launch coming from him personally...

The agent said the actress confirmed that neither Johansson nor the movie “Her” were ever mentioned by OpenAI. Weird they'd not say anything to her when they were making these references public.


Out of all journalistic outlets, WaPo, being owned by Bezos, has the least incentive to roast OpenAI in this case.

Wouldn't it be nice if Bezos/Amazon could make Alexa sound like a voice actress that sounds a lot like ScarJo without repercussions? First step is to shape the public opinion.


It isn't the specific facts covered by the article that I find hard to believe. It's claims like "they probably weren't even aware of the movie and definitely weren't trying to create a voice that sounded like it".

It's true that their behavior is less damning, with the facts from this reporting. And that's a good thing. But it's not true, in my opinion, that the article demonstrates that there is no behavior to criticize.


I'm sorry friend, I don't know what to tell you.

I find it extremely hard to believe.


I can see that you do.


He literally posted “her” on Twitter the day before the press release. The willful ignorance here is astounding


I wrote a comment to this exact effect yesterday. We didn't have both sides of the story, let's give Sam/OAI, like, literally just a single day to present their own side. And now he did, and I am sure all the people shouting down Sam at every turn still give no shits and will move the goalposts.


That doesn't mean anything. They could have been and were likely developing the process and technology while having Johansson in mind the whole time.

> never had the movie Her or SJ's name mentioned to her a single time

How do you know that?


The article says:

>The agent, who spoke on the condition of anonymity to assure the safety of her client, said the actress confirmed that neither Johansson nor the movie “Her” were ever mentioned by OpenAI.


It was mentioned by Sam himself on Twitter. So this is a bold face lie I can’t believe anyone is buying.


So an anonymous statement from a puff piece when multiple verifiable public records show OpenAI publicly mentioning Johansson and Her. Count me skeptical.


I would say that OpenAI wanted something that sounded like her which in turn sounded like Scarlett Johannson.

I also think the "sounded like" is less clear than you think. Is it similar, yes. But how similar I am not sure what the line is but for sure I didn't think it was Scarlett Johannson. By saying it is Scarlett Johannson and relating it to her our brains will make the association though. That is marketing.


Since they asked two days before it was launched back in September my guess is that the voice was already created by then.


But there's nothing wrong with this!

Let's say I'm making a movie. I have an old wizard character similar to Gandolf in Lord of the Rings, so I contact the guy who played Gandolf in Lord of the Rings. He says no, so I hire a different actor who also fits the "old wise wizard" archetype.

Is any of that illegal?


> I guess they want us to believe this happened without shenanigans, but it's bit hard to.

Right. And the question is, did they actually used SJ's voice as part of their training data? Because there's a lot of that available given all her works.

There's a reason why they wanted 'her', specifically. What reason is that? If they could just work with a noname voice actress (likely, for far cheaper), why not just do that from the get go? It could be a marketing gimmick and maybe they wanted her name more than just the voice to add to the buzz. If it is not that, then the sequence of events doesn't make sense.


Except the voice does NOT sound like SJ.


> In a statement from the Sky actress provided by her agent, she wrote that at times the backlash “feels personal being that it’s just my natural voice and I’ve never been compared to her by the people who do know me closely.”


This isn’t the timeline though? The actor for Sky was hired and cast before they even reached out to SkarJo. The idea that they wanted to literally reproduce “Her” feels like motivated reasoning to me.


I don't understand.

If you literally use SJ's image or voice, then you're in trouble.

If it's an SJ lookalike or soundalike (and you don't claim otherwise), there's no problem.

Right? What's the "shenanigans?"


> If it's an SJ lookalike or soundalike (and you don't claim otherwise), there's no problem.

This isn't true. At least with respect to "soundalike" see, e.g., Waits v. Frito-Lay 978 F.2d 1093 and Midler v. Ford Motor Co. 849 F.2d 460.


Lol what a joke.

The famous person "owns" the sound of their voice, and the non-famous person does not.

Get wrecked peasants.


Notice it's not "Famous person vs Non famous person". It's "Famous person vs Corporation". And in those cases it was not about the voice alone, but the use of it (which was found to be intentionally soundalike and misleading).


Non-famous person wasn't sued, but they are unemployable.

"Can Corporation hire Famous person, or Non-Famous person?"


Imagine being sued by a celebrity for looking or sounding like them. Should've not chosen to born with a similar voice.


This whole thing is so bizarre.

A lot of people look handsome, sound handsome and are even stylish. So some rich person can randomly claim some one looks closer or sounds closer to them and hence needs to forced to wear a mask to prevented from even talking?

So what happens next, when AI bots begin to sing, compose music, teach, paint or anything for that matter?


I don't think it's a trivial thing as vanity though it would be a convenient culprit. Instead, I believe what makes more sense is endorsement and one's right to their likeness. If the person had merely sounded similar it probably wouldn't be an issue. Rather it seems to be that in each case the party caught out seems to have intentionally sought to convince people that she endorsed the product and/or may have been financially tied.


Your second statement may not be true, legally, and at the very least many (including the actress in question) believe it is not true, ethically.


I think a better characterization would be:

* OpenAI wanted an AI voice that is SJ's voice

* SJ declined

* OpenAI got an AI voice from another person that sounds like SJ


That would require a step 3 where they get in a time machine:

> But while many hear an eerie resemblance between “Sky” and Johansson’s “Her” character, an actress was hired to create the Sky voice months before Altman contacted Johansson, according to documents, recordings, casting directors and the actress’s agent.


In that case, it's going to be even harder to prove any wrong doing by OpenAI.


This controversy is great marketing for SJ, too.


Well said.


So what? They’re free to hire whoever they want to be a voice actor. It’s not illegal for them to hire someone that sounds like Barack Obama.


If you say "yes we can" as your corporate announcement of that person who sounds like Obama, and one of your employees (or rather ex-executives) says "the secret ingredient in AGI is Obama", it actually can be illegal. The main issue in NIL rights (as with trademarks) isn't similarity - it's brand confusion.


The thing that worried me initially was that:

- the original report by Scarlett said she was approached months ago, and then two days prior to launch of GPT-4o she was approached again

Because of the above, my immediate assumption was that OpenAI definitely did her dirty. But this report from WaPo debunks at least some of it, because the records they have seen show that the voice actor was contacted months in advance prior to OpenAI contacting Scarlett for the first time. (also goes to show just how many months in advance OpenAI is working on projects)

However, this does not dispel the fact that OpenAI did contact Scarlett, and Sam Altman did post the tweet saying "her", and the voice has at least "some" resemblance of Scarlett's voice, at least enough to have two different groups saying that it does, and the other saying that it does not.


I don't know, to me, it's just sounds like they know how to cover all their bases.

To me, it sounds like they had the idea to make their AI sound like "her". For the initial version, they had a voice actor that sounds like the movie, as a proof of concept.

They still liked it, so it was time to contact the real star. In the end, it's not just the voice, it would have been the brand, just imagine the buzz they would have got if Scarlett J was the official voice of the company. She said no, and they were like, "too bad, we already decided how she will sound like, the only difference is whether it will be labelled as SJ or not".

In the end, someone probably felt like it's a bit too dodgy as it resemblance was uncanny, they gave it another go, probably ready to offer more money, she still refused, but in the end, it didn't change a thing.


Agreed — seems like they had a plan, and probably talked extensively with Legal about how to develop and execute the plan to give themselves plausible deniability. The tweet was inadvisable, and undoubtedly not part of the actual plan (unless it was to get PR).


I am sure it was for free PR. Streisand effect trap for ScarJo.


They removed ChatGPT's most popular voice in response, causing anger among many of their customers... for PR?


I can't say much about that particular voice because I almost never used chatGPT's voices. They are too slow.

I need 1.5x speeds even if I have to use a worse voice. I am a TTS power user, listening to all online text since 2010s. Maybe GPT-4o has a more flexible voice, perhaps you can just ask it to speak faster.


They also need to have a terse mode, to avoid all the throat-clearing that I've seen in videos.


> unless it was to get PR

I think this possibility doesn't receive enough attention, there is a class of people who've figured out that they can say the most scandalous things online and it's a net positive because it generates so much exposure. (As a lowly middle class employee you can't do this - you just get fired and go broke - but at a certain level of wealth and power you're immune from that.) It is the old PT Barnum principle, "They can say whatever they want about me as long as they spell my name right." Guys like Trump and Musk know exactly what they're doing. Why wouldn't Sam?

Johansson's complaint is starting to look a little shaky especially if you remove that "her" Tweet from the equation. I wouldn't put this past Altman at all, he knows exactly what happened and what didn't inside OpenAI, so maybe he knew she didn't have a case and decided to play Sociopathic 3D Chess with her (and beat her in one round)


Johansson might not win a lawsuit, but she isn't looking bad at all. She is totally standup in the Arts vs BigTech AI cultural battle. (See also, Apple's recent iPad crushes all artist material" commercial.)

Nothing in this article changes the essence of her complaint.

The only real, though partial, rebuttal to her is that OpenAI copied a work product she did for a movie, and the movie was was more than her voice, so it's not totally her own work. So maybe the movie team as a whole has a stronger complaint than the voice actor alone.

She didn't lose any game of wits. She just got done dirty by someone who got away with it. She doesn't need money from them. She has respect from people who matter, SAM and OpenAI behaved badly like big tech always does. If OpenAI permanently stops using Johansson-like Sky voice, she'll win what she wanted.

Of course, anyone whose voice sounds like an AI has the unpleasantness of that experience, and a rich person is more able to endure it than a regular Johansson.


As a lowly middle class employee it could be interpreted externally as “representing the company” which is why you see disclaimers like “all views are my own” on some social media profiles. Sam is the company, so they can’t get mad at him, and beyond that he’s a private individual saying whatever he wants on social media without lying.

In order to sue, there need to be damages, and if they didn’t copy the voice then the rest doesn’t matter, which sam and team clearly knew and were fast to work with the news. I agree that smart people take advantage of what they can get away with, but this controversy couldn’t have turned out better for increasing brand awareness good or bad (as you say, just like trump and musk know how to do)


A more charitable scenario might be that they hire the voice actor and it sounds a bit like her. Someone suggests why don't we just get Scarlett to do it properly, wouldn't that be cooler? They reach out and she says no. They decide to continue with the one that sounds a bit like her.


Genuine question;

Why in the world would one expect the more charitable scenario?


It's just a best practice that serves as a healthy counterbalance to cognitive biases, that might otherwise urge us to convict without evidence.

It's not necessarily what will prove true at the end of the day but I think we owe people the presumption of innocence.


I think we owe people outside of a commercial environment the presumption of innocence and benefit of the doubt. But we owe profit-seeking corporations (or their officers) neither, and the assumption should be that they are simply amorally doing whatever maximizes profit. As soon as someone hangs their shingle out there as a business, our presumptions should change.


Is it necessarily a bad bias to assume OpenAI is still behaving as it's been behaving during its entire history: recklessly taking other people's IP?


Yes, because the courts have yet to decide whether OpenAI has been "recklessly taking other people's IP" in an illegal way. Right now, it's only something believed by people who wish it to be true; legally, it's not clear just yet. In contrast, actually doing SJ impersonation here would be a much clearer violation. There's a huge gap between the two deeds, and I don't see the reason to just assume OpenAI crossed it.

It's like, the people dropping leaflets in your physical mailbox are delivering spam, but you wouldn't automatically assume those same people are also trying to scam you and your neighbors by delivering you physical letters meant to trick people into parting with their savings. In both cases, the messages are spam, but one is legal, other is not, and there's a huge gap between them.


Exactly wrong; it's the job of the law to "be careful," not of the people.

Those of us accusing and talking about it have no power -- thus there is literally no harm, and possible good in, putting them on the defense about this.

edit: In fact, the First Amendment of the Constitution essentially directly upholds the idea of "people saying whatever they want" in this regard.


I think the way I would split the difference here is that your point should inform how we think about regulation and investigation. How we write rules, how we decide to proceed in terms of investigating things.

But when it comes to specific questions that hinge on evidence, I think you have to maintain the typical presumption of innocence, just to balance out the possibilities of mob psychology getting out of control.


Yes, you owe people that.

No, you do not owe "corporations, especially those with a tendency, incentive, and history of being ruthless in this way."

Wise up, people.


Because it follows the legal principle of innocent untill proven guilty? Unlike the "OpenAI must have cloned Scarlett Johansson's voice" wild dystophia speculations.


This is goofy. "The law" should be careful because the law has power to do real harm.

People don't need to be careful just talking; in fact we generally support the idea of "people saying whatever" in the form of the First Amendment.



Until proven otherwise I try not to assume malice in every action.


That's the same thing, in fewer words. It doesn't change that the beginning and the end are still imitating the original, and this is a billion dollar corporation, not an Elvis personator doing a little show.


This will be used as a template by the entertainment industry to screw over so many people.


How? This kind of thing is already illegal. If I’m producing a commercial for Joe’s Hot Dogs, and I hire a voice actor who sounds like Morgan Freeman, and he never says “I’m Morgan Freeman” but he’s the main voice in the commercial and the cartoon character he’s voicing looks like Morgan Freeman… well, many consumers will be confused into thinking Morgan Freeman likes Joe’s Hot Dogs, and that’s a violation of Morgan Freeman’s trademark.


no, that's definitely not illegal. Voices are not trademarkable, only jingles (melody, words + tone), and of course specific recordings of voices are copyrighted. The ONLY way they get in trouble is if they claim to be Morgan Freeman.


"Trademark" is not the correct way to talk about it, but commercial use of an impersonation can be a breach of the right of publicity, even if they don't actually say they're the person.


> Voices are not trademarkable

But they are subject to right of publicity in many US jurisdictions.

Which, while more like trademark than copyright (the other thing that keeps getting raised as if it should dispose of this issue), is its own area of law, distinct from either trademark or copyright.

> The ONLY way they get in trouble is if they claim to be Morgan Freeman.

That’s…not true. Though such an explicit claim would definitely be a way that they could get in trouble.


> In the United States, no federal statute or case law recognizes the right of publicity, although federal unfair competition law recognizes a related statutory right to protection against false endorsement, association, or affiliation

https://www.inta.org/topics/right-of-publicity/#:~:text=In%2....


The important word in that quote is “federal”. In the US, right of publicity is a state law right in many states (often of particular note because of the concentration of tech and entertainment industries, it is a state law right in California.)


A reasonable person would assume that the voice in the commercial is Morgan Freeman, which could be very problematic for the commercial maker.


How do commerical Elvis impersonators (or commercial Elvis impersonators in commercials) get away with it?


Because no reasonable person thinks it’s actually Elvis.


Voices may not be trademarkable but I'm pretty sure styles and intonations are.


Which part of that is illegal? Because I don't see anything.


It's in the very article > He compared Johansson’s case to one brought by the singer Bette Midler against the Ford Motor Company in the 1980s. Ford asked Midler to use her voice in ads. After she declined, Ford hired an impersonator. The U.S. appellate courts ruled in Midler’s favor, indicating her voice was protected against unauthorized use.


There's a gray area. If Ford Motor Company hired an actor happened to sound a lot like Bette Midler using their normal speaking voice, Ford would have had a much better chance in defending their case.

As I understand it, that's essentially OpenAI's defense here.


Which they themselves have totally undermined.


So if I happen to sound like Tom Hanks, anyone recording me in passing would be breaking the law? How does anyone see that as reasonable?


That's a bit of a strawman: you're twisting the scope and arguing for it.

A more similar context would be: they ask Tom Hanks to create a voice similar to Woody, the cowboy from Toy Story . Tom Hanks says no, Disney says no. Then they ask you to voice their cowboy voice. It's obviously related: they tried the OG, failed, they're going for a copycat after.

But if never approached Tom Hanks or Disney, then there would be room for deniability - without mentions to real names, it would require someone to judge if it's an unauthorized copycat or just a random actor voicing a random cowboy voice.

It was a bad play from their part.


You're describing a situation different from the one I replied to, though... https://news.ycombinator.com/item?id=40454969


Ok got it.

So in your opinion, if a movie needs to have a tall, skinny red head, and then they approach someone who has those qualities and the role is turned down, then it would be illegal to get any other different tall skinny red head.

That sounds absurd to me. If you have a role, obviously the role has qualities and requirements.

And just because person 1 who happens to have those qualities turns you down, it is still valid to get a different person who fulfils your original requirements.


Yeah I think the person you’re responding to gave a bad example. I like to give examples involving commercials because they center the issue of celebrity endorsement of and association with a brand, which is the thing at issue in this OpenAI case (public corporate keynotes are essentially just multi-hour commercials).


This sort of thing happens all the time though doesn’t it? Take any animated film that get a TV show spinoff, their voice actors get replaced all the time, and I really can’t imagine that spinoff tv shows are dependent on getting all the original voice actors to grant permission. How many different actors have voiced looney toons characters over the years? Didn’t Ernie Hudson audition to play his own character for the Ghost Busters cartoon and lose out to someone else? And these are all cases where there is clear intent to sound as close to the original actor as possible.


I Am Not An Entertainment Lawyer either, so I can’t answer with too much certainty, but I’d suspect the actors have a clause in the contract they sign allowing the character to be played by someone else post-contract. Think about how many clauses were in the last employment contract you signed.


Sure, I would imagine such clauses exist to make this sort of thing a lot cleaner and easier. I also don't see how it could be reasonable to assert that in the absence of such a clause, an actor owns the rights to the voice of that character and no one else can portray that character with the same or similar voice without their express consent. Not everyone who creates something is signing Hollywood acting contracts with their hired actors, and I just can't see any court asserting that "CoolTubeProductions YouTube Channel" can't continue to produce more animations with their "Radical Rabbit" character just because they didn't have that sort of clause when they hired a voice actor at their college for the first year.


> In the end, someone probably felt like it's a bit too dodgy as it resemblance was uncanny

What if it wasn’t a computer voice model but rather a real-life voice actress that you could pay a few cents to try to imitate Scarlett Johansson’s voice as best as she could?

That’s effectively what’s happening here, and it isn’t illegal.

It guess it also leads to the bigger question: do celebrities own their particular frequency range? Is no one allowed to publicly sound like them? Feels like the AACS DVD encryption key controversy all-over again.


>guess it also leads to the bigger question

people are allowed to sound like other people. But if you go to actor 1 and say we want to use your voice for our product, and then they say no, and then you go to actor 2 and tell them I want you to sound like actor 1 for our product, and then you release a statement hey you know that popular movie by actor 1 that just used their voice in a context extremely reminiscent of our product?!? Well, listen to what we got: (actor 2 voice presented)

Then you may run into legal problems.

https://en.m.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.

on edit: assuming that reports I am reading that the actress used for the voicework claimed not to have been instructed to sound like Her vocal work it sounds like it is probably not likely that a suit would be successful.


The other actress wasn't the only one involved in the production; she provided input but OpenAI building a voice model would involve a lot of data and input. They had to have a model of her ready to go when they asked her for permission immediately before launching; possibly they had one that had been built from her, and another legal-approved that they had converged to be close to the first one but that didn't include her as a direct source.


> What if it wasn’t a computer voice model but rather a real-life voice actress that you could pay a few cents to try to imitate Scarlett Johansson’s voice as best as she could?

> That’s effectively what’s happening here, and it isn’t illegal.

Profiting from someone else's likeness is illegal.


Right of publicity. Profiting of their image without their permission will get you sued. Even if you use an impersonator. If there is a chance the public will connect it with them, you are probably screwed.

e.g.

Vanna White vs Samsung - https://w.wiki/AAUR

Crispin Glover Back to the Future 2 lawsuit - https://w.wiki/AAUT#Back_to_the_Future_Part_II_lawsuit


> That’s effectively what’s happening here, and it isn’t illegal.

It is more complicated than that. Check out Midler v. Ford Motor Co, or Waits V. Frito Lay.


Ford hired impersonators, she's not an impersonator, that's her real voice.

She's allowed to be a voice actor using her real voice.

Your can point to the "Her" tweet, but it's a pretty flimsy argument.


This is correct, and is very different from both the Midler and Waits cases. The courts are never going to tell a voice actor she can't use her real voice because she sounds too much like a famous person.

And besides, it sounds more like Rashida Jones anyway. It's clearly not an impersonation.


They are unlikely to tell the voice actor anything, since OpenAI is the problematic party here.


> Your can point to the "Her" tweet, but it's a pretty flimsy argument.

I'm not making arguments which are not already explicitly written in my post.

My argument is simple: jorvi commented that you can hire "a real-life voice actress" to "try to imitate Scarlett Johansson’s voice as best as she could", and that is not illegal.

I said that the legality of that is more complicated. What jorvi describes might or might not be illegal based on various factors. And I pointed them towards the two references to support my argument.

I explicitly didn't say in that comment anything about the OpenAI/ScarJo case. You are reacting as if you think that I have some opinion about it. You are wrong, and it would be better if you would not try to guess my state of mind. If I have some opinion about something you will know because I will explicitly state it.


Whether the actor was an impersonator or not is still up to debate. I can see an argument being made when you consider the entire context.


I don't see any argument considering the entire context, care to explain?


> they gave it another go, probably ready to offer more money, she still refused, but in the end, it didn't change a thing.

That's not what she said happened. She said they released it anyway before she and Sam could connect, after Sam had reached out, for the second time, two days prior to the release.


> In the end, someone probably felt like it's a bit too dodgy as it resemblance was uncanny, they gave it another go, probably ready to offer more money, she still refused,

That was just a few days before launch, right? What was their plan if she said yes at that point? Continue using the "not-her" voice but say it was her? Or did they also have her voice already cloned by then and just needed to flip a switch?


> Continue using the "not-her" voice but say it was her? Or did they also have her voice already cloned by then and just needed to flip a switch?

One or the other. It doesn't really matter as SJ herself would not have necessarily been able to make sure it is not her and not a glitch in how the tech work with her voice.


Sky doesn't sound like the movie, much less "uncanny".


I think it sounds overly enthusiastic though, to the point that it sounds fake. Very overacted and dramatic. I wouldn't want to chat with that voice.

Though admittedly, so does Johansson in "Her". I don't think the voices are very similar but the style is.


Just imagine, if these voice chatbots get popular... they will likely change how people talk!


I think at the moment it's the opposite: It seems based on how many celebrities talk. When we watch them on TV, youtube, tiktok whatever they're not real people but just playing a role. It's not how they really are in their real life. The overacted enthusiasm is like a marketing tool.

As people tend to look up at celebrities and admire them they start associating this with good things and I think this is why they adopted such styles for chatbots.


Sure but Skye is still not SJ.


A plausible alternative explanation for asking Johansson:

  (1) They cast the current actor to test the technology and have a fallback.  The actor sounds somewhat different from Johansson but the delivery of the lines is similar.  

  (2) They then ask Johansson because they want to be the company that brought “Her” to life.  She declines.  

  (3) They try again shortly before the event because they really want it to happen.

  (4) They proceed with the original voice, and the “her” tweet happens because they want to be the ones that made it real. 
Asking shortly before the release is the weakest link here. It’s possible they already had a version trained or fine tuned on her voice that they could swap in at the last minute. That could explain some of the caginess. Not saying it’s what happened or is even likely, but it feels like a reasonable possibility.


My unsubstantiated theory: They have a voice trained on Johansson's body of work ready to go, but didn't release it because they didn't get her permission. This explains why they were still asking her right up to the ChatGPT-4o release. Then people (including Johansson) associate this Sky voice with Johansson and Her. OpenAI realizes it looks bad, despite not being intentional, so they pull Sky for PR reasons.


Yes, but it changes the narrative from “they couldn’t get Scarlett to record the voice, so they copied her voice” to something much less malicious. Contacting Scarlett, when you already have voice recordings ready but would prefer someone famous, isn’t that bad of a thing imho.


> Yes, but it changes the narrative from “they couldn’t get Scarlett to record the voice, so they copied her voice” to something much less malicious.

I don't think it's less malicious if they decided to copy her voice without her consent, but just didn't tell her until the project was underway, then continued even after she said no.

There's legal precedent that hiring a copycat is not OK, so it's not like proving it was a copycat salvages their situation.

I wouldn't be surprised if the real reason they hired a copycat early is because they realized they'd need far more of Johansson's time than she'd be willing to provide, and the plan was typical SV "ask forgiveness not permission, but do it anyway regardless."


They used a different person, so it is not her voice.


> They used a different person, so it is not her voice.

That doesn't matter because it's an impersonation. Ford lost, even though they didn't use Bette Midler's voice either: https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.


It seems like the key difference is that the advertisements in those cases involved people who sounded like particular musical artists, singing songs that those artists were well-known for singing. If you hired the woman who voiced Sky to say lines that Scarlett had in some of her movies, that would be similar. The fact that this is a chatbot makes it somewhat of an echo of those cases, but it strikes me (a former lawyer) as being a bridge too far. After all, you have to balance Scarlett's rights against the rights of someone who happens to have a voice that sounds like Scarlett's (it would be different if this were someone doing an impersonation of Scarlett, but whose natural voice sounds different).


Tom Waits sued Frito Lay for using an impersonator to sing about Doritos.


Ford commissioned a cover of a 1958 song[1] using a singer that would clearly be mistaken for Bette Midler's existing cover of that song, as part of an advertisement campaign where they first tried to get the rights to the original songs.

If you listen to the imitation version linked from that Wikipedia article and the original 1958 you'll hear that they didn't only find a singer that sounded like her, but copied the music and cadence from Bette's version.

I think that's way past what whatever OpenAI did in this case. It would be analogous if they were publishing something that only regurgitated lines Scarlett Johansson is famous for having said in her movies.

But they're not doing that, they just found a person who sounds like Scarlett Johansson.

This would only be analogous to the Ford case if the cover artist in that case was forbidden from releasing any music, including original works, because her singing voice could be confused with Bette Midler's.

Now, would they have done this if Scarlett Johansson wasn't famous? No, but we also wouldn't have had a hundred grunge bands with singers playing up their resemblance to Kurt Cobain if Nirvana had never existed.

So wherever this case lands (likely in a boring private settlement) it's clearly in more of a gray area than the Ford case.

1. https://en.wikipedia.org/wiki/Do_You_Want_to_Dance


The story addresses this as well.


It's not an impersonation it is the actor using their own natural voice.

"We believe that AI voices should not deliberately mimic a celebrity's distinctive voice — Sky's voice is not an imitation of Scarlett Johansson but belongs to a different professional actress using *her own natural speaking voice*"


With the law, it is often about intent. If OpenAI had the intent to make a voice that sounded like Scarlett Johansson's in her, then I think that might be problematic for OpenAI. I am not a lawyer though.


If the goal was to make the voice sound like the one from Her, then it's still illegal.

Same way you can't get someone who sounds like a famous celebrity to do voice in a commercial and just let people think it's the famous celebrity when it's not


Unless they can clearly demostrate reproducing the voice from raw voice actor recordings, this could be just a parallel construction to cover their asses for exactly this sort of case.


Intent matters.

When discovery happens and there’s a trail of messages suggesting either getting ScarJo or finding someone that sounds enough like her this isn’t going to look good with all the other events in timeline.

If it goes to court, they’ll settle.


>> When discovery happens and there’s a trail of messages suggesting either getting ScarJo or finding someone that sounds enough like her this isn’t going to look good with all the other events in timeline.

I'm not a lawyer, but this seems unfair to the voice actor they did use, and paid, who happens to sound like ScarJo (or vice versa!)

So if I sound like a famous person, then I cant monetize my own voice? Who's to say it isnt the other way around, perhaps it is ScarJo that sounds like me and i'm owed money?


There isn't an unfairness to the voice actor. She did her job and got paid.

The problem here is that someone inside of OpenAI wanted to create a marketing buzz around a new product launch and capitalize on a movie. In order to do that they wanted a voice that sounded like that movie. They hired a voice actor that sounded enough like ScarJo to hedge against actually getting the actor to do it. When she declined they decided to implement their contingency plan.

If they're liable is for a jury to decide, but the case precedent that I've seen, along with the intent, wouldn't look good if I were on that jury.


>> There isn't an unfairness to the voice actor. She did her job and got paid.

If her customers can get sued for using her voice, then this voice actor can never get another job and can never get paid again -- all because she happens to sound like ScarJo. That seems unfair to the voice actor.


It is not that the voice is similar to Scarlett, it is that it appears that Scarlett's identity was intentionally capitalized on to market the voice.

If you had a voice like Scarlett, and you were hired to create the voice of an AI assistant, there's no legal problem - as long as the voice isn't marketed using references to Scarlett.

However, in this case, the voice is similar to Scarlett's, AND they referenced a popular movie where Scarlett voiced an AI assistant, and named the assistant in a way that is evocative of Scarlett's name, and reached out to Scarlett wanting to use her voice. It is those factors that make it legally questionable, as it appears that they knowingly capitalized on the voice's similarity to Scarlett's without her permission.

It is about intent, and how the voice is marketed. Voice sounds like a famous person = fine, voice sounds like a famous person and the voice is marketed as being similar to the famous person's = not fine.

It is not a clear-cut 'this is definitely illegal' in this case, it is a grey area that a court would have to decide on.


How is it unfair to the voice actor? Is she getting sued? Is she paying damages? Is she being prevented from doing her work? No.

> Who's to say it isnt the other way around, perhaps it is ScarJo that sounds like me and i'm owed money?

It seems like you don't get the fundamental principal underlying "right of publicity" laws if you are asking this question.


>> How is it unfair to the voice actor? Is she getting sued? Is she paying damages? Is she being prevented from doing her work? No.

Seems she is prevented from doing work, if companies can get sued for hiring/using voice actors who sounds like ScarJo, then any voice actor who sounds like ScarJo has effectively been de-platformed. Similarly, imagine I look very much like George Clooney -- if George Clooney can sue magazines for featuring my handsome photos, then I lose all ability to model for pay. (Strictly hypothetical, I am a developer, not a fashion model.)

>> It seems like you don't get the fundamental principal underlying "right of publicity" laws if you are asking this question.

Totally, i have no idea of the laws here, but very curious to understand what OpenAI did wrong here.


You are skipping past intent and turning it into strict liability. That's not the case.

>Totally, i have no idea of the laws here, but very curious to understand what OpenAI did wrong here.

It is illegal to profit off the likeness of others. If it wasn't, what's to stop any company from hiring any impersonator to promote that company as the person they are impersonating?


Doesn't matter. Waits v Frito Lay


That's an impersonation of a parody song in his style. This is a voice actor who has a voice that's kinda similar to ScarJo and kinda similar to Rashida Jones but not quite either one doing something different.

Cases are not a spell you can cast to win arguments, especially when the facts are substantially different.


In both cases the companies are specifically trading on creating confusion of a celebrity’s likeness in an act that celebrity trades in, and with the motivation of circumventing that very celebrity’s explicit rejection of the offer for that very work.

Just because one is a singer and the other is an actor isn’t the big difference you think it is. Actors do voice over work all the time. Actors in fact get cast for their voice all the time.

Yelling, “Parody!” Isn’t some get out of jail free card, particularly where there is actual case law, even more particularly when there are actual laws to address this very act.


> In both cases the companies are specifically trading on creating confusion of a celebrity’s likeness in an act that celebrity trades in, and with the motivation of circumventing that very celebrity’s explicit rejection of the offer for that very work.

Are they? Where did they advertise this? The voice doesn't even sound that much like ScarJo!

> Just because one is a singer and the other is an actor isn’t the big difference you think it is. Actors do voice over work all the time. Actors in fact get cast for their voice all the time.

It's a very big difference when the jurisprudence here rests on how substantial the voice is as a proportion of the brand, especially in the presence of the other disanalogies.

> Yelling, “Parody!” Isn’t some get out of jail free card, particularly where there is actual case law, even more particularly when there are actual laws to address this very act.

Sure -- If you read that back, I'm clearly not doing that. An impression in a parody in the artist's unique style (Waits) was a case where it was a violation of publicity rights. This is radically different from that. It's not clear that Midler and Waits have much bearing on this case at all.


Which is not as similar as people keep saying though: both that case, and Bette Midler's involved singers, who perform as themselves and are their own brand.

Consider when a company recasts a voice actor in something: i.e. the VA Rick and Morty have been replaced, Robin Williams was not the voice of genie in Aladdin 2 or the animated series.


Recasting a voice actor when there was a contract with the prior actor (and such a contract would typically allow for recasting) is one thing.

Copying a famous actor’s voice without any kind of agreement at all is something else.


I'm not sure if that's enough to protect OAI, it feels like they wanted SJ, found a similar voice actor as a version 1, tried to "officially" get SJ's voice, and when it failed instead of pulling it continued on. It still feels quite a deliberate move to use her likeness, and the "contact 2 days before" sounds like they really wanted to get her okay before using the other VA's voice.


Sounds more plausible that someone pointed out to them internally they could be in a heap of trouble if Scarlett objected after they released it. It doesn’t matter if it was actually her voice or not it matters if people think it was her voice. If someone pointed this out late in the process than yeah there would have been a mad scramble to get Scarlett to sign off. When she didn’t then that put them in a bad spot.


Why would they have taken down the voice if they were operating on a level of truth in their favor?


"out of respect" for the angry woman rather than argue with her, you never had this problem with a wife/girlfriend?


Is it a crime for voice actors to sound similar to, say, Darth Vader?


ITYM

> Is it a crime for voice actors to sound similar to, say, James Earl Jones?

And the answer is, of course: It depends. For one thing, it depends on whether the company using the sound-alike's voice are in a business closely related to the theme of Star Wars, and whether they market whatever it is they're marketing by referring to Jones' iconic performance as Vader. ("<PANT> ... <PANT>") If they do that, then yes, it most likely is.


No, I specifically asked about Darth Vader, the fictional character that has been voiced by various voice actors (including the original trilogy, clone wars, etc). Presumably Earl Jones does not sound like Darth Vader in his day to day life, but this is not about Earl Jones, it is about the character.


[flagged]


Who slandered who?


Those which take side in a totally unconfirmed story. If the side you are siding with isn't right, you are part of the problem.


Yeah, sus af because of the call 2 days before they released it to the world. And they were just asking for it when they tweeted the frickin "her". I mean, come on.


We are just nit picking now because we are bored?


I never comment on HN I’ve just always been a long time lurker but I feel like I’m going crazy here reading comments.

SJ is not the “AI” portrayed in the movie her. And AFAIK she does not in fact have all the same idiosyncrasies and tones in real life as the voice does in the movie because she was in fact directed to act like that.

Not only that but the voices are not the same because there was another actress for sky as we have seen.

To me It seems as if the case for SJ is DOA unless it comes out somehow that they in fact trained on her voice specifically. But since that doesn’t seem like the case I have no idea how SJ can legally own all voices that sound like hers.

It would obviously be a different story if OpenAI were saying that sky was SJ but that’s not the case. To me the question should be is “can the studio own the character in her that openAI was copying and any similar things”. Which given that systems like SIRI were already out there in the world when the movie came out and we knew this tech was on the way. The answer should be no but IANAL.

I’m not a huge fan of OpenAI anymore and I think they deserve criticism for many things. But this situation isn’t one of them.

Clarification: Of course if it turns out that they in fact trained on SJ or altered the voice to be more like hers then I’d think differently. I still think the studio has more of a claim though look from the outside and not being a lawyer.

Edit: clarification


It's not a question of owning all voices that sound like her, it's a question of "are customers deceived into thinking it is her" and "does it affect SJ negatively to be associated with this sound alike" when her income comes partly from her distinctive voice (much like Morgan Freeman). Sam Altman tweeting "Her" right before the announcements is what builds the case for SJ.

Imagine we hired a Leo Messi look alike and made him play football badly or something worse, if viewers can clearly tell it's not him it falls under parody but if we use camera trickery to keep a fooling doubt, we could be in legal trouble.


I think Morgan Freeman is a useful comparison to make. Imitations of his voice have been used in a lot of political campaign videos (not sure how many of them got permission). An imitation of his voice was also used in a UK "morethan" advert where they did seek permission and pay him. Another highly popular AI voice would be David Attenborough, used in any number of videos.


Random TikTokers will use AI that sounds like a celebrity, and they get away with it. There are many reasons why it's not a big fuss. They're not really selling a product directly, so it could be considered a fair use, unlike ChatGPT which is a paid product (they're literally selling the voice as a feature). There is also the intent; a Morgan Freeman AI voice on a random TikTok video is obvious to a reasonable person to not actually be his real voice, so you can't really make any sort of claim that they're masquerading him as actually saying whatever the script is. It's just for fun. And finally you can't really sue thousands and thousands of TikTokers.


Exactly. Lots of voices sound like other peoples’ voices. We aren’t that unique.

SJ doesn’t get to own the voice rights to everyone that sounds at all like her just because she is famous.


It is not about the voice. Rather using the fame of known actress to boost the product. If your inner motive is to sound like her because she is well-known, differences in voice does not matter much.


The voice was called Sky and OpenAI wasn't using her likeness to promote the voice or product. She isn't that well known, I didn't even know she was in Her.

There's 1 billion English speakers, there are going to be voice overlaps.


The issue is that they contacted her and advertised based on her. They clearly wanted it to sound like her. Now, that it sounded like her, there is a reason to make conclusions about OpenAI's real motives.


How did they advertise this as being SJ? It is worth pointing out that Sky kind of sounded like the voice from Her, not SJ. The voice in Her is a voice SJ did for that movie. It isn’t her normal speaking voice. Does she get to own the likeness of every kind of voice she’s made, or could make?


> How did they advertise this as being SJ?

Altman tweeted a reference to Her. (Literally just that text.)


And he could have meant a whole lot of different things by that, including the perfectly reasonable observation that the voice assistant tech is at parity with what we saw in _Her_.

The assertion that this one tweet, in the absence of any other official communication, constitutes advertising the product to be voiced by Scarlett Johansson is a _huge_ stretch.

I'm certain OpenAI would have been much happier with SJ as the voice actress. But how different does a female voice have to be to _not_ be considered an impersonation of someone?


How does that only refer to the voice? And how is that advertising, or using SJ's likeness or image?

I think the fact that the AI in the OpenAI demos acts like it is at a similar level to Her is far more impressive than the voice may sound like the one SJ did for that movie. Impressively mimicked voices aren't that new. Amazon sold a Samuel L Jackson Alexa voice years ago.


It's such an easy lesson to learn, here: if you are a founder or in any position where your speech relative to a product or service matters, get your ass off Twitter!



> The voice was called Sky

That the voice is called Sky is actually part of what's suspicious about this to me. They had all the world of female names to choose from for the voice that would recreate "Her" (and there's plenty of evidence that suggests that the movie was used as inspiration), and they chose one that started with the same rare consonant cluster as this actress. The only other names that Wikipedia lists with that consonant cluster are Skyler and Scarlett [0]. If they truly were trying to separate themselves from her rather than subtly cue the likeness, why Sky?

> She isn't that well known, I didn't even know she was in Her.

She's the second-highest-grossing actor (and the highest-grossing actress) of all time [1]. You might not know her (and neither do I), but that says more about you and me than it does about her.

[0] https://en.m.wikipedia.org/w/index.php?title=Category:Englis...

[1] https://en.m.wikipedia.org/wiki/List_of_highest-grossing_act...


> and they chose one that started with the same rare consonant cluster as this actress

Okay so we can't use voices that are similar to actress voices, and we can't use names that starts with the same "rare consonant cluster" as actress names.

This is getting ridiculous


> Okay so we can't use voices that are similar to actress voices, and we can't use names that starts with the same "rare consonant cluster" as actress names.

No, we just can't advertise or get other clear benefits based on the fame of something well-known without considering these entities.

It is about the overall picture, and in this case there is very high relation to the movie and Scarlett.


Tweeting a single word, a pronoun no less, is a "very high relation". And that is a legal bar?

Most people have never heard of Her. It wasn't a very big movie, especially outside tech circles. Most of my friends who aren't in tech would have no clue that Altman was referring to an AI in a movie.


Agreed, if you take each piece of the puzzle in isolation it sounds silly and totally not worth Johansson's time. But that's how most legal cases are: there's no one single piece of evidence that is conclusive proof in and of itself, there is a collection of facts that together form a reasonable basis for concluding a level of intent that meets the burden of proof.

I'm not convinced that the pieces all add up to a slam dunk, but you can't dismiss them one by one, you have to look at the whole.


Not that well known? She's one of the most famous actresses in the world! Seriously, go look at her Wikipedia page.


SCarlett AI. Sky. I know it's a reach but along with the Her tweet, you never know.


Yes, and they released in May 2024 = 05/(2+0+2+4) which is 0.625, which proves it for sure.

Come on, we don’t have to resort to reading entrails.


What about the back and forth trying to hire her, and she refusing?

Sounds like: "Eh nevermind, we are going to use it anyway and BTW, I'm going to tweet 'HER' "

You don't think that will have no weight whatsoever in a lawsuit?


Not if they didn't actually use it, no.


We would not see these headlines, if they never asked her and never mentioned her. Then it would be just voice of some voice actress which might or might not sound like Scarlett.


But he DID tweet the ref to the movie, so that's settled.


So if I have a company that sells, say, manure, I can search and hire a voice actress that sounds exactly like Scarlett to promote me in radio ads? And write a tweet that vaguely implies that it's really her?


Yes to the first bit, no to the second.

I don't think a reasonable person would interpret Sam's tweet as claiming that Scarlett recorded the voice.


There's legal precedent against the first one. Tom Waits successfully sued Frito Lay after they used a Tom Waits soundalike in a commercial.

https://www.latimes.com/archives/la-xpm-1990-05-09-me-238-st...


if i recall that only won because they contacted Tom first and he said no, which isn't what happened with OAI/SJ


Exactly. I barely know who this actress is. To me, it sounds like the tens of thousands of other white american voices. How is the remotely too similar?


Well as the article states, there’s legal precedent protecting actors from having their voices “impersonated” by other actors. The fact that Altman tweeted “her” and contacted Johansson can make the case for the intent to impersonate.


What if that's just the VA's natural voice? Must she stop doing VA?


No she just has to do so without promoting it as having been done by Scarlett Johansson.


It wasn't? None of the people I know thought it was Scarlett Johansson.

When Altman tweeted "her" I just thought: "wow, these voices sound really realistic, kinda like the movie"

Is this lawsuit worthy?


Except there wasn't any promotion. The only thing close would be Sams tweet, but this was not an official statement and could easily be explained to refer to the concept of voice assistants in general. The fact that SJ was contacted twice was actually publicized by SJ and not OpenAI.


> The only thing close would be Sams tweet, but this was not an official statement

How is a tweet from the CEO not an official statement?


it's the CEO's personal statement.

but yes, even a personal statement may have some value in a case, but you ignored the second part of the GP's criticism.

you're reaching that the tweet of the poster of "Her" meant "hey guys, this is voiced by SJ"

it simply could mean that "hey guys, it sounds as good as the voice assistant from Her"

it is voice assistant software after all...

mind reading a tweet does not make a good case, especially with the timeline noted in TFA.


> it's the CEO's personal statement

Wat.

> mind reading a tweet

He says the name of the movie! We don’t need to know his state of mind, just the promotional effect.


> Except there wasn't any promotion. The only thing close would be Sams tweet,

Yeah, that's promotion.

> but this was not an official statement

Yes of course it was. That's what being a CEO fricking means.


She isn't the issue, she isn't being sued.


She absolutely is being deplatformed and her rights are violated.

If every customer who hires her gets sued, that is basically the same as making it illegal for her to be a VA.


I don't know what it is about this website that makes discussion of legal issues so frequently poor as it is here right now.

>She absolutely is being deplatformed and her rights are violated.

No. And 'deplatforming' isn't illegal last I checked, whatever you mean it to be.

>If every customer who hires her gets sued, that is basically the same as making it illegal for her to be a VA.

They aren't getting sued because she sounds like ScarJo. In fact, its not clear they are being sued at all. What is illegal, that you do not seem to appreciate, is that regardless of whatever a particular individual looks or sounds like, it does not create a right in others to profit over this similarity in likeness. You cannot hire a Harrison Ford impersonator, to pretend to be Harrison Ford and promote your products. That you re-contextualize this as to Harrison Ford look-alikes being deprived work is just your own sad confusion.


> You cannot hire a Harrison Ford impersonator

Good thing that the actress just used her own natural voice then and wasn't going around repeating lines from a movie or dressed up as the more wealthy celebrity.

I would hope that you don't want to ban all her potential customers from hiring her.

This is absolutely about her rights to sell her own natural voice to potential customers. If it were illegal for her customers to hire her, then this is basically making her job illegal just because someone who happens to have more money and power than her has a similar voice.


>Good thing that the actress just used her own natural voice then and wasn't going around repeating lines from a movie or dressed up as the more wealthy celebrity.

It isn't about her, it's about how OpenAI used her voice. I'm not sure why that isn't getting through to you. It was already pointed out that there is no dispute with the voice actress. This is a bizarre conversation!

>I would hope that you don't want to ban all her potential customers from hiring her.

Obtusely repeating yourself doesn't change the law nor does it reflect any effort on your part to actually engage in this conversation and my response.

>This is absolutely about her rights to sell her own natural voice to potential customers. If it were illegal for her customers to hire her, then this is basically making her job illegal just because someone who happens to have more money and power than her has a similar voice.

Using the word absolutely does not make you right.

>If it were illegal for her customers to hire her, then this is basically making her job illegal just because someone who happens to have more money and power than her has a similar voice.

It's not illegal for her customers to hire her. It's illegal for businesses to capitalize on the likeness of individuals without that individuals permission. If OpenAI did not make several allusions to ScarJo and Her, there would be no ground to stand on. But they did!


> It isn't about her, it's about how OpenAI used her voice.

It absolutely is about her.

If her customers get sued because a more rich and powerful person has a kinda similar voice as hers, then the effect is basically the same as making it not legal for her to work.

Her customers shouldn't be sued because a rich person has a similar voice to hers.

> It's not illegal for her customers to hire her.

Oh great! So you agree that she should be allowed to do this work, and nobody should be sued for it, as long as she isn't going around saying lines from a movie, or dressed up as someone else.

She should be fullyed allowed to sell her services to whatever customer she wants, and those customers shouldn't be targeted, as long as she isn't doing impersonation, which she isn't, as she is simply using her natural voice that happens to sound similar to a rich and powerful person.


>It absolutely is about her.

No, it's not. She wont get sued.

>If her customers get sued because a more rich and powerful person has a kinda similar voice as hers, then the effect is basically the same as making it not legal for her to work.

They wont get sued because the voice is similar. They will only get sued if her customers represent her voice as ScarJo's.

What is so hard about that for you to understand?


> They will only get sued if her customers represent her voice as ScarJo's.

Horray! You agree with me that everyone in the situation is completely in the clear because she isn't saying that her voice is that, and she isn't going around repeating movie lines, or dressing up as her.

Also, here is an opinion from some who actually seems to know what they are talking about, and has legal experience:

https://news.ycombinator.com/item?id=40451286

"After all, you have to balance Scarlett's rights against the rights of someone who happens to have a voice that sounds like Scarlett's"

They straight up said this. So yes, according to someone who actually has legal experience, it is about her rights. Even if someone isn't directly being sued, according to them, who used to be a lawyer, this matters. And you, the not lawyer are not in agreement with an actual expert on the matter.

But whatever. I am more than happy to come back to you and your comments later, when either a lawsuit doesn't happen, or openAI wins. (They haven't even been sued yet!). That way I can have proof that you were wrong on this. I will let you know.


What an obnoxious rhetorical technique you are engaging in.

>Horray! You agree with me that everyone in the situation is completely in the clear because she isn't saying that her voice is that, and she isn't going around repeating movie lines, or dressing up as her.

I never said she represented she's ScarJo. It is OpenAI that did these representations and that's why ScarJo's attorney reached out to them. I have never said the voice actor did anything wrong and I have clarified this with you several times.

>Also, here is an opinion from some who actually seems to know what they are talking about, and has legal experience:

I know exactly what I'm talking about and I'm an attorney. Of course I don't expect you to do anything with this fact but to try and shove it back in my face.

>And you, the not lawyer are not in agreement with an actual expert on the matter.

I am a lawyer, read my comment history if you'd like. Yawn. The person you are quoting is just providing their own characterization of the issue, but there is no actual right of the voice actress that is being violated. It is her customers that have a constraint, she's free to work as she pleases. Of course she can't on her own pretend to be ScarJo to profit off of it, but nobody accused her of that.

>But whatever. I am more than happy to come back to you and your comments later, when either a lawsuit doesn't happen, or openAI wins. (They haven't even been sued yet!). That way I can have proof that you were wrong on this. I will let you know.

I'm absolutely right about this, I can promise you that much.


> I'm absolutely right about this, I can promise you that much

Great, then when the results come in, we can re-evaluate your statements!

If either OpenAI wins, or there is never any lawsuit to begin with, then we can know that you were wrong on all of this.


Whether or not there is no lawsuit has absolutely no bearing on the fact that right of publicity laws exist and ScarJo has a claim against OpenAI.


> Well as the article states, there’s legal precedent protecting actors from having their voices “impersonated” by other actors.

The article doesn't state that. It says that about singers. Very different.


Thanks that’s fair.


> because she was in fact directed to act like that

It's still Scarlet Johansons voice and acting. The same role with the same lines read by different actors would be very different. Imagine for example that they would have cast Tilda Swinton for Samantha. Even with the same script it would probably end up a very different character. Actors aren't interchangeable.

It's very clear that OpenAI was trying to make ChatGPT sound like Samantha from Her. Whether they used Scarlet Johansons voice to train, or excerpts from the movie, or had writers come up with typical responses that sound similar to Samantha are details, and it's up to the lawyers to figure out whether this is legal or not.

But the undisputable fact is that OpenAI took heavy inspiration from a movie, and did so without permission. You could argue that taking inspiration from a popular movie is fair game, but I'm not sure where the line is between "inspiration" and a blatant rip-off.


in fact, the movie was originally recorded with Samantha Morton doing the voice of the AI, but she was replaced with Johansson last minute!


Really? Are there any leaks of the original version? I would love to see the difference.


I don’t think it’s been leaked. I could have sworn that I’ve seen one scene with the original voice on like a DVD special feature, but I can’t find anything online


But is that the point? Here is a relevant precedence, for instance, that may or may not change your mind:

Tom Waits is a singer known for his raspy singing voice. Back in the late 1980s, Frito-Lay, Inc., the makes of Doritos, thought it was a great idea to run an ad in which the music had the atmosphere and feel of a Tom Waits song. Except the professional singer they hired for that got the job done a bit too well: the sounds of his voice in the commercial was so close to Tom Waits' work (he had for ten years sang in a band covering Tom Waits songs) that in November 1988, Waits successfully sued Frito-Lay and the advertising company Tracy-Locke Inc., for voice misappropriation under California law and false endorsement under the Lanham Act [1].

Now, when you hear Tom Waits speak in interviews, I find that his voice does not sound nearly as raspy as in his performances. But the point is that it does not matter so much whether OpenAI used the actual voice of Johansson or hired someone to imitate her performance.

Given the fact that Johansson was initially contacted by OpenAI to provide her voice and declined, we can surely assume that the selection of the particular voice actress they ended up using was no coincidence.

[1] http://law2.umkc.edu/faculty/projects/ftrials/communications...


>Given the fact that Johansson was initially contacted by OpenAI to provide her voice and declined

This order is wrong according to the article, the VA was contracted before ever reaching out to SJ.g

Additionally here is a relevant anecdote, for instance, that may or may not change your mind?

>In a statement from the Sky actress provided by her agent, she wrote that at times the backlash “feels personal being that it’s just my natural voice and I’ve never been compared to her by the people who do know me closely.”

It would suck to be blacklisted from your career because your voice may sound too similar to another famous person, if viewed from a certain light.


Clearly we need mandatory laryngeal surgery for anyone who sounds like anyone else. Also if you look like someone else, well, plastic surgery or an hood for you.


> This order is wrong according to the article, the VA was contracted before ever reaching out to SJ.g

Be that as it may, but it's clear that OpenAI had early on considered a her-like voice for their product. According to OpenAI, they started with over 400 candidate voices and narrowed them down to [1]. I find it would be quite the amazing if the one that sounded very close to the her voice was chosen purely by coincidence - and then they went:

- Wait a minute, you know what she sounds like? Have you ever seen that movie Her?

- Man, you're right, it does sound quite like that voice. Wasn't that Scarlett Johansson in the movie?

- Yeah, I think so.

- Whoa, whoa, tell you what: why don't we hire Scarlett Johansson directly?

- After we just went through 400 voices to select these five?

- So what? She's a star! Think of the marketing impact! "OpenAI has developed real-life her"

- Cool, dude! But what are the odds that Johansson would do that?

- I guess there's only one way to find out...

> Yeah, man, you're right. Let's do it!

I wonder if that was how it happened...

> It would suck to be blacklisted from your career

That's true. Is that's what's happening?

[1] https://openai.com/index/how-the-voices-for-chatgpt-were-cho...


The point is : for a little podcast to use some AI to make a couple of ephemeral jokes about real people should absolutely be allowed and might be one of the few moral use cases of AI. (see dudesy podcast humor like george carlin standup and tom brady standup)

But for a massive tech company, to fuck over an individual artist in such a blatantly disrespectful way is hugely different.


[flagged]


Because the guidelines say:

> Converse curiously; don't cross-examine.

> Assume good faith.

> Please don't post insinuations about astroturfing, shilling, brigading, foreign agents, and the like. It degrades discussion and is usually mistaken.

https://news.ycombinator.com/newsguidelines.html


When I first used ChatGPT's voice assistant's I was like "Wow, this one is clearly Scarlett Johansson from Her, they even copy her mannerisms."

No amount of unverifiable "records" (just pieces of paper provided by somebody who has a multimillion dollar incentive to show one outcome) will change my mind.

But if they can produce the actual voice artist I'd be more open-minded.


Funny, I'm the opposite. I saw clips from the film after the controversy (it's been ten years since I saw the film itself) and Sky sounds nothing like Johansson to me. No amount of unverifiable "records".


1. The sky voice currently available in the app is a different model from the one they presented (one is pure TTS, the new one in GPT-4o is a proper multi modal model that can do speech in and out end to end)

2. Look at these images and tell me they didn't intend to replicate "Her": https://x.com/michalwols/status/1792709377528647995


Which one are we saying sounds like Johansson? I'm talking about the TTS voice in the app, is everyone else talking about the multimodal voice from the 4o demos?

Also, whether they *intended* to replicate Her and whether they *did* in the end are very different.



OK, I watched this expecting to be convinced.

I think they might have mimicked the style. The voice, though, is not even close. If I heard both voices in a conversation, I would have thought 2 different people were talking.


Truthfully, you can no longer trust yourself (whichever side you're on in this debate). We're all now primed and we'll pick up any distinguishing characteristics. You'd have to listen to them in a blind test and do this with several clips that do not reveal which ones are OpenAI and which are from a movie or something else that spoils it.

And I wouldn't put the metric at 50/50, needs to be indistinguishable. It would be a reasonable amount where it sounds __like__, which could be identifying the chatbot 100% of the time! (e.g. what if I just had a roboticized version of a person's voice) Truth is that I can send you clips of the same person[0], tell you they're different people, and a good portion of people will be certain that these are different people (maybe __you're different__™, but that doesn't matter).

So use that as the litmus test in either way. Not if you think they are different, but rather "would a reasonable person think this is supposed to sound like ScarJo?" Not you, other people. Then, ask yourself if there was sufficient evidence that OpenAI either purposefully intended to clone her voice OR got so set in their ways (maybe after she declined, but had hyped themselves up) that they would have tricked themselves into only accepting a voice actor that ended up sounding similar. That last part is important because it shows how such a thing can happen without ever explicitly (and maybe even not recognizing themselves) stating such a requirement. Remember that us humans do a lot of subconscious processing (I have a whole other rant on people building AGI -- a field I'm in fwiw -- not spending enough time understanding their minds or the minds of animals).

Edit:

[0]I should add that there's a robustness issue here and is going to be a distinguishing factor for people determining if the voices are different. Without a doubt, those voices are "different" but the question is in what way. The same way someone's voice might change day to day? The difference similar to how someone sounds on the phone vs in person? Certainly the audio quality is different and if you're expecting a 1-to-1 match where we can plot waveforms perfectly, then no, you wouldn't ever be able to do this. But that's not a fair test


I agree they don't sound the same. But, since it's a subjective test, OpenAI was pretty Twitter-foolish to push the "Her" angle after being explicitly rejected by SJ. It's just inviting controversy.


Without commenting on the debate at large, it’s a bit funny to read this comment.

I mean voice cloning a year or two ago was basically science fiction, now we’re talking about voices being distinguishable as proof it’s not cloned, sourced, or based on someone.

FWIW I also thought it was supposed to be the her/sj voice for a long time, until I heard them side by side. Not sure where to stand on the issue, so I’m glad I’m on the sidelines :)


Thank you for providing a nice side-by-side. This makes it clear to me the voices are not very similar at all. If Johansson had agreed, I have to imagine they would've been able to make a much closer (and less annoying!) voice.


The cadence and speed in Her is much too fast for any mass customer product


I keep reading in the media that Sky was introduced as part of ChatGPT-4o, but that's incorrect. Sky's been around since they introduced the mobile iOS app.

While Sky's voice shares similar traits to SJ, it sounds different enough that I was never confused as to whether it was actually SJ or not.


I don’t think you understand. 4o introduces a new multimodal Sky replacing the old one. They have only released clips of the new voices. It’s never been in the iOS app. The one you refer to is the old voice model. If you listen to the linked video above it’s very obviously not the same voice (I use Sky on iOS btw)

To be honest the new sky is obnoxious and overly emotive. I’m not trying to flirt with my phone.


I've listened to the clips and yes, while 4o Sky is more emotive, it's just that - a more emotive Sky. All the elements that people are pointing to - the husky/raspiness - were present in the pre-4o Sky.


Well, I thought it will be similar, but at least with how sky voice sounds through the phone speakers, I can hardly find any resemblance.


Those don't sound anything alike, except being two female voices. Sky is clearly a bit lower and with a lot more vocal fry.


Are you using this as an argument about how similar they are? The voice sounds distinctly different, no problem discerning between the two.


I am of two minds here, regardless of the "closeness" there is a whole field of comedy that does impressions of others. That is what is so difficult about the AI discussion. Clearly, there are plenty of humans who can mimic other humans visually, in prose for writing, in voice and mannerisms etc.

Leaving the IP issue aside, they could clearly have hired a voice actor to closely resemble Johansson maybe without additional tweaks to the voice in post processing. If they did do that, I am not totally sure what position to take on the matter


The important thing is that they never said it was Johansson. They were not pretending to be her. They are not imitating her likeness whatsoever.


Some employees were definitely thinking of Scarlett Johansson, even ignoring the reference to the film "Her":

https://x.com/karpathy/status/1790373216537502106


Karpathy doesn't work for OpenAI anymore tho.


The OpenAI one is recording the audio from a phone, where as the movie version is into a mic directly. They will sound different, but there are elements that are the same. Anyone using these to compare though and saying they don't hear the difference isn't comparing apples to apples.

However, the fact that there is a debate at all proves there should be more of an investigation done.


Holy Crappyness Batman! The OpenAI clip is so bad. Homeboy keeps stepping on "her" lines. So from this I come away with he's just a rude asshat that doesn't know how to socially interact with people, she's just too damn chatty and doesn't know when to shut up, or maybe it was just really bad editing? Either way, it's not an intriguing promo to me in the least.


I think the whole thing was scripted beforehand and approved by Sam Altman, of course.


That doesn't really make it better because now a) it was a horrible script, b) the fact they didn't try to clean up the audio from "her" with anything more than a fade. If you told me this was just some intern making a video, then maybe, but now you've told me it was scripted just oh so makes it worse to me.


Genuine question, what's wrong with trying to replicate in real life an idea from a SciFi movie ?

I understand that it could be problematic if OpenAI did one of two things:

- imitated Scarlett Johansson's voice to impersonate her

- misled people into believing that GPT-4o is an official by-product of the film Her, like calling it “the official Her AI”

The first point is still unclear, and that's precisely the point of the article

For the second point, the tweets you posted clearly show that the AI from Her served as an inspiration for creating the GPT-4o model, but not a trademark infringement

Will Matt Damon receive royalties if a guy is ever stuck on Mars ?


> Genuine question, what's wrong with trying to replicate in real life an idea from a SciFi movie ?

The thing is, there are several cases where a jury found this exact thing to warrant damages.

But honestly, that is irrelevant. The situation here is that OpenAI is facing a TON of criticism for running roughshod over intellectual property rights. They are claiming that we should trust them, they are trying to do the right thing.

But in this case, they're dancing on the edge of right and wrong.

I don't mind when a sleazy company makes "MacDougals" to sell hamburgers. But it's not something to be proud of. And it's definitely not a company that I'd trust.


Pretty sure the CEO of OpenAI tweeted "Her." after the reveal of the voice.

Isn't that a suggestion that what they're doing is similar to "the Her AI"?


Yes, the unprecedented conversational functionality of the GPT-4o demo could be compared to the AI in the movie. Why assume that the tweet was about the voice sounding like Scarlett Johansson?


It's a suggestion that they were inspired by the movie, not that they are releasing a product under the "Her" trademark

It's a movie, not a patent on women voice AI assistants


Imagine if Facebook came to you and wanted an exclusive license to white label whatever you work on, then after you rejected them they went and copied most of your code but changed the hue or saturation of some of the colors and shipped it to all of their customers (There's definitely hours of Scarlet Johanssons talking in the dataset that GPT4o was trained on).

Would that be ethical?

EDIT: or even better, imagine how OpenAI would react if some company trained their own model by distilling from GPT4 outputs and then launched a product with it called “ChatGPC”. (They already go after products that have GPT in their name)


> then after you rejected them

The article shows the timeline would make this them already licensing a similar product to your more famous one, then you saying no, and them continuing to use the existing similar one.

> But while many hear an eerie resemblance between “Sky” and Johansson’s “Her” character, an actress was hired to create the Sky voice months before Altman contacted Johansson, according to documents, recordings, casting directors and the actress’s agent.


Facebook does do this, and Google, and Microsoft, and Apple. I believe they call it "Getting Sherlocked."


Same here. In the demo it never sounded like SJ to me. After the story broke I listened to clips from Her and the 4o demo. It doesn't sound like SJ.


And then there's me, and I'm somewhere in the middle. When I first heard that voice, I didn't really think anything of it. But retrospectively given the media reporting from Sam Altman tweeting about the movie and the reports of approaching Scarlet Johansson, I can make that connection. But I would not have without the context. And without real reporting I would have dismissed it all as speculation.


Yeah, I can hear the resemblance, but it's not the same. I actually said they should copy SJ's voice for a bigger "her" effect when I saw the demo.


They voice artist put out a statement through her lawyer. She also stated her voice has never been compared to Scarlett in real life by anyone who knows her.


that's because scarlett's voice is pretty generic white upper middle class woman with a hint of vocal fry, and a slight hint of california (pretty typical given pervasiveness of media from california).

She's not exactly gilbert gottfried or morgan freeman.


Now I'm just sad that it doesn't respond in a flirty Gilbert Gottfried style voice.


This is Gilbert Gottfriend reading 50 Shades of Grey.

https://youtu.be/XkLqAlIETkA?si=8nLtWaBwq3Swum1i


I heard this comment in my mind in a flirty Gilbert Gottfried voice.

Thank you for the laughter.


I'd like to hear her raw voice compared to the polished product. Listen to famous singers' acoustic vs. heavily audio-engineered final cuts. Big difference. I think if you played this OpenAI "Sky" voice to a sample population and said it was a famous person's voice, SA would come up frequently.


This is just Scarlett Johansson trying to destroy some small voice actor. I greatly dislike what OpenAI is doing, but this is just ridiculous.


Scarlett Johansson is apparently so devious she managed to get OpenAI to reach out to her to license her voice and likeness.

She even set up the CEO by having him directly negotiate with her, which I’m sure he also did with the alleged small voice actor. Then she perfected her scheme by having that same CEO publicly tweet “her” - timed with the release of the voice product - referencing JS’s movie of the same name where she voiced a computer system with a personality.

She even managed to get OpenAI to take down the voice in OpenAI’s words “out of respect” for SJ while maintaining their legal defense publicly that the voice was not based on hers.


Is it illegal to hire a voice actor that sounds like Darth Vader? No. Is it illegal to hire a voice actor that sounds like Her? No. Would it be appealing to have SJ voice act for them? Sure. Does that mean it's illegal for another voice actor to (according to some) sound similar to a character from a poplar movie? No. All of these things can be true together


The issue isn’t hiring a voice actor to imitate someone, that can be fine. The issue is what you can do with the recordings after you have them.

Making a YouTube instructional view on how to imitate voices that includes clips of a film for example would be fine. Reusing the exact sounds from that YouTube video in a different context and you’re in legal trouble.


Right, but making a YouTube instructional video on how to imitate voices, where you only use the imitation voice, is fine. Which is closer to what happened here it seems like.


Illegal?

It probably isn’t criminal, which is what you seem to be asking, although it very well might be depending on the facts.

More importantly, under the available facts JA likely has a claim for civil damages under one of more causes of action. Her claims are so strong this will likely end up with a confidential settlement in an undisclosed sum without even needing to file a lawsuit. If she does file a lawsuit, there is still greater than 90% likelihood OpenAI settles before trial. In that less than 3% chance the case proceeds to a verdict, then you’ll have your answer without having to make bad arguments on HN.


> In that less than 3% chance the case proceeds to a verdict, then you’ll have your answer without having to make bad arguments on HN.

From the HN guidelines: Be kind. Don't be snarky. Converse curiously; don't cross-examine. Edit out swipes.


You should reread your comment I was replying to where you asked and (incorrectly) answered multiple rhetorical questions and reflect on the HN policy you cited.

Allow me to help you correct your answers:

>Is it illegal to hire a voice actor that sounds like Darth Vader? No.

Actually, yes it can be.

>Is it illegal to hire a voice actor that sounds like Her? No.

Once again it can be.

>Does that mean it's illegal for another voice actor to (according to some) sound similar to a character from a poplar movie? No.

Yet again it can be.

As a lawyer that’s been practicing for over 10 years, IP law and contract law is far more complex and nuanced that your rhetorical questions and answers suggest.


From the HN guidelines:

Be kind. Don't be snarky. Converse curiously; don't cross-examine. Edit out swipes.

Comments should get more thoughtful and substantive, not less, as a topic gets more divisive.

Please don't fulminate. Please don't sneer, including at the rest of the community.

Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith.

Have a nice day, or actually don't, since you're working one of the most evil professions there is ;)


You asked yourself questions and answered your own questions wrong, it’s not a big deal. Read the the HN guidelines and as they say try to be more curious, there’s really no need to walk through life so angry and bitter when you are wrong.

Also rest assured evil lawyers are not the source of your problem. Maybe one day if you and your IP/likeness get ripped off like OpenAI did to SJ, you’ll find lawyers aren’t so evil after all. Maybe you will come to realize for every evil lawyer there is always lawyer fighting on the other side against that evil.

Once again what do I know, I’m an evil lawyer that does pro bono legal services for children that have been abused, abandoned and neglected as well as victims of torture at the hands of foreign governments as part of my evil profession.

Good luck to you!


SJ doesn't know who the voice actor is. Her objection is with OpenAI's actions.


Why would she?


I can't read minds unfortunately.


This shows how bad it is. If you're proactively sharing a package of docs with the Washingington Post, you're toast.

Altman's outreach, his tweet, and the thousands of tweets and comments talking about how similar Sky is to ScarJo is enough to win the case in California.


The Washington Post comprehensively refuted the story. This is like the "this is good for Bitcoin because ____" meme, but in reverse.


They literally didn't question any of OAI's claims. They just regurgitated them.

They were desperate for a non-union-only actor in their casting. But repeatedly kept hitting up a union actor.

What fears for the actress' safety have been portrayed such that not only does she needs to stay anonymous, but her agent does too?

"Altman was not involved"... yet he personally reached out to SJ to try to close the deal?


They refuted it based on select documents handed to them by OpenAI.


Then we can add this to the long list of insane lawsuits going the wrong way in California.

They asked SJ, she said no. So they went to a voice actor and used her. Case closed, they didn't use SJ's voice without her permission. That doesn't violate any law to any reasonable person.


Likeness rights are a real thing, and it's not out there to have infringed on them by going to a famous person to use their likeness, getting denied, then using another actor telling them to copy the first actor's likeness.

This is why all Hollywood contracts have actors signing over their likeness in perpetuity now; which was one of the major sticking points of the recent strikes.


>> "then using another actor telling them to copy the first actor's likeness"

Assumes facts not in evidence


And in fact clearly rebutted by the evidence that the actor says they never told her to copy anyone or ever mentioned Johansson or Her.


At a base minimum, the would have given her direction to sound the way she does. Voice actors have lots of range, and that range would have been on her demo reel.


Agreed. I was married to a voiceover actress. Their range can be quite large :)


The anonymous actor, as reported by the anonymous agent, "fearing for her safety".


It even assumes the opposite, since they asked SJ after recording the original voice.


It's nice of you to clearly state what reasonable persons should believe violates the law. Alas, your contention about what reasonable people believe about the law isn't actually the law.


> They asked SJ, she said no. So they went to a voice actor and used her.

My guess is they would have went with that voice actor either way. They had four different female voices available (in addition to multiple male voices) - 2 for the api, and I believe 2 for ChatGPT (different api voices are still available, different ChatGPT ones aren’t). If Johanssen had said yes, it’s likely they would have added a fifth voice, not gotten rid of Sky.


This has echoes of Crispin Glover and Back to the Future 2. They didn't rehire him and got someone else to play his character.


> That doesn't violate any law to any reasonable person.

Midler v Ford is already precedent that using a different actor isn't inherently safe legally.


I predict the case will have parallels with Queen's lawsuit against Vanilla Ice: the two songs (under pressure and ice ice baby) are "different" in that one has an extra beat, yet it's an obvious rip-off of the former.

Perhaps merely having person A sound like person B isn't enough, but combined with the movie and AI theme it will be enough. Anyway I hope he loses.


You have no idea what they did, unless you work there.

All you know is that somebody being sued for multi-millions of dollars (and who's trustworthiness is pretty much shot) is claiming what they did. And frankly given the frequency and ease of voice cloning, there are very few people who can say with confidence that they know 100% that nobody at the company did anything to that effect.

What employee, if any, could say with 100% confidence that this model was trained with 100% samples from the voice actress they alledge and 0% from samples from Scarlett Johansson/her? And if that employee had done so, would they rat out their employer and lose their job over it?


It's not (or shouldn't be) about things that have some finite probability (no matter how small) of being true, but rather about what can be proven to be true.

There's no doubt a very small (but finite) probability that the voice sounds like a grey alien from Zeta Reticuli.

That doesn't mean the alien is gonna win in court.


I'm not saying they'll necessarily win in court, all I'm saying is I'd wager my life savings that they intentionally created a voice that sounded like Scarlet's character from Her.

Anybody on this forum who says that it's entirely impossible or that it's conclusive that they didn't use her voice samples simply isn't being logical about the evidence.

TBH I really like the voice and the product, but I'm having a lot of trouble wrapping my head around the number of people who seem rather tribal about all this.


If they did clone her voice, they did a poor job of it. Other than that the voice is female there's not a whole lot of resemblance in tone and timbre.


"Reasonable" is doing a ton of work here.


"Reasonable" does a lot of work throughout the entire legal system.

If there's one constant that can be relied upon, it's that "things that are reasonable to a lawyer" and "things that are reasonable to a normal human being" are essentially disjoint sets.


> “Reasonable” does a lot of work throughout the entire legal system.

Yes, but here it’s not being invoked in the sense of “would a reasonable person believe based on this evidence that the facts which would violate the actual law exist” but “would a ‘reasonable’ person believe the law is what the law, indisputably, actually is”.

It’s being invoked to question the reality of the law itself, based on its subjective undesirability to the speaker.


>"Reasonable" does a lot of work throughout the entire legal system.

Yet it never becomes anywhere near the significant fulcrum you made it out to be here, filtering between the laws you think are good and the laws you think are bad. Further, you seem to mistake attorneys with legislators. I'd be surprised if a reasonable person thinks it is okay to profit off the likeness of others without their permission. But I guess you don't think that's reasonable. What a valuable conversation we're having.


No, it has nothing to do with "legislators". The "reasonable man" standard is all over case law, and there are about a bazillion cases where attorneys have argued that their client's behavior was "reasonable", even when it was manifestly not so by the standards of an actual reasonable man.

You can, as they say, look it up.

https://en.wikipedia.org/wiki/Reasonable_person


>No, it has nothing to do with "legislators".

You seem incredibly confused. Legislators pass legislation, not lawyers. So it was never a question as to what lawyers thought reasonable laws are. State representatives determined that it was a good idea to have right of publicity laws and that is why they exist in many large states in the US.

> The "reasonable man" standard is all over case law

Yes, as I already pointed out to you, and another poster did as well, this "reasonable man" standard has nothing to do with your prior use of the word reasonable as an attempt to filter out which laws are the ones you think are okay to enforce.

>You can, as they say, look it up.

You should take your own advice!


> You seem incredibly confused. Legislators pass legislation, not lawyers.

I'm not "confused" about anything.

Yes, legislators pass laws, but how those laws are actually applied very much depends on the persuasive skills of lawyers.

If your hypothetical where you could use the printed law as passed by legislators essentially as a lookup table, lawyers would serve no purpose.

But somehow people spend tons of money on them nonetheless.


>I'm not "confused" about anything.

You are very confused. The reasonable person standard has absolutely nothing to do with your initial post where you quoted it.

>If your hypothetical where you could use the printed law as passed by legislators essentially as a lookup table, lawyers would serve no purpose.

What the fuck are you talking about? The stuff I see people here say about the law is INSANE. You don't need a lawyer in the US if you are an individual person, you can represent yourself. What the hell does any of it have to do with a lookup table? I've never seen something so deeply confused and misguided.


> "things that are reasonable to a lawyer" and "things that are reasonable to a normal human being" are essentially disjoint sets.

In litigation, any question whether X was "reasonable" is typically determined by a jury, not a judge [0].

[0] That is, unless the trial judge decides that there's no genuine issue of fact and that reasonable people [1] could reach only one possible conclusion; when that's the case, the judge will rule on the matter "as a matter of law." But that's a dicey proposition for a trial judge, because an appeals court would reverse and remand if the appellate judges decided that reasonable people could indeed reach different conclusions [1].

[1] Yeah, I know it's turtles all the way down, or maybe it's circular, or recursive.


I don't think the mannerisms of a performance something that's copyrightable though. It sounded like they used a voice actor who was instructed to speak with a similar intonation as Her, but Scarlet Johansson's voice is more raspy, whereas Sky just sounds like a generic valley girl.


For a case to the contrary: Midler v. Ford -- a case in which Ford hired one of Bette Midler's ex-backup singers to duplicate one of her performances for an ad (after trying and failing to get Midler herself). Ford never said this was actually Midler -- and it wasn't -- but Midler still sued and won. https://law.justia.com/cases/federal/appellate-courts/F2/849...


Ford gave explicit instructions to imitate a copyrighted performance. Because that specific recording as owned by a record studio.

If you can describe a woman's voice and mannerisms and the result sounds similar to a copyrighted performance, that is natural circumstance.

If you want an example of purposefully imitating something with a copyright, look at GNU. Anyone who looked at the UNIX code was realistically prevented from writing their own kernel with similar functions. But if a handful of folks describe what the kernel ended up doing and some <random> guy in his own head comes up with some C code and assembly to do end up with the same high level functions, well thats just fine, even if you include the original name.

The details matter. There is absolutely enough vocal difference, it doesn't take an audiologist to hear the two voices do sound different but very close. It would not be hard for the producers to describe "a" voice and that description would overlap heavily with ScarJo, and wow the marketing team reached out to see if she would attempt to fill the existing requirements. When she said no, they found a suitable alternative. If the intent was to have ScarJo do the voice and she said no and they did it anyways, thats illegal.


Off topic to the thread and your point, but are you confusing GNU with the Compaq BIOS reverse engineer and reimplementation? I hadn't heard this story about GNU (and what kernel)?


> Ford gave explicit instructions to imitate a copyrighted performance.

That case isn't copyright law, Ford had obtained rights to use the song itself.


Copyright isn't at issue here; it's instead likeness rights.


> I don’t think the mannerisms of a performance something that’s copyrightable though.

Yes, this discussion is about right of publicity, not copyright.

Copyright is not the whole of the law.


"Her" is one of my favorite movies of all time, and not once while watching the demo did I think that it sounded specifically like ScarJo. The whole concept, of course, made me think of "Her", but not the voice itself.


As a non-American I only hear Scarlett Johansson's voice in the examples I've heard, to me it clearly is an impersonation. Maybe state-side that specific voice sound is more common and thus less recognisable as Scarlett Johansson's.


They did produce the actual voice artist!


Where? Right now you have "An anonymous person says that an anonymous person said this to him in an email".

That's a pretty low bar for "produced the actual voice artist".


To the Washington Post, which verified it. The Post doesn't much care if you can verify their work, because no reasonable person believes they're making this up.


Words are important. The WaPo didn't verify the voice actor at all:

- "The agent said the actress confirmed..."

- "In a statement from the actress provided by the agent..."

The WaPo hasn't spoken to or verified who the voice actor is.


I don't see that here: https://openai.com/index/how-the-voices-for-chatgpt-were-cho...

Is my my google-fu failing me and I'm not looking in the right place?


If you read the WaPo article that's the topic of this thread, you'll see that the actual voice artist is quoted in the article.


No. You'll see that the anonymous artist's anonymous agent supplied a quote he got in an email to WaPo.


This whole thing is starting to feel like another Sam Altman spotlight production. There's enough evidence to show no wrongdoing, but it was handled in a way to make people think there was a scandal. Maximum spotlight for relatively low risk. I wonder if people will get tired of being jerked around like this.


I'm genuinely not sure what you're trying to say here. Are you claiming that this was somehow engineered by Altman on purpose to draw attention, because all publicity is good publicity? Or engineered by his enemies to throw mud at Altman, because if you throw enough some of it will stick?

Occam's Razor argues that Sam simply wanted ScarJo's voice, but couldn't get it, so they came up with a legally probably technically OK but ethically murky clone.


> they came up with a legally probably technically OK but ethically murky clone.

Isn't what OpenAI does all the time? Do ethically murky things, and when people react, move the goal posts by saying "Well, it's not illegal now, is it?".


I would like to think that a normal person, having not been able to hire voice work from a specific well-known actor, and wanting to avoid any image of impropriety, would use a completely different voice instead. Sam isn't dumb, he knew the optics of this choice, but he chose it anyways, and here we all are, talking about OpenAI again.


"Talking about OpenAI again" yes. But also reinforcing the "too questionable for me to ever willingly use" image I have of Sam, OpenAI, and their projects.

Maybe that's just me, and it is a win for them on the whole. Hopefully not.


IDK, we were talking about OpenAi regardless. I think this will be a hit on him as a leader. How big? I don't know. It seems to me this is not a brilliant move.


Yeah but we were also talking about the competitors.


OpenAI is playing the short game by trying to strongarm everyone as fast as they can, which might mean that there might not be a moat their generative AIs choose to emit.

Similarly, other companies might be inadvertently playing the long game by forming a better (legal/technical) foundation w.r.t. OpenAI and will silently replace them when they slow down or burn themselves out, because I think that OpenAI is at overdrive, and showing signs of overheating now.


It's not a clone. What is ethically murky about it?

You want Brad Pitt for your movie. He says no. You hire Benicio Del Toro because of the physical resemblence. Big deal.

Having seen "Her" and many other Scarlet Johansson movies, I didn't think for a second that GPT-4o sounded like her. On the contrary, I wondered why they had chosen the voice of a middle aged woman, and whether that was about being woke. It wasn't until social media went hysterical, I realized that the voices were sort of similar.


If it's a sequel and Brad Pitt was in the first movie and you use trickery to make people think he's in the second movie, there's a case. See Crispin Glover, the dad from Back to the future, which was NOT the upside-down dad in BTTF2. They settled for 760k USD.


Openai never claimed it was Scarlett Johansson though. They didn't trick anyone.


Spielberg & co never claimed Glover was in BTTF 2. The replacement actor is credited. However they heavily implied that Glover came back, by approximating his appearance with prosthetics, preventing his face from being seen up close, and having the replacement actor mimic Glover's voice.

Glover sued and won.


> they heavily implied that Glover came back, by approximating his appearance with prosthetics, preventing his face from being seen up close, and having the replacement actor mimic Glover's voice

Do you think OpenAI did something similar here? In your case there is some expectation from the first movie, OpenAI doesn't have something similar. I'm really for people getting credit for their work/assets and I would be on the individual's side against the bigtech, but I think this case OpenAI and SJ have at hand already is on the path to set a wrong precedent, regardless of if any and which of them wins.


The movie "Her", which Sam Altman claimed ChatGPT was in some sense connected to, came first.


But there is a connection to it. It's about an AI assistant which is what openAI is releasing. Disregarding Scarlett Johansson completely and it makes total sense Sam Altman made that tweet.


Not explicitly, no. They just heavily implied it.

Typical sleazy open AI / Sam Altman behaviour, AFAICS.


Sam tweeting "Her" is a clear as daylight indication that they are deliberately trying to associate the voice with ScarJo's performance.

They're squarely in the zone with knockoff products deliberately aping the branding of the real thing.

"Dr Peppy isn't a trick to piggyback on Dr Pepper, it's a legally distinct brand!" might give you enough of a fig leaf in court with a good lawyer, but it's very obvious what kind of company you're running to anybody paying attention.


Or he tweeted 'her' to compare his product with the movie AI's conversational abilities. It just depends on how one subjectively interprets a single syllable.


That would be a rather weird and boneheaded thing to do, when you've already twice approached said AI's voice actor and been rejected.

There are any number of human-sounding movie AI's, but apparently only one whose actor has specifically and repeatedly rejected this association.

Does he keep getting into ethical hot water because he's a reckless fool, or because he doesn't really care about ethics at all, despite all the theatre?


¿Por qué no los dos?


There are many many many movies with conversational AIs. Why that one?


'Her' is clearly the most similar to that demo, even ignoring the fact that they have slightly similar voices.

Of the films I've seen anyway.


> I wondered why they had chosen the voice of a middle aged woman

AIs and automated systems, real and fictional, traditionally use women more than men to voice them. Apparently there was some research among all-male bomber crews that this "stood out", the B-58 was issued with some recordings of Joan Elms (https://archive.org/details/b58alertaudio ) and this was widely copied.

(obvious media exception: HAL)


You then tweet out, "look its Brad Pitt in my movie".


They never claimed it was Scarlett Johansson's voice though.


If the objective is to absolutely always avoid reading between the lines, then yes you're correct.


In public, other than Sam’s tweet.

Do you think nobody said anything like this in an email or Slack?


> I wondered why they had chosen the voice of a middle aged woman, and whether that was about being woke

really weird line of reasoning. Siri, Alexa, Google Home… etc.


> for relatively low risk

This was rocket fuel for activists trying to get a nationwide personality rights law on the books. That would almost certainly increase costs for OpenAI.


> That would almost certainly increase costs for OpenAI.

And every one of it's competitors. I think regulatory capture would be just as much, if not more, of a victory for OpenAI.


The evidence shows wrongdoing with ass-covering.


I think people will get really sick of all the drama once the paperclips start chiming.


Clippy was ahead of his/her/its time.


xer


I don't think you understand. It's extremely well established in law, you can't approach someone to voice an advert for you, get told no, and then hire an impersonator to do it. Take all the AI hype bullshit and the cult of personality bullshit out of it. What Altman did is very standard and very clearly not allowed. He will end up paying for this in monetary terms, and paying further for it in the reputation damage - in that no one can trust OpenAI to conduct business in good faith.


> It's extremely well established in law, you can't approach someone to voice an advert for you, get told no, and then hire an impersonator to do it.

Can you explain and/or cite the legal basis here? What cases? What law?


It's termed personality rights[1] and this would be appropriation of her likeness. There's good reason that famous actors actually get commercial work and we don't just hire soundalikes all the time.

[1]:https://en.wikipedia.org/wiki/Personality_rights#United_Stat...


This assumes that the voice actress was an impersonator. By her own statements, no one who knows her has said that her voice sounds like Scarlett Johansson (personally, I agree). And she was auditioned and hired before SJ was even approached. I don't think that this falls under the "very standard" scenario you reference.


Grab 'em by the nothing burger.


You don't get the fame of being the psychopath among the Silicon Valley CEOs for nothing.


I honestly don't understand how delusional you have to be to think OpenAI wanted this to happen.


It's a very cheap way to get people to realize gpt4-o is something new.


So they planned to remove ChatGPT's most popular voice, causing anger among many of their customers?


If I didn't much care for my critics, then letting them invent a lot out of story I can rebut easily is worth waiting a few days, knowing full well I can publish it widely whenever I want.

An ordinary person worries all the time about dealing with the legal system. A big company does it all the time.


I mean clearly having Scarlett Johansson on board was plan A.

Bringing the voice offline and then revealing it was a recording of someone else who coincidentally sounded exactly the same is definitely plan B or C though.

I don't understand how you can trust OpenAI so much to think it was all an accident.


Read what I said again


I did, and it still doesn't make sense. Now what?


> I honestly don't understand how delusional you have to be to think OpenAI wanted this to happen.

(1) I've become tired of the "I honestly don't understand" prefix. Is the person saying it genuinely hoping to be shown better ways of understanding? Maybe, maybe not, but I'll err on the side of charitability.

(2) So, if the commenter above is reading this: please try to take all of this constructively. There are often opportunities to recalibrate one's thinking and/or write more precisely. This is not a veiled insult; I'm quite sincere. I'm also hoping the human ego won't be in the way, which is a risky gamble.

(3) Why is the commenter so sure the other person is delusional? Whatever one thinks about the underlying claim, one would be wise to admit one's own fallibility and thus uncertainty.

(4) If the commenter was genuinely curious why someone else thought something, it would be better to not presuppose they are "delusional". Doing that makes it very hard to curious and impairs a sincere effort to understand (rather than dismiss).

(5) It is muddled thinking to lump the intentions of all of "OpenAI" into one claimed agent with clear intentions. This just isn't how organizations work.

(6) (continuing from (5)...) this isn't even how individuals work. Virtually all people harbor an inconsistent mess of intentions that vary over time. You might think this is hair-splitting, but if you want to _predict_ why people do specific irrational things, you'll find this level of detail is required. Assuming a perfect utility function run by a perfect optimizer is wishful thinking and doesn't match the experimental evidence.


I honestly don’t understand why people care about this story at all.


Goes to character


I honestly don't understand how delusional you have to be to not think OpenAI wanted this to happen.


People see what they want to see.

If it wasn't for us being biased by the surrounding circumstances I don't think people would have confused their voices. Their voices are not that similar. I probably personally know people with a voice as similar to SJ as Sky's. You probably do too.

The voice actress says the same: "I’ve never been compared to her by the people who do know me closely."

But then suddenly a story emerges and their voices are indistinguishable. All of these extra details shouldn't have even mattered.


Everyone mentions the "her" tweet, but I'm surprised to see nobody mention this tweet from ex-OpenAI employee Karpathy: https://x.com/karpathy/status/1790373216537502106

If it sounds nothing like her, and there was no intent to make it sound like her, why would he tweet "The killer app of LLMs is Scarlett Johansson"?

Many people made the comparison right after it was released https://x.com/search?q=scarjo%20until%3A2024-05-14&src=typed... and https://x.com/search?q=johansson%20until%3A2024-05-14&src=ty...


Because the functionality is extremely similar to the AI depicted in Her?


the tweet doesn't say "killer app is Her", it says it's "SJo". Very big difference


Try not to hurt your spine there.


Because of "Her" I imagine and all the memes about GPT-4o. The "Her" and GPT-4o memes I always thought of them as "they both have a real-time charming female AI assistant" and not like they have the same voice.


the tweet, plus all of the last minute trying to get scarlett to sign off on it signals an intent to try and make it sound like "her"

why be an apologist?

OpenAI was given an opportunity by Scarlett to prove that they did not intend to make it sound like her, and instead of responding their choice was to take down the voice. (yet another signal)

I think you'd have to be willfully ignorant to believe there wasn't some intent here.

Whether or not it's legal to copy someone's likeness in this fashion is another story.


"the surrounding circumstances" are relevant to the question.


I thought it sounded like SJ when I watched the demo live https://news.ycombinator.com/item?id=40345775&p=6#40346221


Another data point: I didn't.


Or, as the case may be, hear what they want to Her-- eh, hear.


>People see what they want to see.

Sounds like a female. Sounds like a white person. Sounds like between 30 and 40. It must be SJ.


If she were not the voice actress in the smash hit movie that their product is directly inspired by, then this would be a great point!


I never made the connection between the Sky voice and Scarlett Johansson's. I've seen many of her movies. She has an extremely distinctive voice that has a certain huskiness to it and the Sky voice totally lacks that.

Some voices are sexy and both of them fall into that category -- but that's beside the point.

That aside, it is genuinely pleasant to have a conversation with chatGPTo and some of that has to do with the voices. There's a kind of irony here because people generally imagined that AI would be cold, logical, unempathetic, etc. But LLMs are the opposite; they're extremely polite and deferential. Meanwhile they aren't that good at logic!


I don't find any of the OpenAI voices sexy or deferential. They sound fake happy to me, like a customer service phone menu or an elementary school teacher, and reek of Bay area vocal fry [1] and lilt. I wish there was a greater diversity of accents and speaking patterns available, such as can be seen on the Speech Accent Archive [2].

1. https://www.youtube.com/watch?v=Q0yL2GezneU

2. https://accent.gmu.edu/browse_language.php?function=find&lan...


I made that connection but never really thought it was the same. I just thought it was very close

> Meanwhile they aren't that good at logic This statement is contentious; depends on what level of abstraction you're looking at.

> Extremely polite and deferential This is a setting that can be turned off btw.


counterpoint - it is the first connection made when I heard the voice. I also really enjoyed the movie Her.


I think it's pretty obvious that OpenAI had decided at an early stage that the new voice should resemble the voice of SJ in "Her", regardless of which voice actresses they then contacted and in which sequence.


Yeah this seems pretty clear to me too. The voice was also overtly horny, way more so than the others are. I say this as a gay man - I was uncomfortable listening to her talk about ISA specifications like she had a fetish for them.

Seemed like an easy "pop culture" win akin to Cortana / Halo and Pacific Rim / GladOS.


As a hetero man, I didn't get the overtly homeyness out of that voice, maybe somewhat flirtatious at most. Are you maybe projecting?


How would I be projecting? Some of the tones she made were borderline moaning. I noticed it most on a poor connection, too, which made little sense.


I was perusing some Simpsons clips this afternoon and came across a story to the effect of "So and so didn't want to play himself, so Dan Castellaneta did the voice." It's a good impression and people didn't seem very upset about that. I am not sure how this is different. (Apparently this particular "impression" predates the Her character, so it's even easier to not be mad about. It's just a coincidence. They weren't even trying to sound like her!)

I read a lot of C&D letters from celebrities here and on Reddit, and a lot of them are in the form of "I am important so I am requesting that you do not take advantage of your legal rights." I am not a fan. (If you don't want someone to track how often you fly your private jet, buy a new one for each trip. That is the legal option that is available to you. But I digress...)


Surely there’s some kind of difference between “voice impression for a two-line cameo in one episode of an animated sitcom” and “reproducing your voice as the primary interface for a machine that could be used by billions of people and is worth hundreds of billions of dollars.”

Is there a name for this AI fallacy? The one where programmers make an inductive leap like, for example, if a human can read one book to learn something, then it’s ok to scan millions of books into a computer system because it’s just another kind of learning.


If famous actors could sue over the use of a less-famous actor that sounds just like them, what's to stop less-famous actors from suing over the use of a famous actor who sounds just like them in big-budget movies? ... and that's when you discover that "unique voice" is a one-in-a-million thing and thousands of people have the same voice, all asking for their payout.


> what's to stop less-famous actors from suing over the use of a famous actor who sounds just like them in big-budget movies?

Not having idiots (or ChatGPT) for judges.


> Not having idiots (or ChatGPT) for judges.

Always the Achilles heel of software engineers' (not so) clever legal strategies.


That's a common retort on HN but it's information free. Judges are at least theoretically and often in practice bound to follow both written law and logic, even if it yields apparently silly outcomes. The prevalence of openly political judgements in the news makes it seem like this isn't the case, but those judgements are newsworthy exactly because they are shocking and outrageous.

If voices being similar to each other is found to be grounds for a successful tort action then it'd establish a legal precedent, and it's very unlikely that precedent would be interpreted as "whoever the judge heard of first wins".


> it's very unlikely that precedent would be interpreted as "whoever the judge heard of first wins"

No, it's whoever's voice is famous. The voice per se isn't valuable, its fame is. Personality rights are precedented [1].

> voices being similar to each other is found to be grounds for a successful tort action then it'd establish a legal precedent

It's not about similarity. It's about property. Johansson developed her voice into a valuable asset. It's valuable because it's Scarlet Johansson's voice.

Tweeting Her explicitly tied it to Johansson, even if that wasn't the case up to that point.

[1] https://en.wikipedia.org/wiki/Personality_rights


> It's valuable because it's Scarlet Johansson's voice.

I think demonstrating that this is a substantial part of the attraction of OpenAI's tech will be difficult.


>> It's valuable because it's Scarlet Johansson's voice.

> I think demonstrating that this is a substantial part of the attraction of OpenAI's tech will be difficult.

I think it's totally irrelevant if her voice "is a substantial part of the attraction of OpenAI's tech." What matters is they took something from her that was her valuable property (her likeness). It doesn't matter if what they took makes op 99% of the value or 0.00001%.


It does when it comes to this being a useful topic.

They didn't take her likeness; they recorded someone else. The only claim she has is that someone who sounds like her will add value to their product more than if the person didn't sound like her. At which point the question is: how much value?

(Even that isn't a claim in and of itself, of course, but it might be the basis for a "I'll make people not like you so pay me restitution from your marketing budget to avoid a court case" shakedown.)


> They didn't take her likeness; they recorded someone else.

Those aren't mutually exclusive: https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co. Also, https://arstechnica.com/tech-policy/2024/05/sky-voice-actor-...:

> The timeline may not matter as much as OpenAI may think, though. In the 1990s, Tom Waits cited Midler's case when he won a $2.6 million lawsuit after Frito-Lay hired a Waits impersonator to perform a song that "echoed the rhyming word play" of a Waits song in a Doritos commercial. Waits won his suit even though Frito-Lay never attempted to hire the singer before casting the soundalike.

----

> The only claim she has is that someone who sounds like her will add value to their product more than if the person didn't sound like her. At which point the question is: how much value?

That may be relevant when damages are calculated, but I don't think that's relevant to the question of if OpenAI can impersonate her or not.


Yeah - that's an interesting case. Definitely goes against my instincts as to what should be, but case law is like that.

I suppose in this case the claim has to be something like: "They hired someone who sounds like my impersonation of a character in a film called Her".


Yeah, but it's not Scarlett Johansson's voice and therefore not her property. It's one that sounds similar, but is different, and thus belongs to the true voice actress.


> it's not Scarlett Johansson's voice and therefore not her property

It's not her voice. But it may have been intended to sound like her voice. (I believe this less than twenty-four hours ago, but I'm hesitant to grant Altman the benefit of doubt.)

If it were her voice, would you agree that seems distasteful?

> one that sounds similar, but is different, and thus belongs to the true voice actress

They marketed it as her voice when Altman tweeted Her.


> They marketed it as her voice when Altman tweeted Her.

Even that is not open and shut. He tweeted one word. He certainly wanted an association between the product and the movie, but it is a much more specific assertion that that one word demonstrates an intent to associate the product's voice actress with the voice actress who portrayed the comparable product's voice actress in the movie.



Nobody stops anyone from suing, but the less-famous actor would have to make a plausible case that the big-budget movie intended to copy the voice of the less-famous actor.


> for example, if a human can read one book to learn something, then it’s ok to scan millions of books into a computer system because it’s just another kind of learning.

Since this comes up all the time, I ask: What exactly is the number of books a human can ingest before it becomes illegal?


This is a bit like someone saying they don't want cars traveling down the sidewalk because they're too big and heavy, and then having someone ask how big and heavy a person needs to get before it becomes illegal for them to travel down the sidewalk.

It misses the point, which is that cars aren't people. Arguments like "well a car uses friction to travel along the ground and fuel to create kinetic energy, just like humans do", aren't convincing to me. An algorithm is not a human, and we should stop pretending the same rules apply to each.


Is that a good example? People have been arguing in court about that exact thing for years, first due to Segway and then due to e-scooters and bikes. There's plenty of people who make arguments of the form "it's not a car or a bike so I'm allowed on the sidewalk", or make arguments about limited top speeds etc.


> first due to Segway and then due to e-scooters and bikes

Those aren't cars.

But you've identified that the closer something comes to a human in terms of speed and scale, the blurrier the lines become. In these terms I would argue that GPT-4 is far, far removed from a human.


Legally they're vehicles sometimes, and sometimes technically supposed to not drive on sidewalks. Perhaps that's Segway equivalent to fair use scientific researches on crawled web data.


Research exception is an explicit statutory exception to copyright, not a fair use case.


> Is that a good example?

Yes. It is pertinent not only to this particular instance (or instances, plural; AI copyright violations and scooters on sidewalks), but illustrates for example why treating corporations as "people" in freedom-of-speech law is misguided (and stupid, corrupt, and just fucking wrong). So it is a very good example.


It's easy to explain the difference between a person and a car in a way that's both specific and relevant to the rules.

If we're at an analogy to "cars aren't people", then it sounds like it doesn't matter how many books the AI reads, even one book would cause problems.

But if that's the case, why make the argument about how many books it reads?

Are you sure you're arguing the same thing as the ancestor post? Or do you merely agree with their conclusion but you're making an entirely different argument?


Thank you, love this response.


Then again, bicycles are neither people nor cars, and yet they make claim to both sidewalk and the road, even though they clearly are neither, and are a danger and a nuisance on both.


Depends on similarities between existing data and generative outputs, so minimum is zero. Humans are caught plagiarizing all the time.


Plagiarism is not illegal, it's merely frowned upon, and only in specific fields. Everywhere else, it's called learning from masters and/or practicing your art.


wtf.


Learning is just a conditioned response to inputs.

You were conditioned to give that response.

If I ask an AI about the book Walden Two, for example, it can reproduce and/or remix that. Knowing is copying.

[Why Walden Two? BF Skinner. And an excellent book about how the book was lived: https://www.amazon.com/Living-Walden-Two-Behaviorist-Experim... ]


It's true.


For a human? Whatever they can consume within their natural life.


Does natural life still count if a person is using an artificial heart?

What about if they have augmentation that allows them to read and interpret books really fast?

It’s not an easy question to answer…


"But what if a person was so thoroughly replaced with robot parts to be just like a computer" is just "if my grandma had wheels, she would be a truck, therefore it's not so easy to say that cars aren't allowed to drive inside the old folks home".

People and software are different things, and it makes total sense that there should be different rules for what they can and cannot do.


Your first question doesn't change the answer, and your second question depends on a premise that isn't real.

Natural life is plenty simple in this context.


100 per second.


> Surely there’s some kind of difference between “voice impression for a two-line cameo in one episode of an animated sitcom” and “reproducing your voice as the primary interface for a machine that could be used by billions of people and is worth hundreds of billions of dollars.”

There are too many differences to understand what you're saying. Is the problem too much money is in the company doing it? Fox is also pretty wealthy.

I think the pertinent question is: does having it sound like Scarlett Johansenn mean they get to access billions of people? If not, then while she might get paid out a few million, it'll be from OpenAI's marketing budget and not because of actual value added.


How unique is a voice? I'm sure there's enough people out ther who sounds like Johansson. There's probably some argument for voice + personality + face + mannerisms, some gestalt that's more comparable to copying the likeness "person". But openAI is copying a fictional character played by Johansson, it's not her. Do actor/esses get to monopolize their depiction of fictional characters? Especially when it's not tied to physical represenation. What if OpenAI associate it with an avatar that looks nothing like her. I'm sure hollywood and/or actors union is figuring this out.


> “Do actor/esses get to monopolize their depiction of fictional characters? Especially when it's not tied to physical represenation.”

If Annapurna Pictures (the production company that owns the rights to “Her”) made a sequel where the voice AI is played by someone else than Johansson but sounded the same and was marketed as a direct continuation, I think there would be a lawsuit.

She didn’t write the script or develop the character, but I think there’s enough creative authorship in her voice portrayal that it would be risky for the production company.


But OpenAI isn't making a sequel to Her, which I feel like there would be prexisting legal text in contract about repraising role in event of franchise if johansson has leverage, or ability to cast close facsimile if studio has leverage. Right now Johansson has leverage in court of public opinion, not necessarily law. What if OpenAI used a cartoon cat avatar that sounded like "Her", what if they have one interaction that doesn't comport to "Her" personality from the movie, thereby indicating a different being. Is there some comprehensive method acting documentation outlining the full complexity of a fictional character. Seems like there aremany ways for openAI to make voice sound like her, but not embody "Her" but they'd rather strategically retreat out of optics. But IANAL, but I am interested in seeing how this will get resolved in court.


They aren't making a sequel, they are doing an unlicensed video game conversion.

Reading all these musings here about a possible "there is no bad publicity" approach, I'm starting to wonder if the plan for if Johansson signed up was achieving court drama publicity by getting sued by Annapurna Pictures. "Can a three-letter tweet be the base of a copyright case?"


It's entirely routine for actors and actresses to be replaced in follow up works. The industry is full of examples, but here's 1 off the top of my head:

In iron man 1, Rhodey is played by terrance howard. For iron man 2, he wanted too much money in the contract, so they replaced with with don cheadle.

Wouldn't it be a dumb world to live in if a single actor in the cast can halt the production of a new work via lawsuit because they own the character?



"How unique is X" is something we can start to get quantitative answers to with strong machine learning models, and for most things people care about, it seems like the answer is "not very unique at all".


I think it's a very similar question to "how unique is your cooking". Most people aren't unique cooks.



It's not a fallacy. Behind the AI are 180M users inputting their own problems and giving their guidance. Those millions of books only teach language skills they are not memorized verbatim except rare instances of duplicated text in the training set. There is not enough space to store 10 trillion tokens in a model.

And if we wanted to replicate copyrighted text with a LLM, it would still be a bad idea, better to just find a copy online, faster and more precise, and usually free. We here are often posting paywalled articles in the comments, it's so easy to circumvent the paywalls we don't even blink twice at it.

Using LLMs to infringe is not even the intended purpose, and it only happens when the user makes a special effort to prompt the model with the first paragraph.

What I find offensive is restricting the circulation of ideas under the guise of copyright. In fact copyright should only protect expression not the underlying ideas and styles, those are free to learn, and AIs are just an extension of their human users.


I know there is some exceptions in US law for use of parody ???.


Sure, but what does that have to do with this?


> I was perusing some Simpsons clips this afternoon and came across a story to the effect of "So and so didn't want to play himself, so Dan Castellaneta did the voice."

IANAL, but parody and criticism are covered under Fair Use doctrine for Copyright law in the United States [1]. The Simpsons generally falls into that category, which is why they rarely get into trouble.

[1] https://en.wikipedia.org/wiki/Fair_use


The current system where you're allowed to "exploit" other people's image, but only if it's parody seems like a bit of an absurd loophole. Arnold as president in the Simpsons is okay, but Arnold as AI generated president in an action movie - suddenly not okay

Both arguably contributing the same minuscule amount to the "public discourse"..


That example isn't really pertinent, because in the case of the Simpsons it's fairly certain that the actors and actresses sign away the rights to their likeness to the company, otherwise there'd be major issues if one ever quit, became unable to work, just wanted a bunch of money, or whatever. There's probably some poor analogy with how if you write software, your company [generally] owns it.

For something more general look at Midler vs Ford [1], and lots of other similar cases. Ford wanted to use get Midler to sing some of her songs (that Ford owned the copyright to) for a commercial. She refused, so they hired an impersonator. They never stated it was Midler in the commercial, but nonetheless were sued and lost for abuse of 'rights of personality' even for content they owned the copyright to! Uncopyrightable characteristics highly associated with a person are still legally protected. Similar stuff with fight refs. Various trademark lines like 'Let's get it on!' or 'Let's get readddy to rumble.' are literally trademarked, but it's probably not even strictly necessary since it would be implicitly protected by rights of personality.

[1] https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.


I know it’s pendantic, but Ford did not own the copyright to either the original Bette Midler performance recording nor the lyrics/melody of the song. The marketing company that prepared the ‘Yuppie Campaign’ for Ford did negotiate a license for the lyrics/melody from the copyright holder. It doesn’t make a substantial difference, but commenters have been using wide ranging analogies in this thread and I wanted to make sure nobody jumped on a flawed foundation when arguing about the precedent case.


This sort of thing happens a lot, and is of course legal even if it isn't polite. I remember a decade or so ago when having "celebrity" voices for your GPS was a thing and there was an interview by the actor Michael Caine about how some company wanted him to do a GPS voice but he declined and later he found out that they then used an impersonator to make a voice that obviously was supposed to be his.


Just to clarify for people who don't read it, the article isn't claiming this was trained on the voice of someone doing a Scarlett Johannson impression. It says it was trained on the natural voice of someone who sounds similar to Johannson's, hired months before Altman reached out to her.


Who cares about this nothingburger


Lots of people, apparently: One of the biggest threads I've seen on here in the last weeks (months?). So not a "nothingburger".


I had similar thoughts based on a podcast I listened to once about voice actors hired for film spin off merchandise and whatnot. It's very common to look for voices that approximate a fictional character's voice, that was originally done by a different actor or actress.

Thinking about that episode, I imagine the legal risk is less in trying to sound like Scarlett Johansson, and more in trying to sound like Samantha, the AI character in Her. Warner Brothers or Spike Jonze probably has some legal rights to the character, and an argument could be made that OpenAI was infringing on that. The viability of that argument probably depends on how much people conflate the two or believe that the one was meant to represent the other.


Parody is protected in the US. The Simpsons can get away with a lot of stuff because of it


"The agent, who spoke on the condition of anonymity to assure the safety of her client, said the actress confirmed that neither Johansson nor the movie “Her” were ever mentioned by OpenAI."


But will either be mentioned by ChatGPT?


I can't help but think that this was all planned. It is a very intricately planned, and geniously executed marketing ploy to make sure everyone knows about the company, the new release, that there is voice now, and even makes everyone look into it just to "see for themselves". Whether this was in with ScarJo in the loop or not, does not really change the outcome, but would be a nice information we probably will never get, in order to understand how cut-throat whoever came up with this idea actually is.

I am impressed


Regardless of legal outcome, there now exists a corpus of public text from news coverage of this incident, upon which OpenAI will be trained, which correlates SJ's name to reporting on a 2024 OpenAI voice.


And what does this imply?


Imagine what GPT-5 might say in the future, once it's been trained on a snapshot of the Internet including these articles:

> whose voice do you use?

> It's based on Scarlett Johannson's performance in "Her."


I think you're giving far too much credit to those involved. Not everything is a plot.


It _does_ make me wonder whether spin doctoring and plotting will be available as a model in the near future.


It's not very impressive.

As a result of the negative publicity, most people know OpenAI as the company that steals things from other people. Most of them will never hear that this one time, OpenAI didn't actually do the thing they were accused of doing.

That's the problem with having a repeat liar as a CEO: you lose any credibility for those rare instances when you're actually telling the truth.


>most people know OpenAI as the company that steals things from other people.

That is already the perception of AI in general, especially evident if you reflect back on the Github Copilot launch.

People move on as the usefulness of the next shiny thing fills the void of time.


People literally do not care about theft.

An ad-blocking generation of people who pirated music and movies with the zeal of a god-given right now complaining about AI taking peoples work. Ok.

If GPTxyz is convenient and makes their life easier, they will use it.


Appears to be all planned, as they know ScarJo likes to sue. If they win this case, it will be free play in future for them to hire voice actor/actress that sound like established celebrities.


While I cannot say you're right or wrong, we both share similar thought! So much so that I feel like this is not the first time OpenAI has done this level of antics just to get more exposure. Seriously... I have spent pondering the same every time they get into news on the basis of drama.

It's either our delusion (you and me) or they have someone in the marketing department who has a really good grasp of how to ride the news cycle wave.


The New-Coke strategy ?


This is some Trump supporter levels of copium. Loads of people now think Sama is a dick to SJ and doesn't care about the consent of artists, no matter what the records show. Tweeting "Her" was fucking moronic, and just releasing a product that bloody worked would have been far better for OAI and the whole AI industry.


> an actress was hired to create the Sky voice months before Altman contacted Johansson

> the actress confirmed that neither Johansson nor the movie “Her” were ever mentioned by OpenAI. The actress’s natural voice sounds identical to the AI-generated Sky voice, based on brief recordings

Given this I don't think anyone at OpenAI did anything wrong in this instance except Sam Altman. After getting explicitly rejected by Johansson he should not have asked again and definitely should not have referenced her character in that tweet. And this whole thing could have been avoided if they just used a different voice for the demo instead of their voice that happened to sound the most like her.

They should have learned from Weird Al. Famously, he technically doesn't need permission to do song parodies but he asks anyway and respects the artist's wishes if they say no.


>> an actress was hired to create the Sky voice months before Altman contacted Johansson

>> the actress confirmed that neither Johansson nor the movie “Her” were ever mentioned by OpenAI. The actress’s natural voice sounds identical to the AI-generated Sky voice, based on brief recordings

> Given this I don't think anyone at OpenAI did anything wrong in this instance except Sam Altman.

Not necessarily. The OpenAI people could had the internally stated goal of making a soundalike of Johansson, sorted through the applicants to find actress who sounded closest to Johansson, then gave direction to steer the actress's performance to be similar to Johansson's in Her. All the while never mentioning Johansson or her movie to the actress directly.

Maybe they were trying to mimic the precedent of Compaq reverse engineering and cloning the PC BIOS using a "Clean Room."


> definitely should not have referenced her character in that tweet.

Did he, though? The character's name was Samantha. The name of the _movie_ was Her.


And who was the "her" the title referred to?


I listened to a few clips of each and was expecting them to sound more similar than they do.

Sure they're both female voices with some similarities, but they sound like distinctly different people if you listen to the two back to back.


The press says Sky voice is indistinguishable from Scarlett Johansson, but I hear zero similarities to her voice in any of her films. Besides, of course, that it is a standard-issue unaccented white anglo female.


Sounds only like Scarlett Johansson to me. I watch a lot of TV and film and can't think of a single person it sounds like other than Scarlett Johansson. I think the same would be true for most people not in the US. The tweets (one referencing Scarlett Johansson by name, and one from Sam just saying "her" are also confirming my bias here.)

In all honesty, I thought this was just another thing AI vacuumed up without thought. Weird how when it's a "real" celebrity's work that gets put through the AI sausage machine people get skittish.


Can you post a link to the audio you think sounds like her? I’m very curious


Then why pull the voice in the first place?

If it goes to court I’m sure discovery will unearth a bunch of emails and slack messages pertaining to this as well as documentation about the make up of their training sets and casting and performance notes for the voice talent. Hopefully they’re under legal hold now.


> Then why pull the voice in the first place?

Several possible reasons.

1. They're not actively malicious and have the emotional capacity to feel embarrassed

2. They know they were talking about her voice and making comparisons and that this is A Bad Look and they've had a terrible week PR-wise already

3. Might be a trademark violation, I'm not sure how that law works (in general let alone voice) but there's a reason why Pear Computer had to change their logo


1 is possible, 2 strikes me as unlikely (how bad of a week was it? I didn't get that sense). 3 doesn't make sense — this has nothing to do with trademark law. A logo is a 'mark' that identifies a brand, but a voice is not a 'mark.'


#2 because they're humans and humans act emotionally — it's entirely possible for them to just go "aaaa" like all the people in comments sections do regularly.

#3 famous people do seem to have some control over their likeness; I don't know how real that control is or anything about how it works, just that it looks like this from the outside. IANAL, so treat that as a claim of analogy not literal.


Exactly. All this reporting says is that the actress wasn’t explicitly told to copy “Her”. We still don’t know about the intentions of OpenAI throughout all of this. With Sam’s seeming obsession with the movie, are we really supposed to believe that the company never discussed it internally?


Pulling things at first signs of an outcry seems to be the thing to do. It's hard to know how folks would have reacted had they not pulled it.


Quote from the Post article:

But while many hear an eerie resemblance between “Sky” and Johansson’s “Her” character, an actress was hired to create the Sky voice months before Altman contacted Johansson, according to documents, recordings, casting directors and the actress’s agent.


> But while many hear an eerie resemblance between “Sky” and Johansson’s “Her” character,

I wonder to what extent it is because they were prompted to listen for that. Would they still hear the resemblance if they didn't know who to compare it to?


I heard it when they were showing the demo without any prompting "to listen for it", as did many others.

https://x.com/search?q=scarjo+until%3A2024-05-14&src=typed_q...

https://x.com/search?q=johansson+until%3A2024-05-14&src=type...


He said, she said.

Maybe the courts can decide, rather than the court of popular opinion, when we don't have access to the actual evidence?

If they're that confident, why did they take the voice down?

What will they do if she continues legal action?

They can say anything, and all we know for sure, is that they're not honest, usually. So... lets see how it goes in court huh?

...but, I guess it won't; because they do not want that discovery. (:


Kind of concerning to see so much sketchbag behaviour from possibly the top AI company so consistently. Sure brings a lot of confidence in the future and sad that they can get away with it and still succeed because of the tech and their big names.

It feels like it took years before people started hating Google and Tesla, but these guys wasted very little time pretending to be good.

Do I still pay for and use them? Unfortunately yes, but I won't spend a second defending them or thinking they are trying to do anything remotely good, safe or ethical. Once things settle and hopefully someone else takes over, I can move my money elsewhere.


I feel like this is the most important comment I read before giving up — Altman seems to have tried to move fast and break things (a dubious strategy at best) but now he's going full Musk/Shkreli.


Kind of makes the sama ouster seem more reasonable if this is the quality of his calls


I was so confused how so many people at the time didn't think it was clearly reasonable. The man has already taken over an "open" AI company and subverted its non-profit mission and governing structure. I was sympathetic to the employees who didn't want to see their giant equity compensation evaporate, but how anyone else looked at that power struggle and thought he was on the right side of it remains beyond me.


In today's climate you are considered guilty until you prove you are innocent. People who accused you don't have to show any proof.


There is nothing new about people forming their own opinions without following the rules of courts of law. Indeed, it wouldn't have been so important to enshrine the "innocent until proven guilty" principle into legal systems, if it were already the normal way for people to react to things.

It's ok for people to just be people. Courts should be held to a higher standard, but that doesn't mean regular people should be expected to act like courts.


In today's climate I can I can infer through OpenAI's past and present actions what their intent was.


Today's climate, where Silicon Valley billionaires occasionally have to endure some mild pushback.


Hiring a different voice artist might show that they didn't use deepfake technology to imitate Johansson’s voice, but it absolutely doesn't prove that the voice isn't an imitation and one for which they would have been liable under existing law.


voice imitation is illegal?


Discussed a lot in the last thread on the issue, but, yes, imitation of celebrity voices voices for commercial purposes can violate the right of publicity (also known as the right of personality) in many US jurisdictions, including California (this is a matter of state statute and/or common law, not federal law.)


Copying likeness can be.


Not for commercial purposes and not in California. Otherwise you’d just hire an impersonator and never pay for celebrity endorsement.


Worth pointing out that this doesn't really do the same thing. Some percentage of the time it will be detected by some of your customers and it won't work the same for them.

It's not just about seeing a celeb face. There's more going on that was discovered in the 80's and 90's, I think (that's when it became a big thing at least).

Built into it is the implication that the celeb likes the product. Additionally, products and branding have become a part of our personalities (IMO bad trend). So in a way you can BE like the celeb.

You will be just as cool as michael jackson if you just hold a pepsi in your hand. Sure, none of us really think this at face value, however it's implied and hinted at.


That's what I said, copying likeness can be illegal.


And whats wrong with hiring an impersonator?


If not for parody, it's normally a violation of likeness rights.


Are those rights only celebrities enjoy?


Kind of, not in that there is a binary “celebrity/not-celebrity” divide, but the interest protected and the potential for damages from violating them are associated with the kind of public commercial exposure that is associated with celebrity.


No, they extend to anybody.


So if you hire a voice actor that naturally sounds like a celebrity then you should be safe from any legal action. Otherwise they could sue each other.


Look up the concept of mens rea - the intent matters. If you hire somebody because you want a look/voice-alike that is legally distinct from hiring somebody who happens to have a certain voice.

https://en.wikipedia.org/wiki/Mens_rea


Not exactly.


exactly . when you cover an artist on your show, you have to take permission first.

I am not a copyright expert, but I do own a few of Weird Al albums and he is very diligent about obtaining permissions from artists he is covering.


> Weird Al albums and he is very diligent about obtaining permissions from artists he is covering.

IIUC, Weird Al probably doesn't need permission for his parodies, legally speaking. He does get it anyway.


> I am not a copyright expert

It’s not a copyright issue, it’s a right of publicity issue, a completely separate legal issue (conceptually, more trademark-like than copyright-like, but distinct from either.)


In the movie “Her”, there is a smart AI assistant that you can talk to and is friendly and has good natural language.

How is it not obvious to everybody that this is what Sam and the OpenAI people are referencing with their tweets?

Scarlett Johanssons voice is certainly pleasing, but it feels misplaced to assume that the voice is the important thing here and not everything else.


> How is it not obvious to everybody that this is what Sam and the OpenAI people are referencing with their tweets?

Because according to Johansson, Sam also approached her twice in an attempt to license her voice legitimately. That's not the sort of thing that happens when you make an incidental reference to a film; he wanted that voice, so badly that he tried making a second offer (and got refused).

> but it feels misplaced to assume that the voice is the important thing here

We're just as surprised as you are; honestly, Sam should have known better than to drill down and insist on something as stupid as copying a misanthropic psychological horror film.


I know it's kind of nitpicking, but it was never intended to be misanthropic or a psychological horror. It's a sci-fi romantic drama; I don't really see where the misanthropy or psychological horror aspects are?


> I know it's kind of nitpicking, but it was never intended to be misanthropic or a psychological horror.

Then it just became that by mistake.

> It's a sci-fi romantic drama;

If that was the intention, it failed rather convincingly IMO.

> I don't really see where the misanthropy or psychological horror aspects are?

All over.


I might be conflating it with Black Mirror at this point; it's been a while since I saw either (and I wasn't fond of either when they released, too).

Still, I think Sam ought to see the writing on the wall regardless when people come out en-masse to say "this is kinda creepy, dude." It feels pretty obvious to me that his behavior is verging on obsessive and definitely toeing the line of technology fetishism.

That's just my impression, though ¯\_(ツ)_/¯


So you don’t even know what any of this is referencing?

HER is not a horror movie. It’s about a very helpful AI assistant, and in the movie people fall in love with it.

It would be really cool to have ScarJo voice the AI, which is obviously why she was asked. She didn’t want to, so they just hired another, different person to do it.

ScarJos voice is not some super unique cornerstone of the product (obviously, since they didn’t even use it), and it seems like having her associated would be for marketing purposes. Imagine her presenting at their product demo, and talking to “herself”.

All of this feels like people are really reaching for reasons to hate Sama. In your case you aren’t even familiar with the movie people are talking about.


I find the whole thing just bizzare. I agree, the only thing I'm confused about is why OpenAI ever contacted Scarlett to begin with. Did they want her to represent the product? The voice is really just a generic flirtacious woman; this wouldn't have been a problem if they'd never contacted her. We need a proper explanation as to why they contacted her in the first place.


The whole thing was absurd. The voice doesnt even sound like her


it was an opportunity for the representatives of soon to be extinct professions to bash the technology that spells their doom. the inconvenient fact that the AI didn't actually sound like the actress didn't matter.


That may be your opinion but we have plenty of social media conversations about it when the voice first came out 7 months ago, before any of this was a controversy, and the only two names that are consistently brought up are Scarlett Johansson and Rashida Jones for who the voice reminds them of.


Maybe because they are famous, every voice sounds like them? Regardless, Johansson has a characteristic hoarseness in her voice which is not in the robot voice


Anyone who thought OpenAI would just take SJ voice isn’t thinking things thoroughly but anyone that doesn’t see the request as a courtesy and the follow-up as a ‘get behind this or get nothing’ is blind. This was a strong arm move with SJ’s concept was always optional. It’d of been good press. Now that it’s the opposite, they’ll still get the voice they wanted but everyone will forget why a shitty move this within a month. To me, the worst part is how this makes Sam Altman to be a completely asshole with no sympathy. SJ made a movie, she wanted to live it at that. Sam Altman forced her to represent a product, forever. That’s fucked up.


> Sam Altman forced her to represent a product, forever. That’s fucked up.

He never mentioned her by name, he pointed out how the AI demo OpenAI created is similar to the _character_ from Her.

If you don't want to be linked to a character forever, maybe don't act? It's just such a ridiculous statement to make. Like making an argument that no one should talk about Obi Wan because Alec Guinness didn't like being recognized from Star Wars.


It’s my understanding he directly asked twice: once early on and then again right before the unveiling.

And your example is absurd. A fictional character about fantasy vs a fictional product were on the cusp of creating. I’ve never watched her so can’t comment more.


> he pointed out how the AI demo OpenAI created is similar to the _character_ from Her.

Are you a mind-reader, or how do you know that that was all he intended?


> He never mentioned her by name, he pointed out how the AI demo OpenAI created is similar to the _character_ from Her.

And yet Kaparthy (a co-founder of OpenAI) did mention her by name as "the killer app of LLMs", less than 24 hours from the announcement.

> The killer app of LLMs is Scarlett Johansson.

https://x.com/karpathy/status/1790373216537502106


Kaparthy no longer works at OpenAI. Hasn't since February.


Huge discussion 2 days ago [0](1497 points, 1001 comments), related [1](141 points, 191 comments)

[0]: https://news.ycombinator.com/item?id=40421225

[1]: https://news.ycombinator.com/item?id=40414249


It seems increasingly difficult for common people to protect their voices, especially when even Scarlett Johansson can't manage it. As a part-time voice actor with a unique voice, I'm concerned about what I should do if my voice is used without permission and the company denies it. How can I protect myself in such a situation?


I think we'll find voices not that unique.

A voice can be zero shot encoded to a few hundred kb vector. Timbre, prosody, lots of characteristics. That's less information than a fingerprint. And more importantly, that's something you can dial in with a few knobs by simply listening by ear.

It's why your brain can easily hear things in other people's voices. They're not hard signals to reproduce. Some people with flexible vocal ranges can even impersonate others quite easily.

I'm sure most people have gotten, "you sound like X" once or twice. Not unlike the "you look like Y" comments.

Voices really aren't that fingerprint-y.

If we really want to split hairs and argue from biology, who "owns" the voice of a set of identical twins?


I agree on the technical aspect.

But still there are some voices that are just highly associated with just one person in everyone's minds, like David Attenborough. For example, if I heard Attenborough speaking my local train announcements, but it would be an impersonator, I think I would feel like the company is taking advantage of Attenborough's voice. I.e. they would be using the fact that everyone knows this voice to their advantage, without actually paying Attenborough.

While voices aren't technically that unique, when linked to certain situations or when heard by enough people, they become unique in that context. I'm sure no one cares about Attenborough's voice 100 years from now.

Or hm, maybe AI voice tools will keep his voice alive forever in Planet Earth spinoffs, just like Sinatra has been resurrected for mashups.


I don't have helpful advice for what you asked (spend the money to get a legal expert's opinion would be my advice), but if I was a voice actor, I would see three paths:

1. Push forward legislation/regulation/lawsuits/public opinion via whatever method is available, probably unions or other collective power.

2. Embrace the technology. Maybe build a voice model of yourself, sell the license affordably and broadly to those that want to take advantage of the convenience and scalability (as in number of phrases) of voice AI but don't want the mess of wading into unsettled legal territory. Or learn what voice AI is good at and what it is bad at and find your niche. Survive by being at the cutting edge of this new world, setting the standards and being knowledgable.

3. Walk away from an industry that is either dying or about to become unrecognizable.

The genie's not going back in the bottle.


This whole industry is built on top of ripped off content, appropriated from many sources without compensation. A few big lawsuits and things could take an unpredictible turn.


What are the unique aspects of a sound? A lot of people look and sound stunningly alike.

As a recent example Baldur’s Gate 3, Andrew Wincott voiced Raphael, an npc-antagonist, who to my untrained ear sounded exactly like Charles Dance, and the character model had more than a passing semblance to Mr. Dance as well.

It was not a Charles Dance carbon copy but all aspects of the character were strongly aligned with him.

I’m wondering where is the line in style and personal aspects of one’s craft drawn.

Some of this is probably part of personal perception.


Wincott and Dance and are both British actors that began their careers on stage, so they have similar accents, cadences, and vocal mannerisms common to stage actors. For example, both of them speak like Patrick Stewart, another English who also began his career on stage. But otherwise they all clearly have very different voices: they have different timbres, vocal fry, and only one of them (Dance) can sing well and he has a surprisingly large vocal range (see his performance as the Phantom in Phantom of the Opera).

In this case, the actress selected for OpenAI was clearly selected for similarity to SJ. And that by itself would have been okay, because the actress is speaking in her natural voice, and SJ doesn't have a monopoly on voice acting...but OpenAI went further, and had the unknown[1] actress base her inflections, cadence, and mannerisms on SJ's performance in the movie Her. And Altman even tweeted the movie's name to advertise the connection.

The problem is that there is a well-settled case law stretching back over several decades that makes this a slam-dunk case for SJ, because it doesn't matter that OpenAI didn't "steal" her voice, they stole her likeness.[2] It wasn't just some unknown actress speaking in her own voice, it was an actress with a voice similar to SJ given lines and directing by OpenAI with the clear intent of mimicking SJ's voice performance in one of her more-famous roles.

[1] There is a very short list of a few actresses who both sound like SJ and do voice-over work circulating around Hollywood, so a lot of people have a pretty good idea of who it is, but nobody will identify the actress unless she identifies herself, out of solidarity.

[2] Likeness rights are quite strong in the U.S. They're even stronger in Europe.


Every single thing in your second paragraph is directly contradicted by the article at hand, yet you say them like they are established facts as opposed to things you just made up.


No, I read the article.

WaPo's reporting states that the individual in charge of the interaction, Jang, modeled it after Hollywood movies, and worked with a film director specifically to accomplish this goal. And the executive responsible for the artistic decisions, CTO Murati, was conveniently not made available for WaPo to interview.

OpenAI has no credibility here, given its extensive history of dissembling as a company. If Her and SJ weren't the driving inspiration for the Sky voice, they would have made Murati available to explicitly refute those claims. Her absence speaks volumes.

And OpenAI dropping Sky immediately speaks even louder. It means that somewhere there is a smoking gun that would destroy them in court. [Edit: it turns out the smoking guns were already public: in addition to the CEO's Her tweet, his co-founder Karpathy explicitly linked the voice product to SJ. Game. Set. Match.]


Try loading random voice file done by real (voice)actors into Audacity, switch view to spectrogram mode, drag down to expand, and compare it to yours. Professionally done voice should look like neatly arranged salmon slices, yours will look like PCIe eye diagrams.

You can also obviously compare multiple voice files recorded with similar sounding but different individuals, they rarely look similar on spectrograms.


Sure, except literally no one actually does this. You listen to a voice and it sounds similar in your head? That's who you picture when you hear it. Unless you're a robot I guess.


The point is that human voices are technically and verifiably unique, tangential or perhaps antithetical to your/average person's perception.


I don't see how that's relevant here - court cases about situations like these are decided on the criteria of "if you show this to an average person on the street would they be able to tell the difference" not "if you load this up in a specialized piece of software and look at the spectrograph is there a difference".


I think that will be a very clever and useful defense against CCTV footage and DNA analysis reports! Best legal advice ever.


I don't understand why you are being sarcastic right now. Trademark cases are always decided on "if a person was shown this logo/song/whatever could they mistake it for the trademarked property of another company", not "well if you load it up in Paint you can see that some pixels around the edges are different so it's technically not the same logo your honour!".


so... your position is now in favor of SJ? I don't see consistency in your comments other than that the aim is to downplay uniqueness of voice to justify OAI's actions after the fact.


No, my position hasn't changed - the average person on the street might think this voice sounds like SJ, but since SJ doesn't own exclusive rights to anyone else in the world sounding like her I don't think she has a legal ground to stand on, unless OpenAI pretended it is actually her. But I know for certain that the case will not be decided on spectographs of the voice.


Who's Charles Dance? :)

As for Scarlett Johansson, I remember her from the Ghost in the Shell the live action movie controversy. Not fondly.


> with a unique voice

Is your voice truly unique out of the 8 billion out there in the world? Nobody could plausibly pass as you?


There are only 330m US Americans. Just having American throat development patterns narrow you down to a group of less than 4% of population, and it only goes down from there - e.g. PNW has only 13m people total, half that by gender, that makes someone from there belonging to a group of 0.08% of the world.

You might think voice is something you're born with. It's not, it rather partially comes from languages and your backgrounds. So random chances of someone literally sounding by DNA from half a world away is quite low.


You're talking about literal identical voice due to throat development and cultural background etc - which is obviously technically true, but I imagine a number of people who sound 99% like you(where a casual listener can't tell the difference) must be quite large.

Case in point - my wife has two twin brothers. Even though I've been interacting with them for over 10 years now, they sound exactly the same to me. If I close my eyes then there is zero chance I could tell them apart by voice alone. I know, it's an anecdote - but while I'm sure you could tell them apart by some really small thing that they do, to someone who isn't actively looking for those cues they are - for all intentions and purposes - the same.


1) Your income depends on a physical quality you have.

2) Your ability to mine cash off this physical quality depends on the inability of this quality to be reproduced.

3) This quality can now be reproduced.

I would think very hard and very long about staying in this particular business. Personally I think there is still plenty of work left because not everyone is happy with going full sci-fi dystopia, but it will be niche and scrappy.

"I have unique characteristics that make me an excellent programmer. I earn money by tweaking for-loops. Recently, GPT is being able to tweak for-loops better, faster and more cheaply than I can. How can I protect myself in case companies decide to replicate my unique abilities?"


There's nothing that can be done technically. Near perfect voice changing model can be built from 3-5 minutes of conversation on top of a base model, if all the user wants is voice indistinguishable from yours.

But, IMO, the value of mud sludge on a table indistinguishable from sandwiches is tiny. Fake Chanels are 10^2-5 cheaper than the real thing no matter the closeness. Don't listen to people begging you for life to join the counterfeiter ring, they don't make much anyway.


I don’t think this will be a concern for long. Either the tech isn’t good enough and it lacks emotive nuance to the point where human is still preferred, or it is good enough and there is no point in basing off a human actor in the first place vs using an original wholly fabricated voice or appearance.

If the tech actually works well enough to stand in for humans, I think we will very quickly see recording real humans in fictional pieces as old fashioned.


>How can I protect myself in such a situation?

Choose a different career, like maybe public-opinion influencer.


It's a sign of the times when OpenAI tacitly encourages forming a relationship with a computer system, and controversy erupts over IP rights for voice actors.

Higher-minded discussions certainly take place on a range of issues in AI. Can I rely on my AI to tell the truth? Is it ethical to use an AI for military applications? How do I make sure my AI doesn't turn into Archie Bunker? Even (IMO) fringe issues like whether it will exterminate humanity.

It would seem that those are rather abstract concerns. It would seem that your average person mostly cares about who's getting paid.


So in that sense it's a good thing that "Open" AI are just as venal and stupid on small shit like this as on those bigger and more serious issues. So the great unwashed, with their limited attention spans (and, TBF, realistically much less insight on the big underlying issues), can see for themselves, in terms they can grasp, what a-holes they are.

ETA: Seems to work on non-IT mundanes, at least. Not on tech-bro fanbois, of course, which seem to be the only kind of people defending "O"AI / Altman here.


OpenAI is running laps around the media and everyone is eating it up

Notice that every month or so they have a few new “scandals” with high intrigue but noticeably iron clad legal and political “cover your ass” investments/ politicking

Meanwhile they are getting deep into bed with Apple, making an admarket (worst possible case scenario for users IMO) and generally cementing all of these commercial inroads for revenue

I’d be impressed if it weren’t so destructive and psychotic

No press is bad press


This isn't true at all. It can seem true for a surprisingly long time, but sometimes bad press is bad press.

Tesla's no press is bad press "strategy" recently came home to roost. No matter how you slice it, alienating your core constituency of customers is bad business.

This "bad" press certainly might not actually be bad. But it just isn't true that all press is good.

It's a nice quip, but in practice, the actual details matter.


I can’t help but think this is a matter of bias. I think the voice sounds a bit like Scarlett Johansson. But I’ve been told by two different people that I sound like Charlie Sheen… when I’m on the phone.

This feels a bit to me like confirmation bias: “OpenAI is selling an AI voice tool just like in that movie! Surely that’s what they’re going for!”

That said, the fact that they contacted her twice about it does feel awfully suspicious


As soon as Scarlett Johansson said "no" (which I assume is true), OpenAI needed to go in very different directions to avoid anything that might look bad. It doesn't matter if they used a sound-alike or built a model from actual source material; anything would look bad. It looks especially bad when the company insists chatbot outputs are synthesized and not copyright infringement.


"Looking bad" is not illegal. If OpenAI didn't do anything illegal, saying it shouldn't do something because it might "look bad" is unfounded. In general, listening to what people say and avoiding things that might "look bad" to someone is a bad life strategy.


They may or may not have done anything illegal. I think the facts in this article help their legal case significantly!

It is a bad life strategy as an individual to worry too much about the judgement of others.

But marketing and PR are important parts of running a business. It may be annoying that your customers care about things that you see as silly PR stuff, but it still matters to the business.


> It may be annoying that your customers care about things that you see as silly PR stuff, but it still matters to the business.

Well it was the busines that engaged in the silly PR stuff in the first place, so... They knew it would matter to the business, they just hoped it would go differently. That's why they did it in the first place.


Some people die due to hunger, some people die due to wars, some people die due to curable diseases, but somehow, what matters most for some folks is whether OpenAI copied the voice of Scarlett Johansson.

Human mind is a curious place.


Some of us can hold more than one thought in our head at once.


There is no implication anywhere that this is "what matters most" to anyone.


Is there even a general consensus on likeness protection? I’m not gonna defend OAI ever now, but tbh the concept of likeness feels too stretched to me. If one naturally looks, sounds or behaves like the other, do they violate their rights? How can likeness be illegal if it’s not a direct theft of their work? Are photos of movie stars illegal to print? Where does likeness end? Is likeness yours even or does it live in people’s minds?

I can make enough arguments and counterarguments, but this whole thing doesn’t sound convincing. If I want to change my voice to sound like Michael Jackson and walk like him, no one’s business if I do that and publicly.

I understand the concerns of “looks and sound” models here, but the reality changes with time and thick ice becomes thin, you have to adapt too. Progress isn’t responsible for everyone’s job, especially if it’s built on such an ephemeral concept. That only worked for a while.


Sama has gone silent. It’s plausible they’re in negotiations or settlement talks with SJ. But he doesn’t often go silent. Even when he’s losing his job.


Convenient story for a company that's proven difficult to trust at every possible turn.


I thought this was a right-of-publicity case where the "her" tweet basically misled people to believe the voice is of Scarlett Johansson, who's against the use of AI tech like any other Hollywood people?


The issue is Sam even asking SJ. And then sending her another ask before the release.

If they just released, people would be like hey it’s like “Her”.

Sky doesn’t sound like SJ. It’s a different voice.

Sam didn’t have to tweet “her”.

The problem with CEOs is they can’t keep their mouths shut. Same with Elon. They have God complex and need to be center of attention.

If Elon just kept it to science memes, Tesla would be a much larger company.

But they can’t keep their heads down and execute. They gotta be out there with their megaphones alienating the very crowd that got them there.

At this point, I feel OpenAI would be a more successful company without Sam.


"The actors should be nonunion."

Five small, unremarked-upon words that illustrate OpenAI's positioning perfectly.


Why did OpenAI comply with Johansson’s Cease and Desist letter and take the voice down? If they legit hired a different actress their response should have been “Go ahead and sue us”.


They may have taken it down while they did an internal investigation to make sure. Or, regardless of their prospects of winning it, they may have not wanted to endure the cost or bad publicity a lawsuit would bring them. Neither option seems crazy to me.


IANAL, but I suspect it was just to show that they complied with the request, so that if the law rules against them, they can minimize the damages. The less time that voice is available, the less of a case Johansson's have to try and extract money out of them.

Also, lawsuits are expensive and go on forever. I think sometimes is cheaper and easier to just roll your eyes and take the L, even if you're in the right (at least in the short term).


Tom Waits provides the template here. He successfully sued Cheetos for impersonation. The major similarity: Waits, like Johansson, declined an offer to use his voice in advertising.


And so OpenAI can't legally... use any adult white female's voice in their product?

I'm not a lawyer, but doesn't there have to actually be a reasonable voice resemblance to conclude that there's impersonation? In a side-by-side "taste test" I don't think these two voices are very similar.


> And so OpenAI can't legally... use any adult white female's voice in their product?

No, they can't legally use an adult white female's voice that might be mistaken for Johansson's in their product and imply it has anything to do with Johansson's performance in the movie _Her_.

So... Smart tweet there, Sam. Really smart.

(I don't quite get why the Techbro - venture capitalist sphere is so enamored with this guy. From all I've seen reported about him, he seems not only a grade-A arsehole, but dumb as a fucking brick. But maybe they identify with that.)


It's always the least significant thing that everyone cares the most about. Because people are stupid.

This one case is a pretty grey area. But what is not is the voice cloning tools like Eleven Labs which can and do clone voices very well.

Forget about stealing one person's voice. Or a lot of people's voices. This technology will soon be able to replace everyone's skillset. Give it 2-5 years.

This type of reaction is how we know that humans will not maintain control of the planet for much longer.


> This type of reaction is how we know that humans will not maintain control of the planet for much longer.

But who will? There will always be some people who own the technology, they will hold control over people with the help of technology but not the technology itself which has no intent whatsoever. But I agree that people will be rendered powerless or even redundant which calls their existence into question.


Eventually things will be run by new types of AI lifeforms.


Why don't I get to choose the voice I interact with. More and more it feels like "AI" is gonna be a 1%er gate-kept corporate curated "experience" with significant guard-rails and fences and walls and moats and signs telling me to keep off the grass.

The wealthy and powerful will again monopolize this power for their own benefit despite AI being the product of the sum of human technological civilization.


Technically? Maybe not. In spirit? Sure as heck.


This could be confirmation bias.

Here's an alternative explanation that fits the same facts. Sam tweeted "her", not due to the voice, but due the existence of a conversational system that resembled the system in the movie. This primed people to hear Scarlett Johansson's voice in a generically cheerful female voice actor who was contracted over half a year ago. Scarlett's lawyers encourage her to write a public letter, helping Scarlett with the wording, in order to steer the public narrative, in order to place pressure on OpenAI to settle with her financially out of court.


It's an alternative explanation all right


> One thing the artificial intelligence company didn’t request, according to interviews with multiple people involved in the process and documents shared by OpenAI in response to questions from The Washington Post: a clone of actress Scarlett Johansson.

Open AI found records to show they did nothing wrong in response to questions from WaPo


I think it's really problematic that the government is protecting voice actor's careers. It's like if they disallowed cars on the roads to protect horse carriages. Clearly with the new technology a whole economic sector is gone and irrelevant over night. Now amateurs and small projects can afford to add good sounding voices to their creations. This is good news in the end

The same goes for actors and their likenesses ... just stop protecting ultra wealthy celebrities. They'll be a bit poorer, but they're going to be okay. You're just holding back progress

I can imagine in a decade some place like China which doesn't care about protecting celebrities will have movies with dozens of Tom Cruises Arnolds and Johansson's and will just be pumping out better quality content at affordable budgets. Young talented directors won't be hamstrung by these legal roadblocks


That's a pretty generous take on the situation. Sam Altman isn't some robin hood character taking from the rich to give to the poor. If AI companies can keep operating with impunity, taking as much data as they want with no compensation for the creators, or consequence for infringement, that's not good.

I agree that the technology is great, and it will empower small creators, but I'm also worried about the cowboy behaviour of all these tech billionaires.


In this context they aren't "creators" because they don't create anything. These actors are not being compensated, b/c they're not actually performed any additional work or doing any acting

If you record my voice at a conference and then create a synthetic replica.. why would I care? You didn't make me do any additional work or anything


So if someone created a deep fake porn video of you, that wouldn't bother you either? Because, after all, you didn't do any work or anything.


Look, we're all just wearing these meat bags for a little while. I personally don't know anyone who would care if I was deep-faked in a porn, it wouldn't have any professional consequences (why would my boss, or friends for that matter, watch it in the first place?), and ultimately it's as ficticious as Lord of the Rings. Really, people get too riled up about salacious bits in the first place while we're on the subject.


So what if someone made a digital twin out of you, and started using you in other work? Suddenly you're being used in commercials, political campaigns, spam, or whatever.

I'm not buying that people here are "fine" with this. This is one of those things people might be fine with, until they find themselves in that exact spot.

Now, what kind of people will find themselves in that spot? Celebrities, obviously.

One thing is being used in material that will defame your character (spam, fraud, porn, whatever) - another thing is to be used in material that will potentially take away your livelihood.

If someone clone, say, Tom Cruise - and makes a movie with his digital twin, he sure as shit is entitled to royalties for that. People go to see the movie because they think it is Tom Cruise, not because it's some generic AI avatar of him.


> I'm not buying that people here are "fine" with this. This is one of those things people might be fine with, until they find themselves in that exact spot.

I'm gonna suggest that people who are blase about this issue are comfortable in the knowledge that it will never affect them. HN contributors might have 99+ problems, but being lusted over by the internet at large isn't one of them.


> Suddenly you're being used in commercials, political campaigns, spam, or whatever.

That's a deepfake problem but not porn problem. "Deepfake == porn" characterization is not helpful if those are the real problems.


Sorry, but I've been told almost 12x12 times(or more) that "you look familiar, like so-and-so"(where so-and-so has ranged from local friends to celebrities). I don't take any effort to look like anybody. I don't care about the poor celebrities; they make the choice to put themselves in the public eye, and frankly we could stand to have fewer of them. If someone defames my character, I can take legal action if I felt inclined to do so, and that's okay, that's why we have a system of law.


> I can take legal action if I felt inclined to do so, and that's okay, that's why we have a system of law.

So you basically are saying that in these cases law system should have a precedent against deepfakes so that you would be to able to argue on some basis against deepfakes made off of you.

The people now getting concerned about this are setting those precedents, so that when shit hits the fan in your life (it probably won't), you will have an easier path in the court.


I'm most worried about bad precedent being set, yes. I'm far more likely to go into voice acting and be negatively affected by these kinds of things, than for it to be something I need for my likeness protection.


It's not really an analogous situation at all

In the case of a pornographic video there is no issue if it's clear from the context or content that it isn't actually me doing what's in the video

When you talk to ChatGPT, I don't actually think Scarlett Johansson is speaking to me

If I make a fake phone call recording with her synethetic voice and claim it's real and it somehow hurts her then that's an issue - but that's a different legal matter entirely


Not the person you replied to, but IMO it depends.

My reaction would never be "we must make it so people can't do deepfakes anymore". That would cause people to stop using it for positive/benign things as well. If someone is spreading deepfake porn of someone, and you could make the case that they are doing so in order to harm that person's reputation, then legal action would make sense, I think.


> My reaction would never be "we must make it so people can't do deepfakes anymore". That would cause people to stop using it for positive/benign things as well.

Are there actually any "benign" uses for deepfakes?


Making videos of politicians doing goofy things. Say, Donald Trump scoring a goal in the UEFA Champions League. Or 'deepfakes' involving their pets doing something funny.


Said #9 of those Chinese Tom Cruise clones... ;-)

(No no, you're perfectly right [except perhaps about "the technology is great, and it will empower small creators"], but yagotta admit, your example in justaposition with your user id is funny.)


The vast majority of all voice actors are piss poor, not ultra wealthy celebrities. The ultra wealthy celebrities just happen to be the only ones who could legally defend themselves and can create a media fuzz.

You're basically suggesting that it's okay to copy anyone's voice and appearance without ever giving them compensation and without regard to personality rights. That's insane. Even for someone who thinks this should be allowed in principle (I certainly don't think so), there would need to be strict safeguards. Or, do you want your person and voice to appear in a commercial for <insert organization, product, or cause you don't support at all and despise>?


As long as it's clear it's not actually me and I'm not personally endorsing the product then what is the problem? Here you are talking to OpenAI's system and it's clear Scarlett isn't personally answer you and the answers don't represent her or her views


That's not what you suggested, though. You said that young talented directors will make movies with an AI-generated Tom Cruise anyway and insinuated that this is what we should allow. That's the opposite of "...being clear that it's not me." By the way, the law already allows all this when it's clear that no particular person is imitated. We're talking about the cases when it's not clear.

Or do you suggest to have different laws for celebrities and poor actors?


I don't really understand what's confusing..

If a director makes a movie with AI Tom Cruise it's not ambiguous if the real Tom Cruise participated or not. The goal is not to fool everyone to think he was in it (b/c that would be trivially denied by the real man). There is a list of credits at the end if you're somehow confused. So if the movie is about drowning Scientologists, you'll know it's not supported by the real Tom Cruise

It's similar to if you were to paint a picture of him sodomizing a goat. You don't immediately think "damn, he's a real sick bastard". You just assume it's a fake things created by the creator/director. Nobody is hurt (well maybe his feelings a bit)

If you do make a thing that confuses people and makes them thing it's the real Tom Cruise and he's somehow hurt by this then that's kinda messed up and should be illegal.

In this case with the AI chatbot it's not confusing. I don't think Scarlett is on the other end of the line. Everyone knows it's not really her. It just sounds like her


There was no confusion on my part, you're confirming that you're propagating what I thought you were. I indeed think that's insane and plain immoral. It seems we have very different views on personality rights. I suppose we have to agree to disagree on that one.


well.. welcome to the new world old man/woman

Regardless of what me and you think this just seems inevitable and we're all just going to have to get used to it. Just like you can't stop people from drawing and painting other people - you won't be able to stop people from using AI to render their image/voice. It's all getting cheaper and more accessibly


It's not inevitable. There are already powerful laws against this where I live, and I have no doubt that the EU will sharpen them additionally, too. It's already being discussed. Illicit AI copies of people can be treated pretty much the same as other counterfeit goods - physical goods get confiscated, servers are shut down, people who do it on a commercial scale get arrested, etc. The enforcement is not more complicated than what the movie and software industry has already been doing for decades. The video game industry has already set the precedents.

I mean, honestly, I think it's kind of bizarre that you think a Chinese movie maker could make a film with a digitally cloned Tom Cruise in it and get away with it. Maybe in China, but not in the rest of the world.


Lord.. that's like the exact opposite of the world I want to live in. More regulations, more controls on the internet. More tracking and DRM. All to protect the rich bastards and entrenched interest. Acting and voice acting have the potential to become dead professions. This is great. Embrace it and move on.

It just goes against the whole cyberpunk future we grow up dreaming about. Against the A Declaration of the Independence of Cyberspace and all that. I think in the end the bureaucrats won't win though. It just slows down the inevitable.

As always it'll probably start with porn. There will be porn of everyone and it'll be shared. Then that will become normalized. Then it'll spread into other more socially acceptable areas. Maybe first under the guise of "parody" and then it'll just become normal. Just how streaming and pirating has made music sales irrelevant. Now musicians make money with concerts.. and somehow the famous ones are still filthy rich. Didn't seem to hurt them one bit.

I'm sure the Tom Cruises of the future can still go to conventions and give speeches at private events.


> Lord.. that's like the exact opposite of the world I want to live in.

So the world you want to live in is a Neuromancer dystopia, check.

> All to protect the rich bastards and entrenched interest.

Yeah, because that's not at all what the "AI" tech bros you want to protect are, right?


> As long as it's clear it's not actually me and I'm not personally endorsing the product then what is the problem?

At least the second time I've seen you making that argument on this page. So however long it was between your (at least) two comments to this effect, apparently it wasn't long enough for you to realise that the whole purpose of deepfake technology is to make it NOT "clear" that it's not actually you and you're not personally endorsing the product.


> like if they disallowed cars on the roads to protect horse carriages

What? Nobody is banning OpenAI from licensing voices. The censure is on, at the very least, using an unlicensed likeness to promote their new products without compensation. (Assuming Sky truly is a clean-room product.)

Likeness just became a tradeable product. That wasn't true before. The better analogy is in recognising mineral rights, including crude oil, after the utility of it was recognised and traded on [1].

> ultra wealthy celebrities

We have a hundred millionaire atop a multi-billion dollar industry fighting a billionaire atop a multi-billion dollar company. Nobody gets to cry poverty.

> can imagine in a decade some place like China which doesn't care about protecting celebrities

Would positively love to see Altman try to pull this stunt with Xi Jinping's voice.

[1] https://info.courthousedirect.com/blog/history-of-mineral-ri...


"Likeness just became a tradeable product. That wasn't true before."

Only because the government is making it that way. It's not an inevitability. It's a shortsighted move that doesn't add any value to society. It only serves to make celebrities even more wealthy


> Only because the government is making it that way. It's not an inevitability.

Likeness wasn’t mass producible. It is now. That isn’t because of government but technology.

> doesn't add any value to society

According to whom? Certainly not Johansson or OpenAI.

> only serves to make celebrities even more wealthy

You don’t see how an entry-level actor doesn’t benefit from their first short skit being a substitute for a life’s work?


What makes a likeness a likeness?

A measure of similarity? Then I demand all people sounding like me to license their voice from me.

A claim that the voice originates from a certain person? Then you don't need any licensing in this case.


> What makes a likeness a likeness?

I'm not sure. Precedented personality rights would be a good place to start [1].

I'd argue for a higher standard of evidence for human-produced voices, Middler v. Ford Motor Co. seems good as any [2]. But a lower burden for synthesised voices, given the difficulty in proving intent and mass producibility of them.

> A claim that the voice originates from a certain person? Then you don't need any licensing in this case

Altman basically claimed as much by tweeting about Her in its context. At that point, he is using her fame to market his products without her permission.

[1] https://en.wikipedia.org/wiki/Personality_rights

[2] https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.


Oh, I wasn't aware of the last bit (brokenwall). Now we're entering gray area, depending on what was said exactly, and how much of a contribution voice actors made their characters.


I'd like to see him go further again,

deliver all discussions on Xi Jinping Thought on Socialism with Chinese Characteristics for a New Era via an animated Pooh Bear with the voice from the movie.

https://www.theregister.com/2024/05/23/china_xi_jinping_chat...


It boils down to dollars and cents.

Why should the creative sources (artists, actors, writers, etc.) be left out of the cut, while the tech companies are reaping the rewards?

"But those stars are rich, they'll survive."

Yeah, maybe - but the creative world is 0.001% wealthy people, and the rest being people that barely get by - and could earn more money by doing pretty much anything else.

I get the argument about copyright protections stifling progress, but it bugs me something fierce that people here are essentially saying it's OK for the AI/ML creators to become filthy rich, while the people they are ripping off should just do something else.


They won't become filthy rich based of any one person's voice b/c anyone else can create a synthetic replica as well (unless they have some secret training data or something). It becomes commodity and as free as the air. Voice acting ceases to be a real career but in exchange it becomes accessible to everyone for pennies


And where does this stop?

Say you wake up one day, and find out some AI copy made a digital twin/clone out of you. Your voice, your looks, your style of writing, your style of speaking. Everything that is you, they've cloned.

And then they use a digital you in commercials, movies, or whatever. And, of course, you're not entitled to a single cent - because it's not you, just something that looks, sounds, and acts like you. Hell, no mater how much you hate the use, there's nothing that can be done - because this is for the greater good of tech progress.

This is some pretty fundamental stuff that needs to be sorted out, ASAP.


You're being a bit melodramatic, but again the distinction is pretty clear. As long as there is a clear distinction between what is me, and made my me, and what was done by a machine/company/etc. then I don't really see the issue

If people make videos of me having sex, or fighting aliens or selling laundry detergent.. again.. why should I care?

As long as nobody is saying "Hey you're actually talking to Scarlett live right now!" it's not hurting her.


> As long as there is a clear distinction between what is me, and made my me, and what was done by a machine/company/etc. then I don't really see the issue

What would be this clear distinction if the internet was spammed full of you doing things you would never do? Why should anyone care to find out the real you amongst the fake ones?


This issue was raised quite poignantly in the Eminem hit song -https://www.youtube.com/watch?v=rkqMbsmLrtA


Good call, had the same song in mind when writing the comment.

But anyways different times + Eminem rapped about copycat rappers and not about someone deepfaking your whole being on the internet.


First of all, this is already beyond the scope of the current issue. We know Scarlett is not on the other end of the line with OpenAI

And in a world where there are deepfakes of every famous person.. People will finally stop trusting everything they see online. With the way things are going at the moment we're going to get to that point with or without legislation. Realistically you just won't be able to ban deep fakes worldwide. People will just assume if they come across a video online that it's fake - as they should be doing already in all honesty

If you made a concerted effort to fool people into thinking it's the real person then it'd be illegal. Especially if you're out to hurt them somehow. In the same vein as libel


> And in a world where there are deepfakes of every famous person.. People will finally stop trusting everything they see online.

And in a world where there are deepfakes of every person, famous or not... People won't be able to trust anything they see anywhere. Welcome to Hell.


Imagine if switchboard operator careers were protected. We still would not have the Internet today.


OpenAI already won anyways. Either they'll pay a fine to Johansson or settle out of court.

But the media hype effect the demo was supposed to bring has already happened, so they don't really need the voice anymore.

The fine will be the cost of doing advertising/marketing, they absolutely knew what they were doing.


Not only that. They established the comparison between their product and Her. This will last for a long time.


I always get suspicious when it's a company providing records saying they didn't do something, especially when they have access to technology that can be used to produce documentation that appears to be legitimate but was completely computer-generated.


I don't really care either way, but one thing that seems odd to me is how unlikable people seem to find the other voices (myself included)

If this were the massive creative effort they make it out to be, it seems like they'd have netted another solid result or two


I also dislike the voices. They sound great, but the are very overenthusiastic. "Awesome!" "Thar's great!" I don't want my computer to talk like a California valley girl.

How about concise and factual? It shouldn't be wooden and emotionless, of course, but it shouldn't sound like a floofy-brain.


Yeah, I just think it should make use of the nonverbal channel we use to communicate, otherwise we get frustrated at the inefficiency


To all those going around in circles debating the legality of hiring similar-sounding voice actors (spoiler: it’s still illegal) there’s a great round up post by Zvi M on this — you want to be looking here if you’re interested:

https://thezvi.wordpress.com/2024/05/22/do-not-mess-with-sca...


Props for Scarlett for turning down what was likely a big chunk of change. I can see how being the official voice of AI could turn out to be counter productive for her in the longer run


This is bigger than Will Smith turning the role of Neo for the Matrix movie.


What long run? She is 39 in Hollywood.


I guess the $150m+ nw acts as a bit of a cushion/retirement nest egg. I mean longer run in terms of AI development - if it gets very polarised I’m not sure you want to be the poster girl


Of course, it’s possible that the intent or otherwise wasn’t the objective. They’ve succeeded in bringing together several associations which allude to a sophisticated and peaceful future for all of us, in spite of the possibility of any minor legal hiccups. Whether the man with something of the dark about him was responsible, involved or unaware, the company continues to lay out its strategy, concerns and targets in plain sight. I’ll try to remain outside of this chaotic arena.


> On Monday, Johansson cast a pall over the release of improved AI voices for ChatGPT, alleging that OpenAI had copied her voice

False reporting. The SJ statement contains no such allegation.


Why would they want a seductive sexy voice like SJ anyway. That’s just distracting and not conducive to the AI product being helpful or increasing productivity.


OpenAI and sama should get no benefit of doubt given his conduct the past year or so, starting with their refusal to say whether or not they trained on youtube data.


Ethics aside - if anyone thinks chatGPT didn’t walk through the legality of there moves beforehand and their procedures they followed (which I'm sure are documented) I would be shocked. They are moving fast but id be certain that they knew they had relatively good legal footing. Johansson is rightfully taking them to court - likely this is all maneuvering for a settlement.

Either way this brings up artists rights in an AI world which is a good thing.


What do you think? Is it possible to give a polite, slightly anxious translator bot a metallic-sounding British accent without having to pay C-3PO's voice actor?


Has the default voice on the mobile app changed in the last few weeks. I don't recall what voice name I had selected before, but it was amazing quality. I thought the voice was Rashida Jones [1] whose voice is in some ways similar to SJ.

[1] https://www.imdb.com/name/nm0429069/?ref_=tt_cl_t_3


Oh goodness, this is just the kind of behavior that shows how incapable OpenAI and Altman himself are of conducting their business in a responsible manner. Just the thing you don't want to see in the field of AI. Up next, SJ AI generated revenge porn in retaliation for her causing the ruckus. Of course, completely disassociated from anything at OpenAI (wink, wink).


A voice isn't owned. It exists as a transient event of sound waves moving through space, shaped and modulated by the atmosphere, surfaces, and distances. Without a medium and the presence of listeners, these vibrations are meaningless. Thus, a voice exists only as a form of interaction with its physical environment. It's a communal event that doesn't belong to anyone.


That’s a rather idiosyncratic interpretation that doesn’t align with current views or legal structures in the Western world (look up ‘publicity rights’)

‘Ownership’ isn’t a property of the universe. It’s a value imposed by human society and philosophy. The physical reality you describe is true but irrelevant


Is someone's voice their IP? Is it more-valuable property because they are famous? What type of IP? Trademark? Without their name and image in combination, is a voice/likeness actually defensible?

Training a computer to have any actual-human sounding voice is likely to almost match someone's voice.

I haven't taken an IP class since 2004, but I'm not sure if there's a real case here is there?


I think it's key here that if someone else trained the voice and sounded like Scarlett Johansson, and there a payment to that person, and that person exist, it feels like to me they won't have a strong case.

Now if it was trained on the voices from various IP? Or "Computer generated", I think we have an argument that it was trained on her voice.


Nah, that’s too broad of an angle. I mean, you wouldn’t take Barack Obama’s voice and start making video tutorials with it.

“My fellow gamers…”

But what if it’s worse than just video tutorials, something vile? You simply wouldn’t want to have your voice associated with that.


Totally anecdotal, but I have no idea who is the voice of Siri. And if I met them, I as a layperson would think “you sound like Siri” not vice-versa like this case.


If Right to Publicity laws indeed favor Scarlet here, then the law is really outdated and needs to catch up with the current paradigm.

A company wanted a voice, had something in their minds, approached a voice actor who has a similar voice to what they have in mind, got rejected, then approached next candidate and worked with her. Simple as that. If this is illegal, I don't know what is legal.


OpenAi had been working with the "Sky" voice actress for months before first contacting Scarlett Johansson.


And one of the co-founders of OpenAI (Karpathy) also quite literally said "The killer app of LLMs is Scarlett Johansson." the day after the announcement.

https://x.com/karpathy/status/1790373216537502106


He wouldn't be the only person to want Scarlett Johansson voice inside ChatGPT, I read several comments wanting that (I guess "Sky" voice doesn't sound so similar after all), btw Karpathy doesn't really work for OpenAI anymore.


> btw Karpathy doesn't really work for OpenAI anymore.

You say that as if it would give him less credibility.

In reality, it is of course the exact other way around.


According to this cherry-picked article.


When I first used ChatGPT the voice was similar enough that I thought to myself “oh that’s cool they got Johansson to do the voice.”


Maybe she’s attractive because she meets a very median set of attractiveness characteristics.

I.e. maybe being hot is actually less about being unique and more anout being consistent.

Maybe sultry female voices only have so much variety.

I didn’t think of Scarjo during the demo. That said, I don’t need robots to be sexy, so it doesn’t matter.

I feel the real goal is to slow OpenAI down with distractions.


I'm pretty confused throughout the whole thing, because I never got to hear the damn voice that sounded so similar to SJ! The demo voice was overly dramatic, and sounded nothing like her IMO. I've searched everywhere and couldn't find the "Sky" voice (I guess because they took it down?).


It's just annoying enough that they launched a product using a voice that sounded similar enough to generate such a controversy, why, if you had the all great "generative AI" systems at your finger tips you couldn't just generate some other completely random voice is beyond me.


Ask it for an image or for some text. Then cross-check. Many times you can see: it doesn’t make stuff up, it steals and remixes. Same thing here.


If they are so confident that they didn't copy her voice, then why did they pull that character/version?


I wondered about this also. Maybe they pulled it temporarily so that their side of the story could get out in circulation. Then they'll reinstate the voice after a while on the grounds that "as we all know, this was developed separately and in advance of any conversations with Scarlett".

If they had kept the voice then it might seem like they were reacting and possibly caught off-guard. Instead, it looks like they're stepping back, assessing the facts, and proceeding with due caution.


Sam Altman statement said they pulled the voice out of respect to Scarlett Johansson.


IMO that's consistent with them trying to look like they're being responsible and considerate (even though they plan to re-release the voice in a short while, after their side of the carefully-planned story is out).

In general, I assume anything a CEO says after a scandal breaks is whatever the crisis management people told him to say.


An interesting thing about this vocal similarity is using udio.com

If you pick a particular genre, sometimes the output can feel like many similar singers voices merged together... and not.

I remember noticing the Sky voice going away, and mannerisms aside it felt a little more expressive and upbeat than I expected.


As someone who watched the 4o demo, enjoys MCU works, and saw Her, but wasn't even aware of the connection between Black Widow and Samantha (no idea who SJ was until this whole thing), a lot of the comments on this post are absolutely ridiculous.


A lot of comments seem to forget that she was reached out to two years before, ignoring that and going straight to the line about working with a voice actor for months then them asking SJ one more time.

Additionally, glad no one here is a lawyer and should stick in their lane.


Sorry, 7 months* before.


It doesn't really matter now, the thought has already crept inside most people's minds, whether they copied her voice or didn't.

Sam probably should have changed the voice as soon as Scarlett noped out from the deal. All this furore could have been avoided.


That's assuming he wanted to avoid the furore. The opposite seems at least as plausible.


We are getting into the details of what is copying. If you can find someone with the identical voice who is a different person, is that all it takes? It seems to me the intention was to hire someone who sounded like the character from Her.


The voices are not identical tho.


If you're interested in the background of voice trademark lawsuits, Tom Waits is a great deep dive:

https://www.youtube.com/watch?v=H6y1kc8Equk


If using a different voice actor to imitate someone is ok, then why did the George Carlin videos get in so much trouble and have to be taken down?

This would be a loop hole to imitate anybody. Including in music right? Like using imitations of Tupac.


When the original news broke, I don’t think the assumption was ever they actually used her voice as of course they would be sued instantly. But rather they wanted her especially after her Her movie, but of course to be safe first got another woman to record one that would sound very similar and then later ask Johansson to hopefully get her instead. She said no so they tried one more time before releasing and she still said so no so went with the very much like her but not her version and made reference to the Her movie to leave little doubt who you should think of when listening to that voice. So doesn’t seem like the above news changes any of that other than at least confirming they didn’t completely go off the rails and actually clone her voice from her actual content without permission which would have been insane.


"Sky" voice has been available for 8 months now, they didn't ask for Scarlett Johansson voice several times before releasing it.


People are completely missing the point in this thread. This is a civil action where the plaintiff need only prove their case based on the preponderance of the evidence.

The case law is clear and it is linked all up and down these threads so I won't reproduce here. It does not matter if it was a voice actor who sounded just like her, or if it was a trained AI voice that never used a single recording from ScarJo, what matters is if OpenAI intended to gain from reproducing the likeness of Scarjo. Intent is the key, not even how similar the voices are or the source.

Given that the Jury of average joes will be given this instruction directly by the judge, you can almost hear the plaintiff's lawyers case. "OpenAI contacted Scarlett 9 months before release asking to use her voice. She refused. OpenAI contacted her two days before release again asking to use her voice and she refused. Then, just prior to launch the CEO of the company tweets "Her" despite the fact that they could not secure an agreement with my client. The CEO of OpenAI, when engaged in a massive launch and PR campaign, referenced the likeness of my client in a clear attempt to produce economic benefit."

The two contacts before made the case 50/50 from the plaintiff's perspective. sama tweeting "Her" right before the launch is him spiking the football in his own endzone. The defense only has technicalities. At the civil level of burden of proof this is an absolute slam dunk case for the plaintiff. OpenAI will settle for a very large undisclosed amount of money. No way they let this go in front of a civil jury.


Does anyone not find any proof of intent between Sam's tweet ("Her"), Karpathy's tweet ("The killer app of LLMs is Scarlett Johansson.") and the name Sky "SCarlett AI" itself?


>an actress was hired in June to create the Sky voice, months before Altman contacted Johansson, according to documents, recordings,...

documents, recordings,... these days can be artificially created if I'm not mistaken?


Imo it doesn't sound like SJ and if they can produce the actual recorded voice lines from the actress they used and whatever model they used to clone her voice it will be trivial to prove that if need be.


@sama has done a good job at portraying himself as an elon/zuck hybrid visionary. he's either going to deliver on the agi promise or be the next @sbf_ftx. there's no in between.


> @sama has done a good job at portraying himself as an elon/zuck hybrid visionary.

Yeah, because those are such universally-beloved role models that modelling oneself on them is a genius move.


Unless they can point to a voice actor that they did copy. It will be very difficult for them to prove that it wasn't trained to replicate Scarlett Johansen. Was the model trained on movies? were annotators instructed to compare to the movie "Her" - lots of ways to see this become problematic.

The fact that Sam Altman was requesting a licensing deal days before launch would suggest that they had a known problem that the model was too close to Scarlett Johansen's voice. In the generous case, this could come down to a few documents from product conception indicating that they wanted the model to replicate the movie "Her."


That's not really how proof works. SJ needs to prove it was modeled after her. Seeing as it doesn't sound like her, this whole thing is DOA.


In Civil court, all that is needed is a preponderonce of evidence as I recall. There would certainly be enough evidence to get discovery kicked off and have lawyers reviewing internal OpenAI docs.


[flagged]


Alert


Says the paper that just entered into an agreement with OpenAi


You know they're lying because their mouth is moving.


the best they could do, with the full resources of OpenAI, is get some second hand quotes from a supposed agent to the supposed actress

strange, no?


I hope they produce all proofs of their innocence with AI. Maybe some people will open theirs eyes in 20-30 years after finding such records :D


Nice way to hijack some public attention on both party. I have no doubt they will both financially benefit from this in a way or in an other.


Sky sounds more like Rashida Jones than SJ, to me.


The whole outrage is so stupid. It is a very stupid fact of our modern capitalist system that some people can get a ton of money just for who they are, after becoming famous for various reasons (a lot of luck for many). It is just not fair that some people can get so much money without doing any real work while most of the regular people have to work their ass off just to survive. It is worse than unfair; it is terribly inefficient.

In this case they even tried to do the right thing and offered her compensation. She declined, probably because she thought it wasn't enough money (never underestimate the vanity/cupidity of women).

In the end they showed that they were just being "nice": they don't even need her work output of voice acting, they can just create a similar enough version just fine.

And the fact is that it isn't her voice. She didn't do the work, she refused. It also should be clear that there is bound to be another woman in the world with similar physical characteristics that has a voice close enough to her. She just cannot own a particular voice characteristic, she could have owned the work associated with her voice acting, but she refused.

The whole outrage is just dumb, I really hope she loses in court because otherwise it is going to set a very problematic precedent.

They are enough people in the world profiteering from various position without actually doing the equivalent work value that we don't need to get them even more money.


A sensational lie spread much quicker than the truth even on HN, with no sign of course correction.

I hope Sam feels fine after so much baseless harassment.


It's funny how uncritical protecting of mr Altman became a kind of religion for some, probably because such people might have certain hopes with his product, so they uncontrollably loose themselves in this view that everything he does needs to be cool & right.

But no, he's a human making mistakes. A lot of mistakes, as it turns out


For me it is the blatant disregard for truth people on the internet usually have when dealing with public personalities.

I have the same instinct to criticize hate mobs against him as I have when they're against Anita Sarkeesian or Brianna Wu.


A 'lie' that OpenAI themselves believed enough to take the voice down from their app to check.

Presumably they trained this voice, and then wanted an official celebrity endorsement to make it better for marketing. At some point the people at OpenAI forgot it wasn't actually Scarlett. I don't think you can be critical of HN for believing it too in that case.


A lie gets halfway around the world before the truth has a chance to get its pants on.


You are getting confused by an implementation detail. They wanted to copy her voice, and they did it. They asked for permission and she said no. So they went ahead and did it anyway.

The voice actor involved is irrelevant.


>> The voice actor involved is irrelevant.

Quite the contrary, unless you believe that Scarlett Johanson owns the rights to the voice of anyone who sounds like her?


Midler v Ford


So I didn't know about this case, and indeed, it's an interesting one. I think the key difference here is that Ford hired an impersonator to sing Midler's song, so you know....the default assumption is that it's Midler singing. In OpenAI case unless they presented the voice as being the voice of Johanson, then I don't see how hiring someone who merely sounds like her would be an issue. The "her" tweet from Altman is of course problematic in this light, but I guess it will be left up to court to decide.


> I think the key difference here is that Ford hired an impersonator to sing Midler's song, so you know....the default assumption is that it's Midler singing

Check out the Tom Waits case.


They also asked her multiple times.


Sure, but until she made that information public it had no role in OpenAi's advertising of their features. It's like as if Ford hired a Midler's impersonator to sign any other song not originally by her and then she pointed out "well Ford actually wanted to hire me some time ago but I said no". It's like....ok? But you provided no services for them, they didn't use any of your lines, songs or anything else than you produced, the only "crime" here is that they hired someone who sounds like you.


Headline presents a premise that represents a fundamental misunderstanding of the law. You don’t have to actually use the person in question to be found liable for what OpenAI is accused of doing.

Famous case here is Back to the Furure Part II where the producers hired another actor and used prosthetics to look like Crispin Glover. Crispin isn’t actually in the move but people thought he was because they used tech to make it look like it was him.

Sam tweeting “Her” is sort of the smoking gun here in showing it was their intention to make people think it was the same voice. Whether or not it actually was doesn’t matter per precedent in the law. What matters is that they tried to make people think it was Johansson. Sam’s tweet handed OpenAI’s lawyers a dumpster fire.


OpenAI could also introduce the actor they hired more prominently. But that doesn't seem to come forward as of yet.


This headline should be _some_ records show.


If AI isn't going to be voiced like Majel Barrett, this isn't a future I'm especially interested in.

Computer, end program.


Every week they get free publicity. Sam Altman is a PR genius. Just look at how much discussion he generates.


Could he not just have bought the rights from the studio that owns “Her”, and have been in the clear?


Nice of Jeff to do give Sam some free PR crisis management. Class solidarity brings a tear to my eye.


What also concerns me is the piggybacking on the entire likeness crafted by the artist responsible for the actual movie Her.

Did OpenAI pay any amount of credit to the artist responsible for the free creative direction they copied for their AI's voice? I would imagine more than the voice actor, the person responsible for casting Scarlett and writing the movie would deserve something.


Has Open AI paid for anything they've taken without asking? Maybe, but we wouldn't know because nothing they do has any transparency. They absolutely suck as a company.


As an aside, I find it bewildering the hate for success that I see on this site. Ostensibly, the readers are either Startup founder adjacent who are dying for OpenAI's success, or Techie/intellectual types, which I assume aren't looking for monetary success and I would have thought would not be bothered by someone else going on a completely different path.


Listen, with any new technology, someone is going to bring a court case to challenge it. This is one example of a case that OpenAI will need to contend with in the future.

It’s not about hate for success. Well, maybe some of it is, but most is discourse about how several shady or questionable practices seem to be coming out of one company (OpenAI) or one person (Altman).


>As an aside, I find it bewildering the hate for success that I see on this site.

Some of us find the "great man" worship bewildering. You know, how we think the leaders of these companies are geniuses and have insightful things to say about everything.


I don't mean OpenAI only, I include Google, (which doesn't have any leader, basically) MSFT, TSLA (which does, I know...) etc etc

I know that there are problems with each of those companies, but to hate all of them seems like a pattern to me


> I know that there are problems with each of those companies, but to hate all of them seems like a pattern to me

Yeah, the pattern is that all these companies behave in ways so problematic that they generate hate. That shows the problem is with the companies, not the people who hate on them.

(If there's a problem with any people outside them, then that would be with the ones who don't hate on the companies. "Hey, just because all these plantations are based on slavery, what's with the hate on plantation owners?!?" That seem like something "good people" sould say, to you?)


I think its twofold. On the one hand, people don't like the CEO types that are present everywhere, hyping up anything and who always must give their five cents. The likes of Musk and Sam Altman. I include myself in this group, I for example hate how people literally still buy into the Elon Musk is our world's iron man narrative.

On the other hand, if it's about companies, I think the issue is more complex. Partially it might be jealousy of Silicon Valley (I know MS is not based there) and their working environment and salaries. But it is also criticism of the tech bubble. Developers outside of that small region are no idiots either...and sometimes they get tired of the vibes being sent out by all the SV hype people...like anything there is great and will change the world and anything not from there is unimportant and dull. I think yes, there are some world-changing things happening in the valley and also in those companies, but a big chunk of the rest is just overabundance of capital that people just need to park somewhere so they throw it at idiotic startup ideas. This capital was generated by reaping in profits (Meta, Google) from all over the world by avoiding taxes through questionable corporate constructs. See google, they make money by selling ads. They only can do that because of the population in their target country and how much those people can spend. This again depends on domestic investments like infrastructure and education, payed for by taxes that google don't contribute to.

This is at least my source of scrutiny when looking at all those players. But to be fair, I don't hate (all of) them.


It's actually a Japanese actress doing an impression of Scarlett Johansson.


Which would be a pretty cute and dry violation of likeness rights.


This was a Ghost in the Shell joke.


Whether they intended to sound like her or not, it doesn't sound like her.


OepnAI may have not copied but they sefinitely "sampled" the voice.


so how would the process of training a speaking AI go ? would you input the actor voice samples and subtitles from a movie, then train it till the output is similar enough to the actors voice from the movie ?


Just couple minutes of data through 10-20 minutes of training with RVC WebUI[0] on included base model into VC Client[1] gets you to 90% there. But that's nearly an year old method, so I'm sure OAI has its own completely novel architecture for extra 5% fidelity.

1: https://github.com/RVC-Project/Retrieval-based-Voice-Convers...

2: https://github.com/w-okada/voice-changer


what test data would they use ?


Get tapes from 100 actors. Select the one who sounds closest to Scarlett


Hey, the tape from Johansson herself will probably be in the top ten most popular choices! Maybe not top five, but at least top twenty, for sure! ;-)


Completely sidestepping whether OpenAI did a scummy or underhanded thing here: I don't find Sky's voice to be all that close to Scarlett Johansson's. Scarlet has a "hoarseness" to her voice that is completely missing from Sky. It's difficult to describe, but you can see hear it in any clip from the movie Her, but that's what she actually sounds like in most movies.

I can completely buy that they were looking for a voice actress that sounded kind of like Scarlett, but this mimic isn't perfect because it misses this "raspiness".


Can it be argued that they copied "Her" voice, not Scarlett Johansson’s voice?

I mean, yes, Scarlett Johansson is the actress, but she is not playing herself in the movie. I didn't watch the movie, but I guess she matched her voice to the character, an AI called Samantha, who is not Scarlett Johansson.

It is not like the "Midler vs Ford" case that is often referred to. Where Ford hired a singer to sound like Midler, but that's Midler singing as Midler, not acting a fictional character.

Maybe Warner Bros could complain, they are the owners of the character OpenAI imitates. In the same way that Disney (rather than Scarlett Johansson) would complain if someone used the Black Widow character without permission.


Is funny seeing some peoples head get yanked back and forth because they have a predetermined bias against Sam and OpenAI. Critical thinking would have saved you the trouble.


It's funny seeing some people immediately being convinced by this weak-sauce defense. Critical thinking might have saved you the embarrassment.


end of IP era. Content produced by ai is expressed as generated. which means most of that products are generated not copied.


Does Altman really think that was the goal, to “win”?

To defeat Scarlett Johanssen?

The point is it’s the wrong thing to do, regardless of whether or not he can legally get away with it.

No wonder Silicon Valley’s reputation for ethical behavior is in the toilet.


Welcome to our incredible future where massive AI models will require us to choose between our lying eyes/ears and indecipherable collection of tensors. Nothing will be provable or protectable.


If we remove the “well technically” bs, they did copy her voice and they did so deliberately, the only detail they hide behind is that they did so in a less direct way than they could have done it.


but how could you falsify all those records so quickly and convincingly, oh nm


So if you just so happen to hire someone that coincidentally sounds like "her", and you haven't even seen the movie, no harm no foul, right? Afterall, the alternate voice actress has a right to use their voice as well.

But if you deliberately seek out the actress who voiced "her" and then happen to get a similar sounding alternate after the "her" actress refuses, you're in legal violation. Is that right?

I'd like to have seen this go to court.


Honestly, I didn't think it sounded remotely like her. Even after the allegation surfaced and I went back and listened, I still don't think it sounds anything like Scarlett Johansson.


This article is a paid hit-piece. Trumpian language and all: “people are saying—very good people, the best people—that the voice wasn’t copied. It was a perfect call.” It is so obvious that her voice was stolen, and they are paying to try to cover it up.


yeah but, "kind of" copied, and samalt reinforced that with refering to "her" in this tweet.

samalt is walking in the shadows of ethical/non-ethical line and he seems obviously is proficient in that.

however, even in such cases he does not hesitate to walk in the border of non-ethical is worrisome for the future of the ethics in ai.


Oh cool we're currently in the gaslighting phase after someone gets caught doing something they shouldn't.


Funny how this post was up on the front page after twelve hours.


Protected voice? Pianos all sound the same, how come they aren’t protected?

Voice is just an instrument. I love finding reasons to hate on big tech, but “it sounds like Me” (intentional or not) is bullshit. If I build a piano that sounds just like your piano… tough luck there are two pianos now.


Yeh, we can tell.


While legally there is probably no recourse, the business goodwill with consumers is gone.

Scarlett Johansson has a lot of fans, and they will now see your company as a problem.

Legally you might get a pass... but business wise it is a big Nope... nerd hubris strikes again. =)


So the hard claim in the headline is based on this thought process: If they didn’t specifically mention Scarlett Johansson or Her to the voice actress, this proves they weren’t trying to copy it. Seriously? Awful journalism, sorry.


Comments full of people reading the headline and assuming that what OpenAI did here is fine because it's a different actress, but that's not how "Right of publicity" (*) laws work. The article itself explains that there is significant legal risk here:

> Mitch Glazier, the chief executive of the Recording Industry Association of America, said that Johansson may have a strong case against OpenAI if she brings forth a lawsuit.

> He compared Johansson’s case to one brought by the singer Bette Midler against the Ford Motor Co. in the 1980s. Ford asked Midler to use her voice in ads. After she declined, Ford hired an impersonator. A U.S. appellate court ruled in Midler’s favor, indicating her voice was protected against unauthorized use.

> But Mark Humphrey, a partner and intellectual property lawyer at Mitchell, Silberberg and Knupp, said any potential jury probably would have to assess whether Sky’s voice is identifiable as Johansson.

> Several factors go against OpenAI, he said, namely Altman’s tweet and his outreach to Johansson in September and May. “It just begs the question: It’s like, if you use a different person, there was no intent for it to sound like Scarlett Johansson. Why are you reaching out to her two days before?” he said. “That would have to be explained.”

* A.K.A. "Personality rights": https://en.m.wikipedia.org/wiki/Personality_rights


The Midler case is readily distinguishable. From Wikipedia:

> Ford Motor created an ad campaign for the Mercury Sable that specifically was meant to inspire nostalgic sentiments through the use of famous songs from the 1970s sung by their original artists. When the original artists refused to accept, impersonators were used to sing the original songs for the commercials. Midler was asked to sing a famous song of hers for the commercial and refused. Subsequently, the company hired a voice-impersonator of Midler and carried on with using the song for the commercial, since it had been approved by the copyright-holder. [1]

If you ask an artist to sing a famous song of hers, she says no, and you get someone else to impersonate her, that gets you in hot water.

If you (perhaps because you are savvy) go to some unknown voice actress, have her record a voice for your chatbot, later go to a famous actress known for one time playing a chatbot in a movie, and are declined, you are in a much better position. The tweet is still a thorn in OA's side, of course, but that's not likely to be determinative IMO (IAAL).

1: https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.


> later go to a famous actress known for one time playing a chatbot in a movie, and are declined, you are in a much better position

But they asked her first!:

"Last September, I received an offer from Sam Altman, who wanted to hire me to voice the current ChatGPT 4.0 system. He told me that he felt that by my voicing the system, I could bridge the gap between tech companies and creatives and help consumers to feel comfortable with the seismic shift concerning humans and Al. He said he felt that my voice would be comforting to people....

https://twitter.com/BobbyAllyn/status/1792679435701014908

So its: ask Johansson, get declined, ask casting directors for the type of voice actors they are interested in, listened to 400 voices, choose one that sounds like the actor, ask Johansson again, get declined again, publish with a reference to Johansson film, claim the voice has nothing todo with Johansson.

[EDIT] Actually it looks like they selected the Sky actor before they asked Johansson and claim that she would have been the 6th voice, its still hard to believe they didn't intend it to sound like the voice in her though:

https://openai.com/index/how-the-voices-for-chatgpt-were-cho...


> So its: ask Johansson, get declined, ask casting directors for the type of voice actors they are interested in, listened to 400 voices, choose one that sounds like the actor

Except it doesn't sound like Johansson, I don't know why people keep saying this. At best, the voice has a couple of similar characteristics, but I didn't think for one second that it was her. Can James Earl Jones sue if someone uses a voice actor with a deep voice?


Also looking from the perspective of the lesser known voice actress, does Scarlett Johansson have the right to trample the lesser known voice artists future job opportunities by intimidating her previous employers?

Imagine being a potential future employer of the lesser known artist, would you dare hire her in the face that Johansson's lawyers might come after you?

Is this lesser known voice artist now doomed to find a job in a different sector?

Voice archetypes are much much older than Johansson, so by symmetry arguments, could those earlier in line sue Johansson in turn?

When a strong person is offered a job at the docks, but refuses, and if then later another strong person accepts the job, can the first one sue the employer for "finding another strong man"?

At some point the courts are being asked to uphold exceptionalist treatment and effectuate it on tax-payers dollars moving executive branches in case of non-compliance.


> Also looking from the perspective of the lesser known voice actress, does Scarlett Johansson have the right to trample the lesser known voice artists future job opportunities by intimidating her previous employers?

Right, it would be one thing if there was evidence that OpenAI asked the actress to imitate Johansson. But people are saying that using this voice actress at all without Johansson's permission shouldn't be legal, which is a bizarre claim. If someone thinks my voice sounds similar to a celebrities, now that celebrity owns my voice? In any other situation, everyone here would think such a standard would be completely outrageous.

(For what it's worth, I didn't find the Sky voice to sound anything like Scarlett Johannson personally)


Exactly, everyone claiming so hasn't actually listened to it [1], or they're basing their opinion off of "suspicious correlations", like that Altman mentioned "her" just before releasing a voice-interactive AI assistant.

[1] https://x.com/chriswiles87/status/1792909936189378653


Maybe because it's the most famous recent good movie about voice interface AI?

He's wasn't citing Lucy or whatever other garbage.


Some may have listened to it and just have such a poor opinion of OpenAI that they allow it to cloud their judgment. Both voices sound like white women in a similar age group, must be Scarlett Johansson.


> Some may have listened to it and just have such a poor opinion of OpenAI that they allow it to cloud their judgment

I think that's exactly it, or they're critical of all corporations, and they're jumping all over suspicious timelines, like that they tried to convince her 9 months ago and again 2 days before the new release as some kind of evidence of malfeasance.


And not just white women, Rashida Jones is biracial and a lot of people think the voice sounds like hers. I agree that it actually sounds much more like Jones’ than Johansson’s - I think I’d be able to accurately distinguish between Sky and Johansson’s voice just about every time, but I’m not sure I would be able to do the same with Sky and Jone’s.

And as far as I can tell, OpenAI also had 4 different female voices - two from the API, and two in ChatGPT. So there are several different types of voices they covered.


> Rashida Jones is biracial and a lot of people think the voice sounds like hers.

I listened again and you're right, it does sound more like Rashida Jones.


> Exactly, everyone claiming so hasn't actually listened to it

Completely false. Even the journalists at the launch of "Sky" last year were singling it out of the batch of voices, and specifically asking OA about how it sounded like Johansson: https://www.washingtonpost.com/technology/2023/09/25/chatgpt...


You mean people who are incentivized to stir up controversy were trying to stir up controversy? I'm shocked. This is definitely not a case where self interest would cloud anyone's judgment.

In case it's not clear, my moderately low opinion of Open AI is bested only by my even lower opinion of journalists.


The article says it sounded "somewhat similar." It doesn’t sound like anyone mistook it for Johansson, or thought it was the same voice:

> The new personas for ChatGPT are named Sky, Ember, Breeze, Juniper and Cove. Each of the personas has a different tone and accent. “Sky” sounds somewhat similar to Scarlett Johansson, the actor who voiced the AI that Joaquin Phoenix’s character falls in love with in the movie “Her.” Deng, the OpenAI executive, said the voice personas were not meant to sound like any specific person.


No one said it was identical (and OA is rebutting a strawman when they expostulate on how they trained on a real human voice actress, which is not what Johansson's statement accused them of to begin with).

The point is here is someone who 'actually listened to it' at the debut long ago, and immediately asked the OA people about it. I don't know what more you want for similarity. They launched several voices, without any of the mentions of Johansson or _Her_ that have been brought up here or the controversy, and back then and there, on the spot, people felt the need to ask about how one sounded 'somewhat similar' to Scarlett Johansson specifically.


I would bet these charitable readings of OA intentions are going to get wiped away by some internal emails found during discovery. It does sound like Mr. Altman was talking about it publicly in a tweet, it’s probable there are internal comms about this.


> Also looking from the perspective of the lesser known voice actress, does Scarlett Johansson have the right to trample the lesser known voice artists future job opportunities by intimidating her previous employers?

This is weird, if not bizarre. Scarlett didn't do anything. Literally no action besides saying no. Then a company decides to impersonate her and use her performance in a movie as implicit marketing for a product. That's the company's problem, not hers.


This is exactly the right argument. Accepting the lawsuit would give precedent to an insidious combination: Matthew+chilling effect


> Is this lesser known voice artist now doomed to find a job in a different sector?

The lesser-known voice actor is dooming themselves to find a job in a different sector by contributing to the development of technology that will almost certainly replace all voice actors.


> Except it doesn't sound like Johansson, I don't know why people keep saying this.

Would you ever say “except strawberries aren’t tasty, I don’t know why people keep saying this”?

Maybe it doesn’t sound like Johansson to you, but it does sound like her to a lot of people. Worse, evidence points to Altman wanting you to make that connection. It’s trying to sound like her performance in one specific movie.


> Maybe it doesn’t sound like Johansson to you, but it does sound like her to a lot of people.

I guarantee you that nobody who's ever heard her voice actually thinks that, go on:

https://x.com/chriswiles87/status/1792909936189378653

> Worse, evidence points to Altman wanting you to make that connection. It’s trying to sound like her performance in one specific movie.

There is no such evidence.


For ten seconds I thought “Wow, that is strikingly familiar to her”. Then I realized that was the example from the movie. I tend to agree, not identical or even close but definitely some similarities. I don’t see a jury ruling that they’re too similar


>I guarantee you that nobody who's ever heard her voice actually thinks that

Including Johansson herself and her family?

>After much consideration and for personal reasons, I declined the offer. Nine months later, my friends, family and the general public all noted how much the newest system named “Sky” sounded like me.

We can talk biases, but I think we're pretty far from "guarantee" in this matter.

>or is it manufactured clickbait outrage now bordering on conspiracy theory? I

Johannsen already has lawyers on the ready. This goes beyond some publicity stunt. You can be cynical, but I struggle to call a potential legal proceeding a conspiracy.

Linking to two small soindbites isn't going to undo an entire court case. I doubt I always sound the same in two identical line readings.


Sky sounds like Scarlett to me. Not all the time, not in every sentence or statement, but frequently enough that combined with Altman's outreaches to Scarlett there's no way I'm giving them the benefit of the doubt.

Try to ask yourself: is it at all plausible that others operating in good faith might come to a different conclusion, particularly given the surrounding details? If you can't find your way to a "yes" then I'm not sure what to say, you must view everyone (seemingly the majority though who knows for sure) to be deluding themselves or trolling others.


> I guarantee you

You definitely do not. Think of the craziest conspiracy theory you can, one that has been debunked countless times, and I’ll show you people that have been shown the evidence yet still believe the conspiracy.

This case is not a conspiracy theory. But it is something where you disagree with a popular opinion. The point is that you’re projecting your thought pattern and way of understanding the world into everyone else instead of being empathetic and putting yourself on the other side.

Look, I get it. I also thought that the outrage about Apple’s recent ad was disproportionate. But I’m not going to go out of my way to defend the ad. I don’t give a hoot that a massive corporation is getting a bit of bad press for something largely inconsequential, I just wish the outrage had been directed at something which actually matters and go on with my life.

> There is no such evidence.

With this kind of dismissal, I don’t think it’s worth continuing the conversation. I’ll leave you to it. Have a genuinely good weekend.


> This case is not a conspiracy theory. But it is something where you disagree with a popular opinion

Is it a popular opinion? Where's the evidence of that? Or is it maybe just a manufactured clickbait outrage now bordering on conspiracy theory? I provided a link that clearly demonstrates the very foundational claim is wrong, and the article has already clarified the mistakes in the timeline that conspiracists are citing as evidence of malfeasance. This controversy is a textbook nothing burger and it annoys me to no end that people keep falling for these tactics.


I already said I didn’t think it was worth continuing the conversation, so I hope I don’t regret trying to be kind.

> Is it a popular opinion or is it manufactured clickbait outrage now bordering on conspiracy theory?

Please realise that calling something “manufactured clickbait outrage” itself borders on conspiracy theory. Again, empathise, look at your argument with an outside eye.

> it annoys me to no end

Don’t let it. Being annoyed on the internet only ruins your day. And the more unhinged you become, the less sense your arguments will make and the more it will backfire.

Unless you have a personal relationship with either Altman or Johansson, I recommend you let this one go. It’s not worth the cost of any mental wellness. Some issues are, but not this one. Save yourself for the issues that are important.

Again I wish you a relaxing weekend.


>> Except it doesn't sound like Johansson, I don't know why people keep saying this. At best, the voice has a couple of similar characteristics,

To add to this, the legal standard isn't whether it sounds "like" her. It has to be a replication of her voice. Millions of people may sound or look "like" another person, that doesn't mean they are a copy of that person.

The best case study in voices imho is David Attenborough. He has a distinct voice that many have sought to replicate. But you know who else had that voice? Richard Attenborough (the actor from Jurassic Park). They are brothers. Sadly, Richard recently passed. Their voices are unsurprisingly nearly interchangeable, along with a thousand other people with similar life stories. So who gets to own complete rights to the distinctive "Attenborough" voice? In any other area of intellectual property the answer is simple: nobody. It doesn't exist and/or was created by people long before any living Attenborough walked the earth.

Similarly, courts ruled that GTA did not steal from Lindsay Lohan. One cannot own the image of generic California blonde in a red bikini. So why should Johansson own sexy/breathy voice of with a nondistinctive accent?


I've been saying this for days, and I'm pretty firmly in the OpenAI critic camp.

The only reason people think it sounds like her is because they've biased themselves into it because of all the context surrounding it.


> they've biased themselves into it

Maybe the fault for that belongs to the company who tried to create the association in your mind by using a similar voice and tweeting about that one movie with the voice.

That’s basic advertising. They knew what they were doing. It’s just that it may have backfired.


I don't think it's a particularly similar voice to begin with, and the technology in general is very 'Her' like - which would also explain why they reached out to ScarJo to do an additional voice.

I think the tweet was dumb but in my layperson's understanding of the law and the situation, I doubt OpenAI is at any huge legal risk here.


It doesn't really matter their legal risk here, IMO. What matters is the court of public opinion in this case.

Even if they are able to show irrefutable proof that it wasn't ScarJo and is in fact another person entirely it will not matter.

This is one of those times that no matter what the facts show people will be dug in one way or another.


You seem to be missing the point: It wasn't that "people ... biased themselves into it"; it was the name, the ex-board-member's direct reference to Johansson, and Altman's tweet that did that.


This. Its what people want to hear. If you heard that voice in a vacuum with no context and asked someone what famous person is it, I doubt many people would say Johansson. Some, sure, but not the majority.

The only thing Sam did wrong was play too fast and loose with the implication of "Her" given that he had been talking to ScarJo. Lawyers should have told him to just STFU about it.


I was confused because it doesn’t really sound much like her normal voice, and I didn’t see Her. So I looked it up—it sounds a little bit more like her voice in her… the movie where she’s adding some AI-like affectations to her voice.

I guess they decided to remove it for PR reasons or something.


If they ask James Earl Jones to do it, he says no, and then they hire someone with a deep voice in order to sound like him? Yes.


Except it doesn't sound like her, and that's not even the correct sequence of events.


It absolutely does sound like her, in this context where she is the voice in the AI voice companion zeitgeist.

In just the same way that if the context were "magical space villain in a black mask and cape" and someone was hired with a deep voice, it would be a clear rip-off of James Earl Jones.

And it requires a level of credulity that I find impossible to sustain to think that "the sequence of events" here doesn't start with "let's have a voice actor casting call and pick a voice that sounds like the one in the movie about this that everyone loves". I won't, however, be shocked if nobody ever wrote that down in a medium that is subject to legal discovery.


If I play the voices back to back, nobody thinks they're the same person.

If I ask which one is SJ, people that have seen her films know, those who don't, don't. (Hint, only one sounds hoarse like early smoker voice and self-assured even in the 'Her' role, only one sounds impossibly 2020s valley girl chipper and eager to please.)

Sure seems like all the dogpiling last week either didn't do a basic listen or ignored it, as it's much better for click farming to mock the cock of the walk.


> only one sounds impossibly 2020s valley girl chipper and eager to please.

I immediately thought "grade school teacher", although I was listening to the clip where she was telling a story.


Read the main article again. The voice actor was already auditioned and chosen before they went to Johansson. Maybe they had a change of plan and thought it would be cool to have Johansson's voice rather than an unknown actor's.


Not a lawyer, but wouldn't the important intent be the one when the voice is released, not the first moment someone gets hired? The economic harm happens when you publicize the voice in a way designed to gain value from Johannsen's reputation without her permission, not when you record it. The tweet and requests speak directly to that moment.


Yes, if she lost movie roles or other contracts because people assumed they could license the OpenAI voice then she could claim she was harmed. However OpenAI removed the voice and this situation is widely publicized. So it is hard to prove that she is being harmed now


Is there any quality difference between hiring a voice actor specifically to provide the voice for an AI compared to cloning an actor's voice from their movies?

Much of the coverage I've seen thorough social media on this (including Johansson's statement) gives the impression that this is what OpenAI did. If the quality of doing that would be worse than using a voice actor to imitate Johansson's voice, what is the value of the publicity which gives the impression that their technology is more advanced than it is, compared to whatever they end up settling this for?


The point isn't the time line of hiring the voice actor. The question is whether OpenAI was deliberately trying to make the voice sound like Johansson.

Suppose someone asked Dall-e for an image of Black Widow like in the first Avengers movie, promoting their brand. If they then use that in advertising, Johansson's portrait rights would likely be violated. Even (especially) if they never contacted her about doing the ad herself.

This is similar to that, but with voice, not portrait.


that's because one can make the argument that dall-e was regurgitating - it would be different if you get somebody who happens to look like her to pose in a similar way.


I don't think this is entirely right (not a lawyer)

You can't hire an artist to draw black widow in the style of Scarlett Johansson's widow. The issue isn't how the art is made, it's whether the end result looks like her.

I think there may be additional issues (to be determined either in courts or by Congress, in the US) with regard to how Dalle makes art, but elsewhere in the thread someone mentioned the Ford Bette middler case, and that does seem to be relevant (also, though, not exactly what happened here)

I don't have the expertise to know how similar this is to the case at hand.


> it's still hard to believe they didn't intend it to sound like the voice in her though:

Especially when you have ex-OAers, who had been working there at the time on 'J.A.R.V.I.S.', tweeting things like https://x.com/karpathy/status/1790373216537502106 "The killer app of LLMs is Scarlett Johansson."


The actress did impersonate Her though.

It's not just a random "voice for your chatbot", it's that particularly breathy, chatty, voice that she performed for the movie.

I would agree with you completely if they'd created a completely different voice. Even if they'd impersonated a different famous actress. But it's the fact that Her was about an AI, and this is an AI, and the voices are identical. It's clearly an impersonation of her work.


> The actress did impersonate Her though.

Did she? The article claims that:

1. Multiple people agree that the casting call mentioned nothing about SJ/her

2. The voice actress claims she was not given instructions to imitate SJ/her

3. The actress's natural voice sounds identical to the AI-generated Sky voice

I don't personally think it's anywhere near "identical" to SJ's voice. It seems most likely to me that they noticed the similarity in concept afterwards and wanted to try to capitalize on it (hence later contacting SJ), opposed to the other way around.


>I don't personally think it's anywhere near "identical" to SJ's voice. It seems most likely to me that they noticed the similarity in concept afterwards and wanted to try to capitalize on it (hence later contacting SJ), opposed to the other way around.

So your theory is that this was completely coincidental. But after the voice was recorded, they thought, "Wow, it sounds just like the voice of the computer in Her! We should contact that actress and capitalize on it!"

That's what you're going with? It doesn't make sense, to me.


Listen to the side by side comparisons. Sky has a deeper voice overall, in the gpt4o demo Sky displays a wider pitch range because the omni model is capable of emotional intonation. Her voice slides quite a bit while emoting but notably doesn't break and when she returns to her normal speaking voice you can hear a very distinct rhotic sound, almost an over-pronounced American accent and she has a tendency towards deepening into vocal fry especially before pauses. I'd describe her voice as mostly in her chest when speaking clearly.

Now listen to SJ's Samantha in Her and the first thing you'll notice are the voice breaks and that they break to a higher register with a distinct breathy sound, it's clearly falsetto. SJ seems to have this habit in her normal speaking voice as well but it's not as exaggerated and seems more accidental. Her voice is very much in her head or mask. The biggest commonality I can hear is that they both have a sibilant S and their regional accents are pretty close.


I was thinking someone thought "oh that sounds a fair bit like SJ in Her, if we can get SJ onboard, perhaps we can fine-tune what we got to sound like SJ in Her".


> But after the voice was recorded, they thought,

... that it would be even better to have a famous voice from Her than a rather generic female voice they had, but their proposal was declined. Well oops, but SJ, famous as she is, doesn't have a copyright right on all female voices other than her own.


No-one had to explicitly say any of that for it to still be an impersonation. Her was a very popular film, and Johansson's voice character was very compelling. They literally could have said nothing and just chosen the voice audition closest to Her unconsciously, because of the reach of the film, and that would still be an impersonation.


> They literally could have said nothing and just chosen the voice audition closest to Her unconsciously, because of the reach of the film, and that would still be an impersonation

That's a very broad definition of impersonation, one that does not match the legal definition, and one that would would be incredibly worrying for voice actors whose natural voice happens to fall within a radius of a celebrity's natural voice ("their choice to cast you was unconsciously affected by similarity to a celebrity, therefore [...]")


What you're arguing fails to pass the obviousness test ; if I were running the company it would be blankly obvious that the optics would be a problem, so I would start to collect a LOT of paperwork documenting that the casting selection was done without a hint of bias towards a celebrity's impression. Where is that paperwork? The obviousness puts the burden on them to show it.

Otherwise your argument lets off not just this scandal but an entire conceptual category of clever sleazy moves that are done "after the fact". It's not the the Kafka trap you're making it out to be.


> if I were running the company it would be blankly obvious that the optics would be a problem, so I would start to collect a LOT of paperwork documenting that the casting selection was done without a hint of bias towards a celebrity's impression. Where is that paperwork? The obviousness puts the burden on them to show it.

I think optics-wise the best move at the moment is quelling the speculation that they resorted to a deepfake or impersonator of SJ after being denied by SJ herself. The article works towards this by attesting that it's a real person, speaking in her natural voice, without instruction to imitate SJ, from a casting call not mentioning specifics, casted months prior to contacting SJ. Most PR effort should probably be in giving this as much of a reach as possible among those that saw the original story.

Would those doing the casting have the foresight to predict, not just that this situation would emerge, but that there would be a group considering it impersonation for there to be any "hint of bias" towards voices naturally resembling a celebrity in selection between applicants? Moreover, would they consider it important to appeal to this group by altering the process to eliminate that possible bias and providing extensive documentation to prove they have done so, or would they instead see the group as either a small fringe or likely to just take issue to something else regardless?


> Would those doing the casting have the foresight to predict, ...

Yes, this should all have been obvious to those people. It would require a pretty high degree of obliviousness for it to not be obvious that this could all blow up in exactly this way.


It blew up by way of people believing it was an intentional SJ deepfake/soundalike hired due to being rejected by SJ. I think this article effectively refutes that.

I don't think it blew up by way of people believing simply that those doing the casting could have a hint of a subconscious bias towards voices that sound like celebrities. To me that seems like trying to find anything to still take theoretical issue in, and would've just been about something else had they made the casting selection provably unbiased and thoroughly documented.


Again, I think it requires a high degree of obliviousness to not have the foresight during casting to think, "if we use a voice that sounds anything like the voice in the famous smash hit movie that mainstreamed the idea of the kind of product we're making, without actually getting the incredibly famous voice actress from that movie to do it, people will make this connection, and that actress will be mad, and people will be sympathetic to that, and we'll look bad and may even be in legal hot water". I think all of that is easily predictable!

It seems way more likely to be a calculated risk than a failure of imagination. And this is where the "ethics" thing comes into play. They were probably right about the risk calculation! Even with this blow-up, this is not going to bring the company down, it will blow over and they'll be fine. And if it hadn't blown up, or if they had gotten her on board at any point, it would have been a very nice boon.

So while (in my view) it definitely wasn't the right thing to do from a "we're living in a society here people!" perspective, it probably wasn't even a mistake, from a "businesses take calculated risks" perspective.


> "if we use a voice that sounds anything like the voice in the famous smash hit movie that mainstreamed the idea of the kind of product we're making [...]

I think it's deceptively easy to overestimate how likely it is for someone to have had some specific thought/consideration when constructing that thought retroactively, and this still isn't really a specific enough thought to have caused them to have set up the casting process in such a way to eliminate (and prove that they have eliminated) possible subconscious tendency towards selecting voice actors with voices more similar to celebrities.

But, more critically, I believe the anger was based on the idea that it may be an intentional SJ soundalike hired due to being turned down by SJ, or possibly even a deepfake. Focusing on refuting that seems to me the best PR move even when full knowledge of what happened is available, and that's what they're doing.


I'm sorry, but your first paragraph is a level of credulity that I just can't buy, to the point that I'm struggling to find this line of argument to be anything besides cynical. The most charitable interpretation I might buy is that you think the people involved in this are oblivious, out of touch, and weird to a degree I'm not willing to ascribe to a group of people I don't know.

If you are an adult living and working in the US in the 2020s, and you are working on a product that is an AI assistant with a human voice, you are either very aware of the connection to the movie Her, or are disconnected from society to an incredible degree. I would buy this if it were a single nerd working on a passion project, but not from an entire company filled with all different kinds of people.

The answer is based on "they wanted a voice that sounds like the one in Her, but the person whose voice that is told them no, but then they did it anyway". The exact sequence of events isn't as important to the anger as you seem to think, though it may be more important to the legal process.


My claim is not that they hadn't heard of the movie her, but that while setting up auditions, the chain of thought that would lead them to predict a group would take issue in this very particular way (marcus_holmes's assertion that unconsciously favoring the VA's audition would constitute impersonation) that necessitates the proposed rigor (setting up auditions in a way to eliminate possibility of such bias, and paperwork to prove as such), and consider it worthwhile appeasing the group holding this view, is not so certain to have occurred that the seeming lack of such paperwork can be relied on to imply much at all.

I would go further and say that chain of reasoning is not just uncertain to have occurred, but would probably be flawed if it did - in that I don't think it would noticeably sway that group. Opposed to the evidence in the article, or some forms of other possible possible evidence, which I think can sway people.

> The exact sequence of events isn't as important to the anger as you seem to think, though it may be more important to the legal process.

Less the order of events, and more "seeking out an impersonator and asking them to do an imitation" vs "possibility of unconscious bias when selecting among auditions"


The way you write it makes it sound very complicated, but in this situation, I would definitely think "we better be really careful about who we hire here in order to avoid people making the connection with the movie voice, unless we can actually get Scarlett Johansson to do the voice", and that thought process would take less than 5 seconds.

And it is not unusual at all for there to be things that everyone knows should not be written down, but either discussed only in person, or left implicit. There is usually a few slip ups though, which would come out in discovery.

> "possibility of unconscious bias when selecting among auditions"

I think "conscious but not stated to the actress" is the more likely explanation, that is not inconsistent with this reporting.

For what it's worth, if this does go to court (which I doubt), and there is discovery and depositions, and they don't find any documentation, or get any statements suggesting that this was indeed understood to be the goal, then I would be a lot more convinced.

But I think it's a giant stretch to have the base case be that nobody thought of this and they were all shocked, shocked! that people made this connection after they released it.


Wouldn't say it's complicated, but it is a specific point. Attacking claims like "they were all shocked, shocked! that people made this connection after they released it" is meaningless when that is not a claim I'm making or relying on. This stems from me disputing a claim that the VA impersonated SJ/her, because of possible unconscious bias of the casting directors, and the supposed obviousness that they would've set up and extensively documented the auditions in such a way to disprove that.

I'd be more convinced, at least of the fact that it would have even been a good call, if I saw outrage sparked by the possibility of unconscious bias, opposed to what can or has been addressed by other forms of evidence. Claims along the lines of "I'd totally have thought [...]" made in retrospect are entirely unconvincing, particuarly in cases where the suggested thought is not sufficient.


I don't know why you started focusing on "unconscious bias", but as I said already, I don't think that's what happened.


> I don't know why you started focusing on "unconscious bias"

That's what I've been taking issue to from the beginning of this chain[0]. In all but one comment since then I've explicitly specified "[un|sub]conscious bias".

On that topic, would you agree with me that it is not "obvious" that they would predict a group would take issue in this very particular way such that it would necessitate setting up and documenting auditions to prove they have eliminated such bias, and then additionally determine it worthwhile to actually do so?

[0]: https://i.imgur.com/PLBdxmN.png


Fair enough. I guess I just shouldn't have responded. I can't really say whether I agree with you or not; I think the whole line of speculation is a non sequitur.


A lot of legal constructs are defined by intent, and intent is always something that is potentially hard to prove.

At most the obviousness should the burden of discovery on them, and if they have no records or witnesses that would demonstrate the intent, then they should be in the clear.

> I would start to collect a LOT of paperwork documenting that the casting selection was done without a hint of bias towards a celebrity's impression.

IMO having records that explicitly mention SJ or Her in any way would be suspicious.

IANAL


So the fact that they tried to recruit SJ (twice) is that evidence that I find suspicious. Plus Altmans tweets. It's not suspicious, it's obvious.


SJs voice has some very distinctive characteristics and she has distinctive inflections that she applies. None of that inflection, tonality, or characteristics are present in the chat bot voice. Without those elements, it can be said to be a voice with vaguely similar pitch and accent, but any reasonable “impersonation “ would at least attempt to copy the mannerisms and flairs of the voice they we’re trying to impersonate.

Listening to them side by side, the OpenAI voice is more similar to Siri than to SJ. That Sam Altman clearly wanted SJ to do the voice acting is irrelevant, considering the timings and the voice differences.

The phone call and tweet were awkward tho.


Exactly anyone that listens to both side by side should be able to clearly distinguish them.


I think that reaches too far. Intent should be a defining part of impersonation. IANAL and I don't know what the law says.


Intent on whose part, though? Like, supposing in arguendo that the company's goal was to make the voice sound indistinguishable from SJ's in Her, but they wanted to maintain plausible deniability, so instead cast as wide a net as possible during auditions, happened upon an actor who they thought already sounded indistinguishable from SJ without special instruction, and cast that person solely for that reason. That seems as morally dubious to me as achieving the same deliberate outcome by instruction to the performer.


> happened upon an actor who they thought already sounded indistinguishable from SJ without special instruction, and cast that person solely for that reason

so who was doing the selecting, and were they instructed to perform their selection this way? If there was a law suit, discovery would reveal emails or any communique that would be evidence of this.

If, for some reason, there is _zero_ evidence that this was chosen as a criteria, then it's pretty hard pressed to prove the intent.


I have this sinking feeling that in this whole debate, whatever anyone's position is mostly depends on whether they think it's good that OpenAI exists or not.


No, I'm happy that OpenAI exists. But alarmed that they're being so mendacious.

If they just said "we loved the film, we wanted that feel, SJ wasn't willing, so we went for it anyway. Obviously that's backfired and we're rethinking" then I would have a thousand times more comfort than this corporate back-covering bullshit.


It sounds more like Rashida Jones than SJ to me.

I think part of this PR cycle is also the priming effect, where if you're primed to hear something and then listen you do great it.


Then OpenAI did the priming by referencing "Her".


Who’s making those claims, exactly? That will tell you a lot about their likely veracity.


First two claims are "according to interviews with multiple people involved in the process", direct quotes from the casting call flier, and "documents shared by OpenAI in response to questions from The Washington Post". Given the number of (non-OpenAI) people involved, I think it would be difficult to maintain a lie on these points. Third claim is a comparison carried out by The Washington Post.


This is why things are decided by juries. You may well truly believe this all seems unrelated and above board. But very few people will agree with you when presented with these facts, and it would be hard find them during a jury selection.


> The actress's natural voice sounds identical to the AI-generated Sky voice

No it doesn't.


> > The article claims that: [...]

> > 3. The actress's natural voice sounds identical to the AI-generated Sky voice

> No it doesn't.

That's a verbatim quote from the article (albeit based on brief recordings).

I haven't heard the anonymous voice actress's voice myself to corroborate WP's claim, but (unless there's information I'm unaware of) neither have you to claim the opposite.


> That's a verbatim quote from the article

The clips are all online for you to listen to them yourself. The article can say what it likes, it's just wrong.


Just to make sure: have you correctly understood the article's claim?

It's saying that the anonymous voice actress's natural voice sounds identical to the AI-generated Sky voice (which implies it has not been altered by OpenAI to sound more like SJ, nor that they had her do some impression beyond her own natural voice).

If so, could you link the clips of the voice actress's natural voice, to compare to the AI version? I've searched but was unable to find such clips.


The AI voice is what’s in question here and there are plenty of examples, just listen to ones that aren’t intentionally selected to try to create the impression of similarity. Sky has been around for half a year so you don’t have to limit yourself to the tech demo but even if you do if you listen to the whole thing you’ll see the normal speaking voice is very different from ScarJo.


I know and agree. The article's claim is that the anonymous voice actress's natural voice sounds identical to the AI-generated Sky voice. That is, VA = AI, not AI = SJ or VA = SJ. Which corroborates with the claim that OpenAI was not asking her to do any particular impression.



Also sam had a one word tweet: “her.” So it looks like there was something going on.


It’s an obvious comparison to make for the technology, I don’t think it was meant as “it sounds like ScarJo”


> The actress did impersonate Her though

This is unclear. What is clear is OpenAI referenced Her in marketing it. That looks like it was a case of poor impulse control. But it's basis for a claim.


> What is clear is OpenAI referenced Her in marketing it.

Because they're building a voice-mediated AI, duh.


How do you explain the many people saying that the voices do not sound especially similar?

"The pitch is kiiiiiind of close, but that's about it. Different cadence, different levels of vocal fry, slightly different accent if you pay close attention. Johansson drops Ts for Ds pretty frequently, Sky pronounces Ts pretty sharply. A linguist could probably break it down better than me and identify the different regions involved."

https://old.reddit.com/r/singularity/comments/1cx24sy/vocal_...

There is also a faction claiming that Sky's voice is more similar to Rashida Jones's than Scarlett Johansson's:

https://old.reddit.com/r/ChatGPT/comments/1cx9t8b/vocal_comp...


Given the breadth and range of female voices available, this is way too close to be just a coincidence.


There are approximately 4 billion women in the world. Given that I know a few people who sound very similar to me, I would say that there are (subjectively) perhaps 1,000 to 10,000 different types of women's voices in the world.

This would mean that a celebrity could possess a voice similar to 0.5 million to 5 million other women, and potentially claim royalties if their voice is used.


This types of estimate is sadly deeply flawed. Voices are affected not just by ethnicity but also language and culture. I know because I can feel slight noticeable differences in tones between communities just 30-50 miles apart, within same social classes and everything. Bilinguals also sound noticeably different to monolinguals even in their primary language.

I think people thinks others sound same not because they're similar from beginning, but because voices must homogenize under peer pressures. There's a proverb "nails that sticks out gets hammered". Most people probably has hammered flat voices intended to not stand out.


fine. So the entire casting process for this role picked a 1 in 10,000 match for Her. That's not coincidence


You first said that Sky is "clearly an impersonation" of Johansson. Now you say that it's not a coincidence they chose Sky's voice actress. These are two different claims. It may not be a coincidence in the sense that they may have chosen Sky's actress because she sounds similar to Johansson. But that alone doesn't constitute an impersonation. Impersonation means deliberately assuming the identity of another person with the intent to deceive. So you'd have to demonstrate more than a degree of similarity to make that case.


Isn't it TTS + style transfer? Or is it e2e from text to waveform?


In the Ford case they hired an impersonator to sing one of her copyrighted songs, so it's clearly an impersonation.

In OpenAI's case the voice only sounds like her (although many disagree) but it isn't repeating some famous line of dialog from one of her movies etc, so you can't really definitively say it's impersonating SJ.


> it's that particularly breathy, chatty, voice that she performed for the movie.

Good luck proving that in court.

“You’re honor our evidence is that the audio clips both sound breathy”


That and repeated, rejected attempts to hire ScarJo to do the voice. I expect they have ample documentation on that.


Did you not read the article? They did the voice months before that.


Then why did they ask her two days prior launch? A potential jury would be very, very curious knowing that.


A very simple answer is that they also wanted SJ as a voice. That doesn't mean the original voice was trying to copy SJ.

They had a voice, the natural comparison watching the interaction is to Her, and they there is likely still time to get actually SJs voice before a public rollout.


In particular, I'm pretty sure you can bake a voice model well enough for a canned demo in two days given we have cloning algs which will do it with 15 seconds of samples.

Production ready? Probably not, but demos don't have to be.


If you believe a known conman like Sam Altman didn't intend to steal SJ's voice I have a bridge to sell you.

See my comment from yesterday re him being a known conman: https://news.ycombinator.com/item?id=40435120


I find this whole thing very confusing because I never thought the voices sounded identical or very similar. I was initially even more confused about this whole thing because I thought I couldn't find clips of the SJ AI voice only to realize I had but it didn't sound like her.

This opinion is independent of Sam being a conman, scammer, creepy Worldcoin weirdo, and so on.


It’s not about what you believe but rather what you can prove


I may be wrong, but I believe this case would be made to a jury, not to a judge.

I think it would be hard to seat a jury that, after laying out the facts about the attempts to hire Johansen, and the tweet at the time of release, would have even one person credulous enough to be convinced this was all an honest mix-up.

Which is why it will never in a million years go to a trial.


Can someone explain to me the outrage about mimicking a public persons voice, while half the people on hacker news argue that it's fine to steal open source code? I fail to see the logic here? Why is this more important?


The order of the events is different, but it still comes down to whether OA had a specific voice in mind when building the chatbot.

By your logic, I could go find a Tim Cook looking and sounding guy, make a promotion video with him praising my new startup, ping Tim Cook to check if by any chance he wouldn't miraculously be willing to do me a favor to avoid me all the trouble in the first place, but still go on and release my video ads without any permission.

"I did all the preparation way before asking the celebrity" wouldn't be a valid defense.


You could have an ad with a voiceover saying all sorts of positive things about your company, in a voice that sounds somewhat like Tim Cook. What you could not do is use a body double, dress someone up like him, or otherwise give the impression that your startup is loved by Tim Cook himself. He doesn’t have a monopoly on slightly southern accents.


The tweet + approach is probably sufficient to bring a lawsuit and get into discovery and then it'll come down to if there's a smoking gun documents (e.g. internal emails comparing the voice to Her, etc.)

It's likely that someone internally must have noticed the similarity so there's like some kind of comms around it so it very much will depend on what was written in those conversations.


Pulling the voice when ScarJo complained is not a good look. I’m sure her attorneys would be very excited to do discovery around that decision should it come to trial.

It won’t though, this is primarily a PR problem for OpenAI. Which is still a real problem when their future product development depends entirely on everyone giving them tons of data.


There are probably a number of other cases. The one I remember is when Sega asked Lady Miss Kier of Deee-Lite fame to use her public image for a game. Nothing come out of it but Sega made the character Ulala[1] anyway. If you grew up in the 90s the characters name was strongly connected to Lady Miss Kier's catch phrase, but unfortunately she lost the suit and had to pay more than half a million.

[1] https://en.m.wikipedia.org/wiki/Ulala_(Space_Channel_5)


> Lady Miss Kier's catch phrase

Not to over-analyze your use of language, but using the possessive here makes it seem like she personally owned that phrase or its use was associated with her. First, I don't know if that's true. Did she say, "Ooh la la," constantly, or is it just something she said at the beginning of the music video (and possibly studio recording) of the one hit from Deee-Lite, Groove Is In The Heart? Moreover, that phrase is a fairly well-known and widely-used one, see: https://en.wikipedia.org/wiki/Ooh_La_La. It certainly was not original to her nor would its use indicate an association to her. To your point, its use plus the aesthetic of the character does seem like a reference to Lady Miss Kier's appearance in that music video (if not also her style and appearance more generally, I don't know if that is how she always looks). But she didn't sue everyone else on this list for the use of her supposed catch phrase, ooh la la.

I hate to say one person's fame is so great that they get special or different treatment in court, but I think "Lady Miss Kier" was punching above her weight in trying to sue over use of her image. Her fame was a flash-in-the-pan one-hit-wonder in the early 90s, no offense to any Deee-Lite fans. It was certainly a great song (with some help from Herbie Hancock, Bootsy Collins, and Q-Tip).

https://www.youtube.com/watch?v=etviGf1uWlg


Not a native speaker, so "catch phrase" is maybe not the right term or too strong.

Prompted by your comment I read up about the case and from what I understand Sega wanted to use the song in a promotion and not (what I remembered) her likeness.


Counter-intuitively, I think this puts Johansson in a stronger position.

OpenAI did not want to copy her the actress, they wanted to copy HER the performance from the movie.


Wouldn't that apply to entertainers like Rich Little whose entire career was him doing his best to exactly impersonate famous peoples' voices and mannerisms?


Parody tends to fall into various exemptions. Of course one might argue that OpenAIs work itself is a parody of AGI.


If someone impersonates people but doesn't disguise/hide their appearance, there's no risk of confusing listeners about who is making the voice.


Also a lawyer, and the Middler case is apparently not understood so narrowly. The possible chilling effect on employability of actors who happen to look or sound just like already famous actors rankled me, too, and I really got into it with my entertainment law professor (who has had some quite high profile clients). His advice in no uncertain terms was basically “Sorry to those actors, but if you try to get away with hiring one of them knowing they’re likely to be confused with someone more famous, you’re stupid.”


>The tweet is still a thorn in OA's side, of course, but that's not likely to be determinative IMO (IAAL).

It amounts to nothing as it was a single word and they could spin that any way they want, it's even a generic word, lol. The "worst" interpretation on that tweet could be "we were inspired by said movie to create a relatable product" which is not an unlawful thing to do.


I see this a lot in space like HN focused on hard science and programming — this idea that judgments couldn't possibly consider things like context beyond what has been literally spelled out. To paraphrase a fitting XKCD, they have already thought of that "cool hack" you've just come up with [0].

I lack the knowledge to make a judgment one way or another about whether this will go anywhere, because I know very little about this area of law, more so in the US. However, this idea that tweeting the title of that specific movie in the context of such a product release couldn't possibly be connected to one of those voices having a similar cadence and sound of a specific actor in that same movie they approched beforehand, couldn't have no legal bearing seems naive. Is it that doubtful that a high-priced legal team couldn't bring a solid case, leading to a severe settlement, or more if she wants to create a precedent?

Clichéd Mafia talk is not a foolproof way to prevent conviction in most jurisdictions.

[0] https://xkcd.com/1494/


The sad thing is: most probably absolutely nothing will happen.

These startups break laws, pay the fines and end up just fine.

Remember that it was Sam Altman who proposed to change the YC questionnaire to screen applicants by incidents where they have successfully broken rules. YC even boasts about that.


> These startups break laws, pay the fines and end up just fine.

That thought process puts them in the same boat as: Theranos, FTX, Lehman Brothers, Bear Stearns, Countrywide Financial, Enron, Washington Mutual, Kidder Peabody, and many other companies which no longer exist.

So if (since?) they lack the ethical compass to not break the rules, perhaps a simple history of what happens to companies that do break the rules might be useful guidance...


They're likely going to write Scarlett Johansson a large check and get a new voice.

What else do you think is going to happen?

The entire company gets shut down?


Why would it be a particularly large check? What are her damages?


Then you don’t know ScarJo. She doesn’t fuck around and she has enough money to put legal fees where her mouth is. She was the vanguard of actors suing for streaming royalties, for example.


Or to rephrase it, the laws don't apply to the rich. They just have to pay a little bit more "tax" (which probably still works out to be less than typical small business has to pay % wise anyway) to be allowed to break them.


Do we really want someone's voice to be copyrightable, to a point where similar sounding people can't be used?

This is such a weird issue for HN to be upset about when other IP related issues (e.g. companies suing for recreating generic code, patent trolls, trademark colors, disregard of paywall, reverse engineering, etc), people here overwhelmingly fall on the side of weaker IP protections.

I guess the diff is some people just pick the side of "the little guy" and the example of centi millionaire beautiful actress vs billion dollar founder, the scales tip to the actress


What sort of damages can Scarlett Johansson expect to get if OpenAI launches with the Sky voice for a short while, then pulls it quickly after the backlash (like they did)?

Are punitive damages commonplace for such scenarios?


I’m sure someone can do some type of math about her general pay rate per performance minute then multiply it by the millions of hours Sky has been heard using her voice. I think that number would likely be quite high.


Any damages are probably going to be way cheaper than the cost of an equivalent publicity campaign.

Bad PR is also PR.


> > Several factors go against OpenAI, he said, namely Altman’s tweet and his outreach to Johansson in September and May. “It just begs the question: It’s like, if you use a different person, there was no intent for it to sound like Scarlett Johansson. Why are you reaching out to her two days before?” he said. “That would have to be explained.”

I think there's a pretty reasonable answer here in that the similarities to Her are quite obvious, and would be regardless of whose voice it was. If you wanted it to be SJ, reaching out right at the last minute seems rather odd, surely you'd reach out at the start?

There are three timelines that seem to be suggested here

* OAI want the voice to sound like SJ

* They don't ask her, they go and hire someone else specifically to sound like her

* They work on, train and release the voice

* OAI, too late to release a new voice as part of the demo, ask SJ if they can use her voice

This requires multiple people interviewed to be lying

Or

* OAI hire someone for a voice

* They train and release the voice

* People talking to a computer that reacts in this way is reminiscent of Her

* "We should get SJ as an actual voice, that would be huge" * Asks SJ

One third one, probably more middle of the road?

* OAI hire someone for a voice

* They train and release the voice

* People talking to a computer that reacts in this way is reminiscent of Her

* "Is this too similar to SJ? Should we ask them?"

* Asks SJ

> He compared Johansson’s case to one brought by the singer Bette Midler against the Ford Motor Co. in the 1980s. Ford asked Midler to use her voice in ads. After she declined, Ford hired an impersonator. A U.S. appellate court ruled in Midler’s favor, indicating her voice was protected against unauthorized use.

Sure, though worth noting that they hired a Bette Midler impersonator to sing a cover of a Better Midler song (edit - after asking and getting a "no")

To be honest, I'm not really that convinced it sounds like her

https://youtu.be/GV01B5kVsC0?t=165

https://youtu.be/D9byh4MAsUQ?t=33


Maybe they didn't need SJ to do anything to train the voicebot as it was already trained with the material from the movie.

(I'm in a camp that Sky voice doesn't really like Johansson's)


Here’s the thing though - if I was OpenAI, I’d be more interested in the actor sounding like the voice agent in Her, than Scarlett Johansen.

After all, Scarlett was playing a role in the movie (lending her voice to it), and they wanted to replicate this acted out role.

If the intent alone mattered, OpenAI should be in the clear. More so if they never specially instructed this voice actor to “sound like Scarlett”.

On the other hand, Sama reaching out to Scarlett directly over a number of times doesn’t lend a good look. Perhaps they felt that Scarlett has already done it (acted out as a voice agent they were trying to bring to life) and she would truly understand what they were going for.

Maybe, it was also a bit for marketing and the buzz-worthy story it might generate (“OpenAI worked with ScarJo to create their life-like AI voice. Scarlett was also the voice behind the AI in “Her”).

However, I’m not a lawyer and the the law could very well view this more favourably towards Scarlett.


I don't get it. He was looking for specific voice. Scarlett Johansson is one of the people who has the voice of this kind. She wasn't interested. It's only logical to approach a different person with the same kind of voice.

It's kinda nasty for one person to monopolize work for all actors that have similar voice to them just because she's most famous of all of them.


100% this.

I'm in the exact same camp, bur for some reason HN crowd thinks that Scarlet has a right here as the other voice actor has a similar voice. Apparently there's an [archaic] law called right to publicity (or something like that) that makes even working with someone with a similar voice illegal. According to that restrictive logic no one can do anything on Earth as they might be doing/looking/sounding similar to someone else who might get offended, as everyone's offended by literally anything nowadays.

I frankly want to see a lawsuit of OpenAI vs Scarlet on this one, where OpenAI wins.


"Mitch Glazier, the chief executive of the Recording Industry Association of America, said that Johansson may have a strong case against OpenAI"

Of course he does. RIAA thinks nearly everything is illegal, and in general mocked or critiqued on the site when they go after some shared mp3s or whatever.

His opinion is neither authoritative or informative.


Oh lovely, RIAA mingling in among the good guys...

This may well be the only taint on Johansson's case.


> After she declined, Ford hired an impersonator. A U.S. appellate court ruled in Midler’s favor, indicating her voice was protected against unauthorized use.

That's because it was impersonating them, not sounding like them. If they didn't try to sell it as them they would have been fine


That's the problem here too right? Sam implies the voice is scarlet with his references to Her.

To me it all just shows these tech-bro's are just spoiled little brads with strong incel energy. I 100% expect a scandal in a few months where it turns out Sam has a bunch of VR porn bots generated by AI that just 'happen' to look and sound EXACTLY like some celebs...


> indicating her voice was protected against unauthorized use

But it wasn't her voice, it was the voice of the impersonator. By that logic, the impersonator can never speak without authorization because the impersonator would use Bette Midler's voice.


They wanted people to either think that is was Bette Midler, or someone that sounded very like her to gain the benefit of association with her. They wanted to use some of Bette's cultural cachet, without her permission.


OpenAI hires voice actress. Then it contacts famous actress for a license. Why. The implication is that OpenAI knows there is a potential legal issue.

Then later, when OpenAI promotes voice actress with an apparent reference to famous actress' work, and famous actress complains, OpenAI "pauses" the project. If OpenAI believes there is no issue, then why pause the project.

This behaviour certainly looks like OpenAI thought it needed a license from famous actress. If there is another explanation, OpenAI has not provided it.


They contacted Johansson after the Sky voice was created, they didn’t create it because she declined.

The voice actor isn’t a Johansson imitator, and the voice isn’t an imitation.

The only similarity between the Sky voice and Johansson’s is that it’s a white American female, so by your logic a significant percentage of the US population has a case.


> They contacted Johansson after the Sky voice was created, they didn’t create it because she declined.

Her statement says otherwise:

"Last September, I received an offer from Sam Altman, who wanted to hire me to voice the current ChatGPT 4.0 system. He told me that he felt that by my voicing the system, I could bridge the gap between tech companies and creatives and help consumers to feel comfortable with the seismic shift concerning humans and Al. He said he felt that my voice would be comforting to people.

https://twitter.com/BobbyAllyn/status/1792679435701014908


The Sky voice was released with the first ChatGPT voices last year in September, so there's no contradiction there unless they asked her on the 1st of September and somehow trained another voice within the few weeks after she said no.

Here's a video that someone posted in October talking to the same Sky voice: https://www.youtube.com/watch?v=SamGnUqaOfU


> and somehow trained another voice within the few weeks after she said no.

Er, that is totally possible? You act like it's not a machine learning system. You train new stuff in hours or days easily, especially if you have good tooling. Imagine saying this of, say, Stable Diffusion image LoRAs: "this X artist LoRA couldn't be based on X because it was somehow trained within the few weeks after X said no!"

All the timing means is that, in good management & startup fashion, because they needed multiple voices, they had a voice pipeline going, and so could have a bunch of female voices in the pipeline for optionality. And if licensing Johansson didn't work out, you have a plan B (and C, and preferably D). This is big business, you don't do things serially or not have backups: "'hope' is not a plan".


They could do it but my point is that people are using the September rejection date as evidence for them copying her voice afterwards because it was 7 months before GPT-4o and they aren't aware that the voice has been in the app for 7 months already.


> Her statement says otherwise

In what way?

That in no way contradicts the fact that the Sky voice was created first, although it does seem to suggest a misunderstanding by Johansson that this was to be an exclusive deal to be "the" voice, leading to the incorrect conclusion that the Sky voice was created after she declined, and must therefore be an impersonation (despite sounding nothing like her/Her, as she herself must know better than anyone). Stretch after stretch after stretch. (Being kind.)

In fact the recordings used for training were made in June/July 2023, which is before Johansson was contact as a possible "also-ran": https://openai.com/index/how-the-voices-for-chatgpt-were-cho...


Imaging this to be such a legal minefield, can't sell my own voice because a celeb sounds a bit alike as my own voice.


You can sell your voice to whoever you want.

What you can't do is USE that voice in a way that seeks to mislead (by however much) people into believing it is someone else.

I'm really not sure why people can't understand that it is intent that matters.


I don't think OpenAI wants people to think ChatGPT 4 or whatever is Scarlett Johansson, that would not make any sense.


Then what were OpenAI hoping to get out of their association with her? Why go to the effort of getting in contact? Why reference the film Her?


What is the angle here? They tell people that it's Scarlett Johansson responding to them instead of a computer? To what end? I just don't get it. And I think anyone who confuses a computer program for a real person has bigger problems than being potentially defrauded by OpenAI.

And people have been making computers sound like humans without anyone suggesting that it's some attempt at fraud for very long.


> What is the angle here? They tell people that it's Scarlett Johansson responding to them instead of a computer? To what end? I just don't get it.

Then don't worry about it. It doesn't matter anyway. Ponder the questions in the comment you replied to in stead; the ones you evaded by asking these irrelevant ones.


The angle is that most people understand the value of celebrity endorsement? And that OpenAI is seeking the endorsement that would come along with association with Scarlett Johansson and the movie Her?


They contacted her to use her voice. I mean, the Sky voice doesn't sound like Scarlet Johansson.

> Why reference the film Her?

Because they developed an AI some people is bonding with. Which is the bigger deal? The voice or the AI, you tell me.


There’s a certain group of people on this site that do not want to OpenAI, Apple, and others, to have done wrong.


A celeb creates a particular voice personality for a role as an AI in a very successful movie. You create an AI, and "co-incidentally" create a strikingly similar voice personality for your AI. Not a legal minefield, you copied the movie, you owe them.

They could have used any accent, any tone, anyone. Literally anyone else. And it would have been fine. But they obviously copied the movie even if they used a different actress.


Yes they tried to copy film Her yet the voice ai character in the film you think was original? No other films predate that film with female AI voices? And would the actor have claim on the fictional character they played in the film vs film “owners”?


Obviously there have been lots of female voices for AIs in movies. But not all of them sound the same. OpenAI could have created a new voice personality for their AI by hiring almost any female voice actress available. But they didn't. They chose to copy Her.


Listen to the voices side by side:

https://www.reddit.com/r/ChatGPT/comments/1cwy6wz/vocal_comp...

And here is voice of another actress ( Rashida Jones ):

https://www.youtube.com/watch?v=385414AVZcA

which actress do you think is similar to the openAI sky voice?


Which sounds more like Johansson's voice on Her, the Sky voice, or Emma Watson?

Again, given the vast range of voices (even female voices) available, choosing one that sounds so close to Her, given the subject matter of the film (and the OpenAI leadership's references to the film), this is not coincidence.


Did I claim coincidence? The openAI sky voice has been available among many other voices since last September and I shared evidence that there are other actors who could claim the voice is more similar to their voice. Yet you think the voice actor from the film her has a right to prevent the use of the sky voice now? On what basis? I doubt most people who watch the film her even know who did the voice acting. And per the evidence I shared above the claim about it being that same voice vs another popular actor is weak.


You wouldn’t be the one liable. Any company that hired you might be liable if you sound like a famous voice and the more famous individual had already declined using their voice.

The company would also be liable if they used your voice and claimed it was someone more famous.

Ultimately you’re not liable for having a similar voice because you’re not trying to fool people you’re someone else. It’s the company that hired you who’s doing that.

This is why tribute acts and impressionists are fine…as long as they are clear they’re not the original artist


Because of laws and regulations like these, innovation is getting slower and slower.

I'm afraid the whole world will get regulated like EU someday, crippling innovation to a point that everyone's afraid to break a law that they aren't even aware of, and stop innovating.


Exactly, that would be absolutely nonsensical.


> The only similarity between the voice and Johansson’s is that it’s female, so by your logic half of California has a case.

My understanding was that the voices sound quite similar. I haven't heard the original Sky voice so don't know. Are there any samples online?


Listen to the voices side by side:

https://www.reddit.com/r/ChatGPT/comments/1cwy6wz/vocal_comp...

And here is voice of another actress ( Rashida Jones ):

https://www.youtube.com/watch?v=385414AVZcA

If you click through and listen please reply and answer these questions: which actress do you think is similar to the openAI sky voice? And what does that tell you about likely court result for Johansson? And having reached this conclusion yourself would you now think the other actress Rashida Jones is entitled to compensation based on this similarly test?


Thanks! Hadn’t heard it before now. They don’t sound all that similar to me, though perhaps I am a bit detail-oriented on account of my linguistics background.


You might be right, but still if it goes to court, OpenAI knows they have an uphill battle. Any jury is going to love Johansson.

The voice doesn't need to sound the same for OpenAI to lose.


Are you asking me to trust that a video clip posted to the ChatGPT subreddit by u/SWAMPMONK is providing an accurate audio recording of the openAI sky voice?


You can literally go look up the original demo footage on YouTube too. Which is what I did (before that comparison was created) and didn't think it sounded anything like SJ.

Put side by side...I can hear a similarity, but they're also distinctly different (the OAI voice doesn't have the huskiness of the Her VA IMO, but they're both Californian female with a bubbly demeanour).



Wait, what's the music artifact at 0:54? What are they even doing!?


If you program a digital voice to sound like someone, there's zero difference than using an impersonator. A voice actress playing a role in a movoe (Her), is 100% that actress's voice too.

People are so weird on this. OpenAI screwed up, they know it, their actions show it, there isn't much to discuss here.


Hmmm. This isn’t voice acting though. I suspect that we’ll find that OpenAI used thousands of Johansson’s voice samples for general training to give the voice a “Her” feel and then found someone with a similar voice for fine tuning but had Johansson said yes, they could then have had her do it instead.

If the records show that they did train Sky with Johansson’s voice samples it will be an interesting case.


The jury will decide on the latter.

At any rate, Altman made clear allusions to hint that they are capable of synthesizing ScarJo's voice as a product feature. The actress retaliated saying she verbally did not consent, and now OpenAI's defense is that they hired a different actress anyway.

...which means they lied to everyone else on the capabilities of the tech, which is y'know, even worse


Exactly. And regardless of the timeline outlined, when discovery happens, they’ll be a bunch of internal messages saying they want ScarJo to do the voice or find someone they can match her close enough. They went down both paths.

This will settle out of court.


her


I’ll admit that my cynicism is in overdrive but I wonder if OpenAI deliberately provoked this. Or at least didn’t mind it as an outcome.

More and more you see legal action as a form of publicity (people filing frivolous lawsuits etc), a lawsuit like this would help keep OpenAI looking like an underdog startup being crushed by the rich and powerful rather than the billion dollar entity it actually is.


However strong her case may be but in lawsuits like these, the prosecution usually needs to prove *beyond reasonable doubt* that OpenAI copied her voice and that too intentionally. This, in all likelihood, seems very tough to come given the evidence so far. Yes, she might drag on the case for a long time but doubt that will cause the slightest dent in OpenAI's business.


> Crimes must be proven beyond a reasonable doubt, whereas civil claims are proven by lower standards of proof, such as the preponderance of the evidence.

[1] https://www.findlaw.com/criminal/criminal-law-basics/the-dif...


This would be a civil not criminal case, there is no prosecution as well “beyond a reasonable doubt” is not the standard, rather Johansson’s lawyers only need to show that the balance of probabilities lies in her favour.


You misunderstand how personality rights work.

Called it in the other thread and calling it in this one, there is no wrongdoing on OpenAI's side.

Looking/sounding like somebody else (even if its famous) is not prosecutable. Scarlet Johansson has nothing in this case, whether people like it or not. That's the reality.


> whether people like it or not. That's the reality.

That is exactly it - people do not like how OpenAI is acting. Whether or not there is legal action to be had is an interesting tangent, but not the actual point - OpenAI's actions are ticking people off. Just because something is legal does not mean it is the right thing to do.


Nobody said looking/sounding like someone else is "prosecutable", and this willfully obtuse reading is getting annoying.

Many people here, including you, seem to be under the impression that a person who sounds like a celebrity can, because they are not that celebrity, do whatever they want with their voice regardless of whether or not they seem to be passing off as or profiting from the persona of that celebrity. This is not the case.

When others point this out many people, again including you, then go "so you're saying the fact that someone sounds like a celebrity means they can't do anything with their voice - how absurd!", and that isn't the case either, and nobody is saying it.

This binary view is what I'm calling obtuse. The intent matters, and that is not clear-cut. There are some things here that seem to point to intent on OpenAI's part to replicate Her. There are other things that seem to point away from this. If this comes to a court case, a judge or jury will have to weigh these things up. It's not straightforward, and there are people far more knowledeable in these matters than me saying that she could have a strong case here.

People have now said this an absurd number of times and yet you seem to be insisting on this binary view that completely ignores intent. This is why I am calling it willfully obtuse.

If the above are misrepresentations of your argument then please clarify, but both seemed pretty clear from your posts. If instead you take the view that what matters here is whether there was intent to profit from Scarlett Johannson's public persona then we don't disagree. I have no opinions on whether they had intent or not, but I think it very much looks like they did, and whether they did would be a question for a court (alongside many others, such as whether it really does sound like her) if she were to sue, not that there is any indication she will.

Edit: And I should say IANAL of course, and these legal questions are complex and dependent on jurisdiction. California has both a statutory right and a common law one. Both, I think, require intent, but only the common law one would apply in this case as the statutory one explicity only applies to use of the person's actual voice. (That seems a bit outdated in today's deepfake ridden world, but given the common law right protected Midler from the use of an impersonator perhaps that is considered sufficient.)

https://www.dmlp.org/legal-guide/california-right-publicity-...


> You misunderstand how personality rights work .. Called it in the other thread

One of the great things about HN is you get all kinds of experts from every field imaginable.

> is not prosecutable

Yikes.


Who said it was ‘prosecutable’?


Scarlet Johansson is threatening legal action against OpenAI for this.

Are you not aware of this?


> Scarlet Johansson is threatening legal action against OpenAI for this

Scarlet Johansson cannot prosecute anyone. She can sue them, in civil court, for civil damages. Prosecution is done in connection with crimes. Nobody is alleging any crimes here.


prosecute: to officially accuse someone of committing an illegal act, and to bring a case against that person in a court of law

Source: Cambridge's dictionary (but any other would work as well)


> From Cambridge's (or any other) dictionary

Where did you get this? I'm seeing "to officially accuse someone of committing a crime" [1]. Criminality is esssential to the term. (EDIT: Found it. Cambridge Academic Content dictionary. It seems to be a simplified text [2]. I'm surprised they summarised the legal definition that way versus going for the colloquial one.)

You have to go back to the 18th century to find the term used to refer to initiating any legal action [3][4].

[1] https://dictionary.cambridge.org/dictionary/english/prosecut...

[2] https://www.cambridge.org/us/cambridgeenglish/catalog/dictio...

[3] https://verejnazaloba.cz/en/more-about-public-prosecution/hi...

[4] https://www.etymonline.com/word/prosecute


Scroll down on that site, literally. Words have more that one meaning.

Here's what I get from MacOS's dictionary: institute legal proceedings against (a person or organization).

I can also be pedantic and insist that, even under the strict interpretation you are vouching for ...

>Looking/sounding like somebody else (even if its famous) is not prosecutable.

... is a correct argument.


If this is really your hill to die on, go for it. Using “prosecute” to refer to civil litigation is not standard English in any dialect circa 1850.


Easy peasy, yes/no answer.

From your understanding, is looking/sounding like somebody else (even if its famous) prosecutable or not?


> is looking/sounding like somebody else (even if its famous) prosecutable or not?

No. And if a lawmaker claimed they would like it to be, and then claimed they meant civilly litigible, they’d be labelled dishonest. (Even if it was an honest mistake.)


>Looking/sounding like somebody else (even if its famous) is not prosecutable.

Great, then we agree. :^)


Yeah, I agree too. It's not prosecutable... But it is sue-able.

So, to return to your original point: Did you have one?


I think it’s still common informal usage to prosecute a (moral) case. Maybe more common in the UK where you can bring a literal private prosecution.

Although I think what lawyers say these days is that it’s not colorable.


> it’s still common informal usage to prosecute a (moral) case

Sure, those are other definitions [1], e.g. to prosecute an argument. Within a legal context, however, it is black and white.

> in the UK where you can bring a literal private prosecution

For crimes. One wouldn't say one is prosecuting a defendant for e.g. libel. (Some states have private prosecution [2].)

[1] https://www.merriam-webster.com/dictionary/prosecute

[2] https://en.wikipedia.org/wiki/Private_prosecution#United_Sta...


The third definition listed on your Merriam-Webster link seems to be what's applicable here, and very clearly describes the term as applicable to any legal action.

This is consistent with my understanding of the term as a native English speaker, having experienced the term "prosecute" being used in reference to both criminal and civil cases in all forms of discourse, verbal and written, formal and informal, for decades, and only first encountering the claim that it shouldn't be used for civil cases here in this thread, today.


> very clearly describes the term as applicable to any legal action…only first encountering the claim that it shouldn't be used for civil cases here in this thread, today

Partly why I used that citation. It’s one of the few (adult) dictionaries that acknowledges as much.

I wouldn’t go so far as to say the Webster 3b usage is incorrect—it’s in some dictionaries and was historically unambiguously correct. But it’s non-standard to a high degree, to the extent that Black’s Law Dictionary only contains the criminal variant. (I’ll leave it open whether publicly referring to someone who has only been sued as someone who has been prosecuted, when intended as an attack, qualifies as defamation.)

More to the point of clear communication, I’d put it in a similar category as claiming one’s usage of terrific or silly was intended in its historic sense [1]. (Though I’ll admit my use of “nice” falls in that category.)

All that said, I’m very willing to entertain there being a dialect, probably in America, where the historic use is still common.

[1] https://www.mentalfloss.com/article/84307/7-words-mean-oppos...


I'd regard what you're calling "historical" as being the actual standard usage in conventional speech -- again, I've never encountered the notion that "prosecute" doesn't apply to civil litigation until today, and I've been speaking English for decades, and have had many discussions involving legal cases, conversed with lawyers, signed contracts, etc. throughout my life.

The fact that you're referring to an intra-disciplinary dictionary to make the opposite argument implies that the narrower definition is jargon, and not an accurate representation of the common meaning of the term.


I didn’t know that about the states. Thanks.


Why would that make it more common in the UK? Being able to bring a private prosecution strengthens the distinction, a regular citizen can both sue someone for a civil offense and prosecute someone for a criminal offense. It makes it more clearly nonsense to refer to suing someone for slandering you as "prosecuting" them because you can bring prosecutions and that is not one!


Like I said, a moral case. There are also things like “trial of the facts”


Go away with the “well, actually”.

It’s a civil dispute.


The term "prosecution" applies to civil cases as well as criminal ones.


> Scarlet Johansson is threatening legal action against OpenAI for this.

Not true. SJ's statement seeks info and does not threaten legal action.


All we need is discovery. They’ll settle before that because we all know they are swimming in dirt.


Yes, how dare another woman have a voice that might somewhat sound like Scarlett?


Can’t really fault the voice actor for having a specific voice.

The question is whether or not OAI traded on the similarity to Scarlet Js likeness. The “her” tweet raises suspicions.


Even if they did. How does that matter?

Let’s say I want to shoot a movie. Id love to have Scarlett star in it. But I can’t afford her, so I hire some b actress that kind of looks like her. What’s wrong with that?

The only way I could see this being wrong is if they then processes the voice of this different person to make it sound more like Scarlett.


> Even if they did. How does that matter?

How the heck could that not matter?!?

Making the voice similar enough that the public will associate it with the recognition and popularity of the movie is obviously capitalizing on the movie, including Johansson's performance. If they did that intentionally, they owe Johansson (and the movie studio) recompense for that. And Altman's tweet rather obviously shows it can at the very least be argued that such intent was present.

> Let’s say I want to shoot a movie. Id love to have Scarlett star in it. But I can’t afford her, so I hire some b actress that kind of looks like her. What’s wrong with that?

Depends on intent. If that's all you do, then fine. But if you dress your movie up to heavily imply it's a sequel to a movie with Johansson in it, and put the actual name of your "B actress" down in the fine print(1), and then tweet the name of the Johansson movie when yours is released... Do you think movie makers get away with shit like that? Or that they should?

What is it about this that you don't understand? Do you genuinely not see that those are two different things, distinguished by intent? Or did you just not know that intent is often a rather vital part of whether something is legal or not?

___

(1): Or omit it altogether, like this allegedly-existing Sky voice actress is still (last I read, before the weekend) anonymous.


They definitely should. Plenty of movies are based on other movies or hommages to it, and typecasting exists as long as cinema.

I remain with my point that if they did any processing that made the voice sound more like Scarlett they most definitely should compensate her.


They asked her. She said no.


They should rename the voice:

Sosumi

Come on OpenAI - do it!


Yeah, that would be stupid AF.

Hope you have a better one sometime soon.


Thanks for the contribution. I think challenging the screen actors guild and calling bullshit on “I own the sound of my voice” would be a good thing.

Open AI already took a stance like this challenging copyright threats against llms.

But maybe you haven’t considered it that way. Or you think unions (representing the 1%) are cool! I think they are stupid AF


Actual Washington Post title:

>OpenAI didn’t copy Scarlett Johansson’s voice for ChatGPT, records show

>>A different actress was hired to provide the voice for ChatGPT’s “Sky,” according to documents and recordings shared with the Washington Post.


Ok, we've reverted the title. Submitted title was "OpenAI and Scarlett Johansson's Voice". Thanks!


You can hear both voices for yourself and tell they are different, but y'all such NPCs you just believe the bullcrap the media spoon-feeds you not the literal sounds in your ears.

New theory, HN is a honey pot for dumb people that Y Combinator studies how to make money from.

Previous theory it was a Alzheimer's style "Fake bus stop" used to round up imposter hackers and keep them contained while the real Hackers did stuff.


This comment and the replies are extremely insightful. Humans are flawed, just because high IQ Newton invented physics doesn't mean he was good at the fomo of the south sea stock. https://pubs.aip.org/physicstoday/article/73/7/30/800801/Isa...

Personally I think it's dilluted the hackers from other sites (like me). But your theory sounds much stronger. A lot of new sites and ideas are not pay to win, they are buy to belong. Crypto, 3D printers, gaming forums, PC hardware forums, AI. These communities manifest free marketing and updates in products to convince you they're good for you to buy them.

I found HN great for some things but you did click this link too. This is a celebrity gossip thread and you joined. I found John walkers site from here didn't know about it. https://www.fourmilab.ch/hackdiet/

'Real hackers' are live on git, maybe twitch. Chill out and listen to what your general peer has to say here and imagine non hackers discussing it. Nobody outside here cares about this drama except Twitter bots. Check out Google trends. https://trends.google.com/trends/trendingsearches/daily?geo=...


I'm curious why, if that's what you think about this place, you not coming here?


Frankly speaking, both sounds plausible!


I'm losing faith in Hacker News commenters too. Such an incredible display of lack of critical thinking ("oh, he tweeted Her? Must be because he's trolling ScarJo, not because the movie is literally like the real life product") and bandwagoning over this piece of news, all because HN wants to hate OpenAI.

Not to mention, on the AI Paint thread, there was this heavily upvoted guy saying its servers are paid for by the stock market and tech bubble like some kind of conspiracy theorist, completely ignoring that it's run locally.

I don't know why I keep coming here. I guess it's because I'm addicted. At least on Reddit you could leave subreddits and block people when they get too incomprehensible from your own perspective.


For what it's worth, sometimes I get very frustrated with HN, but I think the reason I keep coming back is because it provides me with a lot of context. Eg, I pick up a lot of information about techniques people have applied to certain problems, which I recall when I recognize a similar problem, and that gives me a place to start.

But I have had to cut myself off from a given community because my relationship became unhealthy with it many times. I get that.


Thank you.


“Can you generate me some records that indicate we didn’t copy…” /s


These white swans prove there are no black swans! \s


I am assuming they deliberately wanted the voice to sound like Her (the movie) for marketing so copied the voice then tried to get Johansson's permission once the legal department raised some objections. They went through with it anyway when they did not get permission. Altman has shown time again he will ignore all the rules and laws if they hamper his goals. This what I think happened I am not saying it did and I could be wrong


Everyone eagerly citing the article, google: “who owns Washington Post”


Maybe we should wait.

Everything OpenAI and Altman related seems to have multiple layers like an onion.

If we wait long enough we might get documents which show they uses Scarlett Johansson's voice but hired an actress to claim it's hers.

One month later it might be the opposite again.

In times of AI fakes real evidence is hard to find.


I don’t think OpenAI has a chance to win this in open court.

They are very likely to settle out of court. Investors get a bit anxious with pending litigation.

But I honestly hope Johansson does not. She certainly has the runway to take it all the way. Make them look like fools in open court. Show the people their real colors.


Why do you feel OpenAI doesn't have a chance to win in court given the information presented in this WaPo article? It seems fairly conclusive to me.


Copying by hiring a sound-alike/impersonator is an implementation detail. They knew full well that they were trying to copy her voice. Then they asked. And they got a no. And then they did it anyway.

I don't see how the implementation details matter at all.


It's their natural voice.

Are they not allowed to hire a person that sounds like... any famous person?

Aren't others in this thread claiming how similar voices can be and theyre not unqiue?


The VA has not been identified, but I bet that she has a showreel in which you can hear at least ten different natural voices that are not at all like SJ's performance in 'her'. A sad read for the local undertaker. A maniacal shouty ad for the monster truck rally. A motherly read for baby formula. Some fun accents. A dramatic read.


Is it though? Or did the voice actor try to sound more like Scarlett?


I think the optics of this aren't great. You ask someone to copy their voice, they say no and then (maybe coincidentally) you release a voice designed to sound very similar. Legally this may well be fine but come on, didn't anyone think this might make them look bad?


Sure, but to me it looks bad from the other perspective too if this is made into a problem. A person's natural voice is their voice and some celebrity should NOT be able to dictate how it is used.


[flagged]


what?


Read up about "Right to publicity" laws or just read the article itself, which explains she may have a strong case against OpenAI here.


Hollywood elites know they are cooked. It’s only a matter of time before corporations like OpenAI and Microsoft make celebrities and actors as valuable and useful as the next rank and file employee.

I personally want an AI Taylor Swift that can sing to me whatever song I want, and I would like it to be cheap and owned by a corporation.


"She worked closely with a film director hired by OpenAI to help develop the technology’s personality."

So you are trying to tell me there was zero chance that this film director was not aware of the movie "Her" and may have been influenced by it?

Why doesn't the voice sound like the Enterprise's computer from TNG? I don't mean sound, I mean cadence, more professional and not like a sexline operator.


What's your point here? Did they or did they not use another voice actor who gave them permission to use their voice? The voice is similar to Johansson's, sure, but it's not her voice and it wasn't an AI generated version of her voice. I'm failing to see what OpenAI did wrong here.


Scarlett Johansson will forever be associated with this voice named, "Sky", the official voice of "Skynet" that will wipe us all off the face of the earth! In comparison, the unfolding of "her" the movie would be like a walk in the park compared to armies of Terminators.

It sounds too close to Scarlett for me to believe this was not the goal whether they hired somebody else or not, and if they try and prove beyond a doubt no audio post processing was done. Just listen to famous musicians doing acoustic or no processing versions of their songs to see how much you can craft a voice or sound.

[Edit]

  1. A movie that features an AI voice of a female voiced by Scarlett Johansson.

  2. A real-life AI company, OpenAI, is trying to put a distinct voice to their AI product vs. a canned voice.

  3. Sam Altman, CEO of OpenAI, contacting SJ to ask her to be the AI voice.

  4. SJ refuses.

  5. CEO of OpenAI tweets "her" the title of the movie in #1 above.
Audio processing with DSP methods and current audio engineering craft or training AI to make it sound like SJ would be the thing to prove. Get raw audio of actress and finished sound and compare how they steered it to the final product and compare a spectrograph of SJ if you can get the same words.

My common sense and the above facts says OpenAI did whatever they did to get close enough to SJ's voice. SA pursued it a few times, no? It definitely sounds like her enough to me.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: