Hacker News new | past | comments | ask | show | jobs | submit login

I am guessing this is only because they first tried to hire the originals before hiring sound-alikes... otherwise, would it mean that if your voice sounds similar to someone else, you can't do voice work?



Good voice actors can do a whole range of voices, including imitating many different people. The cases where someone got in trouble are where there's misrepresentation. If it goes to court, there's discovery, and if the OpenAI people gave specific instructions to the voice actor to imitate Scarlett Johansson, after denying it, there could be big trouble. We don't know that, but it looks likely given how they first approached her and how they seemed to be going for something like the "Her" film.


> would it mean that if your voice sounds similar to someone else, you can't do voice work?

Maybe only when the director's instructions are "I want you to sound like XYZ".


Exception: Parody is covered under the first amendment.


>Wheel of Fortune hostess Vanna White had established herself as a TV personality, and consequently appeared as a spokesperson for advertisers. Samsung produced a television commercial advertising its VCRs, showing a robot wearing a dress and with other similarities to White standing beside a Wheel of Fortune game board. Samsung, in their own internal documents, called this the "Vanna White ad". White sued Samsung for violations of California Civil Code section 3344, California common law right of publicity, and the federal Lanham Act. The United States District Court for the Southern District of California granted summary judgment against White on all counts, and White appealed.

>The Ninth Circuit reversed the District Court, finding that White had a cause of action based on the value of her image, and that Samsung had appropriated this image. Samsung's assertion that this was a parody was found to be unavailing, as the intent of the ad was not to make fun of White's characteristics, but to sell VCRs.

https://en.wikipedia.org/wiki/White_v._Samsung_Electronics_A....

Maybe it depends on which court will handle the case, but OpenAI's core intent isn't parody, but rather to use someone's likeness as a way to make money.

(I am not a lawyer)


Of course, surely you can do this if you're playing a character, such as impersonating Trump or Obama or an actor on SNL?


Yes, parody's fine when it's clearly parody. But if you try to pretend that Trump or Obama (rather than an impersonator) is endorsing a product, you're in trouble.


but openai has only ever said that the chatgpt voice has nothing to do with scojo


What you say after the fact doesn't really matter...


It's only really about the intent of the voice to sound like the original. Reaching out to the originals first implies intent, so it makes the case easier.

It would be harder to find a case if they simply just hired someone that sounds similar, but if they did that with the intention to sound like the original that's still impersonation, only it's harder to prove.

If they just happened to hire someone that sounded like the original, then that's fair game IMO.

IANAL




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: