Hacker News new | past | comments | ask | show | jobs | submit login

That's true. As I suggested, this case may well be open and shut. But what if it was Google's or Meta's voice that sounded exactly like Sky, i.e. without a history of failed negotiations? The amount of "likeness" would technically be identical.

Are companies better off not even trying to negotiate to begin with?




Not sure legally. I chose the words “perceived similarity” intentionally to encompass scenarios in which the similarity is coincidental but widely recognized. Even in that case, I believe the original person should be entitled to a say.


It'd be fine to not negotiate if you weren't going to use a voice that sounded famous. If I were offering something that let you make any kind of voice you want, I would definitely not market any voice that sounded familiar in any way. Let the users do that (which would happen almost immediately after launch). I would use a generic employee in the example, or the CEO, or I'd go get the most famous person I could afford that would play ball. I would then make sure the marketing materials showed the person I was cloning and demonstrated just how awesome my tool was at getting a voice match.

What I wouldn't do is use anything that remotely sounds famous. And I would definitely not use someone that said "no thanks" beforehand. And I would under no circumstances send emails or messages suggesting staff create a voice that sounds like someone famous. Then, and only then, would I feel safe in marketing a fake voice.


Sounds judicious. You probably wouldn't get sued, and would prevail if sued. However, the question of how much human voice space Scarlett can lay claim to remains unsettled. Your example suggests that it might be quite a bit, if law and precedent causes people to take the CYA route.

Consider the hypothetical: EvilAI, Inc. would secretly like to piggyback on the success of Her. They hire Nancy Schmo for their training samples. Nancy just happens to sound mostly like Scarlett.

No previous negotiations, no evidence of intentions. Just a "coincidental" voice doppelganger.

Does Scarlett own her own voice more than Nancy owns hers?

Put another way: if you happen to look like Elvis, you're not impersonating him unless you also wear a wig and jumpsuit. And the human look-space is arguably much bigger than the voice-space.


> However, the question of how much human voice space Scarlett can lay claim to remains unsettled

I don't think it's that unsettled, at least not legally. There seems to be precedent for this sort of thing (cf. cases involving Bette Midler or Tom Waits).

I think the hypothetical you create is more or less the same situation as what we have now. The difference is that there maybe isn't a paper trail for Johansson to use in a suit against EvilAI, whereas she'd have OpenAI dead to rights, given their communication history and Altman's moronic "Her" tweet.

> Does Scarlett own her own voice more than Nancy owns hers?

Legally, yes, I believe she does.


There are other ways public figures are treated differently in the courts in the US. It's much more difficult for them to prove libel or slander, for instance. They have to prove actual malice and intent, whereas a private citizen just has to prove negligence. I imagine "owning" their likeness at a broader sense is the flip side of that coin.


I know toying with these edge cases is the “curious” part of HN discussions, but I can’t help but think of this xkcd: https://xkcd.com/1494/


HN discussions, grad school case studies, and Supreme Court cases alike. Bad cases make bad laws, edge cases make extensive appeals.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: