But this is just using a voice. It might even be their natural voice. I don’t think there’s fraud because it wasn’t presented as Scarlett’s voice. If it wasn’t presented as not her voice, then maybe those other two would apply, though is allowing a service to use your voice the same as endorsement? Is it enough to sound like someone to be considered impersonating them?
This situation lands in a grey area where I can’t endorse or condemn it. I mean, it would have been smarter to just use a different voice. Find a celebrity that would sign on or just use an unrecognisable voice. Ethical or not, and legal or not, it was stupid.
I read that Scarlett’s family & friends couldn’t tell it apart from her actual voice.
I’d say that “Open AI” or whatever they’re called, trained it specifically on only her voice.
The seems-narcissistic-machiavellian-sociopath-CEO whats-his-face tried to get her to agree to this,
she wouldn’t agree,
he tweeted “her” when releasing the update ( after Scarlett’s movie )
she lawyered up,
he backed down…
I’d say it’s a clear case of identity-theft-for-profit of a celebrity, by a consistently narcissistic-machiavellian-sociopath who’s kinda leaving lots of corpses of “integrity” all over the place.
There’s some law which protects celebrities from use of their likeness, and rightly:
it’s their “coin” that their career is made-of, right?
It was explicitly represented as her voice when he tweeted “Her” in relation to the product, referencing a movie which she voiced. It’s not a legal grey area in the US. He sank his own ship here.
I’m mostly going off of this article and a few others I’ve read. This article notes:
Celebrities have previously won cases over similar-sounding voices in commercials. In 1988, Bette Midler sued Ford for hiring one of her backup singers for an ad and instructing the singer to “sound as much as possible like the Bette Midler record.” Midler had refused to be in the commercial. That same year, Tom Waits sued Frito-Lay for voice misappropriation after the company’s ad agency got someone to imitate Waits for a parody of his song in a Doritos commercial. Both cases, filed in California courts, were decided in the celebrities’ favor. The wins by Midler and Waits “have clear implications for AI voice clones,” says Christian Mammen, a partner at Womble Bond Dickinson who specializes in intellectual property law.
There’s some more in there:
To win in these cases, celebrities generally have to prove that their voice or other identifying features are unregistered trademarks and that, by imitating them, consumers could connect them to the product being sold, even if they’re not involved. That means identifying what is “distinctive” about her voice — something that may be easier for a celebrity who played an AI assistant in an Oscar-winning movie.
I think taken with the fact that the CEO made a direct reference to the movie she voiced an AI assistant when announcing the product, that’s enough that a normal person would “connect them to the product being sold.”
But this is just using a voice. It might even be their natural voice. I don’t think there’s fraud because it wasn’t presented as Scarlett’s voice. If it wasn’t presented as not her voice, then maybe those other two would apply, though is allowing a service to use your voice the same as endorsement? Is it enough to sound like someone to be considered impersonating them?
This situation lands in a grey area where I can’t endorse or condemn it. I mean, it would have been smarter to just use a different voice. Find a celebrity that would sign on or just use an unrecognisable voice. Ethical or not, and legal or not, it was stupid.
I read that Scarlett’s family & friends couldn’t tell it apart from her actual voice.
I’d say that “Open AI” or whatever they’re called, trained it specifically on only her voice.
The seems-narcissistic-machiavellian-sociopath-CEO whats-his-face tried to get her to agree to this,
she wouldn’t agree,
he tweeted “her” when releasing the update ( after Scarlett’s movie )
she lawyered up,
he backed down…
I’d say it’s a clear case of identity-theft-for-profit of a celebrity, by a consistently narcissistic-machiavellian-sociopath who’s kinda leaving lots of corpses of “integrity” all over the place.
There’s some law which protects celebrities from use of their likeness, and rightly:
it’s their “coin” that their career is made-of, right?
_ /\ _
It was explicitly represented as her voice when he tweeted “Her” in relation to the product, referencing a movie which she voiced. It’s not a legal grey area in the US. He sank his own ship here.
I’m mostly going off of this article and a few others I’ve read. This article notes:
There’s some more in there:
I think taken with the fact that the CEO made a direct reference to the movie she voiced an AI assistant when announcing the product, that’s enough that a normal person would “connect them to the product being sold.”
if they didn’t need to license it, why did they repeatedly try?