Should Scarlett Johansson Sue OpenAI? A Case Review

Rosslyn Elliott

Written by Rosslyn Elliott - Pub. May 29, 2024 / Updated May 30, 2024

Are you happy with your Internet service?

Rosslyn Elliott

About the author

Rosslyn Elliott

Rosslyn Elliott has over a decade of experience as a writer, editor, and in-house journalist. She earned a B.A. in English from Yale University and has written professionally in many fields including technology and IT. She has won kudos for her work helping tech startups establish their brands. Having lived all over the USA, Rosslyn has first-hand knowledge of the strengths and quirks of top internet service providers. She now writes on all things internet, including Wi-Fi technology, fiber infrastructure, satellite internet, and the digital divide. As a TV fan, she also enjoys reviewing channel choices and cool gadgets for satellite TV and streaming services. Her personal experience as a researcher, career changer, and remote worker inspires her to guide others to their own online opportunities. After work, she likes to kick back with a good craft beer and speculate about A.I. with friends.

Which speed do I need?

Tell us what you use Internet for

How many users?

Online Gaming
Smart Home Devices
Streaming Video
Browsing & Email

Target speed: 50 Mbps

With multiple users in the house, you need a little wiggle room in your bandwidth. 50 Mbps will support all your web browsing and social media.

Your current Internet speed:

Speedcheck

Find 50 Mbps Internet speed near you

On May 20, Scarlett Johansson released a statement to major media outlets objecting to the use of a woman’s voice similar to hers for OpenAI’s chat personality known as “Sky.”

Johansson’s statement (given in full below) alleged that Sam Altman, the founder of OpenAI, contacted her in September 2023 to ask her to perform the voice for Sky. Johansson declined the offer at that time.

Johansson Contacts Lawyers, Issues Statement on Vocal Similarity

When the Sky chat voice debuted online, Johansson’s relatives contacted her to tell her that the voice of the artificial intelligence program sounded remarkably like Johansson. In particular, Sky’s voice sounded like the voice Johansson used to play Samantha, an AI program in the movie Her.

The voice is warm and flirty, with a distinctive upbeat inflection and occasional “vocal fry” (a term for a gravelly sound speakers get when using their voices at the bottom of their range).

Here is a sample of Johansson’s voice in the film Her:

And here for comparison is the voice used by OpenAI’s Sky:

screenshot of video from OpenAI

Video demo of Sky voice conversing with human

 

Sam Altman also tweeted the single word “her” at the time of the release of the Sky product, an apparent reference to the film.

Johansson states that Altman’s initial recruiting pitch made the case that the use of her voice would “help consumers feel comfortable with the seismic shift concerning humans and AI.” His statement made it clear to Johansson that Altman wanted her specific voice for the product.

Sam Altman Denies Deliberate Imitation of Johansson’s Voice

In response, Sam Altman issued a statement on May 20: “The voice of Sky is not Scarlett Johansson’s, and it was never intended to resemble hers. We cast the voice actor behind Sky’s voice before any outreach to Ms. Johansson.”

Altman apologized that OpenAI did not “communicate better” and said that the company would “pause” use of the Sky voice.

AI, Human Voices, and the Law

Whether Johansson would have a legal case against OpenAI could have important implications for the use of AI in the future. Her situation is related to controversies about deepfakes, though the fact that the voice of Sky was created by a different actress means that it is not technically a deepfake.

A deepfake is an actual AI-created replica of a person’s voice. As Laren Leffer wrote in Scientific American earlier this year, AI audio deepfakes are now very hard to distinguish from the original voice they imitate.

Many states have enacted laws against deepfakes, though the federal government cannot seem to come to a consensus on the subject. (There is no existing federal ban even on deepfake revenge porn, one of the most damaging of AI products.)

 

Digital female face and human female face

Deepfakes cause confusion

Deepfake Voices vs. Human Imitations

But what is the legal situation if a recognizable person’s voice is convincingly replicated by another human, not by a computer?

As Brian Fung recently pointed out on CNN.com, existing California laws protect human beings from having their voices used to market or sell products without permission. That law has held up in California courts even when a voice only sounds like the original human and is not the actual voice of that human.

Fung cites cases by Bette Midler against Ford Motor Co. and Tom Waits against Frito Lay. In both cases, the singers won their cases against “sound-alike” singing voices that were used in commercials for these large companies.

Will Right-of-Publicity Laws Be a Precedent Against AI Deepfakes?

Even more important than Johansson’s individual case is the larger question of how and when the unauthorized use of people’s voices and images will be regulated.

Under current law, people can sue other people or corporations for defamation, libel, or slander simply for making a false claim about a real human. For example, if a corporation claims that a celebrity said or did something she did not, that celebrity may win damages for any injury that results from the falsehood.

28 U.S. Code Section 4101 defines defamation as false speech that causes “damage to reputation or emotional distress” or presents “any person in a false light.”

Yet, against all common sense, there is no federal legal penalty for creating a voice or an image that gives the impression that a real person said or did something that they did not do in real life.

In other words, anyone with access to AI tools can create a copy of someone’s voice that says false things that harm that person’s reputation. And there is currently no legal penalty for that kind of AI-generated defamation.

 

cyborg with vocal sound waves coming from its mouth

Who owns your voice?

Do We Own Our Own Voices and Images?

California courts have already determined that celebrities have some rights to control the use of their voices and images through right-of-publicity laws. But mostly, those laws restrict the unauthorized use of recognizable voices for marketing and sales.

With AI, the question will be whether people have the rights to their own images and voices in all situations.

For example, if an AI user fakes a celebrity voice without using it to market a product, is that acceptable fakery? What if the celebrity voice deepfake implies that the celebrity has made controversial statements that would be damaging to that celebrity’s reputation and livelihood?

The law is just as crucial for ordinary people who are not celebrities. If private citizens do not own their own voices and images, there will be endless ripple effects from constant deception and fraud.

Crimes and Fraud Enabled by Deepfake Audio

Deepfake audios enable millions of dollars in thefts from corporations around the world. Deepfake voice clones convince people to pay ransoms for supposedly kidnapped relatives who are actually safe and sound despite the terrifying things their “voices” say over the phone.

And theft and extortion are only the beginning of the manipulation that will result with unregulated deepfakes.

In Asia and Europe, deepfakes have been all over the media for months as political activists attempt to influence elections. Cyber experts have been warning that deepfakes will cause widespread confusion in the U.S. electoral system.

Laws to Prevent Exploitation of Voices and Images

More states are taking legislative action to prevent the unauthorized use of real people’s images and voices.

The Johansson case may influence legislation despite its apparent use of human imitation rather than a deepfake.

The fact that an AI company is embroiled in a high profile right-of-publicity scandal may help push along the painfully slow lawmaking of the federal government. Thus far, Congress continues to reject and delay legislation that would protect people from deepfakes in all walks of life. But the more ethical problems surface with AI, the more the U.S. government may be pressed to take action.

 

Image of digital gavel in front of wall of programming code

Federal law is too slow for technology

 

Full statement by Scarlett Johansson on OpenAI and unauthorized voice imitation

“Last September, I received an offer from Sam Altman, who wanted to hire me to voice the current ChatGPT 4.0 system. He told me that he felt that by my voicing the system, I could bridge the gap between tech companies and creatives and help consumers to feel comfortable with the seismic shift concerning humans and A.I. He said he felt that my voice would be comforting to people. After much consideration and for personal reasons, I declined the offer. Nine months later, my friends, family and the general public all noted how much the newest system named ‘Sky’ sounded like me.

“When I heard the released demo, I was shocked, angered and in disbelief that Mr. Altman would pursue a voice that sounded so eerily similar to mine that my closest friends and news outlets could not tell the difference. Mr. Altman even insinuated that the similarity was intentional, tweeting a single word, ‘her’ — a reference to the film in which I voiced a chat system, Samantha, who forms an intimate relationship with a human.

“Two days before the ChatGPT 4.0 demo was released, Mr. Altman contacted my agent, asking me to reconsider. Before we could connect, the system was out there. As a result of their actions, I was forced to hire legal counsel, who wrote two letters to Mr. Altman and OpenAI, setting out what they had done and asking them to detail the exact process by which they created the ‘Sky’ voice. Consequently, OpenAI reluctantly agreed to take down the ‘Sky’ voice.

“In a time when we are all grappling with deepfakes and the protection of our own likeness, our own work, our own identities, I believe these are questions that deserve absolute clarity. I look forward to resolution in the form of transparency and the passage of appropriate legislation to help ensure that individual rights are protected.”

 

 

 

 

Related Posts

Do you want to leave a review for AT&T Internet?

Rosslyn Elliott

About the author

Rosslyn Elliott

Rosslyn Elliott has over a decade of experience as a writer, editor, and in-house journalist. She earned a B.A. in English from Yale University and has written professionally in many fields including technology and IT. She has won kudos for her work helping tech startups establish their brands. Having lived all over the USA, Rosslyn has first-hand knowledge of the strengths and quirks of top internet service providers. She now writes on all things internet, including Wi-Fi technology, fiber infrastructure, satellite internet, and the digital divide. As a TV fan, she also enjoys reviewing channel choices and cool gadgets for satellite TV and streaming services. Her personal experience as a researcher, career changer, and remote worker inspires her to guide others to their own online opportunities. After work, she likes to kick back with a good craft beer and speculate about A.I. with friends.

Which speed do I need?

Tell us what you use Internet for

How many users?

Online Gaming
Smart Home Devices
Streaming Video
Browsing & Email

Target speed: 50 Mbps

With multiple users in the house, you need a little wiggle room in your bandwidth. 50 Mbps will support all your web browsing and social media.

Your current Internet speed:

Speedcheck

Find 50 Mbps Internet speed near you