AI Voice Cloning is Plaguing Celebrities like Tom Hanks

Tom Hanks warned his fans on Instagram about drug ads that are using AI versions of his voice
AI Voice Cloning is Plaguing Celebrities like Tom Hanks
Tom Hanks attends the "Asteroid City" New York Premiere on June 13, 2023. Dia Dipasupil/Getty Images
Juliette Fairley
Updated:
0:00

Actor Tom Hanks’s voice is allegedly being used in a fraudulent manner to promote pharmaceuticals on social media and tech, and legal experts say there’s not much he or any other celebrity can do about it.

That’s because there aren’t sufficient federal and state laws regulating the use of AI to replicate the voice or likenesses of public figures, according to GPTZero CEO Edward Tian.

“Laws need to catch up to AI use in the US and globally,” Tian told the Epoch Times. “People have been able to create AI-generated content of celebrities without facing legal action.”

GPTZero is a browser extension that’s designed to detect content written by artificial intelligence.

The box-office movie star warned his fans on social media about the AI cloning this week.

“There are multiple ads over the internet falsely using my name, likeness, and voice promoting miracle cures and wonder drugs,” Hanks alleged on Instagram on Sept. 1. “These ads have been created without my consent, fraudulently and through AI.”

Hanks said he has type 2 diabetes in the statement but added that he only works with his board-certified doctor regarding treatment.

He also stopped short of naming the companies. Last year in October, also on Instagram, Hanks called out an unnamed dental plan that he accused of using his AI image.

“I have nothing to do with these posts or the products and treatments or the spokespeople touting these cures,” Hanks wrote in the recent post. “Do not be fooled. Do not be swindled. Do not lose your hard earned money.”

The widespread availability of both commercial and open-source AI tools is part of the problem, according to Pindrop CEO and co-founder Vijay Balasubramaniyan.

Pindrop is a software technology company in Atlanta.

“Tackling this challenge requires vigilant consumers, better social media oversight to manage misleading content and risky links, improved AI control mechanisms from commercial AI-generation tools, and regulations that increase the cost for fraudsters,” Balasubramaniyan told The Epoch Times.

Hanks isn’t alone. “Heart Like a Truck” country singer and “Yellowstone” star Lainey Wilson also claims to be a victim of AI voice cloning.

The recording artist testified in front of a Congressional Judiciary Subcommittee on Courts, Intellectual Property, and the Internet on Feb. 2 in support of the No AI Fraud Act.

“It’s not just artists who need protecting,” she said. “Fans need it too. It’s needed for high school girls who have experienced life-altering deep fake porn using their faces and for elderly citizens convinced to hand over their life savings by a vocal clone of their grandchild in trouble. AI increasingly affects every single one of us.”

If approved, the proposed No AI Fraud Act would create legal mechanisms to prevent the unauthorized use by AI platforms of Americans’ likeness and voice.

Lainey Wilson attends the 2024 Billboard Women in Music Awards in Inglewood, Calif., on March 6, 2024. (Michael Tran/AFP via Getty Images)
Lainey Wilson attends the 2024 Billboard Women in Music Awards in Inglewood, Calif., on March 6, 2024. Michael Tran/AFP via Getty Images

State laws are also needed to protect celebrities and the average consumer, according to attorney Mark Hirsch, who argues policymakers must move quickly to stay ahead of AI as it improves.

“The worst thing that could happen if no laws are passed is the exploitation of celebs’ voices and images for scams, fraud, or false endorsements will continue and there will be a loss of privacy for famous and regular people,” Hirsch told The Epoch Times. “Privacy protections should be made stronger to cover situations unique to AI.”

Hirsch is a personal injury lawyer with the Aventura, Florida firm Templer & Hirsch.

The unauthorized use of a celebrity’s AI voice or image can reduce public trust and damage their brand and reputations, according to Bitmind CEO Ken Miyachi.

“Celebrities invest years building their brand, with teams working tirelessly to craft their public image,” Miyachi told The Epoch Times. “Deep fakes and AI voice cloning can rapidly erode this hard-earned reputation, potentially causing severe damage to their credibility and career.”

Miyachi wants celebrities and the general public to demand that social media platforms implement stricter and faster removal of unauthorized voice cloning.

“The more videos and audio clips someone posts, the more material there is for these AI models to train on,” he said. “The more content they have, the better the cloning gets. That’s why it’s so easy for celebrities’ deepfakes to sound almost identical to the real thing.”

Juliette Fairley
Juliette Fairley
Freelance reporter
Juliette Fairley is a freelance reporter for The Epoch Times and a graduate of Columbia University’s Graduate School of Journalism. Born in Chateauroux, France, and raised outside of Lackland Air Force Base in Texas, Juliette is a well-adjusted military brat. She has written for many publications across the country. Send Juliette story ideas at [email protected]