FCC Outlaws AI-Generated Voices in Robocalls to Combat Impersonation Scams, Election Interference

FCC Outlaws AI-Generated Voices in Robocalls to Combat Impersonation Scams, Election Interference
The Federal Communications Commission’s hearing room in Washington on Dec. 14, 2017. Brendan Smialowski/AFP via Getty Images
Wim De Gent
Updated:
0:00

The Federal Communications Commission on Thursday outlawed robocalls that contain voices generated by artificial intelligence (AI)—a decision that sends a clear message to scammers that exploiting such technology to extort people and mislead voters will not be tolerated.

The unanimous ruling targets robocalls made with AI voice-cloning tools under the Telephone Consumer Protection Act, a 1991 law restricting junk calls that use artificial and prerecorded voice messages.

The announcement comes as New Hampshire authorities are advancing their investigation into AI-generated robocalls that mimicked President Joe Biden’s voice to discourage people from voting in the state’s first-in-the-nation primary last month.

Effective immediately, the regulation empowers the FCC to fine companies that use AI-generated voices in their calls, or to block the service providers that carry them. It also opens the door for call recipients to file lawsuits, and gives state attorneys general a new mechanism to crack down on violators, according to the FCC.

As AI-generated voice cloning and image-creation tools have become more accessible, scammers have latched on to the new programs, abusing the technology for a range of nefarious ends, FCC chairwoman Jessica Rosenworcel said in a statement.

“It seems like something from the far-off future, but this threat is already here,” she said. “Already we see this happening with Tom Hanks hawking dental plans online, a vile video featuring Taylor Swift, and calls from candidates for political office that are designed to confuse us about where and when to vote.”

Some scammers have even preyed on grandparents by impersonating their grandchildren’s voices, begging for money.

“All of us could be on the receiving end of these faked calls, so that’s why we felt the time to act was now,” the FCC chairwoman told the Associated Press.

The Federal Communication Commission’s then-Commissioner Jessica Rosenworcel testifies before the House Energy and Commerce Committee's Communications and Technology Subcommittee on Capitol Hill in Washington on Dec. 5, 2019. (Chip Somodevilla/Getty Images)
The Federal Communication Commission’s then-Commissioner Jessica Rosenworcel testifies before the House Energy and Commerce Committee's Communications and Technology Subcommittee on Capitol Hill in Washington on Dec. 5, 2019. Chip Somodevilla/Getty Images

Under the Telephone Consumer Protection Act, telemarketers generally cannot use automated dialers or artificial or prerecorded voice messages to call cellphones or landlines without prior written consent from the call recipient.

The new ruling classifies AI-generated voices and impersonations as “artificial” and, thus, enforceable under the same act, the FCC said.

Steep fines will be imposed on those who break the law, with top penalties of more than $23,000 per call. The law also gives call recipients the right to take legal action, and to potentially recover up to $1,500 in damages for each unwanted call.

A bipartisan group of 26 state attorneys general responded to a November inquiry of the FCC concerning AI-generated voice scams, urging the agency to move forward with a ruling. In addition, the agency has a memorandum of understanding with 48 state attorneys general to collaborate in the fight against illegal robocalls—and is also looking to implement AI-based tools to help detect scammers.

Election Disruption

Experts familiar with artificial intelligence applauded the FCC’s decision, but some warned that voters should not let down their guard.

“The true dark hats tend to disregard the stakes and they know what they’re doing is unlawful,” Josh Lawson, director of AI and democracy at the Aspen Institute, told the Associated Press.

“We have to understand that bad actors are going to continue to rattle the cages and push the limits,” he said, further warning voters to prepare themselves for personalized spam to target them by phone, text message, and social media.

The robocalls that sought to influence New Hampshire’s Jan. 23 primary election used an AI-generated impersonation of Joe Biden’s voice, employed his oft-used phrase “What a bunch of malarkey,” and falsely suggested that voting in the primary would preclude voters from casting a ballot in November.

Those calls reached thousands of state residents, mostly registered Democrats.

“New Hampshire had a taste of how AI can be used inappropriately in the election process,” said New Hampshire Secretary of State David Scanlan. “It is certainly appropriate to try and get our arms around the use and the enforcement so that we’re not misleading the voting population in a way that could harm our elections.”

On Tuesday, New Hampshire Attorney General John Formella said that investigators had identified the calls as originating from the Texas-based Life Corp and its owner, Walter Monk. The calls were allegedly transmitted by another Texas-based company called Lingo Telecom. Both companies have been subpoenaed.

A task force of attorneys general in all 50 states and Washington, D.C., sent a letter to Life Corp warning it to stop its illegal calls immediately.

In a statement, Lingo Telecom said it “acted immediately” to help with the investigation and quickly identified and suspended Life Corp when contacted by the task force. The company said it “had no involvement whatsoever in the production of the call content.”

But according to the FCC, both companies have been under investigation for illegal robocall activities in the past. In 2003, the FCC issued a citation to Life Corp for sending out illegal prerecorded and unsolicited advertisements. More recently, the AG task force has accused Lingo Telecom of being the gateway provider for 61 suspected illegal calls from overseas. The Federal Trade Commission issued a cease and desist order against Lingo’s prior corporate name, Matrix Telecom, in 2022.

The incident is not the first time robocallers have sought to mislead voters. In 2020, two conservative provocateurs made over a thousand calls to predominantly black areas falsely warning people that voting by mail could increase their risk of arrest, debt collection, and forced vaccination. The FCC served the men a $5.1 million fine for violating the Telephone Consumer Protection Act.

The Associated Press contributed to this article.
From NTD News.
Related Topics