The FBI is warning the public that criminals are taking social media photos and videos to create deepfakes in sextortion schemes.
“The FBI continues to receive reports from victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content,” the agency said in the alert. “The photos or videos are then publicly circulated on social media or pornographic websites, for the purpose of harassing victims or sextortion schemes.”
The agency noted that it has observed an increase in the number of victims reporting sextortion as of April this year, with fake media being created using “content posted on their social media sites or web postings, provided to the malicious actor upon request, or captured during video chats.”
Typically, malicious actors have two different demands, according to the FBI. They either want victims to provide money or gift cards in exchange for not sharing the fake images or photos with their family members or social media friends, or have victims provide real sexually-themed images or videos of themselves.
As a result, the FBI urges the public to “exercise caution” when posting photos and videos of themselves online, including social media and dating apps.
“Although seemingly innocuous when posted or shared, the images and videos can provide malicious actors an abundant supply of content to exploit for criminal activity,” the agency said.
“Advancements in content creation technology and accessible personal images online present new opportunities for malicious actors to find and target victims,” the agency added. “This leaves them vulnerable to embarrassment, harassment, extortion, financial loss, or continued long-term re-victimization.”
The FBI shared a list of recommendations, including asking parents to monitor their children’s online activities and run frequent online searches of their children’s information to know what is publicly available.
“Consider using reverse image search engines to locate any photos or videos that have circulated on the internet without your knowledge,” a recommendation says.
Warnings
In recent years, U.S. authorities have been issuing warnings about deepfake technology.“A scammer could use AI to clone the voice of your loved one. All he needs is a short audio clip of your family member’s voice—which he could get from content posted online—and a voice-cloning program,” the commission wrote. “When the scammer calls you, he’ll sound just like your loved one.”
In such a situation, the FTC asks people to hang up, particularly if the caller wants to be paid via wire transfer, cryptocurrency, or a gift card.
“Don’t trust the voice. Call the person who supposedly contacted you and verify the story. Use a phone number you know is theirs. If you can’t reach your loved one, try to get in touch with them through another family member or their friends,” the commission said.
Legislation
In May, Rep. Joe Morelle (D-N.Y.) introduced a bill to make sharing sexualized non-consensual deepfakes illegal.“As artificial intelligence continues to evolve and permeate our society, it’s critical that we take proactive steps to combat the spread of disinformation and protect individuals from compromising situations online,” he added. “I’m proud to have introduced this legislation that takes common-sense steps to protect their privacy, and I look forward to working with my colleagues to pass it into law.”
Fran Drescher, president of SAG-AFTRA, applauded the legislation.
Drescher added: “Deepfakes are violations, objectification and exploitation, and must be made illegal and punishable by law. This bill is a powerful step to ensure that this technology is not used to cause harm.”