Facebook Correctly Followed Rules in Leaving Edited Video of Biden Online: Board

User’s post featuring President Joe Biden was examined by an oversight board.
Facebook Correctly Followed Rules in Leaving Edited Video of Biden Online: Board
Meta Founder and CEO Mark Zuckerberg arrives to testify before the Senate Judiciary Committee in Washington on Jan. 31, 2024. (Madalina Vasiliu/The Epoch Times)
Zachary Stieber
Updated:

Facebook’s decision to not take down an edited clip of President Joe Biden with one of his granddaughters was proper under the company’s policies, a board that adjudicates disputes said on Feb. 5.

“The Board agrees with Meta that the content does not violate Meta’s Manipulated Media policy as currently formulated,” the Oversight Board said in a statement.

An unidentified user of Facebook, which is owned by Meta, in May 2023 posted a seven-second video featuring President Biden with Natalie Biden, one of his granddaughters.

The clip showed President Biden and Ms. Biden in Delaware, after they voted in the 2022 elections.

The original video showed Ms. Biden sticking an “I Voted” sticker on President Biden before President Biden placed a sticker on her chest, where she indicated he should place it. President Biden then kissed his granddaughter on the cheek.

The edited video showed the moment President Biden put the sticker on Ms. Biden, and the kiss, without showing the moments before, during, and after, according to Meta and its board. Neither have provided the post in question.

The user who posted the video added a soundtrack and a caption that accused President Biden of being a “sick pedophile” and his supporters as “mentally unwell.”

Another user complained but the company opted not to take action. The user appealed the decision to the Oversight Board, which said in October 2023 that it was taking the case.

After reviewing Facebook’s policies and the post, the board said the decision to keep the post up was appropriate.

That’s because Facebook’s policy on manipulated media only applies to videos made with artificial intelligence (AI) that show people saying things they did not say.

“Since the video in this post was not altered using AI and it shows President Biden doing something he did not do (not something he didn’t say), it does not violate the existing policy,” the board said. “Additionally, the alteration of this video clip is obvious and therefore unlikely to mislead the ‘average user’ of its authenticity, which, according to Meta, is a key characteristic of manipulated media.”

The policy states in part that it covers “videos that have been edited or synthesized, beyond adjustments for clarity or quality, in ways that are not apparent to an average person, and would likely mislead an average person to believe:
  • A subject of the video said words that they did not say, AND
  • The video is the product of artificial intelligence or machine learning, including deep learning techniques (e.g., a technical deepfake), that merges, combines, replaces, and/or superimposes content onto a video, creating a video that appears authentic.”

Recommendation

In addition to the finding, the board said that Meta should think about expanding the policy, which it described as “lacking in persuasive justification” and being “incoherent and confusing to users.”

“The policy’s application to only video content, content altered or generated by AI, and content that makes people appear to say words they did not say is too narrow. Meta should extend the policy to cover audio as well as to content that shows people doing things they did not do,” the board said. “The board is also unconvinced of the logic of making these rules dependent on the technical measures used to create content. Experts the board consulted, and public comments, broadly agreed on the fact that non-AI-altered content is prevalent and not necessarily any less misleading; for example, most phones have features to edit content.”

Many people who submitted comments to the board said that they felt the video should not be allowed to stay up.

“FB needs to start cracking down on these,” one wrote. “If it’s altered then say it is when posted and if it’s altered to bring harm to a person then it should not be allowed.”

“This altered video is not an opinion, but it was entirely false. It is misinformation,” another wrote. “Misinformation is a lie. No credible organization with any integrity supports or promotes lies.”

Rep. Adam Schiff (D-Calif.) was among those speaking out against the video, calling on Meta to take action against “cheap fakes,” or content doctored without the help of AI.

A number of others, though, cautioned against expanding Meta’s policy.

“Efforts to add a new policy to counter simply edited videos that may be considered by some to be misinformation could significantly harm both political and non-political expression, be abusable by those with more resources and internet trolls, and present a problem that will be impossible to handle at scale with any semblance of fairness,” David Inserra and Jennifer Huddlestone of the Cato Institute said.

They described the content in question as “a video of a President Biden is simply edited with a loop” and noted that such videos are commonly made for public figures and shared online.

Meta said in a statement that it welcomed the board’s decision. “After conducting a review of the recommendations provided by the board in addition to their decision,” it said, “we will update this post.”
Zachary Stieber is a senior reporter for The Epoch Times based in Maryland. He covers U.S. and world news. Contact Zachary at [email protected]
twitter
truth
Related Topics