Connect with us

Tech

Meta Urged To Change Policy Following Fake Joe Biden Post

Avatar of AlishbaW

Published

on

Meta Urged To Change Policy Following Fake Joe Biden Post

(CTN News) – It is urgent that Meta’s policy on deep fake content be updated in light of upcoming elections, according to an oversight body on Monday in a decision regarding a manipulated video depicting US President Joe Biden.

A video of Biden voting with his adult granddaughter went viral last year in which it was falsely portrayed that he touched her chest inappropriately.

The matter was reported to Meta and later to the company’s oversight board as hate speech.

According to Meta’s oversight board, which independently reviews the platform’s content moderation decisions, the platform’s decision to leave the video online was technically sound.

In addition, it argued that the company’s rules governing the manipulation of content were no longer appropriate.

During this pivotal election year, where enormous portions of the global population are at the polls, the board’s warning comes amid fears of rampant misuse of artificial intelligence-powered applications for disinformation on social media platforms.

As described by the Board, Meta’s current policy is incoherent, lacks persuasive justification, and is excessively focused on the process of generating content.

The board stated that its focus should instead be on the “specific harms that it aims to prevent (for example, to electoral processes).”

Meta responded by stating that it was reviewing the Oversight Board’s recommendations and would provide a public response within 60 days.

In the Biden case, the Rules of Conduct were not violated “because the video was not manipulated using artificial intelligence nor was it misrepresented as Biden saying something he did not say.”

Nonetheless, “non-AI-altered content is prevalent and doesn’t necessarily mean less misinformation.”

Generally, smartphones are equipped with easy-to-use features that can be used to edit content into disinformation, sometimes referred to as “cheap fakes,” the report stated.

Additionally, the board reiterated that altered audio content, unlike videos, does not fall under the current policy’s scope, despite the fact that deep fake audio can be very effective in deceiving the user.

New Hampshire authorities have already begun investigating possible voter suppression after one US robocall impersonating Biden urged residents not to vote in the Democratic primary.
As a result of the number of elections in 2024, the oversight board recommended that Meta reconsider its manipulated media policy as soon as possible.

SEE ALSO:

TikTok Will Combat Misinformation About The Feb 8 Election

Alishba Waris is an independent journalist working for CTN News. She brings a wealth of experience and a keen eye for detail to her reporting. With a knack for uncovering the truth, Waris isn't afraid to ask tough questions and hold those in power accountable. Her writing is clear, concise, and cuts through the noise, delivering the facts readers need to stay informed. Waris's dedication to ethical journalism shines through in her hard-hitting yet fair coverage of important issues.

Continue Reading

CTN News App

CTN News App

Recent News

BUY FC 24 COINS

compras monedas fc 24

Volunteering at Soi Dog

Find a Job

Jooble jobs

Free ibomma Movies