(CTN News) – The independent oversight board of Meta, established by the social media giant in 2020, has criticized the company for removing posts that depicted the human suffering in the Middle East conflict.
In response, the board overturned two decisions to remove posts and called on Meta to be more responsive to the evolving situation in the war between Hamas and Israel.
One of the cases involved Instagram removing a video that showed the aftermath of an Israeli strike near Al-Shifa Hospital in Gaza City, which depicted injured or killed Palestinians, including children.
The second case involved Facebook removing a video of an Israeli woman pleading with her kidnappers not to harm her during Hamas raids on Israel.
The board acknowledged the difficulty of these decisions and emphasized the importance of protecting freedom of expression while preventing the incitement of violence or hatred.
The preservation of removed posts containing evidence of human rights violations was strongly urged by the board to Meta.
According to the overseers, Meta informed the board that it had temporarily reduced the thresholds for automatically removing posts with potentially harmful content following the Hamas-led attack on Israel.
The board highlighted that the use of automated tools for content moderation on Facebook and Instagram increases the risk of removing posts that depict the harsh reality of the conflict.
McConnell emphasized the importance of these testimonies not only for the individuals sharing them but also for users worldwide who seek timely and diverse information about significant events. Some of these posts could serve as crucial evidence of potential serious violations of international human rights and humanitarian law.
While content decisions made by the oversight board are binding,
Meta clarified that its recommendations are not.
The conflict between Israel and Hamas has resulted in numerous casualties and has evoked strong emotions globally.
Social media platforms have been inundated with violent imagery and fabricated content aimed at spreading misinformation, posing a challenge for online platforms.
In October, Meta was asked by the European Union to provide information regarding the spread of violent and terrorist content on its platforms.
Similar investigations are also being conducted on TikTok, owned by China-based ByteDance, and X, formerly known as Twitter.