Meta Oversight Board Says Removing Films of Israel-Hamas War Unfair

Meta’s oversight board criticized the company’s automated moderation tools for being too aggressive after two videos depicting hostages, injured civilians and possible casualties in Israel’s war with Hamas were unfairly removed from Facebook and Instagram. A report released on Tuesday, an external review panel concluded that the posts should continue to exist and that removing the content would be costly for “freedom of expression and access to information” in the war. (Warning to readers: The following description may be disturbing.)

One of the since-deleted videos posted on Facebook depicts an Israeli woman pleading with her kidnappers not to kill her during a Hamas attack on Israel on Oct. 7. Another video, posted on Instagram, shows what appears to be the aftermath of an Israeli attack on or near the Shifa hospital in Gaza city. The post contained footage of Palestinians killed or injured, including children.

In the case of the latter video, the board said, both the deletion and the denial of user requests to restore the video were done by Meta’s automated review tools, without any human review. The decision was reviewed by the board on an “accelerated 12-day timeline,” and after the case was accepted, the film was reinstated with a content warning screen.

In its report, the committee found that the moderation thresholds that were lowered following the Oct. 7 attack to more easily catch violating content “also increased the likelihood that Meta would mistakenly remove conflict-related, non-violating content,” the committee said. The lack of human-led moderation during such crises can lead to “erroneous removal of speech that may be of significant public interest” and Meta should more quickly allow content “for purposes of condemnation, awareness-raising, news coverage, or calls for release” and apply warning screen.

The board also criticized Meta for demoting the two moderated posts with a warning screen, preventing them from appearing as recommended content to other Facebook and Instagram users, even though the company acknowledged that the purpose of the posts was to raise awareness. Respond to board decisions Overturned the expungement, saying there would be no further updates on the case because the panel had not provided recommendations.

Meta isn’t the only one Social media giants come under scrutiny for its handling of content related to the Israel-Hamas war. Verified user on X (formerly Twitter) accused of “Error message super spreader” according to misinformation watchdog group NewsGuard. Tik Tok and Youtube There has been a reported surge in illegal content and false information on the platform, and the EU is also reviewing it under the EU’s Digital Services Act A formal investigation has been launched By contrast, the Oversight Board’s case highlights the risks of over-moderation — and the tricky line the platform must tread.

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button