NEW YORK (Reuters) – Meta’s oversight board on Tuesday called on the company to end its blanket ban on the Arabic word “shaheed,” or “martyr” in English, after a year-long review found the Facebook owner’s approach was “overbroad” and had unnecessarily suppressed the speech of millions of users.
The board, which is funded by Meta but operates independently, said the social media giant should remove posts containing the word “shaheed” only when they are linked to clear signs of violence or if they separately break other Meta rules.
The ruling comes after years of criticism of the company’s handling of content involving the Middle East, including in a 2021 study Meta itself commissioned that found its approach had an “adverse human rights impact” on Palestinians and other Arabic-speaking users of its services.
Those criticisms have escalated since the onset of hostilities between Israel and Hamas in October. Rights groups have accused Meta of suppressing content supportive of Palestinians on Facebook and Instagram against the backdrop of a war that has killed tens of thousands of people in Gaza following Hamas’ deadly raids into Israel on Oct 7.
The Meta Oversight Board reached similar conclusions in its report on Tuesday, finding Meta’s rules on “shaheed” failed to account for the word’s variety of meanings and resulted in the removal of content not aimed at praising violent actions.
“Meta has been operating under the assumption that censorship can and will improve safety, but the evidence suggests that censorship can marginalize whole populations while not improving safety at all,” Oversight Board co-chair Helle Thorning-Schmidt said in a statement.
Meta currently removes any posts using “shaheed” in referring to people it designates on its list of “dangerous organizations and individuals,” which includes members of Islamist militant groups, drug cartels and white supremacist organizations.
The company says the word constitutes praise for those entities, which it bans, according to the board’s report.
Hamas is among the groups the company designates as a “dangerous organization.”
Meta sought the board’s input on the topic last year, after starting a reassessment of the policy in 2020 but failing to reach consensus internally, the board said. It revealed in its request that “shaheed” accounted for more content removals on its platforms on than any other single word or phrase.
A Meta spokesperson said in a statement that the company would review the board’s feedback and respond within 60 days.
(Reporting by Katie Paul; Editing by Lincoln Feast.)
Comments