Meta's oversight board has called upon the tech giant to revoke its blanket prohibition on the commonly used Arabic term "shaheed," which translates to "martyr" in English.
After a thorough year-long examination, the board, operating independently despite being funded by Meta, criticised the company's overreaching stance, asserting that it unduly curtailed the speech of countless users.
According to the board's assessment, which was made public on Tuesday, Meta should only take action against posts containing the term "shaheed" if they exhibit clear indications of violence or contravene other company policies.
The board reached a similar conclusion in its report, stating that Meta's rules regarding "shaheed" did not consider the word's various meanings and led to the removal of content that was not intended to glorify violent actions.
"Meta has been operating under the assumption that censorship can and will improve safety, but the evidence suggests that censorship can marginalize whole populations while not improving safety at all," Oversight Board co-chair Helle Thorning-Schmidt said in a statement.
The ruling follows years of criticism aimed at Meta's handling of content related to the Middle East.
In 2021, Meta itself commissioned a study revealing that its methods had a detrimental impact on the human rights of Palestinians and other Arabic-speaking users.
The scrutiny intensified during the Israel-Hamas hostilities, with rights groups accusing Meta of censoring pro-Palestinian content on Facebook and Instagram amid a war that claimed thousands of lives in Gaza following Hamas' attacks into Israel on October 7 last year.
Meta currently deletes any posts that use the word "shaheed" to refer to individuals listed under its "dangerous organizations and individuals," such as members of Islamist militant groups, drug cartels, and white supremacist organisations.
According to the board's report, Meta considers the use of the word as praise for these banned entities, which is why it removes such content. Hamas is among the groups the company designates as a "dangerous organization."
Last year, Meta sought input from the board after initiating a reassessment of its policy in 2020, which, according to the board, did not achieve internal consensus. The board disclosed that "shaheed" led to more content removals on Meta's platforms than any other single word or phrase.
A spokesperson for Meta stated that the company would review the board's feedback and provide a response within 60 days.
Comments