The Oversight Board of the company "Meta" called on Tuesday for the company to end its blanket ban on the Arabic word "Martyr," following a year-long review that concluded the company's approach, which owns Facebook, was "exaggerated" and unnecessarily restricted the speech of millions of users. The board, which is funded by "Meta" but operates independently, stated that the social media giant should only remove posts containing the word "Martyr" when they are linked to clear indications of violence or if they violate "Meta's" other rules separately.

This decision comes after years of criticism regarding the company's handling of content related to the Middle East, including a 2021 study commissioned by "Meta" itself, which found that its approach had a "negative impact on human rights" for Palestinians and other Arabic-speaking users of its services. These criticisms intensified following the onset of hostilities between Israel and Hamas in October. Human rights groups accused "Meta" of censoring pro-Palestinian content on Facebook and Instagram amid the war that has claimed tens of thousands of lives in Gaza.

The Oversight Board of "Meta" reached similar conclusions in its report on Tuesday, finding that "Meta's" rules concerning the word "Martyr" failed to take into account the diverse meanings of the word and led to the removal of content that was not intended to praise acts of violence. Heile Thorning Schmidt, co-chair of the Oversight Board, was quoted by Reuters stating, "Meta was operating under the assumption that censorship could improve safety, but evidence suggests that censorship can marginalize entire populations without improving safety at all."

Currently, "Meta" removes any posts that use the word "Martyr" when referring to individuals on its list of "dangerous organizations and individuals," which includes members of armed Islamic groups, drug gangs, and organizations advocating white supremacy. The board noted that "Meta" sought advice on this issue last year, after beginning to reassess the policy in 2020, but failed to reach a consensus internally. It revealed in its request that the word “Martyr” was responsible for content removals on its platforms more than any other word or phrase.

A spokesperson for Meta stated in a statement that the company will review the Oversight Board's assessment and respond within 60 days.

Our readers are reading too