Meta Platforms announced on Tuesday that it will lift the blanket ban it imposed on the word "martyr" in Arabic, after a year-long review by its Oversight Board found the social media giant's approach to be "excessive." The company has faced criticism for years regarding its handling of content related to the Middle East, including a study conducted in 2021, commissioned by Meta itself, which found that its approach had a "negative impact on human rights" concerning Palestinians and other Arabic-speaking users of its services. These criticisms have intensified since the onset of fighting between Israel and Hamas in October. The Oversight Board, which is funded by Meta but operates independently, began its review last year because the word was the reason for more content removals on the company's platforms than any other word or phrase. Meta is the parent company that owns Facebook and Instagram. The review in March found that Meta's rules regarding the word "martyr" did not take into account the diversity of the word's meanings and led to the removal of content that was not intended to glorify acts of violence. In March, the co-chair of the Oversight Board, Helle Thorning Schmidt, stated, "Meta has operated on the assumption that censorship can improve safety, but evidence suggests that censorship can marginalize entire populations while not improving safety at all." Meta acknowledged the findings of the review on Tuesday, stating in an announcement that the term "martyr" is used in various ways by many communities around the world across cultures, religions, and languages. Sometimes, this approach may result in the widespread removal of content that was never intended to support terrorism or glorify violence. The Oversight Board welcomed the change, stating that Meta's policy regarding the word had led to the censorship of millions of people across its platforms.