The latest case relates to a post from last October that contained an online meme that made reference to the potential use of violence against those who criticized the Prophet Muhammad. It included hashtags that called Macron the devil and urged a boycott of French goods.
In its initial ruling, Facebook said the post broke its community standards around violence and incitement, though the company told the Oversight Board that there was tension between what was legitimate religious speech and a possible threat of violence.
In its ruling, a majority of the independent group, which is funded by Facebook, disagreed.
“A majority found that, for this specific post, Facebook did not accurately assess all contextual information and that international human rights standards on expression justify the Board’s decision to restore the content,” the group said in a statement on Friday in reference to the decision.
A representative for Facebook was not immediately available for comment.
The outside group said that the post was not likely to cause harm, suggesting that the tech giant had been overly sensitive to potential hate speech issues that either targeted or came from Muslims across its platforms. Facebook will now have to reinstate the post and the Oversight Board recommended that the company provide users with extra information about how it enforced its online standards linked to potential violent threats online.
Part of the group’s mandate is to help Facebook decide what posts fall afoul of the company’s content policies, though the Oversight Board, for now, can only review posts that have already been removed from the platform.
That involves trying to create a body of cases on difficult freedom of expression issues that can be used to update the tech giant’s approach to future problem areas, though critics say the group the group does not have enough power to make a meaningful change.