Meta claims that it has deleted or marked as inappropriate over 795,000 posts since the war between Israel and Hamas broke out

The company reported that it removed "seven times more content daily for violating our Dangerous Organizations and Individuals policy, only in Hebrew and Arabic."

Meta, like X (formerly Twitter), is also taking a stand against Hamas terrorism. According to the company that owns WhatsApp, Facebook and Instagram, it has deleted or marked as inappropriate over 795,000 posts since the war broke out between Israel and the terrorist group. The company made the announcement in a press release in which it explained how it is monitoring content:

Since the terrorist attacks by Hamas on Israel on Saturday, and Israel’s response in Gaza, expert teams from across our company have been working around the clock to monitor our platforms, while protecting people’s ability to use our apps to shed light on important developments happening on the ground.

The company reported that one of the measures was taken solely to monitor Hebrew and Arabic content. In those languages ​​alone, Meta claims to have deleted 795,000 posts.

This material, found between October 7 and 10, meant removing "seven times as many pieces of content on a daily basis for violating our Dangerous Organizations and Individuals policy." To do this, they put together a team, which, as explained in the press release, monitored content in Hebrew and Arabic:

We quickly established a special operations center staffed with experts, including fluent Hebrew and Arabic speakers, to closely monitor and respond to this rapidly evolving situation in real-time. This allows us to remove content that violates our Community Standards or Community Guidelines faster and serves as another line of defense against misinformation.

Other measures taken by Meta

In addition, Meta claims that it has taken additional precautions to protect the identity of the hostages. To do this, the company is "temporarily expanding" its policies on violence and incitement as well as suppressing content that clearly identifies the hostages "even if it's being done to condemn or raise awareness of their situation." Meta explained what the company is allowing:

We are allowing content with blurred images of the victims but, in line with standards established by the Geneva Convention, we will prioritize the safety and privacy of kidnapping victims if we are unsure or unable to make a clear assessment.