Facebook and Instagram have implemented a temporary policy change that will allow users in some countries to post content that is normally prohibited, including calls for harm or even the death of Russian soldiers or politicians. The change first emerged in a report by Reuters, referring to internal emails to moderators. In it, mods are told that calls for the death of Russian President Vladimir Putin or Belarusian President Alexander Lukashenko will be allowed, as long as they do not contain threats against others or “indicators of credibility”, such as saying where or how the act will take place. take place.
In a statement sent to The edgeMeta spokesman Andy Stone said: “As a result of the Russian invasion of Ukraine, we have temporarily taken into account forms of political expression that would normally violate our rules, such as violent utterances such as ‘death to the Russian invaders’. We will still not allow credible calls for violence against Russian civilians.”
The New York Times has confirmed that this policy applies to people using the service from Ukraine, Russia, Poland, Latvia, Lithuania, Estonia, Slovakia, Hungary and Romania. The Time also notes that in 2021, vice reported that Facebook moderators received similar temporary instructions about “death to Khamanei” content and quoted a spokesperson as saying that Facebook had also made that particular exception in certain previous cases.
The Facebook community’s standards regarding hate speech and violence and incitement have continued to receive updates since the company began publishing them publicly in 2018. This change is just the latest example of how platforms are treating content originating from or related to the invading countries since the battles started.
An update from the Reuters report contains the content of the message sent to moderators, which reads as follows:
We grant allowance in the spirit of the policy to allow violent statements by T1 that would otherwise be removed under the hate speech policy when: (a) Russian soldiers EXCEPT POWs, or (b) Russians are addressed where it is clear that the context is the Russian invasion of Ukraine (for example, the content mentions the invasion, self-defense, etc.).
Typically, moderation guidelines would dictate that language that dehumanizes or attacks a particular group based on its identity is removed. But the emails quoted by Reuters argue that the context of the current situation requires reading messages from the listed countries about generic Russian soldiers as a proxy for the Russian military as a whole, and in the absence of credible explanations, moderators are instructed not to take any action against them.
Still, it’s unclear if the posts would be deleted even without the direction. The policy already has many carve-outs and exceptions. It is explicitly stated that in several cases additional information or context is required before the policy is implemented, including:
Content offensive concepts, institutions, ideas, practices, or beliefs associated with protected features, likely to contribute to imminent bodily harm, harassment, or discrimination against the people associated with that protected feature. Facebook looks at a string of characters to determine if there is a risk of damage to the content. These include, but are not limited to: content that may incite threatened violence or intimidation; whether there is a period of heightened tension, such as an election or ongoing conflict; and whether there is a recent history of violence against the targeted protected group. In some cases, we may also consider whether the speaker is a public figure or holds a position of authority.
The Russian government’s response to the report is unknown, and there have been no updates from its censorship agency Roskomnadzor, which Facebook banned earlier this month.