As we’ve seen from far too many recent articles, Facebook is under global pressure from the governments of many different companies to both be more conscious of, and to take more responsibility for, the content that is posted on their platform.
As a result, Facebook announced earlier this month that they will begin taking down posts which support or promote white nationalism, and white separatism. In response, commentators are debating whether this and similar measures represent a genuine desire on Facebook’s part to change (see also Zuckerberg’s recent statement that the internet needed more regulation), or simply a desire to avoid further criticism, which has certainly been coming thick and fast recently.
Hate Groups And Radicalisation
As mentioned, there has been a lot of criticism against Facebook recently, including potential fines and lawsuits thanks to their cavalier treatment of user data and privacy, but this specific move comes in the wake of the live-streamed terrorist attack in New Zealand, where an attacker killed 50 people in 2 mosques.
It was reported that the (alleged…apparently you have to say that) attacker was a member of a white supremacist group. Thanks to a leaked document giving guidelines to Facebook moderators, although white supremacist content and groups are technically banned, Facebook allowed a distinction between that and white nationalist and separatist content.
The distinction was drawn in order to avoid shutting out other nationalist and separatist groups and movement, (such as, for example, Basque separatists in Spain) and allowing them to have a platform for their views. However, Facebook has now decided that white nationalism and separatism, especially in Europe and the US, are different from other types of nationalism.
Offering Support Groups
According to Facebook, while people will still be able to demonstrate pride in their ethnic heritage, posting or searching for white nationalist content may now generate a pop-up which directs people to the organisation “Life After Hate” founded by former extremist group members, who provide help and support for people seeking to leave hate groups.
Easier Said Than Done
For whatever reason though, Facebook has made fairly clear what their intentions are moving forward in this regard. However, the real problem is now starting to show itself.
And that is the difficulty in monitoring, identifying, and even defining, what constitutes content that should be removed or blocked under this new policy.