We have all seen something we don’t like on Facebook, be it your ex on holiday with their new boyfriend or a group photo from the night you threw up drunk outside the takeaway. But what about the darker side of Facebook? The groups and pages that post the most horrific of memes and posts and photos that go against Facebook’s guidelines? How are they brought to justice?
I am a member of lots of groups on Facebook. From my university alumni to fans of the ITV show ‘Home Fires’, it seems that every cause or niche has a group. But should we allow extreme groups and views a space to come together? And how does Facebook deal with that?
From immigration to abortion, we all have views and for each one of our views there is an opposite, that’s just how life works. In my research for this post, I searched for groups that went against my beliefs, like “stop muslim immigration” and “refuse to date women that have lost their virginity”. I was horrified and angered to my very core. How can something so offensive as declaring that another race or religion or gender needs to be hurt or made to leave their homes be published – and stay that way?
What do Facebook do about these extreme and violent views that harbour and multiply on their website?
Last year I posted a photo of my younger self on the site. It was removed within minutes for “breaching community standards”. Why? I asked and there was no reason for the photo to have been taken down. That same day I reported an image that a friend had posted that said that all “moslems should be castrated”. That image supposedly didn’t break any of Facebook’s rules or regulations.
At the beginning of this year, it was reported that over 70 social and racial injustice groups had been removed, or at least had posts removed, including the Black Lives Matter movement. The American Civil Liberties Union headed the letter, but said their reply was inadequate and reinforced Facebook’s own community guidelines. At the same time, videos of animal abuse, livestreams of suicide, and threats of violence to injustice groups, all go uncensored.
More recently, videos of police brutality from the Catalonia independence referendum have been taken down, while violent threats towards those who voted are left to fester. In July, american woman, Francie Latour’s recollection of her children being verbally abused in a grocery store was removed. An older man had called her young children the ‘n’ word and she wasn’t allowed to voice her grievances on Facebook.
So what does it take to get something taken off Facebook?
According to Facebook’s own guidelines, something can be removed for: direct threats, self injury, dangerous organisations (like terrorism or organised crime), bullying and harassment, attacks on public figures, criminal activity, sexual violence and exploitation, and regulated goods. Facebook also state that they restrict the display of nudity, remove content that is graphic, and will remove any kind of hate speech. So my photo of 4 year old me in a swimming costume on holiday could possibly have triggered a big “that is a child in a swimming costume” button and therefore was removed, but why was my friend’s hate speech not in breach of any guidelines?
In May this year, Facebook decided to defend their actions and said that, when failing to remove videos of abortions, violent death, and self harm, as to not censor or push people in distress. The website’s ethical guidelines were leaked to The Guardian, which told moderators not to remove controversial content. Facebook trains its moderators to protect hate speech against “protected groups”, including white males.
But have Facebook fallen on their own sword?
In January, Facebook removed a picture of a statue of Neptune in Bologna as it was “sexually explicit” and removed a famous image from the Vietnam war as it showed a naked child. These joined the long list of things removed by Facebook for being “explicit”, including: the little mermaid statue in Denmark, a tasty-looking simnel cake, and a pop art image of someone licking an ice cream.
Germany are drafting a law that would see Facebook (and other social media sites) hit with fines of up to $55 million dollars for not removing hate speech. Will this be enough to see Facebook step up and impose its guidelines on all groups, not just the privileged?
Rebecca Broad is a graduate from the University of Sunderland, returning to start a masters in January. She now lives with her parents in Teesside.