Social media is something of a double-edged sword. On the one hand its ability to unify billions of people around the world in an instant has revolutionised the ways in which we connect and engage with each other. Take the 2017 Women’s March for instance; one “I think we should march” post saw the largest single-day march since the anti-Vietnam war protests in the 1960s. At its worst however, social media provides everyone unbridled access to a global platform. Whilst that may seem harmless, it can be the beginning of a very toxic equation: unregulated social media platform, plus angry person with key board and unsolicited opinions, equals nasty troll.
If you’re not familiar with the term, a troll refers to someone who posts inflammatory or off-topic messages online with the intention of causing upset and distress to others. Being an active social media user I have witnessed my fair share of trolling comments, which have ranged from juvenile “you’re stupid” remarks to violent death threats. The severity of some comments led me to question who (or what) was behind the abuse. I tended to picture a horrid, unnatural creature that lived under a bridge in and had decided they were nothing like me at all. That was until I came across a recent study carried out at Stanford University which found that under the right circumstances, anyone can become a troll. The study even suggested trolling was potentially contagious and found that exposure to abusive comments online can increase a person’s likelihood of exhibiting the same behaviour. It also found that time of day and mood were also a factor, and people were more likely to exhibit anti-social behaviour earlier in the week and late at night.
Without delving too much into the psychology of why people troll, it’s fair to say there is a level of disassociation between typing abusive comments and its real-world effects. This perceived lack of consequence is one way in which trolling can be explained, as well as an increased sense of anonymity and desensitisation to an online environment. In an effort to increase awareness of its impacts, more celebrities are coming forward to reveal the very real ways in which trolling has affected them. One such celebrity that has taken a stand is Jesy Nelson.
Her documentary “Odd One Out,” has been praised for its raw and honest account of her experiences of online trolls. It details her fairy tale rise to fame, starting in 2011 when she entered and won the X Factor as a member of the girl band ‘Little Mix’. Within hours of winning Nelson explains she was subjected to an onslaught of online abuse. Reading aloud messages of “fat, ugly singing whale” and “go kill yourself” to the camera, it makes for heart breaking but important viewing. In fact, given the overwhelmingly positive public response following the documentary and the prevalence of the trolling problem, there have been calls for the documentary to be made mandatory viewing in schools. Showing this documentary in schools isn’t a bad idea, but doing so feels much like trying to stop the bleeding of a severed limb with a single plaster. The problem is systemic and social media platforms themselves must take responsibility for how their online space is used.
It’s no surprise that social media is where most trolling takes place, with 84.71% of users admitting they have experienced abusive behaviour at least once. As result, there has been growing pressure in recent years for platforms to manage the problem, and to give credit where credits due, there have been some clamp downs. For instance, Instagram has removed the like button and also introduced new AI features that alert people when their comment may be considered offensive and asks if they still want to post it. Facebook is considered the most regulated platform and provides its users with the ability to remove abusive comments completely, which has the effect of it being permanently deleted from the view of everyone. It also requires commenters to register their names and e-mail addresses. Interestingly, the Centre for Countering Digital Hate’s (CCDH) conducted a report titled ‘Don’t Feed the Trolls’ which found that social media platforms actually benefit from online trolling. That is because abusive and offensive comments often generate a significant amount of attention from users which in turn boosts the overall time spent online, which is a core metric for platforms such as Facebook. Given this, maybe it’s not that surprising social media platforms are seemingly slow on the up take.
There have been more traditional efforts made to combat the problem through government programs and consultations, such as Stop Online Abuse and Internet Safety Strategy, all of which are aimed at teaching users how to avoid and deal with trolls. However when it comes to the law, there is no catch all legal definition of trolling and consequently trolling is not in and of itself a crime. This is because the legal system must strike a balance between a person’s right to free speak and what is in the public interest. This makes it hard to define what constitutes online anti-social behaviour and has allowed trolls to slip through the net. At present, a troll can be prosecuted under hate crime, anti-harassment laws and computer misuse laws as well as under some common laws relating to defamation. The threshold for prosecution using these laws is extremely high however and it is evident the present law is ill equipped to deal with the rising crimes in this area. A White Paper titled Online Harms, has been proposed and if successful will place a new duty of care on social media firms to protect users from harmful content. To ensure firms are adhering to the new legal duty, an independent regulator will be assigned to who will impose a hefty fine if it is found they are not. Companies that will fall within the remit of the potential new law are social media platforms, public discussion forums, search engines and messaging services. If passed, the law will be a first of its kind.
It is clear change is starting to take root and it cannot come soon enough. Statics show that online abuse is increasing, with up to one quarter of adults in the UK being targeted; 24% of those trolled admitted they have contemplated taking their own life; 37% went on to develop social anxiety and 36% became depressed. Numbers aside, the problem has been ignored for too long by government and social media firms alike. It is high time that we begin to manage what has now become a faceless evil and work to reshape the ways in which we engage with each other online in order to ensure the internet remains free, open and most of all safe.