Online racism in football: What more can the law do?

The last week has seen yet another disgusting spike in the abuse of professional footballers by those who purport to be football fans. Social media platforms often wring their hands and express outrage whilst doing little or nothing to take offensive posts down. It is well past time for something to be done by way of stricter regulation of social media platforms and the effective prosecution of those who post abuse online. The time has come for the criminal law to be updated to better deal with online abuse, and for greater liability to attach to the platforms that allow abusive material to be propagated.

Liability of those who send and enable the racist abuse of footballers online

Those who racially abuse footballers online may commit communication offences under section 1 of the Malicious Communications Act 1988 or section 127 of the Communications Act 2003. These offences overlap in that they both target indecent and grossly offensive communications, but there are key differences. In simple terms, the MCA 1988 criminalises malicious communications where one of the sender’s purposes is to cause “distress or anxiety” to the recipient. The CA 2003, however, criminalises both (1) the sending of a message which is “grossly offensive or of an indecent, obscene or menacing character”; and (2) the sending of a message which is known to be false and sent for the purpose of causing “annoyance, inconvenience, or needless anxiety” to another.

There are other criminal offences with which a person may be charged when racist abuse has been posted online. It is an offence under section 4A of the Public Order Act 1986 to intentionally cause a person harassment, alarm or distress. Where Anthony Martial is racially abused to such an extent that he has to enhance his home security in the wake of that online racist abuse, the requirement for alarm or distress to be caused will be met.

Crucially, a public order offence may be converted into a racially aggravated offence under the Crime and Disorder Act 1998 – and attract a higher sentence – where an offender demonstrates, or is motivated by, racial hostility. For example, a student who sent a number of racially offensive tweets following the on-field collapse of footballer Fabrice Muamba in 2012 was convicted of a racially aggravated section 4A offence and sentenced to 56 days’ imprisonment. Even if an offender is not convicted of a racially aggravated offence, section 66 of the Sentencing Act 2020 requires the court to treat racial hostility as an aggravating factor in sentencing.

The criminal law is still not without its problems when applied to the online environment. The Law Commission is currently reviewing communication offences with a view to recommending reform. It concluded in 2018 that current criminal offences are ill suited to addressing abusive online conduct. The Law Commission has identified several issues with the current law, including the following:

  • The MCA 1988 and CA 2003 offences are overlapping, ambiguous and can be unclear for online users, technology companies and law enforcement agencies. The term “gross offensiveness” in particular has been criticised for its ambiguity, making it difficult for the public to understand the line between criminal and non-criminal communications.
  • Certain online behaviours – such as coordinated “pile on” harassment – are not adequately addressed by the communications offences.
  •  In the case of the MCA 1988, the communication must have an intended recipient. This means that racist abuse posted on a general webpage, as opposed to ‘@’ a particular player, may not amount to an offence.

The Law Commission has provisionally proposed a new offence to replace the MCA 1988 and CA 2003. The new offence would criminalise the sending or posting of a communication likely to cause harm to a likely audience where the defendant intended to harm or was aware of such a risk. It is clear that online racist abuse would fall within the proposed new communications offence, and possibly be caught in circumstances that it would not under the current law. Whether the Law Commission in fact recommends a new offence, and the shape that it may take, will not be known until the final recommendations are published later this year.

Prosecution is of course not the only tool to address online racial abuse against footballers. Football Banning Orders can be imposed under the Football Spectators Act 1989 to exclude offenders from attending football matches, but only if they have been convicted of ‘a relevant offence’. Unfortunately, communication offences are not ‘relevant offences’ for the purposes of the Act, yet it is arguable that they should be. After all, those who send vitriolic abuse to footballers, racist or otherwise, may also pose a risk of causing violence or disorder at a football match.

Outside of the criminal law, football clubs themselves may take disciplinary action. In 2020, a Manchester City supporter and Newport County supporter were given lifetime bans from their respective clubs following their convictions for making racist football chants during games. There is no reason why football clubs should not take similar action where racist abuse occurs online.

Liability of social media platforms

The public has become increasingly concerned about the apparent immunity of social media platforms from any blame or sanctions when racist abuse is posted on their sites. It was lack of action that prompted the #Enough campaign in April 2019. The campaign featured a 24-hour boycott of social media platforms to call for stronger action to be taken by social networks and footballing authorities in response to racist abuse. Since then, there has remained an ever-growing appetite for the increased regulation of social media platforms.

An overhaul of the legal framework governing platform liability is on the horizon. On 15 December 2020, the Government published its response to the Online Harms White Paper. The response sets out plans for an Online Safety Act, and aims to “usher in a new age of accountability for tech companies”. The new regulatory framework will impose a duty of care to make companies take responsibility for harmful content on their platforms, which will include online abuse and hate crime.

In terms of enforcement, Ofcom will have the power to issue fines of up to £18 million or 10% of annual global turnover and engage in “business disrupting measures”. Interestingly, the Government response suggests that criminal sanctions remain a possibility for senior managers who fail to comply with regulatory information requests. This is likely to be as far as any criminal lability will ever wash up on the shores of a social media company. It would be almost impossible for the actions of social media platforms to attract criminal liability through other routes. Without specific corporate offence models, such as failure to prevent offences, establishing corporate criminal liability would require proving culpability on the part of those individuals who constitute the directing mind and will of the platform.

Changes to the regulatory framework governing social media platforms is therefore to be welcomed. Social media platforms appear to be doing nowhere near enough at present to prevent material being posted. A cynic would no doubt say that controversy is always good as it drives user engagement and generates ad revenue. Without proper enforcement, we are left with a position where a troll can abuse a player with little chance of any consequence. Yet players who have responded to abuse on the pitch – for example Eric Dier and Wilfried Zaha – face playing bans and hefty financial penalties from the FA for failing to “turn the other cheek” and ignore abuse that in other walks of life might be considered outrageous provocation.

Peter Parker was counselled with the words “With Great Power Comes Great Responsibility”. Until social media platforms balance their enormous power with greater responsibility they will – quite rightly – face stricter regulation. Freedom of speech is not an unfettered right and the social media platforms know that. It is the view of the authors that sadly little will change without Government action. The Government should impose liability, whether civil or criminal, on those platforms who are hosting racist abuse. No other course will eradicate this cancer. Addressing online racist abuse in society of course requires more than just criminal law and regulatory reform, but such changes will remain crucial in effecting the necessary education and cultural change.

Jim Sturman QC is joint Head of Chambers at 2, Bedford Row, Alex Davidson is currently a pupil in chambers and has previously worked at The Law Commission.


Category: Blog | Date:


Related Barristers

Barrister Jim Sturman

Jim Sturman QC

Silk: 2002

Call: 1982