Once again, social-media companies are facing criticism as their platforms are used to racially abuse football players, following the dramatic conclusion of the Euro 2020 men’s tournament on Sunday night.
And make no mistake, it is becoming increasingly difficult for the technology giants to defend these incidents. How can they not be held more responsible for the content they share to millions?
In a nutshell, historically, in terms of regulation, social-media platforms have not been categorised as publishers or broadcasters in the same way as, say, traditional media such as the BBC.
If racist comments appeared below this article, written not by me but by someone who had read it, the BBC would be held to account and the UK regulator, Ofcom, would investigate, intervene and decide on a penalty, probably a fine. But Ofcom does not yet have such powers over the likes of Facebook, TikTok, YouTube and Twitter, which have until now been largely self-regulating – although, that is coming, as part of the long-anticipated Online Safety Bill.
Whether the threat of large fines is enough to focus the minds of these multi-million dollar businesses remains to be seen, however. But it is not just in the UK that regulation is planned.
In fairness, while the BBC does have a large global presence, it does not have to deal with anything like the volume of content and video, written and uploaded in real time by anybody and everybody, a platform such as Facebook, with its two billion users, does.