The Miami Entrepreneur

They were arrested for posting during the riots – will it change anything?

Read Time:7 Minute, 22 Second

Charges following disorder felt significant, but for social media giants there was no day of reckoning.

BBC

Kay, 26, and Parlour, 28, have been sentenced to 38 months and 20 months in prison respectively for stirring up racial hatred online during the summer riots.

Charges in the aftermath of the disorder felt like a significant moment, in which people had to face real-life consequences for what they said and did online.

There was widespread recognition that false claims and online hate contributed to the violence and racism on British streets in August. In their wake, Prime Minister Keir Starmer said social media “carries responsibility” for tackling misinformation.

More than 30 people found themselves arrested over social media posts. From what I’ve found, at least 17 of those have been charged.

The police will have deemed that some of those investigated did not meet the threshold for criminality. And in plenty of cases, the legal system could be the wrong way to deal with social media posts.

But some posts that did not cross the line into criminality may still have had real-life consequences. So for those who made them, no day of reckoning.

And nor, it seems, for the social media giants whose algorithms, time and time again, are accused of prioritising engagement over safety, pushing content regardless of the reaction it can provoke.

Getty Images

At the time of the riots, I had wondered whether this could be the moment that finally changed the online landscape.

Now, though, I’m not so sure.

To make sense of the role of the social media giants in all this, it’s useful to start by looking at the cases of a dad in Pakistan and a businesswoman from Chester.

On X (formerly known as Twitter) a pseudo-news website called Channel3Now posted a false name of the 17-year-old charged over the murders of three girls in Southport. This false name was then widely quoted by others.

Another poster who shared the false name on X was Bernadette Spofforth, a 55-year-old from Chester with more than 50,000 followers. She had previously shared posts raising questions about lockdown and net-zero climate change measures.

The posts from Channel3Now and Ms Spofforth also wrongly suggested the 17-year-old was an asylum seeker who had arrived in the UK by boat.

All this, combined with further untrue claims from other sources that the attacker was a Muslim, was widely blamed for contributing to the riots – some of which targeted mosques and asylum seekers.

I found that Channel3Now was connected to a man named Farhan Asif in Pakistan, as well as a hockey player in Nova Scotia and someone who claimed to be called Kevin. The site appeared to be a commercial operation looking to increase views and sell adverts.

At the time, a person claiming to be from Channel3Now’s management told me that the publication of the false name “was an error, not intentional” and denied being the origin of that name.

And Ms Spofforth told me she deleted her untrue post about the suspect as soon as she realised it was false. She also strongly denied she had made the name up.

So, what happened next?

The real story of the news website accused of fuelling riotsWhat is Elon Musk’s game plan?Elon Musk hits back after being shunned from UK summit

Farhan Asif and Bernadette Spofforth were both arrested over these posts not long after I spoke to them.

Charges, however, were dropped. Authorities in Pakistan said they could not find evidence that Mr Asif was the originator of the fake name. Cheshire police also decided not to charge Ms Spofforth due to “insufficient evidence”.

Mr Farhan seems to have gone to ground. The Channel3Now site and several connected social media pages have been removed.

Bernadette Spofforth, however, is now back posting regularly on X. This week alone she’s had more than one million views across her posts.

She says she has become an advocate for freedom of expression since her arrest. She says: “As has now been shown, the idea that one single tweet could be the catalyst for the riots which followed the atrocities in Southport is simply not true.”

Focusing on these individual cases can offer a valuable insight into who shares this kind of content and why.

But to get to the heart of the problem, it’s necessary to take a further step back.

While people are responsible for their own posts, I’ve found time and time again this is fundamentally about how different social media sites work.

Decisions made under the tenure of Elon Musk, the owner of X, are also part of the story. These decisions include the ability to purchase blue ticks, which afford your posts greater prominence, and a new approach to moderation that favours freedom of expression above all else.

The UK’s head of counter-terror policing, Assistant Commissioner Matt Jukes, told me for the BBC’s Newscast that “X was an enormous driver” of posts that contributed to the summer’s disorder.

Getty Images

A team he oversees called the Internet Referral Unit noticed “the disproportionate effect of certain platforms”, he said.

He says there were about 1,200 referrals – posts flagged to police by members of the public – alone in relation to the riots. For him that was “just the tip of the iceberg”. The unit saw 13 times more referrals in relation to X than TikTok.

Acting on content that is illegal and in breach of terror laws is, in one sense, the easy bit. Harder to tackle are those posts that fall into what Mr Jukes calls the “lawful but awful” category.

The unit flags such material to sites it was posted on when it thinks it breaches their terms and conditions.

But Mr Jukes found Telegram, host of several large groups in which disorder was organised and hate and disinformation were shared, hard to deal with.

In Mr Jukes’s view, Telegram has a “cast-iron determination to not engage” with the authorities.

Elon Musk has accused law enforcement in the UK of trying to police opinions about issues such as immigration and there have been accusations that action taken against individuals posters has been disproportionate.

Mr Jukes responds: “I would say this to Elon Musk if he was here, we were not arresting people for having opinions on immigration. [Police] went and arrested people for threatening to, or inciting others to, burn down mosques or hotels.”

But while accountability has been felt at “the very sharp end” by those who participated in the disorder and posted hateful content online, Mr Jukes said “the people who make billions from providing those opportunities” to post harmful content on social media “have not really paid any price at all”.

He wants the Online Safety Act that comes into effect at the start of 2025 bolstered so it can better deal with content that is “lawful but awful”.

I contacted both X and Telegram who did not respond to the points the BBC raised.

During the riots, Telegram said its moderators were “actively monitoring the situation and are removing channels and posts containing calls to violence” and that “calls to violence are explicitly forbidden by Telegram’s terms of service”.

X continues to share in its publicly available guidelines that its priority is protecting and defending the user’s voice.

Almost every investigation I do now comes back to the design of the social media sites and how algorithms push content that triggers a reaction, usually regardless of the impact it can have.

During the disorder algorithms amplified disinformation and hate to millions, drawing in new recruits and incentivising people to share controversial content for views and likes.

Why doesn’t that change? Well, from what I have found, the companies would have to be compelled to alter their business models. And for politicians and regulators, that could prove to be a very big challenge indeed.

BBC InDepth is the new home on the website and app for the best analysis and expertise from our top journalists. Under a distinctive new brand, we’ll bring you fresh perspectives that challenge assumptions, and deep reporting on the biggest issues to help you make sense of a complex world. And we’ll be showcasing thought-provoking content from across BBC Sounds and iPlayer too. We’re starting small but thinking big, and we want to know what you think – you can send us your feedback by clicking on the button below.

About Post Author

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous post Sycamore Gap sapling gifted in memory of boy with cancer
Next post Watch: Can BBC reporter’s AI clone fool his colleagues?