On 29 July 2024, in a story that has now become seared into the British psyche, Axel Rudakubana murdered three children at a Taylor Swift dance class in the Merseyside town of Southport. All three were under 10 years old.

Hours later, faux news accounts and far-right figures on social media began to spread misinformation about the attack, most notably that the perpetrator was Ali Al-Shakati, a Muslim migrant and new arrival to the country. By 3pm the day after the attack, that false name had received more than 30,000 mentions on X and was even recommended to app users as a “trending topic” by X’s algorithm.

Far-right organisers and groups used the traction of those stories to organise a march in Southport that quickly became violent, with participants throwing bricks and bottles at a local mosque, while also setting cars and police vans alight after police were deployed to protect the building. Soon, the violence spread to mosques and hotels that house asylum seekers across the country. When those couldn’t be found, far-right rioters burned down libraries, Citizens Advice Bureaus or cars owned by care workers instead. Before the violence died down, immigration lawyers began to flee towns after receiving death threats from the far right.

In the weeks that followed, the focus increasingly shifted to social media’s role in the tragedy. It was, after all, misinformation spread through sites like X that first lit the fire for baseless claims that the attacks were conducted by a foreign migrant. In the aftermath of the attacks, the government cracked down, with prison sentences for those who stirred up racial hatred and violence, as well as for spreading dangerous misinformation digitally.

But in all the mainstream debate on the perils of misinformation, there’s been precious little focus on exactly how this problem has come about, and whether some of the root causes run a lot deeper than many would like to admit.

The psychology of misinformation

“Obviously, misinformation played an important role,” says Stephan Lewandowsky, a psychologist at the University of Bristol who specialises in misinformation. “In this instance, the link between the specific false information that was spread on social media and the actions by the rioters afterwards was so unique and precise that it’s very difficult to deny.”

That kind of direct causality is rare in our complex and overstated information ecosystem, where individual direct drivers of events can be hard to pinpoint.

But the story is more complicated than social media inventing misinformation. “From the beginning of recorded history, you find that a lot of communication happens through fiction. It’s like we’re telling each other stories to communicate things,” says Walter Scheirer, author of A history of fake things on the internet.

Scheirer argues that much of human communication is more about expressing an idea or belief than it is about a factual recollection of events. The main difference now is that social media has changed the size of the audience and gives largely equal credence to any voice, meaning you can give this “enormous megaphone to anybody”.

Some deep psychological needs draw people to conspiratorial theories, according to Karen Douglas, an expert in the psychology of conspiracy theories at the University of Kent. Narratives, like those around the Southport murders, help people meet both epistemic needs – the desire for “clarity and certainty” – and existential needs – the desire to “feel safe and to have some control” over things that are happening. Given that, they can often be hard to dislodge.

“People are looking for ways to understand what is going on and they don’t like the uncertainty that often surrounds unfolding events. Also, a simple explanation is often not very appealing. People assume that there must somehow be a bigger explanation, or more going on than people know about,” she explains. “Once conspiracy theories are out there, they are difficult to quell, especially when some of the facts are still unknown. Even after that, once people believe in a conspiracy theory, it is often difficult to convince them otherwise.”

Accelerationist tendencies

But social media is uniquely placed to accelerate those kinds of narratives. Lewandowsky explains that social media sites create a “false consensus” effect for users – essentially, by connecting like-minded people across huge distances, it allows you to think larger numbers of people share your views than you would if you were limited by physical interactions. “And we do know that people act on their beliefs more when they feel others are sharing it,” he says.

It’s facts like this that may go some way to explaining why a neo-Nazi in Finland was revealed to be one of the main instigators of violence as an organiser on the Southport Wake Up chat on messaging app Telegram, where much of the far-right action was planned.

“The other thing that social media does is that the whole business model of these platforms is based on creating engagement because that’s how they make their money. If we stay on the platform because we’re engaging with content, then they can show us ads, and they collect money for the ads,” adds Lewandowsky. “And human beings pay attention to material that evokes outrage and anger, negative emotions. For the platforms, that’s wonderful because they can sell ads … but it’s not so good for a democratic society.”

And social media has now reached a size that it was never planned to in the early days of the internet. As part of his research, Scheirer found that even in the earliest days of computer networking you would find servers “about UFOs and government conspiracies”.

While those communities were by their very nature self-contained, now an open social media landscape means when a message is put out it “amplifies and amplifies” until it has a massive global audience. Discourse then tends to centralise more and more around those individual messages or ideas.

“A lot of my research is trying to reimagine a less hostile internet. And I think the answer really is pulling back from these global social media services which don’t really serve a lot of good,” he says. “The internet was never really designed to be a database of facts. There’s this idea from the 1990s that it is this information superhighway, but that’s an idea that a bunch of large tech companies came up with long after the internet was created.”

Specific policies by the platforms themselves may also have facilitated the tragedy.

“There is no doubt that social media is playing a really important role,” says Joe Mulhall, director of research at Hope Not Hate. He cites the reinstatement of Andrew Tate and Tommy Robinson to X, where they have developed huge followings to whom they helped spread misinformation about the Southport attacks.

“Tommy Robinson’s reach has grown since his X account was reinstated last year,” explains Mulhall. “His last two demos in London have attracted tens of thousands of people and his X following is now over 800,000 people, meaning he once again has an enormous reach online.”

Their influence speaks to the fact that the modern far right isn’t the same as the one Britain saw even as recently as the 2000s.

This “post-organisational right” as it is often called is much more decentralised and harder to perfectly define ideologically than the far right of decades past, which was focused on individual political parties or groups. Instead, it is now much more disparate, defined by individualised, fluid sets of fears, angers and ideologies, shaped by specific far-right figures, like Tommy Robinson or Andrew Tate, rather than groups or political parties with set agendas. Even the far right, it seems, isn’t free of social media influencers.

Political context matters

But several of those Computer Weekly spoke to made clear that for all the impact social media had in lighting the fire, the fuel that allowed that spark to turn into an explosion came from elsewhere. 

Past research by the Centre for Media Monitoring found that nearly 60% of articles in the British press on Islam were negative or spread baseless tropes about Muslims, with certain newspapers, like The Mail on Sunday, being the worst offenders.

Academic research has even found media coverage can directly drive support for populist right-wing parties like UKIP, while experts at the University of Leicester’s Centre for Hate Studies have specifically claimed politicians and the media can help fuel hate crime in the UK.

“What I find frustrating in the media coverage on this is that everybody is very happy to talk about social media, but no one is recognising the political context in which this has been unfolding for the last 15 years in the UK,” says Lewandowsky. “If you look at the track record of the tabloid media and the previous government and the language they use, then these events of the last week or so become far less surprising because the context was created for them.”

In the aftermath of what feels like society-changing riots, it remains to be seen if those much harder questions about responsibility will be asked.



Source link

Shares:
Leave a Reply

Your email address will not be published. Required fields are marked *