Social media has played a significant role in fueling the anti-immigration riots sweeping across towns and cities in the United Kingdom, with agitators like Elon Musk not sitting on the sidelines.
The Tesla CEO and owner of X (formerly Twitter) posted on Sunday that “civil war is inevitable” in response to a post blaming the violent demonstrations on “mass migration and open borders.”
On Monday, a spokesperson for the UK prime minister responded to Musk’s comment, stating, “There’s no justification for that.” Musk’s decision to amplify anti-immigrant rhetoric highlights the role of online misinformation in inciting real-world violence—a growing concern for the UK government, which vowed Tuesday to bring those responsible for the riots, as well as their online supporters, to justice.
Later on Tuesday, a 28-year-old man in Leeds became the first person to be charged with using “threatening words or behaviour intending to stir up racial hatred” online, according to the UK Crown Prosecution Service. The charges related to “alleged Facebook posts,” Nick Price, the director of legal services at the CPS, said in a statement.
In recent days, rioters have damaged public buildings, set cars on fire, and hurled bricks at police officers. They also set ablaze two Holiday Inn hotels in northern and central England believed to be housing asylum seekers awaiting decisions on their claims. Hundreds have been arrested.
The riots began last week after far-right groups claimed on social media that the person charged with a horrific stabbing attack that left three children dead was a Muslim asylum seeker. The online disinformation campaign stoked outrage against immigrants. However, the suspect, Axel Rudakubana, was born in the UK, according to police.
Despite police clarifications, false claims about the attack—the worst mass stabbing targeting children in the UK in decades—continued to spread online. According to the Institute for Strategic Dialogue (ISD), by mid-afternoon on July 30, a false name attributed to the alleged asylum seeker had been mentioned more than 30,000 times on X by over 18,000 unique accounts. The ISD noted that platform algorithms helped amplify the misinformation.
The UK government suggested that bots, possibly linked to state-backed actors, may have played a role in spreading the false information.
Tackling ‘Online Criminality’
Social media companies, despite having internal policies against hate speech and incitement to violence, have struggled with enforcement, especially during crises. Isabelle Frances-Wright, a technology expert at the ISD, noted that content moderation systems often fail under the pressure of a crisis.
Musk himself has promoted incendiary content on X, raising concerns about the platform's role in spreading misinformation. Following the October 7 Hamas attack on Israel, Musk endorsed an antisemitic conspiracy theory, which he later apologized for. Under his ownership, X has relaxed content moderation policies and reinstated previously banned far-right accounts, including figures like Tommy Robinson, who has been stoking the UK protests.
The UK government has vowed to prosecute “online criminality” and is pushing social media companies to act against misinformation. UK Home Secretary Yvette Cooper criticized social media platforms for amplifying violence and disinformation, stating that the situation “cannot carry on like this.”
During a cabinet meeting on Tuesday, UK Prime Minister Keir Starmer promised swift justice for those involved in the riots, both online and offline. The UK’s Online Safety Act, which was adopted last year, creates new duties for social media platforms, including the removal of illegal content. However, the legislation is not yet in effect as Ofcom, the regulator, is still developing codes of practice and guidance. Once in force, the law will allow Ofcom to fine companies up to 10% of their global revenue for non-compliance.
Source: CNN