Home People & Planet The rise of right-wing disinformation in democracies while regulations lag

The rise of right-wing disinformation in democracies while regulations lag

Misinformation and disinformation are global threats that must be stemmed
Associate Faculty at the University of London – SIM-Global Education
Graduating student at the University of London - SIM-Global Education
Fae news illustration

Political polarisation has reached critical levels across many parts of the world, sometimes even leading to political violence and the undermining of democracy. There has been a gradual escalation in polarisation, nurtured by rapid technological change in social and political spheres, particularly with social media growth.

The recent Capitol Hill riots, for example, illustrate that social media can fuel an assault on democratic values and institutions.

Platforms such as Twitter have been associated with pro-democracy or citizen rights’ protests like the Arab Spring of 2011, and more recently the Hong Kong protests of 2019-2020, the Black Lives Matter movement of 2020, and the 2020-2021 farmers’ protests in India.

On the flip side, all social media platforms have also been increasingly used by the right-wing and governments to create new narratives, quell minorities, or quash dissent. As regulation is underway, it is important to project a concrete “egalitarian, freedom-enhancing, and pro-democracy vision of the internet” against the many actors combating it. 

In doing this, the real challenge is to balance the right to free speech and the containment of misinformation and disinformation, especially on hard-to-regulate social media platforms. Misinformation refers to false information spread without the intention to mislead. Disinformation, meanwhile, goes a step beyond to intentionally manipulate facts and disseminate false information with some motive.

This phenomenon is enabled by the reigning epistemic chaos in the social realm – implying that society is left without any generally trusted institutions that can function as providers or arbiters of truth.

When disinformation and democracy collide

One direct consequence has been the quick emergence of fake news even in relation to the most seemingly mundane democratic events. Coupled with mostly unregulated social media platforms, this has resulted globally in the emergence of conspiracy theories, some would argue, at both extreme ends of the political spectrum. By encouraging the rejection of established narratives—and sometimes facts—these are capable of exerting great influence on people.

As such, conspiracy theories have the potential to undermine the pillars of democracy, and by distracting attention from issues of importance. Interestingly, former US President Donald Trump’s statements—and tweets—often propelled belief in these sometimes-baseless ideas and theories. For example, according to a 2020 YouGov poll, 27% of American adults believe climate change and global warming are a hoax – the second highest share in the world.

Following the 2016 presidential elections in the US, the Republican party sought increasingly to appeal to an active radical fringe of voters by legitimising far-right conspiracy theories. These disinformation strategies—usually relating to majority appeasement, national security and immigration fears—are also, in turn, promoted by many media outlets.

This is harming democratic discourse. It has led to the near-elimination of cross-partisan debates that were common prior to the age of social media. Rather tellingly, 73% of American adults are of the view that Democrats and Republicans not only disagree over plans and policies, but also cannot agree on basic facts.

Another harmful impact of these trends is the growing strong distrust in established institutions. There has been a marked erosion of trust in the presidency and in congress, from 52% and 40% in 1975 to 39% and 13% in 2020 respectively. In the most extreme scenarios, the continued propagation of fake news and conspiracy theories can result in violence. Following the Capitol Hill riots, the FBI is warning about the possibility of further attacks on state capitols.

A global phenomenon

India, the world’s largest democracy, hasn’t been immune to this bug either. Worryingly, there they take the shape of hate narratives based on religious identities—especially between Hindus and Muslims—as politicians seek to appease their voter bases. The startup Logically, which uses human fact-checkers along with machine learning AI, found approximately 50,000 fake-news stories were published during the 2019 general elections. These stories were subsequently shared two million times. Both the incumbent BJP and the chief opposition party, the Indian National Congress, were party to this spread of content that was deemed “divisive and conspiratorial”.

More and more, Indian political parties employ ‘cyber troops’ or trolls in order to spread false narratives. They are tasked with manipulating public opinion online and end up forming a new and more extreme stage of political competition. This manipulation sometimes results in physical violence, as demonstrated by the propagation of rumours and false information that has resulted in the lynching of some Muslims in the country, contributed to in no small measure by misinformation on WhatsApp. Online manipulation is a favoured electoral strategy because conspiracy theories spread quickly due to low digital literacy and cultural biases. Coupled with deep internet penetration and the ubiquity of platforms such as WhatsApp, one gets a dangerous cocktail of factors. Additionally, by rewarding sensationalism, social media platforms’ algorithms reinforce user’s prejudices, complicating the practice of fact-checking.

Misinformation and disinformation have become key issues in the fight against the pandemic as well. Globally, there have been several instances of prominent personalities sharing unscientific and wrong information in relation to the coronavirus and vaccines. For example, Didier Raoult, a French physician and microbiologist specialising in infectious diseases, has played a major role in sharing information that is not factual over social media. He shared a message contradicting French health authorities and advocated the use of hydroxychloroquine against COVID-19.

Mr Raoult’s statement gained him a wide following. In a brief span of time, there were over 1.1 million people following 90 different Facebook groups dedicated to him. Among his supporters, 89% accepted his stance that “the ministry of health is in league with the pharmaceutical industry to hide from the general public the noxious effects of vaccines”.

A lot being done, but it’s still early days

Making attempts to curb the spread of conspiracy theories is imperative. The EU is the first major political entity to put forward new legislation with the Digital Services Act. This legislation is designed to create a single set of rules for EU users to stay safe online while protecting their freedom of expression. It also helps local authorities hold tech companies accountable. For instance, larger companies will be required to subject themselves to further scrutiny. They will need to publish an annual report detailing their handling of major risks, including users posting illegal content, disinformation that could sway elections and the unjustified targeting of minority groups. 

Additionally, online platform operators will have to prioritise complaints raised by “trusted flaggers” who have a track record of highlighting valid problems. These measures demonstrate the importance of compliance by large tech companies that own and manage the platforms through which most disinformation is disseminated.

The Internet is now officially regarded as a public space and the responsibility to regulate it has been transferred to states. Currently, social media giants seem to have adopted either a proactive approach—such as Twitter banning President Trump’s account—or a passive stance, waiting for official regulations, such as in the case of Facebook stepping away from regulating online speech.

But more recently, the political right and conservatives find themselves increasingly at odds with major platforms. They are of the view that social media platforms have an anti-conservative bias and they do not want oversight over their actions. This means several tweets or posts can be taken down if the platform deems them to be misinformation, but this infringes on the right to free speech.

Nevertheless, given their profits rely primarily on advertisement revenue, social media platforms have an interest in promoting virality and polarisation to boost online traffic. More importantly, algorithms amplify users’ behavioural patterns and lead them to a confirmation bias through online interactions, resulting in a wider spread of conspiracy theories.

More countries are now looking to reform online governance. In 2017, the Honest Ads Act was passed in the US, while the EU high level group published a report on misinformation in 2018. This framework requires the likes of Google, Facebook and Twitter to keep copies of advertisements, and to publicise and keep track of all targeting and transaction advertisement information.

France enacted a law in 2018 that defines fake news. It also requires a judge to decide whether a news release requires removal 48 hours after being first notified. France’s opposition argued this law falls short of the principle of proportional justice and is said to conflict with existing penal codes. 

On the part of large social media companies, a first step towards assuming greater responsibility could be profit diversification, with the aim to decrease reliance on advertisement revenue and ultimately curb the urge to maximise online traffic. Profitability and business viability cannot be sustained on a bedrock of political polarisation and radicalisation.

While there is a growing trend toward more regulations by governments to moderate online content, other sweeping solutions are also being proposed, including taxing internet platforms that host user-generated content, employing independent auditors to audit content, making algorithms open for the public, and perhaps a new industry on trust-and-safety-as-a-service.

Print Friendly, PDF & Email
Author profile
Associate Faculty at the University of London – SIM-Global Education

Dharish David is an associate faculty for the University of London at the Singapore Institute of Management – Global Education (SIM-GE), teaching courses relating to political economy. He previously worked at the Asian Development Bank Institute in Tokyo as a research associate on Green Growth and Infrastructure Development in Asia. He has written widely on infrastructure development, Asian political economy, economic development and international relations.

Author profile
Graduating student at the University of London - SIM-Global Education

Benjamin is a graduating student at the University of London - SIM - Global Education, soon to pursue his Master's in Political Economy. He is also an author at The IAS Gazette, a news site run by undergraduates in foreign affairs at SIM Global Education.

You may also like