Skip to main content

Authoritarian states have weaponised the vulnerabilities of technology and unchecked power of social media to exploit the very tenet of democracy that they despise – free expression – to undermine democracy itself.


Democracy is on life support! At least, that appears to be a dominant narrative in our public discourse. Driven in large part by populist upheavals motivated by genuine grievances against the established status quo and its ruling elites, technology—in particular social media—has taken extreme and polarising voices to scale. In a world of near-ubiquitous and evermore intrusive connectivity, fuelled by micro-segmenting algorithms, democracy as an idea remains ill-equipped to defend itself against the powerful and persistent centrifugal forces of technology. Although fundamentally a crisis of our own doing, or at least our acquiescence, the vulnerabilities of technology have now become weapons of choice employed by militarily and economically weaker authoritarian states such as Russia and China in their seemingly unrestrained struggle against the West. Hence, the aim of this paper is to highlight the threat posed by social media, and those who seek to exploit it, to democratic ideals, as well as to propose a model to protect them, represented by the acronym DARE (Defend, Assist, Regulate, and Educate). Although a significant challenge, as this paper describes, standing up against the forces seeking to undermine democratic values is ultimately a matter of choice, rather than capability. While not a panacea to the underlying tensions enabling polarisation within our societies, acting swiftly to stem the damage caused by such technologies must take centre stage if we are to save the democratic experiment.

‘…if Russia tries to fly a plane in the United States, there's a Pentagon whose job it is and will successfully shoot down that plane before it gets anywhere near the United States, but if they try to fly an information plane into the United States to sow division, Facebook and Google's algorithms say, "What zip code do you want to target?"’ (Harris, 2020, 41:32min)


Democracy is on life support! At least, that appears to be an evermore present narrative in our public discourse. In fact, entire library shelves have, and continue to be, filled with books highlighting the imminence of this threat. Titles such as Vanishing Voters, Democracy in Crisis? Democracies in Flux, Is Democracy a Lost Cause?, Can Democracy be Saved?, Suicide of the Westand Why We Hate Politics are merely the tip of the doomsday iceberg. The virality of the topic alone ought to ring alarm bells, and although the narrative of ‘Democracy in Crisis’ is nothing new (Ercan & Gagnon, 2014), its contemporary epilogue indeed paints a concerning picture of ‘democratic decline and rise of anti-politics’ (Flinders, 2016, p. 182). This is underpinned by, as summarised by Tormey (2016), a significant waning in the various measures used to assess the health and wellbeing of representative democracies, most notably voter turnout, party membership, trust in politicians, and interest in politics. In other words, we are experiencing a waning in political trust, a recognised precursor for democratic rule (Van der Meer, 2016). Now, technology—in particular social media—has taken these vulnerabilities and legitimate grievances to scale by creating tribal echo chambers that amplify divisive voices into toxic and radicalised narratives that threaten the very foundations of democratic societies—trust, informed dialogue, shared sense of reality, mutual consent, and participation (Anamitra, Donohue, & Glaisyer, 2017).

Making matters worse, this is unsurprisingly coinciding with a resurgence of authoritarian regimes, geopolitical turmoil and demagogues of all varieties gaining greater influence, resulting in a concerted and seemingly credible challenge to the democratic model of governance (Flinders, 2016). While competition between ideologies and political systems is nothing new, the current battlefield is far from level. Relying on the ubiquity of technology and the internet, authoritarian regimes, most notably Russia and China, have turned a pinnacle strength of democracy—freedom of speech—into its most vulnerable weakness. By their very nature, democracies believe in and rely on the open exchange of ideas and information. However, in a world of ubiquitous and evermore intrusive connectivity, fuelled by micro-segmenting algorithms, democracy, as an idea, remains illequipped to defend itself against the divisive and persistent centrifugal forces of its challengers. As a result, the purpose of this essay is not merely to expose the dangers posed by technology, in particular social media, but to also stress the importance of standing up for democracy. While it may be politically advantageous to ‘cry wolf’ over the actions of the Dragon and the Bear, figuratively speaking, we must rise beyond rhetorical accusations and cosmetic interventions and take an active stance in the defence of democracy. In other words, and to introduce the unashamedly choreographed acronym, democracies need to DARE (Defend, Assist, Regulate, Educate) and step up to the challenge of liberal ideals. The following pages aim to contribute towards this end.  

The price of hubris

Liberal democracy won the 20th century, but the 21st is still up for grabs. Triumphing over fascism in the first half of the last century and communism in the second, liberal democracy was set to become the dominant political system. At least that was the rhetoric echoing across democratic halls of the West, captured neatly by Francis Fukuyama’s (1992) ‘End of History’ proclamation. Democracy had been recognised, to paraphrase Churchill, as the worst form of government except for all the others. In other words, and recapitulating a point made by Goldberg (2018), democracy, more than any other system of governance, has supressed our innate tribalism and in-group bias. As a result, what he refers to as the ‘Miracle’ of democracy, underpinned by free market capitalism, allowed for the establishment of trust between complete strangers. Coupled with the mediating power of civil society—the vast social ecosystem of family, schools, churches, associations, sports, business, local communities, etc—between the electorate and the elected, we were able to overcome, as Martin (2018) explains, our evolutionary need to use violence as a means to secure survival and reproduction. More than any other system, it managed to morph this destructive and primitive drive into productive ends. In short, democracy has proved capable of taming human nature for the benefit of all, rather than the few.

However, this hubris has led to a loss of focus and investment into the very foundation that makes democracy—the wellbeing of her people. As we wallowed in our seeming success, spurred on by globalisation and promises of never-ending growth, the cancer of wealth inequality started to spread. For decades, democracies, headed by the might of the United States (US), reinforced the now-debunked neoliberal promise that the rising tide of free markets would ‘lift all boats equally’. We now know that this was a myth (Markovits, 2019; Martinez, 2009; Sandel, 2020). While the global elite grew their wealth, the working and middle classes across the developed world experienced a stagnation of real growth leading to increasing disillusionment with the promises of their elected leaders (Piketty, 2017). No other event lifted the veil of hypocrisy more than the 2008 Global Financial Crisis, where the culprits of the damage escaped not only unpunished but rewarded for their misdeeds. It is therefore unsurprising that we are witnessing a growing disillusionment with democracy across the Western world, where the percentage of people who consider it essential to live in a democracy has plummeted (Pew Research Center, 2019; Taub, 2016).

A natural consequence of this reality is the rise of populist sentiment against the ruling elite, giving birth to far left movements—best described as insurgencies—as well as a wave of near-militant right wing ‘new “alpha males” of international politics [who] agitate their audiences with promises of border walls, rallying cries for patriotic national rejuvenation, ‘anti-neoliberalism’, historical revisionism and Olympic ethno-nationalism’ (Gonzalez-Vicente & Carroll, 2017, p. 993). The result of this quagmire is a growing schism between different segments of society, with a reduced ability to ‘re-imagine a different way of living; to re-connect with those around us; to re-interpret challenges as opportunities or to re-define how we understand and make democracy work’ (Flinders, 2016, p. 182). To this volatile mix, we have added social media.

Our attention, for sale

‘With technology, you don’t have to overwhelm people’s strengths. You just have to overwhelm their weaknesses. This is overpowering human nature. And this is checkmate on humanity’ (Orlowski, 2020). This highly controversial statement by Tristan Harris, a tech whistleblower and former design ethicist at Google, provides an insider’s perspective on the dangers of modern technology. As individuals, we are ill-equipped to combat its growing penetration into our lives. Nowadays, most of our day-to-day activities are ‘routered’ through our smart devices. Where we traditionally had to look up a number in a phone book, consult a city map, phone the local fastfood store to order, hail a cab on the side of the road or trawl through textbooks for knowledge, we now merely have to do some ‘googling’ before diving into the bottomless ocean of information in the palm of our hand. However, while this comes with indisputable advantages, we are now also continuously connected and reachable, meaning that the ‘world’ continously competes for our attention. It is here that the business model of social media—defined here as internet-based communication tools such as Facebook, Twitter and YouTube that afford ‘users the capability to create profiles, connect with other individuals, and build and navigate their list of connections within a bounded network’ (Rochefort, 2020, p. 228)—comes into its own. The more time we spend ‘engaged’ with these enticing and ‘free’ platforms, the more marketing bandwith they can sell to an advertiser. In short, our attention is the product being sold, and the fact that social media companies have become some of the wealthiest in the history of humankind is a testament to the effectiveness of this business model (Hendricks & Vestergaard, 2019).  

Although some might argue that this supply and demand symbiosis merely affirms the value of capitalism as the most efficient economic system, the real problem here is that this perpetual level of access to users significantly impacts any notion of independent choice. Many of us have—unwittingly—abdicated the selection of content we engage with to the sole discretion of ethically- or morally-unbounded algorithms designed to keep us glued to our devices (Napoli, 2019). In other words, what we pay attention to in the near-ubiquitous digital bubble of our smartphone is merely stovepiped information curated by our own biases and previous ‘likes’, aptly tuned by super computers to provide us with a more personalised viewing experience than we could have ever selected ourselves. It is therefore not a surprise that, for example, 70% of the more than billion hours of video viewed on YouTube each day are driven by the application’s algorithms (Solsman, 2018). In other words, the majority of the content viewed on this platform is not chosen by the viewer but by the platform itself, which should in itself be alarming. Making this worse, as highlighted by Tufekci (2018), are YouTube’s algorithms, which assume that users are drawn to content more extreme than where they started, resulting in a progressive slide to evermore sensational and extreme videos. Although this meets the platform’s intent of keeping us ‘engaged’ for longer periods of time—which, at face value, may seem reasonable and even welcome from a consumer’s perspective—the primary unintended consequence is an ongoing radicalisation of the viewer (Tufekci, 2018).

Importantly, an additional—and perhaps more sinister—unintended consequence of this hyper-segmented and individualised content across the major social media platforms is the gradual blurring between fact and fiction. Referred to as ‘Truth Decay’ by Kavanagh and Rich (2018), our reality is shaped by influencers spreading ‘alternate facts’, which are saturating the digital marketplace with radical voices of questionable intentions while sidelining traditional and vetted sources of information. Alas, we now live in a world where false news spreads six times faster than fact (Vosoughi, Roy, & Aral, 2018) and where each moral or emotional word posted increases its virality by 20 percent (Brady, Wills, Jost, Tucker, & Van Bavel, 2017) . This dilution of epistemic certainty has also accelerated the spread and potency of ‘conspiracy theories’ that pretend to explain significant social and political events (Douglas, et al., 2019) . Importantly, once someone believes one conspiracy theory, they are more likely to believe others, distancing them even further from a commonly agreed upon reality (Goertzel , 1994) . Making this problem significantly more dangerous are bot farms—large-scale automated accounts impersonating people—that like and share fake news and misinformation by orders of magnitude compared to their human counterparts (Lazer, et al., 2018) . The result is that we end up living ‘in the shattered prism of a shared reality, where we’re each trapped in a separate shard’ (Harris, 2020, 33:15min) without an effective way to reconnect. In short, and to quote Edward Osbourne Wilson, ‘we have paleolithic emotions; medieval institutions; and god-like technology’ (Harvard Magazine, 2009) , making our attention span exceptionally vulnerable to influence by a business model that thrives on scandal, division and conflict. Taking this threat to a whole new level, authoritarian states are actively seeking to capitalise on this self-induced entropy by deliberately injecting further division and misinformation.   

Assisting our suicide

Into this malaise of broken-down communication and echo-chambered radicalisation, we are now witnessing an acceleration of decay ably assisted by authoritarian states, most notably Russia and China (Morris, et al., 2019). Falling under the umbrella  of ‘grey zone’ warfare—‘an operational space between peace and war, involving coercive actions to change the status quo below a threshold that, in most cases, would prompt a conventional military response’ (Morris, et al., 2019, p. 8)—these actors are exploiting the already polarised information domain to sow further unrest. In the case of Russia, Western democracies are being bombarded with a ‘firehose of falsehood’ that seeks to saturate the information domain before allowing social media’s algorithms to do the polarising (Christopher & Matthews, 2016). Increasing the potency of this modus operandi, actors such as the Kremlin-backed Internet Research Agency (IRA) have, for example, run a coordinated campaign over the past six years and across more than 300 sites and platforms—since titled ‘Secondary Infektion’—seeking to destabilise Western democracies. Relying on fake social media accounts, bot farms and forged documents representing all sides of Western politics, the aim remains to instigate unrest, confusion and conflict (Nimmo, et al., 2020). To cite a specific example, between 2015 and 2017 on Facebook alone, the IRA funded 3393 advertisements reaching 11.4 million Americans. This was reinforced with 470 IRA-created Facebook pages injecting 80,000 pieces of organic content, which were observed by 126 million Americans (US House of Representatives Permanent Select Committee on Intelligence, 2017). As recently uncovered, these tactics continued throughout the US 2020 presidential elections (Frenkel & Barnes, 2020).

Learning from the success of these tactics, China recently also stepped up its information operations by investing into Beijing-backed media outlets and cultural institutions operating abroad with the aim of spreading misinformation and undermining cohesion of democratic governments (Morris, et al., 2019). This was reinforced by extensive social media campaigns targeting audiences outside of the Chinese mainland, seeking to shape perceptions and thereby curtailing global responses to the Chinese Communist Party’s (CCP) influence operations in Hong Kong and Taiwan, as well as muddle the global response to the COVID-19 pandemic (Wallis, et al., 2020). As highlighted by Thomas, Zhang and Wallis (2020), YouTube removed more than  2000 channels linked to coordinated shaping operations by the CCP since April 2020, which sought to undermine the credibility of Western countries in general, and US in particular, and their management of the pandemic. This ongoing effort relies heavily on the Chinese diaspora abroad to serve as the digital billboard of the CCP that can, and frequently does, amplify its messages within democratic nations. The sheer scope of these efforts, suggesting a significant investment of time and resources, is indicative of the persistence and patience of China in pursuing its destabilising goals (Thomas, Zhang , & Wallis, 2020).

While it remains beyond the scope of this essay to explore in detail the many and varied efforts by both Russia and China to destabilise the West, suffice to say that their digital fingerprints have left little doubt about the extent of their operations or their level of sophistication (Babbage, et al., 2020; Christopher & Matthews, 2016). These have—and are certain to continue—exploited the inadequate response by the West to its divisive efforts through sustained and deliberate information warfare. In their view, while they remain militarily and economically inferior, this is the only way to achieve strategic objectives in the ever-present great power competition (Rosenberger & Gorman, 2020; Sanger, 2018). Yet, democracies cannot afford to play by the rules set by these regimes, as this will lead to the inevitable erosion of the very principles that make democracies democratic—freedom of expression, including ‘freedom to seek, receive and impart information and ideas of all kinds’ (United Nations, 1966). On the other hand, democracies cannot remain passive and must act.

Time to DARE

Democracy is fragile and at risk (Hartcher, 2020). Rather than idly standing by, and to now delve into this convenient acronym, democracies need to DARE—Defend, Assist, Regulate and Educate—to manage it. While they have been arranged in a manner that achieves simplicity and aesthetic appeal, these four actions are equally important and mutually reinforcing. Most importantly, perhaps, they by no means seek to replace the need for democracies to deal with the underlying issues giving rise to legitimate grievances within their populations. Rather, they are intended to reduce the current polarising effect of social media that continues to be exploited by bad faith actors such as Russia and China.  


Although it is indisputable that governments around the world have started to take measures to defend both against polarisation that eventuates organically through social media algorithms as well as against state-sponsored misinformation and division operations (Robinson, Coleman, & Sardarizadeh, 2019), there remains an imperative to accelerate and coordinate our response. For example, many European Union (EU) countries have created government-run fact-checking bodies that aim to undermine the virality of—in this instance largely Russian—misinformation and ‘fake news’. However, they have been slow to adapt, lack coordination and have drawn a lot of criticism and accusations of censorship (Robinson, Coleman, & Sardarizadeh, 2019). Other parts of the Western world have been even slower to respond, resulting in an information vacuum that authoritarian regimes like Russia and China have proven not only eager to fill, but also increasingly more capable as well (Rosenberger & Gorman, 2020). This has left democracies exposed and vulnerable with their defence characterised by a haphazard kaleidoscope of solutions that lack the potency and urgency required to combat this disease (Lazer, et al., 2018).

As highlighted by Huang (2020), one example where a government-led response was effective was in Taiwan, where a coordinated effort sought to undermine Chinese misinformation on the COVID-19 pandemic. Seeking to act pre-emptively, the government monitored social media platforms around the clock and called out ‘fake news’ and misinformation well before it could ‘go viral’. As an innovative way to neuter the polarising effect of Chinese propaganda, Huang (2020) goes on, the government used humour and memes, thereby making government content more likely to be shared. Importantly, the government also synchronised its messaging with the news cycle, publishing factual information ahead of evening news, allowing the traditional media to contribute toward its spread. Importantly for other democracies, research supports this pre-emptive approach, suggesting that ‘forewarning is perhaps more effective than retractions and refutation of propaganda’ (Christopher & Matthews, 2016, p. 9). In other words, first impressions matter, and actively seeking to get ahead of false and misleading information gives democracies the best chance of reducing their virality and potency within their populations (Ullrich, Lewandowsky, Fenton, & Martin, 2014).

Additionally, given the scope and impact of state-sponsored misinformation campaigns, we must implement a whole-of-government effort against those intending harm. While we must be cautious about employing capabilities, especially any that may trigger a conventional military response, we should exercise our right to self-defence against incursions within our digital borders and efforts to undermine societal cohesion. It is no secret that most Western nations have begun a process of bolstering their cyber capabilities, but whether these should be used to defend against attacks on social media platform remains a contentious issue. However, in the face of indisputable grey zone warfare by states like Russia and China, there is an imperative to not only build resilience but to also create deterrence. A crucial step in this effort is to publicly expose misinformation campaigns. What Ford (2020) calls ‘attribution diplomacy’ involves efforts of rapidly seeking to confirm and then attribute responsibility against malign actors. Equally, he goes on, this ought to include the imposition of a range of consequences against the actor, including the targeting of malicious activity at its source. This is an option also articulated in the 2018 US National Cyber Strategy (The White House, 2018), and although the document only implies the use of military capabilities, these should certainly form part of a whole-of-government response. Importantly, and leading into the next section of this essay, the same document stresses the need for a concerted effort by likeminded nations.


Another avenue that is gaining increasing global traction is the need to coordinate and synchronise efforts between likeminded nations against misinformation campaigns by authoritarian states such as Russia and China. An insightful example is the East StratCom Task Force established by the EU that seeks to build cooperation and understanding within its member states as well as with partner nations (European Union External Action, 2018). It seeks to achieve this through transparent communication and the overall strengthening of the media environment, as well as through a dedicated effort to improve EU capacity to predict and respond to disinformation activities by external actors (High Representative of the Union for Foreign Affairs and Security Policy, 2018). Importantly, although coordinated centrally from its headquarters in Brussels, the nature of this collaborative effort allows the EU to target Russian misinformation campaigns using local languages and therefore going some way towards reducing their impact at the local level (European Union External Action, 2018). This level of coordination has proven critical against a recent Kremlin-led attack seeking to push fake news about COVID-19 in English, Spanish, Italian, German and French, which relied on ‘contradictory, confusing and malicious reports to make it harder for the EU to communicate its response to the pandemic’ (Emmott, 2020).  

Another notable effort, although assessed as slow in development (Polyakova & Fried, 2019), is the Global Engagement Centre within the US State Department. With a lofty mission of seeking to ‘direct, lead, synchronize, integrate, and coordinate’ efforts against foreign state and non-state misinformation (US Department of State, n.d.), it also stresses a focus on protecting not only the stability of the US, but also of its allies and partner nations. This is an argument also presented by Rosenberger and Gorman (2020), who rightly emphasise the need for a multilateral approach to not only combat the authoritarian strategy of causing intra-national mayhem, but also against their efforts of dividing international cohesion between likeminded democracies. In response to the same realisation, the North Atlantic Treaty Organization (NATO) has established the Cooperative Cyber Defense Center of Excellence as a vehicle to combat grey zone threats. Through the conduct of research and publication of findings as well as provision of training to member states, NATO hopes to develop a greater deterrence against misinformation and ‘fake news’ campaigns, in their case predominately by Russia (Thompson, 2020).     

Although noble efforts, the current trajectory of division across the West clearly suggests that they are not enough (Thompson, 2020). Efforts as recent as September 2020 by Russian military intelligence who established 214 accounts, 35 pages and 18 groups on Facebook as well as 34 Instagram accounts to spread misinformation confirms this (Nimmo B. , Francois, Eib, Ronzaud, & Carter, 2020). Democracies need to accelerate their collective efforts to build resilience and cooperation. While, for example, holding workshops seeking to raise awareness and building capability in mid-level officials of South-East Asian and the South-West Pacific nations, as carried out by the Australian Department of Foreign Affairs and Trade (Department of Foreign Affairs and Trade, 2020), is a worthwhile and much-needed act, it is in no way sufficient to counter the deluge of misinformation on social media by Russia and China. A relevant and contemporary example is the spread of conspiracy theories and ‘fake news’ by these actors about COVID-19, intended purely to sow division, doubt and confusion within democracies (Ignatidou , 2020). The fact of the matter is that even the most astute nations such as Taiwan are finding the onslaught difficult to defend against and are requesting help (Zhang, 2020). We can only imagine what awaits those less attuned to the dangers of these efforts. To supplement our incipient efforts, we need to demand action from the social media companies themselves.


As a result of this realisation, the justified call to regulate social media platforms is gaining increasing attention. While many continue to argue that regulating these mediums undermines their ability to compete in the ‘information market’ as well as their users’ freedom of choice and speech (Samples, 2019), there is little evidence that social media companies are willing to sufficiently regulate themselves (Taylor & Hoffmann, 2019). Hence, doing nothing to force change will continue to deliver the same result. This point was best summed up the former head of Facebook Australia, Stephen Scheeler, who stressed that ‘[m]ost companies aren't capable of self-regulation because there is an agency problem. Unless there are negative consequences, then you don't take things seriously. Negative consequences drive behaviour and focus the mind’ (Redrup & Tillett, 2019). In other words, it is unreasonable to expect a company to regulate against the very business model that drives its profits. Hence, there is a need—and indeed a responsibility—of governments to step in and legislate measures that will ensure social media cannot continue to drive division across democracies.

While stipulating an exhaustive list of necessary regulations remains beyond the scope of this paper, a few principal changes need to be mentioned. The first and foremost would be to limit the amount of information social media companies can collect on users. Although this data fuels their business model, compiling infinite data points on user behaviour, without their understanding of its use, creates the asymmetry when used ‘against’ them. In other words, the information garnered through the analysis of user behaviour is what social media platforms use to predict content that will keep users engaged, with the ultimate aim of exposing them to micro-segmented marketing of products (Orlowski, 2020). Additionally, there is an imperative requirement to force data protection and privacy from the private sector, which is a recognised route taken by authoritarian regimes to target citizens of democracies for purposes of manipulation and control (Rosenberger & Gorman, 2020).  Equally important, and to force greater accountability by social media platforms, is to change their status from merely technology to media companies. This initiative is heavily contested by social media companies with arguments ranging from claims that they don’t produce unique content to the fact that there is no human editorial intervention on anything that is shared on their platforms (for a comprehensive analysis of this argument, see for example Napoli & Caplan, 2017). However, a simple fact remains—these companies provide content to users, while selling their attention to advertisers, which is a principle defining feature of a media company (Ingram, 2012), thus making their resistence somewhat futile.

Further, and to combat the virality and influence of misinformation, we should consider the advice of Haidt and Rose-Stockwell (2019), who suggest legislating for the complete removal of ‘like’ and ‘share’ counts from posted content, which would allow a user to curate it on its merit, rather than ‘group think’. Equally, they go on, social media platforms should be forced to challenge a user when posting content that has been flagged as misinformation through a confirmatory question such as: ‘Are you sure you want to post this?’ While not foolproof, Haidt and Rose-Stockwell (2019) go on, this simple act has been shown to significantly reduce the proliferation of misinformation. Finally, they and others (see for example Helmus, et al., 2018), suggest that social media platforms must demand basic identification verification before a user can open an account. This is an opinion shared by Schick (2020), who argues convincingly that this simple—and arguably common sense—regulation would immediately limit opportunities for bad-faith actors, such as Russia and China, to exploit the inherently negative side effects of these platforms. Although not exhaustive, these measures would go some way to limiting the damage caused. However, and as highlighted by Harris (2020), we have a ‘decade of damage to undo’, meaning that regulation is only but a piece of the puzzle. We must also educate the public about these dangers.


The last, but certainly not the least, component proposed in this model is the need to arm citizens of democracies with education and knowledge about the dangers of social media and the polarisation it is capable of injecting into our public discourse. Kavanagh and Rich (2018) provide a compelling argument that lack of education coupled with our cognitive biases are two of the principal drivers behind our ongoing disagreement between facts and data as well as the blurring line between opinion and fact. To counter this, we must increase our collective efforts of providing the public with tools to recognise and build resilience against misinformation and ‘fake news’. One step in this direction, as suggested by Cherner and Curry (2019), is to mandate that social media literacy be taught at schools, with a focus on developing skills in analysing, critiquing and responding to content that appears in their social media feeds.  Such an effort would certainly capture the younger demographic that is growing up with social media as an ever-present medium of communication.

However, while such an intervention is undoubtedly required, we must also focus on educating adults. Shedding light on an avenue of how this may be achieved—albeit relying on adult university students as their data set—McGrew, Smith, Breakstone, Ortega and Wineburg (2019) from the Stanford History Education Group showed that after two 75-minute lessons on evaluating credibility of online sources, participants were more than twice as likely to effectively search for, evaluate, and verify social and political information online. Given the consistent and relentless attempts by states like Russia and China to sow division within democracies using social media and the overwhelming evidence of their effectiveness, there may be sufficient justifications to mandate such lessons be given at workplaces and community centres, and be shown on public television broadcasting stations. While some might call this alarmist, the public across democracies certainly appear to be increasingly concerned about the state of democracy and are therefore hungry for information on how to defend it. This is perhaps the reason why the recent Netflix documentary The Social Dilemma, highlighting dangers of social media, became the platform’s most popular movie in September 2020 (Spencer, 2020).

Although it may appear obvious, it is important to stress that any large-scale, government-mandated education efforts may themselves be cast as ‘fake news’ and manipulation by those already trapped in the contagion of conspiracy theories. Importantly, such efforts may also inadvertently undermine the perceived credibility of truthful news (Lazer, et al., 2018). However, rather than deterring us from making an effort, this should encourage us to develop a strategy that seeks to involve civil society groups and independent research institutions in concert with government institutions, as proposed by Rosenberger & Gorman (2020). The aim must be to build resilience within populations of democracies rather than merely counter misinformation.


As this essay sought to highlight, democracy, as an idea, indeed faces a number of challenges. Firstly, there is a growing internal disillusionment with the political system due to the failed promise of prosperity for all, prophesised by generations of its leaders. In turn, this disparity has given rise to particularly volatile populist movements—from both the left and the right of the political spectrum—seeking, in their own way, recognition of their plight and a change to the status quo. To this set of circumstances, we have added social media, that through its algorithm-driven echo chambers has accelerated the pace of polarisation. Importantly, as stressed in the preceding pages, authoritarian states such as Russia and China have recognised the potential current strained and combustible circumstances harbour. Although incapable of challenging the West through military and economic means, for now at least, they are seeking to level this asymmetry by exploiting and amplifying incommensurable delusions embedded within the social media ecosystem. Through relatively inexpensive means, they rely on algorithm-driven segmentation of the populace to sow further polarisation, while harbouring the hope that this will spill over into open hostilities within and between established democracies. To combat this attack, democracies need to DARE—Defend, Assist, Regulate, Educate—to protect the democratic experiment against these tactics. Although not a panacea against all our troubles, acting swiftly may give us a fighting chance. In parallel, we need to work on ways to resolve the underlying ailments that have plagued our societies. Only then can we hope to take democracy off life support.


Anamitra, D., Donohue, S., & Glaisyer, T. (2017). Is Social Media a Threat to Democracy? Redwood City, CA: The Omidyar Group.

Babbage, R., Bianchi, J., Snelder, J., Yoshihara, T., Friedberg, A., & Rolland, N. (2020). Which Way the Dragon? Sharpening Allied Perceptions of China’s Strategic Trajectory. Canberra: Centre for Strategic and Budgetary Assessments.

Bean, T. (2020, September 24). ‘The Social Dilemma’ Is About To Become The First Documentary On Netflix To Achieve This Incredible Milestone. Forbes. Retrieved from

Brady, W. J., Wills, J. A., Jost, J. T., Tucker, J. A., & Van Bavel, J. J. (2017). Emotion shapes the diffusion of moralized content in social networks. Proceedings of the National Academy of Sciences, 114(28), 7313-7318. doi:10.1073/pnas.1618923114

Cherner, T. S., & Curry, K. (2019). Preparing Pre-Service Teachers to Teach Media Literacy: A Response to “Fake News”. Journal of Media Literacy Education, 11(1), 1-31.

Christopher, P., & Matthews, M. (2016). The Russian "Firehose of Falsehood" Propaganda Model: Why It Might Work and Options to Counter It. Santa Monica, CA: RAND Corporation. Retrieved from

Connaughton, A., Kent , N., & Schumacher, S. (2020). How people around the world see democracy in 8 charts. Pew Research Center. Retrieved from

Department of Foreign Affairs and Trade. (2020). Department of Foreign Affairs and Trade Submission to the Select Committee on Foreign Interference through. Canberra: Department of Foreign Affairs and Trade. Retrieved from

Douglas, K. M., Uscinski, J. E., Sutton, R. M., Cichocka , A., Nefes, T., Ang , C. S., & Deravi, F. (2019). Understanding Conspiracy Theories. Advances in Political Psychology, 40(1), 3-35. doi:10.1111/pops.12568

Emmott, R. (2020, March 18). Russia deploying coronavirus disinformation to sow panic in West, EU document says. Reuters. Retrieved from

Ercan, S. A., & Gagnon, J.-P. (2014). The Crisis of Democracy: Which Crisis? Which Democracy? Democratic Theory, 1(2), 1-10.

European Union External Action. (2018, December 5). Questions and Answers about the East StratCom Task Force. Retrieved from European Union External Action:

Flinders, M. (2016). The Problem with Democracy. Parliamentary Affairs, 69, 181-203.

Ford, C. A. (2020, October 19). Responding to Modern Cyber Threats with Diplomacy and Deterrence. Retrieved from US Deparment of State:

Frenkel , S., & Barnes, J. E. (2020, September 1). Russians Again Targeting Americans With Disinformation, Facebook and Twitter Say. The New York Times.

Fukuyama, F. (1992). The end of history and the last man. New York: Free Press.

Goertzel , T. (1994). Belief in Conspiracy Theories. Political Psychology, 15(4), 731-742.

Goldberg, J. (2018). Suicide of the West: How the Rebirth of Tribalism, Populism, Nationalism, and Identity Politics Is Destroying American Democracy. New York: Crown Forum.

Gonzalez-Vicente, R., & Carroll, T. (2017). Politics after National Development: Explaining. Globalizations, 14(6), 991-1013.

Haidt , J., & Rose-Stockwell, T. (2019, December). The Dark Psychology of Social Networks: Why it feels like everything is going haywire. The Atlantic. Retrieved from

Hanitzsch, T., Van Dalen, A., & Steindl, N. (2018). Caught in the Nexus: A Comparative and Longitudinal Analysis of Public Trust in the Press. The International Journal of Press/Politics, 23(1), 3-23. doi:DOI: 10.1177/1940161217740695

Harris, T. (2020, September 24). Welcome to the Cult Factory. (S. Harris, Interviewer) Making Sense Podcast Episode 218. Retrieved from

Harris, T. (2020, October 6). Your Nation’s Attention for the Price of a Used Car. Your Undivided Attention Podcast Episode 25. Retrieved from

Hartcher, P. (2020, October 16). Democracy is fragile. Handle with care. The Sydney Morning Herald. Retrieved from

Harvard Magazine. (2009, September 10). An Intellectual Entente. Harvard Magazine.

Helmus, T. C., Bodine-Baron, E., Radin, A., Magnuson, M., Mendelsohn, J., Marcellino, W., . . . Winkelman, Z. (2018). Russian Social Media Influence: Understanding Russian Propaganda in Eastern Europe. Santa Monica, CA: RAND Corporation. Retrieved from

Hendricks, V. F., & Vestergaard, M. (2019). The Attention Economy. In Reality Lost: Markets of Attention, Misinformation and Manipulation. Cham: Springer Open.

High Representative of the Union for Foreign Affairs and Security Policy. (2018). Action Plan Against Disinformation. Brussels: European Commission. Retrieved from

Holleran, A. (2008, Nov 7). ‘Such a Rough Diamond of a Man’. The New York Times.

Huang, A. (2020, August 11). Chinese disinformation is ascendant. Taiwan shows how we can defeat it. The Washington Post. Retrieved from

Ignatidou , S. (2020, October 5). Covid lies go viral thanks to unchecked social media. Retrieved from Chatham House:

Ingram, M. (2012, May 16). Facebook’s biggest problem is that it’s a media company. Gigaom. Retrieved from

Kavanagh , J., & Rich, M. D. (2018). Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life. Santa Monica, CA: RAND Corporation. Retrieved from

Lazer, D. M., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., . . . Zittrain, J. L. (2018). The science of fake news. Science, 359(6380), 1094-1096.

Markovits, D. (2019). The Meritocracy Trap. New Yok: Penguin Books.

Martin, M. (2018). Why We Fight. London: Hurst & Company.

Martinez, M. A. (2009). The Myth of the Free Market: The Role of the State in a Capitalist Economy. Sterling, VA: Kumarian Press.

McGrew, S., Smith, M., Breakstone, J., Ortega, T., & Wineburg, S. (2019). Improving university students’ web savvy: An intervention study. British Journal of Educational Psychology, 89, 485–500.

Morris, L. J., Mazarr, M. J., Hornung, J. W., Pezard, S., Binnendijk, A., & Kepe, M. (2019). Gaining Competitive Advantage in the Gray Zone: Response Options for Coercive Aggression Below the Threshold of Major War. Santa Monica, CA: RAND Corporation. Retrieved from

Napoli , P., & Caplan , R. (2017). Why media companies insist they're not media companies, why they're wrong, and why it matters. First Monday, 22(5).

Napoli, P. M. (2019). Social Media and the Public Interest: Media Regulation in the Disinformation Age. New York: Columbia University Press.

Nimmo, B., Francois, C., Eib, S. C., Ron, L., Ferreira, R., Hernon, C., & Kostelanci, T. (2020). Exposing Secondary Infektion: Forgeries, interference, and attacks on Kremlin critics across six years and 300 sites and platforms. Graphika. Retrieved from

Nimmo, B., Francois, C., Eib, S., Ronzaud, L., & Carter, J. (2020). GRU and the Minions. Graphika. Retrieved from

Orlowski, J. (Director). (2020). The Social Dilemma [Motion Picture]. Retrieved from

Pew Research Center. (2019). Many Across the Globe Are Dissatisfied With How Democracy Is working. Pew Research Center. Retrieved from

Piketty, T. (2017). Capital in the Twenty-First Century. London, England: The Belknap Press of Harvard University Press.

Polyakova, A., & Fried, D. (2019). Democratic Defense Against Disinformation 2.0. Washington DC: Atlantic Council. Retrieved from

Redrup , Y., & Tillett, A. (2019, March 28). Social media platforms can't self-regulate. Financial Review. Retrieved from

Robinson, O., Coleman, A., & Sardarizadeh, S. (2019). A Report of Anti-Disinformation Initiatives. Oxford: UK: Oxford Technology and Elections Commission. Retrieved from

Rochefort, A. (2020). Regulating Social Media Platforms: A Comparative Policy Analysis. Communication Law and Policy,, 25(2), 225-260,.

Rosenberger, L., & Gorman, L. (2020). How Democracies Can Win the Information Contest. The Washington Quarterly, 43(2), 75-96. doi:

Samples, J. (2019). Why the Government Should Not Regulate Content Moderation of Social Media. CATO Institute. Retrieved from

Sandel, M. J. (2020). The Tyranny of Merit: What's Become of the Common Good? New York: Farrar, Straus and Giroux.

Sanger, D. E. (2018). The Perfect Weapon: War, Sabotage and Fear in the Cyber Age. Melbourne: Scribe.

Schick, N. (2020). Deepfakes: The Coming Infocalypse. New York: Hachete Book Group.

Solsman, J. E. (2018, Jan 10). YouTube's AI is the puppet master over most of what you watch. CNET. Retrieved from

Spencer, S. (2020, September 30). Netflix Movies: The 10 Most-Watched In September. Newsweek. Retrieved from

Taub, A. (2016, November 29). How Stable Are Democracies? ‘Warning Signs Are Flashing Red’. The New York Times. Retrieved from

Taylor, E., & Hoffmann, S. (2019). Industry Responses to Computational Propaganda and Social Media Manipulation. Oxford, UK: Project on Computational Propaganda. Retrieved from

The White House. (2018). National Cyber Strategy of the United States of America. Washington, DC: The White House.

Thomas, E., Zhang , A., & Wallis, J. (2020). Covid-19 Disinformation and Socia Media Manipulation. Australian Strategic Policy Institute. Retrieved from

Thompson, T. L. (2020). No Silver Bullet: Fighting Russian Disinformation Requires Multiple Actions. Georgetown Journal of International Affairs, 182-194. doi:

Tormey, S. (2016). The Contemporary Crisis of Representative Democracy. Senate Occasional Lecture Series at Parliament House 13 May 2016. Canberra: Parliament of Australia. Retrieved from

Tufekci, Z. (2018, March 10). YouTube, the Great Radicalizer. New York Times. Retrieved from

Turekci, Z. (2018, March 19). Facebook’s Surveillance Machine. New York Times. Retrieved from

Ullrich, E. K., Lewandowsky, S., Fenton, O., & Martin, K. (2014). Do People Keep Believing Because They Want to? Preexisting Attitudes and Continued Influence of Misinformation. Memory and Cognition, 42(2), 292-304.

United Nations. (1966). International Covenant on Civil and Political Rights. New York: United Nates Human Rights Office of the High Commissioner .

US Department of State. (n.d.). Global Engagements Center. Retrieved from US Department of State:

US House of Representatives Permanent Select Committee on Intelligence. (2017). Exposing Russia’s Effort to Sow Discord Online: The Internet Research Agency and Advertisements. Retrieved from

Van der Meer, T. W. (2016). Political Trust and the 'Crisis of Democracy'. In R. Dalton (Ed.), Oxford Research Encyclopedia on Politics (pp. 1-22). New York: Oxford Univesity Press.

Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146-1151. doi:10.1126/science.aap9559

Wallis, J., Uren, T., Thomas, E., Zhang, A., Hoffman, S., Li, L., . . . Cave, D. (2020). Retweeting through the great firewall: A persistent and undeterred threat actor. Australian Strategic Policy Institute Policy Brief No. 33/2020. Retrieved from file://

Zhang, L. (2020). How to Counter China’s Disinformation Campaign in Taiwan. Military Review, 21-32.

Cite Article
(Maslic, 2021)
Maslic, V. 2021. 'Euthanizing Democracy: The Assisted Death of Liberalism'. Available at: (Accessed: 20 July 2024).
(Maslic, 2021)
Maslic, V. 2021. 'Euthanizing Democracy: The Assisted Death of Liberalism'. Available at: (Accessed: 20 July 2024).
Vedran Maslic, "Euthanizing Democracy: The Assisted Death of Liberalism", The Forge, Published: October 12, 2021, (accessed July 20, 2024).
Download a RIS file to use in your citation management tools.
Defence Technical Social


This web site is presented by the Department of Defence for the purpose of disseminating information for the benefit of the public.

The Department of Defence reviews the information available on this web site and updates the information as required.

However, the Department of Defence does not guarantee, and accepts no legal liability whatsoever arising from or connected to, the accuracy, reliability, currency or completeness of any material contained on this web site or on any linked site.

The Department of Defence recommends that users exercise their own skill and care with respect to their use of this web site and that users carefully evaluate the accuracy, currency, completeness and relevance of the material on the web site for their purposes.