Skip to main content

Democracies need to shed the ethical baggage associated with counterpropaganda and harness the integrity of their institutions to engage in positive information offensives in a hyper connected age.

The US’s troop withdrawal from Afghanistan in the summer of 2021 remoulded the global geopolitical terrain in ways the strategic punditry is still grappling with. The international media was awash with the scenes of the mujahideen confidently posing for the cameras as they occupied the complexes of Kandahar, peering straight into the West’s bone-weary democratic soul.

For the Taliban, the kinetic seizure came merely as a by-product to the victory in the information space.

Back in 2012, former CIA officer Arturo G. Muñoz, an expert on narrative warfare, had written the epitaph for the US’s contest in Afghanistan. In a monograph for the RAND Corporation, Muñoz undertook a frank appraisal of the information war in Afghanistan, with many American officers privately admitting to him that the US was losing the battle of minds and opinions.[1]

As the US’s information operations (IO) machinery spread thin across another internecine front—Iraq—all the Taliban had to do was play the waiting game as the kinetic battle succumbed to its natural attrition. At the same time, it kept on dog-whistling into the ears of the Afghani populace with relentless propaganda, becoming the Faustian master of their souls.

The seeds of the Taliban’s victory may have been planted in 2009 or even earlier. Emerson T. Brooking, a senior fellow at the Atlantic Council, retraced the footsteps of the Taliban’s conquest of perceptions.[2] He believes that its digital influence strategy could be dated back to 2002, with crucial pivots towards the new media ecology in 2009 and 2017. Former US deputy assistant secretary of defense Michael Doran noted in 2008 that the Taliban’s version of the story used to make its way to the news tickers of the BBC within 26 minutes.[3]

The Taliban’s information superiority in Afghanistan bears an uncanny resemblance to Russia’s new approach to warfighting which, in the words of Chatham House fellow Keir Giles, is ‘simply a recognition of the primacy of the political over the kinetic’.[4] Giles believes that ‘kinetic operations are [now] frequently undertaken to produce an information effect instead of delivering effect in their own right’.[5]

As Muñoz’s work got published, the US military undertook the much-anticipated overhaul of Joint Publication 3-13, Information Operations, absorbing the lessons from Afghanistan and Iraq. But by the time the new IO doctrine was instituted over the successive years, the US military was already fighting the previous war. The emerging global information environment called for a whole-of-government response to IO rather than just tactically limiting it to permissive geographies.[6]

While all that transpired, the US was hit with a barrage of Russian digital influence operations in 2016 that struck at the heart of its electoral democracy. It triggered a strange, self-perpetuating cycle of self-doubt and contention in the US polity which substantially degraded the quality of its national discourse.[7]

IO researchers Alicia Wanless and Michael Berk ask if the excessive credit given to Russia is in itself ‘a remarkable success for the Russian effort: the success in denying [American] democracy one of its core pillars—the capacity to have a public debate based on some sense of a shared reality and trust in institutions’.[8]

Over the recent years, liberal democracies like the US have been at the receiving end of unannounced information wars with no clear beginnings or ends—as adversarial campaigns transgressed into their homelands. Thriving in the ‘grey zone’ between war and peace, Robert Johnson and Timothy Clack of Oxford have dubbed it as the ‘new form of “total” war’. It is an extension of the legendary media theorist Marshall McLuhan’s definition of the Cold War: ‘Real, total war has become information war… fought by subtle electric informational media—under cold conditions, and constantly.’[9]

Johnson and Clack suggest that the hyperconverged cyberspace, new media ecology and borderless social terrain guarantee that the ‘actors and audiences are proximate to each other all the time’. ‘Almost everyone,’ writes British journalist David Patrikarakos, ‘can now be an actor in war.’[10]

All of that has punctured holes in the fundamental structures of epistemic security that the liberal democracies had put up to shield their populations from malign influence (the term ‘epistemic’ here implies how democracies nurture their own relationships with information). Those checks and balances had reasonably guaranteed that the West could function ‘as a series of deliberative democracies’ while not undermining its own exploitation of the information environments in foreign military theatres.[11] But as Wanless and Berk point out: ‘This traditional distinction between political versus military messaging, and domestic versus foreign audiences, is unfortunately no longer possible in a hyperconnected age.’[12]

Take, for example, the Smith-Mundt Act—a Cold War-era law which put up guardrails to prevent the US from influencing its own populace as it conducted IO abroad. The Act was built upon an epistemic construct of the twentieth-century Westphalian state which, as Patrikarakos explains, drew much of its power from its monopoly over the use of force and the near total control of information flows. The information environment has become too contestable, diffused and disintermediated for the old statist approach.[13]

The West’s institutional and political aversion to IO has its roots in history. David Welch, a historian specialising in propaganda, retraces it to the ‘deep mistrust developed on the part of ordinary citizens who realised that conditions at the [WWI] front had been deliberately obscured by patriotic slogans’.[14] The UK’s Ministry of War and the US’s Committee on Public Information were promptly shut down after WWI.

Covert IO over foreign soils, however, did not abate. As Thomas Rid painstakingly documented in Active Measures, the American activities steadily increased as the Cold War heated up, with aggressive operations like the Kampfgruppe and LC-Cassock peaking in the sixties, until concerns about influencing the domestic populace became mainstream.

The strategic confusions and ethical dilemmas around what constitutes a legitimate defence against IO have affected the offensive IO capability as well.

The US political establishment has had an estranged relationship with cognitive warfare. In the world order that emerged after the collapse of the Soviet Union, its policymakers felt that there was little need for IO to retain any strategic importance.

‘The problem appears to emanate from,’ as Wanless and Berk put it, ‘the long and uneasy relationship between the role of propaganda in liberal democracies, as a mechanism of constructive political socialisation, and individual freedoms for access to information and opinion that constitute a cornerstone of these societies.’[15] The Western militaries stepped ‘valiantly into the breach to address the problem that extends beyond open hostilities by applying the existing definitions and tactical activities used in influence and cyber operations’.[16]

IO was naturally reduced to very theatre-centric operational structures of the US military.[17]

Back in 2009, Dennis M. Murphy, the then professor of IO at the US Army War College, wrote a critique self-explanatively titled Why Warfighters Don’t Understand IO.[18] Many of his observations still hold true. He questioned the defanging of IO, which is a broader integrating function, and mixing it up with more tactically aligned psychological operations (PSYOP).[19] He also foresaw the taxonomic confusion arising due to policy decisions like the multiple name changes and reversals of ‘PSYOP’ to the more benign sounding ‘Military Information Support Operations’. The idea behind the renaming was to somehow disassociate IO from the deeply manipulative, ethically questionable aspects of covert propaganda. But that made the Western IO frameworks look impotent. Johnson and Clack find it striking that the NATO doctrine is completely defensive—not even mentioning offensive IO under the ‘fundamentals of information operations’.[20]

As power moves away from ‘hierarchies to citizens and networks’, Patrikarakos sees it as a ‘regression from centralised communicative modes to the more chaotic, network effects of an earlier, pre-twentieth century nation state’.[21] Academics Andrew Hoskins and Ben O’Loughlin write in War and Media that the ‘mediatisation’ of the war—the paradigmatic shift in how the new media ecology records conflict—has broken the predictable chain of cause and effect, ‘creating greater uncertainty for policymakers in the conduct of war’.

While there are such broader philosophical questions like the endurance of the Westphalian state which need an examination, the minimum common denominator is that the Western liberal democracy needs to shed the ethical baggage associated with counterpropaganda to come up with new models of epistemic security suited for the hyper-pluralistic and hyperconverged information environment (counterpropaganda, by definition, is the covert or overt application of IO against a domestic or foreign audience which has been targeted by adversarial propaganda, to neutralise or reverse the latter’s effects).

Hybrid threats exploit the delicate interfaces across the military, civilian and political apparatus, and society at large—part of a broader systems-of-systems approach to warfare—to possibly avoid a kinetic conflict by shaping the adversary’s information space.[22] Computational propaganda leveraging the inherent epistemic vulnerabilities of liberal democracies would be the key to that.

The question which remains to be answered is, are there any new constructs which could afford legitimacy to domestic counterpropaganda and, resultingly, foreign covert IO?

In his book Striking Back, Thomas Kent, the former president of Radio Free Europe/Radio Liberty—a propaganda outfit of the US government—recommends looking at a ‘sliding scale of ethics’ for foreign covert IO.[23]

First and foremost, effective IO is not always wedded with falsities and deceit. Studies have shown ‘falsehoods closest to the truth are more easily delivered and consumed’.[24] While crafting covert IO, Kent notes, it is common to ask: ‘Does it pass the Washington Post test?’[25]

Most IO would eventually become public. Kent suggests looking at veteran diplomat Joseph Nye’s three ethics for government actions when that happens: ‘Whether the goal was worthy, whether there was proper analysis of the chances for success, and what unintended consequences the action brought on.”

Kent also points out that while the ethical opponents of covert IO may believe that ‘this is not who we are’, it is certainly who ‘we’ have been. He charts the Western institutional history of covert IO to the proclamations of President Harry S. Truman, who once described conflict as the ‘struggle, above all else, for the minds of men’.

For liberal democracies, having a resilient, plural and cognitively secure domestic information environment is the essential precondition to potent offensive IO.

Wanless and Berk allude to the fact that a war extending over a period of time is an ‘imitative and reciprocal activity’ where adversaries start mimicking each other.[26] Any shaping of the domestic information environment using tools like denial or censorship would make the US look more like Russia. They suggest a subtle delineation of persuasion as ‘a legitimised process of transparent and constructive socio-political engineering’ while building capabilities countering external influence.

Corneliu Bjola and Krysianna Papadakis of Oxford offer a compelling proposition for building cognitively secure democracies by drawing upon the lessons from the famed Finnish resilience to Russian disinformation.[27]

German philosopher Jürgen Habermas broadly delineated the interfaces of public engagement in a democratically governed society across macro and micro spheres. In a reflexive arrangement, the macrosphere allows decisions related to societal governance to be properly articulated (will-formation), while the microsphere channelises the lived experiences of individuals to create conducive conditions for critical engagement and political accountability (opinion-formation). Each function, Bjola and Papadakis write, ‘besieges the other but without conquering it’.[28]

The two researchers believe that disinformation targets this peculiar subtlety of balance and the inherent tension between the macro and micro spheres of governance by weaponising the embryonic agitational, dissenting social formations termed as ‘counterpublics’. Counterpublics are generally associated with the ‘emancipatory inquiries’ of the socially marginalised groups. Disinformation, however, seeds extremist social formations which hijack that discursive space reserved for the truly marginalised. Gradually, such ‘unruly counterpublics’ upset the symbiotic arrangement between the macro and micro spheres, leading to societal radicalisation. That is exactly how far-right groupings end up expending the same terminology and victim narratives as radical formations meant for women, worker, ethnic, racial or sexual empowerment.

Bjola and Papadakis explain how Finland—using its civil society, media and the administration—established circuit breakers both at the macro and micro spheres, each requiring a different defence of epistemic security. Especially at the microsphere, the framework of ‘true,’ ‘false’ or ‘fakeness’ becomes less meaningful with the radical appeal of the alternate media (like Breitbart News and InfoWars in the US). Fact-checking, too, takes a backseat as it is outpaced by the virality of the social media. The researchers conclude that a ‘society is only as resilient to disinformation as its most vulnerable segments’, recommending engagement rather than the ostracisation of the societal fringe.

What the democracies need to tap into is the latent potential of their robust institutions in narrating the stories of triumphs of truth, pluralism, transparency and justice, but without varnishing them with self-congratulatory messages—warts and all.

The uncontemporary duo of defence technologist Oliver Lewis and filmmaker Chris DeFaria hit the nail on the head when they write that the generalised communications central to the modern bureaucracy emerging during the nineteenth century were the resultant effect of seeing the ‘implementation of policy largely as an objective exercise in the abstract organisation of society through technocratic means’.[29] To avoid being misconstrued, perceived as deliberately manipulating the electorate, or simply out of sheer cowardice, bureaucracies aspired towards scientific accuracy and presenting the statements of fact.

Lewis and DeFaria are emphatic in saying that ‘this attitude misunderstands how narratives are received and failed to exploit the effectiveness of storytelling’. ‘We share a conviction,’ they write, ‘that the institutions of government in the UK and the US—and the institutions of most mature democracies—have lost the ability to tell a good story’.[30] They further exhort: ‘Without good stories, our institutions are hollow mouthpieces for pseudo-scientific policy proclamations that fail to capture the imagination of citizens.’[31] This resonates deeply with the rise of populist anti-science movements taking hold of the political discourse in the aftermaths of the pandemic. They are a perfect fodder for weaponisation.

There is a misplaced nostalgia around former institutions like the US Information Agency in the current policy conversations on cognitive security.[32] Simply resurrecting institutional models of the past is not going to correct the emerging societal and national security problems having no precedents.

What liberal democracies like the US require is a reconstruction of their epistemological foundations upon which their societies and institutions rest. To prevail in the modern template of war based upon ‘data, attention and control’, the focus should be on fundamentals like civic education.[33] It is going to be a crucial, painful and paradigmatic  departure from the ‘neoliberal ideology which posits the idea that information surrounding political and social life should be delivered in a convenient, fun, and entertaining format’.[34]

 

The views expressed in this article are personal and not of Pukhraj Singh’s employer.

 

Footnotes

1 Arturo Munoz, ‘U.S. Military Information Operations in Afghanistan’ (RAND Corporation, 2012).

2 Emerson T. Brooking, ‘Before the Taliban Took Afghanistan, It Took the Internet’, The Atlantic, 26 August 2021, https://www.atlanticcouncil.org/blogs/new-atlanticist/before-the-taliban-took-afghanistan-it-took-the-internet/.

3 Robert Haddick, ‘This Week At War, No. 17’, Foreign Policy, 22 May 2009, https://foreignpolicy.com/2009/05/22/this-week-at-war-no-17/.

4 Keir Giles, ‘Russian Information Warfare: Construct and Purpose’, in The World Information War: Western Resilience, Campaigning, and Cognitive Effects, ed. Timothy Clack and Robert Johnson (Routeledge, 2021), 142.

5 Giles.

6 Blagovest Tashev, PhD, ; Lieutenant Colonel Michael Purcell (Ret), and Major Brian McLaughlin (Ret), ‘Russia’s Information Warfare: Exploring the Cognitive Dimension’, MCU Journal 10, no. 2 (2019): 132–33.

7 Yochai Benkler, Robert Faris, and Hal Robert, ‘Epistemic Crisis’, in Network Propaganda?: Manipulation, Disinformation, and Radicalization in American Politics (Oxford University Press, 2018), 3–44.

8 Alicia Wanless and Michael Berk, ‘The Changing Nature of Propaganda: Coming to Terms with Influence in Conflict’, in The World Information War: Western Resilience, Campaigning, and Cognitive Effects, 2021, 72.

9 Alicia Wanless and Michael Berk, ‘The Changing Nature of Propaganda: Coming to Terms with Influence in Conflict’, in The World Information War: Western Resilience, Campaigning, and Cognitive Effects, 2021, 68.

10 Timothy Clack and Robert Johnson, ‘Introduction’, in The World Information War: Western Resilience, Campaigning, and Cognitive Effects (Routeledge, 2021), 1–18.

11 Timothy Clack and Louise Selisny, ‘From Beijing Bloggers to Whitehall Writers: Observations on the “Invisible War”’, in The World Information War: Western Resilience, Campaigning, and Cognitive Effects (Routeledge, 2021), 263.

12 Alicia Wanless and Michael Berk, ‘The Changing Nature of Propaganda: Coming to Terms with Influence in Conflict’, in The World Information War: Western Resilience, Campaigning, and Cognitive Effects, 2021, 67.

13 David Aucsmith, ‘Disintermediation, Counterinsurgency, and Cyber Defense’, in Bytes, Bombs, and Spies: The Strategic Dimensions of Offensive Cyber Operations, ed. Herbert Lin and Amy B. Zegart (Brookings Institution Press, 2018), 343–56.

14 David Welch, ‘A Brief History of Propaganda: 'A Much Maligned and Misunderstood Word’’, in The World Information War: Western Resilience, Campaigning, and Cognitive Effects (Routeledge, 2021), 25.

15 Alicia Wanless and Michael Berk, ‘The Changing Nature of Propaganda: Coming to Terms with Influence in Conflict’, in The World Information War: Western Resilience, Campaigning, and Cognitive Effects, 2021, 75.

16 Wanless and Berk.

17 Tashev, PhD, Purcell (Ret), and McLaughlin (Ret), ‘Russia’s Information Warfare: Exploring the Cognitive Dimension’.

18 Dennis M. Murphy, ‘Talking the Talk: Why Warfighters Don’t Understand Informtion Operations’ (US Army War College, 1 May 2009), https://csl.armywarcollege.edu/usacsl/publications/IP_4-09_-_Talking_the_Talk.pdf.

19 US Army War College, ‘Information Operations Primer’, 2011, https://apps.dtic.mil/sti/pdfs/ADA555809.pdf.

20 Timothy Clack and Robert Johnson, ‘Introduction’, in The World Information War: Western Resilience, Campaigning, and Cognitive Effects (Routeledge, 2021), 4.

21 David Patrikarakos, ‘Homo Digitalis Enters the Battlefield’, in The World Information War: Western Resilience, Campaigning, and Cognitive Effects, 2021, 42.

22 Daniel Kirkham, ‘Army’s Hybrid Threat (Part Two): Knowing Your Enemy’, The Cove, 16 June 2020, https://cove.army.gov.au/article/armys-hybrid-threat-part-two-knowing-your-enemy.

23 Thomas Kent, ‘The Covert Arts’, in Striking Back: Overt and Covert Options to Combat Russian Disinformation (The Jamestown Foundation, 2020).

24 Timothy Clack and Louise Selisny, ‘From Beijing Bloggers to Whitehall Writers: Observations on the “Invisible War”’, in The World Information War: Western Resilience, Campaigning, and Cognitive Effects (Routeledge, 2021), 261.

25 Kent, ‘The Covert Arts’.

26 Alicia Wanless and Michael Berk, ‘The Changing Nature of Propaganda: Coming to Terms with Influence in Conflict’, in The World Information War: Western Resilience, Campaigning, and Cognitive Effects, 2021, 71.

27 Corneliu Bjola and Krysianna Papadakis, ‘Digital Propaganda, Counterpublics, and the Disruption of the Public Sphere: The Finnish Approach to Building Digital Resilience’, in The World Information War: Western Resilience, Campaigning, and Cognitive Effects, ed. Timothy Clack and Robert Johnson (Routeledge, 2021), 186–212.

28 Corneliu Bjola and Krysianna Papadakis, ‘Digital Propaganda, Counterpublics, and the Disruption of the Public Sphere: The Finnish Approach to Building Digital Resilience’, in The World Information War: Western Resilience, Campaigning, and Cognitive Effects, ed. Timothy Clack and Robert Johnson (Routeledge, 2021), 191.

29 Oliver Lewis and Chris DeFaria, ‘“Does My Suffering Matter?” - Storytelling and the Military’, in The World Information War: Western Resilience, Campaigning, and Cognitive Effects (Routeledge, 2021), 84.

30 Oliver Lewis and Chris DeFaria, ‘“Does My Suffering Matter?” - Storytelling and the Military’, in The World Information War: Western Resilience, Campaigning, and Cognitive Effects (Routeledge, 2021), 81.

31 Lewis and DeFaria.

32 Matthew Armstrong, ‘The Irony of Misinformation and USIA’, 3 August 2021, https://mountainrunner.us/2021/08/the-misinformation-around-usia/.

33 Andrew Hoskins and Matthew Ford, Radical War: Data, Attention, Control (Hurst Publishers, 2022).

34 Gillian Bolsover, ‘Social Media, Computational Propaganda, and Control in China and Beyond’, in The World Information War: Western Resilience, Campaigning, and Cognitive Effects (Routeledge, 2021), 122–38.

00
Cite Article
Harvard
APA
Footnote
RIS
(Singh, 2022)
Singh, P. 2022. 'Counterpropaganda is Not a Dirty Word'. Available at: https://theforge.defence.gov.au/article/counterpropaganda-not-dirty-word (Accessed: 22 December 2024).
(Singh, 2022)
Singh, P. 2022. 'Counterpropaganda is Not a Dirty Word'. Available at: https://theforge.defence.gov.au/article/counterpropaganda-not-dirty-word (Accessed: 22 December 2024).
Pukhraj Singh, "Counterpropaganda is Not a Dirty Word", The Forge, Published: June 29, 2022, https://theforge.defence.gov.au/article/counterpropaganda-not-dirty-word. (accessed December 22, 2024).
Download a RIS file to use in your citation management tools.
Defence Technical Social

Comments

Disclaimer

The views expressed in this article are those of the author and do not necessarily reflect the position of the Department of Defence or the Australian Government.

This web site is presented by the Department of Defence for the purpose of disseminating information for the benefit of the public.

The Department of Defence reviews the information available on this web site and updates the information as required.

However, the Department of Defence does not guarantee, and accepts no legal liability whatsoever arising from or connected to, the accuracy, reliability, currency or completeness of any material contained on this web site or on any linked site.

The Department of Defence recommends that users exercise their own skill and care with respect to their use of this web site and that users carefully evaluate the accuracy, currency, completeness and relevance of the material on the web site for their purposes.