Skip to main content

The Challenge of Organisational Cybersecurity

In a modern warfighting environment, the cyber domain is growing as an ever-present and pervasive threat to military systems. For the ADF, the degradation of those systems has the potential to devastate mission assurance and operational capability. Subsequently, cybersecurity is perceived by people involved in these processes can have significant influence over planning and management.

Cybersecurity refers to the measures taken within computing systems to protect them from harmful attacks and security breaches (Merriam-Webster, 2017). The concept of cybersecurity exists beyond simply cyber-physical systems and digital architecture, and also involves the complex interaction of people with those systems.

The ADF’s strategic cybersecurity direction is highlighted in the 2020 Strategic Update (Department of Defence, 2020). This document details deterring actions against Australia’s interests in the cyber domain and heavy investment into information and cyber capabilities (Department of Defence, 2020, p. 33). These strategic areas of growth not only involve the development of technical mastery, but also investment into the cybersecurity strength of the ADF workforce.

The challenges that the ADF faces in regards to cybersecurity are not limited to the technical manoeuvres of adversaries, but instead involve cultural fallacies within the defence environment. No matter how secure technological solutions are, exploitation of human factors through traditional hacking or social engineering, is a significant threat to security systems. This is illustrated by the growing preference in social engineering attacks against systems instead of employing technical exploits (Proofpoint, 2016, p. 2). The ADF is a ‘people-intensive business’ and its personnel are its greatest asset (Brodtmann, 2015; Thomas & Bell, 2007, p. 97). No matter how secure technological solutions are, exploitation of human psychological factors including ignorance, negligence, and apathy, is a significant threat to Defence systems.

Perceptions of cyber security have deep roots in what individuals have been exposed to before. The ideas that individuals have on the topic rely on a combination of their own experiences and biases. From a cyber perspective, spheres of influence range from motion pictures such as Hackers, to popular culture artifacts such as memes. As society’s technological presence and reliance on digital systems rapidly evolves, so too does the need for individuals to understand the vital role they play in enabling and securing those systems.

Fundamentally, these stereotypes provide individuals with an opportunity for diffusion of responsibility, and therefore reduce the degree to which they feel personally liable for cyber breaches. Subsequently, the sentiment of “cybersecurity as everyone’s responsibility” is lessened by the compartmentalisation of roles within defence environments. Ultimately, a cultural reformation of what cybersecurity means to every member of the ADF is required in order to minimise the adverse consequences of a cyber-attack to mission assurance.

Divergence of Cyber Perceptions

Within the media, cybersecurity is often portrayed through the lenses of specific organisations or individuals, ranging from hacking groups to government departments. Each of these perspectives diverges people’s perceptions of cybersecurity from their own pool of responsibility. This instead places it onto abstract entities or ideas of what a cybersecurity expert is.

In psychology, it is established and well known that people’s behaviour is influenced by their attitudes, subjective norms and perceived behavioural control (Ajzen, 1991, p. 179). Role models can play a key role in determining these norms and influencing people’s perceptions. The media plays a strong part in portraying role models, such as those in cyber, to audiences and thus can influence their decisions and actions, especially within an occupational context (Bosma, Hessels, Schutjens, Van Praag, & Verheul, 2012, p. 410).

For many people who are not embedded within the cybersecurity community, perceptions of what constitutes cybersecurity are built from popular culture and stereotypes. Media can serve to improve exposure and awareness to cybersecurity. This was exemplified by how the 1983 film WarGames influenced the then US President Ronald Reagan’s discussions and perspectives on ballistic missile programs and prompted a national security decision directive relating to information systems security (Brown, 2008; Kaplan, 2016). It is essential to note these films exist primarily for entertainment value, with dramatization being a primary focal point. They do not represent the majority of cybersecurity incidents, and thus can oversimplify cybersecurity threats.

Online culture also has a strong influence on how cybersecurity is perceived, with masked hacking groups such as ‘Anonymous’ and online communities such as 4chan dominating the general public’s perception of cyberspace. Even something as simple as submitting a Google Image search for “Cyber” returns images of circuit boards, binary code and people in black hoodies hunched over computer screens (Google, 2020). These images can reinforce harmful messages about what cyber security is and who is involved with it, contributing to the average person’s diffusion of responsibility due to a lack of association.

Combined, these stereotypical perceptions are damaging to the ADF’s cybersecurity posture. Overwhelmingly, these perceptions of cyber serve to exclude the typical information technology user in the ADF. On average, these ideas do not align with the average person’s identity, and allow for cognitive distancing from cybersecure behaviours. This has flow-on effects to cybersecurity recruiting, with personal views of roles potentially differing from the employment reality. Ultimately, the more individuals are able to personally distance their own identity from those stereotypically seen as responsible or involved in cybersecurity, the less they feel responsible for these actions.

When the majority of media portrays cyber security as something for ‘hackers’, ‘geeks’, or ‘technical specialists’, it inherently reinforces cyber security as a topic exclusively for the above mentioned stereotypes. This enables diffusion of responsibility, with individuals projecting their cybersecurity duties onto these abstract ideas, people and organisations. In an ADF context, this thought process enables the potentially damaging idea that cyber security is the concern of only certain units or organisations, and leaves gaping holes in the cybersecurity posture of the operational force.

The idea that individuals have a dominant brain hemisphere, that is that ‘left-brained’ people are logical and ‘right-brained’ people are creative, is a destructive myth that can influence people’s perception of their capability (Corballis, 2014; Marcus, 2017, p. 3).

In the ADF, this can translate into the idea that regarding both personal employment and work processes, people are fundamentally ‘technical’ or ‘non-technical’. Jobs within the ADF tend to fall within these siloed categories and can therefore contribute to the pervasiveness of this idea. The false dichotomy of people’s ‘technicality’ can have devastating consequences to their feelings of cyber responsibility, especially if it serves as an enabler for cybersecurity dismissiveness.

The Consequences of Divergence

In a military context, mission assurance involves “establishing and maintaining a reasonable degree of confidence in mission success” (Musman, Tanner, Temin, Elsaesser, & Loren, 2011, p. 210). As a key warfighting environment, the cyber domain contributes to the ability of the ADF to achieve mission assurance. Subsequently, enabling cyber resilience is a key component of this mission assurance, ensuring the systems in which the ADF relies on are capable of withstanding attacks against their infrastructure. No system will ever be completely invulnerable and risk identification and mitigation are important components of the cyber hardening process. However, they do not encompass a complete solution to mitigating mission degradation.

Cyber mission assurance is often viewed exclusively through a technical lens, with focuses of existing research. This includes architecture resiliency (Goldman, 2010), and incorporating frameworks based in traditional engineering risk management, such as Crown Jewels, in order to understand and model cyber-attack effects (Musman et al., 2011). Even within a military context, there is a significant focus on a systems engineering approach to understand how cyber adversaries have the potential to influence mission assurance (Sullivan, Colbert, & Cowley, 2018). These methods play a key role ensuring cyber-worthiness, however they rarely provide a complete solution to mitigate against cyber-attacks. The human factor is therefore an essential element of the complex cyber environment.

Remedies for cybersecurity technical issues can be more easily addressed than social and cultural ones, because the solutions are computationally logical. In reality, whilst the required end state of people’s perceptions and behaviours within the ADF as a cyber-secure entity can be easily visualised, inciting cultural change is a more significant challenge. Subsequently, it can feel more intuitive to invest into cybersecurity development that meets objective requirements because they are measurable, instead of the subjective cultural transformation needs.

Cybersecurity Leadership Needs

Instilling the idea of “cybersecurity as everyone’s business” is crucial to ensuring ADF systems are protected. However, shifting the perception from technical cyber risk management and response, to a more holistic proactive cybersecurity posture requires significant cultural influence.

While there are silos of ADF capability specifically focussed on cybersecurity, this does not excuse the critical role that individuals play in engaging in protective cyber security practises Technology is so pervasive throughout the ADF, and is a critical enabler to mission success, that protecting technology is the responsibility of all people involved.

To further this concept, consider the roles and responsibilities of a person driving a car. Whilst a driver may not completely understand the mechanical details of how the engine operates, they still bear responsibility for driving that vehicle. The majority of car accidents do not occur solely due to mechanical failure, but involve human error (Green & Senders, 2004). Similarly, employing technical cybersecurity mitigations to Defence systems does not reduce the cyber-responsibility of the people using those systems daily. Relying on ADF cybersecurity teams to protect systems from cyber-attacks is like solely relying on mechanics to stop vehicle accidents on the road. They both present incomplete solutions to the issues faced.

There are examples of renewed understanding of cyber as an element of warfighting that is not limited by the technical ability of people who interact with the domain. NATO in recent years has endeavoured to integrate cyberspace operations into part of their overall joint operations framework (Bigelow, 2017). There is an essential need to see cyber not as simply another domain, equivalent to the land, sea, or space, but rather as an enabler through which systems in those other domains function. Subsequently, protecting cyberspace serves the outcome of protecting the ADF’s other capabilities.

Colonel Bryant identifies the issue of cybersecurity perception clearly:

“Mission assurance in and through cyberspace is not fundamentally an IT problem, but a mission problem that requires a mission focus and approaches that go beyond what we have come to think of as traditional cybersecurity” (Bryant, 2016, p. 5)

One of the unintended consequences of perceiving people as either technical or non-technical, is varied cybersecurity representation and experience at the tactical and strategic levels. If technical skills are valued in a tactical environment, and non-technical skills in a strategic one, representation in these contexts may lack the diversity needed to make quality, informed cybersecurity decisions. Furthermore, this adversely affects experiences of career progression and ultimately retention. Varieties of thought are an essential component of good decision making teams, which can be reduced through limiting skillsets and expertise (Jackson, May, Whitney, Guzzo, & Salas, 1995). As a consequence, when thought diversity cannot be achieved due to perceived cognitive or conceptual limitations, a risk of ‘groupthink’ and echo chambers can occur. Cybersecurity is not a topic that should be confined to technical subject matter experts, nor should it be defined by people who have operated in a solely strategic context. The best cybersecurity discussions need to happen at the intersection of these environments.

The solutions to the issues regarding cybersecurity perception and reality cannot be addressed through either a solely technical or non-technical approach. There is an increasing need for cybersecurity ‘translators’ within defence leadership; people with solid foundations in technical knowledge that can relay these challenges within larger strategic contexts. Cultural change across the ADF in regards to general cyber security relies on bridging the gap between the specialist cybersecurity workforce and other employment categories. Every person who interacts with a digital system or information process has the potential to protect that system from a cyber-attack, or conversely to compromise it. Ultimately, “cyber security as everyone’s responsibility” needs to become more than just a statement, and rather underpin the everyday actions of all defence members.

References

Ajzen, I. (1991). The theory of planned behavior. Organizational behavior and human decision processes, 50(2), 179-211. 

Bigelow, B. (2017). Mission assurance: shifting the focus of cyber defence. Paper presented at the 2017 9th International Conference on Cyber Conflict (CyCon).

Bosma, N., Hessels, J., Schutjens, V., Van Praag, M., & Verheul, I. (2012). Entrepreneurship and role models. Journal of Economic Psychology, 33(2), 410-424. 

Brodtmann, G. (2015). Statement, Capability Through Diversity [Press release]. Retrieved from http://www.gaibrodtmann.com.au/statement_capability_through_diversity

Brown, S. (2008). WarGames: A Look Back at the Film That Turned Geeks and Phreaks Into Stars. Wired Magazine, 16(08). Retrieved from https://www.wired.com/2008/07/ff-wargames/. 

Bryant, W. D. (2016). Mission assurance through integrated cyber defense. Air & Space Power Journal, 30(4), 5-18. 

Corballis, M. C. (2014). Left brain, right brain: facts and fantasies. PLoS Biol, 12(1), e1001767. 

Department of Defence. (2020). 2020 Defence Strategic Update.  Retrieved from https://www.defence.gov.au/StrategicUpdate-2020/docs/2020_Defence_Strategic_Update.pdf

Goldman, H. G. (2010). Building secure, resilient architectures for cyber mission assurance. The MITRE Corporation. 

Google. (2020). cyber - google search. Retrieved from https://*******32jfA0z

Green, M., & Senders, J. (2004). Human error in road accidents. Visual Expert. 

Jackson, S. E., May, K. E., Whitney, K., Guzzo, R., & Salas, E. (1995). Understanding the dynamics of diversity in decision-making teams. Team effectiveness and decision making in organizations, 204, 261. 

Kaplan, F. (2016). WarGames’ and Cybersecurity’s Debt to a Hollywood Hack. The New York Times. Retrieved from https://www.nytimes.com/2016/02/21/movies/wargames-and-cybersecuritys-debt-to-a-hollywood-hack.html. 

Marcus, J. (2017). The Left-and Right-Brain Myth. 

Merriam-Webster. (2017). Cybersecurity. Merriam-Webster Dictionary. Retrieved from https://www.merriam-webster.com/dictionary/cybersecurity

Musman, S., Tanner, M., Temin, A., Elsaesser, E., & Loren, L. (2011). A systems engineering approach for crown jewels estimation and mission assurance decision making. Paper presented at the 2011 IEEE Symposium on Computational Intelligence in Cyber Security (CICS).

Proofpoint. (2016). The Human Factor. Retrieved from https://www.proofpoint.com/sites/default/files/human-factor-report-2016.pdf

Sullivan, D., Colbert, E., & Cowley, J. (2018). Mission resilience for future army tactical networks. Paper presented at the 2018 Resilience Week (RWS).

Thomas, K., & Bell, S. (2007). Competing for the Best and Brightest: recruitment and retention in the Australian Defence Force. Security Challenges, 3(1), 97-118.

00
Cite Article
Harvard
APA
Footnote
RIS
(Sob, 2021)
Sob, T. 2021. '“Hackerman”: How diverging cyberspace portrayals influence the ADF’s perception of cybersecurity and the cultural ramifications of these judgements'. Available at: https://theforge.defence.gov.au/article/hackerman-how-diverging-cyberspace-portrayals-influence-adfs-perception-cybersecurity-and-cultural-ramifications-these-judgements (Accessed: 19 December 2024).
(Sob, 2021)
Sob, T. 2021. '“Hackerman”: How diverging cyberspace portrayals influence the ADF’s perception of cybersecurity and the cultural ramifications of these judgements'. Available at: https://theforge.defence.gov.au/article/hackerman-how-diverging-cyberspace-portrayals-influence-adfs-perception-cybersecurity-and-cultural-ramifications-these-judgements (Accessed: 19 December 2024).
Theresa Sob, "“Hackerman”: How diverging cyberspace portrayals influence the ADF’s perception of cybersecurity and the cultural ramifications of these judgements", The Forge, Published: July 08, 2021, https://theforge.defence.gov.au/article/hackerman-how-diverging-cyberspace-portrayals-influence-adfs-perception-cybersecurity-and-cultural-ramifications-these-judgements. (accessed December 19, 2024).
Download a RIS file to use in your citation management tools.
Defence Technical Social

Comments

Disclaimer

The views expressed in this article are those of the author and do not necessarily reflect the position of the Department of Defence or the Australian Government.

This web site is presented by the Department of Defence for the purpose of disseminating information for the benefit of the public.

The Department of Defence reviews the information available on this web site and updates the information as required.

However, the Department of Defence does not guarantee, and accepts no legal liability whatsoever arising from or connected to, the accuracy, reliability, currency or completeness of any material contained on this web site or on any linked site.

The Department of Defence recommends that users exercise their own skill and care with respect to their use of this web site and that users carefully evaluate the accuracy, currency, completeness and relevance of the material on the web site for their purposes.