Skip to main content

While technological advantage has always been a deciding factor in warfare, the rate of progress in the field is accelerating, and the complexities of new capabilities raise new and urgent ethical questions as to their development and uses.

Technology

Militarily relevant technologies are developing at a rapid rate, bringing urgency to the need to consider the ethical implications of new and emerging capabilities.[1] Drones have become part of the modern arsenal, altering the distance from and risks of violence for their operators. Evolving fields of Artificial Intelligence are challenging existing understanding of responsibility chains and the legal obligations that regulate them. Advances in biotechnology are opening up possibilities for military biomodification and raising questions about the medical ethics of altering the human body. All of these technologies and others have overlap in application for peaceful civilian purposes, so-called ‘dual use’, and their research and development must straddle both civilian and military ethical frameworks. The ADF will need to understand how these changes will affect traditional ethics frameworks and their applications.

In the last few decades, uncrewed vehicles have become a key component of modern militaries. Most prominently, this category includes uncrewed aerial vehicles, colloquially called drones, and also called remotely piloted aircraft, but also includes uncrewed naval vessels[2] and uncrewed land vehicles.[3] Applications of this technology range from humanitarian support to reconnaissance, to lethal targeting. Particularly in regards to the last application, there is concern that the concept of ‘riskless warfare—the ability to attack remotely without having personnel exposed to danger—could lower the political and social cost of going to war and make decision makers more likely to engage in warfare.[4] Wayne Phelps in his book On Killing Remotely also discusses the psychological consequences of killing from a distance. Remote operators experience many of the same physiological effects of active engagement but risk a lack of empathy from those who see their role as not ‘real’ combat.[5] In addition, they are frequently transitioning emotionally between states of active combat and peaceful civilian life leading to a situation described as ‘never-ending deployment’.[6]

The human distance from violence can also be increased by the use of Artificial Intelligence (AI), currently at the forefront of military technological advancement. Present and potential future military applications of such technology include autonomous weapons, text generating ‘bots’, as well as military non-specific uses such as suicide prevention tools.[7] The ethical issues posed by this new and still evolving technology are raising questions both in and out of military contexts, and involve numerous stakeholders at the state level, international organisations, and in the private sector. Military specific concerns include the diffusion of command responsibility,[8] clarity of the principle of Distinction,[9] and transparency and traceability in data and processes.

In particular, AI technology is now enabling greater abdication of human involvement in the actions of weapon systems. Lethal Autonomous Weapons (LAWs) are being developed by several well-resourced militaries around the world in what has been described as the third revolution in warfare after gunpowder and nuclear weapons.[10] Degrees of automation in weaponry have existed since the invention of landmines, now restricted under various conventions,[11] but are now expanding to the point where machines may select and kill human targets without the need for human input. In any practical ethical framework, clarity is necessary in defining regions on the spectrum from full human control to full autonomy. Several attempts have been made at definition systems.[12] Suggestions include Automatic < Automated < Autonomous,[13] and human ‘in the loop’ < human ‘on the loop’ < human ‘out of the loop’.[14] Various currently deployed systems such as active protection systems would fall within at least the first two levels.[15] There is mounting pressure globally to limit or prevent the highest levels of autonomous weapons, colloquially referred to as ‘killer robots’.[16]

While there is agreement from the ADF and in the wider international community[17] that IHL applies to new technologies, there is broad concern that the current legal framework is inadequate to regulate emerging AI and autonomous technologies and lacks clarity in its application.[18] The Marten’s Clause may provide some more assistance on the use of AI in weapons systems in highlighting principles of humanity, as determined by broader public opinion, as being necessary considerations. It has also been observed that there is a distinction between weapons that are qualitatively illegal, such as biological weapons, and weapons that are themselves legal, but may be used illegally, such as using a rifle to shoot a non-combatant.[19] This distinction extends beyond legality into ethics. It is therefore important to consider both the type of AI technology being developed and also the context and ways in which that technology is deployed.

It has been suggested that increased automation in warfare may in fact lead to positive ethical outcomes.[20] Autonomous weapons could be programmed to be immune to many of the human weaknesses that lead to unethical action in conflict. Self-preservation instinct, fear, anger, desire for revenge, and hysteria can all be impediments to acting ethically. Additionally, electronic systems may be capable of taking into consideration larger amounts of data, with faster reaction times, and without interpretation being clouded by preconceived notions. Reporting of ethical violations could be made more reliable by robots that are not swayed by a sense of loyalty to comrades. It has also been noted that stress can negatively impact neural circuits responsible for conscious self-control,[21] increasing the likelihood of gross immoral acts such as sexual assault, that would not necessarily be committed by the same soldiers in non-combat situations.[22] It has even been suggested that greater abdication of kinetic conflict to robots could remove necessity for humans in the field and therefore reduce the number of casualties.[23] Post-combat psychological injuries commonly endured by veterans including PTSD and moral injury would not be suffered by machines. However, these potential benefits should be considered in light of the ADF-P-0 ME’s rejection of consequentialism in favour of duty ethics.

Beyond kinetic autonomous weapons, advances in AI such as generative AI, are enabling and accelerating possibilities in online manipulation. These activities frequently fall into the grey zone of activity that can or does fall below the threshold to be called ‘warfare’ but nevertheless poses significant security risks and considerations of military interest.[24] There are already recorded cases of states using online ‘bots’ to further their interests or destabilise their adversaries.[25] While still an emerging technology, AI-created deepfakes pose a new threat to communications and public understanding and trust in institutions.[26]

These and other cyber tools are also available to unconventional non-state actors, including those that may have protected status as non-combatants under IHL.[27] However, if civilian hackers carry out cyber attacks that fall under military action as part of a conflict, they may be considered as combatants. This risks not only the electronic infrastructure they are using being considered a military object and liable for counter cyber attacks, but could also render them and their surroundings a legitimate military object in a violent attack.[28] Additionally, in cyberspace it is often difficult to accurately trace the origin of a particular attack, making accountability for legal violations difficult to ascertain and military retaliation difficult to execute effectively and ethically. This difficulty is exacerbated by the possibility of civilian computers being hijacked and used for malevolent purposes without their owners’ knowledge or consent.[29] There remains ambiguity about what non-physical entities, for example data,[30] could be considered an ‘object’ under IHL and therefore be either targeted or protected. Such an environment calls for a more integrated response from military and civilian parties[31] and a mutual understanding of the legal and ethical obligations of each.[32]

The field of biomodification is also opening up new possibilities for military research and operation, and consequent questions of ethical practice. Interest is at least threefold. It includes:

  • Options to enhance soldiers’ performance, creating ‘super soldiers’ with capabilities beyond their natural abilities.
  • Options to prevent or heal harm to soldiers in combat.
  • Options to attack and limit the abilities of adversaries (noting that chemical and biological weapons are prohibited under IHL[33]).

In the Future Land Warfare Report 2014, the ADF indicated willingness to consider performance-enhancing biotechnology both ‘in’ the body, such as drugs, and ‘on’ the body, such as exosuits.[34] Current ADF policy prohibits and actively polices the use of illicit[35] drugs and the misuse of prescription drugs such as using steroids without a prescription.[36] It has also adopted parts of the sports-focused World Anti-Doping Code limiting the use of performance-enhancing drugs.[37] The relevance of sport ethics to a military context has been criticised by Adrian Walsh and Katinka Van de Ven in their paper on the ethics of ‘Super Soldiers’. They argue that the fundamental differences in aim and telos between sport (a voluntary striving for capability development) and war (a ‘melancholy duty’ necessarily engaged as a last resort) require fundamental differences in the ethical approach to biomodification.[38]

Importantly, any process intended for the first two options (enhancing performance or preventing harm) is seeking a clearly positive outcome. Current and theorised processes could:

  • Counter the need for sleep[39]
  • Enhance moral decision making[40]
  • Enhance speed of decision making[41]
  • Alter digestive systems to enable alternate food sources[42]
  • Reduce pain sensitivity[43]
  • Reduce likelihood of developing PTSD[44]
  • Improve physical strength and endurance[45]
  • Enable brain-computer interfaces[46]

Faced with these obvious benefits, which may even save lives on the battlefield, there is a significant moral case for at least researching and considering military biomodification.

Therefore, reservations about the ethics of biomodification for military purposes have tended towards consequentialist arguments. Especially while research is still being conducted, altering a system as complex as the human body often has unintended consequences. For example, steroids have well-documented health risks[47] including mood disorders that may be particularly relevant to a military setting. In one case, US pilots under the influence of amphetamines killed and wounded Canadian soldiers engaged in a training exercise.[48] The use of trusted substances such as caffeine or alcohol is commonplace, though in the case of alcohol subject to restrictions. But any black and white approach to the management of biomodification is ethically negligent. While there is potential for significant benefit through a better understanding and implementation of new medical technology, an anarchic free-for-all of military biohacking would almost certainly have unforeseen and tragic outcomes.

When it comes to finding the right balance in regulating biomodification, there are a number of considerations. Medical ethics emphasises the need for free and informed consent in both research and practice but there can be challenges to obtaining this in a military context. Because of the life and death nature of military service, it is possible that soldiers would feel pressured to consent to performance enhancing drugs they otherwise would not personally accept.[49] Given the sensitivity of military strategy and research, there may also be difficulties in releasing all of the necessary information required for consent to be adequately informed.[50] Beyond the consent issue, there is the possibility that biomodified soldiers could be subject to dehumanising attitudes from either their adversaries in warfare, perhaps increasing likelihood of war crimes against them, or from their societies when they reintegrate after hostilities have ended. The concept of Jus post Bellum is relevant here.

Most if not all of the technologies at the forefront of military interest, including those not discussed here such as quantum, nanotechnologies, and anti-satellite capability, also have application in the civilian world. This concept of civil-military overlap in research and development is often called dual use. Philosopher Seamus Miller describes dual use dilemmas as generally consisting of well-intentioned, peacefully minded research and development being later utilised for malevolent purposes.[51] The rapid technological progress of recent years has increased the dependence of social development and state power on technological advancement, as well as the porousness across state boundaries of knowledge and ideas.[52] Information and communication tools are increasingly critical infrastructure.[53] Several military frameworks to encourage and benefit from dual use research and development exist internationally, such as The United States’ Defence Innovation Unit[54] and China’s Military-Civil Fusion strategy.[55]

This dual use issue raises several ethical concerns. From a researcher’s point of view, there is the question of whether ‘technology has no morals’[56] and pure scientific development is therefore above ethical concerns, regardless of its potential application, or whether some technology is harmful (either fundamentally or potentially) and should be prohibited. From a regulation point of view, dual use research and development, and the funding and administration structures that enable it, straddle both military and civilian legal frameworks. Dr Annie Handmer highlights the risks of national governments being absolved of international legal responsibility by technology development occurring as a civilian activity undertaken by civilian institutions, and those institutions being absolved of ethical responsibility by conducting ‘pure’ science without accountability for its potential uses.[57] This gives rise to the concept of an ‘infohazard’—information that has such potential for accidental or deliberate misuse that it requires protection or limitation.[58] Particularly in democracies and scientific fields, such protections and limitations must be balanced against the values of the free and open exchange of research and ideas for the benefit of humanity.

In conclusion, rapidly developing new technologies raise new ethical challenges for militaries. Uncrewed vehicles and AI are increasing the physical and psychological distance between combatants and the violence they undertake, and in the case of AI, raising questions about the diffusion of command responsibility and existing legal frameworks. Biotechnology likewise has potential for benefit but also poses medical moral challenges that are difficult to resolve in a military context. All of these technologies and others must navigate the complication of being dual use and therefore needing to navigate both military and non-military moral and legal structures. As the future of the ADF’s activities continue to be deeply enmeshed with technological development, these issues will significantly shape how its ethical framework evolves.

Next in Navigating New Ethical Frontiers: Part 3, Role

Footnotes

1 T Munro, Message from the Chief Defence Scientist, https://www.dst.defence.gov.au/strategy/defence-science-and-technology-strategy-2030/message-chief-defence-scientist, Australian Government Department of Defence, n.d., accessed 4 November 2023.

2 J Wallace, Uncrewed surface vessel launched, https://www.defence.gov.au/news-events/news/2023-08-09/uncrewed-surface-vessel-launched, Australian Government defence, 9 August 2023, accessed 24 November 2023.

3 J Joseph, Unmanned vehicles driving the future, https://www.defence.gov.au/news-events/news/2020-10-28/unmanned-vehicles-driving-future, Australian Government Defence, 28 October 2020, accessed 24 November 2023.

4 C Enemark, Armed Unmanned Aircraft and Military Ethics, https://www.aph.gov.au/Parliamentary_Business/Committees/Senate/Foreign_Affairs_Defence_and_Trade/Defence_Unmanned_Platform/Submissions, Submission to the Committee on The potential use by the Australian Defence Force of unmanned air, maritime and land platforms, Parliament of Australia, 2015, accessed 2023.

5 CJ Miller, [Book review of] On killing remotely: The psychology of killing with drones, https://www.armyupress.army.mil/Journals/Military-Review/English-Edition-Archives/May-June-2023/On-Killing-Remotely/, Army University Press, 2023, accessed 24 November 2023.

6 CJ Miller, [Book review of] On killing remotely: The psychology of killing with drones.

7 N Sadler et al., Machine learning analysis of risk factors for progression from suicide ideation to suicide-related behaviours in contemporary Australian Defence Force members, https://defenceveteransuicide.royalcommission.gov.au/publications/machine-learning-analysis-risk-factors, Royal Commission into Defence and Veteran Suicide, 2023, accessed 2023.

8 A Etzioni and O Etzioni, Pros and cons of autonomous weapons systems, https://www.armyupress.army.mil/Journals/Military-Review/English-Edition-Archives/May-June-2017/Pros-and-Cons-of-Autonomous-Weapons-Systems/, Army University Press, 2017, accessed 3 November 2023.

9 A Etzioni and O Etzioni, Pros and cons of autonomous weapons systems.

10 Campaign to Stop Killer Robots, Military and killer robots, https://www.stopkillerrobots.org/military-and-killer-robots/, Stop Killer Robots, 2021, accessed 3 November 2023; A Etzioni and O Etzioni, Pros and cons of autonomous weapons systems.

11 International Committee of the Red Cross, Rule 81: Restrictions on the Use of Landmines, https://ihl-databases.icrc.org/en/customary-ihl/v1/rule81, International Humanitarian Law Databases, n.d., accessed 3 November 2023.

12 A Etzioni and O Etzioni, Pros and cons of autonomous weapons systems.

13 M Taddeo and A Blanchard, A comparative analysis of the definitions of autonomous weapons systems, https://link.springer.com/article/10.1007/s11948-022-00392-3, Science and Engineering Ethics, 28, 37, 2022, accessed 29 November 2023.

14 B Docherty et al., Losing humanity: The Case against killer robots, https://www.hrw.org/report/2012/11/19/losing-humanity/case-against-killer-robots, Human Rights Watch, 19 November 2012, accessed 3 November 2023.

15 A Etzioni and O Etzioni, Pros and cons of autonomous weapons systems.

16 Human Rights Watch, Killer Robots, https://www.hrw.org/topic/arms/killer-robots, Human Rights Watch, n.d., accessed 4 November 2023

17 A Guterres, Secretary-General's remarks at the launch of the Policy Brief on a New Agenda for Peace, https://www.un.org/sg/en/content/sg/speeches/2023-07-20/secretary-generals-remarks-the-launch-of-the-policy-brief-new-agenda-for-peace, United Nations, 20 July 2023, accessed 28 November 2023; J Weaver, Governments and hackers agree: the laws of war must apply in cyberspace, https://theconversation.com/governments-and-hackers-agree-the-laws-of-war-must-apply-in-cyberspace-216202, The Conversation, 25 October 2023, accessed 28 November 2023.

18 A Guterres, Secretary-General's remarks at the launch of the Policy Brief on a New Agenda for Peace.

19 A Etzioni and O Etzioni, Pros and Cons of Autonomous Weapons Systems.

20 A Etzioni and O Etzioni, Pros and Cons of Autonomous Weapons Systems.

21 A Etzioni and O Etzioni, Pros and Cons of Autonomous Weapons Systems.

22 Known as the ‘robots don’t rape’ argument.

23 E Riesen, The moral case for the development of autonomous weapon systems, https://blog.apaonline.org/2022/02/28/the-moral-case-for-the-development-of-autonomous-weapon-systems/, The Blog of the American Philosophical Association, 28 February 2022, accessed 29 November 2023.

24 The concept of the Grey Zone and its increasing role in the ADFs operations is explored in more depth in the chapter on ‘Role’.

25 A Campbell, Transcript of General Angus J Campbell’s address to the 2023 ASPI Conference - Disruption and Deterrence [pdf 590KB], https://www.aspistrategist.org.au/wp-content/uploads/2023/09/Address-by-General-Angus-J-Campbell-AO-DSC-to-the-2023-ASPI-Disruption-and-Deterrence-Conference-14-Sep-2023.pdf, Chief of the Defence Force, 14 September 2023, Accessed 28 November 2023.

26 A Campbell, Transcript of General Angus J Campbell’s address to the 2023 ASPI Conference - Disruption and Deterrence.

27 T Rodenhäuser and M Vignati, 8 rules for “civilian hackers” during war, and 4 obligations for states to restrain them, https://blogs.icrc.org/law-and-policy/2023/10/04/8-rules-civilian-hackers-war-4-obligations-states-restrain-them/,

Humanitarian Law and Policy, International Committee of the Red Cross, 4 October 2023, accessed 28 November 2023.

28 T Rodenhäuser and M Vignati, 8 rules for “civilian hackers” during war, and 4 obligations for states to restrain them.

29[29] S Coleman, New Technologies and the Law of Armed Conflict p40.

30 E Massingham (host) (15 April 2021) ‘Data as an object in IHL’ [podcast], Law and the Future of War, UQ Law and the Future of War, accessed 4 October 2023; T Rodenhäuser and M Vignati, 8 rules for “civilian hackers” during war, and 4 obligations for states to restrain them.

31 KAA Khan, Technology Will Not Exceed Our Humanity, https://digitalfrontlines.io/2023/08/20/technology-will-not-exceed-our-humanity/, Digital Front Lines, n.d., accessed 28 November 2023

32 Several of these complications parallel complications in unconventional conflict such as counterinsurgency that will be further discussed in the chapter on ‘Role’.

33 International Committee of the Red Cross, Rule 74: Chemical Weapons, https://ihl-databases.icrc.org/en/customary-ihl/v1/rule74, International Humanitarian Law Databases, n.d., accessed 7 November 2023.

34 A Walsh and K Van de Ven, Human enhancement drugs and Armed Forces: an overview of some key ethical considerations of creating ‘Super-Soldiers’, https://link.springer.com/article/10.1007/s40592-022-00170-8, Monash Bioethics Review volume 41, pages22–36, 2023; Modernisation and Strategic Planning Division - Australian Army Headquarters, Future Land Warfare Report 2014, https://researchcentre.army.gov.au/library/other/future-land-warfare-report-2014, Australian Army Research Centre, 2014, p15-16.

35 Under Australian law.

36 Australian Government Department of Defence, Performance and image enhancing drugs and supplements, https://www.defence.gov.au/adf-members-families/health-well-being/services-support-fighting-fit/fact-sheets, Fact Sheets, Defence Health, n.d., accessed 7 November 2023.

37 A Walsh and K Van de Ven, Human enhancement drugs and Armed Forces: an overview of some key ethical considerations of creating ‘Super-Soldiers’; J Mazanov, Anti-Doping in Sport and Human Enhancing Technologies in Army, https://researchcentre.army.gov.au/library/land-power-forum/anti-doping-sport-and-human-enhancing-technologies-army, Australian Army Research Centre, 13 October 2017, accessed 7 November 2023.

38 A Walsh and K Van de Ven, Human enhancement drugs and Armed Forces: an overview of some key ethical considerations of creating ‘Super-Soldiers’.

39 A Henschke, Stronger, faster and more deadly: the ethics of developing supersoldiers, https://theconversation.com/stronger-faster-and-more-deadly-the-ethics-of-developing-supersoldiers-71086, The Conversation, 1 March 2017, accessed 28 November 2023.

40 A Henschke, Stronger, faster and more deadly: the ethics of developing supersoldiers.

41 A Henschke, Stronger, faster and more deadly: the ethics of developing supersoldiers.

42 A Henschke, Stronger, faster and more deadly: the ethics of developing supersoldiers.

43 A Henschke, Stronger, faster and more deadly: the ethics of developing supersoldiers.

44 A Henschke, Stronger, faster and more deadly: the ethics of developing supersoldiers.

45 A common use of steroids.

46 M Kosal and J Putney, Neurotechnology and international security: Predicting commercial and military adoption of brain-computer interfaces (BCIs) in the United States and China, https://pubmed.ncbi.nlm.nih.gov/37140225/, Politics Life Sci., 2023 Apr;42(1):81-103, accessed 29 November 2023.

47 Better Health Channel, Anabolic steroids, https://www.betterhealth.vic.gov.au/health/healthyliving/steroids, Better Health Channel, 2022, accessed 7 November 2023.

48 A Walsh and K Van de Ven, Human enhancement drugs and Armed Forces: an overview of some key ethical considerations of creating ‘Super-Soldiers’.

49 A Walsh and K Van de Ven, Human enhancement drugs and Armed Forces: an overview of some key ethical considerations of creating ‘Super-Soldiers’.

50 A Henschke, Stronger, faster and more deadly: the ethics of developing supersoldiers.

51 S Miller, Dual Use Science and Technology, Ethics and Weapons of Mass Destruction, Springer Briefs in Ethics, Springer Cham, 2018.

52 E Kania, Technological entanglement, https://www.aspi.org.au/report/technological-entanglement, Australian Strategic Policy Institute, 28 Jun 2018, accessed 10 November 2023.

53 A Lele, ‘Military relevance of quantum technologies’, https://link.springer.com/chapter/10.1007/978-3-030-72721-5_8, in A Lele, Quantum technologies and military strategy, Springer, Cham, 2021, pp 117–143.

54 Defense Innovation Unit Defense Innovation Unit (DIU), https://www.diu.mil/about, Department of Defense, United states of America, n.d., accessed 25 November 2023.

55 A Lele, ‘Military relevance of quantum technologies’.

56 AG Handmer, Making a success of ‘failure’: a Science Studies analysis of PILOT and SERC in the context of Australian space science, https://ses.library.usyd.edu.au/handle/2123/27383, [PhD thesis], University of Sydney, 2021, accessed 28 November 2023.

57 AG Handmer, Making a success of ‘failure’: a Science Studies analysis of PILOT and SERC in the context of Australian space science.

00
Cite Article
Harvard
APA
Footnote
RIS
(Storey, 2025)
Storey, E. 2025. 'Navigating New Ethical Frontiers - Part 2 - Technology'. Available at: https://theforge.defence.gov.au/article/navigating-new-ethical-frontiers-part-2-technology (Accessed: 09 July 2025).
(Storey, 2025)
Storey, E. 2025. 'Navigating New Ethical Frontiers - Part 2 - Technology'. Available at: https://theforge.defence.gov.au/article/navigating-new-ethical-frontiers-part-2-technology (Accessed: 09 July 2025).
Emma Storey, "Navigating New Ethical Frontiers - Part 2 - Technology", The Forge, Published: July 09, 2025, https://theforge.defence.gov.au/article/navigating-new-ethical-frontiers-part-2-technology. (accessed July 09, 2025).
Download a RIS file to use in your citation management tools.
Defence Technical Social

Social Mastery

Ethical Philosophies social-ethics-level2
Moral Leadership social-ethics-level3
Stewarding the Profession social-ethics-level5

Comments

Disclaimer

The views expressed in this article are those of the author and do not necessarily reflect the position of the Department of Defence or the Australian Government.

This web site is presented by the Department of Defence for the purpose of disseminating information for the benefit of the public.

The Department of Defence reviews the information available on this web site and updates the information as required.

However, the Department of Defence does not guarantee, and accepts no legal liability whatsoever arising from or connected to, the accuracy, reliability, currency or completeness of any material contained on this web site or on any linked site.

The Department of Defence recommends that users exercise their own skill and care with respect to their use of this web site and that users carefully evaluate the accuracy, currency, completeness and relevance of the material on the web site for their purposes.

 

Related Articles

1 /4