Press "Enter" to skip to content

Military Robotics

Story

Year 2040:

Darius works as a government researcher in Maryland, specializing in military robotics. While Darius used to have more control over what research he was able to conduct, his higher ups have been consistently pushing him more towards looking into robotic solutions that might be used in battle. While Darius is made uncomfortable by the idea, it would be very difficult for him to find another job at the moment, and he thinks that he may be able to either dissuade his superiors from pushing him in this direction, or be able to come up with some potential solutions.

Understanding that Darius is concerned about the direction of his work, one of his higher ups attempts to help convince him that this work is not inherently bad: “Automation could even help prevent loss of life, a robotic system will be far less prone to error; When soldiers are stressed or tired, they are more likely to miss, incorrectly evaluate a situation, or even friendly fire. For reasons like these, a robotic solution may be capable of reducing the amount of casualties, while at the same time potentially being more effective.” While Darius understands that reducing the number of casualties is a good thing, and can even further prevent loss of life by lowering the potential for further insurgents to enter the fight, he is still against working on a fully automated system that can take someone’s life. 

Previously Darius had only worked on automated systems that either prevented loss of life, such as bomb defusal robots, or were simply used as tools by soldiers, such as recon robots. His increasing discomfort in straying any further from these applications leaves him with a difficult decision to make: He either continues to work, and aids in creating something he does not want to exist, or he quits his job and is forced to find work elsewhere, which would put him in a dangerous position in the current economy. Additionally, Darius could go one step further and actively protest against the creation of such automated applications, which would most likely result in him being fired, but would also make it even more difficult for him to get a new job, as he has seen how difficult it is for whistleblowers to do just that. 

 

What are Military Robots?

Military robots are autonomous platforms used by the military for a variety of roles. Currently, military robots take non-lethal roles, such as disposing of live explosives or being used as reconnaissance tools by soldiers. Automated defense systems such as the Samsung SGR-1 do now exist though, and are capable of autonomously firing a weapon, though the SGR-1 in particular is stationary. Aerial drones are currently used as well, though they do not operate entirely autonomously. Some missile systems, for example, have an autonomous portion of their usage, i.e. a missile package that is capable of splitting up and seeking out individual targets.[2, 3]

 

 

Discussion Questions

Click the + on the right of a question to view related perspectives and potential starting points for considering these ethical concerns.

Should an autonomous system have the ability to take a human life without any human intervention? If so, how might this affect legal and ethical concerns in other domains or robotics where loss of life is accidental? Additionally, to what degree of certainty must the system reach before it is allowed to ‘take the shot’, or whatever it may be?
  • One possible opinion is that autonomous weaponry of this type should not exist whatsoever, and this opinion has been urged upon the UN.[11]
Military equipment that is already in use, such as a land mine[2], has the potential to kill without human intervention, and it has no knowledge of who triggers it. It can be argued that the land mine has some form of agency, do we treat robots the same way? How can it be acceptable for a land mine to kill someone, but not acceptable for a robot to?
  • “However, in general, traditional weapons have a very low autonomous power compared to the new generation of military robots.” [2]
If the actions of the robot result in a wrongful death, who is responsible for this? If this responsibility is shared, how much is each associated party responsible?[2, 6, 9]
  • Responsibility for cases such as this can be defined as a ‘chain of responsibility’, wherein any member associated with the decision that lead to the incident holds some amount of responsibility. Politicians may be responsible at a higher level, military commanders at a level below that, soldiers at a level lower still, and the robot itself at the lowest level.[2]
These robots will need to use what data they can gather from the environment in order to make decisions, including who it is they are attacking.Given that biases, particularly racial biases or others based on physical appearance, are already common, how much would this affect these robots and their decisions?
  • Facial recognition can become increasingly inaccurate when identifying people of specific minorities[21]. Mistakes that come from such an inaccuracy could potentially involve the loss of life of a completely unrelated party.
The ease of use of these robotic systems may make it even easier to carry out terror attacks. Should governments ensure that these systems do not become accessible to independent individuals and groups? [4]
It is possible that a military robot may be produced with some vulnerability in programming, and is then able to be hacked by someone. If there is loss of life related to this incident, is it the fault of the manufacturer? [4]
Does the lack of human error[19] justify the use of automated soldiers, if they prove to be better at preventing the deaths of noncombatants?
  • Precision can be an important factor when it comes to war; “needlessly harming innocents can turn the populace against the counterinsurgency”[22], reducing these needless casualties can in turn prevent further deaths in the future.

Themes

(Primary) Government, Safety and Security, Professional Responsibility

(Secondary) Fairness and Non-discrimination, Identity, Anthropomorphization

Resources

  1. Marchant, G. E., Allenby, B., Arkin, R., & Barrett, E. T. (2011). International governance of autonomous military robots. Colum. Sci. & Tech. L. Rev., 12, 272. https://heinonline.org/HOL/Page?handle=hein.journals/cstlr12&id=272&collection=journals&index=
  2. Hellström, T. (2013). On the moral responsibility of military robots. Ethics and information technology, 15(2), 99-107. https://link.springer.com/content/pdf/10.1007/s10676-012-9301-2.pdf 
  3. Voth, D. (2004). A new generation of military robots. IEEE Intelligent Systems, 19(4), 2-3. https://ieeexplore.ieee.org/abstract/document/1333028
  4. Khurshid, J., & Bing-Rong, H. (2004, December). Military robots-a glimpse from today and tomorrow. In ICARCV 2004 8th Control, Automation, Robotics and Vision Conference, 2004. (Vol. 1, pp. 771-777). IEEE. https://ieeexplore.ieee.org/abstract/document/1468925
  5. Lin, P., Abney, K., & Bekey, G. A. (Eds.). (2012). Robot ethics: the ethical and social implications of robotics. Intelligent Robotics and Autonomous Agents series. 
  6. De Boisboissel, G. (2017, May). Is it sensible to grant autonomous decision-making to military robots of the future?. In 2017 International Conference on Military Technologies (ICMT) (pp. 738-742). IEEE. https://ieeexplore.ieee.org/abstract/document/7988854 
  7. Schulzke, M. (2011). Robots as weapons in just wars. Philosophy & Technology, 24(3), 293. https://link.springer.com/content/pdf/10.1007/s13347-011-0028-5.pdf
  8. Singer, P. W. (2009). Military robots and the laws of war. The New Atlantis, (23), 25-45. https://www.jstor.org/stable/43152939?seq=1#metadata_info_tab_contents 
  9. Noorman, M., & Johnson, D. G. (2014). Negotiating autonomy and responsibility in military robots. Ethics and Information Technology, 16(1), 51-62. https://link.springer.com/content/pdf/10.1007/s10676-013-9335-0.pdf 
  10. Subbaraman, N. (2013, September 29). Soldiers. Retrieved December 02, 2020, from https://www.nbcnews.com/technology/soldiers-3-robots-military-bots-get-awards-nicknames-funerals-4B11215746
  11. Bowcott, Owen. “UN Urged to Ban ‘Killer Robots’ before They Can Be Developed.” The Guardian, Guardian News and Media, 9 Apr. 2015, www.theguardian.com/science/2015/apr/09/un-urged-to-ban-killer-robots-before-they-can-be-developed
  12. Davies, Ross. “Dumb or Smart? The Future of Military Robots.” Army Technology@2x, 30 Jan. 2020, www.army-technology.com/features/dumb-or-smart-the-future-of-military-robots/.  
  13. Lauterborn, David. “Goliath Tracked Mine: The Beetle That Started the ROV Craze.” HistoryNet, HistoryNet, 13 Apr. 2016,  www.historynet.com/goliath-tracked-mine-the-beetle-that-started-the-rov-craze.htm
  14. Noorman, Merel, and Deborah  G. Johnson. “Negotiating Autonomy and Responsibility in Military Robots.” Ethics Inf Technol, 18 Feb. 2014. 
  15. Opfer, Chris. “Are Robots Replacing Human Soldiers?” HowStuffWorks Science, HowStuffWorks, 27 Jan. 2020, science.howstuffworks.com/robots-replacing-soldiers1.htm.  
  16. “Robotic Dog Unveiled by the US Military.” BBC News, BBC, 12 Sept. 2012, www.bbc.com/news/av/technology-19567351
  17. Royakkers, Lamber, and Peter Olsthoorn. “Lethal Military Robotics: Who Is Responsible When Things Go Wrong?” Unmanned Aerial Vehicles: Breakthroughs in Research and Practice, IGI Global, Engineering Science Reference (an Imprint of IGI Global), 2019, pp. 394–432. 
  18. Sofge, Erik. “Tale of the Teletank: The Brief Rise and Long Fall of Russia’s Military Robots.” Popular Science, Popular Science, 7 Mar. 2014, www.popsci.com/blog-network/zero-moment/tale-teletank-brief-rise-and-long-fall-russia%E2%80%99s-military-robots/
  19. Subbaraman, Nidhi. “Soldiers <3 Robots: Military Bots Get Awards, Nicknames … Funerals.” NBCNews.com, NBCUniversal News Group, 29 Sept. 2013, www.nbcnews.com/technology/soldiers-3-robots-military-bots-get-awards-nicknames-funerals-4B11215746
  20. Kerick, S. E., and L. E. Allender, “Effects of cognitive workload on decision accuracy, shooting performance, and cortical activity of soldiers.” (Transformational Science And Technology For The Current And Future Force, 2006). 359-362
  21. Buolamwini, J., & Gebru, T. (2018, January). Gender shades: Intersectional accuracy disparities in commercial gender classification. In Conference on fairness, accountability and transparency (pp. 77-91). PMLR.
  22. Brown, J. M. (2007). To bomb or not to bomb? counterinsurgency, airpower, and dynamic targeting. Air & Space Power Journal, 21(4), 75.

Image Credit

Title: “MAARS-robot”
Creator: “c b”—https://www.flickr.com/photos/defwheezer/ 
Source: “The MAARS robot in use by the US military.”—https://www.flickr.com/photos/defwheezer/3392547922 
License: “CC BY-NC-SA 2.0”—https://creativecommons.org/licenses/by-nc-sa/2.0/ 

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *