This site uses cookies for learning about our traffic, we store no personal details. ACCEPT COOKIES DECLINE COOKIES What are cookies?
univerge site banner
Review Article | Open Access | Asian J. Soc. Sci. Leg. Stud., 2024; 6(4), 162-168 | doi: 10.34104/ajssls.024.01620168

ICRCs Intervention on AI Based Weapon under International Humanitarian Law: A Critical Analysis

Anika Nargis* Mail Img ,
Aynal Haque Mail Img Orcid Img

Abstract

The International Committee of the Red Cross (ICRC) has exerted pressure on nations to enact new international legislation that would limit the use of certain autonomous weapons, including those under third-party control, and forbid the deployment of others. The International Committee of the Red Cross (ICRC) is a neutral, independent organization that offers victims of armed conflict and other perilous situations humanitarian protection and support. In addition to responding to emergencies, it promotes compliance with international humanitarian law and its assimilation into national laws. In modern times, the military is making significant investments in artificial intelligence, and there are already instances of AI being used in conflict. A number of sectors that create serious humanitarian concerns include those that the ICRC has identified as areas in which artificial intelligence is being developed for use in combat by armed actors. The International Committee of the Red Cross (ICRC) has been pressuring governments to pass new international laws that would prohibit the deployment of some autonomous weapons and restrict the use of others, including those that are controlled by third parties.

INTRODUCTION

It is not easy to increase knowledge of International Humanitarian Law (IHL) as it relates to armed conflicts and make sure that the parties involved in the conflict abide by this body of law in a world that is becoming more globalized and characterized by chaos on a regional and international scale, unprecedented threats to global security, new forms of violence, and technological revolutions, especially in the field of information technology (Review, 2014). From 1864 until the present, the International Committee of the Red Cross (ICRC) has worked to advance international humanitarian law (IHL). The International Committee of the Red Cross (ICRC) continues to strongly emphasize evaluating the effects of armed conflict on those impacted, taking into account issues concerning moral credibility and compliance with international humanitarian law, and ultimately advocating for policies, practices, and legal clarification or development where needed to lessen unfavorable outcomes for both warriors and civilians (Paper, June 2019; Pourshah MH., 2024). 

Its efforts can be summed up by comparing the organizations experiences and noteworthy moments in international relations throughout history, which led to the establishment of specific attention (Review, 2014). If we go back through history, then it can be clearly identified that the ICRC always performs a vital rule in the field of IHL. The ICRC keeps a careful eye on the development of new weaponry and methods of conflict, as well as the militarys use of them (ICRC, ‘International Humanitarian Law and the Challenges of Contemporary Armed Conflict, 2013). In order to determine whether IHL applies to the use of these new weapons and warfare tactics, it also confers with all pertinent parties.

Review of Literature

This research employed Doctrinal and Non-empirical research approaches. As a result, this project makes use of internet databases and various e-learning tools. The three legal research approaches are utilized in alongside: Descriptive methodology, Analytical methodology, and Comparative methodology. This paper did not require any field data or sample collection because it was a doctrinal study; hence there are certain limitations on data collection. I mainly used main and secondary sources of information.  The resources are studied from a critical and analytical perspective. Secondary sources contain thorough analyses of numerous texts from the legal and non-legal sections, related publications, research papers, relevant cases, and so on. 

Objectives of the Study 

The objectives of this research are, 

i. To determine the ICRCs input into the application of AI to international humanitarian law 

ii. To describe the actions the ICRC has taken in this respect 

iii. To determine how various states have responded to the use of AI in warfare

ICRC in Development of International Humanitarian Law

The International Committee of the Red Cross (ICRC) is best recognized for its emergency management around the world to assist victims of hostilities and internal conflict. (ICRC, ‘What is ICRCs Role in Developing and Ensuring Respect for IHL, 2017). The ICRC has long had a specific link with international humanitarian law. The ICRCs policy of protecting people during armed conflict and other forms of violence is aimed at gaining full respect for applicable law (ICRC, ‘What is ICRCs Role in Developing and Ensuring Respect for IHL, 2017). It cannot literally protect people. Instead, it aims to minimize the dangers they encounter, stop and end the abuse they endure, raise awareness of their rights, and express their concerns. Stated differently, the ICRC keeps an eye on adherence to IHL and notifies the relevant authorities of any infractions (ICRC, n.d.). ICRCs work is based on Geneva Convention.

The main purpose of the International Committee of the Red Cross has been to defend and assist victims of armed conflict since its establishment in 1863. It accomplishes this through carrying out dramatic intervention throughout the globe, in addition to promoting the expansion of international humanitarian law (IHL) and applying pressure on governments and other armed holders to maintain it. The 1949 Geneva Conventions are a World War II legacy. (ICRC, n.d.) Based on the terrible experience gathered in the battle, they significantly strengthened the legal protection afforded to war victims, especially civilians held captive by the enemy. Almost all nations have ratified the 1949 Geneva Conventions (ICRC, International Humanitarian Law and the Protection of War Victims). They have become truly universal law since they have been accepted by the entire community of countries. The growth and development of humanitarian action, the Red Cross and Red Crescent Movement, and the Geneva Conventions are all covered in this course of study (ICRC, n.d.). The ICRCs principal function was that of a mediator. However, when the need for an impartial arbitrator between combatants became obvious, it grew more involved in field operations. The ICRC increased its activity over the next 50 years, as national societies arose and the Geneva Convention was updated to encompass maritime conflict. International humanitarian law, the community of States, the Red Cross and Red Crescent Movement, warring parties, non-governmental organizations, and all individuals with good intentions must collaborate to improve the protections provided to war victims and their families.

ICRCs Approach to AI-based Technology

State-to-state conflicts are growing more frequent as a result of the quick advancement of technology. The International Committee of the Red Cross (ICRC) must first comprehend the implications of those recently developed weapons and their effects in order to govern the rules of this warfare. A few of the wide-ranging consequences that artificial intelligence (AI) and machine learning systems may have on human engagement in armed conflict include the nature of decision-making generally, new forms of cyberspace and information warfare, and the increasing autonomy of weapon systems and other automated systems. Developments in technology can improve the safety of civilians in times of armed conflict. Armed forces can be used more precisely, military decisions made more precisely, and military goals achieved without resorting to aggressive action or physical destruction. New weapons of mass destruction and their application have the potential to endanger both military and civilians while also casting doubt on how IHL is interpreted and applied. (ICRC, International Humanitarian Law and Challenges of Comtemporary Armed Conflict, 2013). Legal, military, technological, ethical, and humanitarian considerations are all interconnected in the ICRCs evaluation of the potential humanitarian impact of new weapons technology and the challenges it may pose under present IHL standards. According to the ICRC, governments, military personnel, and other major stakeholders in armed conflict must adopt a completely human-centered approach to the use of AI and machine learning systems based on moral and legal standards. (ICRC, Artificial Intelligence and Maching Learning in Armed Conflict, 2020) Using artificial intelligence in armed systems needs to be done very carefully.

Usage of artificial intelligence and machine learning are the two key topics that ICRC focuses on. The first one is about using it in conflicts, and the second one is about using it in humanitarian endeavors to protect soldiers and civilians. (ICRC, Statements to the Convention on Certain Conventional Weapons (CCW), 2019). Whether governments or non-state armed groups use AI and machine learning in warfare, its unclear how this will work in such a situation, let alone the potential consequences. Because there is a chance that humans will lose control over weapons and the use of force, the International Committee of the Red Cross (ICRC) views autonomous weapon systems that possess autonomy in their "critical functions" of target selection and execution as a serious humanitarian, legal, and ethical concern. This loss of control puts civilians in danger due to the unpredictable nature of the impacts; hence, legal difficulties also surface (ICRC, Perspectives on Lethal Autonomous Weapon Systems, 2019).

Prospects of ICRC

The International Committee of the Red Cross (ICRC), a humanitarian organization dedicated to safeguarding and aiding victims of armed conflict and other forms of violence, believes that ensuring a truly human-centered approach to the development and application of AI and machine learning is mandatory, (Societies, 2015) it has a mission that is informed by the Fundamental Principle of Humanity and international humanitarian law. The ICRC may interpret this issue within the following aspects,

Cyber Operations

The International Committee of the Red Cross (ICRC) is grateful that more governments and international organizations are acknowledging that information warfare laws (IHL) apply to cyberattacks during wars. Not all technological warfare makes use of AI and machine learning. Nonetheless, it is anticipated that emerging technologies will change the nature of cyberattacking and cyber defense capabilities (ICRC Position Paper). AI and machine learning enabled cyber capacities might autonomously hunt for and exploit weaknesses, as well as fight against cyber-attacks while also conducting counter-attacks (ICRC, Digital Risk in Situations of Armed Conflict, 2019). These kinds of innovations have the potential to enhance the number of attacks, as well as change the nature and, possibly, intensity of the attacks (ICRC Symposium Report, 2019). To minimize the human suffering of armed conflict, the International Committee of the Red Cross (ICRC) continues to prioritize ensuring that established standards under international humanitarian law are upheld in technological warfare. It also underscores the need for those carrying out or resisting such attacks to consider the particular difficulties in safeguarding vital services and civilian facilities (ICRC, The Potential Human Cost of Cyber Operations, 2019).

Autonomous Weapon System

The possibility of losing human control over weapons and the use of force presents an immediate humanitarian, legal, and ethical dilemma for weapon systems that are autonomous in their "critical functions" of detecting and attacking targets (ICRC Challenges Report 2019). The potential integration of AI and machine learning algorithms, namely those intended for "automatic target recognition," into autonomous weapon systems presents a new level of uncertainty and raises concerns around bias and lack of accountability (ICRC, tinyural, 2018). AI and machine learning-powered military robotics are not always autonomous weapons because their programming can be employed for control purposes other than targeting, like flying, positioning, or monitoring (ICRC position Paper 2019). The International Committee of the Red Cross (ICRC) believes that autonomy in weapon systems, including an enabled systems, raises the most urgent concerns. However, the use of artificial intelligence (AI) and machine learning to increase autonomy in military hardware, such as unmanned aircraft, land vehicles, and sea vessels, may also raise concerns about human-machine engagement and security (ICRC Position Paper, 2019).

Robotic Systems and Weapons with Autonomy in Physical Form

One common use of digital AI and machine learning approaches is the control of real-world weaponry. The most obvious example of this is the growing number of unmanned, autonomous robotic systems in the air, on land, and at sea, all of varying sizes and purposes. Whether these robots are armed or not, and whether they control the entire system or specific tasks like flying, navigation, observation, or aim, artificial intelligence (AI) and machine learning have the potential for developing autonomy (ICRC Position Paper, 2019). In order to inform and support its humanitarian work in particular operational contexts for example, by using estimating statistical analysis to help calculate humanitarian needs the International Committee of the Red Cross (ICRC) has developed atmosphere monitoring visualizations that use artificial intelligence (AI) and machine learning to detect and analyze large amounts of data (Hub, 2020). The ICRC also makes sure that, while taking into account a reasonable assessment of the technologys capabilities and limitations, the creation and application of AI and machine learning implementations it uses represent the core principles and values of impartial, independent, and fair humanitarian action (Hub, 2020).

Artificial intelligence (AI) and machine learning technologies have the potential to significantly alter how people engage in conflicts. This is especially true with regard to new forms of cyber and information warfare, enhanced autonomy for weapon systems and other unmanned systems, and decision-making processes in general. In light of their moral and legal obligations, governments, the military, and other significant players in armed conflict, such as the ICRC countries, must adopt a truly human-centered approach to AI and machine learning systems (ICRC Position Paper, 2019). The employment of artificial intelligence in armed systems should be treated with prudence.

States Reaction to the Matter of Artificial Intelligence in Warfare

Autonomous weapon systems, termed the "third revolution in warfare" after gunpowder and nuclear weapons, offer numerous obstacles and bring numerous questions within state institutions and civil society (Galeotti, 2019). The international community has yet to agree on a definition for AI-based weapons, owing to a lack of agreement on what autonomy actually entails (Galeotti, 2019). On the one hand, autonomous technology can enhance efficiency and hence prevent harm; on the other hand, no machine should be granted the authority to take human life. The International Committee of the Red Cross and the United Nations organized the inaugural Expert Meeting on Autonomous Weapon Systems in Geneva in 2014, sparking discussions around around the world. (ICRC, Autonomous Weapon Systems- Technical, Military, Legal and Humanitarian Aspect, 2014). The establishment of a Governmental Group of Experts in 2016 to examine the concerns around growing autonomy of newly developed weaponry formalized the discussion. States have divergent opinions about using AI in combat. Particularly within the European Union is this the case (ICRC, 2014). 

The European Parliament has advocated for the creation, fabrication, and employment of completely autonomous weapons that can carry out operations without the need for human interaction. (N.Weizman, 2014). States like Austria, the only European country to join the NGO Campaign to Stop Killer Robots, are particularly supportive of this approach. Other European countries, like France and the United Kingdom, have more mixed feelings. France emphasized the need of remembering that the technologies in question are dual in nature, with many civic, peaceful, legitimate, and given the highest priority and that states should not attempt to restrict development in this area. Currently, the only systems in use are teleoperated and semi-autonomous ones, although troops from numerous countries including the US, UK, France, Israel, Russia, South Korea, and China are working to increase the autonomy of the latter (Galeotti, 2019). For example, the US Navy is developing the Sea Hunter, a self-piloting submarine that will be able to follow the laws of the sea based on its position. Platform-M, as "universal fighting platform" with an autonomous targeting system, is being developed by Russia and the Kalashnikov business (F.S, 2015). Finally, the SGR-A1, a sentinel robot used in the demilitarized zone between the two Koreas, is the weapon that most nearly resembles the definition of LAWS given by ICRC. If we take into consideration some recent issues, then we can find some practical use of AI in warfare and sates reaction regarding this context,

Syrian Civil War

Ten years ago, a nonviolent demonstration against the government of Syria turned into a full-fledged civil conflict (Why Has the Syrian War Lasted 11 Years, 2022). The violence has claimed the lives of 1.5 million people, wreaked havoc on cities, and drawn in other nations. Mnemonic, a Syrian human rights organization, was confronted with a massive tough challenge in 2017 (Murgia, 2021). They had almost 350,000 hours of film containing evidence of war crimes ranging from chemical attacks to the deployment of forbidden armaments, but theyd never go through it all completely. Mnemonic planned to utilize AI to scan the Syrian Archive, a library of social media records from the war, for proof that a specific "cluster" weapon known as the RBK-250, a metal shell carrying hundreds of tiny bombs, had been used on civilians. RBK-250 shells are also frequently left unexploded, posing a threat for decades after a confrontation has ended. 

War in Afghanistan

The United States was reacting to the nearly 3,000 individuals killed in the New York and Washington, D.C., 9/11 attacks in 2001. Authorities brought charges against 104 Al-Qaeda, an Islamist terrorist group, and its leader, Osama Bin Laden War in Afghanistan (BBC, 2021). When NATO allies joined the US in 2004, a new Afghan government came to power, but the terrorists horrific atrocities continued. Though it was fleeting, President Barack Obamas "troop surge" in 2009 assisted in the Talibans surrender. The Taliban regained power in 2021. From start to finish, the West similarly fought the war. B-52 bombers carried out the first airstrikes in 2001, the same model that first saw service in 1955, and the attacks that signaled the end of the US involvement in August were carried out by the same old model of aircraft (Martin, 2021). The Taliban commenced the war with AK-47s and other basic, conventional weapons, but now they use the internet and mobile phones to conduct tactical messaging and persuasion campaigns in addition to enhancing their armaments and central command capabilities.

Russia- Ukraine War

According to Ukrainian officials, Russian President Vladimir Putin started a "full-scale invasion" of Ukraine on February 24, 2022, following weeks of building up a sizable military force close to the border with Ukraine and in nearby Belarus (Sundbay, 2022). It resulted in the start of a catastrophic new chapter in the east of the nation and a major escalation of an eight-year conflict that had claimed countless lives already. Face recognition software is being used by Ukraine to track down Russian invaders and recognize Ukrainians slain in the continuous struggle, which is notable because its one of the few recorded uses of artificial intelligence in the fight (Tegler). Many academics seek to raise policymakers attention to a growing body of academic research that shows AI and machine-learning systems can be abused in a variety of simple, easily accessible manner (Tegler, 2022). Russia may have won a substantial technological edge with its air superiority, but it may have made a strategic error by delaying the installation of its operational UAVs. (Resitoglu, 2022) Currently, Russian airstrikes are launching increasingly severe attacks, whereas Ukrainian aircraft are focusing on attacks with fewer but more significant impacts. 

AI has the potential to be both empowering and imbalanced in its influence, which implies that a developing nation with limited resources may create robust AI software without needing to research, develop, and test a new military system (Gatopoulos, 2021). It is a strong means for a country to overtake its competitors, generating potent innovations that will give it the edge needed to win a conflict. Countries like Russia and China, with their redesigned and rationalized forces, are aiming to beat the world by investing extensively in future weapons research. They should take proper cautions in this regard. In context of the development of certain new autonomous weapons, states must modify Inter-national Humanitarian Law. The International Committee of the Red Cross (ICRC) has exerted pressure on nations to enact new international legislation that would limit the use of certain autonomous weapons, including those under third-party control, and forbid the accidental use of others.

CONCLUSION

As it is mentioned everywhere, the prospect of the ICRC is to reduce the harm of warfare. To reduce harm, the ICRC has already taken several initiatives within its framework. States concerns are increasing day by day in the field of autonomous weapon systems as well. The ICRC always supports the steps that are already being taken by states as long as they are accepted for human welfare. Artificial intelligence is one of the most crucial matters nowadays, and there is a crying need to improve the legislative framework in this arena. Given the rapid improvements in technology and the employment of autonomous weapon systems, it is imperative that internationally agreed limitations be imposed expeditiously. Within the confines of its mandate and areas of expertise, the ICRC is ready to work in partnership with relevant parties on a national and worldwide scale in order to accomplish this.

ACKNOWLEDGEMENT

First of all, we would like to thank the Almighty for giving us good health. We express our gratitude to our family members, who have sacrificed a lot for us. We would also like to express our gratitude to one of our respected teachers, Suprobhat Paul, for suggesting several key points regarding this research work.

CONFLICTS OF INTEREST

For this paper, the authors have disclosed no conflicts of interest.

AUTHOR CONTRIBUTIONS

A.N. conceptualization, writing the manuscript. A.H. contributed in investigation, visualization. A.N.; and A.H. finally checked the manuscript and editing, and Formal Analysis. All authors who are involved in this research read and approved the manuscript for publication.

Supplemental Materials:

| 4.00 KB

Article References:

  1. Alex Gatopoulos, ‘(2021). Project Force- AI and the Military- a Friend or a Foe. Aljazeera. https://www.aljazeera.com/features/2021/3/28/friend-or-foe-artificial-intelligence-and-the-military 
  2. Alex Sundbay, (2022). ‘Russias War in Ukraine: How it Came to This CBS. New York City. https://www.cbsnews.com/news/ukraine-news-russia-war-how-we-got-here/accessed    
  3. Christopher Ankersen and Mike Martin, (2021). ‘The Taliban not West Won Afghanistans Technological War. MT Technology Review. https://www.technologyreview.com/2021/08/23/1032459/afghanistan-taliban-war-technologicalprogress/     
  4. Eric Tegler, (2022) ‘The vulnerability of AI System May Explain Why Russian Isnt Using Them Extensively in Ukraine. Forbes, New Jersey.
  5. Gady F.S., (2015). “Meet Russias New Killer Robot”, The Diplomat. https://www.thediplomat.org-meet-russia-killer-robot   
  6. ICRC Challenges Report, (2011). Neil Davison, “Autonomous Weapon Systems under International Humanitarian Law” Perspectives on Lethal Autonomous Weapon Systems, United Nations Office for Disarmament Affairs Occasional Paper No. 30.
  7. ICRC, (2013). ‘International Humanitarian Law and the Challenges of Contemporary Armed Conflict 
  8. ICRC, (2014). “Autonomous Weapons Systems - Technical, Military, Legal and Humanitarian Aspect”, Expert Meeting. 
  9. ICRC Review, (2014). ‘The International Committee of the Red Cross and the Promotion of International Humanitarian Law: Looking Back, Looking Forward 
  10. ICRC, (2014). “Autonomous Weapons: What role for humans?” 
  11. ICRC, International Federation of Red Cross, Red Crescent Societies, (2015). The Fundamental Principles of the International Red Cross and Red Crescent Movement: Ethics and Tools for Humanitarian Action, Geneva. 
  12. ICRC, (2017). ‘What is ICRCs Role in Developing and Ensuring Respect for IHL 24 August 
  13. ICRC Challenges Report, (2019). Perspectives on Lethal Autonomous Weapon Systems, United Nations Office for Disarmament Affairs Occasional Paper No. 201
  14. ICRC, (2019). Position Paper, Artificial Intelligence and Machine Learning in Armed Conflict: A   Human-Centered Approach. 
  15. ICRC, (2019). Statements to the Convention on Certain Conventional Weapons (CCW) Group of Governmental Experts on Lethal Autonomous Weapons Systems, Geneva. 25-29.
  16. ICRC, (2022). ‘History of the International Committee of Red Cross. https://www.icrc.org/en/history
  17. ICRC, (2022). ‘International Humanitarian Law and The Protection of War Victims. https://www.icrc.org/en/doc/resources/documents/misc/57jm93.htm   
  18.  ICRC, (2020). Reports and Documents, Artificial Intelligence and Machine Learning in Armed Conflict: A Human Centered Approach (ICRC, 463-479)
  19. ICRC, (2022). Symposium Report: Digital Risks in Situations of Armed Conflict, March 2019, p. 9. www.icrc.org/en/event/digitalrisks-symposium
  20. ICRC, (2022). The Potential Human Cost of Cyber Operations, report of an expert meeting, Geneva. www.icrc.org/en/document/potential-human-cost-cyber-operations    
  21. ICRC, (2022). Statement to the CCW Group of Governmental Experts on Lethal Autonomous Weapons Systems under Agenda Item, 6(b), Geneva. https://tinyurl.com/y4cql4to 
  22. ICRC, Brussels Privacy Hub, (2020). Handbook on Data Protection in Humanitarian Action, 2nd ed., Geneva. 
  23. Madhumita Murgia, (2021). ‘‘Researcher Train AI on ‘Synthetic Data to Uncover Syrian War Crimes Financial Times.https://www.ft.com/content/8399873e-0dda-4c87-ba59-0e2678166fba  
  24. N. Weizmann, (2014). “Academy Briefing n.8 - Autonomous Weapon Systems under International Law”, Geneva Academy of International Humanitarian Law and Human Rights. 
  25. Pourshah MH. (2024). The laws governing dual citizenship international law, Asian J. Soc. Sci. Leg. Stud., 6(4), 146-153. https://doi.org/10.34104/ajssls.024.01460153
  26. Sofia Galeotti, (2022). ‘Under the Rule of LAWS: Artificial Intelligence in Warfare. https://securitypraxis.edu-underthe-rule-of-laws  
  27. Syema Resitoglu, (2022). ‘Artificial Intelligence in Russia- Ukraine War- Series (TUIC, 2022). 
  28. ‘Taliban are Back-What Next for Afghanistan? (2021). BBC. https://www.bbc.com/news/world-asia49192495  
  29. ‘Why has the Syrian War Lasted 11Years, (2022). BBC London. https://www.bbc.com/news/world-middle-east35806229 

Article Info:

Academic Editor
Dr. Sonjoy Bishwas, Executive, Universe Publishing Group (UniversePG), California, USA.

Received

June 3, 2024

Accepted

August 7, 2024

Published

August 21, 2024

Article DOI: 10.34104/ajssls.024.01620168

Corresponding author

Cite this article

Nargis., and Haque. (2024). ICRCs intervention on AI Based Weapon under international humanitarian law: a critical analysis, Asian J. Soc. Sci. Leg. Stud., 6(4), 162-168. https://doi.org/10.34104/ajssls.024.01620168 

Views
248
Download
44
Citations
Badge Img
Share