The ongoing efforts to regulate the use of autonomous weapons systems.

Iran’s Fight Against COVID-19
May 23, 2020
Understanding Conflict Resolution: Breaking Down Theories and Approaches
August 4, 2022

The ongoing efforts to regulate the use of autonomous weapons systems.

Emirhan Darcan, PhD
Senior Expert
Global Center for Security Studies
Autonomous weapons systems will play an important role in future conflicts. This briefing paper describes the ongoing efforts to ban and restrict the use of autonomous weapons systems. From an international security perspective, Autonomous Weapon Systems (AWS) are defined as robotic weapons that can to sense and act unilaterally depending on how they are programmed also have enormous potential. Their use is, however, not without its challenges, as ensuring respect for human rights, in particular, AWS will threaten the fundamental right to life and principle of human dignity. AWS, which can make their own decisions, cause some questions legally and morally. Autonomy in these weapons means no human intervention. The problem stems from this feature. What if autonomous weapons not only destroy enemy weapons but also kill civilians?
Banning and Restricting the Use of AWS

International Level

Since 2014, the issue has been discussed in the United Nations. Apart from 70 countries, scientists and non-governmental organizations have also participated in the Geneva talks within the framework of "Specific Conventional Arms Agreement". "Deadly Autonomous Weapon Systems" discussed in the negotiations between 27 and 31 August. Robots that hit each other on the battlefield are also included in this definition.

Gathering at the International Conference on Artificial Intelligence, held in Buenos Aires in July 2015, experts released an open letter to the United Nations. Under this open letter, besides the world's number of Artificial Intelligence experts, there were also signatures of names from the world of business and science.

Artificial Intelligence experts and activists, who came together in the United Nations in October 2015, emphasized the seriousness of the issue and reiterated their calls for the prohibition of Autonomous Weapons. In November 2015, the United Nations convened once again for the Agreement on the Trade of Conventional Weapons. Experts signed in the open letter hoped that the idea of banning against "Autonomous Weapons" would be strongly expressed and accepted at this meeting.

The Conventional Arms Congress, organized by the United Nations on AWS systems, took place in Geneva, Switzerland in 2017. Worried about autonomous weapons, scientists previously brought the issue to the agenda of the United Nations and received the support of 19 countries. However, countries such as England and the USA did not support the regulation by saying that an arrangement on this subject would be useless due to the difficulty of defining the concept of "human control".

In August 2018, experts from various countries came together in the UN headquarters in Geneva, Switzerland, for trying to establish a framework for identifying and organizing computer-controlled weapons. At the meeting in August, some countries, including Israel, Russia, South Korea, and the US, opposed the ban, saying they wanted to explore the potential "advantages" of autonomous weapons systems.

UN member states also discussed such autonomous weapons in November 2019 in Geneva. In this meeting, security analysts argued that as long as humans program them, they were no more dangerous than other weapons. In this meeting, even a non-binding declaration such as the one Germany is calling for was not accepted.

There is no question of a binding international ban on killer robots. No progress has been made on international law regulation in the United Nations (UN) for autonomous weapons (AWS), also called 'killer robots'. Unfortunately, public pressure has also not increased sufficiently. Moreover, as in land mines, an international ban could not also be reached outside the UN. Consequently, At the Conventional Arms Congress (CCW) held in Geneva, concrete results were not obtained from the studies under the title AWS.

Additionally, the European Parliament decided in September 2018 that prohibits the development and use of autonomous weapons systems that can kill people without human intervention. Some European parliamentarians oppose the decision, which also calls for an international ban in this area, with the reservations that artificial intelligence may restrict its development. Some of the parliamentarians think that some countries may not be allowed to develop such weapons while other countries are not allowed.

Adding to the aforementioned efforts above, The United Nations (UN) has set up a center in The Hague, the Netherlands, to examine developments in artificial intelligence and the threats it may pose. In this center, which is founded as Artificial Intelligence and Robots, possible risks are investigated from technological advances in autonomous robots to mass unemployment.

The Civil Society Level

Activities at the civil society level can be examined under two main headings. The first is about the Robot decision of the United Nations, and the other is about the role of companies in the development of these weapons.

The non-governmental organization, called the "Killer Robots Stopping Campaign", asks for the banning of lethal autonomous weapons, warns that advances in sensors and artificial intelligence will target and make the production of weapons that can fire in the future without the need for human intervention. More than 3,000 well-known people participated in the campaign, which was launched by the American "Future of Life" institute. Organizations like Tesla's boss Elon Musk and improvements in artificial intelligence technology such as Google have announced that they will not contribute to autonomous weapon systems. States and organizations against robotic weapons demand that such weapons be banned by international norms and laws because the decision to kill a person can never be left to a machine. In February 2019, a group of Microsoft employees called the company to cancel the $ 480 million contract that the US army made to provide HoloLens headphones for 100,000 battles, and said they were not involved in developing weapons. An open letter was sent to the company's chief legal counsel Brad Smith and CEO Satya Nadella to cancel the IVAS contract through his Twitter account, called "Microsoft Workers 4 Good".

The HoloLens protest, just like Google employees, reminds the opposition of Microsoft staff to develop artificial intelligence programs to detect people and objects in drone video surveillance recordings. He had withdrawn from the Pentagon's Maven program on the protest of Google staff and published a principle that the artificial intelligence he would develop would never harm people. Google also cancelled the $ 10 billion JEDI Cloud contract with the Pentagon, following these principles. CEO Sundar Pichai also signed a letter saying, "Google shouldn't be at war". However, Google allows using the application called "TensorFlow" opened on the internet to develop artificial intelligence applications. This is an "Artificial Intelligence Interface", and by taking it, it is possible to develop an application you want.

More than 50 researchers in the field of artificial intelligence and robotics have announced that they will boycott KAIST University in South Korea, which plans to develop artificial intelligence-assisted weapons. Academics participating in the boycott undertake to cut off all kinds of academic and academic cooperation with the university, unless the gun developed by KAIST is guaranteed to be "significantly under human control". What triggered the boycott was KAIST's announcement in February 2018. The university announced that they are cooperating with South Korean defense company Hanwha Systems. The Korea Times newspaper announced that the aim of the cooperation is "to develop military weapons technology that can destroy its targets without human control".

2013 May - June there was a robot invasion in front of the British Parliament. The activists aimed to draw attention to the danger posed by robots capable of killing. In August 2018, officials from 80 countries came together as "United Nations Traditional Weapons Reconciliation Group" (United Nations' Convention on Conventional Weapons (CCW)) to "prohibit autonomous weapons". The ban order was blocked by the USA, Australia, Israel, and South Korea and Russia. Unlike reconciliation against nuclear and chemical weapons, these countries have suggested that "first they want to explore the potential benefits of lethal AWS".

Conclusion In conclusion, Effective participation of mobile communication and the internet in daily life provides ease of organization for social movements. Hence, internet-based social movements or protests may force policy-makers, especially finding a response to how AWS technologies can make human rights in the future more powerful rather than weakening them. Especially in strong government structures and politically turbulent times, at the international level, negotiating on regulations of AWS is however necessarily required

At least, social movements and protests should continue to raise awareness, helping the community to follow developments in AWS at national and international levels.


Altmann, J., & Sauer, F. (2017). Autonomous weapon systems and strategic stability. Survival, 59(5), 117-142.

Arendt, H., & Kroh, J. (1964). Eichmann in Jerusalem (p. 240). New York: Viking Press.

Asaro, P. (2012). On banning autonomous weapon systems: human rights, automation, and the dehumanization of lethal decision-making. International Review of the Red Cross, 94(886), 687-709.

Bradley, S. (2019) “Killer robots: ‘do something’ or ‘do nothing’?” Retrieved from

Chavannes, E., Klonowska, K., & Sweijs, T. Governing autonomous weapon systems. Retrieved from 03/Report_Managing_RAS_The_Need_for_New_Norms_and_Arms_Control.pdf

Evans, T. D. (2012). At war with the robots: Autonomous weapon systems and the Martens Clause. Hofstra L. Rev., 41, 697.

Gibbs, S., (2017) “Elon Musk leads 116 experts calling for outright ban of killer robots”, Retrieved from

Guiora, A. N. (2017). Accountability and Decision Making an Autonomous Warfare: Who Is Responsible. Utah L. Rev., 393.

Hass B., (2018) “'Killer robots': AI experts call for boycott over lab at South Korea university” Retrieved from

John Markoff, Fearing Bombs that Can Pick Whom To Kill , N.Y. Times (Nov. 11, 2014), -not-humans-raise-ethical-questions.html

Kessel, M.J., (2019) “Killer Robots Arenʼt Regulated. Yet.” Retrieved from

Kim, D. J. J. (2017). Artificial intelligence and crime: What killer robots could teach about criminal law.

LEWIS, J. (2015). The Case for Regulating Fully Autonomous Weapons. The Yale Law Journal, 124(4), 1309-1325. Retrieved May 28, 2020, from

Losing Humanity : The Case Against Killer Robots , Hum. Rts. Watch 1 (Nov. 2012), http ://ìles/

Noel Sharkey, Computing Experts from 37 Countrìes Call f Comm. for Robot Arms Control (Oct. 16, 2013), http://icra -experts-from-37-countries-call-for-ban-on-killer-robots

Recaps of the UN CCW meetings August 27-31, 2018 Retrieved from

Solon O., (2019) “ 'We did not sign up to develop weapons': Microsoft workers protest $480m HoloLens military deal” Retrieved from

The Problem , Campaign To Stop Killer Robots, -problem

Walsh T., (2017), “Dear Prime Minister: we’d like you to join the call for a ban on killer robots”, Retrieved from

Wong J.C., (2019) “ 'We won't be war profiteers': Microsoft workers protest $480m army contract” Retrieved from uploads/2020/03/Report-on-Activities-GGE-August-2019.pdf

Darcan, E., (2020) “Should the use of autonomous weapon systems «killer robots» be banned and restricted?” Foraus. Available URL Address

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.