The conflict in Ukraine puts 'killer robot' regulation efforts in jeopardy.
The conflict in Ukraine puts 'killer robot' regulation efforts in jeopardy.

US: Because of Russia's invasion of Ukraine, efforts to legally regulate the use of "killer robots" are failing. International efforts to control the use of autonomous weapons, also known as killer robots, are ineffective and could fail if such weapons are used in Ukraine and proven to be effective.
Right now no nation is known to have employed autonomous weapons. Because they would choose and attack targets without human oversight their potential use is debatable. 
Before they are used arms control organisations are pushing for the establishment of legally binding international agreements similar to the ones we have for chemical and biological weapons. However the state of the world is preventing progress. 

From July 25 to 29, a group of government experts at the United Nations will meet for the last time to discuss autonomous weapons. Insiders claim that despite the group's consideration of the issue since 2017 no consensus has been reached. Due to its invasion of Ukraine, Russia opposes international legal controls and is currently abstaining from the discussions making unanimity impossible.

According to Gregory Allen at the Center for Strategic and International Studies in Washington, DC, "there is no chance for a blanket ban on autonomous weapons because the United Nations process operates under a consensus mechanism. However, he asserts that there may still be room for advancement through the widespread adoption of a code of conduct.

A code like this might be based on Directive 3000.09, the US's 2012 introduction of the nation's first autonomous weapons policy. According to a Pentagon directive, rules must be updated every ten years. According to Allen, Directive 3000.09 is frequently interpreted as forbidding autonomous weapons in the United States. 

In order to reduce risk to friendly forces or civilians, it actually specifies requirements for such weapons and a rigorous approval process. Landmines and anti-missile systems are examples of exceptions. No system has yet been submitted for review, according to Allen, because the regulations are so stringent and demand approval from the highest-ranking officer in the US military, for example. 

Attitudes could shift if Russia uses autonomous weapons in Ukraine. Russia has loitering weapons called KUB that can be operated automatically. These have the capacity to wait in a particular location and launch an attack when a target is identified by the apparatus. Allen questions whether they have been used autonomously, but claims that in June, Russia's troops in Ukraine received similar Lancet loitering munitions. These possess the capacity to autonomously locate and attack targets. No reports exist indicating either has been used in autonomous mode.

If Russia uses these weapons, Allen predicts that some in the US government will wonder whether they are necessary for effective deterrence now or in the future.

Autonomous weapons will keep evolving in the absence of international treaties or codes of conduct that are legally binding. Demand for more advanced weapons will increase as a result of conflicts like the one in Ukraine, and it might not be long before killer robots appear on the battlefield

US planning to train Ukrainian military: Ambassador

Kiev deems the military operation in Lugansk region 'successful'


Join NewsTrack Whatsapp group
Related News