The Pentagon's Killer Robots are ready for Battle: how will the world react to the new arms race?

CarderPlanet

Professional
Messages
2,556
Reputation
7
Reaction score
586
Points
83
Learn how autonomous weapons can change the rules of war.

In light of the US Department of Defense's aggressive efforts to modernize the armed forces with fully autonomous drones and weapons systems, critics have expressed concerns about the possible start of a new arms race, which could significantly increase the risk of mass destruction, nuclear war and civilian casualties.

The initiative, dubbed "Replicator," is a massive effort to scale existing technologies to create a future armed force in which fully autonomous systems will be embedded in flying drones, aircraft, watercraft, and defense systems that are integrated through a computer center to synchronize and manage units.

Critics call autonomous weapons "killer robots" or "killer robots" because they are controlled by artificial intelligence and can technically destroy targets on their own without human help. However, there are no international treaties regulating the use of such weapons, which is worrisome for human rights groups.

Anna Kheir, head of autonomous weapons Systems Research at the Future of Life Institute, said:: "It's really a Pandora's box that we're starting to open, and it's going to be very difficult to go back."

Deputy Secretary of Defense Kathleen Hicks first announced the Replicator in late August at a defense conference in Washington, calling it a "game-changing" initiative that counters China's growing ambitions. She also stated that the initiative will comply with ethical guidelines for fully autonomous systems.

The main concerns are related to the fact that the decision to start a war can be made easier if the world is highly dependent on AI weapons, algorithms should not take human lives, since they cannot understand their value and may be biased.

These guidelines are based on the Pentagon directives on the use of AI in military operations, updated in January 2023. They are designed to ensure that senior commanders and those responsible carefully review and approve new weapons. According to this policy, before an AI-based weapon system can use force, human involvement is required to make a decision.

The Congressional Research Service report highlights that the term "flexible term" is not always applicable in a variety of contexts. It is also pointed out that the phrase "human judgment on the use of force" does not imply direct human involvement, but refers to general decisions about the deployment of forces.

In 2018, FLI published a petition calling for a ban on any autonomous weapon that kills a person without the involvement of a human operator. To date, the petition has been signed by 274 organizations and almost 4,000 people.

A 2016 FLI letter called for a complete ban on offensive weapons beyond actual human control. It was signed by more than 34,000 people, including tech billionaire Elon Musk and theoretical physicist Stephen Hawking.

The Western Security Alliance, NATO, has set some standards for the use of autonomous weapons, including the requirement to attack only legitimate military targets. But NATO officials acknowledge that it is unclear "when and where the law requires a human presence in the decision-making cycle" and "how broad the decision-making cycle is."

While at least 30 countries have called for a ban on lethal autonomous systems, the United Nations (UN) has yet to ratify any treaty regulating autonomous weapons, and to date there is no agreed definition of these weapons.

UN Secretary-General Antonio Guterres has called for a legally binding agreement by 2026 that restricts the use of technology and prohibits the use of deadly autonomous weapons without human supervision. The UN Convention on Certain Conventional Weapons has been discussing this issue since 2013.

The UN plans to take a more active look at the issue of autonomous systems this year; the General Assembly is expected to raise the issue of autonomous systems in October, and the Convention on Certain Conventional Weapons will discuss the topic in November.

Compliance with international treaties remains an open question, as advanced AI technologies may evade tracking and identifying their use. These and other issues require careful consideration and international cooperation to ensure the safety and responsible use of autonomous weapons systems.
 
Top