Tech

Autonomous weapons are here, but the world is not ready for them


This is possible Remembered as the year the world knew it lethal autonomous weapon has moved from a worry about the future to one realistic battlefield. This was also the year when policymakers disagreed on what to do to tackle this problem.

On Friday, 120 countries joined the United Nations’ Convention on Certain Conventional Weapons cannot agree on whether to limit the development or use of lethal autonomous weapons. Instead, they pledged to continue and “intensify” the discussions.

Neil Davison, senior science and policy adviser at International Committee of the Red Cross, a humanitarian organization based in Geneva.

The failure to reach an agreement occurred about 9 months after the UN report that a lethal autonomous weapon was used for the first time in armed conflict, in the civil war in Libya.

In recent years, more and more weapon systems incorporate elements of autonomy. For example, some missiles can fly without specific instructions in a certain area; but they still often rely on one person to launch an attack. And most governments say that, at least for now, they plan to keep a human “in the loop” when using such technology.

But advances in artificial intelligence algorithm, sensors and electronics have made it easier to build more complex autonomous systems, raising the prospect of machines that can decide for themselves when to use lethal force.

A growing list of countries, including Brazil, South Africa, New Zealand and Switzerland, argue that lethal autonomous weapons should be limited by the treaty, as chemistry and biological weapons and land mine was. Germany and France support restrictions on certain types of autonomous weapons, including those capable of targeting humans. China favors an extremely narrow set of restrictions.

Other countries, including the US, Russia, India, UK and Australia, oppose the lethal autonomous weapons ban, arguing that they need to develop the technology to avoid being placed at a strategic disadvantage. .

Killer robots have long captured the public imagination, inspiring both favorite sci-fi characters and backward vision of the future. The recent AI renaissance and the birth of new types of computer programs the human ability to think in certain areas, has fueled some of the biggest names in technology warning of an existential threat created by smarter machines.

The issue became more pressing this year, after a United Nations report said a Turkish-made drone called the Kargu-2 was used during the civil war in Libya last year. 2020. Forces affiliated with the National Treaty Government are said to have launched drones against troops pro-Libya National. Army leader, General Khalifa Haftar targeted and attacked independents.

“The retreating supply convoys and Haftar-linked forces were… hunted down and operated from afar by combat drones,” the report said. The systems “are programmed to strike targets without requiring a data connection between the operator and the bomb and ammunition.

The news reflects the speed at which autonomous technology is improving. “Technology is evolving much faster than military-political discussion,” says Maximum Tegmark, a professor at MIT and co-founder of Future of Life Institute, an organization dedicated to addressing the existential risks facing humanity. “And by default, we’re headed for the worst possible outcome.”

.



Source link

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button