Science Fiction, Meet Lethal Weapon

Autonomous weapons are almost here. From drone swarms to smart assault rifles, the face of warfare is about to get a makeover.

When Noel Sharkey was asked about autonomous weapons at a press conference in 2007, it was a new concept to him. He researched the topic at once, discovering that smart weaponry was on the roadmap of almost every branch of the U.S. defense force. Twelve years later, with multiple organizations founded and dozens of United Nations conventions attended, he remains as horrified by the concept as ever.

“From my position as a professor of robotics and AI, I thought this was insane – that they had a science-fiction notion of what AI was – but they’ve carried on with it,” he says. 

Sharkey is an Emeritus Professor at the University of Sheffield, and an active voice in a massive community of scientists and activists who believe that computers have no place in life-or-death situations. He founded (and now chairs) the International Committee for Robot Arms Control (ICRAC), and holds multiple roles in other, similar organizations. These NGOs campaign at UN events across the globe in a bid to convince governments to halt their proliferation of autonomous weapons. 

Public support for banning the development of smart weapons is plentiful, with an early 2019 survey conducted by IPSOS for Human Rights Watch and the Campaign to Stop Killer Robots finding that 61% of adults across 26 countries disapprove of fully autonomous weapons. Yet, many of the world’s most formidable militaries are continuing their research and development in the hope of minimizing human casualties on the battlefield. 

Under the hood of military tech

Emerging technologies have disrupted almost every field, but nowhere is the impact on human life clearer than in the development of defense technologies. Drone technology and AI hold particular interest for military applications. 

According to Sharkey, the U.S. military can fly over 100 weaponized drones at once, and China has “flotillas of 50 little ships that can work together.” The idea is that these swarms will communicate only with one another, using machine vision to identify targets, with a human at the other end authorizing the strike. In military terms, it’s known as force multiplication, or magnifying the force of one person by enabling them to command multiple weapons.

Much of the development in this field is secretive, with few defense firms–if any–ready to divulge the extent of their research and testing. Russian arms developer Kalashnikov was one of the only firms to demo a product publicly. In 2017, the company announced that it had developed a fully automated combat module with the ability to control a variety of weapons that run on neural networks, allowing it to identify and strike targets without a human decision-maker present.

Guns operated by humans have also been upgraded in various ways. Bullets using lasers to correct their trajectory (similar to target-locking in guided missiles) have been in development for over a decade, and Reuters reported in October 2019 that the U.S. Army awarded contracts to several parties involved in creating smart rifles. These firearms can remain locked on targets, independent of atmospheric conditions or aim. 

A case of discrimination

The most telling part of the race to smart weapons is how long they’ve been in development without being deployed. Sharkey describes the example of Red Owl, a counter-sniper device that uses two microphones to determine the source of sniper fire and shoots back. However, there are many ways such a device can be fooled; for instance, a sniper could be located among civilians–almost certainly leading to casualties.

“One of these systems has been around for about 10 years now, and it’s never been deployed because it can’t tell the difference between ricochet and firing,” says Sharkey. Other automated weapons systems can similarly be tricked without much difficulty–just one of the many reasons that NGOs are so vocal about getting development in this area shut down.

As early as 2009, there were reports of live U.S. Army drone footage being accessed by Taliban agents using software bought online for less than US$30. Sharkey argues that putting two computer algorithms relating to lethal weapons face-to-face could be catastrophic, and the outcome is impossible to predict. Similarly, a human agent wouldn’t find it too challenging to game a system into failing.

“The worst thing for a computer program is to keep putting it in unanticipated circumstances. It will fail, eventually, and you can help it to fail with fairly low-tech situations,” he says.

As advanced as machine vision is now becoming, Sharkey believes that the ability to assess the environment contextually and discriminate between military and civilian targets is something robots will never be capable of doing. 

“You’re not allowed to kill surrendering forces, surrendering combatants. You’re not allowed to kill mentally ill soldiers, either,” Sharkey says. “So you’ve got all these problems of who you’re meant to be able to kill, what’s the legitimacy of the target, and I really don’t think these things are capable of making those decisions.”

Also, drone swarms and autonomous jets are designed to operate beyond human parameters of speed. With some jets now being able to fly at Mach 7 and uncrewed jets like China’s Anjian (meaning ‘dark sword’) – which can make maneuvers impossible for a human pilot – in development, there’s little chance of a human being fast enough to observe all the details before making a fully informed strike. 

Sharkey’s third argument against AI weapons is short and to-the-point: “It’s against human dignity to have a machine delegated with the decision of life or death for a human.”

Focusing on the bigger picture

Whether or not civilians are soon to become unfortunate victims of killer robots, some aspects of military technology – primarily surveillance-related – are slowly filtering down the pipeline. A September 2019 Carnegie Endowment for Peace report found that 75 out of 176 countries globally are using AI for surveillance. Of these, 84% are using Chinese technology, with 50 out of 63 countries outfitted by Huawei alone. Despite the privacy concerns, defense and law enforcement entities have little choice but to embrace the technology. 

“With departments of defense – their job is to protect our country as best as they can,” says Sharkey. “But as members of civil society, our job is to say, hold on, that’s going too far, we don’t want that done in our name.”

Similarly, Sharkey adds that governments are unable to halt their development of autonomous weapons because they’re obligated to keep up with other nations. China, for instance, pledged in favor of banning autonomous weapons. However, the country has nevertheless continued to develop them because until an international ban is imposed, the U.S. and Russia will continue to do so as well. 

Conversely, while many might assume that militaries are pushing for smart firearms, they are often more receptive to NGOs’ concerns than world leaders are. Armed forces prefer to maintain control on the battlefield – a sentiment Sharkey and his associates share. 

“[AI] is good for games of chess and things, where it’s a closed game with fixed rules,” says Sharkey. “But if the enemy has tricked you, or when the autonomous weapons arrive and it’s a crowd of civilians, at least the military will know it’s been tricked.”

It could be several decades – if ever – before fully autonomous lethal weapons are deployed in combat, but if and when that eventuality comes to pass, being aware of the strengths and weaknesses of the technology is essential. J. Robert Oppenheimer, the creator of the atomic bomb, famously regretted bringing his invention to life due to the subsequent global arms race it sparked. It remains to be seen whether the inventors of killer robots will come to feel the same way. 

SHARE THIS STORY

Share on facebook
Share on twitter
Share on linkedin
Share on email

RELATED POSTS

Video-Games-That-are-Accessible-for-the-Visually-Impaired

Breaking Barriers: Video Games That are Accessible for the Visually Impaired

In recent years, the gaming industry has made significant strides in promoting inclusivity and accessibility for players of all abilities. While video games have long been regarded as a visual medium, game developers and designers have worked to break barriers and create gaming experiences accommodating the visually impaired.

Discover-the-Best-Thin-and-Light-Laptops-to-Suit-Your-Needs

Discover the Best Thin and Light Laptops to Suit Your Needs

In today’s fast-paced world, a laptop that seamlessly combines sleek design, impressive power and exceptional portability is no longer a luxury but a necessity. Whether you’re a student, a professional or someone with a creative passion, finding the perfect thin and light laptop is crucial for staying productive and mobile.

Top 5 Technologies That Will Make Mars Habitable

Top 5 Technologies That Will Make Mars Habitable

Mars, the fourth planet from the Sun, has long captivated the curiosity of scientists. Some of the most intelligent minds—Buzz Aldrin, Neil Degras Tyson and Stephen Hawking—agree that humankind should work towards occupying Mars. And there is a good reason for that. When life on Earth was evolving, Mars was going through significant climate change.

Exploring the Best Robo-Advisors for Smart Investors

Investing Made Easy: Exploring the Best Robo-Advisors for Smart Investors

Are you ready to revolutionize your investment approach and maximize your profits? Consider robo-advisors—the AI-powered automated investment advisor that uses algorithms to provide financial advice and manage investments. They are typically much cheaper than traditional advisors, and they can be a good option for investors—even beginners—who are looking for an automated and efficient way to manage their portfolios.