Rise of the killer robots that can automatically kill humans.

Rise of the killer robots: Experts call for a ban on machines that can automatically kill humans

  • British expert claims he could make autonomous killer robot today
  • Nobel Laureate says machines could be in widespread use in 20 years

 

Machines with the ability to attack targets without any human intervention must be banned before they are developed for use on the battlefield, campaigners against ‘killer robots’ have urged.

The weapons, which could be ready for use within the next 20 years, would breach a moral and ethical boundary that should never be crossed, said Nobel Laureate Jody Williams, of the ‘Campaign To Stop Killer Robots’.

‘If war is reduced to weapons attacking without human beings in control, it is going to be civilians who are going to bear the brunt of warfare,’ said Williams, who won the 1997 peace prize for her work on banning landmines.

They're here! A robot in Parliament Square at the launch of the Campaign to Stop Killer RobotsThey’re here! A robot in Parliament Square at the launch of the Campaign to Stop Killer Robots. Lethal armed robots which could target and kill humans autonomously should be banned before they are used in warfare, campaigners have said.

Weapons such as remotely piloted drones are already used by some armed forces and companies are working on developing systems with a greater level of autonomy in flight and operation.

‘We already have a certain amount of autonomy,’ said Noel Sharkey, professor of Artificial Intelligence and Robotics at the University of Sheffield.

‘I think we are already there.

‘If you asked me to go and make an autonomous killer robot today, I could do it.

‘I could have you one here in a few days,’ he told reporters.

But the technology is a long way off being able to distinguish between a soldier and a civilian.

‘The idea of a robot being asked to exercise human judgment seems ridiculous to me,’ Sharkey told Reuters.

‘The whole idea of robots in the battlefield muddies the waters of accountability from my perspective as a roboticist,’ he added.

Robonaut 2: Nasa today lreleased this picture of Expedition 35 Flight Engineer Chris Cassidy with a

Robonaut 2: Nasa today lreleased this picture of Expedition 35 Flight Engineer Chris Cassidy with a humanoid robot being tested in the Destiny Laboratory onboard the Earth-orbiting International Space Station.

 

The weapons, which could be ready for use within the next 20 yearsThe weapons, which could be ready for use within the next 20 years, would breach a moral and ethical boundary that should never be crossed, said Nobel Laureate Jody Williams, of the ‘Campaign To Stop Killer Robots’.

The British government has always said it has no intention of developing such technology.

‘There are no plans to replace skilled military personnel with fully autonomous systems,’ a Ministry of Defense spokesman told Reuters.

‘Although the Royal Navy does have defensive systems, such as Phalanx, which can be used in an automatic mode to protect personnel and ships from enemy threats like missiles, a human operator oversees the entire engagement,’ the spokesman added.

But the organizers of the Campaign to Stop Killer Robots say Britain’s rejection of fully autonomous weapons is not yet watertight.

‘We’re concerned that there is a slide towards greater autonomy on the battlefield and unless we draw a clear line in the sand now, we may end up walking into acceptance of fully autonomous weapons,’ said Thomas Nash, director of non-governmental organization Article 36.

Rapid advancements in technology have allowed countries such as the United States, China, Russia, Israel and Germany to move towards systems that will soon give full combat autonomy to machines, according to a report by Human Rights Watch.

‘We think that these kinds of weapons will not be able to comply with international humanitarian law,’ Steve Goose, Human Rights Watch executive director, told Reuters.

 Read more

Comments are closed.

Leave a Reply

Your email address will not be published. Required fields are marked *