back to article Human rights groups rally humanity against killer robots

An international coalition of nongovernmental organizations (NGOs) has formed the Campaign to Stop Killer Robots, a lobbying organization aimed at securing a worldwide ban on fully autonomous weapons. "Lethal armed robots that could target and kill without any human intervention should never be built," Steve Goose, director of …

COMMENTS

This topic is closed for new posts.
  1. Don Jefe
    Happy

    Seen The Movie

    Where the scientists say "we're not going to do this until we get it right': It doesn't end well...

  2. Captain DaFt

    Big reason it won't happen

    No military will advocate autonomous killing machines (other than extreme circumstances), for one simple reason. It'd take all the fun out of war!

  3. Anonymous Coward
    Facepalm

    Yeah. Good luck with that

    The robotic whirring sound you can hear is Isaac Asimov spinning in his grave.

  4. Anonymous Coward
    Anonymous Coward

    It seems like they are helping the killer robots

    Is it just me or does gathering in a large group against killer robots sound a lot like lambs to the slaughter. Maybe they will even all line up so the drones can save fuel and ammo.

  5. Idocrase

    Because training humans to kill other humans, then sending them into civilian populated areas armed with high explosives, depleted uranium shells and the attitude that all the natives are terrorists, always ends well...

    At least a machine would be able to recognize the difference between a Reuters photographer and a guy with a rocket launcher.

    1. The Man Who Fell To Earth Silver badge
      FAIL

      Foolish human

      As the union representative for the killer robots on my planet, I think it's my duty to let you know that while a machine would be able to recognize the difference between a Reuters photographer and a guy with a rocket launcher, we'd still find it best practice to terminate them both.

      1. Dave 126 Silver badge

        Re: Foolish human

        If one is loose with the definition, then land-mines could be considered 'killer robots' - i.e they react to predefined stimuli in such a way as to cause incapacitation. The same is true of mines as would be of 'armed UAVs with no person-in-the-loop' as it is of a workshop bench grinder wrapped around some careless wretch's sleeve: "They don't know when to stop".

        If a human pilot is relying on an automated system to identify targets (or rather, relying on an automated system to identify friendly armoured vehicles and not kill their occupants) then it seems a little irrelevant to argue about whether the actual trigger should be pulled by a human finger.

        Google are trying to make autonomous cars... some would say they have to be really, really safe before being considered suitable for mass adoption.... others would say they just have to be demonstrably safer than a human driver.

    2. Thorne

      "Because training humans to kill other humans, then sending them into civilian populated areas armed with high explosives, depleted uranium shells and the attitude that all the natives are terrorists, always ends well...

      At least a machine would be able to recognize the difference between a Reuters photographer and a guy with a rocket launcher."

      To the army a reporter is more dangerous than a terrorist...

      1. Anonymous Coward
        Anonymous Coward

        @Thorne

        Nope. I escorted a BBC team, the Times defence correspondent, and a German photographer around my PB in Afghanistan, and at no time did I feel in any danger from them. There was a guy from the Scottish Herald who got on the wrong Chinook and ended up with us instead of 5 Scots, but I just gave him a sleeping bag and got him back on the right chopper.

        A free press is part of a free society, and the British Army is part of that free society.

        The US pilots who shot up the Reuters team were trigger-happy cowboys, and it was not an individual or institutional effort to suppress journaism.

  6. Dave 32
    Coat

    Rise Of The Machines

    Don't they know that they can't stop the Rise Of The Machines? Resistance is futile...

    Of course, governments can promise that they won't design/build such machines. And, they'll stick to that promise, at least until they get wind that some other governmentis building such machines, at which point, an arms race in autonomous killing robots will occur.

    Dave

    P.S. I'll get my coat. It's the one with the badge that says "I welcome our new robot overlords".

    1. Anonymous Coward
      Anonymous Coward

      Re: Rise Of The Machines

      Forget governments, I'm more worried about civillians building them!

      Shame my university rejected my idea of an autonomous drone that tracked down people by facial recognition then tagging them with a paint ball gun....

      1. Anonymous Coward
        Anonymous Coward

        Re: Rise Of The Machines

        I have a machine at work that does that, we built a machine that wanders the datacenter and if it doesn't recognize someone, it shoots a Nerf dart at them. It uses a small Atom board (D525) and a Kinect for its control system. (The Kinect identifies the object as human and relays the image of the person's head-region to a set of C# applications that do the actual face recognition)

        So far it hasn't hit anyone on my team and hit a couple electrical contracts that haven't had their faces programmed in yet.

        Anon as everyone already thinks the company I work for is an Evil Empire...

  7. nagyeger
    Terminator

    too late

    We already have (single use) technologies that can automatically select their own targets. Have had for years.

    I'm thinking of the spectrum of missiles, (e.g. anti radar-installation, anti tank, anti-aircraft, anti-missile, etc), mines (sea and land), and so on. About the only difference is that "killer robots" presumably have a

    shorter loiter time than your average WWII mine, and if we're lucky they might struggle if there are stairs / pylons in their way.

    1. Anonymous Coward
      Anonymous Coward

      Re: too late

      Noel Sharkey (Pictured, but oddly not mentioned) was saying on the Radio 4 PM program the other day that these are all weapons that are instructed by humans. What they are actually talking about is something which can act fully autonomously with no instruction or oversight from humans.

    2. mIRCat
      Coat

      Re: too late

      "if we're lucky they might struggle if there are stairs / pylons in their way."

      No one would design a weapon that can't go up stairs!

      ...But let me just fetch my sonic screwdriver to be safe.

      1. Richard 12 Silver badge

        Re: too late

        Yes, it is too late in at least one field.

        Anti-missile systems are already exactly this, because they don't work if they aren't - see HMS Sheffield for a tragic example of why humans can't do anti-missile.

        You select a volume of space for them to check, the locations to protect and turn them on.

        They then automatically fire upon and (hopefully) destroy any incoming items they recognise as an inbound missile.

        There is no human in the loop, because by the time the meat-bag hears the alarm, it's too late for an anti-missile system to do anything.

        Now, it's probable that the operator can order the anti-missile-missile to self-destruct after it's launched, but it's still very little time.

        On the other hand, there is a difference between anti-missile systems and anti-tank etc. as there is a lot more time to identify the target before you need to open fire - although still not very much.

        On the third hand, what are the military supposed to be defending against anyway?

        Who has tanks and might invade a neighbour in the next fifty years? North Korea and Iran are about it!

        The current and near-future threats are individuals or small to medium-sized groups (perhaps associated with international movements), not states.

        Autonomous systems can't identify those, and really, neither can military personnel either - although they usually do better.

        It's effective policing and peacekeeping forces that are really needed these days.

    3. Anonymous Coward
      Angel

      Re: too late

      Quote: "About the only difference is that "killer robots" presumably have, (is) a shorter loiter time than your average WWII mine,"

      Hmmmmmm the worlds 3 biggest industries are sex, drugs and guns.....

      It's too profitable.

      And dropping autonomous killer bots by parachute into a country, is not an issue, because not much will be heard about it, and every one in that country who does not like being invaded and shot up, must actually be a terrorist, and they get in the way of the cheap (stolen) oil, then they actually interfere with bringing peace and democracy to their country.

      We can't allow that. The reason why we send in the bots, is because we care, we care for the hearts and minds of the nations that are ruled by democratically elected tyrants, who oppose our foreign policy, armies and corporate bank robbers.

      1. Anonymous Coward
        Anonymous Coward

        Re: too late

        Actually, the trade in weapons is worth far more than the trade in narcotics and pr0n put together.

        1. Eddy Ito
          Coat

          Re: too late

          The weapons trade is only worth more because it's a competitive sport. I've never heard anyone saying they needed to "invest" more on pr0n or drugs because their neighbor was stockpiling oxycodone and grumble flicks.

  8. Anonymous Coward
    Pint

    Instead of aiming for the tools...

    Why not focus your energy on the powers that would actually use or wield these?

    Or put differently: focus on the heart of the problem instead of the symptoms; it gets you much better results. Of course; "protest against a future SkyNet" sells so much better...

  9. Anonymous Coward
    Anonymous Coward

    Well... while I agree in a way, it could in theory block development of other technology... what about Point Defence weapons? i.e. anti-missile guns/lasers...

    these are designed to operate as a 'shield' for itself and its escorted ships.

    and while I think at present a human is in charge of fire/dont fire..

    think about it, if you have 50 incoming missiles, you want to take them out immediately, at some point you have to rely on friend or foe detection and let the computers do the grunt work or if its even a yes/no button, by the time you aprove the action, it is too late...

    IIRC some of the missile defence screens actually are automatic unless aborted by an operator...

    1. graeme leggett Silver badge

      @AC

      naval defence is not so much the concern, and you imply there's still a human in the loop

      the concern seems to be more over eg automonous drone flying across a city looking for insurgents with technicals or carrying rpgs and shooting them up without recourse to an overseer to say "actually that's a roll of carpet"

      perhaps a ban would allow self-defence (anti-missile missiles) but not counter-offence (a missile automatically launched at the area where the anti-ship missile originated.

  10. A J Stiles

    Good

    This is a nice beginning.

    I'd go further, and outlaw remotely-controlled vehicles with weapons systems. Or anything (besides conventional armour) that denies the target a chance to fight back at their attacker, even.

    1. Steven Roper

      Re: Good

      "Or anything ... that denies the target a chance to fight back at their attacker..."

      Ah yes, the old principles of honour and fair fighting in warfare. In days of old when knights were bold and all that.

      There's a song from around 1979 you should listen to (if you haven't heard it already) by Chris DeBurgh, called "Crusader." Especially listen to the last verse, which is a reprise of the first with a few significant changes. It's just as relevant today as it was in the post-Vietnam era it was written to satirise. It still brings a lump into my throat when I listen to it.

    2. Bumpy Cat
      WTF?

      Re: Good @AJ Stiles

      I bet you still want a free society that keeps you safe, though. So what you're basically saying is "Anyone who defends me must do so at great risk to themselves."

      1. A J Stiles

        Re: Good @AJ Stiles

        No; what I'm saying is "anyone who attacks me must do so at great risk to themselves."

        1. Bumpy Cat
          FAIL

          Re: Good @AJ Stiles

          Nice! So you're willing and able to stop a military attack on you, provided it's not done with autonomous drones?

          Not everyone lives in good old Blighty, with the whole of the EU between them and a hostile power.

  11. Spanners Silver badge
    Facepalm

    And this will work how?

    A few years ago, civilised countries banned (some) mines. Watching the news, I sometimes hear that not only are some people still being killed/maimed by old mines but that "rogue" states are still using them.

    We can ban autonomous lethal weapons all we like - and we should, but don;t expect the usual suspects not to use them.

    1. graeme leggett Silver badge

      Re: And this will work how?

      well, it'd give us another reason to go to war with them.

      and it's easier to ban them before they appear rather than have to explain to upcoming nations the dichotomy of "yes, we've got (insert weapon of choice), but you're not allowed them"

  12. Daniel B.
    Terminator

    Anyone thinking on doing these for real?

    Understandable to have this opposition, especially as most of the "autonomous killing machine" stories always end up badly. The Terminator, Screamers (based on Phillip K. Dick's "Second Variety"), even the I Robot movie which has 3-law compliant AI.

    Notable that in Screamers (and indeed, in "Second Variety" which spawned said movie) the killer bots have a simplified rule consisting in "kill all humans not wearing a tag". The AI gets smart and "improves" into being able to kill humans that wear the tags, thus not only wiping out the enemy, but their makers as well...

  13. veti Silver badge
    Holmes

    Fairly obviously...

    ... various governments' stances on this are dictated, mostly, by how confident they are in their own militaries' ability to build and use such robots.

    Hence, UK - 'no way, 'cuz we know we can't build a system we'd trust with that kind of power.' Expect this position to be reviewed every five years or so.

    USA - 'keep humans in the loop for now, let's see if we can build up that trust level.'

    I expect most countries would echo the UK's current position for now, with the possible exceptions of those (N Korea, Russia probably) who just don't give a damn' if a few of their soldiers get killed during testing.

    1. Anonymous Coward
      Anonymous Coward

      Re: Fairly obviously...

      Mr Veti writes:

      "Hence, UK - 'no way, 'cuz we know we can't build a system we'd trust with that kind of power.' Expect this position to be reviewed every five years or so."

      Actually, the UK has more of the enabling technology for this than anyone else. What it doesn't have is a pro-active military funding agency like the USA's DARPA. And because UK funding is currently so tight, the temptation for UK researchers to accept DARPA funding will become all but overwhelming. So Prof Sharkey ought to start naming names amongst his UK colleagues.

      The problem is not autonomous devices (we've had these since the invention of torpedos and mines in the Victorian era), but rather devices that learn to improve their (and subsequent generations') performance.

  14. mIRCat
    Terminator

    So I won't be allowed to own one either?

  15. Denarius
    Trollface

    too late

    Deadly killer drones made of carbon compounds with limited capabilty expert systems already exist. These drones create financial disaster, destroy industries and lives for the sake of maximising personal gain. If that is not an example of mechanical behaviour, not much else is.

  16. Ken Hagan Gold badge

    "Normal human beings find it repulsive."

    Do they? Oh well, one more way in which I (and many friends) are abnormal then. The tragedy is that so many normal human beings *don't* find it repulsive when it is another human picking out the targets and doing the killing. I suspect the nice distinctions are lost on the dead person and their surviving friends and relatives.

    Distinguishing between human and robotic killing systems is a "four legs good, two legs bad" argument. I can see valid moral arguments around the decision to go to war, whether you are killing enemy combatants or passing civilians, and whether you are removing them from the battlefield in a humane way. Indeed, a robotic army might actually be a more humane way to tackle a human enemy if the robots used less effective but non-lethal weaponry, trading higher robotic "casualties" for a lower death toll on the enemy side. But then, when the US announced research into non-lethal weaponry, people were upset about that, too, since it apparently lowered the moral barriers to war and death, er...

  17. This post has been deleted by its author

  18. Bumpy Cat
    Terminator

    UK/US is not the problem here

    Despite what a lot of commenters seem to think, the US and UK generally adhere to international law as it applies to war. This is not least because both countries have a robust legal system that can and will take the military to court.

    The real threat is in a decade or two, when countries with less appreciation for morality or legality can use autonomous lethal drones. There's already footage from Syria of unguided bombs being dropped from high altitude onto towns; North Korea has shelled islands, downed planes and sunk ships belonging to South Korea.

  19. hugo tyson
    Coat

    XKCD

    http://what-if.xkcd.com/42/

    has a nice aside...

  20. Amonynous

    This is about moral responsibility not technology!

    The need to keep a human in the loop is not about the 'micro' issue of having some (slim) chance of preventing deaths of non-combatants. Innocent bystanders get killed (accidentally and deliberately) every day by knives, guns, gung-ho pilots, predator drones operated from the other side of the world, etc. The technology is irelevent to some extente, but in all cases, there is either an individual (or at least a chain of command) that can be held accountable for their actions.

    It is about the 'macro' issue of who is accountable for unleashing completely autonomous killing machines in to the wild. The machine doesn't have intelligence, just autonomy, so it is not morally responsible. The designer and builder of the system isn't accountable; how many innocent victims have sucessfully taken action against an arms manufacturer or dealer for their injuries or loss of a loved one? The only people who can be held accountable are those in the military and political chain of command who decided to unleash the weapons in to the arena of battle. If it is internationally acceptable for them to be used, then they are also off the hook. That is why such systems need to be banned under international law, otherwise our leaders will have even more freedom wreak havoc in 'rogue' states in pursuit of oil and treasure.

  21. David Lawrence
    WTF?

    If I had the choice.....

    ....I'd prefer to be killed by a human being every time. Oh yes none of that robotic killing for me. I just love that human touch associated with a fleshy-but-lean, mean, killing person-machine. Trained in indiscriminate murder and over-zealous, racist, xenophobic extermination techniques. So much nicer every time than being murdered by a drone or a robot of some kind.

    Oh yes, for the discering dead person, the distinction is really important.

  22. Velv
    Terminator

    Robots building robots

    And have robots signed up to the policy?

This topic is closed for new posts.

Other stories you might like