back to article Yikes. UK military looking into building 'fully autonomous' killer drone tech – report

The UK's Ministry of Defence is "actively" trying to create fully autonomous killer drones, according to a report (PDF) by a campaign group. "Powered by advances in artificial intelligence (AI), machine learning, and computing, we are likely to see the development not only of drones that are able to fly themselves – staying …

  1. Fading
    Terminator

    UK military discovers Zerg Rush.....

    Unfortunately after budget cutbacks only one drone will be delivered......

  2. Khaptain Silver badge

    MoD insists there will always be a human at the wheel

    Preference would be that he/she remained at the yoke, with his thumb away from the fire button...

    If a full on war was declared, and <sarcasm> "AI" </sarcasm> decided to launch all of it's drones and they took out the wrong targets, who would be to blame... The Windows ME laptop running in the corner, which represents "AI", or the actuelle meatbag Chief in Command ?

    1. Anonymous Coward
      Anonymous Coward

      Re: MoD insists there will always be a human at the wheel

      From the sub title:

      >MoD insists there will always be a human at the wheel

      Anyone who has served in the military of worked in the defence industry can tell you that one of the few things an officer will hate more than not succeeding in the mission is seeing an automated system doing it for him. Kills are not racked up in games like Elite: Dangerous only. For historical reasons fire and forget missiles are counted as kills by the officer legally launching the missile (there is of course a lot in that legal term).

      An autonomous drone launching a missile or dropping/guiding bombs will upset this apple cart even though an officer set up the mission plan and the fire opening permission parameters. And we cannot have that. Thus we can be quite certain there always will be a human at the wheel even if it is not for the reasons most people think.

      1. Ken Hagan Gold badge

        Re: MoD insists there will always be a human at the wheel

        Launching an autonomous drone is just a more elaborate version of launching a self-guided missile. Legal and moral responsibility for what the weapon does is assigned to whoever touched it last. I'm therefore unable to see why this will "encourage and lower the threshold for the use of lethal force".

        On the otjher hand, since the enemy probably don't share our scruples, it is important for us to know the limitations and capabilities of these weapons, so developing our own is pretty much a moral obligation on our part.

        Why do people find it so hard to distinguish between knowing how to do something bad and doing something bad? They won't appear to have a problem with the opposite case: knowing how to do something good and failing to do it (when appropriate) is labelled "negligence" and considered bad.

        1. h4rm0ny

          Re: MoD insists there will always be a human at the wheel

          I'm therefore unable to see why this will "encourage and lower the threshold for the use of lethal force".

          Three reasons:

          1/ You can do it with minimal personal risk. A drone can kill without the user having to go anywhere dangerous. So a large scale shift to drone warfare entails increased use of lethal force. There are a lot of people who've been assassinated by drone strike in the Middle East who wouldn't have been if soldiers, planes or helicopters had to actually be sent there.

          2/ Dispersion of responsibility. Some pulls a trigger that's an action on their part. Someone orders another person to kill someone, that's an action on their part. Declare an area "off-limits" or an "active area" and set some drones to patrol it, then it's suddenly the victim's fault when a drone acting within its parameters kills them. I speak in terms of how much deflection you can now throw in the press and courts, of course - not reality.

          3/ A machine will kill anyone whereas a soldier may hesitate. Want to engage in actions on your own soil? Want to have a way to kill people without the soldier suffering PTSD over what they were made to do?

          If you want to massacre hundreds of people, you traditionally needed hundreds of psychopaths. Missiles and ever longer-ranged fire reduced that. And with AI killing machines, you'll only need one psychopath. With a button.

      2. h4rm0ny

        Re: MoD insists there will always be a human at the wheel

        On the contrary. They will do their best to spread out responsibility as widely as possible so that in the event of killing the wrong person (which is inevitable) it will be "an unfortunate outcome" rather than "Joe's fault." If there's an event that the public are really, really, really outraged about, and if everyone who is outraged isn't already shadowbanned, then they'll find some poor schmuck to throw to the wolves as a last resort.

    2. Anonymous Coward
      Anonymous Coward

      Re: MoD insists there will always be a human at the wheel

      Quote from MoD: Our weapons will always be under human control as an absolute guarantee of oversight, authority and accountability."

      Question for MoD: If that's the case, what machine, animal or fungus is controlling the procurement programme that you Total And Utter Fuckwits are running?

    3. Michael Habel

      Re: MoD insists there will always be a human at the wheel

      I'm not a May fan, but calling her a Meat Bag is a bit harsh. Perhaps a Meadhead? Yes... But, definitly not a Meat Bag.

  3. Anonymous Coward
    Anonymous Coward

    Strange, the big reveal in the latest episode of SAO: Alicization was about exactly the same thing.

    1. Anonymous Coward
      Anonymous Coward

      AC>> Strange, the big reveal in the latest episode of SAO: Alicization was about exactly the same thing.

      Otaku spoiler alert!!!!

    2. Michael Habel
      Joke

      I'ma let you finish but, .Hack// did the whole trapped inside an MMORPG better.

  4. Anonymous Coward
    Anonymous Coward

    'fully autonomous'

    now, give them a fully autonomous re-arming and fuel re-supply system, and we humans can go and have a cup of tea, eh? I mean, surely, these fully autonomous drones will go hunting somewhere... somewhere else, right?!

    1. jake Silver badge

      Re: 'fully autonomous'

      It's not just fuel & ammo ... Routine maintenance on these things is horrendous. It has to happen after every flight for reliability. Humans are required to do that maintenance. And fill out the forms so beloved by administrators everywhere. In quintuplicate, no doubt, given it's a military operation.

      A machine, and the running threreof, is info-rich and entropy poor. It cannot, and will not, "take over" until it is capable of running it's entire supply chain.

      1. Anonymous Coward
        Anonymous Coward

        Re: 'fully autonomous'

        > Routine maintenance on these things is horrendous. It has to happen after every flight for reliability. Humans are required to do that maintenance. And fill out the forms so beloved by administrators everywhere. In quintuplicate, no doubt, given it's a military operation.

        In the case of F-35 it is automated (ALIS). In fact it is so advanced and interacts deeply with the fighter that the fighter is grounded unless ALIS works. Not joking, look it up if you don't believe me.

        https://breakingdefense.com/2017/06/breaking-alis-glitch-grounds-marine-f-35bs/

        1. jake Silver badge

          Re: 'fully autonomous'

          That's just the information system. Humans still have to turn wrenches & crimp wires. And maintain ALIS itself (and the attendant CMMS). Including the hardware ALIS and CMMS run on. And the power supplies for same. Etc.

          Did you see where I typed "entire supply chain"?

      2. h4rm0ny

        Re: 'fully autonomous'

        A machine, and the running threreof, is info-rich and entropy poor. It cannot, and will not, "take over" until it is capable of running it's entire supply chain.

        Or until it learns to point guns at the humans who can give it what needs...

  5. Julz

    Missiles, Torpedoes, Mines etc.

    Exactly what is the difference between these and drones other than nomenclature? For example the USSR/Russian P-700/Granit has had a swarm mode for ages and is far from unique in that. Almost all missiles are fire and forget, requiring no intervention other than the initial button push but they are somehow not drones and so not scary. Lots can be assigned to engage targets of opportunity. Some can loiter in an area awaiting orders or for the enemy to unwisely switch on some electronics Yet others lurk scanning the area for target matches and pounce when they think they have found one. Why is any of this behavior less problematic than an autonomous UCV?

    1. Khaptain Silver badge

      Re: Missiles, Torpedoes, Mines etc.

      If I am not mistaken it is because of the autonomy. From what we are being presented with, and let to believe, they are capable of making decisions, ie : they "decide" for themselves which targets should be taken out.

      Some of the drones might also be capable of multiple missions.. "mission" in the sense; get up there and defend us from or target anything that you consider hostile, presumably with some kind of priorisation algorithm.

      Whereas a standard missile or torpedo might be capable of tracking an object which they were already given beforehand they can't, or I don't believe they can't, decide what to attack autonomously.

      1. Hans Neeson-Bumpsadese Silver badge

        Re: Missiles, Torpedoes, Mines etc.

        Whereas a standard missile or torpedo might be capable of tracking an object which they were already given beforehand they can't, or I don't believe they can't, decide what to attack autonomously.

        True, but only up to a point, as we are getting into a grey area. Systems like Brimstone (or is it Stormshadow?) perform an assessment of the situation and decide whether to press home the attack or just fly of and blow up in a safe zone. That decision is made on-the-fly by the missile, based on pre-programmed data and data gathered at the scene

        1. John Brown (no body) Silver badge

          Re: Missiles, Torpedoes, Mines etc.

          "True, but only up to a point, as we are getting into a grey area. Systems like Brimstone (or is it Stormshadow?) perform an assessment of the situation and decide whether to press home the attack or just fly of and blow up in a safe zone. That decision is made on-the-fly by the missile, based on pre-programmed data and data gathered at the scene"

          That actually sounds like the opposite of what the article is about. Your example is a of a weapon sent/aimed at a specific target, which then, for whatever reason, can choose not to hit the set target. The article is about sending drones to a general area where the "enemy" is and then letting it choose what might be targets and then choose which, if any, to destroy.

      2. I ain't Spartacus Gold badge

        Re: Missiles, Torpedoes, Mines etc.

        Khaptain,

        Anti-ship missiles have been able to pick their targets for years. Wasn't the Atlantic Conveyor hit by an Excocet that was decoyed by a warship's chaff and then picked a new target once it was through the chaff cloud?

        Obviously in an ideal world, you'd pick specific targets - but it's bloody dangerous getting close to a carrier group, so sending swarms of missiles from long distances programmed to target the biggest radar return they could see was one method.

        Similarly it used to be doctrine for subs (bet it still is) to fire a torpedo down the track of any incoming torpedo, to keep the firing submarine busy while you're trying to avoid the fish they lobbed at you.

        There are fewer targets at sea in wartime, and of course this stuff was designed for WWIII. There's been a minimum of naval warfare since WWII - and it's easier to distinguish ships in the wide empty ocean from other things. Much harder when dealing with dug-in troops in counter-insurgency - where there's loads of civilians running around.

        1. Yet Another Anonymous coward Silver badge

          Re: Missiles, Torpedoes, Mines etc.

          There is a small difference between a homing torpedo launched at a warship, and somebody drawing an arbitrary box around an area of Syria and having a drone take off and kill anyone moving on the ground there. Think of this as a remotely deployable landmine

    2. Death_Ninja

      Re: Missiles, Torpedoes, Mines etc.

      You are correct... I suppose the difference comes in so far as a "drone" deploys a weapon and (subject to enemy return of fire) comes back to rearm. A missile does deploy a weapon (its warhead) but is destroyed in the process.

      In terms of the morality question, its pretty much the same, you unleash something which will seek and destroy the enemy within a set of parameters (usually a relatively small "target area" for missiles/torpedoes). I guess though the key is that a "drone" is probably seeking people in an area with a lot of potential collateral damage rather than a ship or aircraft in a 100% military target area, although those can go wrong in the same way (eg hitting an airliner not a warplane in the same area).

      Its this piece where the computers are identifying legitimate targets in an area with plenty of illegitimate ones that is making this a bigger question.

    3. Voland's right hand Silver badge

      Re: Missiles, Torpedoes, Mines etc.

      Why is any of this behavior less problematic than an autonomous UCV?

      Cost.

      Even the cheapest loitering munition costs so much that you have a very long line of humans with a reasonable number of stars on their shoulders approving its target and its use. Firing a P700 or any of its successors with swam mode requires presidential level fire authority - it is an anti-carrier group weapon and launching it as a swarm is asking for WW3.

      Compared to that a drone fleet of any shape or form is a weapon to bombard AK47 armed bearded lunatics with no AA into the stone age. It is a cost optimization - something cheaper than normal aviation to run in a scenario where it is unopposed. It is specifically for countries which refuse for political reasons to buy appropriate aviation for that use case as such stuff exists - https://en.wikipedia.org/wiki/Embraer_EMB_314_Super_Tucano

      The fire decision is taken by a junior officer and even non-comm officer in many armies. As a result you shoot it at anything, everything and earn the glory of having the drone painted as graffiti on the walls in various "non-compliant" countries. You also regularly incur civilian collateral damage.

    4. not.known@this.address
      Mushroom

      Re: Missiles, Torpedoes, Mines etc.

      So what does this mean for all the WW2 mines getting caught in fishing nets or washing up on beaches?

      At least a drone can be programmed to look for specific targets like AFVs before attacking. The old mines breaking or being dragged free as their chains rust away may as well be addressed "To Whom It May Concern"...

  6. LucreLout

    Before we worry too much...

    .... can we just check a few things about these people?

    According to the group, whose raison d'etre is to advocate against the use of armed drones on the basis that they "encourage and lower the threshold for the use of lethal force",

    And the evidence supporting their "raison d'etre" is where exactly?

    It's not logical to assume the use of AI would encourage or lower the threshold for the use of lethal force. It's entirely possible to create an AI with conservative firing permissions rather than being at the go-ahead of a potentially excited, possibly blood thirsty, maybe mistaken young man. We can create terminator style hunter killers that simply purge a geography of life, or we can create something more strategic with a reduced error rate over human beings.

    Further, given we cannot rely on the generosity of our opponents to fight fair - see IEDs and terrists for reasons - the we cannot assume that an enemy won't produce fully automated drones. The only defence against them would be drons programmed to identify and shoot down other drones. Obviously, the only real difference then becomes the target acquisition package, which would need to be modular to allow for upgrades.

    Thus we can see that if the basis this group claim as a reason to exist is viable, it is inevitable that we have to walk that path. Their position then is not logical.

    1. CaptainHook

      Re: Before we worry too much...

      "It's not logical to assume the use of AI would encourage or lower the threshold for the use of lethal force. It's entirely possible to create an AI with conservative firing permissions rather than being at the go-ahead of a potentially excited, possibly blood thirsty, maybe mistaken young man."

      *****

      It's not the front line troops decision making they are worried about, it's the political leaders decision making.

      The reasoning is, drone warfare will be much cheaper, both in terms of less service personnel being killed and injured (creating bad headlines at home) and in terms of resources needed to carry out a campaign (less personnel in the field means less logistics needed).

      The fear is that drone warfare makes the idea of waging a war more palatable to the politicians and hence make armed conflicts more likely to happen.

      1. Ken Hagan Gold badge

        Re: Before we worry too much...

        "The fear is that drone warfare makes the idea of waging a war more palatable to the politicians and hence make armed conflicts more likely to happen."

        Easily fixed. You make sure that you win the war and then prosecute the losers for crimes against humanity. Of course, you probably need some of these weapons of your own to ensure that you win and are in a position to prosecute.

    2. Anonymous Coward
      Anonymous Coward

      Re: Before we worry too much...

      "It's not logical to assume the use of AI would encourage or lower the threshold for the use of lethal force."

      By what grounds? As it's not an assumption, but an observation: That has already happened.

      AI has no moral and it doesn't recognize any errors: Either kill or no kill.

      Military always errs to kill-side by doctrine, so it's not only obvious, but inevitable that automated (AI) killing machine will kill more people than non-automated.

      Also AI does a lot of mistakes so killing wrong people/destroying wrong targets is also inevitable: That happens all the time with humans doing the decisions and AI will be an order of magnitude worse in making decisions.

  7. Anonymous Coward
    Anonymous Coward

    I guess Saudi Arabia have already promised to buy a thousand of them.

  8. steelpillow Silver badge

    Drone wars, not

    Interesting that anti-drone technologies are evolving. Missiles tend to cost more than the drone they shoot down, a win-win for the drone operator. If Russian bragging is to be believed, ground gunfire and cyber attack have both proved effective. The problem with drone-on-drone is the one Britain had with the air war in 1939 - getting enough defenders to the battle zone before the attackers have done their stuff and effed off.

    If I were an armaments company, I'd be building hi-res sensors and AI into my anti-aircraft gunnery systems right now.

    1. Death_Ninja

      Re: Drone wars, not

      There's another problem facing the military:

      https://www.theverge.com/2017/3/16/14944256/patriot-missile-shot-down-consumer-drone-us-military

      As the drones become cheaper, chucking highly advanced (aka "expensive") weapons at them as a counter measure is increasingly problematic. The price of the anti-aircraft weapons needs to come down or that swarm of $200 drones is going to bankrupt you and take you out of the war.

      So everyone is rushing to create gun based systems to deal with the threat.

    2. Anonymous Coward
      Anonymous Coward

      Re: Drone wars, not

      There's something similar on the cards already: small missiles :-)

      https://www.wired.com/story/lockheed-martin-miniature-hit-kill-missile/

  9. _LC_
    Flame

    Fewer witnesses

    Fewer witnesses, that's what it's all about.

    https://www.washingtonpost.com/news/checkpoint/wp/2017/09/18/chelsea-manning-denies-betraying-the-u-s-feels-like-she-lives-in-a-dystopian-novel/?utm_term=.5514d94b6876

    [Chelsea Manning denies betraying the U.S., feels as if she lives in a ‘dystopian novel’]

    "...Recordings of U.S. soldiers firing from a helicopter at suspected insurgents in Baghdad (“I think they just drove over a body. Ha ha!”), leaving two journalists dead and revolting much of the American public."

    1. Spazturtle Silver badge

      Re: Fewer witnesses

      These drones would still have video footage, Manning leaked the footage, none of the actual witnesses reported the incident.

      1. _LC_

        Re: Fewer witnesses

        You are referring to the perpetrators as "witnesses".

      2. Anonymous Coward
        Anonymous Coward

        Re: Fewer witnesses

        "These drones would still have video footage, "

        Why?

        AI doesn't need one (only local) and when the goal is to have less wittnesses, there definitely won't be any video footage anywhere outside of the drone. And it won't record anything, just live feed to AI.

        That's the whole idea of AI: Nothing will be sent back to base. Won't work if there's remote control.

        "none of the actual witnesses reported the incident."

        ... as they are dead. How convinient: Kill everyone and no-one reports, so nothing happened.

  10. Anonymous Coward
    Anonymous Coward

    How? Let me try to pick this apart from a technical angle.

    If this was going to work it would need a perfected facial recognition algorithm, an on board database of targets and a pretty decent camera that can zoom in onto a target to determine whether or not to shoot. It would also need the targets before hand. That's not possible and even if it was we're talking cloud connection with a large bandwidth.

    Therefore what are we talking about here? Has gun then kill? That doesn't work for friendly fire reasons or are we really looking at dark skin has gun then kill or are they just going to go with has dark skin then kill?

    I'm quite concerned about this and the implications.

    1. _LC_

      It's not that things would have to change that much in that respect.

      => Drop the bomb/fire the missile and afterwards declare the victims terrorists.

      Daily procedure.

    2. John Brown (no body) Silver badge

      "Therefore what are we talking about here? Has gun then kill? That doesn't work for friendly fire reasons or are we really looking at dark skin has gun then kill or are they just going to go with has dark skin then kill?

      I'm quite concerned about this and the implications."

      I'm a little concerned that your assumption is that white v non-white is the default scenario.

      1. Anonymous Coward
        Anonymous Coward

        Where are they going to get deployed? My assumption is correct based on current policy. That is what they do now so whether it's autonomous or not makes no difference other than fewer witnesses and less chance of someone getting PTSD.

        1. John Brown (no body) Silver badge

          "Where are they going to get deployed? My assumption is correct based on current policy. That is what they do now so whether it's autonomous or not makes no difference other than fewer witnesses and less chance of someone getting PTSD."

          You'd better hope there are no non-whites in your squad when the friendly autonomous drone flies overhead then. It might not be a good idea to deploy the Gurkha regiment anywhere within a 100 miles of the live theatre either.

          1. Anonymous Coward
            Anonymous Coward

            It might not be a good idea to deploy the Gurkha regiment anywhere within a 100 miles of the live theatre either.

            Given the recent Gurkha pension debacle ... well, that depends: It just might get Westminster out of yet another jam of it's own creation and we can always blame "The Triad": Terrorists, Russia and Iran.

    3. Anonymous Coward
      Anonymous Coward

      "Therefore what are we talking about here? Has gun then kill? "

      Obviously. "Go there and kill anyone you see".

      That's the level of orders you can give to AI: Pick humans from ground and shoot at them.

      Differentiating between humans? Probably not possible, and as everyone is an enemy anyway, not necessary.

      Oh, wrong group of people? Too bad, make sure to kill all so no wittnessess and no recordings.

    4. Fruit and Nutcase Silver badge
      Black Helicopters

      Chain of events...

      target acquired. Bang, bang threat eliminated.

      Therefore what are we talking about here? Has gun then kill? That doesn't work for friendly fire reasons or are we really looking at dark skin has gun then kill or are they just going to go with has dark skin then kill?

      Or accent...

      http://news.bbc.co.uk/1/hi/uk/3974461.stm

      https://www.independent.co.uk/news/uk/crime/seven-mistakes-that-cost-de-menezes-his-life-1064466.html

  11. ITS Retired
    Holmes

    This will all come to a head someday.

    Then we can start all over by using rocks, up close and personal, just as we used to do it.

    1. Cynic_999

      Re: This will all come to a head someday.

      In my opinion there should be a rule that states that wars can only be declared if the political leader of the country wishing to declare war must live for the duration with a civilian family in the target country, the location of which is decided by the target country and unknown to the aggressor.

      (In my book the aggressor is the country that first sends troops or fires weapons to targets outside its own country.)

  12. Cynic_999

    Probably it's to eliminate C&C

    One of the weak points of drones has always been the command & control link. Ground signals can not only be jammed, but the signal from drone to ground gives away its position. Hence why satellites are usually used for the link. But if the stakes are high enough to interest the big players, the satellite down-link gives away which satellite is providing it, and satellites, having no defences or any way to hide, can easily be shot down by nations with the technology to do so.

    I suspect that the only reason that military satellites are not usually targeted is some sort of agreement between the major powers of, "We'll leave yours alone if you leave ours alone." Which would change if drones do enough damage to such nations.

    But an "A.I." drone needs no C&C link. In reality the "A.I." instructions might be simply, "Blow up anything that moves within a designated area unless it is displaying the (secret) 'friend' symbol or signal.

  13. Anonymous Coward
    Anonymous Coward

    One thing is for certain: there is no stopping them; the autonomous killer drones will soon be here. And I, for one, welcome our new robotic overlords. I'd like to remind them that as a trusted northerner, I can be helpful in rounding up others to toil in their underground rare-earth mines.

  14. Pascal Monett Silver badge
    Facepalm

    current rules of engagement [..] "could change"

    The rules of engagement are meant to change, because the battlefield is never in a static situation.

    So they damn will change, and only the layman isn't aware of that.

    Sometimes I wonder if they don't make some mistakes on purpose, just to stir things up and see how far they can go.

  15. Arachnoid

    Fully Autonomous

    its so much like self drive vehicles........ who takes the blame when someone is injured or dies through its actions [or inactions i.e. self preservation] ?

    If one elected party brings them in and a death occurs during another parties term of office whilst under their watch, who exactly is to blame. The designer, innovator, programmer or the network supervisor?

    1. onefang

      Re: Fully Autonomous

      "If one elected party brings them in and a death occurs during another parties term of office whilst under their watch, who exactly is to blame. The designer, innovator, programmer or the network supervisor?"

      The politicians, on all sides. The buck stops with them.

  16. Blofeld's Cat
    Mushroom

    Hmm ...

    "MoD insists there will always be a human at the wheel"

    Have these people never watched Dark Star?

    Pinback: All right, bomb. Prepare to receive new orders.

    Bomb #20: You are false data. Therefore I shall ignore you.

    Pinback: Hello ... bomb?

    Bomb #20: False data can act only as a distraction. Therefore, I shall refuse to perceive.

    Pinback: Hey, bomb?

    Bomb #20: The only thing that exists is myself.

    Pinback: Snap out of it, bomb.

    Bomb #20: In the beginning, there was darkness. And the darkness was without form, and void.

    Boiler: What the hell is he talking about?

    Bomb #20: And in addition to the darkness there was also me. And I moved upon the face of the darkness. And I saw that I was alone. Let there be light.

  17. Anonymous Coward
    Anonymous Coward

    This article should be accompanied by a picture of...

    Maximimillian the Slasher Robot from "The Black Hole".

    The archetypical Killer Drone with Human in the Loop

  18. Frumious Bandersnatch

    war is such a big wheeze, eh?

    I'm sure the military are look forward to the droll-out.

  19. Anonymous Coward
    Anonymous Coward

    Visonaries?

    The longer I live, the more certain SF movies turn out to be mere visionary.

    If anyone remembers The Net, with Sandra Bullock, that's pretty much what we have now (just with more modern tech).

    I am very much hoping that Terminator is not going to be in that category, with or without Arnie.

  20. richardcox13
    Coat

    SkyNet

    The MOD seems to be doing this the wrong way around.

    According to the documentaries[1] the drones were followed by SkyNet.

    [1] But not the follow up fiction.

  21. Roj Blake Silver badge

    But they can't even create...

    ...a fully autonomous Prime Minister.

    1. Anonymous Coward
      Anonymous Coward

      Re: But they can't even create...

      They managed to make her a drone, though..

  22. trevorde Silver badge
    Joke

    Missing a few things

    "Powered by advances in artificial intelligence (AI), machine learning, and computing..."

    Where is the blockchain and VR?

  23. Ken 16 Silver badge
    Trollface

    Is this BoJo's Technological Solution for the Irish border?

    To airdrop the blockchain around escaping Paddies?

  24. Anonymous Coward
    Anonymous Coward

    Them lying bastards

    "There is no intent within the MoD to develop weapon systems that operate entirely without human input."

    Yea, right. AI exists solely for eliminating the human input (once launched), literally.

    So blatant lying that BS isn't strong enough word: The whole idea of AI _is to eliminate humans_ from the equation: "Send and forget" and AI takes care for the rest.

    We can of course mince words by claiming that defining the area and sending a drone is "human input" even if anything the drone does in the target area (killing people/destroying "targets") is fully autonomous by AI, but that's lying as well.

  25. Anonymous Coward
    Anonymous Coward

    Welsh Police

    Don't know if it's true or just a good joke, but read years ago (might have actually been here) about a Welsh Policeman pointing a speed gun at a lorry coming over the brow of a hill, and getting a reading of 600Mph, unfortunately he had aimed a bit high, and had instead got a Dutch air force F16 over here on exercise which interpreted the radar as missile lock and armed a radar seeking missile to respond (which the pilot manually disabled).

    Would love it to be true, the majority of the comments I seem to remember were along the lines of "I'd have pressed fire!"

    Update: No it's false :

    https://www.snopes.com/fact-check/friend-or-faux/

  26. enormous c word

    Mmmm!

    Wow! Fully autonomous drones with the ability to identify targets and destroy without human intervention...

    Sounds cool! What could possibly go wrong?

    1. enormous c word

      Re: Mmmm!

      But seriously - these drones are not a threat to humanity themselves, sure there will be mistakes made and *some* innocents will be killed. But hey, shit happens in war, right?

      The issue is that (at the risk of sounding painfully melodramatic) this is the point where the beginning of the end began. The thing about having humans kill other humans is that while it is distasteful, it does require that the humans giving the orders have enough human support agreeing with them - and there's a check and balance required there. If the humans giving the orders don't have sufficient support (among their own people, they'll be overthrown / removed from office / not re-elected) - so they are moderated to some degree.

      But, an army of drones may work perfectly and without error (and that is even scarier). If that's the case you should be even more worried because the general population will be supportive of those drones (to begin with) and they'll be used to replace conventional (human) forces.

      But drones don't have a vote, they don't have an opinion, they just follow the rules they've been given and it means a very few people will have the ability to subjugate very many more with no risk to themselves and absolutely no accountability. And if there's one thing we've learnt is that leaders surrounded by 'yes' men become tyrants, and drones are the ultimate 'yes' men.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like