Sorry Cori, I respectfully disagree...
The public needs a healthy dose of realism about how America has used and will use these technologies, and how the war on terror looks on the ground where it is waged.
They also need to understand why drones are deployed so often. IEDs. If you want to blow up the soldiers while passing, rather than fight a ground war with them, then don't be so suprised when their mates blow you up with a drone. Our countrys first responsibility is to the men & women we send to war - whether you believe they should be there or not is completely irrelevant to that point - and we owe them the very best protection that may be provided them whilest deployed.
they never said why Salem and Waleed were caught in the crosshairs.
Well, most likely because of a mistake. Unfortunately in war, mistakes happen - like any other walk of life, just with bigger bangs and worse consequences. If the enemy combatants would stop hiding amongst the civillian population, or if the civillians would simply move away from the men with guns, then collateral damage could be greatly reduced. Expecting one side to not fire back is unrealistic and unhelpful.
A human fired the missiles, but did so, in part, on the software's recommendation.
And they did so in part due to standing orders, rules of engagement, and the situation in the given area. I don't follow all of resharpers bat shit crazy recommendations (or all I'd have are untestable static classes), and blaming the software for the human acting on its mistake is missing the point. It's why we don't allow automated firing by the AI.
in societies where most men are armed, and insurgents are interwoven and married into civilian populations, network analysis will always make mistakes.
Those societies and men have specifically chosen to have a higher rate of casualties amongst their neighbours and family by living amongst them as enemy combatants. You spend your day shooting at soldiers and blowing them up with IEDs, then seek to complain when a drone takes out your house while you're having dinner? Frankly, that isn't a reasonable complaint to make - you made your bed, now die in it.
Some of Google's people seemed less concerned about moral balance than they were to avoid public discussion of the contract at all.
Moral balance doesn't mean anything. You think your morals are the correct set. I think mine are. They won't always align, so whose morals get primacy? Thus, your morals mean nothing to me, in the same way as mine mean nothing to you. You can't expect the rest of the world to work per your own moral framework. It's astounding how many seemingly intelligent people cannot grasp that simple fact.
Weaponized AI is probably one of the most sensitized topics of AI – if not THE most.
It is, and rightly so. I'm not sure anyone is yet advocating rolling out Terminator style hunter killers that purge a location of all humans, but that day will come eventually, unless terrorism is knocked on the head as a means of conflict. If you wish to be martyred, stand and fight like a conventional army. If you're frightended of dying, well, stop picking fights with other nations, and stop blowing up their civillians. If you don't care about or are deliberately target their civvies, yours will one day become fair game, or at the very least collateral damage.
Lets take a moment to review what that phrase really means today. It means your civillians were viewed as being expendable to the achievement of the mission. If that mission is to stop your menfolk blowing up our families, then its wholly understandable why it is considered preferable for our drones to blow up your menfolk. Unfortunately for you, that may be after they pop home for lunch, and while aiding and abetting them, you might get killed too.
Under President Trump, the targeting rules have been made even looser, with predictable results: over 6,000 civilian deaths last year in Iraq and Syria alone.
As upsetting as that may be, how many lives were saved due to the deaths of the primary targets, the enemy combatants? Gross numbers aren't nearly so useful as net figures. How many of our soldiers lives are worth sacrificing to avoid what may be more or fewer civillian deaths if we use planes and tanks instead?
Do we even know if drones kill more civvies than bombers, fighter jets, helicopters, or tanks? Are some of the objections really just emotive, because there's no risk to life of the dorne pilot?
We all have a role to play in the debate about where AI should be used. But the most important audience is AI developers and engineers.
We do. And the number of soldiers I've met with serious injuries and dead friends due to IEDs leads me to believe that it is preferable to deploy drones to eliminate the terrorist threat rather than having our guys out their with their ass in the breeze. See, ethical and moral standpoints vary from person to person, so while you may feel they're a great decision filter, the filter comes up short when we account for interpersonal differences.
This is true mainly for the populations of wealthy nations. While you and I bicker on Twitter, buy crap on impulse, or do any of the things that figure in these TED-talk dystopias, Orwell is out there: for the poor, the remote, the non-white.
Race may be a correlation of drone strikes, but its absolutely not causal. The cause of drone strikes is terrorists planting IEDs, not prayer books or brown skin.
That's why some say engineering and computer science should be regulated like the old professions: medicine and law.
And I'd completely agree with you that they should be. However, don't for a second think that would prevent the development of autonomous drones or weapons.
Could unethical uses of AI land developers in hot water? Sure.
Illegal use, sure. Unethical? Not a chance. Your ethics have no bearing upon anyones actions but your own. Just as my ethical framework guides my actions. You've no specific expectation or right to think I'll act according to your ethics than I do of you acting according to mine. Its the main problem with ethics.
That's what could solve the AI ethics debate – for those with the gift to code to think about what they are building.
If what "I" build helps save the lives of our soldiers that would otherwise be blown up by a terrorist IED in some godforsaken part of the world, then I could sleep real easy at night. There is, after all, nothing that mandates these clowns to hide behind their wives when the drones come calling - in choosing to do so, they choose to make their families as expendable to us as they are to them.
I don't build drone software and never have, but I certainly have no moral objection to it. Quite the opposite.
If they chose to wield their power for good, who knows what they could do?
Define good.
This is the point where simplistic and emotive rhetoric breaks down. Is it good that drones save the lives of our troops? Yes, absolutely it is and they absolutely do achieve that. Is it good that drones end the lives of terrorists before they can kill more of us? Yes, absolutely it is, and again they do achieve that. Is it good terrorists hide behind their families in an attempt to avoid the consequences of their actions? No, it isn't, but who made that choice? So whose fault is it really?
I'll get more downvotes for this than a bacon sarnie in a mosque/synagog, but the point is there is always more than one view point, and a reason why emotion must be kept out of such debates.