Well this is fantastic. I guess we've entered the age where the Government will be able to censor material from the public without the public ever really knowing.
All in the name of terrorism and our safety of course.
UK Home Secretary Amber Rudd has announced a tool that purports to detect and block jihadist content online, and tech companies may end up being legally required to use it. London-based firm ASI Data Science was handed £600,000 by government to develop the unnamed algorithm, which uses machine learning to analyse Daesh …
Or rather what we have actually done is fund another private firm whose owners/founders *know* people in the civil service (and in the space of a couple of weeks this will come out ), along with independent tests showing just how crap this 'amazing' tech actually is. Rudd has put her name to it so we should be pretty sure of the outcome.
Wonder how much from Al Jazeera will get incorrectly flagged?
I wonder how much from Albertson's will get incorrectly flagged.
This will probably put the lid on the Theresa May bits where she wonders about shorting those people engaged in Labor; just when the exact opposite of what it supposedly discusses (plus honorifics and NHS LSD microdosing ad libum for the people who voted stay, and not waiting to TKO the rest of the wait or EU exit or probably poorly coined political terms in general) is earnestly needed.
The department claimed the algorithm has an "extremely high degree of accuracy", with only 50 out of a million randomly selected videos requiring additional human review.
1) 'Randomly Selected' - Permanent Undersecretary of State's Pornhub bookmarked favourites?
2) '50 out of a million requiring additional human review' - there were some Pornhub videos he hadn't watched yet?
3) 'extremely high degree of accuracy' - CivilServantSpeak for 'Capita says they have fixed the bugs'.
I guess we've entered the age where ...
Where have you been this past decade?
Of course I agree with the point you were making. It's your wording I take issue with.
This post has been deleted by its author
Actually it is not 'used' by the Government at all. the purpose of the software is to intercept the upload process and prevent the video actually making it online to Vimeo, Youtube etc., whom I presume would be requested to integrate the ASI software into their sites.
More information here
Well this is fantastic. I guess we've entered the age where the Government will be able to censor material from the public without the public ever really knowing.
Yes, the slippery sloop has been trod upon and there's no telling how steep it is or how slippery. What's next... political parties added to the mix? Anything on the whim of government wanting to ban? I'm surprised a certain leader hasn't jumped in on this about "fake news"....
Automated
Detection of
Extremist
Propaganda
Transmission
There you go. Snappy, took 2 minutes to come up with, apt... belies the reality. That'll be £6 million please. More likely:
British
Internet
Governance
Jihadi
Information
Zerg
Upload
Monitoring
I can suggest names for free. "Big Brother" or "Maybot".
The name Maybot is already in use.
Sadly, I found Maybot to be rather unconvincing. It made far too much sense and showed too much human feeling to pass for the real thing. But with further development it can only improve.
CensorBot.
Soon to be forked by every other Government Department, Quango and busybody who thinks they are entitled to block random content regardless of the actual legislation. (CEOP, FACT, Movie Pigopolists, City of London Plod - I'm looking at you)
The Capita cubicle slaves in some dodgy offshore tech hub are going to be busy operating the master blacklist that it really runs off.The "AI" is just a randomiser on the top.
What does that actually mean? Either it detects something as what it's looking for or it doesn't. If it detects 94% than that's a meaningful figure. But what does "with 99.99f% accuracy" mean? Unless it's a means of saying it has 0.005% false positives - which they could say more explicitly - I can't see that it has any meaning at all. I would instinctively distrust anyone who produces a statement like that. OTOH I suppose there might have been something meaningful that went into the Rudd "I don't really understand it but it went something like this" regurgitation mill.
I wish she and Davis would swap jobs. He seems to have his head screwed on right about the Home Office and its doings while she seems sound on Brexit.
Even humans aren’t that accurate.
If you’re positively identifying terror videos at that rate, you must have a huge false-positive rate. So a movie like Mad Max Fury Road, featuring fighty young people with big guns on dusty desert roads, would probably be flagged as jihadi propaganda.
"You laugh but an episode of Peppa Pig is banned in Australia...
(for saying spiders aren't dangerous, basically, which is a little bit misleading in Oz...)"
There's only three dangerous to people spiders in Oz, funnel web spider, jump spider, and red back spider. The jumping bird spider is only dangerous to birds, and that one that went viral carting a mouse up the side of a 'fridge, well it was just helping it's mate get to the cold cheese. As for the so called dangerous white tail spider, that one's a myth.
"As for the so called dangerous white tail spider, that one's a myth."
Whitetails are aggressive (most spiders try and run away, white tails attack) and whilst not particularly venomous seem to have pretty filthy fangs which lead to a high chance of infection when they bite.
A friend in New Zealand lost a finger (actually the entire metatarsal back to the wrist) after being bitten by one. Apparently it still aches 20 years later.
> 99.995% is impossible
Not at all impossible for the test's they'll have run.
If you've _very_ carefully curated your test content, with an eye to claiming a high headline effectiveness rate you could quite easily score damn near 100% (though you don't want 100, because people would question that). Of course, the Government would *never* massage figures, so it couldn't possibly be they're using the best result from a test designed to prove it works (rather than look for failure scenarios)
In real world conditions, definitely not going to happen.