nav search
Data Centre Software Security DevOps Business Personal Tech Science Emergent Tech Bootnotes
BOFH
Lectures

back to article
Can you trust an AI data trust not to slurp your data?

Chris G
Silver badge

Another brick in the wall

The wall is in a cellar, behind the wall is a coffin that is having the final nails hammered in; inside the coffin is privacy.

Anonymous Coward
Anonymous Coward

Re: in a cellar

I'd go and rescue it, but I'm scared of the leopard...

Michael H.F. Wilkinson
Silver badge
Coat

This is clearly some strange new usage of the word "ethics" I wasn't previously aware of

to paraphrase Arthur Dent in the first episode of the HHGTTG radio play

Let's appoint a bunch of cronies from industry to write the rules in such a way as to optimise profit, and hang the consequences for the general public. What could possibly go right?

Sounds like another sell-out by government

My coat please, I want to get out of here!

deive

Re: This is clearly some strange new usage of the word "ethics" I wasn't previously aware of

An ethics committee writes procedures to get around ethical concerns. Literally the opposite of what most people would assume.

JohnFen
Silver badge

Easy answers to easy questions

"Can you trust an AI data trust not to slurp your data?"

Absolutely not.

Mark 85
Silver badge

Re: Easy answers to easy questions

The slurping is one thing... it's what is done with the slurped data that's the troubling part. Until the what is done with is answered to the satisfaction of those being slurped, slurping should not be allowed. No "if's", "and's" , or "but's"."

JohnFen
Silver badge

Re: Easy answers to easy questions

"The slurping is one thing... it's what is done with the slurped data that's the troubling part."

I consider both of those aspects about equally troubling, but the slurp is arguably the more important part. If the data isn't slurped, then it can't be abused. Once the data is slurped, there is literally no way of knowing what is being done with the data, so you can't know whether or not it's being abused. That makes the whole thing a matter of trust.

But note that when I use the term "slurp", what I mean is "spy" -- data which is collected without my consent. If I have given my informed consent, then I have decided that I trust whoever I give consent to. Without consent, there can be no trust.

Andy The Hat
Silver badge

Re: Easy answers to easy questions

By definition, AI may not be slurping your data today but, because it's AI, may decide to completely autonomously slurp data tomorrow ...

It's supposed to be AI, we're putting tax-payers' money into AI, at the very least it should demonstrate some semblance of AI ...

Or is it just a set of interpretation algorithms, lots of data and some deep data mining tools that people have rebadged ...? /cynical mode off/

JohnFen
Silver badge

Fox/Henhouse

"If large data processors like Google and Facebook help write the "ethics", they are writing the law that governs themselves."

Having two of the least ethical companies in the tech industry (where there is plenty of competition for the "unethical" crown) help to determine what is and is not ethical sounds like a fantastic idea!

alain williams
Silver badge

The wrong question ...

Some medical AI having/not-having some medical data of mine is not the really important question; the real one is where else does that AI pass that data, probably without my agreement and out of scope of what the NHS trust thought it was agreeing to.

I don't want to find: myself getting spam for hypertension pills; or my mortgage rate going up; or being denied a job; or ... We all know that private data will end up in all sorts of places not envisaged, there will be flimsy excuses invented to give a veneer of justification - but the real reason will be some organisation's profit at my expense.

Mark 85
Silver badge

Re: The wrong question ...

but the real reason will be some ALL organization's profit at my expense.

FTFY.

Anonymous Coward
Anonymous Coward

@alain williams - Re: The wrong question ...

You forgot to mention the insurance companies who can't wait to get their sticky fingers on your medical file (they already have your consummation habits). Combine this with the utter stupidity of people giving away their DNA to companies like 23andme and you will see the magnitude of the sh*%t we're getting into. If this is future, please stop right here so I can disembark!

Matthew Taylor

No.

"Can you trust an AI trust not to slurp your data"? No. You can't. They will slurp it all. All of it. Every last byte. Every time.

John Smith 19
Gold badge
WTF?

"post-data-ownership"

My f**king backside.

That's BS for people who want to make money out of your data without paying for it.

These guys are two footed vermin.

JohnFen
Silver badge

Re: "post-data-ownership"

"That's BS for people who want to make money out of your data without paying for it."

I don't think paying for it is the important qualifier here -- I think getting your permission is (paying for it might be a condition you require to give permission, though).

Without them doing that, they aren't just people who want to make money from your data, they're straight-up thieves.

Anonymous Coward
Anonymous Coward

Turkeys' discussions before Thanksgiving...

Data slurping is the big hit, because you can do it for cheap and sell the harvested data for good money, over and over again. Add to this the fact a vast majority of victims doesn't really mind, and you have a big winner. Just watch as the rest of the world jumps onto that bandwagon: It is as profitable as spam, and like spam it's not going to go away that easily.

Sorry to be blunt, but it's not some rare people muttering "Uh, I'd rather not" in some forum which will change that, the only thing which could (somewhat) help is if there is a big public outcry, making politicians start worrying about their votes - More than about their future cushy public sector jobs.

Honestly, how likely is that?

JohnFen
Silver badge

Re: Turkeys' discussions before Thanksgiving...

"Honestly, how likely is that?"

In the long run? Very likely. It is all but inevitable that the ubiquitous surveillance being forced on everyone will cause, at some point, widespread harm. When that happens, the shit will truly hit the fan.

At this point in time, though, I figure the only reasonable approach to take is that of self-preservation. When the disaster comes, I would prefer not to be a part of it, which means subverting as much surveillance as possible right now. There's no point in expecting the law to protect you at this point in time. Only tears lie at the end of that path.

Pascal Monett
Silver badge

So they've already redefined the notion of AI

And now they're redefining the notion of ethics.

Well, here's the thing : the public at large doesn't care about how the word AI is bandied about, it makes no difference to them. That public will, however, care about how their personal data is (ab)used, especially when those who know are educating the rest about the risks.

Not to mention the banking sector which has done a fine job in 2018 to educate its customers on the risks of surrendering too much data to people who are not worthy of having it.

Privacy is theft ? That's where they want us to go ? Well then call me Lupin. Arsène Lupin.

Ledswinger
Silver badge

Re: So they've already redefined the notion of AI

That public will, however, care about how their personal data is (ab)used, especially when those who know are educating the rest about the risks.

By the time the public at large care enough to (try and) influence the liars, ignoramii, shysters and crooks of Westminster, the free for all data gravy train will have long left the station. The reason that the government have appointed a panel of snout in the trough rent-a-gobs to provide "technical advice" is specifically to create definitions to argue that the public sector giving people's data to private companies is in the best interests of the population at large, and thus the individuals concerned, and wholly out of scope for the UK GDPR-compliant rules.

Take a look at the fabricated business cases or farcical consultations associated with any government policy, and you'll see that they are NEVER about doing things correctly, these are merely abused to post-justify a poor decision taken by (at best) an Oxbridge arts graduate who didn't know anything about the decision they were taking. For starters, if the twerps of government don't like the advice, they simply ignore it - which is why Professor David Nutt was sacked by the last Labour government, for applying science and data to the failed fifty year+ "war on drugs". More generally, all the tinsel of technical advice, consultations and performance audits are merely the civil service going through a process before doing what it wanted to do in the first place. Wait for a few days until the latest NAO smart meter progress report lands, and then watch how BEIS denies the fact that the entire programme is a shambolic mess, and then denies that this fact is due to the incompetent programme design by government, and further denies that the programme will deliver a fraction of the benefits alleged in the business case (and won't even consider revising the business case and concluding that the whole thing should be binned).

#define INFINITY -1

Re: So they've already redefined the notion of AI

Dave Nutt... sacked?

Mike 137

"And now they're redefining the notion of ethics"

Actually not. One recognised definition of ethics is 'the rules of conduct recognised by certain limited departments of human life' (1789) [Shorter OED 1933]. So in fact nothing much has changed since at least the late 18th century. What we should be talking about is morality.

#define INFINITY -1

Re: "And now they're redefining the notion of ethics"

According to 'Fowler's', ethics is the principle, morals the practice; by which means it is concluded that ethical and moral (adjective) are much the same. Additionally that the term 'ethical' is often used because 'moral' has connotations of sexual 'misbehaviour'. Oh for the days UK public servants read and quoted these (thought-provoking, not necessarily absolutely definitive) sources.

Adelio

Smart Meters?????

I still have no idea what use they are to the general public?

What difference does it make to me if i have a smart meter or not. (except for the fact that noone has to physically read the meter!)

Anonymous Coward
Anonymous Coward

It's a war just now..

Dear El'Reg before you contact me no I won't go on record with any of this.

Right now in the NHS it's a war, with researchers, universities and private companies all wanting patient information, all wanting it faster, all wanting it in more convenient ways for themselves. There are a limited number of those hated NHS middle managers fighting back, reminding directors of their legal and ethical obligations and pushing back. Those middle managers are mostly nearing retirement..

What we need is one NHS Trust to be HAMMERED by the ICO to put the fear of god into the rest of them, softy-softy will ultimately only result in more information going out as we are pushed to meet government set targets or initiatives (internet access to your patient record anyone??). Otherwise it's inevitable that YOUR information will end up in the hands of just about anyone who fancies having a look.

Tommy L

Andrew, did they turn your application down?

In all seriousness, I would have appointed you to the Board of the CDEI. I think you would have been a great counterpoint to the great and the good nodding things along.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

The Register - Independent news and views for the tech community. Part of Situation Publishing