If the police have got one...
...it has already fallen into the wrong hands.
A secretive unlocking tool offered to cops and government agents has some computer security bods worried over its privacy implications. Known as GrayKey, the box is reportedly being marketed as a way to unlock iPhones without needing the key code. The hardware is reportedly offered in two forms: an internet-connected model …
The system works PROBABLY by overwriting one of the Apple CPU's (which is mostly ARM-based) instruction caches at the point where the Password entry counter is kept which indicates when to lock the phone. The on-chip operation is similar to the following:
Enter_Password;
If Password IS_INCORRECT then
Increment( Password_Entry_Count );
if Password_Entry_Count IS_GREATER_THAN FOUR then
Lock_Phone_Permanently
else
Allow_Entry_Into_Phone;
---
You just have to keep setting the hard-coded password-enrty count BACK to ZERO so that
ALL combinations of the entry code (9999 combination for many Android and older Apple phones)
can be tried until the phone opens.
Enter_Password;
JUMP_TO BYPASS_MEMORY_LOCATION <<< Insert jump code here
If Password IS_INCORRECT then
Increment( Password_Entry_Count );
if Password_Entry_Count IS_GREATER_THAN FOUR then <<< or reset hard-coded password count
Lock_Phone_Permanently
else
BYPASS_MEMORY_LOCATION: Allow_Entry_Into_Phone;
For FINGERPRINT and FACE-RECOGNITION-based phones, you just have to find the memory location where the HASH-CODE value for the Fingerprint and/or Faceprint digital signature is kept and copy that hash-code and present it to the phone at the memory location where it is kept in cache.
The phone interprets the copied data as the real code and unlocks the code even IF the hash code is encrypted. Just make sure you send back your own or a specially re-encrypted hash code value BACK to the faceprint or fingerprint decrypt and recognition algorithm.
You can also electrically SHORT the YES/NO circuit pathway to the phones bootstrap operation where the phone thinks the face recognition, faceprint or unlock code has ALREADY been entered and validated simply by overwriting the memory location WHERE the accept/reject security credentials branch occurs with an appropriate JUMP CODE that simply BYPASSES the entire verification process.
The microcode where this occurs, is loaded from the encrypted BIOS portion of Apple phones and put into a secured cache area which can be OVERWRITTEN with the appropriate JUMP instructions If the phone tries to verify the loaded instructions via a hash code comparison with a hard-coded digital signature or other credential, just overwrite the comparison process with a bypass JUMP code and continue onto the rest of the bootstrap process.
NOT THAT HARD TO DO !!! Even with the rather secure bootstrap process of Apple Phones which use MULTIPLE verifications during boot-up.
if you have physical access to the phone then you can short specific pins on the PIC and get it to release your asymmetric key under various scenarios to figure out HOW the key is created so you can recreate your own new key for presentation to the CPU.
I highly doubt that Apple or Google would keep a full key on a PIC but rather create a NEW key based upon parts of the obtained biometric signature AND from internal hardware serial numbers/signatures on an as-needed basis. You just want the ALGORITHM and THEN you can recreate your own keys for ANY type of phone you are trying to dissect or decrypt.
"Like people ever get their stuff back after it's stolen, er seized, by the police..."
I'm sure your high priced lawyer will be able to get it returned.
Wait, you are rich enough to afford a high priced lawyer aren't you? Because otherwise you're poor and there's probably a law against that.
That hasn't changed.
If someone you don't trust has unrestricted access to your phone, for two hours continuously...
... it's over. Forget it.
Historically this has always been true. It seems there was a brief period when we were all anxious to pretend it no longer applied, but that was only ever an illusion.
"Historically this has always been true. It seems there was a brief period when we were all anxious to pretend it no longer applied, but that was only ever an illusion."
I'll say! Back in my day we had to tattoo our secret stuff inside our butt cracks. Then the bulls got wise, and it was "Spread 'em!" every time!
The company has already taken the money from buyers, who will have a useless device when the exploit is patched in a new version of iOS.
If they can get hold of a new 0 day they can sell law enforcement an update (or whole new device) to work with that new version of iOS and get paid again. Lather, rinse, repeat.
Seems like they have their business plan pretty well figured out...at least until Apple fixes whatever underlying problem is allowing them to apparently guess passwords at wire rate!
Chief of Police replies, "No, it cannot be stolen. We keep it locked up in our secure vault, right beside the illegal drugs and illicit cash that we have seized. Here, let me show you. It's right in here beside... HEY! Who stole all the drugs and cash, again? And where's that $30,000 gadget?"
People that sell stolen used iPhones would pay $100,000 for the gadget. But those people that make and sell brand new iPhones might pay a million.
Gone In 60 femtoseconds. There will be a loud clap sound as the air rushes in to refill the 4x4x2-inch box shaped hard vacuum where the gadget used to be.
It's almost a game it seems but it's not. It's real and it can have real bad implications. I can understand that the cops would want one. But should they actually be allowed to have one? Seems that the IT industry just keeps chasing it's tail here with exploits, patches, etc. And then with this, due process falls by the wayside.
The old saying "nothing to hide, nothing to fear" is not (maybe it never was) true. In this case, there's a lot to fear even if you haven't done anything wrong. Once the phone has been broken, anything can be planted on it.
If you don't want mandated backdoors, I think this is the lesser evil.
Of course, there are risks - it's inevitable. But it's still better than allowing someone to claim the need of unrestricted surveillance.
Anyway, I would ditch any electronic device (and any related account) after it has been seized for any reason.
Face it: police has a mandate to stop crime, and it needs to find evidences. With a valid warrant, they can open doors, safes, etc. etc. They can tap phones, install cams and microphones. It's a matter of fact - even democratic constitutions have provisions to allow for evidence gatherings as long as they abide to the law. Even privacy is not an unlimited right - or say bye bye to any kind of justice.
Smartphones are no different. If there is a technical way to "open" something legally in search of evidences, they will do - and yes, someone will do a business of it - they always did. This business could be less or more ethical - and may need to be regulated, and yes, there's a risk they could end in the wrong hands.
It's like weapon, it can be a gun in the hands of a police officer saving you, or an AR-15 in the hands of a murderer shooting at you, if there is no sensible regulations and controls.
Still it's better that there are expensive, difficult and limited ways to achieve it - because otherwise there will be a mounting pressure for backdoors, and it could be successful.
It's not hard to understand, but of course the anarchist conspirationists that permeates the Internet see any kind of law enforcement as some kind of evil - until their are the victims.
Dear Sir,
The problem in not so much that there is such a device. The problem is that the police has access to it.
Now, I agree, that they need to be able to do their job, but even with the limited information they have today, they fail to do most of their job. Increasing the amount of information isn't going to make that better.
Also, if the police really thinks that this person is the perpertrator, it is indeed quite handy for them to be able to plant evidence on a device once they have unlocked it.
I do not know what the best solution for society is, however I do feel that unfeathered access to people's belongings is not something that the police or indeed the government should have.
Best regards,
Guus
So, you don't trust police. Buy an AR-15 and hide in the woods. then.
I understand there are a lot of morons in the police too. There are also good people who pursue true criminals respecting the law.
Would you like a world without law enforcement? Where only the powerful ones can enforce their own rules? And do you believe they will respect your rights to privacy, property, and life?
It's this anarchism that is destroying the foundation of democracy, and sends people like Trump to the top spots - and eventually you'll get exactly what you feared.
"unfeathered access to people's belonging" is against the law. "Planting evidences" too. And if they don't abide to the law, they can plant evidences outside your devices as well.
It can happen? Sure. Wearing a tinfoil hat won't save you - act to ensure democracy stands and it's not turned into a parody of it.
@LDS "Would you like a world without law enforcement? "
Another false dichotomy - most aren't arguing for that, what they want is a world where the police, all of them, are held to account for their abuse of power, not just let off the hook due to the colour of the victim's skin or the power of the police union, or which state official the Chief of ̶S̶t̶a̶s̶i̶ Police is dining with tonight.
Currently, there are only two kinds of cops - bad cops and those who cover for bad cops.
̶
So, you don't trust police. Buy an AR-15 and hide in the woods. then. *
Well, if that's not a sure-fire way of getting their attention, I don't know what is.
* Might work in the US as long as the authorities think you are just a lone nut hermit and not some Cult, but try it in one the tame forest parks in the UK. You'd end up 'Brazilianed' even if it were a plastic AR-15.
"Would you like a world without law enforcement? Where only the powerful ones can enforce their own rules? And do you believe they will respect your rights to privacy, property, and life?"
Professional policing was invented by Sir Robert Peel in the 1820s. Civilization existed before then. Police forces are an optional extra, not something essential for the existence of civilization.
https://www.thebalance.com/the-history-of-modern-policing-974587
Sir ! In support of /your/ post, I cite Stoke Newington Police Station ...
https://socialistworker.co.uk/art/4885/Metropolitan+Police%3A+a+long+history+of+corruption%2C+racism+and+criminality
https://en.wikipedia.org/wiki/Death_of_Colin_Roach
not enough to convince you ?
https://duckduckgo.com/?q=corruption+and+criminality+at+stoke+newington+police+station+london&t=ffab&atb=v98-1_b&ia=web
Smartphones are no different. If there is a technical way to "open" something legally in search of evidences, they will do - and yes, someone will do a business of it - they always did. This business could be less or more ethical - and may need to be regulated, and yes, there's a risk they could end in the wrong hands.
Still it's better that there are expensive, difficult and limited ways to achieve it - because otherwise there will be a mounting pressure for backdoors, and it could be successful.
====================================================================
The compromise of personal devices
1. provides a level of ubiquitous surveillance of both the owner and all interactions with other people or information unmatched, or even unapproximated, at any previous time
2. can often be automated at low cost as the technology matures, allowing use without economic or practical limits, and thus permitting the targeting of entire groups or populations.
3. produces information that cannot be secured. If rich governments of technically advanced nations cannot protect the information needed for top secret vetting, the design of their nuclear weapons, and the technologies of their not yet in service 5th generation jet fighters, (and those are only the failures we know about) why would anyone think they will successfully protect databases containing every useful piece of personal information about everything and everyone
4. these techniques will be redeveloped, copied, stolen, rented, and otherwise compromised by anyone with an ever decreasing amount of skills, cash, ruthlessness, or other forms of leverage or technical competence.
The only hope for individual freedoms and personal rights is a determined and continuing effort to completely prevent certain types of surveillance and monitoring, without any designed or tolerated exploits or 'doors' of any type.
The key phrase in your contribution is:
"It's like weapon, it can be a gun in the hands of a police officer saving you, or an AR-15 in the hands of a murderer shooting at you, if there is no sensible regulations and controls."
What you seem to be unaware of is that there ARE no SENSIBLE regulations and controls on the police (or any other agents of the state who might use technology like this on your phone/laptop/desktop etc)
We'd all be a lot more comfortable with State Surveillance if we knew (and could prove) that those doing the surveillance were themselves under the strictest form of surveillance. That's why I keep rabbiting on about Accountability Theatre.
"of course the anarchist conspirationists that permeates the Internet see any kind of law enforcement as some kind of evil - until their are the victims."
I was with you until this sentence -- this completely misrepresents the resistance to police surveillance.
"If you don't want mandated backdoors, I think this is the lesser evil."
Genuinely puzzled by that remark. The possibility of products like this (inevitably falling into the wrong hands) is one of the main reasons *why* people don't want mandated 0-days backdoors.
"I would ditch any electronic device (and any related account) after it has been seized for any reason."
This. Once you've lost control of the device to that extent, it is completely untrustworthy. Also, for those who are truly concerned, don't rely solely on the built-in security measures for your security.
well, there are a lot of reasons:
a) fishing expedition by law enforcement - we'll find SOMETHING we can nail you for! [jaywalking as indicated by GPS, for example]
b) planted evidence. Not that hard, really. A few child pr0n pics in your browser cache, and now you're a sex offender!
c) "leaking" personal information found on your phone, as a means of coercion or outright blackmail [just plead guilty, or maybe some of those photos will get 'leaked' and you don't want THAT, now do you?]
Those three reasons ALONE ought to be enough to ALWAYS INSIST on keeping privacy, well, private.
> c) "leaking" personal information found on your phone, as a means of coercion or outright blackmail
Least anyone think that is hypothetical, I happen to follow the news from Catalonia and that is exactly what happens, last time one week ago.
Why should phones (PCs, tablets etc) be considered different to other physical stores of information?
You raise an interesting idea. The police can use info to blackmail people. I bet it happens fairly regularly. Should the source of that info matter? How is using photos from a cracked smart phone different from photos in a locked safe?
If you’re worried about this type of product allowing the police to carry out abuses (and I believe that to be a fair concern) and think that banning it is the solution I’d have to disagree. You don’t solve that problem by stopping them decrypting phones. That’s the kind of solution government ministers come up with (no offence intended). It takes a great deal of effort to solve the root cause. If the police commit abuses now, without access to such tech, banning it isn’t going to change anything. It just gives criminals an obvious place to store their dodgy info. You and I can still be abused by the state, and criminals can impede investigations into their activities. Doesn’t sound like the best place we could be in to me.
Obviously, this article is posted to grab some attentions to iPhone security, and somewhat promotes users here that these tools can fall into the wrong hand. However looking on the brighter side, it gave Apple something better when those government officers keep getting these tools.
Apple could use this against the government officers from forcing them to unlock their devices. Apple can basically draw the line for the government that if they want to unlock an iPhone, they are to ask the iPhone unlock providers to do it and not them. It is just like asking a lock picker to unlock the lock, instead of the lock designer.
Whilst I appreciate privacy is important, security is also, and if this can help stop bad things happening, great.
And if they get my phone and discover the only dubious thing I do is read El Reg and they obtain all the pictures of my cat I've taken, I wouldn't consider that a bad thing, my cat looks awesome....
"And if they get my phone..."
Why would they have your phone? You haven't done anything wrong, have you citizen?
"...and discover the only dubious thing I do is read El Reg...",
You consider reading El Reg "dubious"? Against what standard do you measure this? I seriously doubt this is the only dubious thing you do. Something to hide, maybe? Come on, you can tell your Big Brother...
"...and they obtain all the pictures of my cat I've taken, I wouldn't consider that a bad thing, my cat looks awesome...."
An extraordinary claim requires extraordinary proof. Pics, or it didn't happen.
"my cat looks awesome...."
You're not into cats, are you?....
Yeah, these are his cat pictures.
@h3nb45h3r
It's not your phone that anyone would be interested in.
What about journalists working in repressive regimes, human rights activists trying to prosecute those in authority, whistleblowers in authoritarian governments.
Any of these and more could have their lives and work compromised by those in authority who want to shut them up by getting "evidence" from their phones.
I doubt the company selling these devices will be particularly fussy about who they sell them to. You may not have much to worry about but many many others will as a result of this.
Do you believe they won't try to access those data anyway because you just say no? Actually you would just give them an advantage because they will be able to access your data, while hiding theirs.
Good luck, then, maybe investigating those who killed a journalist or human right activists in your country...
"privacy is important, security is also"
There can be no security without privacy.
"if this can help stop bad things happening"
It's hard to see how it could. This would be more useful after the bad thing has happened and the cops are trying to figure out who did it.
" I wouldn't consider that a bad thing"
Good for you. If it were me, I'd consider it to be a very bad thing!
> And if they get my phone and discover the only dubious thing I do is read El Reg and they obtain all the pictures of my cat I've taken, I wouldn't consider that a bad thing
The moment "they" get your phone you better start worrying.
Whether you have something to hide or not is irrelevant. If "they" think you do, you are in the shit. And if "they" don't like you or "they" decide you would make a plausible enough culprit, you're not seeing that cat of yours again.
"I wonder what happens when you set a passphrase instead of a PIN?"
It would depend on the hash. SHA256 would take a very long time.
iOS apparently limits the total # of failed attempts and/or the retry rate in order to mitigate the less secure PIN method. It's actually OK to do it Apple's way as long as there's no 0-day flaw [which they obviously need to fix, now].
Not entirely.
You may have a long, strong passphrase.
Can you guarantee the same about everyone in your vicinity?
It only takes one compromised device to record a meeting or a conversation, or to reveal both sides of messaging or email or documents... or credentials that can unlock other systems, or decryption keys, or...
At least not for the examples above like someone wanting to use a phone as a bug.
In such a case, the greater the variety (PINs versus passwords, Android vs iOS) the better, as you only have to compromise the weakest link. In that big meeting, me sitting there with my iPhone that has always used a password since I bought a 3gs might take a donkey's age to brute force, but that's no problem if the guy next to me was an easier mark.
The FBI will continue to whine because some phones will be protected by passwords, and they still believe they deserve a backdoor and don't believe they should have to pay third parties for equipment to let them hack (some) phones.
Of course this is irrelevant for spy agencies, since this requires physical access. If they had physical access they'd take my phone apart, add some tiny little microphone the size of a grain of rice (the iPhone X is packed pretty tight, but there's probably room somewhere) that will record everything around me for a few days and then they can get close enough to me (maybe sit next to me while I'm in a restaurant) to command it to download the contents to them.
Selling to law enforcement....
0-day closed by Apple.
Rinse, lather, repeat. (Public money being the inference?)
"THIS COULD FALL INTO THE HANDS OF CRIMINALS!!!!"
0-day closed by Apple.
Someone might want their $30,000 investment back....
....mine's the bed with the horse's head in it.
Can't have it both ways. This is a zero-sum argument.
I wonder if they've got access to the location which stores the try count, and just keep resetting it after each attempt.
You know, like game hackers used to do *in the 80s*.
I won't write anymore, as it's clear from the chatter around the water cooler anyone under 45 simply doesn't know how this was done. Especially when you mention memory-mapped I/O ...
That's when you need your Trace Buster Buster. (May not be safe for work; contains ridiculous language.)
But then, of course, they'll show up with a Trace Buster Buster Buster.
A phone isn't a vault located in a military bunker. Phones should be looked at as the last place you keep sensitive information.
It's long been known, if someone gains physical control to your computer/device, etc... then they own it.
If not by using some 'secret killing box', then by another method.
So if you're a criminal conducting incriminating actions via your phone... don't be shocked if law enforcement uses it against you.
If you keep GPS active along with other 'features' active on your phone, don't be shocked when Google records your every move, puts the information into a database and then sells this information to Equifax; who then loses it when their database is breached. You chose to accept the risk. A phone shouldn't be looked at as being a secure safety deposit box located at Fort Knox.
You're InfoSec professionals. You're smart enough to look at this from the correct perspective of risk management. Don't get caught up in the emotion of this. Don't let the press or politicians twist your thinking. Keep your perspective true and remember, nothing is hack proof. So the loss or misuse of a box isn't any worse than someone not correctly securing information.
someone has found a way to trip the code handling the IO from the cable (that proprietary interface no one can review) and it's causing some sort of soft reset which forgets how many tries you've had, or enters some sort of panic mode, where you can directly talk to protected memory.
It's where I would have started looking.
I read the comments and gave this some thought.. ( some .. not much .. bear brain you know ) .. I am simply amazed that anyone would be stupid enough to keep any important information on a phone. A phone is a phone is a phone. So called smart technology is not smart at all security wise. People trusting such devices for anything important past making a date with Doris are in error. A) easily lost .. B) easily stolen C) can be broken into in various ways D) reveals your location E) Stingray anyone ? F) US Marshall Service with his planes knowing the location of ALL cell phones in the US .. need i go on ? Whatever way you look at it trusting a cell phone for privacy is foolish. Committing crimes using them even more so. Again , a phone is a phone is a phone. I know they're convenient but they can never be trusted.
There's more security in an encrypted text travelling on a microdot at the back of a stamp than a smartphone on a network.
One would hope that they'd insist on a demo. One of the cops will have an iPhone - make them break into his. If they spend money on it without proof or without another law enforcement agency they trust confirming it really works then hopefully the taxpayers find out so they can be rightfully fired.
A couple of commenters here have complained in an assumption that the readers are the ones walking around with sensitive data on their smart phones. I, for one, don't own a smart phone. So I'm not overly concerned about these matters as they affect my person. But my wife? I cannot get her to understand, let alone comply with, the most basic principles of data protection. Nevermind households that don't have anyone in the industry.
As Dale Carnegie says, "If you want to persuade someone, speak to them in terms of their own interests."
What US officials should be most concerned with is that the device and its techniques are easily available to foreign intelligence agencies for the purposes of spying on and interfering with US corporations, civilians, and political campaigns.
This is more of a boon to America's enemies than to US police forces.
If you want to persuade the general public that this is dangerous don't do it using some complex argument about the police being a danger to public safety. Most people don't realize that. And most elected officials think they control the police.
To persuade the powers that be that allowing companies and government agencies to keep vulnerabilities secret is worse for them than the alternative.
Our insecurity is their insecurity.
- That our phones and computers can be cracked, Diane Feinstein's phones and computers can be cracked.
- If our phones and computers can be cracked, then the phones and computers of Republican and Democratic re-election campaign teams can be cracked.
- If means that the phones and computers of Goldman Sachs, the Koch brothers and George Soros employees can be cracked.
That our secret police can crack means their secret police can crack.
Our intelligence agencies can crack means their foreign intelligence agencies can crack.
Yeah, in Soviet Russia, in China, in the USA, even in Canada the police can kill you on video and generally get away with it. But that doesn't worry those in power since they think they control the police. Those in power would be/should be more worried that allowing these sorts of vulnerabilities to exist personally hurts them, their power and their wealth.
What GreyKey and Cellebrite are selling is the means for China to steal US trade secrets -- that is what our powers that be will care about.
The answer to this problem is obvious: don't carry "smart" phones. Indeed, never carry anything outside your home that has any personal data on it whatsoever.
A very, very dumb mobile phone with a limited directory function is probably safe-ish as that is not really a much greater security risk than any other subset of the generally available Phone Book.
Carrying a music-player, ebook reader with Project Gutenberg-like titles only or a video player with no Internet link, all for entertainment while travelling, should also be fairly safe. Mostly.
There's no inherent reason to trust law enforcement simply because they are law enforcement. Trust comes from acting in a way that is consistent with the norms and laws of the society that they operate in.
On that basis, the (bad/incopolice continually do their best to screw it up (and in doing so, screw it up for everyone, themselves included).
Fundamental theory. Security is built on a "trusted" item. Without that item, you can always break in. And pretty much every computer security item is based on a trust in the hardware. Once you have physical access to the device, the rest is just engineering.
Apple could have cooperated with law enforcement, unlocking phones after a valid warrant was issued, and only allowing the access in-house to prevent outside parties from learning Apple's secrets. But no, Apple played its childish game and so the genie has left the bottle. Too bad, so sad.