Biased much?
"Apple admits some iPhone 6Ses crappier than others." Ah, El Reg — always to be counted on for even-handed, unbiased, well-supported reportage when it comes to matters Cupertinian...
Apple has confessed that some iPhone 6Ses have worse battery life than others because their processors are made by two different chip bakers. The Cupertino giant said in a statement on Thursday that the handsets using TSMC-built 16nm A9 processors will have a slightly longer battery life than those powered by Samsung-built …
"It would be difficult to find an IT journalist so befuddled that he or she could simultaneously demand a stop to immigration and complain about the shortage of staff in the NHS."
Offtopic - but the shortage of medical staff in this country is DIRECTLY down to previous governments reducing funding for training. Both Labour and Tory are equally guilty on this.
"So the physics of gates isn't applicable to computers using the Harvard architecture?"
Harvard is a derivative. Your point is? Oh, I know, a straw man.
The physics of gates is irrelevant if you don't change the clock speed to utilise faster gates.
And someone modded you up on a tech site. Jesus H.
The physics of gates is irrelevant if you don't change the clock speed to utilise faster gates.
And someone modded you up on a tech site. Jesus H.
Not by definition nor even universally in practice. Plenty of architectures have used 'early out' instructions that essentially stall the processor until the result is known: for example multiply on the 386 - how long it took was a complex formula depending on the exact values being multiplied.
A faster logic could potentially shave a few cycles of such variable-duration instructions.
"Plenty of architectures have used 'early out' instructions that essentially stall the processor until the result is known: for example multiply on the 386 - how long it took was a complex formula depending on the exact values being multiplied."
And? Its still clocked whether its via the main system clock or an internal processor clock. If the clock speed isn't raised it ain't going any faster.
"Plenty of architectures have used 'early out' instructions that essentially stall the processor until the result is known: for example multiply on the 386 - how long it took was a complex formula depending on the exact values being multiplied."
And? Its still clocked whether its via the main system clock or an internal processor clock. If the clock speed isn't raised it ain't going any faster.
You've completely missed his point. The vast majority of hardware multipliers (and all of them aiming for high performance) are not synchronous multi-cycle designs, they're a single chunk of combinatorial logic. They are not "clocked" to anything. The difficulty arises because the logic is very deep, meaning a lot of propagation delays and you can't be sure the result is going to be known in a single clock cycle. Hence the early out approach - instead of waiting the maximum amount of time for each invocation you monitor carried values at strategic points in the logic. If those carry values have changed by a given deadline within the cycle you say "changes are still propagating through, we need more time". On the other hand if they stay the same you say "OK, things have stabilized, we'll know the answer by the end of this cycle".
How long it actually takes for those values to stabilize is NOTHING to do with the system clock - it isn't even connected to the multiplier proper.
@JeffyPooh
> So the physics of gates isn't applicable to computers using the Harvard architecture?
The physics of gates is only applicable to the maximum speed: if the processor (or core) is under-clocked then its speed is limited regardless of how fast the gate is. It's a bit like me walking to the shops with Usain Bolt: as long as he doesn't start running we'll both get there together.
"Err no, It won't do anything unless they also raise the clock speed for the 14nm chip. You Reg hacks do know how a von neumann computer works don't you? Hello?"
Well, I know how a von Neumann[sic] computer works and I recall what happened when we went from TTL/NMOS to CMOS. Same architecture but now it could run for hours off a battery instead of needing a mains operated PSU. The side benefit was not having to run at reduced clock to get to reliable operation at 70C ambient.
Hand held computers spend a lot of time at lowered clock speeds to reduce dissipation. Under heavy computing load they often have to slow down to avoid excessive heat dissipation. So if the 14nm process gives lower power consumption per Hz per gate, it can run with less throttling. Thus they may not need to raise the max clock speed, the performance boost comes from being able to run closer to it.
"The side benefit was not having to run at reduced clock to get to reliable operation at 70C ambient."
If the ambient temperature around a smartphone is 70C then someones put it in the tumble drier by mistake.
"So if the 14nm process gives lower power consumption per Hz per gate, it can run with less throttling."
If. And how often do smartphone CPU or GPUs need to run at max speed for any appreciable length of time that would cause them to get so hot they'd need to throttle? Maybe a few games, other than that...
Candy Crush would drain a colleague's phone in a very short space of time and turn the phone into a lovely pocket warmer. A fully charged phone would just about survive the journey into work but needed to see a charger ASAP. If there was a delay to the train it sometimes didn't last and they couldn't then phone and say they would be late.
"If the ambient temperature around a smartphone is 70C then someones put it in the tumble drier by mistake."
Did I say I was working on mobile phones? I didn't. In fact at the transition from TTL/NMOS to CMOS very, very few people owned mobile phones, so you could have guessed that.
Instrumentation computers that ran in harsh industrial conditions.
I refer you to your own headline. That, and a mirror.
Given that whenever I see an iPhone its either in use or plugged into the wall, I don't think anyone would really notice the reduced battery life, especially since that 2% is dwarfed by the battery life reduction that charging it so much causes.
When at the office or at the airport, I have to spend quite a bit of time to find a power outlet that isn't filled with those white power adapters. But what I find odd is that I see an even mix of Android, Apple, Blackberry, and Windows phones in the hands of my coworkers and fellow passengers. So why the discrepancy in the brands of phones being charged?
I would speculate its twofold firstly Apple owners probably represent a high proportion of the population that use airports - ranging from US students with MacBooks on a gap year to 90% of the people flying business class.
Secondly I bet Apple has a bigger share of the BYOD market that a lot of business users have flipped to since blackberry jumped the shark. So Apple users are more likely to run the batteries down with dual usage.
Is it one or more of these?
i) apple owners, having beggared themselves buying the thing, need to grab as much free electricity as possible
ii) hubristic sense of entitlement to power sockets
iii) unnatural phobia that sudden loss of battery oomph will leave them bereft of social media contact (with other owners of shiny applely goodness)
iiii) some sort of mating or territorial display
Quote
iii) unnatural phobia that sudden loss of battery oomph will leave them bereft of social media contact (with other owners of shiny applely goodness)
I'd have thought that it would have been the opposite. After all, you are off social media for the duration of most flights aren't you? Sure some flights have net access but it tends to get very slow.
Thus the airport is the last chance for users of all mobiles to get their Social Media Fix.
Personally I think that planes should become a Faraday cage with no access to mobile networks once the door has closed.
Thumbs down for those railway companies who are removing 'Quiet Carriages' but I'm proud to be a Grumpy Old Man so my opinion don't count in today's hip world.
Even beats most tablets tested. Granted this is mainly a test of CPU efficiency, but it sure puts a lie to Samsung's stupid FUD about iPhone users huddling near chargers. Funny how Samsung stopped those ads once they dropped their replaceable batteries and put their users in the same situation.
https://browser.primatelabs.com/battery-benchmarks
Yes, it's a constant war. Finally Apple has crept up to where Samsung has been for years. Next years' Samsung will beat the iPhone, then maybe the year after Apple will be on top again. Ford vs. Chevy, Coke vs. Pepsi. Blah. And Apple's UI is finally creeping up on what Android users have taken for granted for several years at least. Maybe next they'll learn to not make them so damn fragile and actually include some water resistance, but I've always found putting form completely over function to be utter folly.
I've never much cared for Apple phones, and supporting hundreds of them has only reinforced my disdain of them. But to each their own.
"When at the office or at the airport, I have to spend quite a bit of time to find a power outlet that isn't filled with those white power adapters. But what I find odd is that I see an even mix of Android, Apple, Blackberry, and Windows phones in the hands of my coworkers and fellow passengers. So why the discrepancy in the brands of phones being charged?"
An 'even mix' of Blackberry and Windows phones next to iPhones and Android? You wouldn't be bullshitting for dramatic effect here would you?
"I think that he meant to write "I see an (sic) mix of Android, Apple, Blackberry, and even Windows phones in the hands of my coworkers and fellow passengers. "
"
More or less. The point I was trying to make was that there was a vastly disproportionate number of Apple devices charging vs. other brands.
I had actually posted that from the airport were I was sitting in a row of 19 other people at my gate, the flight was just announced as delayed, so everyone had their phones out for one reason or another. In my row there were 8 iPhones, 8 Androids, 2 Windows phones (bright yellow Lumias), and 2 Blackberries; yet, plugged in were all 8 of the iPhones, an Android, a BlackBerry, and a pair of outlets used to charge a MacBook.
>> I don't think anyone would really notice the reduced battery life, especially since that 2% is dwarfed by the battery life reduction that charging it so much causes.
Curious comment. Of course any modern device will stop charging its internal battery once it reaches 100%*. So just having a phone plugged in doesn't mean that it's actually charging.
So you must mean that keeping a battery charged towards the higher end of its charge capacity reduces its longevity... any evidence to support that notion?
* Also note that when your device says the battery is at 100%, it's not actually "full." That just means it reached the maximum "safe" charge capacity as determined by the charging hardware/software, i.e., the capacity where any more charge would probably be detrimental to longevity.
Longevity of owner.
Lithium batteries and cells explode if you have them on float charge as is possible with Lead Acid or NiMH. Interesting too that the higher capacity NiMH have life reduced if on NiCd style float charge, they should only have enough to cancel the self discharge, but they don't explode, except if you "turbo charge" when it's charged!
So Lithium cell charger has to disconnect if voltage too low or if cell charged. Unlink NiMH and Lead Acid, in a battery each cell must be separately tracked. This is why multi-cell packs (aka batteries) have a CPU built in. Note most single Lithium cells either have a temperature sensor built in, or if naked and soldered to main board there is one on the parent PCB.
"Curious comment. Of course any modern device will stop charging its internal battery once it reaches 100%."
Ish. They usually charge to 100%, stop charging, runs on battery to 95%, and then begin charging again. The display remains fixed on 100% at this point (mostly to stop users questioning why their phone isn't charging even though it's plugged in and only on 96%). So while it's not charging constantly, it IS charging for about half the time it's listed at 100% and plugged in; the phone isn't running off the mains.
And yes, that still saps the battery life comparatively aggressively. Of course, given that most flagship phones are probably replaced after just a year to 18 months (as opposed to budget models, which are usually used for much longer and usually have replaceable batteries), it's probably not overly noticeable to anyone who's not still rocking an iPhone 4; the other flagships from that period pretty much all had user-swapable batteries. And really, how many Apple fanboys would still be using a phone that's 2 1/2 generations old?
"And yes, that still saps the battery life comparatively aggressively"
Really? I have a 4 year old desktop replacement laptop that is almost always on mains power (because I need the GPU and a bright screen). It does exactly what you say as shown by the battery meter; the battery periodically gets a few percent of topup charge. After 4 years the battery is probably around 10% below new capacity (I say probably because it has gone from W7 to W10 and this may affect the run time.)
As I understand it with lithium cells, and roughly speaking, they tolerate around 300-500 discharge cycles, but this could more accurately be represented as a life in coulombs. So 5 times 20% discharges is about one full discharge. That's why I prefer phones with a long battery life; other things being equal with a daily charge the battery should last longer.
"And really, how many Apple fanboys would still be using a phone that's 2 1/2 generations old?"
My iPhone 4S battery is still as good as my 5S battery. Both easily last a day with normal use, and usually two days if I forget to charge. That said I don't have that nervous twitch a lot of people have these days where I keep lighting up the screen to check the time/messages/facebook every 4 seconds so YMMV :)
"They usually charge to 100%, stop charging, runs on battery to 95%, and then begin charging again."
Errr. No.
They charge to 100%, stop charging and then run on the power coming in from the USB.
They will start charging again if the battery voltage drops below a certain level, which it will even if not powering the phone, all batteries self discharge a small amount and more importantly voltage will drop with temperature and the phone will cool down once it stops charging.
Yes, keeping a Li-ion battery fully charged for long periods is bad for them, especially if the phone is being used, hot and fully charged is not a good condition for battery life.
But it's still better than constantly charging and discharging them.
The ideal state to maximize battery life is to keep it between 50% and 30% charge, cool but not too cold temperature and only charge or discharge it very slowly or not at all. In other words you get the most life out of your phone battery by switching it off an putting it in the fridge.
"So while it's not charging constantly, it IS charging for about half the time it's listed at 100% and plugged in; the phone isn't running off the mains."
I'm pretty sure you're COMPLETELY wrong, but out of curiosity, why do you think this?
I mean, I can take a battery out of a laptop and it will work fine when plugged in. So laptops can be powered off the mains while charging (or not charging) a battery. Why would phones be different?
It makes zero sense to me why anybody would design a device that ran on battery power even if it's plugged in.
Well, yes, it's pretty hard to see the battery meter through the spider web of cracks.
You never see an iPhone without a broken screen, and yet you never see a broken screen on any other phone. May I have an EU grant to study whether the root cause is the tech, or the users?
That's because the people with white wall warts actually use their phones while travelling. Useful phones use power, unlike less useful ones.
What you're seeing is people who want to keep listening to music, playing a few games and doing their emails.
We were staying with some friends at a large holiday cottage and another family they knew were staying too. They were a family of 5. All of them had iPhones. All they seemed to do was spend their time going "Where's my/your phone charger?? I need to charge my phone!!" Just never stopped.
Seems owning an iPhone is a stressful endeavour.
"We were staying with some friends at a large holiday cottage and another family they knew were staying too. They were a family of 5. All of them had iPhones. All they seemed to do was spend their time going "Where's my/your phone charger?? I need to charge my phone!!" Just never stopped."
A lot of the time when I go on vacation, it's to a location that doesn't have very good cell coverage (small towns, beaches/coasts, etc.). That alone will ruin your battery life.
Add to that that you might not know your way around very well, which means more phone use to get maps, look at restaurant reviews, etc. Also, when you're on vacation, you'll probably be taking more photos than usual and communicating with friends back home more than usual. All of this takes battery power.
Additionally, in vacation homes, beds don't necessarily have electrical outlets convenient to them, so charging overnight might not be an easy thing to do.
So yeah, I'm an iPhone user and I commonly find myself running low on battery when on vacation but not because I have an iPhone in particular. And when I'm not on vacation, I usually end the day with over 70% left.