* Posts by ThomH

2913 publicly visible posts • joined 18 Jun 2009

helloSystem: Pre-alpha FreeBSD project chases simplicity and elegance by taking cues from macOS

ThomH

Re: Global menu bars?

NeXT's were really cool — you could tear submenus off and leave them sitting wherever you wanted permanently as regular buttons; I think the only disadvantage to Acorn's take was discoverability, but only in the first five minutes before you'd been taught what all three mouse buttons were for.

Popular open-source library SDL moving development to GitHub despite 'calamitous design choices' in git

ThomH

You mean git push origin :branchname doesn't immediately say "remove the branch branchname from wherever it is you have set as the origin repository" to you?!?

ThomH

In the industry that embraced x86 clearly calamitous design choices are no obstacle to market dominance, and being dominant usually means being the best tool for the job. SDL's move is the right thing to do. If anything, the hand-wringing feels a bit overplayed.

Humble Apple Pie: Cupertino sweetens pot to get its DTK prototype machines returned after developer backlash

ThomH

Re: Globally, Apple didn't do bad on this

But you can fault them for having shortened the machines' useful period by failing to offer any real support whatsoever since the M1 Macs shipped, even before the software update cut-off; the original offer was $500 in return for a year of access to an ARM-based Mac, which then mutated unilaterally into "return it now and we'll chuck you a severely-limited $200 credit, or keep it a bit longer if you want but good luck with that".

Sending it back for a full $500 credit, that isn't limited to a certain subset of products and has ten months until expiry rather than three, is a much more palatable offer and probably makes a net difference of 0.0000% to Apple's bottom line.

I like Apple's products, though I always relish a good El Reg bite, but overall I think Apple heavily misstepped on this one.

Apple clinches Q4 smartphone shipments top spot as US sanctions elbow Huawei out of the major league

ThomH

Re: Most iPhone upgraders ever, per the Q1 results call

> Are you kidding? All the Apple fanbois I know pretty much foamed at the mouth to get one of these.

The most-used single model of iPhone in 2020 was the iPhone 7, which dates to 2016, per web stats.

So I think probably your sample set is biased. Probably other tech types? Regardless, it is unambiguously true to say that most iPhone users do not upgrade in most years.

ThomH

Most iPhone upgraders ever, per the Q1 results call

I guess the much smaller percentage of people who bother to upgrade is made up for by the much larger total installed base? If so then that's an easier position to be in than actually having to come up with reasons why people should upgrade, every year, endlessly.

That said, a bumper [financial] Q1 for Apple isn't surprising given that the new handsets missed Q4 for the first time since the device launched.

What happens when the internet realizes the stock market is basically a casino? They go shopping at the Mall

ThomH

Re: Maybe not a bubble

"Do you treat your employees well?"

"We treat them equally."

Apple slapped with €60m lawsuit from Italian consumer rights org for slowing down CPUs in old iPhones

ThomH

The iPhone 6s is supported by the current version of iOS; rumours have it that this year's release will be the first to cut off the 6s and the rest of its generation.

ThomH

Re: Enough all ready...

Personally, I think Apple is culpable for selling some number of people new iPhones when they would have been happy with only battery replacements, especially where those people went to an Apple shop and specifically discussed the issue with Apple employees.

But the Italian lawsuit here seems to be predicated on the rate at which Apple's batteries became less effective being a malfunction? That feels like a stretch.

Smartphones are becoming like white goods, says analyst, with users only upgrading when their handsets break

ThomH

Re: in store sales

AT&T exec 1: Revenues from phone sales are dropping. What can we do?

Exec 2: You mean long-term?

Exec 1: Hahaha.

Exec 2: Hahaha, yeah, let's just start bombarding them with upsell and warranties and whatever. Run this nonsense into the ground. That should protect this year's bonus.

ThomH

I'm still on the iPhone 6s; unless and until it breaks or stops receiving security updates I expect to continue to be. It runs the current version of the OS (i.e. 14) without any obvious performance issues, and Apple released a security update for the OS two versions ago (i.e. 12) as recently as this month so I'm optimistic I'll get to 2023...

Which I like because in 2023 my iPhone 6s will be as old as the original iPhone was when the 6s came out.

I can think of a world of reasons why an original iPhone would have been next to useless in 2015; I can't think of anything an iPhone since 2015 adds that I actually consider to be worth an upgrade here in 2021. And even then, 2015 is an arbitrary pick just because that's the vintage of my current phone. If I still had a 6 (from 2014) or a 5s (from 2013) with no performance issues I think I'd be equally happy...

Must 'completely free' mean 'hard to install'? Newbie gripe sparks some soul-searching among Debian community

ThomH

Re: Free means somebody is not getting paid

I write and maintain a piece of open source software.

Do you know who I write it for? People who might want to use it. If they're also programmers then that's great but my priority is to make my piece of software a smooth part of their workflow; the less time they have to think about me or my link in the chain, the better.

Every time I get critical feedback that allows me to improve my software. I am tremendously grateful to those who not only take the time to try out my little effort, but put the extra energy into getting in touch when something isn't quite right.

So I guess I'm doing it wrong? I should just set up an autoreply that says "Oh, well, if you're sooooo clever, just fix it yourself"?

Barbs exchanged over Linux for M1 Silicon ... lest Apple's lawyers lie in wait

ThomH

Re: UNIX 03 certified

I think the advantage with Linux support would just be more options: some people prefer certain package managers, or subscribe to certain support packages, or just can't afford to tie themselves to a single supplier.

For those people, bringing Apple's new systems into the fold as an additional choice could be a win because they are ahead in some very tangible ways: battery life relative to performance being possibly the least ambiguous.

I get a lot of personal mileage out of macOS also being a UNIX but I would be very unlikely to build a corporate plan upon it. And not just because I'm orders of magnitude below that pay grade.

ThomH

Re: Look forward to this coming to fruition

The M1 won't be, being Apple's low-end offering. It's extremely competitive within its price segment both in terms of performance and power usage, but you don't have to get to the top of AMD's range before a Threadripper is doing what its name promises.

To put it another way: no, Apple has not bested processors that cost $4,000 standalone with one that comes with an entire computer around it for $700.

That being admitted, I have an M1 Mini and — for a regular, home computer — it's glorious.

Developers! These 3 weird tricks will make you a global hero

ThomH

Re: Saving the planet

That was always almost the first setting I changed when acquiring a new Mac — I think NestStep always did it that way.

... that was before the banishment of scroll bars, anyway. Even if you turn them on permanently in modern macOS they don't come with up/down buttons.

Crowdfunded Asahi project aims for 'polished' Linux experience on Apple Silicon

ThomH

I am an expat, living in the USA.

Would you like a list of other things which one over here often hears casually relied upon as "notoriously" true?

File format conversion crisis delayed attempt to challenge US presidential election result

ThomH

Re: Google Docs?

The pro-Trump commentators have actually gone full Brass Eye on this; per Fox Business yesterday:

"We’re eight weeks from the election, and we still don’t have verifiable, tangible support for the crimes that everyone knows were committed ... We know that’s the case in Nevada, we know it’s the case in Pennsylvania and a number of other states, but we have had a devil of a time finding actual proof."

Sadly he did not go on verbatim to add "Now that is scientific fact—there's no real evidence for it—but it is scientific fact".

Surface Laptop Go: Premium feel for a mid-range price, but Microsoft's Apple-like range once meant more than this

ThomH

Re: The same review in just a few words.

I read it more as: good laptop but I am sad that Microsoft's ambitions are so low.

And you thought that $999 Mac stand was dear: Steve Wozniak's Apple II doodles fetch $630,272 at auction

ThomH

Re: Three button mouse?

The 2005 Mighty Mouse was Apple's best value product — a mouse with a free trackball, rated for over ten hours of operation before the ball gunked up beyond use.

The current Magic Mouse can't quite match it for useful lifetime hours; it's a trackpad on top rather than a ball, great, but the crippling RSI will get you in very short notice.

You've got to be shipping me: KatherineRyan.co.uk suggests the comedian has diversified into freight forwarding

ThomH

Re: Can somebody explain the economics?

I've always assumed it's a way for the domain-name squatters to earn an extra few pennies, presumably there's some service somewhere you can throw a small amount of money at for them to redirect some of the domains they're sitting upon for some period of time?

The one domain name I've ever given up, which I didn't even use, is now under the ownership of such a company and would apparently cost me $2,400 to buy back. Given that I'm not interested, and I don't imagine anybody else would be, maybe this is how they'll squeeze back some of the £10 or whatever that they invested in the hope that I might pay?

Apple appears to be charging Brits £309 to replace AirPods Max batteries, while Americans need only stump up $79

ThomH

Re: Compare to Apple's competitors...

The corrected price for Apple is £75; suppose you spent £25 on the physical battery then you could spend close to six hours when you'd otherwise be working, before hitting the Apple replacement cost.

Supposing you can figure it out in one hour, you need to be earning close to £100,000/year for the spudger solution not to be a saving.

ThomH

Re: Fix it yourself

For the record, the brands I saw were Anker and Aukey; I've heard of and used products by the former, and the latter at least passes the sniff test by having a full English-language website with no obvious grammatical or spelling errors and an online store, for which the domain name registration has been the same since 2014.

I agree thoroughly with the advice though — I once bought an off-brand charger for my PowerBook G4, and it killed the thing. Not until it was long obsolete, but nevertheless a completely stupid way to try to save forty quid.

ThomH

Re: Fix it yourself

Fact check ahoy! Apple currently charges the following prices for chargers:

USB-C: £49 for the 30W; £69 for the 61W; £79 for the 96W.

MagSafe 1 or 2: £79 for any of them.

In the US, replace the £ with a $ and add your local taxes.

Some people complain about the loss of MagSafe, but I think it's fantastic — those chargers seemed to have terribly flimsy cables that for me rarely lasted more than 18 months, whereas the USB-C cables plug in at both ends.

They're also standard USB-C chargers so you don't even have to pay crazy Apple prices if you do need to replace, e.g. the cheapest 30W USB-C charger on Amazon US right now is $17, close to being two-thirds cheaper than the Apple equivalent, and the cheapest thing to offer at least 96W is $32, less than half the Apple option.

Nothing you can easily do about those soldered-in batteries though.

Users complain iOS 14.2 causes some older iPhones to overheat, rapidly lose charge

ThomH

Meanwhile, back in reality, the most-used iPhone in Q3 2020 according to user agent identification was the iPhone 7 — the 2016 model.

Applying a weighted average to the iPhones that add up to a total of 99.5% of those currently in use I got the mean year of introduction of a currently-active iPhone as 2016.99. Let's call it 2017.

So that's three years average.

A solid 38% are from 2016 or earlier. You have to go back to 2014 to drop below 10% of the total.

So, verdict on the statement that "Apple practically force their users to upgrade every year or two": a complete work of fiction.

Apple's M1: the fastest and bestest ever silicon = revolution? Nah, there's far more interesting stuff happening in tech that matters to everyone

ThomH

Re: I fear that too much shiny is taking a toll on some people's attention span.

Oh, well on iOS you could have used the wilfully obscure CVOpenGLESTextureCache to share CoreVideo buffers (which can just be BGRA) between CPU and GPU without copying. But it doesn't guarantee no conversions or reorderings, it just minimises them.

... and it's not even Mac/iOS cross-platform. It's iOS only.

ThomH

Re: I fear that too much shiny is taking a toll on some people's attention span.

I was refuting that shared buffers are not something you can do with OpenGL. Though I neglected to include one important caveat: in Metal you can share texture storage and even compel the GPU to use linear ordering to avoid swizzling costs, if you know that that's the correct trade-off for you. I do not believe you can do this in OpenGL.

Your theory about Apple is trivial to discount, however:

The first meeting ever to discuss Vulkan happened in July 2014, and the call to form a working group happened in August. Metal was first released in June 2014.

So it is trivially false that "Apple stayed in the working group just long enough to steal most of the ideas ATI had put forward" — there was no Vulkan working group until after Metal had launched and Apple was never a member. For that reason one can also immediately discount the claim that "Metal is different to Vulkan for the sake of it".

Metal takes AMD's ideas from Mantle and adapts them to something that works across AMD, Intel and Apple's homespun GPUs. Wishing that Apple wouldn't be so quick to go it alone and so reticent to adopt a later standard is valid, alleging a weird conspiracy doesn't really stand up.

ThomH

Re: Saturated market

I wasn't sure who to believe on this, so I checked StatCounter, which attempts to monitor trends through web traffic analysis. Make of that methodology what you will.

Worldwide it does indeed look like a ~16% share for macOS, on a gradual upswing, with Windows very slightly fading. In Europe macOS is nipping at 20%, and in North America it's more like 27.5%, but apparently the continent that likes Macs the most proportionally is Oceania where Apple gets almost to a third. I did not see that one coming.

But it's easy to oversell: in the worldwide all-OS that chart both Android and Windows are basically as important as each other, both hanging around just below the 40% total share mark; from Appleworld iOS appears to be about twice as used as macOS.

ThomH

The power savings are the main benefit to a typical use case, I think — 20 hours on a charge for the 13" 'Pro'. But for people like me that develop native code, there's a huge reduction for build times and for video and iage editors there seems to be quite a bounce via the GPU.

So, benefits for people who just want to carry a laptop around and browse, for developers, and for media production. Isn't that essentially Apple's entire user base?

That comment is made while acknowledging the article's point, of course: many, many people do work that is entirely disjoint from Apple's user base, and this change will matter not one jot to them.

ThomH

Re: Wow.

Luckily Apple's first generation ARM processor was the A4, which was in those first-generation iPads.

So Apple now has a decade's experience at ARM-based SoCs, plus a couple of successful examples of architecture moves for the Mac. If even 1990s-era Apple managed not to screw something up then there's probably not too much cause for concern.

ThomH

Re: Wow.

To agree with you emphatically: it took two years from the release of the Raspberry Pi to persuading Broadcom to release specs for its VideoCore GPU and that was with heavy vendor pressure, so the probability of Apple ever releasing much about its GPUs or the AI stuff must be negligible.

So bare-metal Linux is likely always to be a second-class citizen, even if it comes to exist at all.

ThomH

Re: Headline-whoring

Pfft. Multiplexed bus on the 8080 plus two-phase clock input = instant fail. And I don't care what anybody says.

ThomH

Re: I fear that too much shiny is taking a toll on some people's attention span.

To be fair, `glMapBufferRange` was introduced with OpenGL 3.0 and permits a GPU buffer to be exposed within the CPU's address space for direct access where hardware supports it. Though `GL_MAP_PERSISTENT_BIT` which asks for a persistent mapping — i.e. one that isn't invalidated the next time you issue a draw command — arrived only in OpenGL 4.4 and therefore has never been available on the Mac. The OpenGL 4.4 specification was announced in 2013, so marginally before Metal but after Apple stopped putting any effort into OpenGL.

But, yeah, it's another OpenGL-style workaround for a workaround.

As someone who has recently converted a pile of OpenGL code to Metal, the big wins for me were formalised pipelines, resolving the threading question, and getting to be fully overt about which buffers are ephemeral at the point at which they're either loaded or unloaded from GPU cache.

Apple's tooling for Metal is also leaps and bounds ahead of where its macOS tooling for OpenGL ever was, especially with regard to profiling, debugging, etc, so it's nice to have that supported in a first-party capacity but I think that's probably just a comment on Apple's lackadaisical approach to OpenGL over the years. Other OpenGL-supporting environments do a much better job here — even iOS had pretty cool frame capture/edit/replay facilities for GL ES back when iOS still supported that.

Uncle Sam sues Facebook for allegedly discriminating against US workers in favor of foreigners on H-1B visas

ThomH

Re: Temporary visa holders often have limited job mobility

I was on an H-1B for four years, ending just inside the Trump period; during my H-1B I switched employer twice.

There's a filing fee for an H-1B transfer, and it has to be performed by the prospective employer, but there's no quota or interview and the paperwork is relatively small so all but the tiniest employers usually don't consider it too much of an obstacle.

I had in my mind that I wanted to live in at least a couple of places before my visa expired, and one of the moves was a bit of a misstep. But everything worked out in the end. I'm grateful to America for hosting me.

Apple's global security boss accused of bribing cops with 200 free iPads in exchange for concealed gun permits

ThomH

Re: Concealed?

As counterintuitive as it was to me, as a born-and-bred European, on my visit to Arizona I saw a lot of people openly carrying and it wasn't that concerning after a few hours. But I think that's because it was so common that everybody else was acting quite naturally, imparting a general sense of ease.

One guy had a bright silver six-shooter prominently strapped to his waist like some sort of cowboy (noting that he had no other cowboy accoutrements; it wasn't a costume), but, honestly, the thing looked so well-maintained and obviously of so much pride to him that it instinctively felt like he's probably also quite up on his safety and training. I don't really know anything about it, but it feels like that has to be an enthusiast's gun.

I'd probably have declined a hypothetical invite to his house though.

ThomH

Re: Cali is like leftist Europe, only the criminals are allowed guns

Texas and Caifornia are both nearly bang in the middle of the league when it comes to homicide mortalities per capita, so are poor data points. If you'll permit a broad correlation of level of gun control with which Presidential candidate each state voted for in 2020 then...

The five states with the highest rates are: Mississippi, Louisiana, Alabama, Missouri, New Mexico. Four reds and one blue.

The five with the lowest: Maine, Rhode Island, Vermont, New Hampshire, Nebraska. Four blues and one red.

The probability of 'blue, given that it is in the top half of the list' is 8/25. So if you say to me "this state is above the midpoint in terms of homicide mortalities", there is a 68% chance you're talking about a red state.

The bottom half of the list contains both Nebraska and Maine which split their votes, but if you bucket those according to who won the majorities then you end up giving each candidate 25 states each. So the numbers are symmetrical.

Therefore if you say to me "this state is below the midpoint" then there's a 68% chance you're talking about a blue state.

All that, of course, being relevant only if the initial assumption of a strong correlation between red/blue-ness and gun control strictness actually holds water. I might be falling for propaganda on this one.

Bloated middle age beckons: Windows 1.0 turns 35 and is dealing with its mid-life crisis, just about

ThomH

Ugh, I forgot about the Amiga; a disadvantage of allowing history to be rewritten by the victors — the main thoughts in my head were merely 'avoid Windows prior to 3.1', followed by 'avoid Mac OS 8 & 9, as even original-release Windows 95 has more chance of recovering from a crashed application'.

That said, I think the business world went for PCs just because of the wide array of vendors and the practicalities of procurement. Which is also one, but not the only, reason why you rarely see Mac-based workplaces today (outside of a couple of niches).

ThomH

No, I think it's genuinely the first. NextStep seemingly got there independently, but later.

Also props to RISC OS for being the first mainstream option to push vector fonts. Unless you started screwing up your system with proprietary Adobe nonsense, both Windows and the Mac were bitmap-only until the early '90s.

ThomH

Only if by 'ripped off' you mean: Apple paid Xerox in share options (shortly pre-IPO, so no shares yet), Microsoft licensed relevant additional details from Apple.

In both cases the relevant companies proactively paid for what they were reproducing. Only Apple tried to turn around and sue afterwards, claiming that Microsoft's licence was limited solely to Windows 1.0.

The court found in Microsoft's favour in the latter case at least partly because they found that many of the things Apple claimed to own had only been licensed from Xerox, not purchased.

So I don't think there's even a moral case for 'ripped off' here, no matter how stretched. All copied ideas were paid for.

ThomH

Really?

Non-overlapping windows, most of which cannot be moved or sized. No substantial concept of a desktop — boots into a text prompt and runs one application at a time. Every application does more or less whatever it wants with the UI. No pull-down menus. No drag and drop at all — including for file manipulation. No concept of file associations, workflow is always application-first.

And, of course, it runs like a snail because the CPU is built of discrete components and it runs an interpreted language.

If you were to ask me to pick a computer for general office productivity purposes, between 1984 and 1990 it'd be any Mac, from 1990 to 1995 it'd be Windows or Mac, whatever, from 1995 to 2001 I'd rather use Windows, and after that it'd be back to whatever my employer was willing to provide.

CodeWeavers' CrossOver ran 32-bit Windows Intel binary on macOS on Arm CPU emulating x86 – and nobody died

ThomH

> Not really; you have an Apple account active and as you built the binary, the DRM, licenses and certificates were generated behind the scenes.

No they weren't. Notably because:

> "Starting with Xcode 12 beta 4, ...

As he clearly stated, he didn't use Xcode.

The signing tool that Xcode ships with is `codesign`, and from the terminal it must be invoked explicitly. Xcode merely automates it.

Use `codesign -d hello` if you really insist on being so tiresome in your counterfactual assertions. You'll see: "hello: code object is not signed at all".

ThomH

I think you're conflating Apple's new preferences for [i]distribution[/i] of software (i.e. as distinct from local build and run — once you're into the realm of 'this binary came from the internet... are you sure about this?'), which individual machines can opt out of, with the new architecture.

But that also means it's possible that I'm false bucketing things, so a citation would be appreciated.

ThomH

That's not what I'd call an open platform either. It's also notably completely disjoint from how macOS operates.

When you build and run a binary on macOS it runs the binary. If you live in a conjured world of fiction it might say something about provisioning profiles, I guess. But if we're playing that game then what I don't like about Windows is that whenever I build and run a binary it will give me the error:

"Your radiance is incoherent. Cannot dance, dance, dance, dance, dance to that radio."

ThomH

In macOS you can, amongst other things:

* download and run any software you want;

* download the development tools for free and write and run whatever you want; and

* use the terminal plus all the standard tools to, yes, write and run whatever you want.

Less useful things like X11 (useful on the Mac, that is) and Java may no longer be distributed directly from Apple but they remain available supposing you want them.

So if you want your ARM C compiler on a Mac then download Xcode, or install GCC or Clang in a more traditional UNIX fashion.

How Apple's M1 uses high-bandwidth memory to run like the clappers

ThomH

Right; it's exposed to all developers as Core ML and for now is slowly creeping into image, video and audio editors. Pixelmator Pro jumps on it for image processing, for example. It's not clear to me that there's more here than you'd get with a modern dedicated GPU on any other computer though so you're probably just looking at Apple optimising to do the task as best as can be done within the confines of a mobile SoC.

I'd be completely out of my depth trying to say anything beyond that.

Google yanks Apple Silicon Chrome port after browser is found to 'crash unexpectedly'

ThomH

I think they're trying to hit the marketing point that the Apple chip is more than just a CPU — it's an ARM processor plus GPU, machine learning bit, security stuff and RAM. Per the image on Anandtech, the GPU looks like the physically-largest part, although not by a lot, with the machine learning segment about half the size of the four performance cores.

That said, nothing yet that calling it an SoC wouldn't normally cover, and apply your usual reverence towards marketing-derived naming.

Microsoft unveils a Universal version of Office for Apple silicon

ThomH

Re: In the same week?

You say that, but Microsoft Office was pretty much the final thing to transition to Intel last time around; the first Intel Mac went on sale in January 2006 and the first public Intel build of Microsoft Office became available in January 2008.

That's a solid nine months after the other of the expected laggards, Adobe Creative Suite, which first reached users in April 2007.

Part of the difference is that Office is on a perpetual release cycle now whereas back then the Mac version was generally updated only every three or so years (previous versions were released in 2004, 2001 and 1998), but that didn't take much of the edge off.

ThomH

Re: This puzzled me a bit

Agreed entirely; my post was because the original poster seemed to have misinterpreted the article and I also thought it could have been a bit clearer.

Historical trivia: I had a quick search, and Office was never ported to Windows NT for PowerPC at all. Or in full to Alpha or MIPS, though Excel and Word made it to Alpha. Wikipedia believes that "[Microsoft] met problems such as memory access being hampered by data structure alignment requirements" which is a credible claim even if not necessarily a credible source.

ThomH

Re: This puzzled me a bit

I think it means that, on Windows, only the subscription version of Office currently provides ARM binaries.

The pay-once version is still a strictly-Intel build, and for running Intel code on ARM Microsoft supports only 32-bit binaries.

So as per the thrust of the article, Microsoft seems to be putting a lot more effort into supporting ARM processors in Macs than in PCs.

Python swallows Java to become second-most popular programming language... according to this index

ThomH

Re: Sin tax

The advantage of the ternary is that it requires that both options evaluate to the same type and avoids accidentally omitting the assignment in one branch or the other.

As with anything there's a risk of abuse, in this case through nested ternaries and statements with side effects, but in the intended use it's a less repetitive and equally readable way of expressing your intent.

Geekbench stats show Apple Silicon MacBook Air trouncing pricey 16-inch MacBook Pro

ThomH

I remember the 2015 Retina MacBook because I still use it.

That said, it's on its third logic board because they kept melting. Luckily only while it was still under warranty and not for the last couple of years, but nevertheless I'd be a little cautious.

If the reviews check out, I'll probably buy one of the Mac Minis. I guess we'll see.