Re: Burn
He's estimated to be worth more than $200bn so he can find the $1bn breakup fee if he has to, but I also wonder whether he's antagonising exactly the people who constitute the majority of his customers.
2913 publicly visible posts • joined 18 Jun 2009
I think it's the opposite; if he bought Twitter and choose very selectively not to apply its various moderation policies, he could continue his ingratiation with the Republican Party and thereby gain such favours as he desires next time the pendulum of government swings back to them.
See also: the drama of the Texas AG vs Twitter, with Texas 'inexplicably' using government clout to further Musk's objectives.
> But they aren't in a special area. They are just on the desktop, and in those versions, they were the only things allowed on the desktop.
I think that's a slender distinction: they're not in a special area, but they are in an area that only they are allowed to be in.
That said:
> It's not a bar.
There it is: I lose. You can't invent the taskbar with something that isn't a bar. End of story.
> They actually shoot the actors on green screen and use a computer to replace the green (color humans don't have)
That's clearly not what the author is referring to; suggest you learn more about modern filming processes such as filling a room with LED screens and rendering the effects to that.
Especially in the MacBook Air, Apple did not have a good history of using Intel’s fastest — even when maximally configured, Apple’s final Intel-based Air declined the top-of-the-line mobile i7 for TDP reasons, using the 9W 1060 rather than the 15W 1065.
So getting this level of performance _even in the base model_ is a huge performance leap for Apple customers.
Linus himself uses the quotation marks so I appreciate it's not to be read naively, but in what sense is the generic ARM stuff more 'multiplatform' than Linux has been until now? Is it about supporting something that's a little less cohesive than a traditional hardware platform, i.e. various vendors all providing very similar runtime environments but doing whatever they feel like with regards to startup — boot loaders, device trees, etc?
> Apple isn't a monopoly, Raspberry pi exists
As said by absolutely no-one.
However, Apple isn't a monopoly. Android exists. And the EU is drafting new legislation to address the damage to this market specifically because it can't just use its existing anti-competition law, because Apple isn't a monopoly. The EU believes it is doing harm without being a monopoly, as does the author.
Except it obviously doesn't, since Microsoft had control over 90% of the market, whereas Apple has control over around 20%.
... and that's why the EU is looking at the problem in terms of new legislation, to determine what they think overall market fairness requires, rather than targeting Apple specifically via anticompetition law, which can already be used to attack misuse of a monopoly position.
Microsoft had a monopoly, Apple doesn't. The rest of the world can innovate, whether Apple likes it or not.
> I would be more interested if they allowed >1 monitor to connect (where the monitors use DP).
The base M1 supports two monitors. The Pro and Max support four. The Ultra supports five.
No version of the M1 is limited to a single monitor.
However, this may be the faintest praise that anybody has ever posted, and I'm happy to admit that it took several minutes of searching to navigate Apple's confusing naming. Shouldn't the 'Max' be the best one by definition? And who are the better chips for, if not 'Pro's?
Nevertheless I remain very happy with my M1 Mini.
Yeah, NT4 was the one where they moved the GDI, along with print and video drivers, into kernel mode — buying both a speed boost and a step backwards in stability, especially as NT drivers weren't exactly anyone's priority at the time.
If memory serves then Windows Vista introduced the current model, of putting only a tiny shim into kernel mode and doing the overwhelming majority of driver work in user mode.
... and non-compete agreements are explicitly unenforceable in California, for the general public-policy reason that people shouldn't be able to bargain away their ability to engage in a lawful profession.
On the other hand, he'll be under a pile of NDAs and wouldn't have risen anywhere close to as far as he already has if he'd been the sort of person who obviously doesn't honour them.
Ugh, yeah. I have a Mac and in the past had an iPad but my Kindle was always first choice for PDFs despite the paperback-sized display with awkward panning and zooming due to the e-ink refresh rate, precisely because I can just drag and drop to it.
Well, that and not wanting to read off LCD when it can be avoided.
I haven’t had an iPad for the better part of a decade because I never really found any other use for it either.
Which work are you having difficulty with? Lack of USB input on the phones is the only thing I can think of that would push you to an Android rather than an iOS device in terms of productivity.
Otherwise, Office and Photoshop and Exchange and Slack and everything else is no big issue. Apple even finally started offering specs for AirPlay to partners a few years ago, so you can screen cast to your Roku, Samsung TV, etc.
I seriously can't think of a strong objective argument to prefer an iOS device over an Android or vice versa these days. It's just marginal preference amongst a sea of unexciting devices.
I suspect the unwarranted linkage to preemptive multitasking may relate to the reason than Chen got involved at all — per his blog entry he was chasing up on crash reports, and SoftRAM not only didn't actually compress but also was largely based on out-of-date Windows 3.1 DDK sample code, which being for Windows 3.1 made no effort to be thread safe.
So SoftRAM would crash hard on Windows 95 as soon as a lot of processes started hitting memory issues at once.
This is exactly right; IBM threw lawyers at absolutely everyone making a PC clone for a long time. The breakthrough was finally achieving 100% BIOS compatibility without using any of IBM’s original code, and with the necessary legal evidence to prove that the people who wrote the new code had never laid eyes on IBM’s.
IBM’s solution to that was to double down on lawyers and create the MCA bus and the rest of the PS/2 that would be much easier to protect against clones. But the horse had already bolted.
It has fantastic sound hardware — for me that’s the only bright spot. Otherwise it’s a framebuffer-only machine with nothing even close to the grunt necessary to do decent animation and a vertical resolution too low for productivity, and the memory layout is so arcane that even the official documentation names one of the many overlapping registers as the quagmire state.
The 65816 ends up being a net detriment because it has the same inefficient memory access patterns as the 6502 (there’ll always be two reads for single-byte instructions, read-modify-writes always have a spurious access in the middle, etc) but in a machine where large chunks of the address space are behind a 1MHz bus.
I’m a paid-up member of the Apple ecosystem, but I strongly doubt there’s a blame-the-user angle here, whether physical or otherwise.
I give it 99% odds that a bug in the software is to blame, whether the OS itself or one of the firmware updates that Apple bundled with the OS.
If I dare jump in; Microsoft aren't doing that bad a job in my opinion.
Google also can't make a plan and stick to it, but in that case the users pay as whatever Google is abandoning simply ceases to be. Microsoft's discarded frameworks at least continue to function.
Apple can make a plan and stick to it, but that plan usually involves a large amount of technology churn and the assumption that developers will keep up. As a developer you at least never get stranded by a complete horse change, but as a user you can still expect unmaintained applications to expire.
UWP requires a non-standard compiler if you want to target it from C++; a non-standard C++ compiler is not required for UWP because it was always mainly for the C# crowd.
Microsoft not only could be better, but is: C++/WinRT is the standard C++17 way into WinRT, provided as a header-only library for any old compiler.
Alas, I have absolutely no idea how UWP maps to WinRT, how either corresponds to Win32 or .NET, or what WPF has to do with any of it. All I really know is: don't mention Silverlight.
I think Reunion is meant to clarify, even to idiots like me.
Here's what I usually see when a Mac user tries to create a shortcut to a network share:
They drag the network share icon to where they want the shortcut, holding down the option+control buttons, and release.
Spoiler: if they're doing something that involves connecting to a network share, in an environment that hasn't already been dummified, they probably know how to use a computer.
Space Invaders is also 8080 based; as documented here, Dave's emulator has a main loop of:
• call check-for-interrupt microservice; schedule RST x if so;
• otherwise: call memory bus microservice to get next opcode.
• call appropriate opcode microservice as per decision above.
The opcode microservices are implemented in a range of different languages: Swift, Javascript, Visual BASIC, C#, Typescript, Python, Ruby, Perl, Java, Lua, Scala, D, F#, Kotlin, C++, Rust, NIM, Crystal, Powershell, Deno, Go, Haskell, C and Dart.
And, yes, it's satire, but it's all really implemented and then profiled and discussed.
macOS may be more consistent but it still has its issues.
In every standard text box across every app: command+left and command+right act as home/end do on Windows, taking you straight to the beginning or end of a line.
But, in terminal: command+left and command+right switch between active terminal windows. You've of course to remember to use the UNIX standards of ctrl+a and ctrl+e instead.
What's even more annoying is that the command+left/right terminal window switch isn't constrained by virtual desktop, unless the OS-standard command+` to switch between windows of the active application.
So forgetting that terminal has decided arbitrarily to do its own thing can be very jarring.
No to disagree with the main part of this comment, but the menu bar always relates to the frontmost application. If that application has any windows at all, they'll be on top.
That aside, I set my menubar to hide automatically and rarely mouse up to it other than when heading for the Apple menu for system preferences and suchlike; even when I want something from the menu bar it's almost always easier and faster to use the keyboard shortcut — either the specific one or command+shift+/ to open the search box.
I have two relevant suspicions:
Firstly, that Apple simply considers the menubar to be part of the brand. It made an awful lot of sense in 1984, when the Mac had a 9" display and ran exactly one application at a time, and from there the classic OS was stuck with it by the same complete lack of forward-planning that also prevented the addition of memory protection, preemptive multitasking, etc. So I'm imagining it was then carried forward to OS X because, you know, Macs have menubars, and if you're Apple circa 1999 then how many more of your customers can you afford to lose anyway?
Secondly, that almost everybody who writes a Mac app nowadays makes sure to put almost nothing in the menubar, because it's so disconnected. Even floating tool windows seem to be out of fashion. macOS just seems to be a little behind here; I think the same instincts finally fully manifested on Windows whenever we all accepted that the multiple-document interface — the old big parent window with multiple child windows, pull-downs belonging to the parent but acting on the currently-selected child like a desktop within a desktop — was a terrible way to show multiple documents.
Pedantically: Xerox got a lot wrong; it's the descendants that built a lot of what we now recognise as a desktop.
The Xerox machines used fixed-location icons, fixed-location windows and was fully modal for all manipulations — no drag and drop anywhere, ever, including something as simple as moving files from one directory to another. Everything on-screen is so fixed in place that the machine doesn't even offer pull-down menus.
I'm not even you could call the modern desktop nailed into place until the 1990s: Mac OS was single-application prior to System 5 — you'd launch an application, it'd get the whole screen, you'd exit it to go back to the Finder, etc — and still so by default prior to System 7, and Microsoft kept inside its Program Manager box rather than offering the desktop as much more than wallpaper prior to Windows 95.
It's not forcing them, it's at most coercing them.
But that aside, it feels proportionate to me. Typhoid Mary lost her liberty for egregiously refusing to believe that she was highly infectious, evidence be damned.
So just losing your phone during a public medical emergency because you refuse to act rationally and are potentially endangering others is fair enough.
You know, given that it's already well-established that the liberty of individuals can be restricted for the benefit of society, and that actions should be punishable orthogonally to consequences.
... though it also sounds like an awesome idea for a cartoon.
Act one: villains invade only a small area of the globe.
Act two: everyone's like "Oh, that's just Ted's zone, who cares about Ted anyway?"
Act three: everyone remembers that they value and respect Ted, and turn up to help him save the day.
Worldwide Loyalty Team, Ho! Go go Worldwide Loyalty Team! This is Nancy from Worldwide Loyalty Team, if you guys don't stop messing around then your time is going to run out!