Jeez
Just as well everyone is diligently monitoring for updates and applying them as soon as possible.
Microsoft is urging everyone to install an emergency security update for all supported versions of Windows to fix a remote-code execution vulnerability. Details of the vulnerability were found and reported to Microsoft by security researchers poring over internal memos leaked online from spyware-maker Hacking Team. This …
"Microsoft Windows does not release any OpenType fonts natively. However, third-party applications could install them and they could be affected by this change"
Sure the library is broken, sure it might well be Adobe's shit code, but the decision to run a Font library in kernel mode was all Microsoft. This particular class of problem has been a pointed out to MS and the user community on numerous occasions - going back to at least NT 4.0.
I am hoping the fix stops running that library in kernel mode in addition to fixing the code, but the fact that MS & their fans have expended more energy on burying the bad news than fixing the problem so far doesn't give me much hope.
This post has been deleted by its author
Also how about this crazy idea. Perhaps some servers don't need a GUI at all and can do without a font attack surface in kernel mode. I understand that for many computing roles its may be necessary to have some of the graphics subsystem run in kernel mode. You are going to have a tough time convincing me however this is anything but ancient legacy gunk that Microsoft knows is not good but being basic infrastructure would be prohibitively disruptive/expensive to do right now.
Sure, for a *server* it's wrong - actually, a server could even work without a graphic card.
For a desktop, it's needed to achieve required performance, and not for gamers alone. From some perspectives, games are even easier to manage because usually the get exclusive use of the screen. From the same perspectives, it's more complex to handle multiple applications each with their windows needing rendering while overlapping.
That's why the next version of Windows will introduce a *server* version without a GUI.
And that's why Linux is introducing newer graphic frameworks running in the kernel - if you want performance on the desktop side you can't do without.
>>> Perhaps some servers don't need a GUI at all and can do without a font attack surface in kernel mode.
But but but ... all compute devices should be running a unified OS. I mean it's not like there's an actual *reason* to have different server, desktop, tablet and phone capabilities. I mean they are all used for the same thing...
>But but but ... all compute devices should be running a unified OS. I mean it's not like there's an actual *reason* to have different server, desktop, tablet and phone capabilities. I mean they are all used for the same thing...
I hope the upvoters actually caught your sarcasm instead of agreeing literally.
It's not impossible. It's slooooooooooooooooooow.
That's why Linux is used by 1.46% of desktop users, and mostly by people who don't go beyond an SSH shell to a server. Try to use anything graphical and it's a slug, compared to other systems.
Just take the time to look at how font rendering works, and how a graphic card works, and maybe you'll understand why. And you'll also learn why most Linux desktops and windows managers are pure crap.
"That's why Linux is used ... mostly by people who don't go beyond an SSH shell to a server."
This is a new and ingenious one. Is this your own material or do you have a script-writer?
Back in the day I did use Windows for more or less this purpose. The first example was when Visionware in Leeds did a nice package which included Windows/286 or 386 (look it up - it was a thing) and an X-server; it was a very good way of getting multiple sessions from a PC.
"Sure, Linux doesn't use Direct Rendering Manager in kernel, does it?
Face it, the amount of processing required by actual application requires most of pixel calculations and settings to happen close to the VRAM and GPU..."
If anyone is genuinely interested in finding out a bit more around the topic, I suggest that they read up some papers on how SGI implemented their early 3D accelerator hardware, drivers & libraries. Might be a bit of a hunt - they were published in the early 90s, I think I found them in IEEE Computer Graphics & Applications back in the day.
Anyway - in SGI's case Performance was a tougher problem for them - as they were working with slower silicon than the NT 4.0 bods, and yet they decided to pay a *lot* of attention to stopping people from cracking the kernel and applications via the graphics hw & libraries. I can't believe that all those techniques passed Microsoft by, especially as they actually *hired* some of the SGI folks... Perhaps MS were simply too cheap to license the tech.
>Anyway - in SGI's case Performance was a tougher problem for them - as they were working with slower silicon than the NT 4.0 bods, and yet they decided to pay a *lot* of attention to stopping people from cracking the kernel and applications via the graphics hw & libraries. I can't believe that all those techniques passed Microsoft by, especially as they actually *hired* some of the SGI folks... Perhaps MS were simply too cheap to license the tech.
Well shoving the whole GDI system haphazardly in kernel space had to be a lot cheaper and considering most of their customers don't know the difference in the end which company is still around (in all but name)? Sad but true though people did learn some and haven't shown a lot of interest in Microsoft engineering in mobile.
"Sure the library is broken, sure it might well be Adobe's shit code, but the decision to run a Font library in kernel mode was all Microsoft."
Well, consider that font handling is a basic OS function (meaning it gets used all the time) AND that graphics drivers are in kernel space for performance reasons, how else are you going to get smooth and speedy font rendering without tons of time-wasting context switching?
NT was a server OS, it doesn't really matter how fast a server OS repaints windows or scrolls the console screen. I don't remember NT 3.51 being noticeably more slower than Windows 3.1 for general use and as it was a server OS nobody was running Photoshop on it.
They should have gone with the idea of merging the Windows 95/NT lines into XP but having two XPs. A server XP done properly, a desktop XP which had stuff in the kernel if necessary, and one compatible API to rule them all. Instead their idea of a server OS is the same as a desktop OS with a registry key to be able to use Active Directory.
"Well, consider that font handling is a basic OS function (meaning it gets used all the time) AND that graphics drivers are in kernel space for performance reasons,"
I suspect that convenience and slinging the software out of the door as fast as possible also played a part.
"how else are you going to get smooth and speedy font rendering without tons of time-wasting context switching?"
There are a number of techniques you can use to reduce context switching without running complex application code in the kernel. Two of the simple and obvious ones are:
1) Build up a display list (usually made up of primitives) then render list all in one go.
2) For fonts and other oft-replicated items you can cache the rendered glyphs so you don't need to keep re-rendering them. Some systems even cached glyphs in off-screen display memory as well.
if you want to learn more there are a lot of books & papers out there on the topic and there are millions of lines of production code you can read through (for free). In my case I used to religiously read through every copy of IEEE Computer Graphics & Applications and databook I could get my hands on. You may find hardcopies of early 90s CG&A hard to find, so might be worth a look at computer.org to see if they have digital editions of the back-issues. If you are lucky you will a corporate tech library or university that will be only too happy to have you take away all their old copies - just ask them.
Well, consider that font handling is a basic OS function (meaning it gets used all the time) AND that graphics drivers are in kernel space for performance reasons, how else are you going to get smooth and speedy font rendering without tons of time-wasting context switching?
I think this is the root cause: x86 is dreadful at context-switching which is why the decision was taken to put stuff that had deliberately been kept out of the kernel into it. I suspect it didn't make as much difference on the DEC Alphas that early on were given equal status to x86. Sigh, another instance of where the Wintel duopoly stifled innovation and quality.
Believe it or not, it wasn't a gamer requirement. It was a requirement for complex graphical applications, including DTPs, vector and bitmap graphics, CAD, etc.
Try to scroll a complex documents with lots of text in different fonts, antialiasing, kerning, etc., and some complex graphics, and try to render it smoothly while the user scrolls or zooms it...
Then ask yourself while none of such applications are available for Linux - I mean *professional grade* applications...
I could argue that as well but fine let Windows be used for engineering workstation stuff but I am also grateful that manufacturing production servers I support don't have a bunch of legacy windows 3.11 crap shoved into the kernel (not Linux either as Red Hat is now going the same desktop first and only path). The Where have you been for the last 20 years? post below nails the issue.
"Try to scroll a complex documents with lots of text in different fonts, antialiasing, kerning, etc., and some complex graphics, and try to render it smoothly while the user scrolls or zooms it...
Then ask yourself while none of such applications are available for Linux - I mean *professional grade* applications..."
Then ask yourself why so many of those applications were originally Unix applications ported over to Windows NT. In some cases there were Linux ports as well, since one reason for Linux's success was the ease of porting from Unices. Then ask yourself whether the real reasons were commercial rather than technical.
Try to scroll a complex documents with lots of text in different fonts, antialiasing, kerning, etc., and some complex graphics, and try to render it smoothly while the user scrolls or zooms it...
"But... but ... MUH OPTIMIZATIONS! I can't do it! HERP! DERP!"
I agree the situation would be completely hopeless if practically the whole company consisted of low-grade fakers unable to even understand how this "Operating System" that they are supposed to own even works. Well thought-out optimizations and proper architecture would be right out and it everybody would think it be a good idea to shit all over everything and do insecure stuff where it shouldn't be done.
As I suppose this is not the case at MS, some other factor must have been very important.
"It's the financial models that limit the kind of "professional" apps that Linux can support as well as the overall success of the OS."
What are you talking about? Commercial software sells on its own merits. They don't have to worry about how the OS is distributed. And there's absolutely nothing preventing you from selling commercial software for Linux. Licensing is on a program-by-program basis. The only things they're concerned about it appropriate market penetration and support. That's why there were tons of Mac-only software in the old days: because it was a tool of choice of certain professional niches.
"What are you talking about? Commercial software sells on its own merits. "
Agreed. For years I've been using seriously expensive protein modeling software that was written for Unix/SGI/Linux. The sort that needed a license server (or worse only had a few copies available via a token system )
"It was a requirement for complex graphical applications, including DTPs, vector and bitmap graphics, CAD, etc."
Would you include video in this? In my household TV is vary rarely watched directly but via MythTV. Because of the constraints of a domestic environment this runs on an old fan-free Intel mini-ITX board with just standard Intel graphics simultaneously shuffling multiple streams from the receivers (note the plural) onto disk and the watched programme off it.
I don't have many requirements for some of the other stuff you mention although LibreOffice & PDF viewing works quite well under KDE on the Debian laptop on which I'm typing this.
But I believe that another Unix-derived system is quite popular for such complex graphics. You may have heard of it. It's called OS-X & comes from a little company called Apple. As they use a Mach-style kernel where the principle is to shove as much stuff as possible into userland I'd be surprised if it handled fonts or the like in kernel space and I'm sure there are plenty of folk here can give chapter & verse on that.
Twenty years ago, I was fiddling around with "fly through" views of 3D CATIA datasets at Boeing. Targeted at the high end IBM AIX workstations, they ran just fine on my little Dell PC running Slackware/X11 (back in the 1.2 kernel days). And these were being generated by clients running remotely (20 miles away, over leased lines).
Back then, the problem with running full up CATIA on Linux boxes were proprietary I/O drivers patched into the AIX X11 implementations, mainly for the specialized input h/w. Fast forward to today: Most of the high performance graphics stuff is licensed to individual apps (games, CAD systems, etc.). Ask for a license to hook some proprietary GPU API to an open X11 server? Forget about it. Graphics optimization must still be done on a per application basis. It's just buried deeper in Windows than in Linux.
"Believe it or not, it wasn't a gamer requirement. It was a requirement for complex graphical applications, including DTPs, vector and bitmap graphics, CAD, etc.
Try to scroll a complex documents with lots of text in different fonts, antialiasing, kerning, etc., and some complex graphics, and try to render it smoothly while the user scrolls or zooms it...
Then ask yourself while none of such applications are available for Linux - I mean *professional grade* applications..."
DreamWorks might disagree with you, for one example...
Files run in the kernel in Windows. The Pope is Catholic. ATFM.dll is such a file. Ursines defecate in areas surrounded by trees. Changing that particular process will mean that far too many third party programs would have to be changed. Fonts are a particular issue as they require both software (documents) and hardware access (printers) to the system. If not, then things would crawl when printing documents or displaying them.
It's all so "luvvies" can make pretty Powerpoints and Word docs.
We've all been down this road before. Have you been living in a bunker somewhere for the last 20 years?
"Files run in the kernel in Windows" - with such an incomprehensively vague and bizarre comment, I do wonder if you might bow out of further discussions. You'll only irritate those who understand stuff, and confuse those who don't.
How about "Flies run in the wake of the mature banana." Perhaps you could pick that up as your mantra and run about the woods at night. That'll keep you nicely out of the way of ordinary, decent everyday folk.
>Files run in the kernel in Windows. The Pope is Catholic. ATFM.dll is such a file. Ursines defecate in areas surrounded by trees.
His point I think (admit the writing is confusing) is to state some obvious things up front to prove his point. More interesting to me is his implication how in general desktop optimization comes before all other competing interests even in the core of the OS (which is so wrong but also where Linux is headed).