Managing a fleet
Have you checked Ansible?
1979 publicly visible posts • joined 18 May 2007
Sounds fun, got to try it to see it it can compile my own old Pascal compiler project (attempted port of P-pascal that has gathered dust for about 30 years now...). But Win16 support? OK, if it scratches someones itch, but it is hard to see why one of the the worst programming environments ever, which isn't even "cool" by any measure, still excites anyone.
> If I'm replying to specific points in an email (or on a forum) I'll inline my replies with the individual points I'm addressing.
Yes, that is how we used to use email in days of yore. But Microsoft email clients make it maddeningly laborious. Even worse, some variants (at least the web one) may actually hide your nicely written mail if you quote material and reply under it, because the client thinks everything under the quote is part of the old thread, and helpfully hides it.
That is why, after decades of fighting it, I gave up, and started top-posting in work emails.
Microsoft really broke the classic email system.
Exactly. If you want to actually be in control of the computer you own, Linux, FreeBSD, NetBSD etc are the only option (or rather, an up-to-date option, you could also run MS-DOS, DR-DOS or very old versions of Windows... could be sufficient depending on what you do, I heard George RR Martin used MS-DOS and Wordstar to write the Game of Thrones books).
If you WANT that kind of "protection" (from pay-for certs by developers) it should be OPT IN ONLY,
In fact, you do get a certain degree of protection of that kind for Linux binaries, if you use only programs from the official repository of the distribution. These are signed. And it is definitely an opt-in mechanism.
If the company believes there's business value in using that application, it will evaluate it, package it and deploy it in a secure manner.
And then forget to keep it up-to-date, so users get some turd 1.0 version while the developer is going at 3.1... Seen this happen. Fortunately the company I work at has not tried to enforce downloading only from the "software central", and in any case I now run Linux on the work computer, which they are enlightened enough to permit.
If Lenovo makes sure both Red Hat and Ubuntu run on their systems, then installing other distributions is likely to be painless. Recall it is about hardware support, and the part that talks to the hardware - the kernel - is shared infrastructure in all distributions. Even if Lenovo were to release some support only in blobs and closed kernel modules, instead of donating it to kernel.org kernel, they really cannot make those usable only on Red hat and Ubuntu. And why should they even try?
I think the free (as in no cost) was less important than that there was source (*) and documentation about Unix. You could run assignments like add feature X to unix. Try doing that with VAX/VMS.
(*) Strictly speaking, Unix was not "open source" as we now know, but its source was easily available to academic institutions. Eventually Berkeley reimplemented bits and pieces and eventually released the result under a liberal open source license, leading to lawsuit by AT&T, which was settled in 1994. Long-standing uncertainty about the legal status of BSD is one reason Linux is the most used free OS, and not FreeBSD or NetBSD...
Another implementation that was used in universities was Berkeley Pascal, which came with BSD Unix. As a student, I wrote some lengthy pieces of Pascal code in it for analysing Petri nets. (Now I hardly remember what those things were). This implementation produced native code for the VAX, and had enough nonstandard additions to make it just about practical for programming.
Compared to the alternatives available (K&R C and Fortran), it was the best choice for this task, in the sense that the resulting program was more likely to be correct.
> He could of course always open up on TikTok.
I noticed that gab.com posted a thread advertising themselves below one Trump tweet. They said they already had created an account for Trump.
Really wish he and his followers took their marbles there. This would improve Twitter considerably.
The book does not mention buying the 387, and at the time it was a rather expensive add-on that was considered useful only for people with number-crunching needs. The Linux kernel also has included code for 387 emulation starting from quite early versions. So my guess is he did not have it in his first 386 computer. But further research is needed to be sure.
He had a Sinclair QL, but started writing Linux only after getting a 386-based PC. The man himself describes shopping for it:
"January 2, 1991. It was the first day the stores were open after Christmas, and my twenty-first birthday, the two biggest cash-generating days on my calendar. [...] It was at one one of these small corner shops, sort of a mom-and-pop computer store, only in this case it was just pop. I didn't care about the manufacturer, so I settled on a no-name, white-box computer. The guy showed you a price list and a smorgasbord of what CPU was available, how much memory, what disk size. I wanted power. I wanted to have 4 megabytes of RAM instead of 2 megabytes. I wanted 33 megahertz. Sure, I could have settled for 16 megahertz, but no, I wanted top of the line." (from "Just for Fun", Chapter IV).
> I've got a photo somewhere of his old student 486, which is on display in a museum in Helsinki.
Thanks. I did not know it was on display (and I live in Helsinki). Definitely on my places to visit list when museums reopen. Some info from the University of Helsinki web pages:
"The Power of Thought" is a permanent exhibition of the University of Helsinki and its students, teachers and researchers. [...] Later objects in the exhibition include a computer used by Linus Torvalds and a student boilersuit from 2007. The exhibition is situated on the 3rd floor of the main building of the University (Fabianinkatu 33).
The comment of the main program in GWMAIN.ASM is interesting as it says who wrote the original MS BASIC and when:
COMMENT *
--------- ---- -- ---- ----- --- ---- -----
COPYRIGHT 1975 BY BILL GATES AND PAUL ALLEN
--------- ---- -- ---- ----- --- ---- -----
ORIGINALLY WRITTEN ON THE PDP-10 FROM
FEBRUARY 9 TO APRIL 9 1975
BILL GATES WROTE A LOT OF STUFF.
PAUL ALLEN WROTE A LOT OF OTHER STUFF AND FAST CODE.
MONTE DAVIDOFF WROTE THE MATH PACKAGE (F4I.MAC).
*
> lgol 60 begat both Algol 68 and Pascal ... and quite a few other languages along the way (Simula, anyone?).
I never used Algol 60, but I did do an exercise in SIMULA-67 at the Helsinki University of Technology. It had the same syntax as Algol 60, but added classes, with objects allocated dynamically, had garbage-collector (like Java decades later...), strings, and a sensible I/O library.
It actually felt a way more practical language than the Pascal compiler used in earlier courses. Pascal at the time omitted too many real world features. It was impossible to make a portable program that processed a named file. In fact, making a portable program that reads a string from the terminal, and prints something in response, was impossible, because the INPUT stream was defined to work in a way that only a theorist would love. Every implementation had a different workaround for this, or just redefined the I/O semantics, like Turbo Pascal did.
> and yet, for a while, at least, one could run Win16 software under Wine on 64-bit Linux.
Interesting. Perhaps it used instruction set emulation for the 16-bit code.
For Linux, there is a fork of DOSEMU (an old VM86-based system for running MS-DOS) that handles 16-bit code on 64-bit Linux using the modern virtual machine features, instead of VM86. I have yet to try this project myself, but here it is: https://github.com/dosemu2/dosemu2 .
> I think AMD's decision was fairly sensible; 16 bit modes had to go, and if not then, when?
AMD did not remove the 16-bit support entirely. At start-up even the 64-bit x86 CPU runs in "real" 16-bit mode. You could in principle boot it into MS-DOS, but I suspect the modern peripherals and motherboards could cause compatibility issues. Also, if you run a 32-bit OS on it, the VM86 mode is still available.
DOSBOX is probably the easiest to set up for casual MS-DOS use. It includes an emulated DOS, so no need to install FreeDOS or any other additional package. (If you really want, it is possible to run a real MS-DOS or FreeDOS inside DOSBOX, but that is more complex and usually not needed).
It is slower than a VM, but on modern machines, its emulation executes programs at least as fast as they ran on actual 80's PC:s. There is actually a setting for slowing things down, for some games.
Another thing a 64-bit Windows does not do is running MS-DOS applications out of the box. You have to install an emulator, such as DOSBOX, or install VM software and run MS-DOS in it (not sure if it is possible with VMWare or VirtualBox these days)
These limitations really boil down to the decisions AMD made when x86 was extended to 64 bits. The VM86 mode is not supported when running as a 64-bit CPU.
> It is a mess for data transfer,
A few years back, I needed to make a Perl script to report test coverage data in a format that Excel can read. As it happens, some users had Finnish-localized Windowses (we use the comma like the Germans), some not. So CSV output caused some problems. I solved them by making the script output SYLK. This is an old text-based data interchange format (think RTF for spreadsheets), and is still supported by Excel and others. More complex than CSV, but not hard to generate from Perl, if all you have to do is to make a table with labels. And it is immune to the comma issue.
> and Europeans want to have at least one open-source foundation that is distinctly European.
Thanks and welcome, but we already have at least one, the Document Foundation (https://www.documentfoundation.org/), the home of LibreOffice.
It is a charitable Foundation under German law: Gemeinnützige rechtsfähige Stiftung des bürgerlichen Rechts, so there!
VS Code was one of the code editors I occasionally try - and then go back to Emacs. The one-window restriction was a major reason. Even Netbeans (which has its own suite of other problems) is better in this respect. Supporting only one window was excusable in the MS-DOS age, but not afterwards.
Peeked at some of the files. This MDL language looks interesting. A lot like a variant of LISP, but with <> used instead of () in some places. Wikipedia has a description [ https://en.wikipedia.org/wiki/MDL_(programming_language) ] but no link to any implementation. So anyone porting Zork would have to start making one...
Right up to the point you want to validate the random sequence of characters claiming to be data.
Sorry but I have to disagree here. The syntax of JSON is far simpler than XML or ASN.1, so a coder is more likely to get it right, and anyway there are JSON parsing libraries for almost any programming language you care to name ( I have never had to write a JSON parser because of this). These either turn the "random sequence of characters" into a nice data structure, or return an error.
The one situation where processing JSON is inconvenient is a shell script, but why on Earth would you want to do that? Just use Perl or Python instead.
XML has valid uses, but unfortunately, at the top of its hype curve it was pushed as the data representation solution for everything. Frequently leading to 10K of XML boilerplate wrapping 1 byte of actual information... Nowadays JSON tends to be used instead, which makes far more sense for most purposes.
ou can build self-contained executables that run pretty fast without the bloat or hassle of shipping a large runtime.
On the other hand, last time I looked you cannot escape this self-containment, which means it is no good for smaller programs. Each exe carries with it much of the runtime system, so they start at about 2Mb in size. By contrast, the loated version of the "cat" command on Fedora Linux is 46K.
Indeed, if it turns out that immunity to coronavirus is short-lived, an effective medicine may turn out to be the better solution (there is precendent: there still is no HIV vaccine, but the disease can be kept in check by an antiviral cocktail). Of course, at this point we do not know which is it, so looking for both a medicine and a vaccine are top priorities.
In Finland, people are encouraged to have some battery-powered radio at home, for getting announcements and news in case of the WW3, zombie apocalypse or other disasters. We don't have DAB at all (it was tried for a year or two, then discontinued as nobody was listening), so I have a little Sony FM + shortwave radio that runs for days on two AA batteries.
What is the actual duration you can use a modern battery-powered DAB? (or how many batteries you would have to stockpile for a week of listening in your bunker?)
I had been using Chrome almost exclusively on Fedora Linux, but the other day the Webex remote meeting system my firm uses mysteriously stopped working on Chrome, claiming this platform combination is not supported. Still works fine on Firefox, so I had to switch. Not much difference otherwise, Firefox performance seems to have caught up. It used to render some JavaScript-heavy sites slower.
>> For instance, I would not block nudity but I would block violence, religion and pro-suicide sites.
OpenDNS allows this, it has various categories. However, you might not always agree on how a given site is categorized. For instance, it often classifies anything with nudity as porn, even though nudity has its own category (in my "family values", nudity by itself is not objectionable, it all depends on what the nude is doing).
> in any case if the Eclipse tool discussed in the article is NOT "scripty" but instead uses actual Java code
It sounds like a fork of VS code, and compatible with its plugins, therefore it, too, must have been written in JavaScript, running on Node/js.
Whther that is "scripty" is a matter of debate. The performance of JavaScript on Node/js is pretty good. It does JIT compiling. Certainly faster than, say, Python or Ruby. It is actually a question I have meant to investigate how much actual speed and memory usage difference there would be between JavaScript on Node/js, and C++ code that is written in the modern way using the standard containers and strings, instead of a "C-like" style. The answer could be surprising. The C++ containers basically do reference-counting based memory management, whereas Node/js does garbage collection.
> who creates admin accounts with a localised name?
Maybe the localisation system? I don't know this particular system, but if you install Windows to use Finnish, it helpfully shows nearly everything in the UI in Finnish. The underlying admin user name might not be altered, but what the GUI shows is.
My son accidentally set the language to Spanish when taking his new Windows PC in use. That was fun (and lucky it was not Hebrew or Chinese). It still occasionally shows some texts in Spanish, even after trying to tell it we want to run it in Finnish. Probably would require a reinstall.
As a kid I once did the reverse, treating white spot disease in my aquarium with ground-up malaria pills, left over from a journey in Africa (this was in the early 1970's when chloroquine was still effective in Africa). I reasoned it should work because both diseases are caused by protozoans. It was a success.
Actually, not so many people use chloroquine as a malaria prophylactic anymore, because in most places the malaria parasite has become resistant to it.
Doctors are currently throwing anything they can find in the medicine cupboard at the virus, in the hope something works. I suspect the effect was discovered because of this.
The common thread in all these exploits is speculative execution, which is performed by all advanced CPU:s these days.
I wonder if this sparks interest in research on architectures that would be reasonably efficient without speculative execution.
What was the last Intel CPU without speculation? The original Pentium, I guess.
As another commentard pointed out, most of the new stuff really has appeared in older languages. Sometimes much older. For example, Simula 67 (from 1967 like the name says) had most of the same features that make Go or Java safer. Managed memory, run-time checks, no wild pointers, compilation with strong typing. Even classes and inheritance. But almost nobody uses it any more.
Go etc. add finesses, and also follow a syntactic style that people are now familiar with. Simula 67 syntax is based on Algol, so long keywords, begin ... end instead of { ... } ...
I think one reason the older innovative languages have fallen by the wayside is that at the time they were introduced, known implementation techniques did not allow making the fast enough for production use, and computers also were slower and had less memory. I recall Bjarne Stroustrup saying he started developing C++ for a project where he first tried to use Simula, but it ran too slowly.
So programmers were enticed by the low-level, anything goes C, and later C++. Then managed languages became more feasible, thanks to faster computers and innovations like better garbage collection algorithms and JIT, but these were applied to new languages instead of attempting to resuscitate old ones. It is easier to "sell" something syntactically C-like to programmers who grew up with it.
I'm fine with this. Having done most of my professional programming in C for decades, I now believe very few programs should be written in it. Mainly kernels and drivers. (Perhaps one should have a license to use it). Everything else should be programmed only in managed, checked-to-hell-and-back languages. Even then programmers will keep making stupid errors, but there is some hope there will be a a bit less of them, and they are caught earlier.
> what is the point of these new languages other than an ego trip for its developers and a lock-in to the environment.
Would you then prefer coding in FORTRAN IV or COBOL, since newer languages are just ego trips? Of course not. No language can ever guarantee absence of bugs, but improvements can be made nevertheless.
Ob car analogy: Modern cars are much safer that what people drove in the 1960's, but fatalities still sadly occur. That does not mean all the improvements were pointless.
svn commit
like it's the year 2000: Apache celebrates 20 years of Subversion
I don't really get this "similar to CVS" claim. I have used both, and find SVN works totally differently. For example, you don't have branches, but separate directories that act as the branches. OK, it works, but it is nothing like how CVS does things.
Version numbers in SVN also are entirely different, not 1.2, 1.3 ... but a huge number without any structure that counts commits since the beginning.
In all, CVS and SVN have about as much in common as CVS and Git.