* Posts by Vincent Manis

32 publicly visible posts • joined 26 Jan 2008

Not even poor Notepad is safe from Microsoft's AI obsession

Vincent Manis

Re: Microsoft?

Actually, the DEC OSes that CP/M and MS-DOS were based on used / as an option character. Both / and - have about the same amount of history: - was used (I think) in at least some programs in CTSS (1960s), whereas DEC was using / only a few years later. Since MS-DOS was based upon CP/M, it's no surprise that they would use the same flag character. Apparently, when DOS 2.0 was designed, they actually wanted to support / in pathnames, but that would have been too great a change for DOS 1.0 users. I believe that DOS 2+ all silently accept / as a directory separator in pathnames.

So not “change for the sake of change” but more “we don't want to make incompatible changes”.

RIP: Software design pioneer and Pascal creator Niklaus Wirth

Vincent Manis

Re: Then And Now

I visited the Community Memory storefront in Berkeley back in 1974. They had a Teletype connected to a remote Xerox Data Systems machine (possibly a Sigma 2 or 5), and it was running what we would now call BBS software. The CM folks wanted to use it as a tool for connecting community groups and individuals. I don't know what happened to the project.

Vincent Manis

Wirth, ALGOL68, and the Meta key

A couple of observations. First, ALGOL 68 did not succeed, but its complexity has been overstated. Partly due to the horrendously un-understandable 2-level grammars of the Report, and partly due to language features that were not yet understood (parallelism and semaphores, among others), it got a reputation as unimplementable, even though there were almost-complete subsets built in the 1970s, and ALGOL68 Genie flourishes to this day. Wirth and Hoare had very good reasons for rejecting it, but I would argue that modern C++ is at least as complex as ALGOL68, if not more so.

Second, during his California stay, if not afterwards, Wirth had the nickname`Bucky'. At one point he suggested an Edit key to set the 8th bit of an ASCII character on a keyboard. This was the base for the Meta, Super, and Hyper keys of various Stanford and MIT keyboards, and the modern Alt, System Logo, Command, and Option keys of modern keyboards. The bits these keys set are known as `bucky bits' to this day.

Windows 12: Savior of PC makers, or just an apology for Windows 11?

Vincent Manis

This week, I bought a really nice ultraportable, a Lenovo Yoga 6. I had to experience Windows long enough to turn off fast boot (which involved logging into my Microsoft account), I attempted to install Debian. No luck there, needed some custom wifi drivers. Rather than mess around doing that, I installed Ubuntu. It installed flawlessly, and (although I'm not a Gnome fan) came up with an entirely usable user interface (which I'm replacing, of course). I don't think Ubuntu is the sole choice (though it makes sense for me as a Debian user), but there are a number of very polished distros that install cleanly and are usable by non-experts...and cost nothing.

Microsoft floats bringing a text editor back to the CLI

Vincent Manis

Just as nano is imitation pico.

I have my strong preferences regarding text editors, but find the Editor Wars tiresome. Whatever MS does will be liked by some and hated by others. People who don't like their choice can install their own. There is nothing more to say.

Chromebooks are problematic for profits and planet, says Lenovo exec

Vincent Manis

Re: Bad for the environment?

I don't know, on a Chromebook, I just go to Settings > Developer, and enable the Linux environment. This gives me an almost complete Debian system (there are a few limitations) on which I happily run Emacs, TeX, Inkscape, Gcc, and Chez Scheme. Chromebooks are admittedly not powerful machines, and I can point to various defects and limitations of this setup, but it works for me. In fact, a Lenovo Chromebook tablet with an external mech keyboard and trackball is my preferred travel/writing setup.

There are definitely things I don't care for in ChromeOS (like having to use a Google account), but the ability to run Android programs, Linux programs, and Chrome itself, along with the fact that Google promises updates to that machine until 2031, count for a lot.

Security? Working servers? Who needs those when you can have a shiny floor?

Vincent Manis

Re: Clean keyboards

This is in a laundry room in my apartment building. The plugs are hidden behind the machines; the washers themselves are made by Huebsch, and have many unpleasant failure modes, but no visible Pause button.

Vincent Manis

Re: Clean keyboards

I can't demand that a phone survive a wash cycle; if I could, my original post would have been pointless. That said, all reputable device manufacturers provide a statement about environmental factors. I randomly picked Samsung, and found https://www.samsung.com/ca/support/mobile-devices/galaxy-phone-dust-and-water-resistance-rating. There they show the Ingress Protection ratings of (some of?) the Galaxy range of phones.

Vincent Manis

Re: Clean keyboards

I am profoundly not fond of my iPhone and its horrible OS. But I'll give it credit: it survived an entire wash/rinse/spin cycle. The washing machine in question locks when started, so all I could do was to stare and weep when I inadvertently loaded it along with the laundry. When it came out, it functioned flawlessly, and was unmarred except for a tiny scratch at one edge. I very much doubt I'll buy another iPhone,but The experience has taught me to demand that my next phone will be similarly sturdy.

Techie wasn't being paid, until he taught HR a lesson

Vincent Manis

Re: Unique keys

Ah. Following Conway's Law (roughly “the design of a program mirrors that of the organization which created it”), I would assume that there were at least three different authorities involved, each deciding that they needed a unique ID, and none communicating with each other.

Shocks from a hairy jumper crashed a PC, but the boss wouldn't believe it

Vincent Manis

Re: capital of BC

Prince George is the capital of BC, just as Milton Keynes is the capital of the UK.

The Stonehenge of PC design, Xerox Alto, appeared 50 years ago this month

Vincent Manis

Re: the language that begat C

Dyadic (binary) only was a design decision. I did this on an IBM System/360, and addressing arbitrary bytes wasn't something you needed. So in w%o, you start with a word pointer w, and add a byte offset o. I also had in mind word-addressed machines (which still existed in those days), where accessing an arbitrary byte in memory was nonsensical. (As much as I can capture my thoughts from 50 years ago.)

Vincent Manis

Re: the language that begat C

*Grumpy* note. I invented the % operator for byte subscripting, and made the changes to the compiler and System/360 code generator to make it work. This work was communicated to Martin Richards without my name on it, and he in good faith incorporated it into his distribution.

Sorry, I just had to vent. :)

Vincent Manis

Re: the language that begat C

It actually stands for “Basic CPL” where CPL was a language that Christopher Strachey and his colleagues worked on during the 1960s and early 1970s. It would have been an excellent language, had its developers ever converged on an actual final specification for it. Martin Richards noticed that a small subset would be ideal as a systems programming language; he developed a highly portable compiler that could be ported by writing about 100 lines of Fortran/assembler/whatever for the target machine. As well, backends could be written to produce good-quality code for a range of machines. (I benchmarked it in the early 1970s as producing code for a couple of problems that ran about 1/3 faster than the same code on IBM's Fortran G (non-optimizing) compiler.

BCPL's downfall came because it worked best on a word-oriented machine, such as the IBM 7090/7094 on which it was first implemented. There was only one type, the word; you needed to add a minimal notion of data types to make byte or double-precision data work well. (This was exactly the reason that B was replaced by C.)

That emoji may not mean what you think it means

Vincent Manis

I too have eyesight that makes decoding emojis at normal text size very difficult. Further, I can't be bothered to learn the input methods for emojis. I have a compose key set up for typing accented letters, and so I can type <compose> : ) for a smiley. Apart from that, the whole emoji business leaves me cold, and therefore my response to it is <compose> p o o.

You need to RTFM, but feel free to use your brain too

Vincent Manis

Re: Check you can complete before you start

Back when I was a university teacher, I often included the instruction `Write the word OKAPI at the top of page 2 for an extra mark’ in my midterm exams. Less than half of students did this. Some who didn’t said afterwards that they had read it, but thought I was joking! Life sometimes makes me sad.

Wine 6.0.1: For that one weird app on that one weird Mac

Vincent Manis

Re: It's astonishing that Wine exists...

Wine for me is a real life-saver. I use the Firestorm Second Life viewer on Linux, and the add-on voice subsystem doesn't work very well. Firestorm recommends using the Windows voice add-on instead, and thanks to Wine, that works like a charm.

Overload: A one-way ticket to a madman's situation

Vincent Manis

Re: phoning and testing

MVCL was always interruptible.On an interrupt, the hardware would update the registers; on return, the machine would just re-execute the MVCL.

A much more interesting way of swamping a 360 or 370 with virtual memory was to process a huge matrix along rows, not knowing that Fortran stored arrays by columns. This could bring our university's 360/67 running a time-sharing OS to its knees, as the offending process ended up having to swap in a new page for each element access.

Why is Wikipedia man Jimbo Wales keynoting a fake news conference?

Vincent Manis

It would be interesting to compare Wikipedia's error rate with Orlowski's

Intelligent people know that no source can be trusted completely, and that any worthwhile research uses a multiplicity of sources. I often use Wikipedia as a starting point, often finding its references and external links the most useful part of the article.

While there have been many accounts of individual Wikipedia pages that contain slanted or biased information, I have no reason to believe that the overall site's error rate is substantially higher than any other encyclopdia.

Shoddy journalism is the journalist's fault. Uncritically cutting and pasting a Wikipedia article is no different from a school student copying a passage from a book without acknowledgement in a school report.

Finally, I look forward to seeing a link to Mr Orlowski's list of the 16,000 pages he claims are false.

'We think autonomous coding is a very real thing' – GitHub CEO imagines a future without programmers

Vincent Manis

Well, OK, then

I'm a fan of the Fargo film and TV series, which depict decent people going to hell through a series of really bad decisions. Similarly, programming ended in the early 1960s, when FARGO, the programming system was created. [https://en.m.wikipedia.org/wiki/FARGO_(programming_language)]. Not!

Of course, people don't write mortgage programs anymore, they use spreadsheets. We don't need to keep writing the same programs over and over.

Software development has changed many times over the decades. What hasn't changed is the need to have smart people create solutions to problems that nobody has solved before, and other smart people evolve those to meet changing needs. Intelligent programming assistants can definitely be a part of that.

Init freedom declared as systemd-free Devuan hits stable 1.0.0 status

Vincent Manis

Re: No - systemd doesn't offend me

I selected No because the choice of init system really makes almost no difference to me. I've been using Unix since 1975, and taught courses on Unix programming and system administration, so I find the issues relating to init systems interesting, but they have very little effect on me on a day-to-day basis. I've survived BSD init, SysV init, upstart, and now systemd, and none of those changes has made an appreciable difference to me, other than some minor modifications to scripts. Obviously, some sysadmins might find that such changes necessitate a great deal of work; but that hasn't applied to me.

Now do I find systemd's design offensive? Yes, I do. But then I find most features in modern distros kind of offensive (the cat and true commands are exceptions ☺) . I tend to like systems that are consistent, have as little clutter as possible, and match my notions about workflow. I would personally have liked it if Plan9 had been modernized and made a solid OS. But it wasn't, and Ubuntu and Debian do a fine job for me.

So what about Devuan? I hope the developers can produce a solid system. If it's easier to use, or more robust, or more scalable than Debian, good for them; maybe other distros will pick up on their ideas. The computer scientist in me finds such matters interesting; the computer user in me is happy with what he has.

systemd free Linux distro Devuan releases second beta

Vincent Manis

Re: @Vincent

Peter is of course correct, it was a brain fart on my part. In fact, by the time 7th Edition came out, I no longer had access to the 11/45 in question.

Reiterating my earlier point, much of present-day Linux distros (all of them) isn't the way I'd do it, but I have a life, and I just use what I have to get the job done. Linux/Unix on desktop, laptop, or server looks downright great compared to the Redmond alternative.

Some other posters have referred to disembowelment or the like. I prefer to judge software on its merits or demerits, without ad hominem comments.

Vincent Manis

My first Unix system was 7th Edition on a PDP-11/45, back in 1975. Since then, I've seen Unix (and now Linux) grow from an elegant, small, sparse system to the furry, ungainly creature it is today. In part, this is due to increasing performance, reliability, or functionality. In part, it's creeping featurism. Most of a modern Linux system isn't done the way I'd do it, ranging from duplicated functionality to unwanted applications, not to mention hit-and-miss documentation.

I suppose if I really cared, I'd use one of those distro generators to roll my own. But frankly, life's too short. I fire up my various Debian or Ubuntu systems, and for the most part, they just work. I haven't really noticed any decline in stability with the advent of systemd (which is definitely not done the way I'd have done it).

So I wish the creators of Devuan success, but frankly neither the presence nor the absence of systemd really makes much difference to me. I just use whatever I have to get whatever I need done.

Arthur C Clarke award won by Adrian Tchaikovsky

Vincent Manis

Re: Dubious award

If you are referring to the claims of pedophilia lobbed against Clarke, no evidence was found to support them (other than some claims in some tabloids). While those accusations were current, Clarke's knighthood was in abeyance. When the Sunday Mirror apologized for publishing those claims, the knighthood was granted. Clarke was definitely either gay or bisexual (regardless of what he said on the subject), and he may well have enjoyed the company of younger men, but there's a world of difference between that and pedophilia.

What's long, hard and full of seamen? The USS Harvey Milk

Vincent Manis

Re: Missing info

...and also the name of an eminent member of the US House of Representatives, representing a large part of Atlanta, Georgia, a former (and current) civil rights leader.

Vincent Manis

Missing info

Actually, the USNS Harvey Milk is the second ship in the John Lewis class, which are being named after civil rights leaders. And the `long, hard, and full of seamen' comment is about as courteous in discussing this matter as a reference to watermelon in a discussion of Obama's foreign policy would be. Both the article and the headline are seriously misleading.

Steve Jobs had BETTER BALLS than Atari, says Apple mouse designer

Vincent Manis

I don't like the one-button mouse, but I will say that it does make using the mouse simpler to explain. I have taught several seniors how to use computers, and the whole `left-click/right-click' thing is actually quite difficult to get across. I don't think the one-button mouse is a good idea, but it is true that Apple actually had a case in favor of it. (I really like my Lenovo laptop with its TrackPoint and three mouse buttons, and a touchpad I disable.)

As for inaccuracy, mice have always been inaccurate. I was told in a graduate-level HCI course I took circa 1975 that the original SRI mice were built from miscellaneous electronic scrap, and that getting perfectly linear potentiometers was essentially impossible. As a result, to move the cursor in a straight line, the user had to move the mouse in a curve. Apparently, users adapted very quickly, and were genuinely unaware of this phenomenon.

US Social Security 'wasted $300 million on an IT BOONDOGGLE'

Vincent Manis

In my experience, government IT projects often don't come in on time/under budget, for several reasons. 1) Requirements creep, often triggered by political considerations (and often from the same sources as later outrage about increasing costs); 2) lack of coherent oversight by the client (instead of having a small review committee with both technical and other members, approvals are often distributed among a large, diffuse, and ever-changing group of individuals); 3) significant amounts of out-of-band communication between contractor and client, often necessitated by the first two factors, and resulting in a loss of institutional memory (`where did that requirement come from'); 4) poor life-cycle models, veering towards waterfalls, and away from iterative/agile practices (and I am not carrying the agile banner here, just pointing out that waterfall development often produces gargantuan monsters that don't satisfy client needs); 5) lack of end-user involvement (e.g., a senior program manager who thinks he can speak for clerical users of the system); 6) lack of clearly identifiable milestones that relate to the actual project (e.g., `the Frobozz infrastructure is complete', rather than `the system can correctly perform transactions X, Y, and Z'); 7) emphasis on delivering milestone products, rather than ensuring the products meet specified reliability and performance requirements; and 8) lack of proper audit procedures (e.g., looking at numbers of closed and unclosed bugs in the bug database). I could go on and on.

Of course, many private-sector projects go south in exactly the same ways. But government IT projects seem particularly susceptable to these dangers. As someone who has taught software engineering practices to industrial practitioners, it disturbs me greatly how easy it is for large organizations to ignore their own history, and make the same mistakes over and over again.

A Year of Code timeline: History of a HYPEGASM

Vincent Manis

`Coding' is such a generic thing, and never should have been identified as a goal. Certainly HTML coding is a good and useful skill (though, to be fair, it's easily automatable, as many web frameworks have shown), but it has little to do with writing programs that accomplish useful things, whether they be things of interest to the individual author, or things of interest to enterprises and the public at large.

The Raspberry Pi folks don't primarily want to have people coding. They wanted to build something that can be played with, that is as open as possible, to stimulate interest in computer science and engineering. Their metric, as can be seen from their website, is much less `how many people learned to program RPi' than `how many people have done fun/interesting/useful things with the platform'?

This distinction seems to be very difficult to get across to people. Thirty years ago, I was running workshops on teaching with LOGO. I gave it up after realizing that to most of the participants, the goal of LOGO was not to learn how to solve interesting problems, but to `trick' the turtle into drawing cute pictures.

I thought it interesting that the timeline presented in the article made no mention of people who were in the business of teaching computer science and engineering. If I were a non-technical politician wanting to set up something like this, I'd have gone out and formed an advisory council with (say) 5 people from academia and 5 from industry (at the CTO level) to chart out a workable strategy. If I were doing that in Britain, Eben Upton would have been one of the first I would have called. So I guess that's why I'm not in politics.

Apple routs rivals in sat survey

Vincent Manis

Macs are for creative folks

I'm always intrigued by the notion that somehow Macs are for creative people, but PCs are for serious people. Maybe that was true back in the days when desktop publishing was new, but nowadays, pretty much anything you can do on OS X can be done on Windows, and vice versa. Here's why I love my 3-year old MacBook Pro.

1. Magsafe power plug. Very little chance of destroying a computer, as I once did to a Dell Latitude laptop where the power cable ended up inextricably tied around my foot just I stood up abruptly. Those little tiny touches do make a difference.

2. It's real Unix, so I can run computer science-y software that was written for Unix.

3. It really is plug-and-go. I have a Dell laptop with Ubuntu on it, and while it's a nice system, I had to do a lot of work to get everything working properly on it.

4. My antivirus software for that machine consumes zero bytes, takes 0% of the cpu cycles, and cost me $0.

5. Apple includes their development software at no cost with every system.

6. Apple's development software does not suggest that I become super-user in order to compile, test, and debug software. Visual Studio did exactly that on Vista, maybe it won't on Windows 7.

Not one of these is something that Apple's competitors couldn't duplicate (they'd have to license the magsafe patents, I guess, but they could apply the same attention to detail about other aspects of how people actually use machines).

Windows systems fail on all six of these criteria. Unix/Linux systems pass on 2, 4, 5, and 6, but fail on 1 and 3. (Incidentally, I know my way around Unix, having started in 1975 on a Unix V7 system. Even so, when I want to send email, or write software, the last thing I want to be doing is to fiddle with system parameters to make everything work, and yes, I have had to do that with every system I have ever installed Linux on).

So my point is, Microsoft, the hardware vendors, and the Unix/Linux community COULD be competing with Apple. The criteria I listed here happen to be mine; other folks have their own, though `it just works' should be on everyone's. Where people miss the boat is to say `Oh, we aren't Apple, we do it our way'. Apple definitely has their failures, but their successes come from building things that satisfy needs (not just status) in people's lives. Other companies could do the same.

By the way, let me put in a plug for Mark Shuttleworth's goal of making Ubuntu compete with OS X. I don't know if they'll be successful, but they are definitely thinking the right way.

'Tofu' license pits open source against meat

Vincent Manis

@Sean Baggaley

1. The Open Source Definition is not the work of Richard Stallman or the FSF, who overall does not like the phrase.

2. The non-discrimination clauses I quoted come from the Open Source Definition, which is published by the Open Source Initiative.

3. When Microsoft developed licenses that they considered to be open-source (note lower case), they submitted them to the OSI for approval.

4. The Wikipedia article on open source cites a reference ([6]) from 1990 that shows the term to be used to signify `materials open to the public and freely available' [my wording] and opposed implicitly to `controlled'.

5. I hope Sean would not consider a library to be public if it refused admittance to meat eaters, vegetarians, or any other group.

For the record, I have been using open-source software since about 1970, back when user groups such as SHARE (IBM) and DECUS (DEC) used to distribute it. The GNU Manifesto clearly says that RMS wanted to re-create the environment he'd experienced at the MIT AI Lab, so he can hardly have claimed to invented the concept.

Vincent Manis
Thumb Down

Another article that misses the mark

Whatever this license might be, it most definitely isn't an open source license. The Open Source Definition has at least two clauses (no discrimination against persons or groups, and no discrimination against fields of endeavor) that are. at least according to the article, are violated. Similarly, the Four Freedoms include the right for anyone to run the software, again violated, according to the article, by this license.

I do wish the article's author had been a bit more careful about using the term `open source' for something that is nothing of the kind.