* Posts by Michael Wojcik

12336 publicly visible posts • joined 21 Dec 2007

Foxconn factory fiasco could leave Wisconsinites on the hook for $300m

Michael Wojcik Silver badge

Foxconn EPCOT. Temple to the spirit of Buckminster Fuller. Sunsphere Mount Pleasant.

Version 251 of systemd coming soon to a Linux distro near you

Michael Wojcik Silver badge

Re: Horses for courses...

The idea of an active/passive pair of OS images has been around for decades. OS/400 had it in the '80s, and it wasn't considered novel then. Implementations have probably been around for half a century or so.

And of course you could roll your own with any number of boot managers. IIRC, even with BSD 4.3 Tahoe (1988) the bootloader could be told what label to use when locating the root partition.

Sure, it's "a benefit". It was a benefit 35+ years ago. Everything old is new again, generally without the implementors taking any note of the lessons learned in the previous incarnations.

Landmark case recognizes Bored Ape NFT as an asset

Michael Wojcik Silver badge

Re: BAYC

You are Walter Benjamin, and I claim my £5.

In all (well, some) seriousness, Benjamin's "The Work of Art in the Age of Mechanical Reproduction" does a good job of analyzing the psychological attachment to "original" works and how reproduction interacts with it, even if he indulges in his penchant for somewhat obscurantist language.

If you read his use of "aura" as referring to some real, measurable property of the object, it seems like rubbish; but understand it as an attitude people tend to have toward what they believe is an original, and it seems quite apt – more so now, even, than when it was written. It's also important to read the whole piece because there's a critical tone shift near the end that reveals his actual political thrust, in something of a twist ending. (I've heard plenty of people cite the essay in support of claims not justified by the text.)

Matt Scala's "What color are your bits?" blog post is another great rumination on how these sorts of parasitic attributes can be attached to digital artifacts by (what people believe to be) their provenance, regardless of their actual content. You might have to grub about in the Internet Archive to find that one, though.

Michael Wojcik Silver badge

Re: Crazy times!

A classic example of where I understand what these words mean individually (although "Metaverse" is a bit sketchy) but now that they've been assembled into a sentence I'm just left scratching my head

I understand the phrase just fine, which I think is sadder. I may have no interest in participating in this madness, but I still seem to have dedicated far too many brain cells to understanding it.

SoftBank to stop investing 'randomly' after losing billions of dollars

Michael Wojcik Silver badge

How shocking...

... that the tech bubble is popping. Who could have predicted this?

I did enjoy Son's bit about "we will proceed with procedures of Arm's IPO", though. Looking forward to processing the proceedings of those procedures as the proceed; should be quite a procession.

MIPS discloses first RISC-V chips coming in Q4 2022

Michael Wojcik Silver badge

Re: another closed system with no upgrade path

Hennessy's MIPS project at Stanford (eventually spun out into the company MIPS) and Patterson's RISC project at Berkeley were both funded under DARPA's VLSI grant program, starting around 1980.

Both also took inspiration from IBM's 801 project, which started in 1974, and Tannenbaum's 1978 paper showing most applications used only a small number of the instructions provided by the CISC architectures that dominated in that era.

IBM's 801 evolved into the ROMP CPU (begun '77, running '81, public demo '84), which became the CPU for the PC RT (1986); and the RIOS / POWER architecture (development '82-'89, available 1990). John Cocke was probably the most influential figure in RISC development at IBM, though there were certainly many others, such as Phil Hester.

There was a lot of cross-influence. Tannenbaum's study and others made an argument that was sufficiently compelling to drive parallel development on RISC architectures in a number of places.

LIDAR in iPhones is not about better photos – it's about the future of low-cost augmented reality

Michael Wojcik Silver badge

Tired: reality. Wired (or, rather, wireless): ubiquitous fiction, without even a smidgen of mental effort.

My first hands-on experience with VR and, to a limited extent, AR was at SIGGRAPH in Boston in ... 1989, I think. I wasn't impressed then, except for a few niche applications. I'm not impressed now.

"Rendering happens" indeed. For various meanings of "render".

Industry pushes back against India's data security breach reporting requirements

Michael Wojcik Silver badge

Re: Best practice

It's not "written in stone". It's a best practice, not a standard or law.

It's best practice because it takes time to assess whether a portscan or DoS or other event is actually a breach, much less one of any significance. As it is, as someone mentioned upthread, even after 72 hours we get a lot of noise with insignificant events being reported. Scans, DoSes, and other automated attacks happen all the time. Reporting all of those makes it impossible to separate the wheat from the chaff; it's worse than not reporting at all.

"my bank is hacked" is a meaningless phrase. It has no technical meaning. What matters are breaches and damage done, what's known and suspected and with what probability, what risks are present, what contingency plans are in place -- technical specifics. Your bank is "hacked", for some meaning of the term, every day, probably every hour. Everything on the public Internet is, to a first approximation. The public Internet itself is, with invalid BGP routing advertisements and the like (which number in the thousands per day).

It's a harsh world. Panicking at every little thing does not help.

Michael Wojcik Silver badge

Re: Better more than less

But, quicker reports of hacking attempts, successful and unsuccessful, are objectively a good thing.

I'm afraid that's simply incorrect. Faster and more voluminous reporting of attacks is not an absolute good.

For reporting to be of any use whatsoever, it has to be limited as much as possible, to a high probability, to events of interest. The vast majority of "hacking attempts" under any reasonable definition (as I noted above, phrases like "hacking attempts" are largely meaningless, but let's pretend they correspond to some useful concept) should most definitely not be reported. If we insist on reporting them, companies might as well run a client that just sends a continual stream of reports to the observer; that would be equivalent and more efficient.

Palantir summons specter of nuclear conflict as share price collapses

Michael Wojcik Silver badge

Re: "I'm fairly confident"

And for those that don't work, many of the failure modes create a big ol' mess somewhere. I dare say most of us would rather that didn't happen.

TurboTax to pay $141m to settle claims it scammed millions of people

Michael Wojcik Silver badge

Re: Paper tax returns

I do mine electronically, but archiving isn't an issue. I have paper and PDF copies of all my returns.

But I use the desktop Turbo Tax, not the web version. I wouldn't touch the web version if they paid me.

My complaint with Intuit is that this year they made it damned near impossible to get the desktop version. Every damn link from the website led back to the goddamned web version. I had to do multiple web searches and dig through results to get a download for the desktop version. Intuit really want to own your data, and to kill the standalone product.

It also does that obnoxiously obsequious thing when it's checking for updates, with messages like "OK, we know it's taking a little while". Microsoft does the same thing these days in products like Teams. Stop patronizing me, you bastards; I was writing software before you were a gleam in some traveling sales rep's eye.

Security is a pain for American Dental Association: Ransomware infection feared

Michael Wojcik Silver badge

Re: ADA attack

There's no question that ransomware has stained and pitted the once-gleaming surfaces of our network defenses, and you mustn't just brush it off or pay lip service to it. Incisive action is necessary. The troubling calculus is whether to pay the ransom. Pick your poison and scrape up the necessary resources to plane down the attack surface, fill in missing capabilities, and seal porous boundaries. White-box testing can help identify vulnerabilities before they bite. Rinse, spit, and repeat.

And, um ... something about dentures? I dunno.

Accenture announces 'Accenture Song' – not a tune, but a rebrand

Michael Wojcik Silver badge

Re: IBM also had a song...

IBM had an entire songbook. It was real. I held a physical copy in my own hands when I worked at IBM long ago (OK, circa 1990). In addition to the usual music and lyrics, it had black-and-white photos of choruses of besuited IBMers dutifully singing some of the numbers.

Corporate songs, songbooks, and even full-fledged musicals were A Thing in the US for a while.

And surely everyone knows the KPMG song? That one's famous. There are a bunch of remixes on YouTube.

Why is IBM selling post-quantum crypto when it's still a pre-quantum company?

Michael Wojcik Silver badge

Re: Latticies

Wait how long? "some algorithm" to do what?

If you'd waited until MD5 and SHA-1 were broken to decide what cryptographic hash to use, you'd have waited until SHA-2 was standardized in 2001 to do any asymmetric cryptography, HMACs, etc.

There are certain kinds of proofs of security that we can, in principle, find or create (depending on your stance on mathematical Platonism) for algorithms. We can say "under these assumptions, we can prove a lower bound on the amount of work needed to reverse this construction without the secret", for example. We can talk about Random Oracle proofs and the like.

We can't prove something is secure in an absolute sense, because 1) that would involve either proving a negative (there is no viable attack) or exhausting all possibilities; and 2) there's no such thing as "secure in an absolute sense". It's a nonsense concept.

And we use lots of cryptography which doesn't have particularly strong proofs. There's no proof of the hardness of factoring; there's just no published algorithm for general integer factoring that's better for large integers than GNFS. (There are special cases, such as when the factors are relatively close to one another and you can use Fermat's, where other algorithms are better.)

That's why we have a three-stage NIST competition for PQC, which is now in stage three. And things continue to shake out; Rainbow was broken just a few weeks ago.

But we can say some useful things about PQC. Like, for example, that if there's an algorithm in complexity class BQP for solving lattice problems, then the complexity hierarchy collapses, which would be a Pretty Big Deal and seems Rather Unlikely.

The oldest PQC schemes, McEliece and NTRU, look to be reasonably secure. They've received a lot of attention. The problem is they're expensive. So people come up with related schemes that use smaller keys or smaller signatures or are faster or whatever, and then other people try to break them. (Well, that's one problem. They also don't have some results we'd be happy to see.)

Most of these schemes are variants on McEliece, which is conceptually pretty simple. You have a matrix-based error-correcting code. You inject some noise into your generator matrix using a permutation P and a linear transformation S. P and S become the private key; if you know them, you can remove the noise and correct out the errors that were injected into the message.

I think it's a little early to be advertising baked-in PQC, since the NIST competition is still running. Even if we get practical quantum supremacy in, say, the next few years, breaking RSA or ECC with decent key sizes will be expensive; attackers won't be breaking everything left and right. And applications that deal directly with cryptographic resources such as keys and signatures will probably need changes to handle the huge keys and signatures typical of these algorithms, so organizations will be slow to switch to PQC.

The wild world of non-C operating systems

Michael Wojcik Silver badge

Re: Modula 2

Rust is about as close to the metal as C is. It just imposes more restrictions on the programmer, and provides more syntactic sugar and abstractions at the library level. But it runs comfortably in the same environments as C does.

For an OS, the reduced need for serialization of execution provided by Rust's object-ownership guarantees (which are imposed at build time, not run time) should overcome any incremental overhead introduced by good Rust programming practices.

For end-user applications, any performance difference is almost certainly irrelevant, because the vast majority of them will be I/O-bound.

For stuff in the middle -- execution environments and libraries and subsystems and whatnot -- some of it will be CPU-bound and performance-sensitive, but again most of those things will get more back from the reduction in locking than they'd lose from Rust's level of abstraction, which is not, after all, much higher than that of C, much less that of C++.

Michael Wojcik Silver badge

Re: What about Assembly Language?

Whether that's necessary depends on your definition of "compiler".

You write a minimal compiler in a subset of C for the same subset of the language, hand translate that to assembly — it can be quite straightforward — and assemble that.

That bootstraps the minimal compiler. Compile it with itself and confirm the assembly it produces is valid.

Now add more features, still using the subset in the compiler itself, and compile it with itself again. And so on.

Is the human who translates the initial compiler also a compiler? Or do you use that term only for a program? It's a question of definition, and people disagree. But you certainly don't need to create a compiler for the entire C language in assembly or some other HLL in order to create your first C compiler. You can start with a very small subset of the language. If you're willing to allow certain deviations from the standard, you can start with even less.

Then, of course, once you have a working compiler you can split it into a portable front end and just write the code generator for each new platform, as first pcc and later GCC did (as well as various others).

Michael Wojcik Silver badge

Re: VMS

Ah, VAX assembly. All those handy CISC opcodes, and the system macro libraries... I had a school assignment where we had to use a macro (SYS$UNWIND or something like that?) to implement exception handling. Very cool.

The VAX assembly course tied with the LISP and Scheme courses for being the most fun just to write code.

Michael Wojcik Silver badge

Re: Multics & PL/I

IBM has written chunks of its various operating system using PL/S and its successors.

Yep. IIRC the VMM (monitor) that underpinned AIX 2 on the RT PC was written in PL/M, and the OS/400 kernel in PL.8 (which used the "." rather than "/" in its name for some reason). PL/I variants were all the rage at IBM for a while.

Blockchain powered stock market rebuild started in 2017 delayed again

Michael Wojcik Silver badge

Re: Si non confectus, non reficiat*

Upgrading Itanium CPUs is ... tricky, these days.

But, yeah, we'd be happy to sell them a COBOL runtime for shiny new x86-64 blades running Linux. Or whatever their beverage of choice might be.

Michael Wojcik Silver badge

Re: Immutable record

As Bruce Schneier has pointed out multiple times, we have plenty of protocols for append-only ledgers. Blockchain is just a child's version of a Merkle tree,1 doing a job that can be done in better ways for most use cases.

There are good uses for Merkle trees (and other Merkle graphs). git is popular, for example. I've yet to see a convincing use case for blockchain. This doesn't look like one either.

1Just realized that in other posts I've written "Nagle trees", having somehow mixed up cryptographer Ralph Merkle with John Nagle of TCP's Nagle Algorithm fame. Oh the shame. Comes from working in too many technical domains, I guess.

Michael Wojcik Silver badge

Re: April 1st already?

Blockchain in COBOL on Itanium - really?

No, not really. The new system is not written in COBOL, according to the article. And while it's not entirely clear to me (I may have missed something), I suspect it's not running on HP Itanium hardware, either.

Not that it would be hard to do blockchain in COBOL. Blockchain is just a degenerate Nagle tree where you periodically prune all but one branch using a half-assed consensus mechanism. All you need to implement a Nagle tree is a cryptographic hash. And while it's a bit of a pain implementing a cryptographic hash in standard pre-2002 COBOL (I know, because I implemented MD5 in COBOL-85), that's not what any sane person would do anyway; they'd call a library function to do the actual hashing.

RIP: Creators of the GIF and TRS-80

Michael Wojcik Silver badge

Re: And how about GINGER, GIST, GIBBERISH, GIBLET, and GIN?

"Why the long face?"

Michael Wojcik Silver badge

Re: Loved my TRS-80 as a kid.

I did quite a bit of experimenting with a Model I as a student, and then later took programming courses (COBOL and Pascal) using Model IIIs. I did get a chance to play a bit with a Model II, which was completely unrelated to the I and the III; it has 8" floppy drives which were already unfashionable at the time, but possessed a businesslike air. "Oh, those 5 1/4 drives are for kid stuff."

I had a very interesting book that described an alternate OS for the TRS-80 Model III, and included annotated Z80 assembly for it. I never got around to trying to enter and assemble the whole thing and get it to run, but I did sketch out some routines for floppy access — which I'm sure wouldn't have worked as written, but were a good thought experiment.

None of the original line did raster graphics — they had block graphics that had something like 1/6 of a character cell as the element, if memory serves — so you couldn't view a GIF on them. Perhaps on the CoCo?

Along with the Commodore PET/CBM, VIC-20, and 64; the Apple ][ and //e; and the Atari 400 and 800, the TRS-80s were formative 8-bitters of my youth, and gave me a lot of insight not just into programming but system design and the differences among different CPUs and machines. Really helped me avoid the "all the world's a VAX" syndrome.

British cops arrest seven in Lapsus$ crime gang probe

Michael Wojcik Silver badge

More info

Krebs has a good write-up.

There are some interesting point in the article. One is that the alleged Lapsus$ leader, the Oxford teen described in this story (whose name has been leaked but I'll follow convention and omit it), was doxxed by members of Doxbin, a doxxing site he purchased a year or so ago. Apparently Doxbin members were not happy with his administration of the site, and he agreed to sell it, but then dumped the content, greatly reducing its value.

But various organizations such as Unit 221B say they knew who he was before the public doxxing. Apparently some security researchers and the police have had him under investigation for months while they gathered evidence, and 221B's Allison Nixon mentioned alerting some victims before Lapsus$ attacked them.

Most of the big ransomware organizations are run in a fairly corporatist manner, as shown by the Conti leaks for example, or are run by governments in a bureaucratic or military style. Consequently they optimize for income or damage. Lapsus$ seems to have been going for reputation and press attention, which is not a great strategy for staying in business if you're running a criminal enterprise.

Michael Wojcik Silver badge

Re: I'd say to hire them...

Lapsus$ were successful primarily through extensive research and social engineering. One of their tactics was to gather a long list of employee names and the like, and then swamp help desks with requests for password resets.

Their differences from other prominent ransomware groups include going after high-profile targets, doing the work themselves rather than recruiting an affiliate network, and making more focused efforts rather than simply going for low-hanging fruit.

HP finance manager went on $5m personal spending spree with company card

Michael Wojcik Silver badge

Bling is functional. Veblen goods provide social signalling and satisfy psychological cravings that some people are susceptible to acquiring under exposure to certain cultural artifacts.

Those may be functions you don't care about. Good for you; that's one (small) part of acting like a relatively efficient economic agent. It's a quirk of psychology, though. Nothing more.

I'm not interested in Veblen goods either. Some people — and I'm not saying you're one — regard that as some sort of moral superiority, and I think that's mistaken; it's far more complicated.

Michael Wojcik Silver badge

Re: Weird

There's a psychological pathology at work, at any rate. I've seen this in other cases, where people embezzle recklessly, with little or no realistic hope of getting away with it. It seems to be compulsive.

There was a somewhat similar case where the company treasurer for a Michigan firm embezzled millions for a 419 scam. You might think someone in that position would have to be able to recognize a 419 — it's not like they're sophisticated — but she just kept pumping company funds into it.

Hackers weigh in on programming languages of choice

Michael Wojcik Silver badge
Coat

Re: Who wrote this survey?

Even stock vim comes with the builtin :make command, which will drive make-based toolchains on most platforms with IDE-like behavior, building the target and scraping warning and error messages, which can then be navigated in the source using the :c* commands.

And stock vim comes with a lot of filetype and syntax macros and plugins, for things like syntax highlighting. And fancier plugins such as netrw and tohtml. And Towers of Hanoi.

Personally, I describe my IDE as "bash plus vim plus every other CLI program on the machine". But vim has IDE-like features, at least.

Russian IT pros flee Putin, says tech lobby group

Michael Wojcik Silver badge

Re: Russxit

I read an article somewhere — alas I do not recall the source — that claimed Putin's political career has been largely shaped by profound bitterness at what he felt was the USSR's abandonment of him and his KGB compatriots in East Germany during the fall of the Berlin Wall and German reunification. In this reading, his rule over post-Soviet Russia has been primarily concerned with correcting what he sees as weakness in the late USSR and the first nine years of post-USSR Russia, and attempting to create a new Russian empire.

There's a scene in Pamela Dean's Juniper, Gentian, and Rosemary where Gentian and her father are discussing Gentian's recently-thwarted adversary, who is also The Adversary, if you take my meaning. Gentian says she thinks she hurt his feelings, and her father remarks that "his feelings are permanently hurt". (Milton of course makes a similar observation, at somewhat greater length.)

That's how I imagine Putin. His feelings are permanently hurt, and he's happy to burn the world out of spite.

IBM files IP lawsuit against mainframe migration firm

Michael Wojcik Silver badge

Re: Lawyers and Tech...

Even though the PoOp documents the instruction set, some aspects may be patented. Patents are public documents too, after all; that doesn't mean you can just implement what they describe with no consequences. Note that IBM says two of the allegedly-violated patents are for "methods embodied in IBM mainframe instructions".

There are also potential issues in how you implement the instruction set. Two of the allegedly-violated patents are for optimizations. z is a CISC architecture; there's a lot of wiggle room in implementing some of those opcodes. That also means trade secrets and reverse engineering might apply, if IBM can demonstrate that LzLabs used optimizations that are used in z, aren't documented (in the PoOp or patents or otherwise), and (make a convincing argument for) wouldn't be arrived at independently. That would support their charge of license violations.

But, yeah, the other licensed software has to be a sticking point for this sort of thing.

FIDO Alliance says it has finally killed the password

Michael Wojcik Silver badge

Re: So much fail

Exactly. Smartphones are:

  • Fragile, but carried everywhere and frequently handled, and so frequently broken.
  • Expensive, with a high resale value and often with valuable contents, and so frequently stolen. And since they're expensive, they're unavailable to people of severely-limited means, which can include people with access to sensitive systems.
  • Not infrequently misplaced or lost.
  • Extremely complex and loaded with software of dubious quality, making them unreliable at the software level as well as the hardware.
  • Equipped with an enormous attack surface, and frequently compromised.
  • In my experience, often used without a passcode or other even minimal protection.

As authentication tokens, smartphones are abysmal. It would take some effort to devise a worse choice.

The FIDO alliance has spent most of its history barking up the wrong tree. This is no exception.

How not to attract a WSL (or any) engineer

Michael Wojcik Silver badge

Re: A fail on both counts

Or people with decent performance, plus a comfortable ride and ample room for cargo.

Hear us out: Smartphone lidar can test blood, milk

Michael Wojcik Silver badge

Liquid testing with smartphones, previous edition

See Yue and Katabi, "Liquid Testing with Your Smartphone", CACM 64.10 [Oct 2021]. (Everyone here is an ACM member, right?)

Yue and Katabi used an actual smartphone — an app they wrote running on an iPhone — to perform various tests on liquids by 1) vibrating the container (using the phone's vibrate function) to create wavelets, and 2) imaging the reflections of the wavelets off the bottom of the container (using the camera and light, obviously). That lets them compute the fluid's surface tension to nearly the same precision and accuracy as a lab tensiometer.

With that information, their app was able to do things like detect certain types of water contamination, test albumin concentration in urine, and determine the alcohol content of a beverage.1 Apparently there's some application for that last one.

The paper is worth reading. They had to deal with some interesting problems, and their solutions are intriguing. They also cite some related prior work, of course.

So while this latest work is interesting and potentially useful, and having multiple techniques at hand improves sensitivity and range of applications, it's not the first successful stab at "let's use the existing smartphone sensors to do chemical analysis".

1If you're trying this at home, don't mix up the containers for experiments 2 and 3.

Microsoft Visual Studio: Cluttering up developer disks for 25 years

Michael Wojcik Silver badge

Re: @Richard Speed -- Wait...Wha'?

Bloat hurts cache utilization, too. It's not just RAM footprint that matters.

Personally, I loathe all the modern IDEs I've used. None come close to the power of bash. Sure, it doesn't do much for software development by itself; but I can plug in my choice of editor, debugger, build system, etc., and indeed use a variety of them in the same session. And I have all sorts of tools to eliminate repetitive tasks.

Why should combining all of your toolchain into a monolithic application be a good idea?

An open-source COBOL contender emerges

Michael Wojcik Silver badge

Re: "guaranteed a job."

Oh, be reasonable. Are you telling me that EXEC CICS BIF DEEDIT FIELD(ws-foo) LENGTH(7) RESP(ws-bar) RESP2(ws-baz) END-EXEC1 is not intuitively obvious? Why, it's clear as day!2

But that's a legacy code base for you. Vast landscapes of undocumented, largely comment-free source — when source is available at all. Sometimes people just have faded listings on paper. Sometimes no source code at all. No one knows exactly what operations and calculations those applications perform. Replacing them would be a major feat in software archeology.

And why take the risk, when you can put a modern3 front end on the thing, by screen-scraping, or using an environmental mechanism like EXCI, or tacking a web API over the business logic. You get a new "user experience", management is satisfied, none of the incomprehensible tangled mess needs to be unpicked.

1Not to be confused with BIF DIGEST, which does something entirely unrelated.

2It's snowing quite heavily here.

3For arbitrary definitions of "modern".

Michael Wojcik Silver badge

ISO COBOL won't get you far

Writing a COBOL front-end for GCC is a nice project. And there's some green-field COBOL development, and some number of existing COBOL applications which don't stray too far from the ISO standard; thus the success of GnuCOBOL and COBOL-IT.

But the standard allows tremendous latitude, and implementations have dramatically different behavior. Nested PERFORMs behave differently on different platforms, for example, and the differences control where the program ends up at the end of a PERFORM, so that matters a bit. MF Visual COBOL currently has 17 dialects to pick from at compile time, more than 30 options for tweaking the dialect, and 50 or so other ways to modify what sort of COBOL the compiler actually implements. (And that doesn't include the OO or managed dialects, conditional compilation, or other runtime-behavior options.)

And as others have pointed out, most real-world COBOL applications will depend heavily on the environment, whether that's one of the IBM mainframe ones such as CICS or something else. Yes, there are COBOL programs which just use POSIX / SUS or Windows APIs; but they're in the minority.

Michael Wojcik Silver badge

Re: "guaranteed a job."

We have plenty of COBOL customers who are not using CICS or IMS or z batch-mode environments.

It does seem to be common to tie COBOL jobs to particular environments: "we want someone who can write COBOL for Siemens BS2000", as if people capable of working on that machine are born that way and couldn't possibly have learned it.

That said, the COBOL standards leave ample room for major differences among implementations, and pretty much all of them have extensions. So you can't just move COBOL expertise from one implementation to another. And it's also true that real-world COBOL applications will have to deal extensively with their environments. But these are things which can be learned.

Michael Wojcik Silver badge

Re: How did it come to this? - Draft standard

Yeah, for many people, the draft standard is fine. You really only need to pay when you need to say you've consulted the actual, official thing.

Still, it's a problem, and I wish more commercial software-development organizations just budgeted to subscribe to the standards relevant to their work, for all their developers. I've settled too many language-lawyer questions just because I paid (out of my own pocket) for the standards.

For C90, back in the day, many people bought Schildt's The Annotated ANSI C Standard, which contained the complete text of C89 (the ANSI version, adopted wholesale by ISO the following year). The book was cheaper than the standard document from ANSI, but contained all of its text, plus Schildt's commentary.

The standard review on comp.lang.c was that the cost difference was the value of Schildt's annotations. Still, you were free to ignore those.

Chip world's major suppliers of neon gas shut down by Ukraine invasion – report

Michael Wojcik Silver badge

Subhead

Well that brought back memories.

I was at the Modern Language Association's annual convention1 one year and saw a panel on anime, where one eager young gentleman waxed enthusiastic about Neon Genesis Evangelion. He all but called it the greatest work of art in history. I thought, well, Gainax has pulled off one of the more elaborate trolls there.

(I mean, I enjoyed NGE, in a vaguely bemused way; but it's no Gunbuster.)

1A tolerably large affair at the time, in the early 2000's.

Startups competing with OpenAI's GPT-3 all need to solve the same problems

Michael Wojcik Silver badge

Re: Magic Sales Bot

On a more serious note, machine prose generation is already widely used commercially, for example in sports and financial reporting, and in the niche non-fiction market. Philip Parker's system is the best-known example of the last; his Icon Group International claims to have published over a million titles, either in electronic form or print-on-demand.

The first documented commercially-published machine-generated novel seems to have been True Love from SPb, in 2008. It wasn't very novel, actually, being a style-transformed pastiche of Anna Karenina,1 but it was produced by software. There appear to be a number of commercial projects in this area so it's likely there are quite a few more commercially-published machine-generated novels by now.

And of course there is extensive academic and commercial research in the field of machine-generated prose, and related fields such as machine-assisted prose generation.2 Doyle's efforts are not representative of the state of the art. A few years back CACM had a cover story on computational journalism, for example.

(Note, too, that in some other fields computer-generated "creative" work has been around longer and had greater success. Computer-generated classical music was winning praise from critics in the 1990s.)

1If you're going to steal, steal from the best.

2I've done some work in computational rhetoric, for example. Computational narratology, computational adaptations of formalist and structuralist literary analysis, computational folklore, computational psychology ... there are many cognate fields of research.

Michael Wojcik Silver badge

Re: "lack of common sense and inability to be accurate "

Only for an extremely reductive and unhelpful definition of "statistics".

Now show us evidence the human CNS is doing something fundamentally different.

Shazam! Two world-record lightning events recognised

Michael Wojcik Silver badge

Headline

Houston Fires Lightning Cannon at Mobile, Misses

Idea of downloading memories far-fetched say experts after Musk claim resurfaces in latest Neuralink development

Michael Wojcik Silver badge

Re: Not gonna happen any time soon.

"Sure, it's pretentious irrational bullshit, but it's pretentious irrational bullshit in a good cause."

No thanks. That's an excuse which has been used to cover a multitude of sins. I'm not going to give Musk a pass on this.

Hardware boffin starts work on simulation of an entire IBM S/360 Model 50 mainframe

Michael Wojcik Silver badge

Re: the computer on which virtualisation was invented.

Citation, please.

The Atlas was the first machine to have virtual memory, including simple memory protection, but I can't find any reference (aside from your post) that claims it had virtualization. And the Atlas Supervisor paper doesn't say anything about virtualization either.

Carked it, Diem? Zuckerberg's grand cryptocurrency thing may sell off assets for $200m

Michael Wojcik Silver badge

Re: What is the point?

Yes, Pascal's post was impressively wrong. We've seen him grind that axe several times before, though, and I expect we'll see it again.

To go back to the original question: There are many "stablecoins". The argument for them (which, as with all cryptocurrency projects, I do not find particularly persuasive) is that they combine the partial anonymity1 of cryptocurrency with better price stability.

There was a really nice overview of a number of stablecoin proposals and implementations, discussing the technical and economic implications of their design choices, in CACM (if memory serves) a couple years back. In many ways, the research around cryptocurrencies is far more interesting, and useful, than cryptocurrencies themselves.

1There's a substantial body of research on de-anonymizing cryptocurrency transactions using just the metadata available, to say nothing of various attacks against poor implementations and OPSEC, or making use of normal HUMINT (e.g. getting people to turn state's evidence).

Machine needs more Learning: Google Drive dings single-character files for copyright infringement

Michael Wojcik Silver badge

Mani had prior art circa 250 AD. Your "electronic states" bit is just an implementation detail.

Now, if you'd said "electronic states with rounded corners"...

IBM confirms new mainframe to arrive 'late in first half of 2022'

Michael Wojcik Silver badge

Re: odd wording

Not a problem, since IBM mainframe "MIPS" don't mean MIPS anyway, and haven't for a long time. They're a metric IBM derived years ago when z started getting architecturally more complex.

So they're only useful for comparing different z models, or figuring a rough estimate of workload size.

Michael Wojcik Silver badge

Re: Masochist

"Work consists of whatever a body is obliged to do. Play consists of whatever a body is not obliged to do."

Anyone who wants to get a taste of IBM 3xx life can try out the free Hercules emulator and the old IBM OSes and other software which is legally available for download and use on it. It's not the same as modern z, but quite similar in many ways, and the look and feel hasn't changed (at least if you still do things the traditional way).

Of course it might not be easy to find your way around. I believe there are various walkthroughs for getting things up and running with Hercules, though, and there are books for aspects of using z-and-predecessor OSes, such as Fake Your Way through MVS.

Michael Wojcik Silver badge

I think there aren't many niches for such

Selection bias.

The bulk of my work is in IBM mainframe environment emulation, so I see customers with z systems all the time. If you don't work with z or its predecessors, you're less likely to run into them. And they're expensive beasts to lease and maintain, so they don't proliferate the way Windows and Linux systems do. But there are plenty out there.

Just look at 3270 emulators. That's software for the particular proprietary green-screen terminal family that z uses.1 Considering that a great many mainframe apps are front-ended with Java or web UIs, or have no UI at all, that's got to be even more niche than the machine itself.

And yet you have free 3270 emulators that are still maintained and widely available, such as x3270/c3270. And you have a wide range of commercial emulators from various vendors: IBM's PComm; our HA Cloud, Reflection, and Rumba lines; bundled ones as in Microsoft's HIS; and various smaller offerings such as BlueZone, Rocket, and Turbosoft. All still viable products. And you have people writing new Internet Drafts proposing enhancements to the TN3270E protocol, and so on.

IBM wouldn't be creating new models if the market weren't there.

Of course, they're heavily virtualized (VM/CMS was nearly 30 years old when VMWare was founded), so it's rare to see a physical system, which might be running hundreds or thousands of LPARs across an organization.

1System i and its predecessors such as the AS/400 use 5250, of course.

If you want less CGI and more real effects in movies, you may get your wish: Inflatable film studio to orbit Earth

Michael Wojcik Silver badge

One-way ticket?

claims to be working on "the first ever Hollywood motion picture filmed in outer space," reportedly involving Tom Cruise

This is going to be a problem.1 No one's managed to bring Cruise down to earth yet.

1For small values of "problem".