back to article The Steve Jobs of supercomputers: We remember Seymour Cray

Before Steve Jobs, there was Seymour Cray – father of the supercomputer and regarded as something close to a God in the circles he moved in. Jobs’ Apple Computer is reputed to have bought one of Seymour’s massive machines back in the day: a Cray, to design the brand-new Macintosh personal computer. This would have been a …

Page:

  1. Aristotles slow and dimwitted horse

    Pretty sure...

    The first real "CGI" in movies was done on a CRAY XMP. The Last Starfighter and such like...

    1. Richard Wharram

      Re: Pretty sure...

      Tron was done on a Foonly was it not?

      1. Bit Brain

        Re: Pretty sure...

        3 different companies did the CGI for Tron. According to the "The New Magicians" episode of the old TV series Equinox, one of them used the X-MP.

      2. Michael Strorm Silver badge

        Re: Pretty sure...

        Interestingly, despite the fact it *did* include quite a bit of innovative CGI, the majority of Tron's "high tech" look was done using non-digital masking, layering, film processing techniques and backlighting. (#)

        It certainly wasn't the first film to use any of these techniques- indeed, backlit animation was very popular in the early 80s for that neon/computer look back when real CGI was limited and expensive. However, it's been observed that it was probably the first (and will likely remain the only (##)) film to use them in such an ambitious and extensive manner- basically, every scene inside the computer that isn't CGI uses these film processing techniques to some extent.

        Ironically, Tron's reputation as groundbreaking CGI has overshadowed this (also impressive) use of traditional filmmaking techniques in such an original way.

        (#) See my comment on "Max Headroom" elsewhere for another example of early-CGI-that-wasn't-actually-CGI-at-all.

        (##) I say this because- with the huge technical advances and reductions in cost of actual CGI since Tron came out- there's no way anyone would do it that way today. Even if they wanted to replicate exact the same appearance and feel, it would still be so much easier to do that digitally, no-one with that sort of budget would do it the incredibly tedious and error-prone analogue way.

      3. John Smith 19 Gold badge
        Unhappy

        "Tron was done on a Foonly was it not?"

        From an article at the time Tron used 3 different systems for different scenes in the film, with different models (wire frame versus frame buffer) and at least one using custom built hardware.

        The article said it bankrupted all three production houses involved.

        The Last Starfighter was all done on a Cray and IIRC the company was quite proud that the animation got done at about 1/24 real time IE 1 frame a sec from scratch

        So pretty much what a competent Blender user could achieve on a new PC.

    2. Amorous Cowherder

      Re: Pretty sure...

      A CRAY was used as a fancy prop in the Robert Redford/Ben Kingsley flick Sneakers. Redford, after being knocked unconscious, wakes up on the "seated" part of the CRAY by Kingsley the evil genius paid by the Mafia to run their office "clerical" systems!

      1. oldcoder

        Re: Pretty sure...

        No. They wanted to use a Cray, but the company said no - they would never sell to a Mafia backed company.

        So they used a fake that had some of visual cues of a Cray.

  2. Anonymous Coward
    Anonymous Coward

    A superfast computer ...

    ... and a comfy bench when your all calc'ed out.

    1. Teiwaz
      Coat

      Re: A superfast computer ...

      It's a wonder there is no 'themed' furniture, or lego?

      Maybe next time I'm building my own desktop I'll try for a mock-up (think I still have an 'action man' or two in a box somewhere to put on the seats).

    2. Ilgaz

      $38M chair eh?

      You can also plug your phone. Read what they did in 70s

      http://www.0x07bell.net/WWWMASTER/CrayWWWStuff/Cfaqp2.html#TOC12

      Regular Crashes

      The following is an approximate description of an event that took place in the late '70s:

      There was the time that an early Cray-1 serial number was running pre-ship reliability tests and was crashing consistently at roughly the same time in the pre-dawn morning hours. After much head scratching, somebody realized that the (newly hired) third shift custodial engineer was coming through on his regular pass through the checkout bays at about the time the failures happened. He was questioned when he came in the next night to find out if he might have been moving or operating any of the system machinery in the bay during his rounds. He denied doing anything of the sort. Later that night however he was noticed plugging his Hoover into one of the 60Hz utility outlets conveniently located on the base of a Cray-1 power supply 'bench' in an adjacent checkout bay. The outlets were handy for plugging in test equipment during checkout activities but an ECO (engineering change order) was issued soon afterward removing them.

      1. Tcat

        My 3 Deamons

        For my mid-life crisis I went to IT Training.

        EMI RFI and static we're a minute zero center when real training began.

        I remember in irony, the police car RADIO trigging the stereo store burglar alarm

        (when parked on one building side), DSU/CSU mortally wounded because

        the power vacuum met the data cable. OR the ID10T that ran CAT 3

        inside an elevator shaft in active duty.

        and the poor sod that was so naturally static creating in his office,

        I found him a tech anti static mat and to this day, I keep in good service

        a pair of Birkenstocks with ESD built into the heel.

  3. Andy The Hat Silver badge

    Dribble mode <engaged> ...

    Sorry, I always loved the Cray-1 ... the Cray XMP wasn't bad either ... vector processors ... ooh missus get your fingers off my keyboard!

    It may date me but I'd now be happy if I could simply have a Cray 1 as a seat - like a computer version of Clarkson's Top Gear furniture ... Full geekdom would be in sight :-)

  4. Anonymous Coward
    Boffin

    What use as a bitcoin miner?

    1. Dani Eder

      Bitcoin miner

      These days, anything other than custom bitcoin mining hardware isn't worth the electricity it consumes.

      The bitcoin network currently runs at 5.5 million Petaflops. That far outclasses the 361 Petaflops of the world's top 500 supercomputers *combined*. The reason the network is so fast, is these days it runs on custom chips that implement the hashing algorithm in hardware, and are massively parallel. The downside is these chips are useless for almost any other purpose. You can't reprogram them to do some other calculation, it is wired into the silicon.

      The reason these custom chips were worth making is newly mined bitcoins are worth $315 million a year at the moment, more than enough to justify a custom chip.

  5. Anonymous Coward
    Devil

    A question

    Steve Wozniak built the Macintosh in order to democratise computing

    So why are they so fucking expensive?

    1. Teiwaz

      Re: A question

      "Steve Wozniak built the Macintosh in order to democratise computing

      So why are they so fucking expensive?"

      Ah, I see your mistake was in thinking 'democratise' in the traditional sense, not in the modern sense where money buys legislature.

      But then Steve Wozniaks thinking was probably the former, the other Steve was more of the latter.

    2. James 139

      Re: A question

      Probably a combination of things, such as seeing the price people were willing to pay for a Cray.

      As for democratisation, dont confuse available to all with achievable by all.

    3. Steve Todd

      Re: A question

      To my knowledge Woz wasn't involved in the design of the Macintosh. He designed the Apple I and Apple II to that end, and they were cheap compared to commercial kit at the time.

      PC manufacturers these days have two basic options : compete on price (which leads to a race to the bottom), or compete on features (at which point PCs and Macs are about on parity for a given price point).

      You're not forced to buy a Mac, in the same way you aren't forced to buy a BMW rather than a Ford. Providing there are enough people out there who think that BMW/Apple are worth the extra then they make a living. That seems fundamentally democratic to me.

      1. MondoMan

        Re: A question

        Yep, the Macintosh was famously Jobs' baby. The anecdote reference is to the Apple I/II instead of the Macintosh.

      2. This is my handle

        Re: A question

        > or compete on features (at which point PCs and Macs are about on parity for a given price point).

        I hear this all the time from Apple users, and have always been skeptical, at least for US pricing. "One 3rd the number of mouse buttons, 3 times the price" is what I've always said.

        I'm eating my words though; parity is near. The local BestBuy offers the machine below for $1K US before taxes: Apple - MacBook Air® (Latest Model) - 13.3" Display - Intel Core i5 - 4GB Memory - 128GB Flash Storage - Silver. Comparable WinDoze machines vary from $800 - $1,400 depending on whether or not you want "touch", or will reformat the Win10 drive w/ Ubuntu, LOL.

      3. TheOtherHobbes

        Re: A question

        Woz was distantly involved in the pre-Jobs Jef Raskin-Mac. Raskin wanted a $500 computing appliance, with a supporting national network (!) and some unusual software (like a pseudo-APL "calculator") running on a conservative hardware spec to keep costs down.

        Jobs liked the all-in-one idea but wanted a more powerful spec because he'd been to PARC by then, and a 6809 with 64k wasn't going to make it happen.

        Raskin and Jobs weren't a mutual fan club, so Raskin left (he never forgave Jobs), leaving Jobs with the idea of friendly+cheap and a brilliant design team.

        The friendly part stuck, the cheap part didn't. Nor did Raskin's unusual software ideas - some of which would still be interesting today.

    4. Anonymous Coward
      Thumb Up

      Re: A question

      Yep, Commodore et al did more toward that goal!

    5. Anonymous Coward
      Anonymous Coward

      Re: A question

      Because they design and integrate both the software and the hardware so it works.

    6. SImon Hobson Bronze badge

      Re: A question

      > So why are they so fucking expensive?

      Ah, that was democratise as in ... make accessible to mere users, not just the nerds. And at the time, other computers (of decent spec) weren't exactly cheap !.

      The vision (and as pointed out, it was Jobs, not Wozniac) was to make a computer that was easy enough to use that anyone could use it - that's the democracy bit. Instead of having to learn loads of nerdy crap to do anything, you just had this simple visual desktop metaphor that just about anyone could get to grips with.

      Alongside that was possibly the most important feature - a printed 3 volume set of developer info, Inside Macintosh, one whole volume of which was on how to "do the right thing" with the interface. Most developers followed this and so their programs were easy to pick up and use. For the few that ignored it, the users generally told them where to stick their crap UI and the developer either fell into line or the program flopped.

      There was much detail in the dev books (I had a set). Even the mundane things like "though shalt have an Apple menu, and it shall contain ...", and "the next menu will be File, and it will contain ...", and so on.

      For those too young to remember, this really was a major milestone. Even without the graphic interface, just the user interface consistency was a major thing - back then every (almost all text based) program "did it's own thing", so having learned one program was naff all help in using another because every developer had their own idea of how it should be done.

      And the general lack of modality was another breakthrough. At around this time, in ${dayjob} the standard word processor was IBM's Displaywrite. This was highly model - you went into one menu from where you could edit a program, if you wanted to print you had to save the job, exit that menu, go into the printing menu, print the file, and then go back to the edit menu to continue editing. I don't recall if it had background printing or whether you had to wait for printing to finish before doing something else.

      On the Mac it was all event driven - and that challenged some developers who were used to letting the user do half the work. So in a word processor, you could be typing away, and without a thought just whizz up to the File menu, select Print, and the program had to cater for that.

      I never got into "classic" programming on the Mac, but I did get to do a fair bit in Hypercard. For it's day that too was a real breakthrough, with fairly simple (but capable) programming, object oriented, and with rudimentary database capability. I know a lot of stuff was built with that !

    7. Dadmin
      Holmes

      Re: A question

      Because they bothered to put more thought into the design and the coupling to the OS than anybody the biz did, or does. Apple only produced a single architecture at time and took great pains to make it all "look and feel" better than your experience with any other desktop machine of the era. To that end it makes it easy to build the entire computing experience from opening of the box, to setup, to booting, to working on it when your all one company doing one thing. Think about it, who else bothers to do this? Apple had an entire half of a building floor filled with Human Interface design engineers. This was back in the late 80's and early 90's when I worked there. No other computer hardware manufacturer or software manufacturer went to these lengths to get the entire system to work together in a fluid way that makes you concentrate on your work, not how to get past a fucking giant windows modal dialog box that thwarts you from doing even the most simple tasks.

      Linux desktops are getting quite good, and I'm moving my primary home desktop from an old Mac Mini to a RaspberryPi setup, with some time before I complete cut over, or at least as long as this old Mac stays running. But, as to your question; it's the amount of time they spent on crafting their product and the supply and demand that can justify the price. You pay a bit more for the design and the convenience factor, not because they put a cheap Intel heater inside it.

      Some notes relating to the article:

      -The first Cray at Apple Computer was a Cray X-MP

      -It was custom painted purple at a nearby auto body shop in Sunnyvale CA

      -The second Cray they purchased was the Cray Y-MP

      -I was present the day we ALMOST had a halon dump in the main Vally Green 3 computer room. This was back in the early 1990s and I was but a lowly network apprentice. Somehow the air in the computer room became much more humid (It's those chillers, I tell you!), at any rate the smoke doctors under the raised floor got a bit misty-eyed and started the countdown to HALON DUMP TIME! Luckily one of the Unix admins quickly assessed the situation and hit the "don't fucking dump the halon, no fire" button. Good times.

      -This is the same admin who showed me how to stitch together multi-part unencoded porn pics from USENET. Good guy.

      1. Dan 55 Silver badge
        Windows

        entire computing experience... one company doing one thing... who else bothers to do this?

        At the time, Commodore, Atari, Acorn...

    8. Captain DaFt

      Re: A question

      "Steve Wozniak built the Macintosh in order to democratise computing

      So why are they so fucking expensive?"

      Well it's all a matter of scale. The iPhone may be pricy, but pales in comparison to what a Cray costed, and can out compute a room full of them.

      Anyone can buy an iPhone, only governments could afford the Cray.

      That looks like democratistion to me.

      1. Steve Davies 3 Silver badge

        Re: A question

        But..... I've still not seen anyone brave/foolhardy/idiotic/etc wearing an Apple Watch.

        Even in the Apple Store in Washington DC I visited earlier today (to get out of the rain...) I failed to spot anyone wearing one.

        Bit of a damp squid if you ask me.

        On the otherhand, a quick glance around the BA Lounge at Dulles there was hardly any non Apple device in operation.

        Make what you want of that.

        Kudos to the Apple Store though, the did have a MacPro on show and working.

        Oh, and the Computer Museum at Bletchley Park has a Cray-1 complete with seats.

  6. Little Mouse

    In the nerdy corner of the school playground in the eighties I remember clearly that Cray computers were considered the absolute ultimate when it came to processing power.

    My mate swore blind that using one was absolutely definitely the ONLY way that Max Headroom could have been made (!)

    1. Teiwaz

      My mate swore blind...

      I'd a mate at school who always tried to outdo everyone, and didn't let his ignorance get in the way. He once claimed he was getting a Cray (I think he got an MSX, in the end).

      1. Destroy All Monsters Silver badge
        Childcatcher

        Re: My mate swore blind...

        I think he got an MSX, in the end

        Damn you Greg. Long time, no see!

      2. PaulyV

        Re: My mate swore blind...

        Ah, every school had one of those. Ours was called 'Peter' and it wasn't long before he had the nickname of 'The Incredible BullSh*tting Man'.

    2. Michael Strorm Silver badge

      Max Headroom

      @ Little Mouse; The irony- as I suspect you know, but others might not- being that Max Headroom himself wasn't CGI at all, but actor Matt Frewer wearing a load of prosthetics. I'm not sure anything approaching Headroom's appearance would have been possible- let alone practical- with computers of the time. (People- even intentionally fake-looking ones like Headroom- were always much harder to do convincingly for early CGI than shiny, flat-surfaced spaceships and plastic balls).

      Admittedly the effect was enhanced (I'm assuming) by what would then have been state-of-the-art digital editing effects et al, but that's still not CGI in itself. The rather simpler background graphics in some later episodes were apparently created on an Amiga, but that's hardly in the same league of complexity.

      1. Anonymous Coward
        Anonymous Coward

        Re: Max Headroom

        Max-Headroom was fantastic, shame it didn't last.

      2. Frank Bough

        Re: Max Headroom

        The BG gfx weren't done on an Amiga, I used to work with the guy who did them, I think it was a Matisse system that they used. Not sure.

        1. Michael Strorm Silver badge

          Re: Max Headroom

          @Frank Bough; Re: the background graphics, I was only going by what was written in the Wikipedia article (hence "apparently") which claims that the Amiga was used for backgrounds in the later US TV show and they were originally done by hand by the same guy that did the pseudo-computer-displays for the Hitchhikers Guide to the Galaxy TV show. Neither of these claims are referenced, though, so I've no idea where they came from.

          1. Clueless_Shot

            Re: Max Headroom

            Thanks for a fun to read thread

            I do hope that this can be answered

            I'm curious, Years ago way back in the eighties reading through one of my fabulous ZX Spectrum Magazines I recall reading about the BBC using the Cray that was used by the met service to render an image of Max headroom, Does anyone here remember this happening or were you involved in this? I'm interested to find out as its bugging the heck out of me. I can not for the life of me remember how long it took, I seem to think it was a week to render one image?

            Anyone?

  7. Tromos

    Happy days

    I used Cray designed kit throughout most of my early career, loads of CDC 6000 and 7000 series and then Cyber17x. Managed to use (and sit on!) not one but two Cray-1s. Always loved the clean and elegant machine instruction sets. That's probably why I much preferred the Signetics and Motorola microprocessors to the Zilog and Intel ones when the personal computer revolution started.

    1. Michael H.F. Wilkinson Silver badge

      Re: Happy days

      Same here. My first programming was on a CDC 7600 and much later did loads more serious work on the J932 and SV1e. Cray didn't just provide good hardware, but also cracking good compilers. They could recognize just about the most obfusticated bunch of for-loops as a matrix multiplication and replace the code by some optimized routine from their library. What I really liked about both the SV1e and J932 is how these shared-memory machines managed to attain an average performance of roughly 2/3rds of the theoretical peak. Some really nifty scheduling going on, which is VERY hard to accomplish on clusters.

  8. John Smith 19 Gold badge
    Coat

    The cry of all big data apps is

    "Feed me, Seymour."

    "Feed me all night long."

    Time to be gone.

  9. Peter Simpson 1
    Thumb Up

    CYBER 74

    My university took delivery of a previously owned (they got a deal) CDC CYBER 74 while I was there. It was on this machine that I took my required assembly language programming course from the CDC applications engineer who came with the machine. 60 bit words, hardware floating point operations and a "count ones" instruction. "Anyone know why that's in the instruction set?" he asked. No one did. "By special request from a three letter government agency", he replied.

    // much more fun than waiting in line at 3AM for time on the PDP-11

    1. Loud Speaker

      Re: CYBER 74

      The "Count ones" instruction was there because it is used for database joins. (Codd, Date). It also has some use in image processing (not sure what).

      Report the highest bit set is generally found in the same context.

      Debugging Compass was not great fun.

      1. oldcoder

        Re: CYBER 74

        It is also good for very fast schedulers. Just assign a bit to signify the queue. On a 60 bit machine, that allowed for 60 priority queues - the first bit set to 1 gives you the queue index.

      2. IvyKing
        Boffin

        Re: CYBER 74

        COMPASS == Complete pain in the ass

        One nice thing about the 6000/7000 series instruction set was that it was really easy looking through an octal dump compared to what's involved with an x86 hex dump. Also had fond memories of the use of A1 to A5 for reading memory into the associated X register and A6 and A7 for writing to memory.

        Text processing on the other hand was a royal pain.

    2. DaveB

      Re: CYBER 74

      "Cray supercomputers early on featured a population count machine instruction, rumoured to have been specifically requested by the U.S. government National Security Agency for cryptanalysis applications."

      https://en.wikipedia.org/wiki/Hamming_weight

    3. John Smith 19 Gold badge
      Boffin

      "Anyone know why that's in the instruction set?"

      Small side point I wonder if it uses the bit counting algorithm outlined in "Combinatorial Algorithms" by Reingold, Nievergelt and Deo whose complexity varies with the binary log of the word length, takes a fixed number of cycles and which they "Do not give explicitly" ?

      As opposed to the other 3 bit counting methods they happily describe in details.

  10. Anonymous Coward
    Anonymous Coward

    Flight Of The Navigator

    <--- this was also a use of earlier CG.

    Anyone know what it was done on?

    1. Christian Berger

      Re: Flight Of The Navigator

      http://dave.zfx.com/f1.html

      It was the Foonly F1

  11. Bob Merkin

    Typical management

    "Barack Obama issued an executive order in July telling his nation’s technologists to build the world’s fastest and most powerful supercomputer – one exaflop – by 2015."

    Wait until the last minute to give an order, then expect the impossible. I don't think they're going to make it.

    1. Anonymous Coward
      Facepalm

      Re: Typical management

      Well, we did send a man to the moon and return him safely.

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like