back to article ARM creators Sophie Wilson and Steve Furber

The Story so Far At Acorn, Sophie Wilson and Steve Furber have designed the BBC Micro, basing the machine on the ageing MOS 6502 processor. Their next challenge: to choose the CPU for the popular micro's successor. Now read on... While Sinclair attempted to move upmarket with the launch of the QL in early 1984, Acorn was …

COMMENTS

This topic is closed for new posts.

Page:

  1. JDX Gold badge

    Buy ARM?

    I had ARM shares which plummeted in the .com crash. I wonder if they're a good buy, or if Intel is about to mount its first serious step into mobile devices.

    1. Anonymous Coward
      Anonymous Coward

      Re: Buy ARM?

      http://uk.finance.yahoo.com/echarts?s=ARM.L#symbol=arm.l;range=my;compare=;indicator=volume;charttype=area;crosshair=on;ohlcvalues=0;logscale=off;source=undefined;

      You bought at the peak last time and you're wondering if you should buy at the peak now?

      And with Bubble 2.0 getting into full swing too ...

      I'd say look at their 10-year valuation unless you're day trading.

      1. Glenns
        WTF?

        Re: Buy ARM?

        I'm glad I'm not the only one wondering when bubble 2.0 will burst... instagram for $1billion !! WTF !!

        They are expecting facebook to be valued around $100 billion...for a company with increasing costs and decreasing profits ($205m last quarter) I don't get it :S

    2. Anonymous Coward
      Anonymous Coward

      Re: Buy ARM?

      I bought ARM during the .com bubble, have held on to them, and I'm still waiting for them to get back to where they were then! Just about made it during the last couple of months... think I'll hold on to them a bit longer though.

    3. Christian Berger

      Hard to predict

      I'd say it depends on whether Intel or ARM manages to implement a stable "PC-like" mobile platform leading the mobile market out of the "home computer" market to the "PC era", when suddenly platforms _really_ span multiple vendors and you can separate the software from the hardware.

      Currently the mobile world is like home computers used to be. You couldn't be sure that software bugs ever were fixed. Back then that only meant some minor inconvenience, today it can mean serious security holes. While on a "PC" (or whatever stable hardware platform) you can simply update, upgrade or exchange your operating system without your hardware vendors consent. It is, in fact even quite easy. You boot from an USB-stick or CD-Rom and there you go, having another operating system without touching any stored data.

      This would be the next revolution.

  2. Chika
    Thumb Up

    for not mentioning the name of the person responsible for Acorn's "unravelling".

    Good article, though.

  3. Jefe Mixtli
    Stop

    NOOOOOOO!!!!

    If the The Register has one habit that bugs me, it's that their prime articles, and the accompanying banner images, stick around on the right-side bar FOREVER!

    I swear we've been staring at those two guys holding that PARIS plane for MONTHS now.

    Please, don't tell me I'm still going to be looking at this article's banner pic when Christmas comes around!

    (Great articles thought)

    1. Anonymous Coward
      Thumb Up

      Re: NOOOOOOO!!!!

      Agreed, they stick around forever. I could swear they do it on purpose - there's nothing more annoying than having a photo of some lego characters regularly on your work screen over a period of months, people get the wrong impression ;)

  4. Mr C
    Thumb Up

    Nice informative article.

    If this article had anything resembling a "like" button, i'd press it now.

    GJ

    1. VinceH

      You could always use the "Rate this article ..." gadget underneath the link for the comments.

      (Disclaimer: The only time I ever do that is by accident, when reading on my tablet.)

      1. Graham Dawson Silver badge

        I'm not alone!

        I do that all the time, it's so very annoying.

        1. ratfox
          Thumb Up

          Re: I'm not alone!

          Oh no, you are not alone. I suspect web designers will learn to place important links right next to ads in the near future.

  5. Eddie Edwards
    Thumb Up

    Superb

    Almost brought a tear to my eye. A beautiful article about a beautiful piece of technology. My jaw dropped when I read the ARM worked without applying Vcc :)

    1. Will Godfrey Silver badge

      Re: Superb

      Very much enjoyed the article.

      I might be wrong, but I think the VCC issue was originally flagged up because the test unit would occasionally crash for no apparent reason, so they started to monitor just about everything, and discovered the apparent lack of current. They then backwards reasoned that it was getting power via its inputs and that the crash happened when all inputs (data & address) were zero.

  6. ForthIsNotDead
    Pint

    Brilliant!

    I would just like to congratulate the author of this article. Beautifully written. A real credit to yourself, and The Register, sir. Thank you very much.

    It struck me that Hauser, despite not being the technical guy, is as bright as they come. He clearly recognised the talent he had with Wilson and Furber.

    I completely riviting read. Actually, the story of Acorn & ARM would make a very good book. Ditto Inmos IMHO.

    Have a good day all.

  7. Glenns
    Thumb Up

    Fantastic article

    This is why I come to the Reg, so much nicer to read a well informed structured article as opposed to the usual "my dads bigger than your dad" fanboy rantings on other sites

    A*

  8. This post has been deleted by its author

    1. Christine Hedley Silver badge

      Quite a few years back now, though evidently not that far back... Don't know the exact date, sorry.

      1. AfternoonTea
        Paris Hilton

        He's been dressing as a woman since the 80's. Not sure when he had the chop though....

        1. Christine Hedley Silver badge

          "He's been dressing as a woman since the 80's. [...]"

          *She* has. Sorry to nitpick, it's a subject that's rather close to my heart.

  9. Eldnah

    s/pus/put/ on page 3

    Great article. Wow ... running on leakage.

  10. Peter Gathercole Silver badge
    Boffin

    IBM ROMP vs. ARM

    The IBM ROMP chip (aka the 801) was never intended to be a general purpose RISC processor. It was intended to power an office automation product (think of a hardware word-processor like WANG used to sell).

    As a result, although it could function as a General Purpose CPU, it was not really that suited for it. It was never a success because at the time, IBM could not see justification for entering the pre-Open Systems UNIX world. RT 6150 and the 6151 were intended as niche systems mainly for education, although they did surface as channel attached display front ends for CADAM and CATIA run on mainframes (and could actually run at least CATIA themselves). This changed completely with the RIOS RISC System/6000 architecture, where IBM was determined to have a creditable product, and invested heavily.

    In comparison, the ARM was designed from the ground up as a general purpose CPU. Roger Wilson (as he was then) greatly admired the simplicity and orthogonality of the 6502 instruction set (it is rather elegant IMHO), and designed the instruction set for the ARM in a similar manner. Because the instruction set was orthogonal (like the 6502, the PDP11, and the NS320XX family), it makes the instruction decoding almost trivial. It also made modelling the ARM on an econet of BBC micro's (in BBC Basic, no less) much easier, which allowed them to debug the instruction set before committing anything to silicon.

    They had to make some concessions on what they wanted. There was no multiply-add instruction, which appeared to be a hot item in RISC design at the time, and to keep it simple and within the transistor budget, all they could do was a shift-add, (the barrel shifter), which although useful, was a barrier to ultimate performance, but great for multi-byte graphics operations.

    It was also simple enough so that they could design the interface and the support chips (MEMC, VIDC and IOC) themselves, achieving early machines with low chip counts.

    This is all from memory of articles in Acorn User, PC World, Byte and other publications. Feel free to correct me if my recollections are wrong.

    1. breakfast Silver badge
      Joke

      Re: IBM ROMP vs. ARM

      If ARM and WANG had worked together they could have greated the Jenerally Executable Reliable Kompiler, which would have been a major release.

      1. This post has been deleted by its author

      2. Mike Moyle
        Coat

        @ breakfast -- Re: IBM ROMP vs. ARM

        Ah...! You're thinking of the ShortARM™ project.

    2. Anonymous Coward
      Anonymous Coward

      Re: IBM ROMP vs. ARM

      "There was no multiply-add instruction."

      The ARM processor in the original Archimedes had multiply and multiply-add instructions (MUL and MLA), though I seem to remember them being very slow. Perhaps the designers were disappointed because they didn't have the transistor budget for a fast multiply.

      1. starsilk

        Re: IBM ROMP vs. ARM

        the ARM1 chip had mo multiply (or multiply add) instructions. it was used in the Tube coprocessor for the BBC Micro that the article talks about (the £4500 one..). I actually got to use one briefly long after it was obsolete...

        lack of multiply was discovered to be causing performance problems, so a slightly revised chip (ARM2) was used from then on (ie: all the Archimedes series) which had MUL/MLA implemented, although it was a bit of a hack - every instruction took a single clock cycle *except* MUL or MLA which could take up to 16 clock cycles (still way faster than emulating multiply in software).

        1. Peter Gathercole Silver badge

          Re: IBM ROMP vs. ARM

          @starsilk. Thanks for the correction. I certainly knew about the multiply-add being missing, but I deliberately avoided talking about the multiply instruction being missing, because I just could not remember.

        2. Sandtreader
          Boffin

          Re: IBM ROMP vs. ARM

          MUL & MLA were indeed slow when both sides of the multiplication were variable, but lots of multiplies have a constant one one side, often sparse in bits (e.g. 2^N - 8, 16, 256 - or 2^N+2^M - 10) and the great trick (of ARM assembler hackers like me, and the - at the time - brilliant Norcroft compiler) was to unfold the multiply into shift-adds (one per bit) using the barrel shifter, one cycle each.

          One my most treasured possessions is an original ARM-1 dot-matrix instruction set description with CONFIDENTIAL scrawled over it in red ink...

      2. Wilco 1
        Happy

        That brings back memories...

        ARM1 didn't have multiply instructions. ARM2 did have MUL and MLA, and used the ALU and shifter to perform multiplies 2 bits per cycle. It used early termination so that multiplies by small values was much faster. When writing assembler for the ARM2 I always made sure that the smallest value was in the right place as X * Y would use a different number of cycles as Y * X... Given the low transistor budget it was the right design, just a pain to optimize for. Today even the smallest ARM CPU has a single cycle multiply.

  11. cdilla
    Thumb Up

    An excellent read

    Thanks for this well presented and interesting article.

    More of this kind of thing!

  12. Steven Gray
    Thumb Up

    Great article

    I had an Archimedes around 1987 - specifically an A3000, and it's power became apparent running some simple Mandelbrot fractal code in BBC Basic. I had the original code from some magazine listing and had applied it to my A3000s predecessor, a Sinclair QL.

    The QLs SuperBasic was as wonderful as it was slow – the Mandelbot set took 24hrs to draw. Even compiled from Pascal code, it still took 8 hours.

    The Archimedes? 45 of your earth minutes. Astounding.

    1. defiler

      Re: Great article

      The A3000 wasn't out until about 1989/1990 - the 1987 models were the A310, A410 and A440 if I recall correctly. I had the A420/1 which came out in 1989, and the A3000 came out soon after. A great piece of miniaturisation, but hamstrung slightly by the lack of a hard drive. 2.5" drives appeared that could fit inside later.

      As for the Mandelbrot drawing, I converted a BASIC program to ARM assembly and then hand-optimised it. The innerloop was 13 instructions long, and the rest of it was just dumping a value to the framebuffer. It could do 320x256 fractals at 5fps (although, to be fair, it mirrored one half, so 2.5fps). Such was the power of hand-optimised ARM code. You could bash out a program like that in half an hour, and then spend a week teasing out every extra clock cycle. And figuring out that ot only was inline conditional execution faster than branching, but LT is faster than GE (by one cycle).

      Still that was ARM2/2.5/3, so I guess things have changed since. Now please excuse me - I've been typing this whilst being assaulted by a 3-year-old...

  13. Andrew James

    Great article

    This is the sort of thing I like reading on here. Something that's had some true care taken to make it worthy of it's audience.

    And you even managed not to refer to Apple as a Foxconn-rebrander when they were mentioned! Bravo.

    Its staggering to think that the chip that powers billions of phones, cameras, hard drives, tablets, etc started life as nothing more than an accident of design.

    And i agree, i dont think failure is a word you can use to describe Acorn. It didnt fail, it just evolved beyond itself.

    1. Dave Bell

      Re: Great article

      I agree, it wasn't Acorn and the ARM which failed.

      But there is something about the British industrial and financial environment which seems to let the winnings from these works of genius drift away out of reach.

      Its not just globalisation, and some factory that is so expensive to build that there can only be one on the entire planet. And we can't expect to spot the right investment choice every time. But what is it about this country which turns a successful entrepreneur into somebody fronting a TV show that tests how people can run a market stall in Essex?

  14. chipxtreme

    I remember my first ARM chip, the ARM250 fitted to the Acorn A3010 bought for £500 from Dixons which ran at the massive 12Mhz, still a lot faster than the A3000's at school. I then went on to a Risc PC 600 which had an ARM610 chip running at 30Mhz, later replaced the CPU board with the 40Mhz ARM710 and eventually put a StrongARM 203Mhz CPU in which I later overclocked to 287Mhz. I even put a deposit down on the cancelled Phoebe computer.

    When that fell through and Acorn collapsed I eventually bought a Castle Iyonix PC which had an Intel XScale 80321 600 MHz.

    I still have the Risc PC in my mothers loft but the only ARM chips I use these days are in my Samsung Galaxy Nexus.

    1. geekbrit

      I think you probably have quite a few ARM cores around - iPods use them (I'm not sure about today's iPods, but early ones used three), your hard disk drives have them, in fact most of the major components of a PC have ARM in them - network cards, video cards, SSD, even USB flash sticks, and then there's printers, routers, car dashboards, gps, brake systems - ARMs are everywhere!

  15. mark1978

    The sad thing is that Acorn should be occupying the same space that Apple is in now. Instead they let it fall apart, and so Apple is the biggest company in the world, not Acorn.

    1. Steven Jones

      Acorn's demise in perspective...

      The reason the Acorn project fell to pieces was partly because so many journalists, politicians and other influential parties were vociferous in opposing the use of a "non-standard" (that is non-Wintel) architecture machines in education especially. That, and the domination of business by MS Office and the need to exchange documents essentially lead Acorn into an ever declining market. There were some great applications written for the Acorn RISC machines (like Sibelius), but it was inevitable that it could not be sustained on that architecture. It's simply impossible to maintain a thriving development community of applications in such a narrow market based largely in one country.

      Don't forget there were many other non-Wintel casualties in the US and a whole raft of alternative processors. Apple only just survived as manufacturer of an alternative architecture because of its dominance in some important niche areas, such as the "creative" sectors along with a somewhat grudging support from MS via a porting of Office (grudging, because it was something of a sop to US competition authorities). Acorn were never able to do what Apple did with the non-computing products, like IPod. With all its troubles, Apple was much better financed with a much more supportive investment sector and a larger market.

      As it is, it was ineviable that Acorn would end up, as its name indicated, the seed for a number of small/medium enterprises specialising in niche areas. Competing with Wintel was always going to be near impossible. That ARM emerged from it is something of a miracle, but to keep things in perspective, the vast majority of the income from products using this architecture acrues outside the company. Essentially ARM does not compete just on the excellence of its low-power processor designs and associated eco-systems, but because it is very, very cheap. ARM is not Intel who can command income per processor perhaps 100x that of the royalties the former achieves.

      1. Anonymous Coward
        Anonymous Coward

        Re: Acorn's demise in perspective...

        While MS's continued support of Office for Mac may well be given grudgingly. It's worth pointing out, however, that the two central constituents of Office (Word and Excel) were both released for Mac in 1985 and weren't really what I'd consider ports. The first Word for Mac was actually the first graphical WYSIWYG version of that software and the first Excel for Mac preceded the DOS/Windows version by 2 years (that not being released until 1987). That the Mac had Excel is often given as one of the contributing reasons (along with DTP applications) for its continued existence following a rather lukewarm couple of years from its launch.

      2. Giles Jones Gold badge

        Re: Acorn's demise in perspective...

        That's always been the problem. It doesn't matter how good the hardware is, if there's no software available that you wish to use then it no good.

        This is what affects Linux for instance. It may be great at many things, but it's not much good for running the popular tools people want to use. Office, Photoshop etc.

        Even the Amiga and ST which did have many cool tools (ST was popular for Cubase in music studios) gave way to the PeeCee. So what hope did the Arc have when even the big US alternatives dried up?

    2. Anonymous Coward
      Anonymous Coward

      I heard

      I had heard (at the time) that one of the big problems Acorn had was that Apple would sue them out of existence if they released RiscOS systems in the US. This basically killed their potential for worldwide sales.

      Does anyone have any links about this?

      1. ThomH

        Re: I heard (@AC)

        I'm pretty sure Acorn systems were sold in the US via Olivetti. I think the main reason that RISC OS never gained the staying power of Mac OS is that Apple did the graphical desktop four years before Acorn and so managed to grab niches in publishing and design that sustained them when Microsoft came along and did the GUI for everyone else. Acorn's educational niche wasn't sustainable because, as noted above, there's a lot of political meddling in education and it's easy to score points with 'business picked Microsoft, we should be training them on Windows'.

        I guess it's a shame but the triumph of ARM makes it difficult to be very upset.

        1. Anonymous Coward
          Anonymous Coward

          Re: I heard (@AC)

          As I recall Acorn tried selling to education in a small geographical location in the States. Then Apple rushed in and dontated large numbers of computers to those specific schools which of course killed the Acorn initiative.

        2. Anonymous Coward
          Anonymous Coward

          Re: I heard (@AC)

          As I recall, Arthur was out and about in 1987.

    3. Chika
      Facepalm

      Not sure about who you mean here by "they". Acorn was mercilessly picked apart by beancounters for want of a decent BOFH and a lift shaft! All that on the eve of the launch of the fabled "Phoebe" or RISC PC 2 which, from accounts of the lone prototype known to have existed beyond Acorn's end, was a pretty stunning machine by the standard of computers back then.

      It's yet another example of how short sighted money men are asset stripping our industry, our inventiveness, heck even our culture! The only bright point is that, on occasion, we can still shock these idiots with what we can do - I suspect that the Raspberry Pi, for example, must be giving some of the big corporates something to think about.

    4. Giles Jones Gold badge

      We don't seem to dominate much as we don't seem to produce products that have a good reputation for build and engineering

      Acorn machines were the Linux of the day, less games and commercial software, more educational and development.

  16. Anonymous Coward
    Anonymous Coward

    More, please! I'm really enjoying all these retro-themed articles, especially the Acorn ones.

    Acorn may live on in various other forms, but the day they went belly up was a seminal day in my computing life. I'd started off with an Electron, graduated to a Beeb and then taken a three year time out in the form of an Atari ST as the Archimedes was way too expensive for a 11 year old school boy. I remember seeing it - finally - at a computer shop in York, and marvelling at it in wonderment.

    In 1992, I finally got my RISC OS machine, in the form of an A3010, purchased from Dixons. Four years after that, I got a RISC PC 600. It was utterly brilliant. Friends of mine were predominantly PC users, and my RISC PC was light years ahead.

    Then Acorn died. And I was sad. The workstations division closed down for purely financial reasons. I moved to a PC after that, something I was sure I'd never do, and moved to OS X six years back.

    Nothing will ever recapture those golden Acorn years for me. Back then, computing was fun. Now, programming for a living, all the joy has gone.

    I still have two Electrons, my A3010 and my RISC PC. One day I'll get the RISC PC out again and relive those golden days.

    1. Mark Cathcart
      Pint

      History of VM..

      There are a couple of really good history documents on the creation and evolution of IBM's Virtualization Technology and the development of the IBM Mainframe 360 and follow-on 370 architecture. The former was written by Melinda Varian formally of Princeton University, the latter written by Jeff Gribbin formally of Rolls Royce in the UK. Varians contains some great pictures and generally less understood and aspects on the evolution of virtualization.

      http://web.me.com/melinda.varian/Site/Melinda_Varians_Home_Page.html

    2. Dan 55 Silver badge

      RISC OS is still a going concern. There will supposedly be a version of RISC OS ported to the Raspberry Pi. I might try it, just because.

      Back then I was a Spectrum fanboi and everything else was crap, but looking back the 'right way' of designing those 80s systems was arguably Acorn's way.

      Can we have an 'old codger' icon, please Reg sir?

Page:

This topic is closed for new posts.

Other stories you might like