back to article Arm emits Cortex-A76 – its first 64-bit-only CPU core (in kernel mode)

Arm's latest top-end processor core design – the Cortex-A76 unveiled today – is its first CPU to only run kernel-level code that is 64-bit, according to its inventors. The processor will still support applications that use 32-bit A32 and T32 instruction sets, as well as 64-bit A64, meaning your apps and other programs will run …

  1. Wolfclaw

    1st generation ARM powered Macs since Apple will have a licence and A64 is desirable to them to move users owning A32 devices to upgrade, if they only start issuing A64 updates in the future?

    1. Anonymous Coward
      Anonymous Coward

      Apple is already 64 bit only on iOS

      Apple doesn't use ARM's designs, and already went 64 bit only with the A11 - and iOS 11 is 64 bit only and support for 32 bit apps was dropped as well (all app updates were required to include a 64 bit build the previous two years so any apps dropped hadn't been updated for at least two years)

      If they go ARM on Macs, it will be with their own SoCs, not ARM's which are inferior in performance.

      1. Anonymous Coward
        Anonymous Coward

        Re: Apple is already 64 bit only on iOS

        @DougS

        "If they go ARM on Macs, it will be with their own SoCs, not ARM's which are inferior in performance."

        ARM don't do/sell SOCs.

        So your statement about inferior performance is bogus too.

        1. Anonymous Coward
          Anonymous Coward

          Re: Apple is already 64 bit only on iOS

          Fine, the CPU cores in the SoCs they use won't be ARM standard cores, they'll be Apple designed cores.

          1. DrBed

            Re: Apple is already 64 bit only on iOS

            >Fine, the CPU cores in the SoCs they use won't be ARM standard cores, they'll be Apple designed cores.

            Fine, so it means that they will be even less compatible with rest of the world.

            Own history lessons not learned.

  2. JassMan

    Shame there is still a spectre in the background

    Seems that in spite of knowing about (CVE-2017-5753) spectre and (CVE-2018-3639) speculative store bypass, they haven't spent a little more time to mitigate these vulnerabilities.

    1. Anonymous Coward
      Anonymous Coward

      Re: Shame there is still a spectre in the background

      The design of this chip was already underway before those were known about - and new varieties of these attacks and similar ones are still being found so it is a moving target. There probably won't be any 'fixed' CPUs released this year, it isn't enough time to make hardware changes to an in-progress design.

      Apple's A12 is unlikely to include such fixes in hardware, either since it is already being mass produced by TSMC so there wouldn't have been enough time for Apple to fix theirs, either. Possibly Samsung's next Exynos will have fixes since it won't be mass produced until the end of the year, but since it is a moving target it is very unlikely it would fix everything known at the time - let alone what might be discovered after it is shipping in phones.

      1. JassMan

        Re: Shame there is still a spectre in the background @DougS

        I have just read that they are about to release updates for Arm Cortex-A72, A73, and A75 for some of the other known vulnerabilities so maybe the A76 has already had those problems removed during design stage.I fully appreciate that it seriously hard work designing a processor like the cortex series and it has a long lead time. As a one time owner of an Archimedes, I have always been a fan of RISC. It still doesn't mean it is not a shame that a new processor is released with known vulnerabilities.

    2. Gene Cash Silver badge

      Re: Shame there is still a spectre in the background

      One of the benefits of streamlining a CPU like this is reducing the attack surface for things to pop up like Spectre.

      1. dmcq

        Re: Shame there is still a spectre in the background

        I can't see there being any simple solution to Spectre without changes to the protection model. One could run managed code in its own address space but that's expensive. Perhaps better would be having a data protection mask that could be set and which would be associated with any data so managed code could only access what it was entitled to and the hardware could treat it the same way as Meltdown. I can't see them liking anything like that in their page tables though. And it wouldn't be a short term fix.

  3. J27

    Interesting that Intel and AMD still haven't done this with their CPUs despite 64bit x86 processors being around much longer than 64bit ARM processors.

    1. Joe Montana

      Other processors

      Intel and AMD have a lot more legacy 32bit (and even 16bit) code hanging around, removing 32bit support would cause a lot of headaches. ARM chips are usually used in embedded devices which are typically designed together with the software they will run so there's far less problem.

      There have already been pure 64bit chips with no 32bit mode, such as Alpha...

      Interestingly while ARM64 is quite new, their primary competitor in the embedded space is MIPS, and MIPS64 has been around since the early 90s, but they failed to capitalise on their lead over ARM.

      1. Anonymous Coward
        Anonymous Coward

        Re: Other processors

        MIPS didn't have anyone designing 'higher end' embedded CPUs like ARM had with XScale, and ARM's own later efforts targeting a bit higher than traditional embedded devices like the A11 and A8 cores. As a result early smartphones like Windows Mobile ended having no option but ARM (at least later in its lifecycle) Which is kind of ironic, given that MIPS began as a CPU for high performance RISC workstations and servers...

        Probably Apple didn't have much choice when choosing a CPU for the first iPhone either - though their early association with ARM probably made that a more likely choice than MIPS even if MIPS had been comparable at the time. Even then those first couple iPhones were pretty sluggish, which is probably why Apple started acquiring CPU design firms way back in 2008 to begin building a team capable of designing custom cores.

  4. Will Godfrey Silver badge
    Unhappy

    A bit sad

    Those conditional instructions are absolutely beautiful if you're into tightly packed assembler. I'll certainly miss them (although I admit I don't do much assembly level work these days).

    To some degree they also mitigate not having speculation, as they are incredibly fast (one clock tick).

    1. Herring`

      Re: A bit sad

      IIRC, the original instruction set was the work of Sophie Wilson - British computing legend.

      1. Will Godfrey Silver badge
        Thumb Up

        Re: A bit sad

        @Herring

        Indeed so. She was a master of the 6502 instruction set (managing to cram the whole of BBC BASIC into a 16k EPROM) and used that as a starting point for the development of ARM1

      2. druck Silver badge
        Unhappy

        Re: A bit sad

        The original 26 bit mode ARM ISA with combined PSR & PC, conditionals, and multiple register stacking was a thing of beauty, it actually made you want to write in assembler. The evolution and expansion of A32 has degraded the ease of handwritten use, and I'm afraid A64 is very much leave it to the compiler territory. Sad indeed.

    2. MacroRodent
      Boffin

      Re: A bit sad

      The idea of the conditional instructions is to avoid jumps, which disrupted pipelines in the older simpler processors without instruction reordering or speculation. Other RISC designs at the time used delay slots for the same reason: the instruction following a jump is always executed (compilers are supposed to find something useful to put there, if not, they put a NOP). But delays slots don't help much if the pipeline gets deeper.

  5. Will Godfrey Silver badge
    Happy

    Following a train of though

    I wonder what results you would get if you re-fabbed the ARM2 instruction set with modern sizes. It should be pretty fast, and ideal for highly deterministic designs. You would always know exactly how long any code sequence would take, and it would be somewhat more secure that some others!

    P.S. apologies for spamming the thread!

  6. Roland6 Silver badge

    32 v 64-bit

    "There is so much software out there that still runs in 32-bit mode on Arm-compatible chips"

    There is much software out there and yet to be written for IoT that doesn't need anything more than 32-bit, also given what was achieved with 16-bit architectures 32-bit for some is an overkill, it would seem the time has come to separate the architectures and have 32-bit only and 64-bit only chip families.

    The only question is whether you re-architect the 32-bit instruction set to get rid of the nice for programmer but hard for chip designer instructions, or maintain the legacy compatibility.

    1. druck Silver badge

      Re: 32 v 64-bit

      A32 will probably carry on as a microcoded emulation on top of core solely optimed for A64 instructions.

  7. onefang

    "These features are super useful for those crafting highly tuned assembly code, or writing compilers – and the thing is, not many people are left doing that these days."

    While I've only ever written one compiler, I just love writing highly tuned assembly code.

  8. onefang

    "There is much software out there and yet to be written for IoT that doesn't need anything more than 32-bit, also given what was achieved with 16-bit architectures 32-bit for some is an overkill, it would seem the time has come to separate the architectures and have 32-bit only and 64-bit only chip families."

    8 bit CPUs work fine for much of the stuff IoT tries to do. How many bits do you really need to switch a light bulb on and off? Or even for dimming the light bulb?

  9. DenTheMan

    A non Meltdown chip?

    I would assume it beats the A75 which went Intel like in adding Meltdown for speed.

    And benchmarks are bogus in that Meltdown mitoigation changes it all.

  10. john.jones.name

    CoreMark anyone, anyone ?

    How about actually getting the benchmarks such as SpecFP or CoreMark ?

    SPECpower or ULPMark ?

    1. -v(o.o)v-

      Re: CoreMark anyone, anyone ?

      How do you suppose to do that when the things are not available yet?

      The focus of the article was very clear: the technical architecture. And it was excellent.

  11. -v(o.o)v-

    Excellent article and there should be more of these

    I might remember wrong, but I recall El Reg having a lot more of these really in-depth technical articles in the past that went way beyond the usual rehashing of PR material.

    More of this please!!

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like