back to article Intel tock blocked for good: Tick-tock now an oom-pah-pah waltz

For more than a decade, a staple of Intel's defense of Moore's Law has been a development methodology known as "Tick-Tock," but it's had its day. The news emerged in an almost-unnoticed 10-K filing from earlier this month, submitted by Chipzilla to US financial watchdog the SEC. Here's how the tick-tock works: Intel first …

  1. the spectacularly refined chap

    Beginning of the end for Intel?

    For perhaps the last ten years or so Intel have been at least a generation ahead of everyone else when it comes to chip fabrication - it's no coincidence that around this time is when AMD started falling by the wayside in terms of absolute performance. Look at the trend with each feature size shrink - at each generation costs go up and up and the number of manufacturers goes down and down until Intel are out in front by themselves.

    They've needed to. x86 is ultimately a 40 year old ISA and has few redeeming features. Intel have only been able to stay competitive by throwing money at the problem and engineering themselves out of a design corner created by that very ISA. If we're getting to the point not even Intel can shrink any more and the other manufacturers have a chance to begin to catch up what is the future for x86 and Intel?

    The world has moved on over that same ten years and people are no longer so wedded to Wintel thank to the likes of Android and iPhone, they can see upping sticks to something else is not a complete impossibility. Whereas the likes of MIPS and Alpha were at the time a country mile ahead of Intel despite much smaller R&D budgets simply because of smarter design, people couldn't see an alternative to Wintel for the mass market at least. If the competitors catch up on fabrication technology but with similarly superior designs on a conceptual level then they could take on or even overtake Intel once more but this time with a better chance of supplanting them in the market.

    If that pans out personally I'd be glad. IT's been pretty boring post-millennium with ARM for mobile and x86 for power. It we get back to the point where there are half a dozen or more different platforms of note the industry becomes a lot more interesting place.

    1. PleebSmasher
      Dead Vulture

      Re: Beginning of the end for Intel?

      Let's see how AMD's 14nm chips do. If Zen lives up to the hype, x86 is not dead just yet.

      Intel will probably hang around. They make stuff like SSDs, and can probably continue to make CPUs steadily smaller for another 10 years. TSMC/ARM may be sporting an aggressive timeline for 10nm and 7nm, but those can be delayed and might be dependent on EUV. 5nm and smaller is still possible, although it may require a complete rethink on the materials used and how to deal with quantum tunneling. 3 year cycle officially buys more time to think about post-CMOS technologies. Eventually, a technology will be able to be stacked and more decades of "Moore's law" performance gains will be sought.

      As for the alternate designs like MIPS, they have a lot of work to do to displace x86 or ARM. More likely we will see completely novel designs for neuromorphic computing emerging out of thin air.

      1. Alan Brown Silver badge

        Re: Beginning of the end for Intel?

        "(Intel)...can probably continue to make CPUs steadily smaller for another 10 years."

        7nm is getting pretty close to as small as it can get. Quantum tunnelling is already a problem but the greater issue is that at this scale the features are only a few atoms (not tens of atoms) wide.

      2. Dave 126 Silver badge

        Re: Beginning of the end for Intel?

        Only some types of game benefit from anything more than an i5 CPU - it's the GPU that makes the difference. I don't know how much that might change with Vulkan, DirectX 12 and AMD's APUs.

        So yeah, Intel CPUs get more power efficient but don't get much faster - though they do boast less useless integrated graphics.

    2. goldcd

      Conversely though

      I used to look forward to benefits in "a new CPU"

      Currently have a i7 2600k in my desktop which is (consulting my receipts folder), a few months shy of 5 years old - and I can see naff all reason to upgrade. Didn't really need the power when I bought it and still don't now, the odd spot of video-encoding aside.

      If it were a laptop, then it'd be worth it for reduction in power consumption - but..

      Bluntly the greatest requirement for CPU anybody really needs is for gaming, and it's GPU that's the bottle-neck there. Next step is surely to get beyond basic 'integrated' graphics and bang out a populist chip that will give you 1080p performance of a current gen console.

      AMD are pretty well placed here with both new architecture coming in, and making all current gen console chipsets.

      If you don't game, I'm bemused as to why you'd need to upgrade anything at all. iPhone/Pad have proven than if you can get the tech into people's hands, they do tend to use it.

      1. Anonymous Coward
        Anonymous Coward

        How can Intel be dead?

        As long as Windows only (for all practical purposes) runs on x86, and Windows remains dominant on the desktop and important on servers, Intel will be fine.

        The PC market may be shrinking, but the server market is still slowly growing, and since server CPUs have a better margin than PC CPUs, they're just fine. And no, ARM CPUs aren't going to have any measurable impact on Intel for the forseeable future. They may be cheaper, but they're also much slower. Even if that changes, it would be 5-10 years before they could steal a significant share of the server market.

        1. Flocke Kroes Silver badge

          As long as Windows dominant on the desktop ...

          The desktop is not dominant now, and non-gamers are noticing that it is more than they need. Smart phones out-number desktops. A modern smart phone is more than powerful enough for a word processor and the majority of business software. When connecting a phone to a monitor and keyboard become convenient, another chunk will vanish from the desktop market.

          Microsoft saw this coming. They spent loads on a smart phone OS, a billion advertising it, over a billion propping up Nokia and more buying out the burned out shell. All because Microsoft knew that being dominant on the desktop will be the equivalent of being the dominant supplier of abacuses next decade. After their abysmal phone failure (35% market share -> 2%), Microsoft have switched to porting their products to Linux.

          Next target: servers. The Google and Amazon account for a hefty chunk of the server market. They are designing their own silicon. I have seen plenty of rumours of an Amazon ARM or a Google ARM or Power. For now, I think these are big sticks to shake at Intel. Such sticks can be really effective: Itanium killed Alpha well before there was a functional Itanium CPU. ARM architecture could have made big changes to the server market that we do not see because we do not know the details of negotiations between Intel and Google/Amazon/Facebook.

          1. Medixstiff

            Re: As long as Windows dominant on the desktop ...

            "A modern smart phone is more than powerful enough for a word processor and the majority of business software. When connecting a phone to a monitor and keyboard become convenient, another chunk will vanish from the desktop market."

            I laugh at the smart phone manufacturers, they trumpeted to everyone that the desktop was dead but they didn't learn from history. Instead you now have smart phones with 4GB's of RAM and quad core processors, which people are finding are more than enough for all their tasks and that the new model with the slightly better specs and camera, really isn't worth spending another $700 AU on, so I'll wait until the existing one dies or my two year contract is up.

            That's the reason desktops stagnated, because with multi core and 8GB's of RAM, the software bloat wars that had continued from the 80's to early naughties, were finally won.

          2. Ellis Birt 1

            Re: As long as Windows dominant on the desktop ...

            Recent developments from Redmond are continuing to move away from wintel.

            With ASP.Net 5 Core and now sql server potentially being equally happy on Linux, and Redmond even having their own distro, Microsoft may let windows fade slowly.

            Even on the desktop, they have abandoned he concept of selling OS upgrades. And let's not forget Win-RT. It may not have take off, but the code-base still exists. As more application software becomes architecture-agnostic, ARM, or another technology may well end up ruling supreme - even in Redmond!

          3. Anonymous Coward
            Anonymous Coward

            Re: Redmond - the dominant supplier of abacuses next decade

            Thanks

            Now I understand the strategy behind the Win 8 and Win 10 user interface

            Redmond Tile Tapping = Abacus Bead Sliding.

            Typical Redmond interpretation of User Requirements:

            Round Bead into Square Tile

          4. P. Lee

            Re: As long as Windows dominant on the desktop ...

            >smart phone is more than powerful enough for a word processor and the majority of business software.

            I wish. However if this were true Apple would have practically phased out x86 MacBooks in favour of arm ones and Windows RT would have been a success.

        2. Ian 55

          Re: How can Intel be dead?

          "As long as Windows only (for all practical purposes) runs on x86, and Windows remains dominant on the desktop and important on servers, Intel will be fine."

          Yep.

          The converse is probably true too: when Windows loses its dominance, anyone who has been investing in alternatives is probably going to leave Intel in the dust.

      2. A Non e-mouse Silver badge

        @Golcd Re: Conversely though

        I think we need to ask: Exactly what have we been getting in the last few iterations of x86 CPUs?

        At first, it was easy to see what you got from a new CPU: Faster clock, more bits (16, 32, 64) and maybe some instructions (Protected memory, virtualisation, floating point maths in hardware). This was easy for most people to understand. Then they hit the thermal wall at around 4Ghz, and had to start being smarter about the architecture.

        Now, the focus is more on power efficiency (which is good), but there's less actual speed boost. Sure, more cache and cores helps to a limited degree, but does ten cores verses four cores really help the average person who uses Word, Excel or Internet Explorer?

        CPUs are getting lots more baggage around them: integrated I/O, integrated graphics, more niche instructions (anyone remember the VAX CPU?), systems management, etc. But the core of a CPU is the ALU, and none of these features are going to help improve the speed of the ALU. All we're getting is (roughly) the same throughput for less energy.

        Intel are being squeezed. At one end, ARM is doing (very) low power for good enough performance. At the high end, GPUs are doing the heavy parallel number crunching. What's left for Intel? They're now looking to integrate FPGAs onto the CPU die. How many people will need that?

        1. bazza Silver badge

          Re: @Golcd Conversely though

          @a non e-mouse,

          "Intel are being squeezed. At one end, ARM is doing (very) low power for good enough performance. At the high end, GPUs are doing the heavy parallel number crunching. What's left for Intel? They're now looking to integrate FPGAs onto the CPU die. How many people will need that?"

          You've forgotten Xeon Phi. This is a true monster of a chip. The next one is a true CPU, not a coprocessor. And it's got about the same grunt as Nvidia's best GPU.

          Being a CPU it has a lot of advantages over Nvidia's GPU - you don't have to load data over a PCIe bus. That will mean it's peak performance is easier to sustain, so in real world applications it will probably be faster than a GPU.

          And being just a bunch of X86 (well, more importantly, SSE. The x86 bit is largely irrelevant) cores it is a little easier to exploit.

          It's not quite as elegant as the great, late and lamented Cell processor, but it'll do.

          "Microsoft saw this coming. They spent loads on a smart phone OS, a billion advertising it, over a billion propping up Nokia and more buying out the burned out shell. All because Microsoft knew that being dominant on the desktop will be the equivalent of being the dominant supplier of abacuses next decade."

          I don't think MS have seen the future at all. They nearly did, and 7 year's ago had an opportunity to start defining an ARM server architecture. They did mobile instead, and now Qualcomm, AMD, etc are leading the way in ARM servers with Windows Server utterly unprepared to exploit them. No wonder they're porting stuff to Linux, because that's faster than getting Windows Server running on ARM and getting all their server applications moved too.

        2. JeffyPoooh
          Pint

          Re: @Golcd Conversely though

          "...integrate FPGAs onto the CPU die. How many people will need that?"

          Glue logic gates might useful for somebody trying to integrate the inevitable Intel PC-on-a-chip die into the micro-HDMI connector.

      3. Paul Kinsler

        Re: Bluntly the greatest requirement for CPU anybody really needs is for gaming,

        So, my 3D electromagnetic simulations /aren't/ maxing out the cpu on my workstation after all? And I don't have to wait for them to finish?

        perhaps: s/anybody/most users/

    3. Ken Hagan Gold badge

      Re: Beginning of the end for Intel?

      "x86 is ultimately a 40 year old ISA and has few redeeming features."

      Intel's original floating point model hasn't seen light of day for about a decade, having been superseded by SSE2. Both integer and floating point arithmetic models have been evolving since the mid-90s with MMX and various other TLAs. A modern desktop chip also devotes more than half its area to an integrated streaming processor that owes nothing to x86. Lastly, since the early 90s, those x86 instructions have been translated and re-ordered on-the-fly into whatever was convenient inside the chip. A modern x86 ALU has dozens of registers and the L1 cache is only a couple of clocks away.

      To be fair, everyone else's chips are the same. x86 lost the ISA wars against the RISC chips, but Intel responded with the ISA-less Pentium Pro and ISA hasn't mattered since then.

      1. bazza Silver badge

        Re: Beginning of the end for Intel?

        Hello Ken Hagan,

        "Intel's original floating point model hasn't seen light of day for about a decade, having been superseded by SSE2. Both integer and floating point arithmetic models have been evolving since the mid-90s with MMX and various other TLAs. A modern desktop chip also devotes more than half its area to an integrated streaming processor that owes nothing to x86."

        It's OK now, but the evolution of SSE has been a bit rubbish. It took them absolutely ages to include some fairly fundamental instructions like a fused multiply-add.

        MMX / SSE was for a long time an ever changing thing and was consequently very hard to develop for. About the only way to use it was to use Intel's IPP/MKL libraries, where Intel had put in the effort to account for the different versions of SSE that your application would encounter in the field. And this costs money. To not use it meant taking on the huge job of writing versions of your software for SSE2, SSE3, SSE4, SSE4.2, etc. Unsurprisingly, very few did.

        In comparison, Altivec (the equivalent to SSE on PowerPC and POWER processors) was right first time. Motorola put the right instructions into it and didn't keep changing it. So people actually wrote software to use it. For example, in the overlap between PowerPC and Intel Macs, Photoshop was far quicker on PowerPC because Adobe had actually exploited Altivec pretty well.

        <u>Itanium</u>

        Itanium was slightly popular in the high performance computing world because it always had a fused multiply-add in it. I saw the addition of FMA to X64's SSE as being the signal that Intel had truly given up on Itanium; there was absolutely nothing left to recommend Itanic over x64.

        "To be fair, everyone else's chips are the same. x86 lost the ISA wars against the RISC chips, but Intel responded with the ISA-less Pentium Pro and ISA hasn't mattered since then."

        Almost, but importantly, not quite everyone. ARMs are ARMs, there's no microcode (at least not in the same sense as x86's). You get 48,000 transistors running the ARM op codes, and there's no real instruction translation.

        It's important because of the transistor count - only 48,000. An equivalent x86 core needs several million to get the same performance (translation, pipelines, etc. etc), so it's not surprising that ARM wins on power consumption.

        1. Anonymous Coward
          Anonymous Coward

          Re: Beginning of the end for Intel? (Motorola, DSP)

          Interesting reading, in particular about SSE's inapplicability to real world stuff til V11 or whatever.

          Motorola of course have the benefit of years of real DSP experience, as well as the 680x0 and the less successful (!) 88000 families, so one might hope that Altivec got it right fairly quickly.

          Intel are of course the x86 company.

        2. Roo
          Windows

          Re: Beginning of the end for Intel?

          "It's important because of the transistor count - only 48,000. An equivalent x86 core needs several million to get the same performance (translation, pipelines, etc. etc), so it's not surprising that ARM wins on power consumption."

          More importantly the ARM cores are easier to test & the designs are easier to validate. Compare and contrast the errata for a Xeon vs pretty much any RISC core and tell me which one you would prefer to bet your life on... ;)

    4. Alan Brown Silver badge

      Re: Beginning of the end for Intel?

      " x86 is ultimately a 40 year old ISA and has few redeeming features"

      Intel tried to kill it (Itanium). AMD resurrected the zombie (x86_64) and the market forced Intel to keep providing it.

      1. DropBear

        Re: Beginning of the end for Intel?

        "AMD resurrected the zombie (x86_64) and the market forced Intel to keep providing it."

        Hmmm... okay go ahead laugh, but... what if they'd switch to some more convenient architecture and let their new built-in FPGA emulate a legacy x86 when it can't be avoided and do something more useful when legacy software is not running...?

        1. Roo
          Windows

          Re: Beginning of the end for Intel?

          "Hmmm... okay go ahead laugh, but... what if they'd switch to some more convenient architecture and let their new built-in FPGA emulate a legacy x86 when it can't be avoided and do something more useful when legacy software is not running...?"

          CPU cores really don't take up that much space (compared to say cache) these days, the chances are it would be faster, easier and more efficient to simply embed a full blown x86 core instead. FPGAs with CPUs embedded in the corner have been around a long time now.

          Also in terms of the dual-ISA approach, Intel have already had a crack at that with Itanium, didn't seem to be a winner if you judge by sales...

    5. peter@oxscisoft.com

      Re: Beginning of the end for Intel?

      "x86 ... has few redeeming features"

      Dense ISA, register renaming, outstanding cache performance - see

      http://lkml.iu.edu/hypermail/linux/kernel/0302.2/1909.html

      by someone called Linus Torvalds

  2. Tom 64

    How did I know...

    ... that this article would include the word 'cadence' in it?

    Someone amongs the intel marketing droids can't help themselves.

  3. Ragequit
    Pint

    Stagnation...

    Let's face it everyone is in trouble, intel or not, unless 'the next big thing' is realized. Some are hoping VR or AR is just that but I don't think it'll happen overnight. That tech, especially the former, has been the dream of many a Sci-fi/wonk/nerd for 30+ years now. I'd be very surprised if anyone is satisfied with the first product out the gate.

    Intel seems to be banking on Hybrid CPU/FPGA tech for the hyperscale and perhaps they're hoping it will filter down into the mass market in VR products? I speculate machine learning would be helpful in some of the problems faced by emerging tech. I.e. Adapting character animation to more closely match the normal movements of the user. It's important for the immersion factor. Not everyone has the same cadence (ahem) and people have quirks all their own. You could do this with a GPU? Sure, but it's just one example and efficiency could become important in mobile versions.

    AMD is of course banking on their GPU and APU to tackle some of the same problems. The problem is that their facing more than one front here. Intel and Nvidia. AMD has, until this point, been slow to bring their ARM based products to market but it seems the OS/toolchains have almost caught up. Except that helps Nvidia as much as it does AMD. If 64-bit ARM takes hold Nvidia stands to be a major player as well. Still early days. Zen will hopefully be a way for AMD to get back to profitability in the near term, but going forward they need to be adaptable and offer solutions others can't or won't provide. Intel is very protective of their distinct product lines and do their utmost to keep one from consuming the other. If AMD can provide competitive solutions that give customers options rather than be forced to color within Intel's lines I think they have a fair shot of surviving.

    I think it's true Intel is going to have some trouble maintaining the status quo going forward. MS is at a turning point with their OS, which leaves x86 dominance in question, and Intel is trying to diversify a bit before the storm hits. I.e. New memory/storage tech, FPGA's, and even software stacks for open source that favors their products. Well the later isn't really anything new.

    At any rate things might actually start to get interesting for once. More competition and some cool tech from all camps. At least one can hope.

    1. allthecoolshortnamesweretaken

      Re: Stagnation...

      Mass market VR gear? That's as likely now as it was 20 years ago, in other words: not going to happen. If Intel bets the farm on that, they are dead. (But they won't, they are smart. They'll be keeping some options open, though.)

      What I'd like to see from Intel is ca CPU that is not just another x86 again. It could still be bog standard WinTel on the outside for the average user/mass market, but running in a VM on something new inside. (Hey, a man can dream, can't he?)

      As to oom-pah-pah - try some proper Humppa instead! Seriously, if you have a chance to watch these guys live, do it!

      1. Ragequit
        Devil

        Re: Stagnation...

        You did read my post right? That I didn't think it would happen overnight. That it'll be some time before people will be satisfied? There's 30 years of unreasonable wish fulfillment to work through. Still, I think there will be some market for it. It might even be reasonably popular for a year or two before the hype machine stalls. The difference now is that we have more compute than we know what to do with. Granted we need more than just compute and that's where we're going to get hung up on reality tech. But that was one of the major hurdles back in the day. That and high res displays. Back then it was too expensive to develop something better than Minecraft sans textures. Now we can at least get our foot in the door tech wise.

        People are going to have a wake up call on what's passed for 3D graphics up till now. When you give people the chance to really look at things closely in your environment - texture, lighting, and shadows quality is going to become obvious. Time to stick Ray Tracing in an FPGA?

      2. Dave 126 Silver badge

        Re: Stagnation...

        >Mass market VR gear? That's as likely now as it was 20 years ago, in other words: not going to happen.

        It's not just VR - the related technologies (Intel's RealSense, Google's Project Mango, MS's Kinect, plus Qualcom and numerous video-stitching cleverness) have applications in machine automation (most disruptively, potentially, self-driving cars), and in changing how many of us interact with computers*.

        On the VR front, things are very different to 20 years ago when the only consumer headset I remember was from Forte Technologies. Here's why:

        - low cost of entry for consumers to dip a toe in the virtual waters (just pop your 5" smartphone into a $5 lens contraption)

        - more sophisticated, dedicated headsets inexpensive since they share components (screen, gyros, accelerometers) with smartphones

        - video content cheaper to make, due to cheaper multiple digital cameras and fancy post-processing.

        -straightforward support from game-creation engines

        - support from widely recognised consumer brands, such as Sony, Samsung, Steam, Intel, Qualcomm

        I'm not saying that mass market VR is a sure thing, but the landscape is very different to the mid 90s.

        * I'm not a coder, and I'm not talking about flying through a file system a la Neuromancer or Jurassic Park. I'm a product designer and I want a magic workshop.

        1. Dave 126 Silver badge

          Re: Stagnation...

          Okay, a Magic Workshop.

          The easy shorthand would be 'Tony Stark's workshop from the Iron Man movies'. More immediate, my 'magic workshop' might consist of a Kinect sensor and a projector. The idea is to better integrate the designing (paper, CAD) and the physical making (mixture of hand and machine techniques) for greater efficiency, whilst aiding safety.

          - physical measurements I make are instantly available in CAD, or used to define driving dimensions I've already defined.

          - a projector can help me mark out cuts

          - cuts off power tools if my fleshy bits get too close

          - a self-correcting 3D printer. Parts are printed within a 3D scanner, so the printer can correct for alignment in real time.

          - jigsaws with steering assistance

          - mixing catalysts can be done with audio cues, so no measurement vessel is required.

          I'm just brainstorming, purely to provide examples of interacting with computers without a mouse and keyboard.

          An example of a 'smart desk' might be HP's Sprout desktop, which incorporates a 3D scanner, camera and projector, for rapid back and forth between physical media (pens and paints) and virtual (Photoshop, CAD etc).

          1. DropBear

            Re: Stagnation...

            - jigsaws with steering assistance

            Let it not be said that all I do spew snarky quips and bring people down... behold the steering-assisted hand-held router.

            1. Dave 126 Silver badge

              Re: Stagnation...

              >Let it not be said that all I do spew snarky quips and bring people down... behold the steering-assisted hand-held router.

              Haha! Thanks for that DropBear! Genuinely, I had not seen that before! It just goes to show that the difference between me an a MIT student is that I brain-fart an idea onto the Register, and an MIT student actually builds something potentially useful!

              : )

        2. DropBear

          Re: Stagnation...

          - low cost of entry for consumers to dip a toe in the virtual waters (just pop your 5" smartphone into a $5 lens contraption)

          ...where the applications are pretty much limited to watching surround panoramas - I foresee it becoming about as popular as the same kind of long-existing 360-pano-viewers embedded on some websites ever were (some of them are even still around!). Remember when the Quicktime pano plugin was all the rage...? Yeah, me neither.

          - more sophisticated, dedicated headsets inexpensive since they share components (screen, gyros, accelerometers) with smartphones

          That would be nice - as it is, they cost an arm, two legs, half a dozen kidneys and some change. It could be argued their actual BOM cost is a tiny fraction of that (much like with smartphones), so their price will surely come way down (much as it happened in the smartphone market with Apple and Samsung). ...oh wait. Hmmm. Maybe the Chinese will save the day...?

          - video content cheaper to make, due to cheaper multiple digital cameras and fancy post-processing.

          Have you actually seen what one of those "360" multi-cameras aimed at enthusiats (not even pros) costs? Clue: almost a grand. Yup, aunt Jane will surely start shooting her holiday pics and videos with one of those one any day now...

          -straightforward support from game-creation engines

          Is that why only every single game studio who obviously promised VR support in their Kickstarter a few years ago is now wishing they haven't, going on about how even though the engine they use allegedly supports VR but actually making the game interface work half-decently in VR is much harder than they expected? Because that's all I hear whenever any of them mentions VR. It's always "yeah it's great _but_..."

          - support from widely recognised consumer brands, such as Sony, Samsung, Steam, Intel, Qualcomm

          ...the best thing about standards being of course (as everybody knows) that there's so many of them. And a new one is born every time someone decides to jump on the bandwagon and try their hand at VR. Yup, I'm sure those supporting folks are working together in such a sublime harmony pushing all in a single direction, not at all the way ants "collaborate" by pushing the same thing from all sides simultaneously...

          I'm not saying that mass market VR is a sure thing, but the landscape is very different to the mid 90s.

          Is it really? Because I'm so not seeing it...

          1. Dave 126 Silver badge

            Re: Stagnation...

            >...where the applications are pretty much limited to watching surround panoramas

            Some are professionally produced. And in any case, the same hardware can be used to watch traditional video content on a virtual big screen. I seem to recall the Sony Walkman also started life providing in-flight entertainment to a rich and well-connected person (the then CEO of Sony).

            >That would be nice - as it is, they cost an arm, two legs, half a dozen kidneys and some change.

            Some do, some don't. The higher end models are pricey due in part to the GPU requirements - but GPUs will always be sold anyway; Sony's planning a PlayStation 4.5 for driving 4K televisions and the same grunt could power a VR headset. And hey, Nintendo made a killing with its Wii console that wasn't trying to compete with the graphical prowess of its rivals.

            Not sure why you cite the £500+ flagship models of Apple and Samsung... every midrange phone today is a match for last year's flagship. All you've shown is that there are a fair few people out there who will spend £500+ on a gadget when a far cheaper gadget will do much the same job.

            >Have you actually seen what one of those "360" multi-cameras aimed at enthusiats (not even pros) costs? Clue: almost a grand. Yup, aunt Jane will surely start shooting her holiday pics and videos with one of those one any day now...

            I wasn't just talking about a 360 rig, but the sort of post-processing that lets people watch a sports match from any angle ( very fancy post-processing). The very technology that Intel acquired last week. The sports viewing market is huge, and will happily punt on a few new technologies.

            I don't really want to watch Aunt Jane's videos, but Aunt Martha in Australia might want to virtually relive a family reunion. Really though, the £1000 price range was what half decent camcorders were a few years back, and they only seemed to be used for family parties and the like.

            >Is that why only every single game studio who obviously promised VR support in their Kickstarter a few years ago is now wishing they haven't, going on about how even though the engine they use allegedly supports VR but actually making the game interface work half-decently in VR is much harder than they expected?

            This old chicken-and-egg scenario has been played countless times in IT. Why bother writing software for a platform with no users? Why bother buying a platform with no software? It always resolves itself one way or another.

            >Yup, I'm sure those supporting folks are working together in such a sublime harmony pushing all in a single direction, not at all the way ants "collaborate" by pushing the same thing from all sides simultaneously...

            Even their rival efforts serve to raise public awareness of VR.

            >Because I'm so not seeing it...

            You're trying very hard not to see it. But have an upvote for taking the time to expand upon your views. I do not agree with you sir, but will defend to virtual death etc etc

            1. Ian 55

              'The sports viewing market is huge'

              It is - and the tech to do multi-angle views in real time is impressive - but there are not many TV stations with significant sports rights, and they're the only realistic market.

    2. Anonymous Coward
      Anonymous Coward

      Re: Let's face it everyone is in trouble

      Except those that are tapping into the "spooks and military robotics" market.

      <TIN FOIL HAT ON>

      It can be argued that the "spooks and military" really helped to push Wobbly Windoze into desktop dominance for easy trawling.

      The arrival of GPS Mobiles meant the "spooks and military" could throw Redmond under a bus.

      Now they are pushing for Android dominance so they can track your every thought and deed.

      <TIN FOIL HAT OFF>

      Please welcome our new Sirius Cybernetics Corporation overlords.

      (That is not a request)

  4. Anonymous Coward
    Anonymous Coward

    More to this than technology

    Contributors have noticed that the end is nigh for the desktop x86 market: already so for non-gaming home users, and increasingly so (starting with longer lifecycles) for the IT department rollouts.

    Intel fanbois sing "but there's still Xeon".

    Yes, for now there is. Chip development costs a lot of money. Historically desktop revenues at Intel have helped pay for continued Xeon development. (Mobile revenues at Intel don't pay for anything much yet, do they?). But as desktop x86 isn't growing, might desktop CPU prices come down? Volumes surely will. And thus so will Intel's desktop profits.

    Even as Intel's desktop profit declines, Intel will be able to fund Xeon development from their piggy bank for a while longer, but the closer Xeon gets to having to fund its own development, the closer it gets to being completely unaffordable. Can anyone spell "death spiral"? Those who remember Alpha certainly can.

    Intel. The x86 company.

  5. JeffyPoooh
    Pint

    Meanwhile, in the Self-Driving Car labs...

    ...a boffin puts down his chalk and says, "We're gonna need a bigger..."

    "Boat?"

    "... No, vision system and AI processor. Get Intel on the line."

    Intel is going to be busy until The End of Time itself.

  6. Dave 126 Silver badge

    Intel's competitor is yesterday's Intel

    - in that my laptop is around 5 years old and still works fine for me (CAD, Photoshop etc). It is because the Intel chip in my PC is fast enough already that I haven't bought an Intel chip since.

    Curiously enough, my laptop has roughly the same RAM and internal storage as an iPad Pro. I'm not championing Apple now, but just noting that an ARM-based system is now being sold that could fulfil all my needs if specific productivity software was available.

    (I have no idea about how fast an iPad's Imagination GPU is compared to my aging nVidia 9600M GS with 32 CUDA cores, but I get the impression that the iPad can shunt polygons around cheerfully enough for many CAD purposes)

  7. Anonymous Coward
    Anonymous Coward

    What a time to be alive

    Moore's Law is finally dead.

    The next technological frontiers: virtual reality that is no longer gimmicky but fully integrated into daily life, and artificial intelligence which understands nuances and emotions. May they unfold within my lifetime.

  8. ColonelClaw

    If Intel's ticks and tocks are as you've described, then shouldn't it be called tock-tick? After all, you have to come up with the chip design first, before it can be shrunk.

  9. Colin Tree

    I can wait

    I basically need keyboard, mouse, display, audio, network.

    For years I've enjoyed slight improvements in response and processing power with each iteration.

    I purchase at the price/power sweet spot which isn't the fastest, newest.

    Currently awaiting a Cortex A72 SBC, I can wait.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like