back to article Boffins demo 'memcomputer', plot von Neumann's retirement

Shuffling stuff between memory and the CPU is one of the biggest wasters of time and electricity inside, so boffins from the University of California, San Diego, have demonstrated a prototype “memcomputer” that has memory and compute in the same place. The memcomputer concept has been around since the 1970s, as noted by …

  1. Destroy All Monsters Silver badge
    Holmes

    I slipped on muh snake oil

    "We show an experimental demonstration of an actual memcomputing architecture that solves the NP-complete version of the subset sum problem in only one step and is composed of a number of memprocessors that scales linearly with the size of the problem”

    In other words: magic.

    100 units of the currently strongest currency that this computational version of a perpetual motion machine is not going to hold up for long. Some resource is sure to increase exponentially or demand exponential precision somewhere.

    Haven't thought about it yet, but lady nature doesn't like her no-go areas lifted, that's pretty certain.

    The claim also seems to be that you can build an implementation of a quantum computer (which appears to NOT be able to solve problems in NP in polynomial time, see the problem class BQP) using classical microelectronics. Nein, nein,. nein.

    1. Steve Crook

      Re: I slipped on muh snake oil

      "but lady nature doesn't like her no-go areas lifted"

      Thing is though, we don't know where the hard limits are. The history of science and technology is littered with definitive statements about what is and is not achievable. Most turn out to be wrong.

    2. Anonymous Coward
      Anonymous Coward

      Re: I slipped on muh snake oil

      Ye cannae change the laws of physics Jim

    3. Nigel 11

      Re: I slipped on muh snake oil

      Reading between the lines, I think that's probably "in theory, ignoring noise".

      Allowing for noise, either this piece of magic will fail rendering the whole thing useless, or assuming that the boffins have more of a clue than the journalists, you'll be running a sort of physically accellerated annealing process that will give you an answer that's pretty close to THE answer to the NP-complete problem. However, it won't necessarily be that answer, and you may never be able to know whether it is or not. For a travelling salesman problem (route optimisation), that doesn't matter. Any close approximation is equally useful. For other (crypto?) problems, that most certainly does.

      It's almost certainly a good idea to build one and study its operation. As with quantum computation, detailed information on what can and can't be achieved may lead to the next breakthrough in our understanding of our universe.

  2. frank ly

    Given the example shown:

    "varying the amount of charge moving through the transistor can be used to change the resistance property, which remains stored even when the memcomputer is powered off."

    Ah, it's a programmable analogue computer with simple built-in parameter memory. It can do a Fourier transform in real time but it couldn't run a simple spreadsheet with a pie chart. (Choose your own examples).

    1. Paul Kinsler

      Re: a Fourier transform in real time but it couldn't run a simple spreadsheet with ...

      Your requirements might be different, of course, but my use for fourier transforms is much greater than that for spreadsheets (although, to be fair, I did use a spreadsheet about three years ago).

      It can be quite useful to dig up old ideas and see how much further you can push them with modern technology. Perhaps we should wait and see how far they can get before turning on the full-strength pessimism.

      1. John Brown (no body) Silver badge

        Re: a Fourier transform in real time but it couldn't run a simple spreadsheet with ...

        "Your requirements might be different, of course,"

        Exactly. There's nothing wrong with exploring the possibilities of highly specialised computing instead of "simply" trying to shoehorn more power into the generalised model we tend to prefer these days. Multi-core GPU GFX cards is a step in that direction in that they are not general processing units but optimised for certain types of calculations that the main CPU can be slow at dealing with.

        1. Michael Wojcik Silver badge

          Re: a Fourier transform in real time but it couldn't run a simple spreadsheet with ...

          There's nothing wrong with exploring the possibilities of highly specialised computing instead of "simply" trying to shoehorn more power into the generalised model we tend to prefer these days

          Sure, there's nothing wrong with it; in fact there's a lot right with it - that's what we call "research". But the reason "we tend to prefer" the "generalised model" of Von Neumann computers1 is because economies of scale make them really fucking cheap. It's not just because we're lazy thinkers.

          Discrete digital computation certainly is far from ideal for many classes of problems, but often it's cheaper - in terms of all sorts of resources, and for both the short and reasonably long term - to just throw lots of digital-computation power at those problems, and call it good enough.

          1Albeit now with speculative OOO execution and many other tweaks that make them rather unlike the "pure" vN architecture of yore. But I digress.

      2. Anonymous Coward
        Anonymous Coward

        Re: a Fourier transform in real time but it couldn't run a simple spreadsheet with ...

        "my use for fourier transforms is much greater than that for spreadsheets "

        OK with that, though presumably if FFTs were that important to you, you might run them on a suitably chosen tool, e.g. something DSP based (which itself wouldn't run spreadsheets, but might interface to something that does run spreadsheets)?

        Equally, you probably like your payroll and your bank account to be run on something that (as others have noted) isn't an updated equivalent of an analogue computer - this new thing might be great for running certain classes of model, but it sounds relatively useless at accurate repeatable simple arithmetic and logic as would be required by payroll and by most banks.

      3. Michael Wojcik Silver badge

        Re: a Fourier transform in real time but it couldn't run a simple spreadsheet with ...

        It can be quite useful to dig up old ideas and see how much further you can push them with modern technology

        Yes, but this is hardly "dig[ging] up old ideas". That is, analogue computing is a relatively old idea, but there have been plenty of projects implementing analogue computers using microelectronics. Maybe this particular approach will bear commercially-viable fruit; but it's hardly the breathtaking breakthrough that the article implies.

        On a possibly-related note: I was reading in the Santa Fe paper the other day about a startup in Los Alamos that's pitching a combined-compute-and-memory chip. The article was of course devoid of plausible technical details - it had enough inaccuracies to make a Pentium FPU - so I don't know what the mooted design looks like, but clearly this is an idea that's in the air.

    2. This post has been deleted by its author

      1. John Robson Silver badge

        Re: Given the example shown:

        @1980s_coder - can it run Crysis?

        No, but it can tell you who will have won... maybe

    3. Anonymous Coward
      Anonymous Coward

      Re: Given the example shown:

      That's right. The results are approximate and limited by the noise floor; this is not a general-purpose computer in the regular sense. Not to say that getting approximate answers quickly to things like Fourier transforms isn't useful; but getting answers to discrete problems, like integer factorisation, do not suit this architecture well.

      For those not old enough to remember analogue computers:

      http://blog.analogmachine.org/2012/03/15/analog-computers/

      I once had an edition of a 1970's electronics magazine which had a project to build an analogue computer lunar lander game. Integrators were used to store your altitude, velocity and fuel remaining, and you controlled the rate of burn with a potentiometer.

  3. Gideon 1

    “We show an experimental demonstration of an actual memcomputing architecture that solves the NP-complete version of the subset sum problem in only one step and is composed of a number of memprocessors that scales linearly with the size of the problem”

    Yeah, but the interconnect between all these memprocessors rises exponentially. Similar to an FPGA, you have to program all that interconnect too, and that takes a long time, and can only solve one problem with each programming.

  4. Gideon 1

    Optimal memory != optimal processor

    Silicon baking processes are optimised for what they are trying to make. Microcontrollers integrate processor, ram, and flash memory all on the same silicon by compromising the foundry process to get all the parts to work, but they all end up sub-optimal which is why microcontrollers have lower performance and smaller memories than their dedicated counterparts.

  5. Paul A. Clayton

    A relevant blog post: http://www.scottaaronson.com/blog/?p=2212 (Found from this Computer Science Stack Exchange answer: http://cs.stackexchange.com/a/44167/4577)

    1. nijam Silver badge

      @Paul A. Clayton

      AFAIK whether or not NP-complete problems admit of solution in polynomial time remains an unsolved problem itself. I believe the weight of opinion says not, but that scarcely constitutes proof in the field of computability.

  6. charles_hermann
    Stop

    @dESTROY aLL mONSTERS

    We're all already betting more that 100 EUR that this is nonsense ;)

    A scalable method of solving any NP-complete problem in polynomial time / space would take out most forms of cryptography - including those relied upon by banks. Even though their preprint has been around since last December, I don't see widespread panic & stashing of gold coins under beds just yet!

    If you look at their arXived paper, it's clear that scalability requires -complete- absence of noise, as the amplitudes of certain crucial oscillations decreases exponentially. No engineering magic is going to solve this one - even if they cool it to close to absolute zero they will run into quantum uncertainty before they can attack a practical problem.

    They claim that a different encoding might get around this, but the one they reference only offers a quadratic advantage, which is a long way from what they need.

    It's not scalable.

    1. Muscleguy

      Re: @dESTROY aLL mONSTERS

      Its not scalable or you do not understand or know how to make it scalable? They used to say that about valve architectures, then silicon came along. I don't think we are talking about fundamentals of the universe here but limitations in materials, fabrication and programming added to architectures.

      Computers and computing is not some fundamental property of the universe unless you get really arcane in information theory that both leaves practicality behind and suffers from poverty of assumptions.

      We also have and do have dedicated units in our computers. What is a graphics card but an optimised sub motherboard dedicated to one task? I remember the Mac SE 30 that had a maths co-processor added to the chip. I was doing 3D reconstruction (stacked dinnerplates, not solids) of mouse legs and the difference from a computer without vs with was in the former: input x.y,z go away and do something else, maybe get yet another coffee while you wait; the latter watch it in real time and cancel if you get the x,y.z wrong and go again.

      So I don't see this as a motherboard architecture, I see this as a dedicated add-on unit for speeding up particular tasks or doing them more efficiently and thus freeing the CPU for other tasks. Imagine how much slower your device would be if the CPU was handling all the graphics.

      1. charles_hermann

        Re: @dESTROY aLL mONSTERS

        It's not scalable, in the sense that a scalable version of what is described would break the laws of both thermodynamics and quantum mechanics. I'll freely admit I don't know how to do that :)

        If you believe that some different technology (valves -> silicon) might one day solve NP complete problems in P-time, that's up to you. I'd like to see the evidence, but this talk / paper doesn't offer any.

    2. Michael Wojcik Silver badge

      Re: @dESTROY aLL mONSTERS

      A scalable method of solving any NP-complete problem in polynomial time / space would take out most forms of cryptography

      I don't think so - only the asymmetric-key ones, and other primitives that rely on trapdoor functions. I suppose you could argue that constitutes a majority of the "forms of cryptography", depending on how you count them, but "most" seems like a stretch.

      Or do you know of an argument that if P=NP then symmetric cryptography is doomed? I'd be interested to hear it.

  7. phil dude
    Boffin

    P != NP....

    @Paul A. Clayton Thank you for dredging that up - it is an accessible explanation.

    This sort of "magical thinking" has got to be torpedoed when it rears its head.

    P.

    1. Anonymous Coward
      Anonymous Coward

      Re: P != NP....

      "Thank you for dredging that up - it is an accessible explanation."

      It is in my opinion a *very* accessible explanation, needing not too much more than an understanding of superposition and Fourier transforms.

      In particular (if I've understood correctly) the point that to get a Fourier transform of infinite resolution will need an infinite length (ie infinite time) sample. Less precision permits shorter run times. Transfer my wages to an account numbered in the region of 08-00-28 please. Oh hang on, that's pretty useless as a concept. Which may slow matters down somewhat.

      Why does anyone attempt to claim success with stuff like this?

      It's not like the Italian "faster than light" stuff where they said "these are our measurements, what are we missing?" and ended up being somewhat misreported.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like