back to article IBM's Phase Change Memory computer can tell you if it's raining

IBM boffins have unveiled new work in-memory computing: doing processing inside Phase Change Memory with no external CPU. Traditional computing requires a memory to hold data and an external processor to which the data is transferred, processed and then written back to memory. This is the Von Neumann architecture and, can be …

  1. mako23

    It will not work....a sad desperate attempt from a company that's past is sell by date

    1. Lee D Silver badge

      Timing synchronisation would seem to be the problem, especially if you're talking temperature critical operations.

      I don't see how you'd be able to do these computations any faster than current technology, or even quite a bit slower. Certainly without then having to push the results to somewhere else to actually make use of them.

      It probably has some niche applications somewhere... no RF emissions? No central clock / space missions where clocks might not work reliably / time might change because of speed / etc.? I don't know. But as a mainstream technology? I don't see it.

      1. Dave 126 Silver badge

        And yet we are already aware of systems that 'compute' and 'store' in the same regions, for deciding actions based on fuzzy inputs in a very power efficient manner. Biological brains.

        So I'm not sure why the above two posters assume this approach is a useless dead end.

  2. Destroy All Monsters Silver badge
    Holmes

    Computronium!

    Sounds interesting, but of course only useful for some algorithms (that have characteristics of "physiciality": they are local and far away nodes need to be messaged first). In computers, memory is used as "super-lightspeed" communication medium.

    But yeah, basically the Connection Machine 1 without the hypercube communication channels? (Upon which, it is always interesting the re-read The Book. It's likely floating around on the 'net somewhere. I mean come on MIT Press, USD 34 for a 30-year old extended paper in B&W?)

    1. Doctor Huh?

      Re: Computronium!

      I'm glad I'm not the only one who came up with this, ahem, connection.

      The basic idea of moving the computation to the data has been around for decades and pops up frequently. My most recent favorite is the Netezza appliance, which essentially implemented smart disk storage using FPGAs and hard drives. SSD storage would have solved the low MBTF of the mechanical devices. But, 2 factors have put that technology on life support:

      1. About 5 years ago, Hadoop became the shiny new thing (now it's Spark, and tomorrow it will be...?), and interest in dedicated appliances waned as interest in on-demand Cloud-based Hadoop grew.

      2. IBM bought Netezza. IBM buying your company is a more effective kiss of death than the one Michael Corleone gave Fredo, because IBM doesn't wait for Mom to die before putting out the hit.

      I can see Seth MacFarlane doing a whole bit on how this memory chip is an improvement in rain detection over the "weather rock" present in so many places.

  3. Doctor Syntax Silver badge

    This sounds like a technology for building neural networks with something like the degree of parallelism that the brain uses.

    1. DropBear

      I don't think so, in anything like its current form - I see zero interconnectivity in this. Just a bunch of cells, each one reacting locally to some external stimulus in a horribly rudimentary way. Without all the interconnectedness (and much more importantly, the part that can _modify_ the connections in a meaningful way) this sounds nothing like neural networks...

      1. Dave 126 Silver badge

        @Drop Bear

        There has been success in using neural networks that have some areas of just local connections. Indeed, it seems the bottlenecks that these introduce are essential to keep the amount of information low enough to be processed efficiently. The image in this first link below illustrates this well:

        https://arstechnica.co.uk/science/2017/10/inspired-by-brains-visual-cortex-new-ai-utterly-wrecks-captcha-security/

        More theoretical stuff here:

        https://www.quantamagazine.org/new-theory-cracks-open-the-black-box-of-deep-learning-20170921/

      2. Doctor Syntax Silver badge

        "Without all the interconnectedness"

        Sure, you'd have to add the interconnections. As it stands it's not complete. But the ability to do this at the density of memory cells does shove it well up the parallelisation scale and could make it an enabling technology.

  4. frank ly

    Analogue computing?

    From the description of heating effects causing changes to the electronic state and hence derived data value, it sounds like it could be used as a signal integrator (with fixed natural leakage rate) and a trigger output if heating rate due to signal (minus natural cooling rate) exceeds a certain value.

    I suspect it would be quicker, easier and more flexible if you multiplexed all the inputs into an A/D converter and then used 'traditional' digital programming methods to perform computations on them.

  5. steelpillow Silver badge
    Thumb Up

    The return of analog

    It does sound like a return of analogue computing, in at least some respects. As I understand it, repeated small SET pulses may be applied to nudge the device conductance until sufficient material is affected by the phase change to register a 1. The pulses are summed analog-fashion in the conductance of the device, i.e. the amount of material currently changed. If so, then it is rather like the way human memory works, by strengthening already-existing synapse connections between neurons. I wonder if it can accumulate partial RESET signals to weaken memories, too. Now that would be something!

  6. jMcPhee

    Must be...

    ...options vesting time at IBM. Or, perhaps they want to sell off Zurich so Ginni can ride another helicopter.

  7. Colin Tree

    tri-state

    I'm waiting for three-state memory

    +ve -ve zero

    true, false, don't care

    true, false, maybe

    true, false, stop bit

    Especially as a stop bit, serial storage, variable word length,

    a true or false flag could be stored as f+1,

    a very long number could be stored in n+1 bits,

    variable length instruction, execute

    how much redundancy is there in fixed length words ?

  8. annodomini2

    Please correct me if I am wrong...

    There seems to me to be two possible avenues:

    Both with Time delayed operation

    1. Slow switching transistors (phase change), but with retained state, so you could set a switch state, but only action it when needed, this I think is where the potential power savings are intended to come from in the circuit.

    2. Potential Analogue computer routes.

    The main issue will always be the mechanical thermal gradients in the circuit, I doubt you'd be overclocking this to 7Ghz (not that it would stop someone from trying).

    But for applications where speed is less of an issue and integrity is high there could be products.

  9. Tom 64

    "boffin bunch"

    @El Reg

    Please use the correct collective noun for boffins. 'baffle'. e.g The baffle of boffins ...

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like