IBM revealed a new generation of experimental computer chips that simulate the human brain. The “Neurosynaptic Chips” emulate the interactions between neurons and synapses in the brain using advanced algorithms.
Using artificial intelligence, cognitive computers built with the new chips would move technology beyond the traditional von Neumann computer architecture. This would let them learn dynamically through experience, in addition to collecting and analyzing information from various sensory modes.
A cognitive computing system can detect temperature, pressure, wave height, sound, and smells, and could make informed decisions to issue tsunami or traffic warnings.
So far, two prototype chips have been manufactured and are undergoing testing. The chips have potential to be smaller and much more power-efficient than current designs.
The computer chips were inspired by neurobiological concepts. The neurosynaptic core of the chip is made of integrated memory, computation, and communication systems that parallel synapses, neurons, and axons in nervous systems.
The chips mimic the “perceive, process, and make a decision” capabilities of biological brains. One chip contains 256 neurons and 262,144 programmable learning synapses.
Having received close to $21 million in new funding from the U.S. military’s research and development branch, the Defense Advanced Research Projects Agency (DARPA), IBM and its university collaborators are ready to embark on Phase 2 of the Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) project after completing Phase 0 and 1.
According to a press release, in the long run, IBM intends to build a chip system with 10 billion neurons and 100 trillion synapses, consuming 1 kilowatt of power and occupying less than 2 liters of volume.
New Artificial Intelligence Computer Chips Mimic the Brain
IBM revealed a new generation of experimental computer chips that stimulate the human brain. The “Neurosynaptic Chips” emulate the interactions between neurons and synapses in the brain using advanced algorithms.
By Vicky Jiang
Updated: