The new IBM supercomputer chip mimics the human brain by using an architecture with 1 million neurons. Nevertheless, its true purpose remains in question for a project with massive public funding.
At the Lawrence Livermore National Laboratory, scientists have begun testing IBM's TrueNorth computer chip—a brain-inspired mega chip IBM deems "the largest neurosynaptic chip" to date.
IBM's TrueNorth is the first of its kind. A collaboration between IBM and Cornell University via the DARPA SyNAPSE Program, which was granted $100 million in public funding, the idea for the chip was conceived in 2004.
While traditional computing chips have what's called Von Neumann architecture, relying on separate memory and processors, TrueNorth's structure is an example of neuromorphic computing, which is designed to mimic the human brain.
Each chip is powerful, with 1 million neurons and 256 million synapses, and they are assembled on boards with 16 chips, creating systems with 16 million neurons and 4 billion synapses.
Also, since the chips process information differently, by using the neuron model of the brain, in which each neuron fires only when needed, the chip is also not working constantly—IBM and others claim that this makes the chip much more energy efficient.
"Is it considered state of the art for what we have in terms of low-power chips?" said Nir Shavit, a professor at MIT's Computer Science and Artificial Intelligence Laboratory. "No. But it is the first one that pushes this idea of neuromorphic to the extreme."
Applications of the chip
So what is the purpose of the chip? According to Dharmendra Modha, researcher with the Cognitive Computing group at IBM's Almaden Research Center, neuromorphic computing "delivers deep learning with energy-efficiency, volume-efficiency, speed-efficiency, and scalability."
Aside from the chip itself, Modha said that IBM has created novel systems based on the chip, including "an end-to-end ecosystem consisting of a simulator; a programming language; an integrated programming environment; a library of algorithms and applications; firmware; tools for deep learning; a teaching curriculum; and cloud enablement."
The ecosystem, Modha said, is already in use at over 30 universities and government agencies.
"This is not a point solution, but rather a substrate for designing neural network-based intelligent business machines that will transform business, government, and society," said Modha.
So what are the real-world applications? Modha said it provides "energy-efficient, always-on content generation for wearables, IoT devices, smartphones." It can also give "real-time contextual understanding in automobiles, robotics, medical imagers, and cameras." And, most importantly, he said, it can "provide volume-efficient, unprecedented neural network acceleration capability per unit volume for cloud-based streaming processing and provide volume, energy, and speed efficient multi-modal sensor fusion at an unprecedented neural network scale."
The chip has also been seen to have the potential to affect cybersecurity and other defense goals at the National Nuclear Security Administration (NNSA).
Over at Microsoft, they also see the potential for this kind of computing power. "One of the 'next big things' will be ultra-powerful machine learning coupled with novel accelerator hardware running at large scale in the cloud," said Doug Burger, distinguished engineer, director, client & cloud apps at Microsoft Research. "The space is moving so fast that lots of competing approaches should be explored; this way the industry will quickly learn what large-scale machine learning architecture will provide the big leaps in cognitive capabilities."
And the creation of the chip is not the endpoint for IBM. The company has a larger goal: to build a chip with 10 billion neurons. "In 2011, IBM had a prototype with one core and 256 neurons; in 2014, each grew to have 4,096 cores and 1 million neurons; today's system has 65,536 cores, 16 chips, and 16 million neurons," said Modha. "So, in a span of just five years, the technology has grown 65,536 fold!"
The plan, he said, is to "innovate on architecture, systems, and software ecosystems to achieve the long-term goal of building a brain in a box." The "brain" he said, would not only have 10 billion neurons, but it would use less than 1kW and be less than 2 liters of volume, in size.
But while the power of the chip is truly unprecedented, some question the purpose behind its development.
"Why are we doing this?" asked Shavit. "Are we trying to push the frontier of hardware in terms of problems that we know how to solve today? Or to solve problems in the future?"
The answers, Shavit said, are frustratingly mysterious.
Over at Facebook, Yann LeCun, director of artificial intelligence research and an expert in neural networks, has written that he does not believe that the chip will be able to surpass the processors available today.
Shavit also doesn't see the reasoning behind the chip.
"It would be as if Henry Ford decided in 1920 that since he had managed to efficiently build a car, we would try to design a car that would take us to the moon," Shavit said. "We know how to fabricate really efficient computer chips. But is this going to move us towards Human quality neural computation?" Shavit fears that its simply too early to try to build neuromorphic chips. We should instead try much harder to understand how real neural networks compute.
"The problem is," Shavit said, "that we don't even know what the problem is. We don't know what has to happen to a car to make the car go to the moon. It's perhaps different technology that you need. But this is where neuromorphic computing is."
- IBM announces new Watson APIs for developers to make AI apps more human (TechRepublic)
- IBM Watson's next big challenge: Filing your taxes (ZDNet)
- How IBM's new 7nm chip busts Moore's Law, changes future of computing (TechRepublic)
- Configurable IoT chips present new hacking risks (TechRepublic)
- The computer that helped bring nuclear power to the world (TechRepublic)
- Security and Privacy: New Challenges (TechRepublic and ZDNet special feature)