Von Neumann architecture is 70 years old. The Department of Defense's funding to university researchers is aiming to come up with smarter approaches for big data problems.
Unique approaches to designing the computers of tomorrow are being planned at six new centers funded by the U.S. government's Defense Advanced Research Projects Agency (DARPA).
Industry research usually focuses on improvements to the ubiquitous Von Neumann architecture, where data communicates directly with processors which in turn have exclusive access to memory and storage. Notable goals of the new centers include alternative architectures such as putting logic directly into memory, along with modular designs to eliminate the need for motherboards.
(Whether Von Neumann architecture is an accurate term or a misnomer has become a controversial topic in the past few years. It's well-documented that John von Neumann in the 1940s documented the EDVAC architecture developed by J. Presper Eckert and John Mauchly, but computer historians are divided about what part of the ideas were Eckert and Mauchly's, what parts were von Neumann's own, and whether he improperly accepted credit for the whole thing.)
SEE: Quick glossary: Storage (Tech Pro Research)
The concepts of near-data processing and intelligent RAM aren't new, but these have not been applied to entire systems or data centers in any mainstream way, said Kevin Skadron, a University of Virginia computer science professor leading CRISP (the Center for Research on Intelligent Storage and Processing in Memory), which includes seven other universities and has a 5-year, $200 million contract through DARPA and the nonprofit Semiconductor Research Corporation (SRC).
"We're going to build on the ideas and the prior results that are in the literature," Skadron explained. "With 20 faculty in the center, we're going to be having a variety of approaches we will be exploring."
"We need to re-think the overall system stack. So we need new programming frameworks that insulate the programmer from these new architectural capabilities and allow them to write code that is highly portable. Hopefully we'll come up with our own unique approaches," Skadron continued. "You need to re-think the operating system as well. Instead of treating memory in the classic page-based fashion, you need a new way for the operating system to organize and protect memory."
Caching, prefetching, and parallel computing are examples of Band-Aid methods to compensate for the CPU wall between memory and data input, so data center managers and system administrators are already seeing the need for such research, Skadron noted. "I would guess that many of them are spending a lot of time stalled waiting on memory or waiting on disk," which is a problem that can go away if the separation of processing and memory could be removed [beyond stored-program concepts], he observed.
Another of the new centers is ADA (Applications Driving Architectures) with a $32 million grant led by the University of Michigan. Prof. Valeria Bertacco envisions simplifying application development by making processor and memory components modular in the same assembly, instead of limiting them to communication across a backplane. "When many more people can make the applications, that's what makes an ecosystem bloom," she noted, citing as a recent example the mobile applications revolution.
Bertacco said she'd like to work on processors tailored to algorithms, so that developers can pick the processor best suited for their task. ARM, IBM, Intel, Micron, and Samsung are supporting ADA, she added. Outreach is being planned to companies that make applications and operating systems.
The remaining centers are:
- ASCENT (Applications and Systems-driven Center for Energy-Efficient integrated Nano Technologies), University of Notre Dame ($26M);
- CONIX (Computing on Network Infrastructure for Pervasive Perception, Cognition, and Action), Carnegie Mellon University ($27.5M);
- BRIC (Brain-inspired Computing Enabling Autonomous Intelligence), Purdue University ($27M); and
- ComSenTer (Converged TeraHertz Communications and Sensing), University of California Santa Barbara), ($27.5M).
All six centers report to SRC's Joint University Microelectronics Program. Each center will have an annual symposium to share research.
The DARPA/SRC projects are tasked to look a decade in the future, giving the centers plenty of research material before quantum computing makes silicon obsolete.
- Quantum computing: The smart person's guide (TechRepublic)
- CES 2018: Intel proclaims 'major breakthrough' in quantum computing chip (ZDNet)
- Storage-class memory supporters may heed lessons learned from the 1970s (TechRepublic)
- In-memory computing: Where fast data meets big data (ZDNet)
- What would a memory-centric system look like? (ZDNet)
- The cloud v. data center decision (free PDF) (ZDNet/TechRepublic special report)