Experts from chip designer Arm on how chip design will evolve to ensure performance keeps advancing.
"Moore's Law is dead. Moore's Law is over."
So says Mike Muller, chief technology officer at chip designer Arm, the Japanese-owned company whose processor cores are found inside nearly all mobile phones.
Given Moore's Law has been the engine driving the breakneck pace at which computers have advanced over the past 50 years this statement might seem worrying.
But Muller is more sanguine.
"On one level it's true, but I'd say, certainly from my perspective and Arm's perspective, we don't care," he said, speaking at the Arm Research Summit 2018.
Muller and his colleagues have good reasons for their indifference to the end of Moore's Law, the prediction that the number of transistors on computer processors will double every two years.
SEE: Internet of Things policy (Tech Pro Research)
For one, the bulk of Arm-based processors are sold into the embedded computing market, where there is still plenty of scope for transistors to get smaller and chips to get faster.
But more importantly, Arm believes the regular boosts to computing performance that used to come from Moore's Law will continue, and will instead stem from changes to how chips are designed.
Here are three ways that Arm expects processor design will evolve and advance.
1. 3D chips will continue to improve processor performance.
Muller believes chip designers will continue to squeeze more power from processors by stacking more transistors and processor dies on top of each other.
"There's a whole bunch of stuff happening in 3D, whether that's within the silicon and 3D transistors stacking within a die, [or] stacking dies together," he said.
"It's become a reality in how you manufacture Flash chips, it's a reality in what you do in servers now, where you take your large multi-core CPU and cut it up into four smaller ones and stack them.
"The yields go up and you actually have better performance and higher yield by starting to stack those processors with memory."
2 . Computers will rely on increasingly specialized chips
While today's systems already offload workloads to processors tailored to accelerate those tasks, for example, 3D rendering to GPUs or running trained machine-learning models to Google's TPUs, Muller predicts future systems will have an even wider range of specialized chips.
"You have to step back and say 'What are some of the tasks we're doing? How can we architect better accelerators to solve specific tasks? And how do we build accelerators?'" he said.
"It's a way of putting brain cycles into solving computational problems that isn't just brute force and transistors."
Greg Yeric, director of Future Silicon Technology for Arm Research, says there's plenty of runway to continue improving accelerators.
"For the next three to five years there's a lot to be gained just by making better CMOS-based machine learning," he says.
"The Google TPU is a great example. You don't need massive amounts of accuracy to do these calculations. You can cut back on power and delay by not counting as many bits in the decimal points. Simple things like that."
3. Computers will move beyond silicon chips
In the near future, it's possible we will reach the limits of conventional materials and technologies used to build processors, such as the CMOS (Complementary metal-oxide-semiconductor) chips used today, says Yeric.
At this point, the chips inside systems will become more diverse, with traditional CMOS chips sitting alongside more exotic forms of information processors.
"We're going to start to see accelerators that are hardware differentiated," says Yeric.
"So, you can have a special kind of transistor that does one thing really well, [which] doesn't necessarily do all compute well, and you can have a chip that bolts onto a regular CMOS chip.
"Architecting those systems is going to be a bit more of a challenge, however, I really don't see that we're going to have a slowdown at the system level, it's just that the systems won't look like a big monolithic piece of CMOS."
Muller says to expect the likes of neuromorphic , spintronics, and even quantum photonic chips to find their way inside systems.
"There is going to be, in the 10-15 year timeframe, new technology to take us beyond CMOS," he says.
"Products as bought by people like you and me are just going to keep getting better and better.
"Our jobs [chip designers] might be getting harder and harder, but from a consumer's perspective there isn't going to be a slowing down," he says.
- Special report: Digital transformation: A CXO's guide (free PDF) (TechRepublic)
- What is the IoT? Everything you need to know about the Internet of Things right now (ZDNet)
- Internet of Things (IoT): Cheat sheet (TechRepublic)
- Google to let you pop its AI chips into your own computer as of October (CNET)
- New security certification could make it easier for businesses to get started with IoT (TechRepublic)