Hardware

Quantum leap: D-Wave's next quantum computing chip offers a 1,000x speed-up

With a new chip due out early next year, the maker of the D-Wave quantum system says its technology has the potential to revolutionize machine learning.

screen-shot-2014-06-19-at-2-54-44-pm.png

A close up of one of D-Wave's quantum processors.

Image: Google / YouTube

We may be decades from unlocking the true power of quantum computing, but D-Wave is promising to offer a taste of the future with its significantly upgraded quantum processor.

When it is released early next year, the Canadian firm's new quantum chip will be able to handle some 2,000 quantum bits (qubits), roughly double the usable number found in the processor in the existing D-Wave 2X system, and be capable of solving certain problems 1,000x faster than its predecessor.

D-Wave machines are multi-million dollar computers that crunch data using "quantum transistors", tiny loops of niobium cooled to close to absolute zero by liquid helium. Only a handful of such systems are in use, run by Google and the Universities Space Research Association, Lockheed Martin and Los Alamos National Laboratory. However, D-Wave also offers access to its quantum computers via a cloud service.

SEE: Quantum computing: The smart person's guide

Quantum computing is still a largely theoretical field, which studies how to exploit the bizarre and counter-intuitive way that matter behaves at an atomic level to develop hugely powerful machines. For certain tasks, quantum computers have the potential to be exponentially faster than existing systems, as well as being vastly more energy efficient. While a universal quantum computer doesn't yet exist, the D-Wave system utilises various atomic behaviors, such as entanglement and state superposition, to help solve a range of difficult computational problems.

"We've been on a trajectory, which has been doubling the number of qubits pretty much every year," said Colin Williams, director of business development and strategic partnerships at D-Wave.

Bumping up the qubits on the D-Wave processor moves the systems closer to being able to challenge conventional computers, and the new processor will also support additional features that allow for more efficient calculations.

"From an internal tests, that looks like that's a really good thing to do. We've got some problems we've already sped up by a factor of 1,000 by exploiting that capability," said Williams at the CW TEC conference in Cambridge.

D-Wave systems are not universal computers, as is the case with PCs today.

Rather than being capable of performing any computational task asked of them, D-Wave machines are designed to tackle a specific task known as unconstrained binary optimization, as well as related sampling problems. A very simple example of this type of optimization problem might be the challenge of drawing up a plan for a house that comes as close to your dream spec as possible, while staying within your budget.

The specialized work that the D-Wave processor can carry out could be useful in a number of areas, according to D-Wave, particularly training machine learning models.

However, there are still significant obstacles to the larger challenge of building a universal quantum computer, with several unresolved engineering challenges. By extrapolating from past trends in chip development, John Morton, professor of nanoelectronics and nanophotonics at UCL, predicts the first universal quantum processor won't be built until the 2030s.

Morton said that just as a calculator isn't a computer, so the D-Wave system isn't a universal quantum computer.

"A calculator solves a very specific set of problems. Lots of people use it, and you can use calculators across many different industries," he said.

"So when D-Wave shows you many different industries that might use a D-Wave machine, there may be many areas that it can be used in, but it remains a specialized device."

While Google isn't using the D-Wave machine to bolster its machine-learning efforts as of yet, the credibility of D-Wave's processors — which has been repeatedly challenged by some academics — was given a boost by a test run by the technology giant late last year.

The experiment found that the D-Wave 2X processor was 100 million times faster than a classical processor running a similar operation, but more importantly it demonstrated the future viability of D-Wave's chips, according to Williams.

"The main takeaway from that wasn't really a speed up result, because there are other classical algorithms that can do better," he said.

"That experiment showed the quantum tunnelling really was occurring in the D-Wave chip. It showed that even if the range of that tunnelling is finite, it is a useful computational tool.

"Google understood, as do we, that as we evolve the design of our chip to make it more densely connected, then the classical algorithms that currently work well for this problem will completely fall apart."

D-Wave has raised millions of dollars in funding from various investors, including investment bank Goldman Sachs, In-Q-Tel (the investment arm of the US Central Intelligence Agency), Bezos Expeditions (the investment arm of Amazon founder Jeff Bezos), and BDC Capital, Harris & Harris Group, and DFJ.

A machine that can speak like a human

Another criticism of the D-Wave's chip is that its specialized nature limits its usefulness, something that Williams rejects.

"I wanted to put to rest the idea that the D-Wave chip only does one thing, it's actually been shown that the one thing it does, can be used in lots of different fields," he said.

While not naming the organizations that had used D-Wave chips in this way, he said the processors had been used in the financial sector for trading trajectory optimization, to work out how proteins fold in bioscience, to create filters for lists that never miss a potential match — useful for security services checking terrorist watchlists, for development of binary classifiers in AI and for computer vision.

But it is unsupervised machine learning, where training data is fed into a neural network and the machine learns by identifying patterns, where Williams believes the D-Wave processor will make the greatest impact, perhaps explaining Google's interest in the technology.

"We think that machine learning and AI are fundamentally the best use cases for this kind of machine. It has the potential to be completely revolutionary, especially for unsupervised generative learning," he said.

"With the quantum chip, we have the potential to go back and address the original, grand challenging problem of machine learning - 'How do you get unsupervised, generative machine learning to work as efficiently as you can?'.

"If you can do that, you can do amazing things with machine learning. You can make a machine that, having trained it, you can make it generate new data that is statistically indistinguishable from the kind of data on which it was trained."

Williams predicted future generations of D-Wave chips that could train machines to produce new and convincing works of art in the style of the master painter whose work it was trained on or to be capable of human-like speech, he said.

D-Wave has already experimented with machine learning on the chip, setting up a Boltzmann machine, a type of stochastic recurrent neural network, as well as a "Quantum Boltzmann machine", which Williams said is 'fundamentally different from previous machine learning models'.

Williams doesn't see the D-Wave chip, or other quantum processors, that might follow as replacing classical computer chips, but as working alongside them.

"We recognise that it's not going to be the case that quantum computing replaces classical machines. The way quantum computing is going to change the world is it's going to augment classical systems," he said.

"For example, you could take the output of a quantum computer and use that as an input to a heuristic search algorithm. The idea is that the quantum algorithm could get you in the vicinity of a good solution and a classical algorithm could finish it off.

"We can also look at pre-processing techniques, to take a problem that is too big, put it on the quantum chip and break it down into a sequence of problems."

Beyond the 2000 qubit processor, Williams says that D-Wave has a design for a "next-generation chip" with a "fundamentally new topology, based on all the lessons we've learnt".

More details on the new D-Wave chip (for those with a grasp of quantum physics)

For those interested in the nitty gritty, Williams spoke in more depth about the new capabilities in its 2,000 qubit chip.

Each D-Wave processor is designed for quantum annealing, using quantum physics to find a minimum energy state, which is useful in solving the optimization and related sampling problems mentioned above. Williams explained how the new chip will offer greater control of the annealing process.

"It's not just more qubits, we've changed a lot of other features too. On the previous D-Wave chips we only had the ability to look at one annealing trajectory. We've only been able to essentially turn off the initial Hamiltonian and turn on the final Hamiltonian one way," he said, referencing a Hamiltonian formula that is able to output the energy within a system when given the state of that system.

"That's no longer true, with the 2,000 qubit chip, we're going to have a lot more control over parameters, we're going to have a lot more control over the trajectory.

"From internal tests, that looks like that's a really good thing to do. We've got some problems we've already sped up by a factor of 1,000 by exploiting that capability.

"We also have features for pausing the anneal and then ramping it quickly to the end, you don't have to anneal at a constant speed anymore.

"That's very interesting, because it allows you to probe the quantum state in the middle of the anneal, which is a critical feature from the point of Quantum Boltzmann Machines.

"We also have a faster annealing, previous generations of systems could only anneal down to 20 microseconds. With the new system we can anneal down to five microseconds."

Read more about quantum computing

About Nick Heath

Nick Heath is chief reporter for TechRepublic. He writes about the technology that IT decision makers need to know about, and the latest happenings in the European tech scene.

Editor's Picks

Free Newsletters, In your Inbox