Practically every new technology gets caught in a hype cycle–many would argue that 5G networks are at the forefront of this, presently, as promises about what 5G will be able to do are far off from what initial network rollouts can actually provide. The overabundance of hype in technology sets an unrealistic expectation for its performance, inevitably leading to disappointment when the technology finally reaches consumers.

The hype cycle effect is slightly more indirect in enterprise technology. Immeasurable amounts of money have been invested in technologies that never quite made it–this, in part, is a failure of planning, as not every piece of technology can be directly and equally applied to every business use case, despite overzealous attempts to do so, as is increasingly the case with blockchain.

SEE: Quantum computing: An insider’s guide (free PDF) (TechRepublic)

Hewlett Packard Enterprise sees quantum computing in this light. Though HPE is not building quantum computers, the company is not betting against the technology.

“If you’re trying to find the ground state of a molecule that you think might be a great drug, then a quantum computer would be your go-to machine to do that,” Hewlett Packard Enterprise senior fellow Ray Beausoleil told TechRepublic. “I’m a big booster of quantum computing. I think the applications are going to be incredibly interesting and important. I just don’t think that the enterprise is going to be one of those places where those applications are found, unless you’re a pharmaceutical or materials company.”

Typical office work is not going to be improved by quantum computers; this is not a technology that an average accountant can utilize to improve their work. “Quantum computers are not very good at the three Rs–reading, writing and arithmetic. They don’t like to read big databases, they can’t give you very big complicated readouts, and they don’t do regular arithmetic as well as a regular old classical computer,” Beausoleil said.

Classical applications are poor fits for quantum systems

The difficulty of attempting classic calculations on quantum computers is not widely understood, leading to optimistic predictions about their applicability for general-purpose use cases.

“Suppose that you have a 200-layer deep neural network, 50 nodes per layer. That’s 200 by 50 by 50 weights that you need after you’re done training… let’s pretend that each of those weights as 50 bits,” Beausoleil said, setting up an example of processing a petabyte of data using a machine learning algorithm. “In in principle we could store [one petabyte] in 50 cubits. However, those have to be 50 perfect cubits.”

The current generation of quantum computers–Noisy Intermediate-Scale Quantum (NISQ) systems–use imperfect qubits which are subject to environmental noise, and are operable for a short time before reaching decoherence. It is possible for a quantum computer to combine noisy qubits to simulate a perfect qubit, with John Preskill estimating this conversion around 1,000 noisy qubits for 1 good qubit, while IBM researchers have seen some success by amplifying and measuring noise to extrapolate what a noiseless state would be.

Beausoleil notes, however, that “the big problem is that no one has any idea how to efficiently take that petabit of information and encode it in 50 cubits. There is no algorithm for that,” adding that “There’s no such thing as a quantum hard drive.”

Every time the neural network is run, the data set must be re-loaded–which Beausoleil contends is not yet possible, and poses a large encumbrance for adoption of quantum computers. “I can only get one classical bit of information out of every cubit. I have to re-run this 200 by 50 by 50 times, and then carefully plan my measurements so that each time I extract a unique weight,” he said. “Each time I’m doing that run, I’m re-encoding that petabit of data into the quantum register.”

There’s no quantum roadmap

Unlike industry resources like the International Technology Roadmap for Semiconductors, and industry anecdotes like Moore’s Law, there is no shared wisdom about when higher-performance quantum computers will be available. After NISQ, the next step is to establish “quantum supremacy,” a threshold at which a quantum computer is demonstrably capable of performing a calculation that a traditional computer. When this occurs, it will be a remarkable technical achievement; however, it will still fall short of being transformative.

“The questions that people are trying to answer to demonstrate quantum supremacy, no one cares about. They’re not important, groundbreaking questions,” Beausoleil said. “When a quantum computer can be used to answer a question that cannot be answered on a classical computer, and it is of real significance, that is when quantum computing is a thing.”

For more on quantum computing, learn how a new manufacturing technique could create scalable quantum computers, and how helium shortages will impact quantum computer research.