Written at Dallas Fort Worth Airport while stranded in the lounge by high winds and a dust storm that has grounded everything for miles in every direction. Dispatched to silicon.com via a low-cost commercial wi-fi service.

I first encountered the world of quantum mechanics way back in the 1970s as an undergrad. Like everyone else before, then, and since, I struggled to make sense of the strange properties in this world of the ultra-small.

It all became real when I got into nuclear fission and fusion, which included some limited hands-on experiments. But it was some 20 years later, while working in high-speed optics, that my interest was really rekindled.

At a personal level I was involved with the design and construction of photonic detectors, while an adjacent team worked on quantum encryption. It was here that I discovered the potential for quantum computers.

They can span the analogue and digital, and will most likely crack encryption codes and passwords almost instantaneously, not to mention the solving of many of our currently intractable problems.

Such a technology would render useless a lot of what we have developed and deployed, while at the same time accelerating solutions to problems in genetics and proteomics – as well as chaotic and non-linear systems in general.

So quantum computing has been on my ‘probables’ list – for some 20 years – to take over from silicon as it comes to the end of its very long and fruitful road sometime in the next 10 to 20 years.

Until this last week such a prospect looked – at best – an academic and very long-term prospect. Then, right out of the blue, a Canadian company called D-Wave made the first really comprehensive demonstration of an engineered quantum computer.

With over 100 patent applications filed, D-Wave has demonstrated a limited but successful degree of problem solving. The really big engineering deal here is the insulation of qubits from noise, and the biggest uncertainty is the degree to which the system will scale in practice.

While the computing chips require only a few nanowatts (one nanowatt is one-billionth of a watt), the cryogenics consume around 20kW to get down to a stable operating temperature of 4mK (-273.15C).

Although the power consumption is very small compared to the average server farm, it dwarfs the average PC, and the really good news is that expanding the qubit depth per chip will not require a huge increase in cryogenic energy. So D-Wave expects to have a 32-qubit system later this year, with 512 in 2008, and 1,024 in 2009.

Any form of deep quantum physics and computing explanation is tough to communicate and I’m not going to try here. Suffice it to say that this might be the beginning of the end for conventional large-scale binary computing. If not, it sure will be a heck of a complementary technology. For those interested in reading more visit the site here.

To date, D-Wave’s computer is optimised for complex simulations in the arena of finance, proteomics, weather, wave motion, non-laminar fluid flow and pharmaceuticals, for example. And while last week’s demo really was a proof-of-concept, and a demonstration of what the final product could look like and what it might do, the key thing is that someone has at last got quantum computing to work reliably at all!

For me, the most tantalising and perverse prospect? A working quantum computer might just help us understand quantum mechanics and the world of the really small for the first time!