CXO

Peter Cochrane's Blog: There is no magic tech

No product can defy the laws of maths and physics

Compiled in my office over a nice warm coffee while snowed in with roads that are mostly black ice, and dispatched via my LAN connection

Because my areas of expertise straddle science, technology, engineering, academia and investing, I am often approached to ratify products, technologies and solutions. I often wonder why because most investors seem to have already jumped off the springboard of commitment and decided they are right - and they are going to use the product, no matter what I say.

A lifetime in technology has seen endless cases of this but I only have space for one!

I choose the 'magic' communications system that promises to deliver greater speeds and distances than ever before. These appear on my desk with monotonous regularity, and most recently one popped up in the form of an Ultrawideband radio system that was going to oust all previous radio systems.

Of course the investors were convinced they were on to a winner. What could be more seductive than a system that can communicate faster and further than anything that has gone before?

Using minimal theory and mathematics and maximal graphics, I tried to explain how this could not be the case.

Let's start with just two concepts - one simple and one more complex.

First: The distance power law, which says the power received from a transmitter is dispersed as 1/d2 where d is the distance between transmitter and receiver.

This is easily explained with some basic geometry (see the graph below). If we start from a point source of energy radiating out in all directions, then the amount of energy (or flux if you will) flowing through a given surface at a distance will be in proportion to the inverse of the square of the distance as depicted below for a rectangular or conical section.

More simply put: If we double the distance the energy is reduced by a quarter. Or, more dramatically: If we increase the distance tenfold the energy available to be collected by an antenna is reduced a hundredfold. Any system that claims to be doing better than this should be looked upon in the same way as a perpetual motion machine.

Second: The information-bearing capacity of a channel was first determined and published by Claude E Shannon a couple of years after I was born. This is not an easy one to explain but it is based on the ability to code and recover signals/information in a noisy channel.

In short: It is rocket science! For those of you with an engineering or physics background, it is worth looking at the resemblance to an entropy calculation.

Another way of thinking of this would be a man whistling a single note, or a newspaper page with a single letter on it. Neither would convey much information. But should the man whistle a tune, or the page be filled with characters, then we might guess that there is far more information.

Now if our whistler was in a football crowd, or our newspaper was not well illuminated, we would have some difficulty in recovering the information. Could we catch that tune, or could we read that text accurately? Shannon gives us an effective measure of the limits to what we might be able to do.

In brief: the information content is related to the disorder in a signal, and the amount is related to the signal power and time used as follows:

I≤BT.log2(1+S/N)

Where I = Information conveyed in bits
B = Bandwidth of the channel
T = Transmission time
S = Signal power
N = Noise power

This says that this is the best we can do in theory - it represents an upper bound or limiting case. In practice we always do worse! So when people tell me they have a new communications product or system that can do better, I put it on a par with them telling me they can travel faster than light.

Now, with a little sleight of hand, and without delving into the mathematical detail, we can take the following engineering liberty: When the signal to noise ratio (S/N) is >>1, this formula is well approximated by:

I≤kBT(S/N)dB (where k is a constant).

Now we have something that is very easy to plot and comprehend without totally bastardising all the basic concepts and understanding embedded in the theory. The graph looks like this:

The key thing here is the volume of the figure as it represents the amount of information conveyed or received. I could equally well choose configurations giving the same volume but with different values on each axis:

And so there are an infinite number of possibilities dictated by the choice of modulation and coding, which in turn are often decided by the transmission medium and involve noise of various kinds. In some cases path variability and echoes plus interference are also present.

We might thus imagine the energy of a signal dispersed inside such a solid form in the same way that water is retained by the skin of a balloon. We can change the shape of the balloon but the amount of water stays the same. Similarly, different coding and modulation schemes can alter the ratios of the sides presented by Shannon's equation.

We can certainly trade off signal power against noise and/or bandwidth and time, but we can never exceed the bounds set by nature.

If you find someone claiming to be able to do so, take a closer look. The thing to remember is: there is no magic. There is no means to defeat the limits dictated by nature so very concisely stated by thermodynamics. Also, there is no free lunch. Everything entails a cost and you certainly never get more out than you put in!

About

Peter Cochrane is an engineer, scientist, entrepreneur, futurist and consultant. He is the former CTO and head of research at BT, with a career in telecoms and IT spanning more than 40 years.

Editor's Picks