As a scientist and engineer I've always struggled with the definition and understanding of analogue and digital. In broad terms it is easy to define these modes in the following manner: If something is continuous and well behaved then it is analogue; if it discontinuous, and a limited number of states, then we consider it to be digital.
As I speak into this voice recorder my neurons are firing in a digital manner to create a continuum of spoken words in the form a pressure wave to be converted and impressed electromagnetically onto a tape in an analogue format. Later my PA will replay these words though an analogue machine directly to her ears and in turn her neurons will fire in a digital manner to see her rapidly pressing keys on her laptop to create the column you are now reading.
So here we have multiple digital and analogue operations that at first sight look inherently and predominantly analogue. As a young student I had the good fortune to study both analogue and digital computing at a time when analogue was the predominant mode for most large scientific investigations. I found myself both programming digital machines using machine code and physically wiring analogue computers, to model physical situations.
Interestingly some of these early analogue computers did not involve electronics but water. It may seem quaint but the reality is a very few of those machines and techniques are still used for the very reason that the complexity of the problems still remain beyond our biggest digital machines. But there is very rapid trend towards the digitisation of analogue computing even for extremely complex models.
So where is all of this going? I look to the future with an increasing conviction that this current digital revolution is set to continue for some considerable time. I think we can look forward to digital computers at least a billion times more power than those we currently enjoy - and well within the next 30 years.
At the same time I look at the human race and see no progress whatsoever in our abilities – relatively speaking we have stopped evolving. It seems abundantly clear that without some technological augmentation, our species will run out of steam to be surpassed by its own technology. Already we see products emanating from Russia that have a rudimentary ability to invent new technologies on the basis of our past history and knowledge subsumed into modest computers.
We still need that spark of human innovation to make the final decisions, to define the real requirements and identify the economic routes to solution and market. But in this arena I can see the man-machine gap closing rapidly and certainly within the next 25 years we could find ourselves pushed out of the loop.
Perhaps we shouldn
Peter Cochrane is an engineer, scientist, entrepreneur, futurist and consultant. He is the former CTO and head of research at BT, with a career in telecoms and IT spanning more than 40 years.