Artificial Intelligence Innovation

Artificial Intelligence and life beyond the algorithm: Alan Turing and the future of computing

Turing is not just a historical figure; his work is still crammed with relevance - and tricky questions.

A statue of Alan Turing at Bletchley Park. Image: Jon Callas/Wikimedia Commons

It's hard to choose where Alan Turing had the biggest impact on history. The British mathematician is known as the father of computing thanks to his work on what he called a universal machine - which provided the framework for development of digital computing - and he also helped significantly shorten the Second World War through his work with the codebreakers of Bletchley Park.

But the interest in Turing is not just historical - his work is still relevant to some of the thorniest problems in tech, particularly around artificial intelligence. While a number of other academics and engineers had a role in the creation of digital computing, what sets Turing apart is the breadth of his influence, says S. Barry Cooper a professor of mathematical logic at the University of Leeds.

"He is bringing ideas about computation to different areas and that's what's really significant about Turing - he made all these connections and he had a global over-arching view of how computation worked in many different contexts," he said.

The development of the digital computing upon which we rely was just one element of his thinking. Because even while engineers were struggling over how to turn his theory into physical computers - mechanical giants with glass valves - Turing was already working on even thornier questions, and his work at Bletchley may have helped broaden his outlook, said Cooper, who has co-authored a book on Turing.

"He's kind of inhabiting a pure mathematical world before going into Bletchley Park and he's forced to engage with real world problems. He comes out the other end and his late work is very much engaged with how the nature of human thinking and the emergence of patterns in nature and so on."

In particular Turing's work on artificial intelligence remains relevant and controversial.

"I believe that at the end of the century the use of words and general educated opinion will have altered so much that one will be able to speak of machines thinking without expecting to be contradicted. I believe further that no useful purpose is served by concealing these beliefs," Turing said in his 1950 paper Computing Machinery and Intelligence.

He wasn't right, but the emergence of artificial intelligences whether in the form of Siri or Watson remains a hot area of research. Most famously in this paper Turing outlined 'The Imitation Game' (now the name of a new film about Turing) which he argued could be a method for testing machine intelligence. It's now better known as the Turing Test and while there are a number of variations the basic concept is that a machine that can convince a human of its intelligence should be thought of as a thinking machine.

"The Turing Test has kind of framed people's thinking. Turing had this knack of focusing on fuzzy problems in kind of rather precise ways. He said well certain questions don't make sense so lets try to pin this down in a practical way," said Cooper.

Turing effectively took a Victorian parlour game and turned it into a piece of modern science. And while the Turing Test has been criticised on a number of levels it also reflects how trying to work out what 'thinking' or 'intelligent' means - and then applying this human concept to machines - is incredibly fraught, and that the appearance of thinking may be the closest we can get.

As Cooper notes: "Maybe there is a theoretical barrier which is being recognised in taking such an approach. Maybe there isn't an algorithm for testing intelligence and in that case what do you have - some kind of empirical approach."

Indeed, while Turing's work is responsible for the world of computing which we inhabit, it doesn't necessarily follow that he thought algorithms hold the answer to every question, and we should guard against the assumption that big data can make every decision for us, as Cooper points out. "We have to blame Turing for a lot: the way his work has been interpreted and the primacy of the algorithm these days - and the way in which human thinking has in many ways been marginalised particularly when you are thinking about large organisations."

The computer needs to be kept in context, says Cooper - while it has changed our world and will continue to be important in everything we do, so is the human input which Turing recognises in the Turing Test.

Indeed, Cooper argues that Turing's work on artificial intelligence also links up with his work on incomputability - how to solve problems that cannot be solved by using standard digital computing. "Right the way through this 20 year period of discovery he's engaged with not just modelling how we compute but also modelling how we actually transcend what the computer does. It's an amazing body of thinking. This is why he is still significant to us, he was thinking about issues that are still issues for us and in very basic ways that are still valid," he said.

Cooper added: "We haven't really got used to the idea that the standard model of computation isn't comprehensive enough to describe what's happening with the internet or what's happening with human thinking or at the quantum level and we are going to have to take that onboard at some point."

He added: "It feels to me that this is Turing's revolution in progress now ... it's very much part of the way people are thinking about problems now."

More on Alan Turing

Visit TechRepublic