With schools failing to get the next generation excited about computers, silicon.com chief reporter Nick Heath argues that lessons need to revisit how the information age got started.
While most people in Britain use a computer, few of us would pretend to know, or care, how they work.
For most, the computing technology we use every day is so complex as to be unknowable - barely distinguishable from magic - the secrets of its inner workings only known by a technocratic elite.
Until I started my research to write an ebook on the history of computing I pretty much held the same view. I was impressed by computers and marvelled at their evolution from the home computers of the 80s to the handheld virtual assistants they are today, but I was mostly cheerfully ignorant of what made these mysterious creations tick.
Partly it’s because computers have no wheezing pistons or clanking gears - only inscrutable chunks of silicon. The featureless surface of a microprocessor gives no indication of the labyrinthine complexity that lies beneath, of the myriad electronic components etched into its structure.
But writing this ebook meant travelling back to a time before computer parts had passed into the microscopic realm, to a point when fist-sized valves were the engines of information processing and computers filled rooms, rather than pockets.
At that time the mysterious components that today are locked away inside silicon chips were clearly visible, in the vacuum tubes that studded the walls and the cabinets manically gobbling punched cards. For me the old maxim ‘Seeing is believing’ held true. Looking at these early machines, the likes of the Eniac and the Colossus, and reading about how they worked crystallised a basic understanding of modern computing for me.
I got a sense of how computers process information by translating it into binary 0s and 1s, of how they represented binary 0s and 1s by switching circuits using these valves, and of how those binary 0s and 1s can be manipulated by arranging electronic components to create logic gates. Computers and what they actually do became tangible and relatable.
And that’s why I’m convinced if learning about the roots of computing fuelled my understanding then the same knowledge could breathe life into the dire state of IT teaching in England.
For too long, IT in most schools has consisted of lifeless lessons on how to create spreadsheets and format a Word document, leaving children without the interest or skills to pursue a career in computing. Apart from coding, what better way is there to get kids excited about computing again than by showing them the steps along on the path to today’s digital world?
Show them how punched cards were used to program the Jacquard Loom or to rapidly count data in the Hollerith Tabulating Machine, and help them understand the link between the iPhone and the 1940s electromechanical computer, the Z3. Do this and we shine a light into a computer’s inner workings and reveal the building blocks that today have vanished from view.
And if the technology itself gets a bit dry then liven lessons up with the human stories that go hand in hand with the march of computing. The story of how Colossus helped crack Hitler’s Lorenz code, thus shortening the war, will enliven the dullest of classrooms.
Introduce the history of computing into the National Curriculum and these machines will lose the air of rarefied mystery that intimidates and discourages so many people from trying to understand how they work.
Computers may be complex but their basic principles can be understood by everybody. Let’s try to get that message over by taking IT teaching back to where it all started.
From the abacus to the iPhone The 50 breakthroughs that sparked the digital revolution is available to download now.