
Alan Turing, often considered the father of computer science,
was born a century ago, in June of 1912.
Arguably, and it would be a tough argument to win if you took the other side, computers have had a greater impact on civilization than any other machine since the wheel.
Sure, there was the steam engine, the automobile and the airplane, the printing press and the mechanical clock. Radios and televisions also made their share of societal waves.
But look around. Computers do everything TVs and radios ever did.
And computers tell time, control cars and planes, and have rendered printing presses pretty darn near obsolete.
Computers have invaded every realm of life, from work to entertainment to medicine to education: Reading, writing and arithmetic are now all computer-centric activities.
Every nook and cranny of human culture is controlled, colored or monitored by the digital computer.
Even though, merely 100 years ago, no such machine existed. In 1912, the wordcomputer referred to people (typically women) using pencils and paper or adding machines.
Coincidentally, that was the year that Alan Turing was born. If you don’t like the way computers have taken over the world, you could blame him.
No one did more to build the foundation of computer science than Turing. In a paper published in 1936, he described the principle behind all of today’s computing devices, sketching out the theoretical blueprint for a machine able to implement instructions for making any calculation.
Turing didn’t invent the idea of a computer, of course. Charles Babbage had grand plans for a computing machine a century earlier (and even he had precursors). George Boole, not long after Babbage, developed the underlying binary mathematics (originally conceived much earlier by Gottfried Leibniz) that modern digital computers adopted.
But it was Turing who combined ideas from abstract mathematical theory and concrete mechanical computation to describe precisely how, in principle, machines could emulate the human brain’s capacity for solving mathematical problems.
“Turing gave a brilliant demonstration that everything that can be reasonably said to be computed by a human computer using a fixed procedure can be computed by … a machine,” computer scientist Paul Vitányi writes in a recent paper (arxiv.org/abs/1201.1223).
Tragically, though, Turing didn’t live to see the computer takeover.
He died a victim of prejudice and intolerance. His work lived on, though, and his name remains fixed both to the idealized machine he devised and to a practical test for machine intelligence, a test that foreshadowed powers that today’s computers have begun to attain.
complete info