News

How Alan Turing set the rules for computing

On Saturday, British mathematician Alan Turing would have turned 100 years old. It is barely fathomable to think that none of the computing power surrounding us today was around when he was born.

But without Turing’s work, computers as we know them today simply would not exist, Robert Kahn, co-inventor of the TCP/IP protocols that run the Internet, said in an interview. Absent Turing, “the computing trajectory would have been entirely different, or at least delayed,” he said.

For while the idea of a programmable computer has been around since at least 1837 — when English mathematician Charles Babbage formulated the idea of his analytical engine — Turing was the first to do the difficult work of mapping out the physics of how the digital universe would operate. And he did it using a single (theoretical) strip of infinite tape.

“Turing is so fundamental to so much of computer science that it is hard to do anything with computers that isn’t some way influenced by his work,” said Eric Brown, who was a member of the IBM team that built the “Jeopardy”-winning Watson supercomputer.

A polymath of the highest order, Turing left a list of achievements stretching far beyond the realm of computer science. During World War II, he was instrumental in cracking German encrypted messages, allowing the British to anticipate Germany’s actions and ultimately help win the war. Using his mathematical chops, he also developed ideas in the field of non-linear biological theory, which paved the way for chaos and complexity theories. And to a lesser extent he is known for his sad demise, an apparent suicide after being persecuted by the British government for his homosexuality.

But it may be computer science where his legacy will be the most strongly felt. Last week, the Association of Computing Machinery held a two-day celebration of Turing, with the computer field’s biggest luminaries — Vint Cerf, Ken Thompson, Alan C. Key — paying tribute to the man and his work.

Turing was not alone in thinking about computers in the early part of the past century. Mathematicians had been thinking about computable functions for some time. Turing drew from colleagues’ work at Princeton University during the 1930s. There, Alonzo Church was defining Lambda calculus (which later formed the basis of the Lisp programming language). And Kurt Gödel worked on the incompleteness theory and recursive function theory. Turing employed the work of both mathematicians to create a conceptual computing machine.

His 1936 paper described what would later become known as the Turing Machine, or a-machine as he called it. In the paper, he described a theoretical operation that used an infinitely long piece of tape containing a series of symbols. A machine head could read the symbols on the tape as well as add its own symbols. It could move about to different parts of the tape, one symbol at a time.

“The Turing machine gave some ideas about what computation was, what it would mean to have a program,” said James Hendler, a professor of computer science at the Rensselaer Polytechnic Institute and one of the instrumental researchers of the semantic Web. “Other people were thinking along similar lines, but Turing really put it in a formal perspective, where you could prove things about it.”

On its own, a Turing Machine could never be implemented. For one, “infinite tapes are hard to come by,” Kahn joked. But the concept proved invaluable for the ideas it introduced into the world. “Based on the logic of what was in the machine, Turing showed that any computable function could be calculated,” Kahn said.

Today’s computers, of course, use binary logic. A computer program can be thought of as an algorithm or set of algorithms that a compiler converts into a series of 1’s and 0’s. In essence, they operate exactly like the Turing Machine, absent the tape.

“It is generally accepted that the Turing Machine concept can be used to model anything a digital computer can do,” explained Chrisila Pettey, who heads the Department of Computer Science at Middle Tennessee State University.

Thanks to Turing, “any algorithm that manipulates a finite set of symbols is considered a computational procedure,” Pettey said in an interview via email.

Conversely, anything that cannot be modeled in a Turing Machine could not run on a computer, which is vital information for software design. “If you know that your problem is intractable, and you don’t have an exponential amount of time to wait for an answer, then you’d better focus on figuring out a way to find an acceptable alternative instead of wasting time trying to find the actual answer,” Pettey said.

“It’s not that computer scientists sit around proving things with Turing Machines, or even that we use Turing Machines to solve problems,” Pettey said. “It’s that how Turing Machines were used to classify problems has had a profound influence on how computer scientists approach problem solving.”

At the time Turing sketched out his ideas, the world had plenty of pretty sophisticated adding machines that would allow someone to perform simple calculations. What Turing offered was the idea of a general-purpose programmable machine. “You would give it a program and it would do what the program specified,” Kahn explained.

In the next decade, another polymath, John von Neumann, at the Princeton Institute for Advanced Study, started working on an operational computer that borrowed from Turing’s idea, except it would use random access memory instead of infinite tape to hold the data and operational programs. Called MANIAC (Mathematical Analyzer, Numerator, Integrator, and Computer), it was among the first modern computers ever built and was operational in 1952. MANIAC used what is now called the Von Neumann architecture, the model for all computers today.

Returning to Britain after his time at Princeton, Turing worked on another project to build a computer that used these concepts, called the Automatic Computing Engine (ACE), and pioneered the idea of a stored memory machine, which would become a vital part of the Von Neumann architecture.

As well as sparking the field of computer science, the impact his work had on cracking encryption may ultimately have also saved Great Britain from becoming a German colony. People have argued that Turing’s work defining computers was essential to his success in breaking the encryption generated by Germany’s Enigma machine — work that helped bring World War II to an end.

“By today’s definitions, the Enigma was an analog computer. What he [and his team] built was much closer to [the operations] of a digital computer,” Rensselaer’s Hendler explained. “Essentially he showed the power of digital computing in attacking this analog problem. This really changed the whole way that the field thought about what computers could do.”

Having defined computational operations, Turing went on to play a fundamental role in defining artificial intelligence — or computer intelligence that mimics human thinking. In 1950, he authored a paper that offered a way to determine if a computer possessed human intelligence. The test involves a person having an extended conversation with two hidden entities, a computer and a man pretending to be a woman. (“In both cases he wanted pretending,” Hendler explained.) If the person can’t determine which party is the computer, the machine can be said to think like a human.

“He wanted to put human and computing on equal footing,” Hendler said. “Language is a critical skill for humans because it requires understanding and context. If a computer showed that level of understanding then you wouldn’t notice the difference.”

The test “has the advantage of drawing a fairly sharp line between the physical and the intellectual capacities of a man,” Turing wrote in the original paper.

As IBM’s Brown noted, Turing’s legacy is still strongly felt today. In his mathematics work, he showed that “there exists problems that no decision process could answer,” Hendler said. In terms of computers, this means, “You could never prove for all complicated computer programs that they are correct,” Hendler said. “You could never write a computer program that could debug all other computer programs.”

But far from restricting progress of computer science, the knowledge of such inconclusiveness paved the way for building previously unimagined technologies. It allowed engineers to create immensely helpful services such as Internet search engines, despite knowing that the answers such services were to provide would not always be complete.

“You have people who say we should never build a computing system unless we can prove it is secure. Those of us who understand Turing say, ‘Well, you can’t.’ So you must start proving some approximation of secure, which starts a very different conversation,” Hendler said.

And despite numerous attempts to beat the Turing Test, it still hasn’t been done, except within the most limited of topics. That means we will likely be working to meet Turing’s benchmarks for years to come.

“You can’t say, ‘Siri. How are you today?’ and expect it to go on from there in any interesting way,” Hendler said.

Previous ArticleNext Article

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

GET TAHAWULTECH.COM IN YOUR INBOX

The free newsletter covering the top industry headlines