Imitating Life
By David Alan Grier

Close up of red wires on the back of the British military's Bombe machineIt seemed like a good idea at the time, but its value is something that you will have to judge. It arrived when I was walking the streets of Dongcheng on the morning of the 2015 CCF Awards Banquet. My travels took me past a cinema that was near the hotel. When I saw the movie posters, I thought that I should write about the new film about Alan Turing, “The Imitation Game.” The movie is quite popular in the United States and Europe, though it is also making computer scientists a bit uneasy.

As I started to do my research for the column, I started to doubt my inspiration. As far as I could tell, “The Imitation Game” has not yet arrived in China. If it has not arrived by the time that this column is published, I will have to ask your indulgence. I will have to ask you to read a description of a movie that you have not seen and ask you to judge if my comments seem real or make sense. I think that Turing would have appreciated the circumstance because the imitation game, Turing’s own term for the activity that we now call the Turing Test, challenged his readers to distinguish between the actions of a human being and the actions of a computer.

“The Imitation Game” is actually the third movie about Alan Turing. The first, “Breaking the Code,” appeared in 1996, and the second, “Codebreaker,” was released during the 100th anniversary of Turing’s birth. All three are based on the same book, Alan Turing: The Enigma by Andrew Hodges. (A Chinese translation of the book is available from Hunan Science and Technology Press.) All cover more or less the same time in Turing’s life, a period that begins when he was working as a code breaker for the British government during the Second World War and ends with his death in 1954.

All three movies have generated a frustrated anger from the computer science community. While computer scientists have generally been pleased that one of the founders of their field has been portrayed in films, they dislike the liberties that the filmmakers took with Turing’s story. “So much of it isn’t true,” complained a friend of mine at a recent IEEE editorial meeting. He was so disturbed by the portrayals that he claimed that he would “create web page devoted to correcting the errors in the films.”

Now to be fair, “The Imitation Game”, as well as its two predecessors, follows the Hodges book fairly closely. If anything, “Codebreaker” strays the furthest from the Turing story by casting a good looking and articulate actor as Turing. Turing may have been a brilliant scientist, but he was decidedly plain looking and occasionally difficult to understand because he stuttered.

The current movie deviates from Turing’s life in how it portrays his technical contribution. This kind of deviation, of course, is the one that most angers engineers and scientists. We not only want recognition from the public for all that we have done to create the modern age, but also want that recognition to be honestly presented. We don’t want credit for things we have not done, but we do want every bit of acclaim for those accomplishments that are truly ours.

The major misrepresentation in “The Imitation Game” is the fact that it portrays Turing as an electrical engineer. It represents Turing as not only designing the operation of the machine, which he did, but also designing how it would work and even rolling up his sleeves and helping to build the first one. It is a dramatic scene but not true. The electrical engineering was actually done by others.

Turing was, of course, a mathematician specializing in logic. He had written one of his two most important papers, “On Computable Numbers, With an Application to the Entscheidungsproblem,” before he had begun graduate study at Princeton University in 1936. This paper, which has become one of the foundations of theoretical computer science, introduced one of the key concepts that we associate with him, the Turing Machine.

The other misrepresentations in “The Imitation Game” are minor and seem to have been done to make the narrative stronger. As with many dramas based on historical events, the movie reorders events, combines characters, invents characters, and foreshadows events in ways that could not possibly have happened in real life. Drama carries a burden. It must be interesting and engaging. When crafting a drama, we are always willing to sacrifice truth if we can make the story a little more compelling. Though, of course, we hope that we have to make such sacrifices only rarely.

Yet, drama is not the only thing that has used Turing’s story in its own way. In the computing field, we have used Turing’s life to create a history for our field, a history that tends to favor one kind of contribution and downplay others.

For a decade after his death in 1954, Turing was rarely mentioned in the computing literature. He became more visible in 1966, when the ACM decided to name its major award after him, and Herman Goldstine identified him as the founder of computer science in his 1972 book, The Computer from Pascal to von Neumann.  Goldstine argued that the ideas of modern computing had originated with Turing and been absorbed by von Neumann. Von Neumann had not only read Turing’s 1936 paper, but also talked with Turing when the young Englishman was in graduate school. At the time, von Neumann had an office in the Princeton mathematics building. “I am personally certain that it was this that made the development of the stored program computer so simple and painless,” Goldstine wrote. “Many of the ingredients were there in von Neumann’s capacious memory” when the time came to develop automatic computing.
Many scholars have objected to Goldstine’s portrayal and have noted that computing has strong engineering connections as well. Many of the early computing pioneers were not trying to build on an elegant mathematical theory of computing but were trying to solve practical problems with the best means available. Indeed, some of the early computing pioneers were so angered by Goldstine’s portrayal that they could not speak peaceably about the origin of the computer. Even today, you can meet relatives, colleagues, or students of those early computer scientists who are convinced that the early story of computing is badly biased.  

Still, we cannot deny that Turing is indeed an important figure in computing. Not only did he give us his 1936 paper that described the Turing Machine, he also wrote a second important paper in 1950 that gave us the standard for judging if a computer program expresses intelligence. We have come to call that standard, “The Turing Test.” However, he titled that paper, “The Imitation Game,” the same name as the recent movie.

But unlike the recent movie, we cannot make Turing both a mathematician and an engineer. He was a mathematician first and foremost. His contributions to the engineering of computing are far less than the two prominent ideas that he left us for the mathematical foundation of computing. There are plenty of gifted engineers who contributed their talents to computing: Konrad Zuse, Howard Aiken, J. Presper Eckert, Sergey Lebedev. We need to honor them but not confuse them with mathematicians.

Of course, computing has proven to be a remarkably adaptable endeavor and has been able to absorb ideas from almost every field that it has touched. The richness of this heritage has sometimes made it difficult to identify the originators of some ideas and harder still to portray their contributions to the public. Still, we as computer scientists seem to feel a need to have the public recognize our work. We seem to feel this need more every day, as new ideas in computing seem to come less and less from computing professionals, and more and more from business people and entrepreneurs. The public may never see the story that we believe to be the true state of affairs. But we can at least hope that they will see a story that is indistinguishable from the truth or perhaps distinguishable in only the most minor of ways.

David Alan Grier circle image

About David Alan Grier

David Alan Grier is a writer and scholar on computing technologies and was President of the IEEE Computer Society in 2013. He writes for Computer magazine. You can find videos of his writings at He has served as editor in chief of IEEE Annals of the History of Computing, as chair of the Magazine Operations Committee and as an editorial board member of Computer. Grier formerly wrote the monthly column “The Known World.” He is an associate professor of science and technology policy at George Washington University in Washington, DC, with a particular interest in policy regarding digital technology and professional societies. He can be reached at