JULY-SEPTEMBER 2005 (Vol. 27, No. 3) pp. 93-95
1058-6180/05/$31.00 © 2005 IEEE
Published by the IEEE Computer Society
Published by the IEEE Computer Society
|References and notes|
PDFs Require Adobe Acrobat
Stuart M. Shieber, ed., The Turing Test: Verbal Behavior as the Hallmark of Intelligence, Bradford Books, 2004, 336 pp., $35, ISBN 0-262-69293-7.
As Daniel Dennett notes in his essay in this volume, Alan Turing meant his famous test to forestall the very discussion it provoked. On the premise that we judge intelligence in humans through their conversational behavior, Turing proposed that a machine capable of carrying on a conversation indistinguishable from that of a human would be intelligent—that is, for all intents and purposes be thinking. The test of its capability would be an "imitation game," in which a human judge failed, say, 70 percent of the time to tell the difference between a computer and a human. Such a test would, Turing felt, avoid the difficulties of deciding whether machines can think on the basis of competing definitions of "machine" and "think." Whatever Turing's intention, his test gave rise to a continuing debate about just what it would mean for a machine to pass it or indeed whether in principle such a machine could be built.
With this book, Shieber, a specialist in computation and language at Harvard, offers a selection of the major contributions to that debate published since the 1950s, which he links by short, informative commentaries on the argumentative strategies they pursue.
The volume begins with a historical perspective provided by selections from the writings of René Descartes, and Jacques de la Mettrie, both of whom took language as the mark of intelligence—the former to argue that humans were more than machines, the latter to argue that humans were just machines. Whatever Turing's private thoughts about the nature of humans, his test kept them distinct. The machine in question was to be a digital computer, which being universal, could emulate any other machine. Whether humans were machines, or the brain a computer, was left open.
Shieber also gives Turing himself ample opportunity to explain his test. In addition to the original paper, "Computing Machinery and Intelligence," he includes two other pieces and the transcript of a British Broadcast Company's panel discussion for Turing's further thoughts on his test and its implications.
Shieber characterizes the major line of critiques of the test in terms of the wedge and the spark. The wedge refers to arguments that undermine the test as a sufficient condition of intelligence. That is, passing Turing's test is not enough to warrant thinking, as one can specify a machine that would pass it and yet demonstrably not be thinking. Shieber's examples include Keith Gunderson's toe-stepping rocks device, John Searle's "Chinese Room," and Ned Block's "Aunt Bertha." In each case, the machine is missing something essential to thinking: generality (Gunderson), intentionality (Searle), and "richness of information processing" (Block). In a bow to Frankenstein, Shieber refers to this something as the spark, which he characterizes as a "phlogistic substance" (p. 148), thus revealing his own view of this line of argument.
Another line, perhaps more sympathetic to Turing's overall point, questions the test as a necessary condition of intelligence. Essays by Richard Purtill, P.H. Millar, and Robert M. French argue that in essence it demands either too much intelligence or the wrong kind, thus excluding animal or extraterrestrial intelligence. French, for example, points to the complexly layered subcognition underlying human conversation and arising from the peculiarly human experience of the world.
Shieber leaves the main defense of Turing's test to Daniel C. Dennett and the closing word to Noam Chomsky in a hitherto unpublished essay from 2002. For Dennett, the arguments just mentioned reinforce its sufficiency as a criterion of thinking. Precisely because our use of language is so complex, it encompasses any other cognitive skill we might set as a condition. It is, he asserts, "a surefire, almost-guaranteed-to-be-fail-safe test of thinking by computer" (p. 287). Whether such a machine can be built is an empirical question that remains open.
If Dennett finds it ironic that the test triggered the discussion Turing sought to avert, Chomsky finds the discussion to be as futile as Turing feared it would be. Turing proposed two lines of research for the computer, which "have proven to be eminently worth pursuing" (p. 320): to expand the capabilities of machines and to explore "the intellectual capacities of a man" (p. 318). The first, says Chomsky, is uncontroversial, the second "a more complex affair, though of a kind that is familiar in the sciences" (p. 318). Although simulations often prove useful in such investigations, Chomsky does not see why Turing's test has any special value for Turing's agenda, either as a guide or a test of progress. Given the contents of this collection, that judgment seems unlikely to end the discussion.
Michael S. Mahoney
Arthur Porter, So Many Hills to Climb, The Beckham Publication Group, 2004, 385 pp., $19.95, ISBN 0-931761-08-5.
In a well-known picture that first appeared in Meccano Magazine in 1934, Douglas Hartree of Manchester University is standing behind a model of a digital differential analyzer made from Meccano parts. One of Hartree's students, Arthur Porter, is operating the machine. Two years later, Porter received a PhD for his work on constructing a large-scale version of this machine and its use in solving a number of problems in science and technology. The account of Porter's work that appeared in IEEE Annals (vol. 25, no. 2) is an abstract from the book I am reviewing here.
So Many Hills to Climb describes Porter's full and varied life, which began in England's Lake District—the home of William Wordsworth and Beatrix Potter—and his story continued in a peripatetic manner in government, business, and academia in England, the US, and Canada. After completing his doctorate, Porter held a two-year Commonwealth Fund Fellowship at the Massachusetts Institute of Technology.
Upon his return to England, he spent the war years with the Admiralty Research Laboratories. His professional career after the war consisted of a sequence of senior academic and industrial appointments at the Military College of Science in Shrivenham; Ferranti Electric in Toronto; the Imperial College of Science and Technology in London; the University of Saskatchewan in Saskatoon, where he was dean of engineering; and the University of Toronto, where he was chairman of the newly created Department of Industrial Engineering. Throughout all this work, he also found time to serve on numerous university committees and government commissions. His many honors include Fellow of the Royal Society of Canada and Officer of the Order of Canada.
So Many Hills to Climb gives an engrossing account of the rich personal and professional life of a prominent engineer, scientist, and administrator who was a personal friend or acquaintance of an impressive number of distinguished individuals including, by his own count, 21 Nobel Laureates. IEEE Annals readers will be interested in the chapter on his Manchester experiences, the technical aspects of which are described in the earlier Annals article. I was greatly entertained by the chapter "Saskatchewan—Rhapsody in Academe," which gives an account of the Porter family's adjustment to the academic culture of a small city on the Canadian prairies.
Arthur Porter's wife, Patricia, figures prominently in this book. She has been his companion and helpmate throughout their 63 years of marriage, which has included moves between three countries and almost three dozen houses. They now live in the retirement community of Bermuda Village in Advance, North Carolina. Now 95 years old, Porter still finds time amidst the joys of family, friends, and neighbors to think about technical problems that interest him.
University of Alberta
I. Bernard Cohen, The Triumph of Numbers: How They Shaped Modern Life, W.W. Norton, 2005, 199 pp., ISBN 0393057690, $24.95.
Imagine, if you can, a world without numbers. It would be difficult but possible. How would you market a can of black beans without the 50 individual numbers (not just digits) that I found on the label? Do you think it would be possible to hold an election without exit polls or to educate the young without Scholastic Aptitude Test numbers? Could your vital signs be encapsulated without any numerical parameters? A Swiss group recently won the famous America's Cup Yacht Race by using intensive aerohydroelasto modeling and state-of-the art computer simulations. Do you think that the performance of the beautiful Viking longboats could have been improved by access to these algorithms?
Yes, our world is increasingly mathematized, and the future promises to see mathematics increasingly used in practically every aspect of our lives. The employment of numbers is an old story. A cuneiformist I'm acquainted with has written about the supplies and prices of beans and onions in ancient Babylon: "Through all of recorded history, every organized society or system of government has relied on numbers in some way; [but] no systematic analyses of these numbers occurred until well into the seventeenth century, the age of the Scientific Revolution." Yet, the tremendous acceleration of such use is fairly recent.
I. Bernard Cohen, a reputable historian of science, a student of George Sarton, and founder of the History of Science Department at Harvard, has put together as his last (alas) book on the story of how the collection of data—epitomized by numbers—has entered into civilized life. Proceeding from antiquity to about 1900, it concentrates on the "social physics" and "political arithmetic" that blossomed in the 18th and 19th centuries. Cohen gives us a collection of short and eminently readable snapshots of the process.
For example, Stephen Hales (1677–1761) "became the first person in history to measure the phenomenon we know as root pressure, i.e., sap in vines. His concern with numbers led him to be the first person to measure blood pressure in animals." Sir John Sinclair (1754–1835) collected and compiled vast amounts of social data in Scotland and was one of the first people to use the word statistics. Thomas Jefferson (1743–1826) suggested an algorithm for the reapportionment of representatives after a new national census. Antoine-Laurent Lavoisier (1743–1794), known primarily as the founding father of modern chemistry, undertook a census of the land in France that was under cultivation for farm production. B.C.E. Louis (1787–1872), a physician, described "the stages of disease and therapeutic outcomes in terms of numbers and not merely as a set of verbal descriptions."
And the brilliant Adolph Quetelet (1796–1874), mathematician, astronomer, and statistician, "is held to be the founder of quantitative social science." Quetelet was an early discoverer of the stability of social statistics—particularly crimes—and he wondered about individual responsibility for crime. Where is our vaunted free will? "Society," Quetelet wrote, "prepares the crime, and the guilty person is only the instrument by which it is executed." Quetelet's Treatise on Man contains tables of numerical data concerning every possible characteristic of life and social behavior. He created the notion of the "average man" (a problematic concept), an idea that has played a role even in certain judicial decisions. Quetelet's work was as instrumental in the development of the statistical point of view in theoretical physics as James Clerk Maxwell's work in statistical mechanics demostrates.
As a reader of Lytton Strachey's 1918 satirical biographies in Eminent Victorians, I never knew that Florence Nightingale (1820–1910) was a mathematics student of James Joseph Sylvester, and her compilation of numbers led to sanitary and hospital reforms.
These are only a few of the names we find in this slim volume. For me, reading this book—which you can do in one sitting—was like eating salted peanuts: I wished for more. My interest whetted, I resolved to look into a few of Cohen's references. 1
The books on the history of mathematics that I keep close at hand emphasize developments in pure mathematics and give applications a subsidiary role. If they mention applications, much more space is devoted to mathematical physics than to social physics, which gets at most a line or a footnote. I suppose that the reason for this neglect by the historians of mathematics is that social or political physics is more iffy, more contentious than mathematical physics. Then there are the naysayers—for example, Cohen mentions Thomas Carlyle and Charles Dickens. The latter ridiculed and poked fun at the newly formed British Association for the Advancement of Science, and in his novel Hard Times, Mr. Grad Grind, a man of calculations and of objective realities, becomes famous for his assertion, "Now what I want is facts. We want nothing but facts."
Yes, the world has become increasingly mathematized, and we might wonder why. Numbers have the power to organize, condense, epitomize, describe, prescribe, and occasionally, predict. In fact, complex social and ethical questions seem much easier to deal with when we reduce them to numbers, and numbers lend the cachet of objectivity in a way that mere prose does not. I.B. Cohen's book is a good introduction to a mode of thinking and acting that has now gone far beyond the simple data that his heroes compiled, interpreted, and moralized about.
Philip J. Davis
References and notes