, Dartmouth College
, Center for Computing Sciences
Pages: pp. 40-41
In 1900, David Hilbert presented 23 problems at the International Congress of Mathematicians held in Paris. Hilbert's problems spanned the spectrum from rather trivial (Problem 3 on the equality of the volumes of two tetrahedra) to the probably impossible (Problem 6 on the axiomatization of physics). Nonetheless, they challenged many of the leading scientists and mathematicians of the 20th century and set the tone for mathematical research, especially in the early part of the century.
In Hilbert's day, there was much enthusiasm and optimism about the impact formal mathematical methods could have on science. The previous two centuries had witnessed unprecedented progress in mathematics and physics—Newton, Gauss, Cauchy, Euler, Maxwell, and Poincaré, to name but a few, introduced and explored fundamental, new concepts that changed the way the world was viewed and the tools that scientists used. The axiomatization of science and mathematics was a steamroller then, brought to a grinding halt only in the 1930s by Gödel's results on the incompleteness of any axiomatic system sufficient to express simple arithmetic.
Today, much of our scientific enthusiasm and optimism lies in the power of computing, so it is wholly appropriate that this issue's theme articles address some of the computational challenges facing science and engineering in the 21st century. The main challenges deal with complexity in various guises—although we can build increasingly powerful machines, some problems still remain out of reach because of their intrinsic complexity.
Anthony Guttmann writes about fundamental conjectures in statistical mechanics and combinatorics that have no formal proofs even though significant numerical experimental data support them. This raises questions of what we are willing to accept as proof and whether theorem-proving techniques will become powerful enough to resolve such conjectures in the future.
Jonathan and Peter Borwein remind us that computing should provide insight and not necessarily precise quantitative results. In lieu of successful resolution of the P = NP question, they argue that we need to think of computing as a vehicle for better understanding. This requires better integration of the various tools we use for formulating and analyzing complex mathematical and scientific questions.
Martin Haugh and Andrew Lo describe a natural class of problems in computational finance that are plagued by the "curse of dimensionality" and that currently remain outside the domain of practical solution. Their examples illustrate the hard fact that complexity is not only intrinsic to the natural and mathematical world but to the world we have engineered and now have to cope with daily.
These articles and others scheduled to appear throughout the year will refocus enthusiasm for computing by demonstrating that because of complexity, computing might only point the way to an answer—not provide the answer itself. Let us propose another challenge for computing in the 21st century.
Comparing the growth in computing power (as captured by Moore's Law for example) with the growth in human performance (as measured by standardized test scores or even Olympic's sports records) shows that although human performance is flattening, machines are becoming increasingly more powerful. Maybe the two curves have already crossed in some areas such as chess playing. The biggest challenge is that posed by a future in which computing power goes significantly beyond human neural processing power. Bill Joy articulated a view of that future in what is now dubbed the " Joy Hypothesis" (see the sidebar). Perhaps we could simply state scientific challenge in this direction as "How can we build computer systems that go beyond the limitations of human thinking and yet still be able to understand what they are doing?" Put another way, will computers still respect us in the 22nd century?