Pages: pp. 2-3
An IEEE Intelligent Systems editorial board member recently said to me that she looked forward to reading my editorial each issue. I was surprised and pleased to realize that I also enjoyed writing them. And now this is my last editorial as editor in chief. After four years I'll be handing the reins to Jim Hendler at the start of 2005. My most immediate reaction is to wonder where the time has gone! All the usual clichés apply—it seems like only yesterday …, the time has flown by …, and so on. It's the kind of thing everyone finds themselves thinking periodically. But it hasn't always been this way.
As children, we all remember when time stretched ahead and seemed both limitless and languid. Whether it was the long summer vacation or a keenly anticipated event, time seemed to have a more leisurely pace. But as we get older, time seems to accelerate. We catch ourselves thinking how a year can have passed so quickly—where has the last decade gone—it seems like only a moment since I was starting high school. Associated with these thoughts is the dawning realization of ratios—the ratio of time lived to time to come, the ratio of time you've been working to the time left until retirement, and even the ratio of your time in a subject area to its age. The subjective sense of time plays other tricks too. I remember clearly as a 15-year-old standing in a small market town in the middle of rural England and impressing on my mind that image and the thought that one day I would look back on this moment and I would be 30 years older and living in the 21st century. It all seemed impossibly remote then, but it seems a moment ago now.
What's responsible for this flight of time, this headlong rush into the future? Is it all in the mind? Situations in which we become mentally absorbed accelerate the flow of time. Perhaps the technology we surround ourselves with is also partly to blame. The pace of events was certainly slower when I was a child—no email or faxes, mobile phones, or round-the-clock news services. Information overload and the constant demand to respond immediately can add to the sense of time becoming a scarce resource. However, I suspect this is part of the human condition. We've been lamenting the rush of time since the beginning of time. In the fourth century BC, Aeschylus wrote, "Time in his aging overtakes all things." Eight hundred years later, we find this in the Haggadah of the Palestinian Talmud: "Would that life were like the shadow cast by a wall or a tree, but it is like the shadow of a bird in flight."
What can we do about it? I can offer several ingenious solutions. In Catch 22, one character's secret of immortality was to contrive to be terminally bored all the time. For him every minute was an age, every hour an eternity. Perhaps that's not so attractive! Other people think life extension is the thing. More life might seem attractive at first blush—as long as it's quality time. As we rush through life, we notice time's effects: not quite as fleet of foot, unable to hear quite as well, can't focus on the fine print. There are those who put their faith in cryogenics—who freeze heads and bodies. Others spend lives on vitamin supplements, reduced-calorie diets, and whole panoplies of tonics and treatments, techniques and methods. For some, much of AI's appeal is the hope that it will offer improved quality or even quantity of life—whether it's microrobots scouring your body and performing delicate surgery, cognitive prosthetics to help your memory, or even downloading yourself onto a hard drive for posterity! The attraction of hard drives escapes me, I'm afraid. I have the sneaking suspicion that some people are so preoccupied with extending life that they're no longer living it!
For me, one main reason for researching AI is a fascination with the nature and basis of human experience. Over the past four years in these editorials, I've often harked back to questions that are as much to do with psychology, neuroscience, philosophy, and sociology as with AI. They're questions having to do with memory, learning, language, and social networks. Ours is an interdisciplinary subject—one full of fundamental challenges.
Take the topic of consciousness. What is it that allows me to be having first-person experiences as I type this? I have a definite sense of self; I am more than a set of behavioral responses. I am located in my body looking out, making sense of the world around me. I possess hopes and fears, memories and dreams. How is this sense of self supported? From where does it come?
For many in psychology, philosophy, and AI, this subject is almost taboo. We're making progress understanding many aspects of cognitive behavior. We can locate that region of the brain that gives rise to speech, we can observe the intricacies of visual processing, and we can identify structures associated with planning and reasoning. Many of us are implementing these abilities in software and hardware. We've made impressive progress in these areas. But do any of us believe that we're nearer to an understanding of self and self-awareness?
The sense of self for some is all about memory; for others, it's the embedding of a physical sensing body in a richly textured environment. Another school of thought is that it resides in an understanding of how language works and how the symbols of language acquire meaning or semantics. For others, no amount of technical explanation provides an explanation. For them, it remains a mystery and can't be explained. Many engaged in AI hold sincere religious convictions that place an explanation of self outside the material world. AI lets us study and research questions of real philosophical depth while trying to implement practical applications to immediate problems.
How has the world of AI and intelligent systems changed in the past four years? Of course, the overriding fact is that the world has changed. Pre- and post-9/11 are as significant for our subject (see my editorial "The Shape of Things to Come" in the Sept./Oct. 2001 issue) as anything we've seen in decades. The simple reason is that the applications and challenges have turned very much to issues of national and global security, and to the use of technology for surveillance, data mining, and so on. You'll see this reflected in future issues of Intelligent Systems that will cover homeland security and IS in government.
But security isn't the only global driver of AI. In 1999, a barrel of crude oil cost around US$18 (at today's prices); it currently costs just under $54. This simple fact means that everything from transportation to manufacturing and from health to agriculture needs to run leaner and more intelligently. Oil exploration and extraction, as in the 1980s, is a huge market for knowledge-intensive systems. The commodity markets themselves increasingly use agent technology to find the best deals. Environmental analysis calls for distributed intelligent sensor networks. Deployment opportunities for our work are everywhere.
How are we doing in terms of our technological solutions for these wide-ranging and ubiquitous-deployment opportunities? The thing that has continued to deliver is the hardware. Intel's Pentium 4 is now running at around 3 GHz; back at the end of 1999, the Pentium III was achieving 800 MHz. In the memory stakes, in 1999 decent PCs boasted 256 Mbytes of RAM and a 22-Gbyte hard drive—now 1 Gbyte of RAM and a 300-Gbyte hard drive is common. Indeed, you can buy a 1-Tbyte hard drive for around $1,000. These numbers attest to the continued truth of Moore's law.
The question is, to what extent has hardware, and not our software ingenuity, been responsible for recent developments in AI: grid computing, large-scale data mining, real-time language processing, or intelligent systems for genomics? Is it the smart software or the very, very fast hardware that makes the difference? In my piece "Brute Force and Insight" (Nov./Dec. 2001), I noted how much we owed to these trends in power. What estimate can we make about the contribution of our software solutions? This might be hard to assess quantitatively, but we can see new methods that allow a new kind of IS approach.
These methods include integrating heterogeneous content through ontologies. This is evident in the establishment, over the last few years, of a W3C standard for expressing agreed-upon conceptualizations on the Web. There's an increased emphasis on netcentric solutions that assemble sets of intelligent services to solve particular problems as they arise. We see the increased use of statistical methods to tackle computationally hard problems—methods that remain informed by domain heuristics. Robust robotic platforms are at large in the real world, from unmanned autonomous vehicles to Sony's Aibo.
So, looking into the future, where will the action be—what will my successor observe? I've no doubt that the acceleration of hardware, the proliferation of content, and the increase in bandwidth and miniaturization will continue apace. Things will continue to get smaller and yet more powerful. As early as next year, some mobile phones will contain 2 Gbytes of storage. New paradigms of interaction become possible; new forms of local computation become feasible.
To understand the future, understand your past, or hire a science fiction writer. We've done both of these in our Histories & Futures department. In some ways, not much is new under the sun—from seamless information management as envisaged by Vannevar Bush to robots serving our every need a la Asimov.
Whatever the future holds, what always seems to challenge us are the sociotechnical issues. One of the innovations in this magazine has been our Human-Centered Computing department. The compelling essays in this department are showing how the success or otherwise of our best technologies derive from unanticipated aspects of the way humans modify, repurpose, or subvert any system.
I'll end my time here by paying tribute to all the IEEE Computer Society staff—those on the title and those in the background. Angela Burgess, Dick Price, Crystal Shif, and Dennis, Hilda, Pauline, Dale, Shani, Rebecca, Rita, and Monette—you've all been wonderful. Thanks to all my editorial board and to my Associate Editors in Chief Austin Tate and Bob Laddaga. Thanks to you, the readers—it has been a privilege.