, Frost and Sullivan
Abstract—Sentient tools–powered by incredible advances in artificial intelligence, deep learning, and data mining–represent the next stage of intelligent, aware, and social machines designed specifically to work with people. The Web extra at https://youtu.be/XtMFZ0sDzIk is an audio recording in which Science Fiction Prototyping editor Brian David Johnson talks with Richard Sear, vice president of consulting at Frost and Sullivan, about the next stage of intelligent, aware, and social machines designed specifically to work with people.
Keywords—Science Fiction Prototyping; AI; artificial intelligence; sentient tools; human-computer interaction
Over the years I've chronicled how different people use science fiction prototyping (SFP) to develop products and projects that stretch far beyond traditional engineering. More recently I've begun exploring work in which SFP is integral to future development and will be required to realize future technologies. This month's column delves into the coming age of sentient tools.
Significant technological advances as well as economic and cultural shifts are bringing about a new age of intelligent tools. These tools will be aware of their surroundings and able to make sense of and adapt to them. But, more than that, these tools will have a social awareness of the people using them. Sentient tools are the next phase in the evolution of computational systems and intelligent environments, building on advances in computational, sensing, and communications technology over the last 50 years.
Designed to work with people, these intelligent, aware, and social machines won't have a human level of consciousness and won't mimic, mirror, or replace humans or human interaction. They'll be designed to work with the human labor force to perform vastly complex tasks. Powered by incredible advances in artificial intelligence, deep learning, and data mining, these tools exemplify the class of technologies that we'll employ in the wake of autonomous vehicles and smart cities. Their applications will significantly affect education, the global workforce, and virtually all industries.
Truly understanding sentient tools' impact will require nearly unprecedented use of SFP. SFP will help us not only understand these tools' human, cultural, legal, and ethical effects but also expand the details of these possible futures. It's the social aspect of sentient tools that requires SFP. In fact, science fiction already has a long history of exploring sentient machines' effect on humans.
In science fiction, the narrative of machine sentience usually doesn't turn out well for the humans. The typical story unfolds this way: a person desires to make a sentient machine that's aware and, ultimately, smarter than humans. Despite being seen as crazy, the person perseveres and succeeds, usually at the cost of everything near and dear to them. But when this sentient machine comes to life, it decides to kill its creator and, oftentimes, the entire human race.
Mary Shelley explored this trope with great resonance in her 1818 novel Frankenstein, and her work was followed by an army of other science fiction authors and moviemakers. The narrative doesn't always follow these lines; many other stories have explored sentience in a subtler, sometimes less violent way.
2001: A Space Odyssey, however, isn't one of those stories. The 1968 film, cowritten by Arthur C. Clarke and Stanley Kubrick (who was also the director), explores sentience's darker side. The movie introduced us to HAL, one of the most powerful and enduring images of sentience gone wrong.
The HAL 9000 (Heuristically Programmed Algorithmic Computer) is a sentient computer that controls all the systems on the spaceship Discovery One and interacts with the crew. HAL's capabilities are impressive and diverse, and include speech, speech recognition, facial recognition, natural language processing, lip reading, art appreciation, interpretation of emotional behaviors, automated reasoning, and playing chess.
Events go horribly awry when HAL malfunctions and the crew decides to shut “him” down. HAL tries to protect himself by killing them. In his novel, Clarke explains that the root of the malfunction was the unresolved conflict between HAL's mission and the covert directives requiring him to keep the mission's true purpose a secret. HAL's solution is to kill the crew.
HAL's utterly calm voice is chilling as he meticulously bumps off the crew members. The film does a great job of exposing the humans' vulnerabilities. Because of the deep-space setting, the humans must depend completely on their sentient tool—and when that tool goes crazy, the effect is frightening. In fact, “HAL” has become pop-culture shorthand for technology run amuck.
As we moved into the 21st century, a new crop of films and stories imagined a different outcome—that of a future with sentient, superintelligent machines that don't want to kill us.
In 2013, filmmaker Spike Jonze wrote, directed, and produced Her, in which a man develops a loving relationship with Samantha, also known as OS1, the first artificially intelligent OS. The plot is much like a conventional love story—except that one member of the couple is a sentient machine.
In his article “Why Her Is the Best Film of the Year,” Atlantic journalist Christopher Orr wrote: “By the end of the film, the central question Jonze is asking seems no longer even to be whether machines might one day be capable of love. Rather, his film has moved beyond that question to ask one larger still: whether machines might one day be more capable of love … than the human beings who created them.”3
The beauty of Her is that it portrays a very different possible future. It suggests that as sentient tools advance and become increasingly smart, they might not desire to obliterate the human race—they might just break up with us.
The effect of sentient tools has yet to be fully explored. What will it mean to work with tools that can think? How do we train the next generation of laborers to work with tools that can take over increasingly complex tasks? If our machines can think for us, what will we do next? And, probably most important, how should they interact socially with us?
This last question is precisely the area that SFP must explore. SFP can help us determine, among other things, how we can program or train our tools to interact with us and what will be considered appropriate behavior. As we delve into this very real science fiction future, SFP will be increasingly necessary—no other approach can really address the specific details, social situations, and effects.
The coming age of sentient tools is both exciting and challenging. It portends a future at once amazing and incredibly complex. Much remains to be explored about the future of super-smart social tools. But I know for certain that SFP will be key to mapping out this future.