In the past half century, driven by rapid, phenomenal advances in microelectronics closely following Moore’s law, computers of different kinds, forms, and shapes have evolved, redefined, and transformed almost everything we deal with. However, they still function on the same fundamental computational principles that Charles Babbage and Alan Turing envisaged and that John von Neumann and others subsequently refined. What’s in store for the next 50 years? Do the fundamental principles and assumptions that define modern computing — and that have guided us so far — require revolutionary rethinking? How might — and should — computing advance to address unmet demands and current and emerging challenges?
Demands on computing, storage, and communication will continue to escalate. We’ll use computers for newer applications and computationally more difficult problems that we still haven’t addressed satisfactorily. More people, even those at the bottom of the economic pyramid who haven’t yet benefitted from IT, and almost everything, including objects, animals, and buildings, will eventually use and rely on computers in some form.
Unfortunately, digital computing based on silicon and the conventional architecture is reaching its limits owing to fundamental physical limits, economic considerations, and reliability issues. It’s also handicapped toaddress certain kinds of problems in domains such as weather forecasting, bioinformatics, robotics, and autonomous systems.
Thus, we must examine and advance new approaches that might seem radical or even difficult to realize. Researchers and industry are thinking differently about computational principles and pursuing new paradigms such as quantum computing, biologically inspired computing, and nanocomputing. We might soon need to embrace such approaches, particularly for some of the emerging applications. So, we need to understand the principles and potential of these paradigms and be aware of their current status and future prospects.
Quantum computers, which represent a new generation of computing based on quantum mechanics rather than electronics, seem increasingly viable. Quantum computers based on quantum bits (qubits) exhibit astonishingly high-speed processing capabilities. However, renewed interest in quantum computing has only just begun, and researchers have only demonstrated a few of its potential applications. Nevertheless, we might well have a general-purpose quantum computer in 10 to 15 years.
Even so, conventional computers with quantum components aimed at optimization problems are already available from companies such as D-Wave Systems. According to researchers, by leveraging quantum entanglement, quantum computers could greatly speed up certain machine-learning tasks. In some cases, they could reduce computing time from hundreds of thousands of years to mere seconds. For more information, see the “Additional Resources” sidebar below.
Biologically Inspired Computing
Biologically inspired designs adopt nature as a source of analogies or inspiration. This concept, also called biomimicry, biomimetics, or bioinspiration, has inspired many innovators and designers and resulted in innovations including Velcro, “cat’s eye” retroreflective road markers, fast swimsuits, aircraft, windmill turbine blades, and high-speed trains. Biologically inspired computing has produced such early results as genetic algorithms and neural and sensor networks. As the theme articles in this issue illustrate, we can draw on nature to address IT security, deal with ecological problems, and enhance machine intelligence.
Biological developments exist even at the device level. DNA is a dense, stable information media, and advances in DNA synthesis and sequencing have made it an increasingly feasible high-density digital storage medium. According to a recent article, the entirety of human knowledge might soon be stored in a few kilograms of DNA, and in the future we might even store data in our skin.
For further information on this topic, see the “Additional Resources” sidebar below.
Emerging nanotechnologies are squeezing more transistors into silicon chips by reducing their physical dimensions to a few nanometers. Such devices can bring millionfold increases in computing power while consuming far less power per function. Nanocomputing technology could revolutionize how we build and use computers. However, to realize these benefits, research must make major progress in device technology, computer architectures, and integrated-circuit processing, while addressing dependability challenges.
In This Issue
This issue features five articles exploring the foundations of the three approaches I just described.
In “Beyond Bits: The Future of Quantum Information Processing,” Andrew Steane and Eleanor Rieffel outline how quantum physics offers powerful methods of encoding and manipulating information that are impossible in the classical framework. They highlight applications of these methods in secure-key distribution for cryptography, rapid integer factoring, and quantum simulation.
In “Biologically Inspired Design: A New Program for Computational Sustainability,” Ashok Goel discusses how “biologically inspired design encourages designers to view traditional problems from novel perspectives” and “biological analogues could also help designers spawn new problem spaces, which might lead to the invention of new technologies.” He illustrates these observations with examples from ecology and outlines biologically inspired design concepts and their benefits.
Securing IT systems and applications is an ongoing problem, particularly in the context of increasing threats and sophisticated attacks on them. Can we apply insights from nature to IT security? Wojciech Mazurczyk and Elzybieta Rzeszutko address this question in “Security — A Perpetual War: Lessons from Nature.” They argue that although a direct connection between IT security and patterns in nature isn’t readily apparent, many analogies exist between these two vastly different domains. For instance, botnets, distributed denial-of-service attacks, intrusion detection and prevention systems, and other techniques use strategies closely resembling actions by certain species. They analyze these analogies and recommend that the IT security community turn to nature to search for new offenses and defenses for securing our cyber world.
As we look forward to autonomous smart systems, the potential exists to apply biological intelligence to enhance machine intelligence. In “The Convergence of Machine and Biological Intelligence”, Zhaohui Wu and several well-known experts discuss the opportunities for and challenges in leveraging both machine and biological intelligence, each of which has its own merits and can augment the other. They propose an architecture for a teachable “child machine,” a biological–machine system with human-like cognitive architectures for perception, cognition, and action. They also outline the challenges and trends in brain–machine interfaces.
Finally, in “Nanocomputing: Small Devices, Large Dependability Challenges,” Jean Arlat, Zbigniew Kalbarczyk, and Takashi Nanya briefly examine dependability and security challenges in the evolution of silicon technologies toward nanoscale dimensions. They offer perspectives on how researchers might overcome these challenges.
Understanding, mastering, and applying these and other emerging radical approaches is critical to steer the future of computing. I hope this issue inspires new computing paradigms and encourages researchers and developers from multidisciplinary fields to learn from each other and advance computing. Several questions, however, remain. How can we effectively address the challenges these paradigms pose? Will they be viable and evolve as next-gen computers? Will they have transformational impacts, similar to conventional computing systems, and improve the world?
These paradigms and ongoing initiatives by research institutions and companies such as D-Wave Systems, Google, IBM, and HP could change the computing landscape yet again. In embarking on this journey, however, we should recognize that computing is a means to an end rather than the end itself.
To keep updated on current and future developments, stay tuned to upcoming issues of Computer, such as the special issues on Rebooting Computing in December 2015 and Emerging Computing Paradigms in September 2016. If you wish to contribute an article to these issues, contact me at san[at]computer[dot]org.
Finally, I invite you to join the conversation about next-gen computing paradigms that matter and share your thoughts in the comments section below or by emailing us.
S. Murugesan, “Radical Next-Gen Computing,” Computing Now, vol. 8, no. 6, June 2015, IEEE Computer Society [online]; http://www.computer.org/web/computingnow/archive/radical-next-gen-computing-june-2015.
San Murugesan is the editor in chief of IT Professional, the director of BRITE Professional Services, and an adjunct professor at the University of Western Sydney. He’s a corporate trainer, a consultant, a researcher, and an author. He’s the coeditor of the Encyclopedia of Cloud Computing (Wiley, 2015; https://sites.google.com/site/encyclopediaofcloudcomputing) and Harnessing Green IT: Principles and Practices (Wiley, 2012; http://eu.wiley.com/WileyCDA/WileyTitle/productCd-1119970059.html), the Handbook of Research on Web 2.0, 3.0, and X.0: Technologies, Business, and Social Applications (IGI-Global, 2009), and Web Engineering: Managing Diversity and Complexity of Web Application Development (Springer, 2001). Murugesan is on the Computer editorial board and edits its bimonthly column, Cloud Cover. He’s a fellow of the Australian Computer Society and Institution of Electronics and Telecommunication Engineers, as well as a distinguished visitor of the IEEE Computer Society. Contact him via email (san[at]computer[dot]org), Twitter (@santweets ), LinkedIn (http://tinyurl.com/sanlinks), or his website (http://tinyurl.com/sanbio).
- “The End of Moore’s Law,” The Economist, 10 Apr. 2015; www.economist.com/blogs/economist-explains/2015/04/economist-explains-17.
- “When Silicon Leaves the Valley,” The Economist, 6 Mar. 2014; www.economist.com/news/technology-quarterly/21598327-semiconductors-it-becomes-harder-cram-more-transistors-slice.
- C.C. Lo and J.J.L. Morton, “Will Silicon Save Quantum Computing?,” IEEE Spectrum, 31 July 2014; http://spectrum.ieee.org/semiconductors/materials/will-silicon-save-quantum-computing.
- R. Van Meter and C. Horsman, “A Blueprint for Building a Quantum Computer,” Comm. ACM, vol. 56, no. 10, 2013, pp. 84–93; http://cacm.acm.org/magazines/2013/10/168172-a-blueprint-for-building-a-quantum-computer/fulltext.
- W. Browne, “D-Wave’s Dream Machine,” Inc., 19 May 2015; www.inc.com/will-bourne/d-waves-dream-machine.html.
- “Your Cognitive Future: How Next-Gen Computing Changes the Way We Live and Work,” IBM, 2015; www-935.ibm.com/services/us/gbs/thoughtleadership/cognitivefuture.
- Quantum Computing Primer, D-Wave Systems, 27 May 2015; www.dwavesys.com/tutorials/background-reading-series/quantum-computing-primer.
- “Quantum Computing Explained: An Introduction;” www.youtube.com/watch?v=owqBTgm6NXE.
- “Introduction to Quantum Computers;” www.youtube.com/watch?v=Fb3gn5GsvRk.
- “Quantum Computer in a Nutshell;” www.youtube.com/watch?v=0dXNmbiGPS4.
- “How Quantum Computing Will Change The World!;” www.youtube.com/watch?v=3BKIJCTLy-s.
- D.S. Modha, “Brain-Inspired Computing: A Decade-Long Journey;” www.youtube.com/watch?v=qE4kQh_30bA.
- L. Smarr, “Brain Inspired Computing;” www.youtube.com/watch?v=mrWzWPcI6jU.
- H. Markram, “Future Computing: Brain-Based Chips;” www.youtube.com/watch?v=PCql2DgW5sE.
- “From BrainScales to Human Brain Project: Neuromorphic Computing Coming of Age;” www.youtube.com/watch?v=g-ybKtY1quU.
- “IBM SyNAPSE Deep Dive Part 1;” www.youtube.com/watch?v=tAtmNYBObkw.