Issue No. 02 - March/April (2001 vol. 3)
In a conventional computer, information is represented by quantities that obey the laws of classical physics, such as the voltage levels in a logic circuit. But as the size of microelectronics shrinks, quantum physics becomes increasingly important.
R. Hughes, "Guest Editor's Introduction: Quantum Computation," in Computing in Science & Engineering, vol. 3, no. , pp. 26, 2001.