Issue No.02 - March/April (2001 vol.3)
In a conventional computer, information is represented by quantities that obey the laws of classical physics, such as the voltage levels in a logic circuit. But as the size of microelectronics shrinks, quantum physics becomes increasingly important.
Richard Hughes, "Guest Editor's Introduction: Quantum Computation", Computing in Science & Engineering, vol.3, no. 2, pp. 26, March/April 2001, doi:10.1109/MCISE.2001.908998