, University of Arizona
Pages: pp. 13-14
Computing is pervasive in almost all modern scientific endeavors, but nowhere is the need for computing power and efficient computational techniques more apparent than in the optics and imaging fields. In fact, we've known the fundamental laws governing the behavior of electromagnetic waves, Maxwell's equations, since the mid 1800s, but they are difficult to solve under conditions that deviate from the ideal.
Until the introduction of modern computers and computational techniques, solutions to Maxwell's equations under nonideal conditions were intractable. Researchers had to use assumptions and restrictive boundary conditions to make solutions possible. Although these simplifications-scalar diffraction theory being the best example-have elucidated a variety of physical phenomena, they are, nonetheless, based on assumptions that do not always hold true. Modern computational power makes the search for vector solutions to Maxwell's equations under nonideal situations more tractable, hence, researchers can study light's behavior in situations that previously were simply not amenable to computation.
A better understanding of the nature of light is just one example of where the marriage of computing and optics works well. Increases in computational power also have had a large impact in the optical engineering field. Since the late 1970s, for example, researchers have used computed tomography to image the human body. Traditionally, such systems use high-energy photons like x-rays and gamma rays because the physics characterizing this radiation's interactions with matter is well understood. Recent advances in the field of optical tomography, however, have led to tomographic imaging systems that use the near-visible spectrum to scan the object being imaged. These advances would not have been possible without modern computer technology.
Recent computer advances have enhanced imaging on both the small scale (such as electron microscopy) and the large scale (such as astronomy). Modern telescopes use a vast amount of computing power just to keep systems running. Some telescopes use deformable mirrors that correct for the negative effects of the Earth's atmosphere. These systems must measure atmospheric distortion and deform the secondary mirror in such a way as to correct for the effect. All of this must be done rapidly because the atmosphere is turbulent and ever changing, a problem that represents an enormous computation burden best-solved via efficient, parallel computing techniques.
Finally, one of the most visible areas of computing in optics is that of lens design. Numerous techniques and software packages enable the design of multiple-element optical systems that correct for aberrations and other negative effects while retaining acceptable resolution and sensitivity properties. Computers can measure optical properties of lens systems efficiently and examine many different lens designs to best enable a researcher to build better optical systems.
Almost all modern optical systems designed or optimized via computer techniques rely on an accurate understanding of the forward model; that is, we must simulate the propagation of light from its source to whatever detector system is being used. For systems that image high-energy photons, this modeling might be somewhat straightforward (at least at first glance). Modern Monte Carlo techniques can efficiently model the attenuation, scatter, and absorption of photons traveling through a medium.
However, modeling the forward problem with less-energetic photons might be somewhat more difficult. These systems must account for reflection, refraction, absorption, and other physical phenomena. The steady increase in computational resources-in terms of faster computers, more of them, and better algorithms-has led to a steady advance in researchers' ability to model the forward problem for a wide variety of optical systems.
With the enhanced understanding of the optical forward problem for many systems comes an enhanced ability to solve the inverse problem. Computational advances, for example, aid in the reconstruction of the basic properties of materials or patients that are investigated using light. Modern medicine, astronomy, and physics have benefited from an enhanced understanding of the optical forward problem and, thus, an enhanced ability to solve the inverse problem. It is my sincere hope that the articles in this issue will elucidate many of the themes currently prevalent in computational optics research and the benefits that such research can give society.
Advancing computational power and resources have led to advances in a wide variety of optical problems. The articles in this special issue represent promising areas of research covering a wide range of computational optics fields. The four articles cover lens design for optical data storage, fundamental electromagnetic theory, diffuse optical tomography of human tissues, and adaptive optics in astronomical applications. Each article focuses on the computational aspects and difficulties associated with the authors' work.
Together, these articles form a nice introduction to many of the complex issues underlying optics research today. However, a comprehensive overview of computing in optics would require far more space than a single issue of CiSE. In particular, the field of quantum optics is not represented here, but it's an exciting area of computational modeling.
This collection of articles is necessarily incomplete and nonargumentative because I chose one group from each of the various fields to write an article. Nevertheless, the authors, the CiSE editorial staff, and I have endeavored to make this special issue as interesting and enlightening to the general scientific audience as possible.