Total Tomography
March/April 2009 (Vol. 11, No. 2) pp. 12–13
1521-9615/09/$31.00 © 2009 IEEE

Published by the IEEE Computer Society
Total Tomography
Michael Jay Schillaci
  Article Contents  
Download Citation
Download Content
PDFs Require Adobe Acrobat
Written by and primarily for experts in the field of discrete tomography (DT) and its applications, students, programmers, scientists, and engineers alike will find a wealth of well–presented theoretical and practical material in Advances in Discrete Tomography and Its Applications. Gabor T. Herman is a prolific author and pioneer in the computerized tomography (CT) field and is recognized worldwide as an authority on medical image reconstruction techniques. Attila Kuba's vast contributions to the DT field include publishing original manuscripts, producing software packages—Idicon for the efficient conversion of Interfile images and the Discrete Reconstruction Techniques toolkit for DT reconstruction—and chairing or organizing several summer school programs and conferences on DT, medical imaging, and signal processing.
The current volume is part of Springer's Applied and Numerical Harmonic Analysis series, which seeks to highlight "the intricate and fundamental relationship between harmonic analysis and fields such as signal processing, partial differential equations, and image processing." DT is the process of reconstructing an image from a set of projections (that is, a set of measurements along a given direction) of that image. A cursory view of the extensive bibliography shows that DT has substantive applications in areas ranging from industry (detection of coking on turbine blades) to medicine (3D reconstruction of cardiac vessel structures).
Despite the sometimes dense mathematical content, the selected contributions are clear and complete. The editors wisely break the text into three sections, all covered with equal depth and breadth—foundational material, reconstruction algorithms, and applications. The text's exercises, exhaustive solutions, and expert remarks also make it a suitable reference for a special topics class. Guided by well-crafted chapter summaries, motivated students and professionals will find the text a welcome handbook and resource.
Although the text covers familiar concepts such as connectedness, convexity, and directedness, its claim to be a self-contained volume is stretched at times. In particular, some readers might view the treatment on finding components somewhat brief. More often—as the inclusion of a proof for the correctness and complexity measures illustrates—the coverage goes beyond the mundane. The text discusses important constructs such as decomposability at depth and gives reconstruction algorithm examples using multiple sets that have the same horizontal and vertical projections. The pertinent comments included by the author-experts add to the text's quality and readability and help underscore DT's applied nature. Care is also taken to point to relevant historical reviews and seminal articles in which the contact between disciplines plays a role in understanding.
The implementation of DT is perhaps best refined in its application to electron microscopy. Because measuring the crystal lattice destroys some of its structure, microscopy provides a particularly stringent "control" in which to test DT algorithms. As no "closed and explicit forms" for the maximum a posterior probability (MAP) or marginal posterior mode (MPM) estimators are known, the authors rightly employ a Markov chain Monte Carlo (MCMC) method. Within a hierarchical model consisting of label images, gray-value images, and measurements, the method avoids degradation by modeling the underlying distribution of label images. The authors carry out optimization via the Metropolis algorithm and provide extensive implementation notes.
With material from the forefront of industrial nondestructive testing (NDT) applications, which combine CAD and DT data, the text provides a detailed analysis of measurement degradation by limited view angles and subsequent scattering. The text also extends pixel-based object reconstruction methods and compares them to filtered back-projection (FBP) results, as well as demonstrating that DT might help improve existing and emerging technologies.
Many of the lessons gleaned earlier in the text are brought to bear in the last chapter, in which the authors demonstrate that linear programming (LP) methods common in DT can be used to improve on CT. This represents a clear change in perspective that's quite significant. For practical purposes (that is, CPU time), CT has historically used analytic and iterative methods for reconstruction. Whereas "a typical LP problem has 15,000 constraints and 11,000 variables for a reconstruction with 16 projections on a 64 × 64 matrix," all of the iterative methods "have been improved by adding the possibility of taking into account information in addition to the measurements … [and] … interaction with external data is modeled mathematically with Bayes' theorem." Hence, an implementation of CT using a linear Dirac model in which the contribution of the points is weighed by the total number of points in the digital line will significantly decrease image acquisition time. Moreover, in testing the models, the authors vary the optimization techniques and cutoff frequency and put forth a clear agenda for improved image quality. Results such as these are of particular relevance to x-ray CT, which can be limited by x-ray penetration.
Bringing together an enormous amount of theoretical and practical knowledge from DT applications, Advances in Discrete Tomography and Its Applications fulfills its commitment to report on the field's growth. Surely, students, programmers, scientists, and engineers from medicine, industry, and research will find something valuable in this total tomography handbook.
Michael Jay Schillaci is a lecturer in the department of physics at Rochester Institute of Technology and is an independent computing consultant. His teaching and research interests include computational physics and multidisciplinary curriculum development, as well as models of human cognition using magnetic resonance imaging (MRI) and electroencephalographic (EEG) data. Schillaci has a PhD in physics and was formerly managing director of the McCausland Center for Brain Imaging at the University of South Carolina. He's a member of the American Physical Society (DAMOP, DCOMP) and the Cognitive Neurosciences Society. Contact him at