The Community for Technology Leaders
Green Image
Issue No. 02 - February (2012 vol. 18)
ISSN: 1077-2626
pp: 299-308
Wu Shin-Ting , University of Campinas, Campinas
Clarissa Lin Yasuda , University of Campinas, Campinas
Fernando Cendes , University of Campinas, Campinas
Curvilinear reformatting of 3D magnetic resonance imaging data has been recognized by the medical community as a helpful noninvasive tool for displaying the cerebral anatomy. It consists of automatically creating, with respect to a reference surface, a series of equidistant curvilinear slices at progressively deeper cuts. In comparison with planar slices, it allows more precise localization of lesions and identification of subtle structural abnormalities. However, current curvilinear reformatting tools either rely on the time-consuming manual delineation of guiding curves on 2D slices, or require costly automatic brain segmentation procedures. In addition, they extract the skin and skull, impeding a precise topographic correlation between the location of the brain lesion and skin surface. This impairs planning of craniotomy for neurosurgery, and of the appropriate implantation of electrodes for intracranial electroencephalography in presurgical evaluation. In this work, we present a novel approach based on direct manipulation of the visualized volume data. By using a 3D painting metaphor, the reference surface can be defined incrementally, according to the principle that the user interacts with what she/he sees. As a response, an animation of the reformatting process is displayed. The focus of this paper is a new volume tagging algorithm behind user interactions. It works at an interactive frame rate on current graphics hardware.
Curvilinear reformatting, volume clipping and tagging, neurological diagnosis and surgical planning, 3D interaction.

C. L. Yasuda, F. Cendes and W. Shin-Ting, "Interactive Curvilinear Reformatting in Native Space," in IEEE Transactions on Visualization & Computer Graphics, vol. 18, no. , pp. 299-308, 2011.
88 ms
(Ver 3.3 (11022016))