The Community for Technology Leaders
Green Image
<p><b>Abstract</b>—We developed a three-dimensional (3D) digitized atlas of the human brain to visualize spatially complex structures. It was designed for use with magnetic resonance (MR) imaging data sets. Thus far, we have used this atlas for surgical planning, model-driven segmentation, and teaching. We used a combination of automated and supervised segmentation methods to define regions of interest based on neuroanatomical knowledge. We also used 3D surface rendering techniques to create a brain atlas that would allow us to visualize complex 3D brain structures. We further linked this information to script files in order to preserve both spatial information and neuroanatomical knowledge. We present here the application of the atlas for visualization in surgical planning for model-driven segmentation and for the teaching of neuroanatomy. This digitized human brain has the potential to provide important reference information for the planning of surgical procedures. It can also serve as a powerful teaching tool, since spatial relationships among neuroanatomical structures can be more readily envisioned when the user is able to view and rotate the structures in 3D space. Moreover, each element of the brain atlas is associated with a name tag, displayed by a user-controlled pointer. The atlas holds a major promise as a template for model-driven segmentation. Using this technique, many regions of interest can be characterized simultaneously on new brain images.</p>
Brain atlas, magnetic resonance imaging (MRI), 3D visualization, 3D surface rendering, biomedical visualization.
Ron Kikinis, Robert M. Donnino, Hiroto H. Hokama, Chiara M. Portas, Ferenc A. Jolesz, Martha E. Shenton, David Metcalf, Andre Robatino, Robert W. McCarley, Dan V. Iosifescu, Pairash Saiviroonporn, Cynthia G. Wible, "A Digital Brain Atlas for Surgical Planning, Model-Driven Segmentation, and Teaching", IEEE Transactions on Visualization & Computer Graphics, vol. 2, no. , pp. 232-241, September 1996, doi:10.1109/2945.537306
99 ms
(Ver )