The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.12 - Dec. (2011 vol.17)
pp: 1775-1784
Thomas Rydell , Interactive Institute, Norrköping, Sweden
Camilla Forsell , C-research, Linköping University, Sweden
Anders Persson , Center for Medical Image Science and Visualization, Linköping University, Sweden
Anders Ynnerman , C-research, Linköping University, Sweden
ABSTRACT
Medical imaging plays a central role in a vast range of healthcare practices. The usefulness of 3D visualizations has been demonstrated for many types of treatment planning. Nevertheless, full access to 3D renderings outside of the radiology department is still scarce even for many image-centric specialties. Our work stems from the hypothesis that this under-utilization is partly due to existing visualization systems not taking the prerequisites of this application domain fully into account. We have developed a medical visualization table intended to better fit the clinical reality. The overall design goals were two-fold: similarity to a real physical situation and a very low learning threshold. This paper describes the development of the visualization table with focus on key design decisions. The developed features include two novel interaction components for touch tables. A user study including five orthopedic surgeons demonstrates that the system is appropriate and useful for this application domain.
INDEX TERMS
Medical visualization, multitouch, tabletop display, treatment planning.
CITATION
Thomas Rydell, Camilla Forsell, Anders Persson, Anders Ynnerman, "Multi-Touch Table System for Medical Visualization: Application to Orthopedic Surgery Planning", IEEE Transactions on Visualization & Computer Graphics, vol.17, no. 12, pp. 1775-1784, Dec. 2011, doi:10.1109/TVCG.2011.224
REFERENCES
[1] Digital lightbox™. http://www.brainlab.com/art/2841/4surgical-pacs access /, 2011. Accessed March 2011.
[2] K. Andriole, J. Wolfe, R. Khorasani, S. Treves, D. Getty, F. Jacobson, M. Steigner, J. Pan, A. Sitek, and S. Seltzer, Optimizing Analysis, Visualization, and Navigation of Large Image Data Sets: One 5000-Section CT Scan Can Ruin Your Whole Day. Radiology, 259(2), 2011.
[3] R. Bade, F. Ritter, and B. Preim, Usability comparison of mouse-based interaction techniques for predictable 3d rotation. In Smart Graphics, volume 3638 of Lecture Notes in Computer Science. 2005.
[4] D. Bowman, S. Coquillart, B. Froehlich, M. Hirose, Y. Kitamura, K. Kiyokawa, and W. Stuerzlinger, 3D user interfaces: New directions and perspectives. Computer Graphics and Applications, IEEE, 28 (6):20 –36, 2008.
[5] J. L. Burton and J. Underwood, Clinical, educational, and epidemiologi-cal value of autopsy. LANCET, 369 (9571): 1471–1480, apr-may 2007.
[6] W. Buxton, Chunking and phrasing and the design of human-computer dialogues. In Proceedings of the IFIP World Computer Congress, pages 475–480, 2005.
[7] S. Carpendale, Evaluating information visualizations. In A. Kerren, J. Stasko, J.-D. Fekete, and C. North editors, Information Visualization, volume 4950 of Lecture Notes in Computer Science, pages 19–45.Springer Berlin / Heidelberg, 2008.
[8] D. Coffey, N. Malbraaten, T. Le, I. Borazjani, F. Sotiropoulos, and D. Keefe, Slice WIM: a multi-surface, multi-touch interface for overview+ detail exploration of volume datasets in virtual reality. In Symposium on Interactive 3D Graphics and Games, pages 191–198. ACM, 2011.
[9] J. Edelmann, A. Schilling, and S. Fleck, The DabR - a multitouch system for intuitive 3D scene navigation. In 3DTV Conference: The True Vision -Capture, Transmission and Display of 3D Video, 2009, pages 1 –4, 2009.
[10] B. Fröhlich, J. Hochstrate, A. Kulik, and A. Huckauf, On 3D input devices. IEEE Computer Graphics and Applications, 26 (2): 15 – 19, 2006.
[11] L. Gallo, A. Minutolo, and G. D. Pietro, A user interface for VR-ready 3D medical imaging by off-the-shelf input devices. Computers in Biology and Medicine, 40 (3): 350 – 358, 2010.
[12] M. Hancock, S. Carpendale, and A. Cockburn, Shallow-depth 3d interaction: design and evaluation of one-, two- and three-touch techniques. In Proceedings of the SIGCHI conference on Human factors in computing systems, CHI '07, pages 1147–1156, 2005.
[13] M. Hancock, T. ten Cate, and S. Carpendale, Sticky tools: full 6DOF force-based interaction for multi-touch tables. In Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces, ITS '09, pages 133–140, 2005.
[14] M. Hancock, F. Vernier, D. Wigdor, S. Carpendale, and C. Shen, Rotation and translation mechanisms for tabletop interaction. In First IEEE International Workshop on Horizontal Interactive Human-Computer Systems, 2006.
[15] E. Hornecker, P. Marshall, N.S. Dalton, and Y. Rogers, Collaboration and interference: awareness with mice or touch input. In Proceedings of the 2008 ACM conference on Computer supported cooperative work, pages 167–176, 2005.
[16] Y. Hu and R. A. Malthaner, The feasibility of three-dimensional displays of the thorax for preoperative planning in the surgical treatment of lung cancer. European Journal of Cardio-Thoracic Surgery, 31 (3): 506 – 511, 2007.
[17] P. Isenberg, D. Fisher, M. Morris, K. Inkpen, and M. Czerwinski, An exploratory study of co-located collaborative visual analytics around a tabletop display. In IEEE Symposium on Visual Analytics Science and Technology (VAST), pages 179 –186, 2010.
[18] T. Isenberg, M. H. Everts, J. Grubert, and S. Carpendale, Interactive exploratory visualization of 2d vector fields. Computer Graphics Forum, 27 (3): 983–990, 2005.
[19] Y. Jung, J. Keil, J. Behr, S. Webel, M. Zöllner, T. Engelke, H. Wuest, and M. Becker, Adapting X3D for multi-touch environments. In Proceedings of the 13th international symposium on 3D web technology, pages 27–30, 2005.
[20] K. Kin, M. Agrawala, and T. DeRose, Determining the benefits of direct-touch, bimanual, and multifinger input on a multitouch workstation. In Proceedings of Graphics Interface 2009, pages 119–124, 2005.
[21] W. Krueger and B. Froehlich, The Responsive Workbench. IEEE Computer Graphics and Applications, 14 (3): 12–15, May 1994.
[22] R. Kruger, S. Carpendale, S. D. Scott, and A. Tang, Fluid integration of rotation and translation. In Proceedings of the SIGCHI conference on Human factors in computing systems, CHI '05, pages 601–610, 2005.
[23] S.-W. Lee, H. Shinohara, M. Matsuki, J. Okuda, E. Nomura, H. Mabuchi, K. Nishiguchi, K. Takaori, I. Narabayashi, and N. Tanigawa, Preopera-tive simulation of vascular anatomy by three-dimensional computed tomography imaging in laparoscopic gastric cancer surgery. Journal of the American College of Surgeons, 197 (6): 927 – 936, 2003.
[24] C. Lin, R. Loftin, I. Kakadiaris, D. Chen, and S. Su, Interaction with medical volume data on a projection workbench. In The Proceedings of 10th International Conference on Artificial Reality and Telexistence, pages 148–152, 2005.
[25] J. Liu, D. Pinelle, S. Sallam, S. Subramanian, and C. Gutwin, TNT: improved rotation and translation on digital tables. In Proceedings of Graphics Interface 2006, GI '06, pages 25–32, 2005.
[26] P. Ljung, C. Winskog, A. Perssson, C. Lundström, and A. Ynnerman, Full body virtual autopsies using a state-of-the-art volume rendering pipeline. IEEE Transactions on Visualization and Computer Graphics (Proceedings Visualization/Information Visualization 2006), 12: 869–876, 2005.
[27] A. Martinet, G. Casiez, and L. Grisoni, 3D positioning techniques for multi-touch displays. In Proceedings of the 16th ACM Symposium on Virtual Reality Software and Technology, VRST '09, pages 227–228, 2005.
[28] A. Martinet, G. Casiez, and L. Grisoni, The effect of DOF separation in 3D manipulation tasks with multi-touch displays. In Proceedings of the 17th ACM Symposium on Virtual Reality Software and Technology, VRST '10, pages 111–118, 2005.
[29] J. C. McLachlan, J. Bligh, P. Bradley, and J. Searle, Teaching anatomy without cadavers. Medical Education, 38 (4): 418–424, 2005.
[30] C. North, T. Dwyer, B. Lee, D. Fisher, P. Isenberg, G. Robertson, and K. Inkpen, Understanding multi-touch manipulation for surface computing. In Human-Computer Interaction INTERACT 2009, volume 5727 of Lecture Notes in Computer Science, pages 236–249. 2009.
[31] J. L. Reisman, P. L. Davidson, and J. Y. Han, A screen-space formulation for 2D and 3D direct manipulation. In Proceedings of the 22nd annual ACM symposium on User interface software and technology, UIST '09, pages 69–78, 2005.
[32] Y. Rogers, W. Hazlewood, E. Blevis, and Y.-K. Lim, Finger talk: collaborative decision-making using talk and fingertip interaction around a tabletop display. In CHI '04 extended abstracts on Human factors in computing systems, pages 1271–1274, 2005.
[33] J. Roulson, E. W. Benbow, and P. S. Hasleton, Discrepancies between clinical and autopsy diagnosis and the value of post mortem histology; a meta-analysis and review. Histopathology, 47 (6): 551–559, 2005.
[34] H. Scharsach, Advanced GPU raycasting. In Central European Seminar on Computer Graphics, pages 69–76, 2005.
[35] M. Shiozawa, N. Sata, K. Endo, M. Koizumi, Y. Yasuda, H. Nagai, and H. Takakusaki, Preoperative virtual simulation of adrenal tumors. Abdominal Imaging, 34: 113–120, 2005.
[36] T. Sielhorst, M. Feuerstein, and N. Navab, Advanced medical displays: A literature review of augmented reality. Journal of Display Technology, 4 (4): 451 –467, 2008.
[37] C. Silén, S. Wirell, J. Kvist, E. Nylander, and O. Smedby, Advanced 3D visualization in student-centred medical education. Medical Teacher, 30 (5): 115–124, 2005.
[38] S. Thayyil, N. J. Robertson, N. J. Sebire, and A. M. Taylor, Post-mortem MR and CT imaging in fetuses, newborns and children: an evidenced based approach. Diagnostic Histopathology, 16 (12): 565 – 572, 2010.
[39] F. Volonte, J. Robert, O. Ratib, and F. Triponez, A lung segmentectomy performed with 3D reconstruction images available on the operating table with an iPad. Interactive CardioVascular and Thoracic Surgery, 2011.
[40] K. Yen, K.-O. Lövblad, E. Scheurer, C. Ozdoba, M. J. Thali, E. Aghayev, C. Jackowski, J. Anon, N. Frickey, K. Zwygart, J. Weis, and R. Dirnhofer, Post-mortem forensic neuroimaging: Correlation of MSCT and MRI findings with autopsy results. Forensic Science International, 173 (1):21 – 35, 2007.
[41] R. K. Yin, Case Study Research: Design and Methods. Sage Publications Inc, 2009. Fourth edition.
[42] L. Yu, P. Svetachov, P. Isenberg, M. H. Everts, and T. Isenberg, FI3D: Direct-touch interaction for the exploration of 3D scientific visualization spaces. IEEE Transactions on Visualization and Computer Graphics, 16 (6): 1613–1622, 2005.
17 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool