This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Semantic Layers for Illustrative Volume Rendering
November/December 2007 (vol. 13 no. 6)
pp. 1336-1343
Direct volume rendering techniques map volumetric attributes (e.g., density, gradient magnitude, etc.) to visual styles. Commonly this mapping is specified by a transfer function. The specification of transfer functions is a complex task and requires expert knowledge about the underlying rendering technique. In the case of multiple volumetric attributes and multiple visual styles the specification of the multi-dimensional transfer function becomes more challenging and non-intuitive. We present a novel methodology for the specification of a mapping from several volumetric attributes to multiple illustrative visual styles. We introduce semantic layers that allow a domain expert to specify the mapping in the natural language of the domain. A semantic layer defines the mapping of volumetric attributes to one visual style. Volumetric attributes and visual styles are represented as fuzzy sets. The mapping is specified by rules that are evaluated with fuzzy logic arithmetics. The user specifies the fuzzy sets and the rules without special knowledge about the underlying rendering technique. Semantic layers allow for a linguistic specification of the mapping from attributes to visual styles replacing the traditional transfer function specification.

[1] S. Bruckner and M. E. Gröller, VolumeShop: An interactive system for direct volume illustration. In Proceedings of IEEE Visualization 2005, pages 671–678, 2005.
[2] S. Bruckner and M. E. Gröller, Style transfer functions for illustrative volume rendering. Computer Graphics Forum (accepted for publication), 26 (3), 2007.
[3] B. Coyne and R. Sproat, Wordseye: an automatic text-to-scene conversion system. In Proceedings of ACM SIGGRAPH 2001, pages 487–496, 2001.
[4] H. Doleisch, M. Gasser, and H. Hauser, Interactive feature specification for focus+context visualization of complex simulation data. In Proceedings of the Symposium on Data Visualizatation 2003, pages 239–248, 2003.
[5] R. A. Drebin, L. Carpenter, and P. Hanrahan, Volume rendering. In Proceedings of ACM Siggraph 1988, pages 65–74, 1988.
[6] H. Hauser, L. Mroz, G.-I. Bischi, and M. E. Gröller, Two-level volume rendering. IEEE Transactions on Visualization and Computer Graphics, 7 (3): 242–252, 2001.
[7] J. Hladuvka, A. König, and M. E. Gröller, Curvature-based transfer functions for direct volume rendering. In Proceedings of the Spring Conference on Computer Graphics 2000, pages 58–65, 2000.
[8] G. Kindlmann, R. Whitaker, T. Tasdizen, and T. Möller, Curvature-based transfer functions for direct volume rendering: Methods and applications. In Proceedings of IEEE Visualization 2003, pages 513–520, 2003.
[9] J. Kniss, G. Kindlmann, and C. Hansen, Multidimensional transfer functions for interactive volume rendering. IEEE Transactions on Visualization and Computer Graphics, 8 (3): 270–285, 2002.
[10] J. Kniss, R. V. Uitert, A. Stephens, G.-S. Li, T. Tasdizen, and C. Hansen, Statistically quantitative volume visualization. In Proceedings IEEE Visualization 2005, pages 287–294, 2005.
[11] E. B. Lum and K.-L. Ma, Lighting transfer functions using gradient aligned sampling. In Proceedings of IEEE Visualization 2004, pages 289–296, 2004.
[12] P. McCormick, J. Inman, J. Ahrens, C. Hansen, and G. Roth, Scout: a hardware-accelerated system for quantitatively driven visualization and analysis. In Proceedings of IEEE Visualization 2004, pages 171–178, 2004.
[13] C. Rezk-Salama, M. Keller, and P. Kohlmann, High-level user interfaces for transfer function design with semantics. IEEE Transactions on Visualization and Computer Graphics, 12 (5): 1021–1028, 2006.
[14] Y. Sato, C.-F. Westin, A. Bhalerao, S. Nakajima, N. Shiraga, S. Tamura, and R. Kikinis, Tissue classification based on 3d local intensity structures for volume rendering. IEEE Transactions on Visualization and Computer Graphics, 6 (2): 160–180, 2000.
[15] D. D. Seligmann and S. K. Feiner, Automated generation of intent-based 3D illustrations. In Proceedings of ACM Siggraph 1991, pages 123–132, 1991.
[16] P.-P. Sloan, W. Martin, A. Gooch, and B. Gooch, The lit sphere: A model for capturing NPR shading from art. In Proceedings of Graphics Interface 2001, pages 143–150, 2001.
[17] K. Stockinger, J. Shalf, W. Bethel, and K. Wu, Query-driven visualization of large data sets. In Proceedings of IEEE Visualization 2005, pages 167–174, 2005.
[18] N. Svakhine, D. S. Ebert, and D. Stredney, Illustration motifs for effective medical volume illustration. IEEE Computer Graphics and Applications, 25 (3): 31–39, 2005.
[19] A. Tappenbeck, B. Preim, and V. Dicken, Distance-based transfer function design: Specification methods and applications. In SimVis, pages 259–274, 2006.
[20] L. H. Tsoukalas and R. E. Uhrig, Fuzzy and Neural Approaches in Engineering. Wiley & Sons, 1997.
[21] J. Woodring and H.-W. Shen, Multi-variate, time varying, and comparative visualization with contextual cues. IEEE Transactions on Visualization and Computer Graphics, 12 (5): 909–916, 2006.
[22] R. R. Yager and L. A. Zadeh, editors. An Introduction to Fuzzy Logic Applications in Intelligent Systems, volume 165 of International Series in Engineering and Computer Science. Springer, 1992.
[23] X. Yuan and B. Chen, Illustrating surfaces in volume. In Proceedings of Joint IEEE/EG Symposium on Visualization 2004, pages 9–16, 2004.
[24] J. Zhou, A. Döring, and K. D. Tönnies, Distance based enhancement for focal region based volume rendering. In Proceedings of Bildverarbeitung für die Medizin 2004, pages 199–203, 2004.

Index Terms:
Illustrative Visualization, Focus+Context Techniques, Volume Visualization
Citation:
Peter Rautek, Stefan Bruckner, Eduard Gröller, "Semantic Layers for Illustrative Volume Rendering," IEEE Transactions on Visualization and Computer Graphics, vol. 13, no. 6, pp. 1336-1343, Nov.-Dec. 2007, doi:10.1109/TVCG.2007.70591
Usage of this product signifies your acceptance of the Terms of Use.