This Article 
 Bibliographic References 
 Add to: 
Visualizing Large, Heterogeneous Data in Hybrid-Reality Environments
July-Aug. 2013 (vol. 33 no. 4)
pp. 38-48
Khairi Reda, University of Illinois at Chicago
Alessandro Febretti, University of Illinois at Chicago
Aaron Knoll, University of Texas at Austin
Jillian Aurisano, University of Illinois at Chicago
Jason Leigh, University of Illinois at Chicago
Andrew Johnson, University of Illinois at Chicago
Michael E. Papka, Argonne National Laboratory
Mark Hereld, Argonne National Laboratory
Constructing integrative visualizations that simultaneously cater to a variety of data types is challenging. Hybrid-reality environments blur the line between virtual environments and tiled display walls. They incorporate high-resolution, stereoscopic displays, which can be used to juxtapose large, heterogeneous datasets while providing a range of naturalistic interaction schemes. They thus empower designers to construct integrative visualizations that more effectively mash up 2D, 3D, temporal, and multivariate datasets.
Index Terms:
Data visualization,Visualization,Stereo image processing,Monitoring,Visual analytics,Three-dimensional displays,computer graphics,Data visualization,Visualization,Stereo image processing,Monitoring,Navigation,Three-dimensional displays,Educational institutions,3D visualization,large high-resolution displays,integrative visualization,immersive visualization,hybrid-reality environments
Khairi Reda, Alessandro Febretti, Aaron Knoll, Jillian Aurisano, Jason Leigh, Andrew Johnson, Michael E. Papka, Mark Hereld, "Visualizing Large, Heterogeneous Data in Hybrid-Reality Environments," IEEE Computer Graphics and Applications, vol. 33, no. 4, pp. 38-48, July-Aug. 2013, doi:10.1109/MCG.2013.37
Usage of this product signifies your acceptance of the Terms of Use.