This Article 
 Bibliographic References 
 Add to: 
Immersive Group-to-Group Telepresence
April 2013 (vol. 19 no. 4)
pp. 616-625
We present a novel immersive telepresence system that allows distributed groups of users to meet in a shared virtual 3D world. Our approach is based on two coupled projection-based multi-user setups, each providing multiple users with perspectively correct stereoscopic images. At each site the users and their local interaction space are continuously captured using a cluster of registered depth and color cameras. The captured 3D information is transferred to the respective other location, where the remote participants are virtually reconstructed. We explore the use of these virtual user representations in various interaction scenarios in which local and remote users are face-to-face, side-by-side or decoupled. Initial experiments with distributed user groups indicate the mutual understanding of pointing and tracing gestures independent of whether they were performed by local or remote participants. Our users were excited about the new possibilities of jointly exploring a virtual city, where they relied on a world-in-miniature metaphor for mutual awareness of their respective locations.
Index Terms:
Calibration,Cameras,Servers,Streaming media,Image reconstruction,Image color analysis,Virtual reality,3D capture.,Multi-user virtual reality,telepresence
S. Beck, A. Kunert, A. Kulik, B. Froehlich, "Immersive Group-to-Group Telepresence," IEEE Transactions on Visualization and Computer Graphics, vol. 19, no. 4, pp. 616-625, April 2013, doi:10.1109/TVCG.2013.33
Usage of this product signifies your acceptance of the Terms of Use.