Search For:

Displaying 1-8 out of 8 total
Some Implications of Eye Gaze Behavior and Perception for the Design of Immersive Telecommunication Systems
Found in: Distributed Simulation and Real Time Applications, IEEE/ACM International Symposium on
By John P. Rae,William Steptoe,David J. Roberts
Issue Date:September 2011
pp. 108-114
A feature of standard video-mediated Communication systems (VMC) is that participants see into each other's spaces from the viewpoint of a camera. Consequently, participants' capacity to use the spatially-based resources that exist in co-located settings (...
Eye Tracking for Avatar Eye Gaze Control During Object-Focused Multiparty Interaction in Immersive Collaborative Virtual Environments
Found in: Virtual Reality Conference, IEEE
By William Steptoe, Oyewole Oyekoya, Alessio Murgia, Robin Wolff, John Rae, Estefania Guimaraes, David Roberts, Anthony Steed
Issue Date:March 2009
pp. 83-90
In face-to-face collaboration, eye gaze is used both as a bidirectional signal to monitor and indicate focus of attention and action, as well as a resource to manage the interaction. In remote interaction supported by Immersive Collaborative Virtual Enviro...
A Tool for Replay and Analysis of Gaze-Enhanced Multiparty Sessions Captured in Immersive Collaborative Environments
Found in: Distributed Simulation and Real Time Applications, IEEE/ACM International Symposium on
By Alessio Murgia, Robin Wolff, William Steptoe, Paul Sharkey, David Roberts, Estefania Guimaraes, Anthony Steed, John Rae
Issue Date:October 2008
pp. 252-258
A desktop tool for replay and analysis of gaze-enhanced multiparty virtual collaborative sessions is described. We linked three CAVETM-like environments, creating a multiparty collaborative virtual space where avatars are animated with 3D gaze as well as h...
Supporting interoperability and presence awareness in collaborative mixed reality environments
Found in: Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology (VRST '13)
By Angelika Peer, Anthony Steed, Benjamin Cohen, Franco Tecchia, Laith Alkurdi, Oyewole Oyekoya, Ran Stone, Stefan Klare, Tim Weyrich, William Steptoe
Issue Date:October 2013
pp. 165-174
In the BEAMING project we have been extending the scope of collaborative mixed reality to include the representation of users in multiple modalities, including augmented reality, situated displays and robots. A single user (a visitor) uses a high-end virtu...
Panoinserts: mobile spatial teleconferencing
Found in: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13)
By Anthony Steed, Fabian Wanner, Fabrizio Pece, Jan Kautz, Simon Julier, Tim Weyrich, William Steptoe
Issue Date:April 2013
pp. 1319-1328
We present PanoInserts: a novel teleconferencing system that uses smartphone cameras to create a surround representation of meeting places. We take a static panoramic image of a location into which we insert live videos from smartphones. We use a combinati...
Lie tracking: social presence, truth and deception in avatar-mediated telecommunication
Found in: Proceedings of the 28th international conference on Human factors in computing systems (CHI '10)
By Aitor Rovira, Anthony Steed, John Rae, William Steptoe
Issue Date:April 2010
pp. 1039-1048
The success of visual telecommunication systems depends on their ability to transmit and display users' natural nonverbal behavior. While video-mediated communication (VMC) is the most widely used form of interpersonal remote interaction, avatar-mediated c...
A saliency-based method of simulating visual attention in virtual scenes
Found in: Proceedings of the 16th ACM Symposium on Virtual Reality Software and Technology (VRST '09)
By Anthony Steed, Oyewole Oyekoya, William Steptoe
Issue Date:November 2009
pp. 199-206
Complex interactions occur in virtual reality systems, requiring the modelling of next-generation attention models to obtain believable virtual human animations. This paper presents a saliency model that is neither domain nor task specific, which is used t...
Eye-tracking for avatar eye-gaze and interactional analysis in immersive collaborative virtual environments
Found in: Proceedings of the ACM 2008 conference on Computer supported cooperative work (CSCW '08)
By Alessio Murgia, Anthony Steed, David Roberts, Estefania Guimaraes, John Rae, Paul Sharkey, Robin Wolff, William Steptoe
Issue Date:November 2008
pp. 21-27
Participants' eye-gaze is generally not captured or represented in immersive collaborative virtual environment (ICVE) systems. We present EyeCVE, which uses mobile eye-trackers to drive the gaze of each participant's virtual avatar, thus supporting remote ...