This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
May-June 2014 (vol. 34 no. 3)
pp. 20-21
Andy Wilson, Microsoft Research
Hrvoje Benko, Microsoft Research
Advances in sensing technology are poised to spark the next shift in human-computer interaction, liberating users from the 2D plane of interaction currently supported by the mouse and touchscreen. New sensors can read users' shape and motion as they move about in three dimensions. What signal-processing algorithms and interaction models are appropriate for this mode of interaction above and beyond the screen? How can we use the more detailed, nuanced information made available by new sensors to enable more expressive interfaces, going beyond what a mouse can do but preserving its familiar predictability? The five articles in this issue deal with these questions, covering the spectrum from specialized sensing hardware to high-level interaction models, across multiple physical scales and applications.
Index Terms:
Special issues and sections,Haptic interfaces,User interfaces,Solid modeling,Software development,Tablet computers,sensors,Special issues and sections,Haptic interfaces,User interfaces,Solid modeling,Software development,Tablet computers,tablet computers,Special issues and sections,Haptic interfaces,User interfaces,Solid modeling,Software development,Tablet computers,stylus input,Special issues and sections,Haptic interfaces,User interfaces,Solid modeling,Software development,Tablet computers,multitouch interaction,spatial interfaces,human-computer interaction,computer graphics,multimedia,graphics
Citation:
Andy Wilson, Hrvoje Benko, "Interacting above and beyond the Display [Guest editorial]," IEEE Computer Graphics and Applications, vol. 34, no. 3, pp. 20-21, May-June 2014, doi:10.1109/MCG.2014.54
Usage of this product signifies your acceptance of the Terms of Use.