Recognition and mapping of facial expressions to avatar by embedded photo reflective sensors in head mounted display
2017 IEEE Virtual Reality (VR) (2017)
Los Angeles, CA, USA
March 18, 2017 to March 22, 2017
Katsuhiro Suzuki , Keio University
Fumihiko Nakamura , Keio University
Jiu Otsuka , Keio University
Katsutoshi Masai , Keio University
Yuta Itoh , Keio University
Yuta Sugiura , Keio University
Maki Sugimoto , Keio University
We propose a facial expression mapping technology between virtual avatars and Head-Mounted Display (HMD) users. HMD allow people to enjoy an immersive Virtual Reality (VR) experience. A virtual avatar can be a representative of the user in the virtual environment. However, the synchronization of the the virtual avatar's expressions with those of the HMD user is limited. The major problem of wearing an HMD is that a large portion of the user's face is occluded, making facial recognition difficult in an HMD-based virtual environment. To overcome this problem, we propose a facial expression mapping technology using retro-reflective photoelectric sensors. The sensors attached inside the HMD measures the distance between the sensors and the user's face. The distance values of five basic facial expressions (Neutral, Happy, Angry, Surprised, and Sad) are used for training the neural network to estimate the facial expression of a user. We achieved an overall accuracy of 88% in recognizing the facial expressions. Our system can also reproduce facial expression change in real-time through an existing avatar using regression. Consequently, our system enables estimation and reconstruction of facial expressions that correspond to the user's emotional changes.
Avatars, Resists, Face recognition, Face, Neural networks, Optical sensors
K. Suzuki et al., "Recognition and mapping of facial expressions to avatar by embedded photo reflective sensors in head mounted display," 2017 IEEE Virtual Reality (VR), Los Angeles, CA, USA, 2017, pp. 177-185.