The Community for Technology Leaders
2017 14th Conference on Computer and Robot Vision (CRV) (2017)
Edmonton, AB, Canada
May 16, 2017 to May 19, 2017
ISBN: 978-1-5386-2818-8
pp: 307-313
ABSTRACT
We present a novel user interface for aiming andlaunching flying robots on user-defined trajectories. The methodrequires no user instrumentation and is easy to learn by analogyto a slingshot. With a few minutes of practice users can sendrobots along a desired 3D trajectory and place them in 3D space, including at high altitude and beyond line-of-sight. With the robot hovering in front of the user, the robot tracksthe user's face to estimate its relative pose. The azimuth, elevationand distance of this pose control the parameters of the robot'ssubsequent trajectory. The user triggers the robot to fly thetrajectory by making a distinct pre-trained facial expression. Wepropose three different trajectory types for different applications:straight-line, parabola, and circling. We also describe a simple training/startup interaction to selecta trajectory type and train the aiming and triggering faces. Inreal-world experiments we demonstrate and evaluate the method. We also show that the face-recognition system is resistant to inputfrom unauthorized users.
INDEX TERMS
autonomous aerial vehicles, face recognition, human-robot interaction, mobile robots, pose estimation, robot vision, trajectory control
CITATION

J. Bruce, J. Perron and R. Vaughan, "Ready—Aim—Fly! Hands-Free Face-Based HRI for 3D Trajectory Control of UAVs," 2017 14th Conference on Computer and Robot Vision (CRV), Edmonton, AB, Canada, 2018, pp. 307-313.
doi:10.1109/CRV.2017.39
185 ms
(Ver 3.3 (11022016))