Search For:

Displaying 1-9 out of 9 total
Haptic Human-Robot Interaction
Found in: IEEE Transactions on Haptics
By Amir Karniel,Angelika Peer,Opher Donchin,Ferdinando A. Mussa-Ivaldi,Gerald E. Loeb
Issue Date:July 2012
pp. 193-195
The eight articles in this special section focus on haptic human-robot interaction. It had its origins in a tournament announced at the 7th Annual Computational Motor Control Workshop held June 2011 at Ben-Gurion University to compare algorithms for handsh...
 
Feature Extraction and Selection for Emotion Recognition from EEG
Found in: IEEE Transactions on Affective Computing
By Robert Jenke,Angelika Peer,Martin Buss
Issue Date:July 2014
pp. 1-1
Emotion recognition from EEG signals allows the direct assessment of the “inner” state of a user, which is considered an important factor in human-machine-interaction. Many methods for feature extraction have been studied and the selection of both appropri...
 
A Comparison of Evaluation Measures for Emotion Recognition in Dimensional Space
Found in: 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII)
By Robert Jenke,Angelika Peer,Martin Buss
Issue Date:September 2013
pp. 822-826
Emotion recognition from physiological signals like electroencephalography (EEG) can be performed using different underlying emotion models. While dimensional emotion models have recently gained attention, measures to evaluate recognition methods that are ...
 
Performance related energy exchange in haptic human-human interaction in a shared virtual object manipulation task
Found in: World Haptics Conference
By Daniela Feth, Raphaela Groten, Angelika Peer, Sandra Hirche, Martin Buss
Issue Date:March 2009
pp. 338-343
In order to enable intuitive physical interaction with autonomous robots as well as in collaborative multi-user virtual reality and tele-operation systems a deep understanding of human-human haptic interaction is required. In this paper the effect of hapti...
 
Role determination in human-human interaction
Found in: World Haptics Conference
By Nikolay Stefanov, Angelika Peer, Martin Buss
Issue Date:March 2009
pp. 51-56
Physical human-robot interaction can be significantly improved when being aware about the role each partner takes in a joint manipulation task. This holds especially in computer assisted teleoperation, where depending on the identified role of the human, d...
 
An HMM approach to realistic haptic human-robot interaction
Found in: World Haptics Conference
By Zheng Wang, Angelika Peer, Martin Buss
Issue Date:March 2009
pp. 374-379
A robot controller is developed for human-robot handshaking. The focus of the work is to provide realistic experiences for the human participant in haptic interactions with a robot. To achieve this goal, a position-based admittance controller is implemente...
 
Predictability of a Human Partner in a Pursuit Tracking Task without Haptic Feedback
Found in: International Conference on Advances in Computer-Human Interaction
By Raphaela Groten, Jens Hölldampf, Angelika Peer, Martin Buss
Issue Date:February 2009
pp. 63-68
We are interested in whether humans create a model of their partner when they jointly manipulate an object in a virtual task without haptic feedback. In such a scenario the partner is perceived as a disturbance because she/he is responsible for inconsisten...
 
Exploring the Design Space of Haptic Assistants: The Assistance Policy Module
Found in: IEEE Transactions on Haptics
By Carolina Passenberg,Antonia Glaser,Angelika Peer
Issue Date:October 2013
pp. 440-452
Haptic assistants augment user commands to facilitate manipulation and to increase task performance. The strength of assistance, also referred to as assistance level, is one of the main design factors. While existing implementations mainly realize fixed as...
 
Supporting interoperability and presence awareness in collaborative mixed reality environments
Found in: Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology (VRST '13)
By Angelika Peer, Anthony Steed, Benjamin Cohen, Franco Tecchia, Laith Alkurdi, Oyewole Oyekoya, Ran Stone, Stefan Klare, Tim Weyrich, William Steptoe
Issue Date:October 2013
pp. 165-174
In the BEAMING project we have been extending the scope of collaborative mixed reality to include the representation of users in multiple modalities, including augmented reality, situated displays and robots. A single user (a visitor) uses a high-end virtu...
     
 1