How does the robot feel?

How does the robot feel?

Perception of valence and arousal in emotional body language displayed by a humanoid robot.

Mina Marmpena, Angelica Lim, Torbjørn S. Dahl

Full paper: https://www.degruyter.com/view/j/pjbr.2018.9.issue-1/pjbr-2018-0012/pjbr-2018-0012.xml?format=INT

Abstract: Human-robot interaction in social robotics applications could be greatly enhanced by robotic behaviors that incorporate emotional body language. Using as our starting point a set of pre-designed, emotion conveying animations that have been created by professional animators for the Pepper robot, we seek to explore how humans perceive their affect content, and to increase their usability by annotating them with reliable labels of valence and arousal, in a continuous interval space. We conducted an experiment with 20 participants who were presented with the animations and rated them in the two-dimensional affect space. An inter-rater reliability analysis was applied to support the aggregation of the ratings for deriving the final labels. The set of emotional body language animations with the labels of valence and arousal is available and can potentially be useful to other researchers as a ground truth for behavioral experiments on robotic expression of emotion, or for the automatic selection of robotic emotional behaviors with respect to valence and arousal. To further utilize the data we collected, we analyzed it with an exploratory approach and we present some interesting trends with regard to the human perception of Pepper’s emotional body language, that might be worth further investigation.
How does the robot feel?



How does the robot feel?

Add a Comment

Alamat email Anda tidak akan dipublikasikan. Ruas yang wajib ditandai *