Sung Park1,*, Seongeon Park2, Mincheol Whang2
CMC-Computers, Materials & Continua, Vol.71, No.2, pp. 3761-3784, 2022, DOI:10.32604/cmc.2022.023738
- 07 December 2021
Abstract Artificial entities, such as virtual agents, have become more pervasive. Their long-term presence among humans requires the virtual agent's ability to express appropriate emotions to elicit the necessary empathy from the users. Affective empathy involves behavioral mimicry, a synchronized co-movement between dyadic pairs. However, the characteristics of such synchrony between humans and virtual agents remain unclear in empathic interactions. Our study evaluates the participant's behavioral synchronization when a virtual agent exhibits an emotional expression congruent with the emotional context through facial expressions, behavioral gestures, and voice. Participants viewed an emotion-eliciting video stimulus (negative or positive)… More >