Open Access
ARTICLE
Empathic Responses of Behavioral-Synchronization in Human-Agent Interaction
1 Savannah College of Art and Design, Savannah, GA, 31401, USA
2 Sangmyung University, Seoul, 03016, Korea
* Corresponding Author: Sung Park. Email:
(This article belongs to the Special Issue: Deep Vision Architectures and Algorithms for Edge AI Computing)
Computers, Materials & Continua 2022, 71(2), 3761-3784. https://doi.org/10.32604/cmc.2022.023738
Received 19 September 2021; Accepted 20 October 2021; Issue published 07 December 2021
Abstract
Artificial entities, such as virtual agents, have become more pervasive. Their long-term presence among humans requires the virtual agent's ability to express appropriate emotions to elicit the necessary empathy from the users. Affective empathy involves behavioral mimicry, a synchronized co-movement between dyadic pairs. However, the characteristics of such synchrony between humans and virtual agents remain unclear in empathic interactions. Our study evaluates the participant's behavioral synchronization when a virtual agent exhibits an emotional expression congruent with the emotional context through facial expressions, behavioral gestures, and voice. Participants viewed an emotion-eliciting video stimulus (negative or positive) with a virtual agent. The participants then conversed with the virtual agent about the video, such as how the participant felt about the content. The virtual agent expressed emotions congruent with the video or neutral emotion during the dialog. The participants’ facial expressions, such as the facial expressive intensity and facial muscle movement, were measured during the dialog using a camera. The results showed the participants’ significant behavioral synchronization (i.e., cosine similarity ≥ .05) in both the negative and positive emotion conditions, evident in the participant's facial mimicry with the virtual agent. Additionally, the participants’ facial expressions, both movement and intensity, were significantly stronger in the emotional virtual agent than in the neutral virtual agent. In particular, we found that the facial muscle intensity of AU45 (Blink) is an effective index to assess the participant's synchronization that differs by the individual's empathic capability (low, mid, high). Based on the results, we suggest an appraisal criterion to provide empirical conditions to validate empathic interaction based on the facial expression measures.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.