Open Access
ARTICLE
Acknowledge of Emotions for Improving Student-Robot Interaction
1 Department of Electrical and Electronics Engineering, Istanbul Bilgi University, Istanbul, 34060, Turkey
2 Department of Computer Engineering, Istanbul Bilgi University, Istanbul, 34060, Turkey
* Corresponding Author: Tugba Dalyan. Email:
Intelligent Automation & Soft Computing 2023, 37(1), 1209-1224. https://doi.org/10.32604/iasc.2023.030674
Received 30 March 2022; Accepted 25 May 2022; Issue published 29 April 2023
Abstract
Robot companions will soon be part of our everyday life and students in the engineering faculty must be trained to design, build, and interact with them. The two affordable robots presented in this paper have been designed and constructed by two undergraduate students; one artificial agent is based on the Nvidia Jetson Nano development board and the other one on a remote computer system. Moreover, the robots have been refined with an empathetic system, to make them more user-friendly. Since automatic facial expression recognition skills is a necessary pre-processing step for acknowledging emotions, this paper tested different variations of Convolutional Neural Networks (CNN) to detect the six facial expressions plus the neutral face. The state-of-the-art performance of 75.1% on the Facial Expression Recognition (FER) 2013 database has been reached by the ensemble voting method. The runner-up model is the Visual Geometry Group (VGG) 16 which has been adopted by the two robots to recognize the expressions of the human partner and behave accordingly. An empirical study run among 55 university students confirmed the hypothesis that contact with empathetic artificial agents contributes to increasing the acceptance rate of robotsKeywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.