Facial Expression Recognition using Neural Network for Dyadic Interaction
Sham, Abdallah Hussein
MetadataShow full item record
Computers are machines that don’t share emotions as humans do. With the help of Machine Learning (ML) and Artificial Intelligence (AI), social robots can become a reality. These robots are currently capable of interacting with people at a certain level, but not exactly as a person would do. For them to reach that level, they would need to understand more about how people interact daily and to learn from the dyadic interaction of two people would be a good option. Participants’ facial expressions are the main features that can be retrieved from dyadic interaction and this can be done using a trained Deep Neural Network (DNN) model. The DNN model, known as the Mini-Xception, is trained in this thesis using a dataset that has been pre-processed and can then be tested on images. Using a face detector algorithm, the model will be able to detect a person’s facial expression on the image. After successful image results, the model can be tested using a different medium. First, the tests are carried out using a webcam, then videos with more than one participant. Since people react to expressions, their reactions can also be caused by a context in which, for example, sad news would be the reason for sad emotion. The results of the tests will, therefore, be used for analysis where a correlation can be constructed between facial expressions and context.
The following license files are associated with this item: