Please use this identifier to cite or link to this item: http://localhost:8080/xmlui/handle/123456789/7
Full metadata record
DC FieldValueLanguage
dc.contributor.authorHarrati, Nouzha-
dc.contributor.authorBouchrika, Imed-
dc.contributor.authorTari, Abdelkamel-
dc.contributor.authorLadjailia, Ammar-
dc.date.accessioned2023-01-15T13:58:45Z-
dc.date.available2023-01-15T13:58:45Z-
dc.date.issued2015-12-21-
dc.identifier.urihttp://localhost:8080/xmlui/handle/123456789/7-
dc.descriptionThe human face is a source of rich and wealthy information that are more essential to social interaction as well as communication. Such information can be either static such as identity, age, gender and ethnicity. Alternatively, facial data can be dynamic such as the emotional state of the person. As facial expression plays undoubtedly a key role in conveying human emotions and feelings, research into how human beings react to the world and communicate with each other still stands as one of the most scientific challenges to be addressed. The first research and analysis for facial emotion analysis dates back to the nineteenth century as Charles Darwin [1] is considered the pioneer to conduct the initial study on emotion indication from animal and human faces. Ekman and Friesen [2] categorized the human emotions into six basic emotions that each owns its particular content, appearance and distinctive facial expression. The classified basic emotions are considered to be universal across different ethnicities and cultures. The six basic emotions are : happiness, sadness, fear, disgust, surprise and anger. The computer vision community showed unprecedented interest for research related to automated classification of emotions based on facial expressions.en_US
dc.description.abstractAbstract—As facial expression plays undoubtedly a key role in conveying human emotions and feelings, research into how people react to the world and communicate with each other still stands as one of the most scientific challenges to be addressed. Recent research has shown that facial expressions can be a potential medium for various applications. In this research paper, we explore the use of texture-based facial features obtained using the Local Binary Patterns operator. The facial expression signature is constructed via encoding the textural information using the bag of features. Features are trained to robustly distinguish different seven facial emotions including: happiness, anger, disgust, fear, surprise, sadness as well as the neutral case. Based on a gallery dataset containing 76 images, a classification rate of 93.4% is achieved using the Support Vector Machine classifier. The attained results assert that automated classification of facial expression using an appearance-based approach is feasible with an acceptable accuracy.en_US
dc.language.isoenen_US
dc.publisherIEEE Xploreen_US
dc.relation.ispartofseriesaestin4;4-
dc.subjectFacialen_US
dc.subjectExpressionsen_US
dc.subjectLBPen_US
dc.subjectBag of Featuresen_US
dc.titleAutomated classification of facial expressions using bag of visual words and texture-based featuresen_US
dc.typeArticleen_US
Appears in Collections:Articles



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.