
Répertoire de publications
de recherche en accès libre
de recherche en accès libre
Henni, Khadidja; Mezghani, Neila; Mitiche, Amar; Abou-Abbas, Lina et Benazza-Ben Yahia, Amel (2025). An Effective Deep Neural Network Architecture for EEG-Based Recognition of Emotions. IEEE Access, 13, 4487-4498. https://doi.org/10.1109/ACCESS.2025.3525996
Fichier(s) associé(s) à ce document :![]() |
PDF
- Henni2025.pdf
Contenu du fichier : Version de l'éditeur Licence : Creative Commons CC BY. |
|
Catégorie de document : | Articles de revues |
---|---|
Évaluation par un comité de lecture : | Oui |
Étape de publication : | Publié |
Résumé : | Emotions are caused by a human brain reaction to objective events. The purpose of this study is to investigate emotion identification by machine learning using electroencephalography (EEG) data. Current research in EEG-based emotion recognition faces significant challenges due to the high-dimensionality and variability of EEG signals, which complicate accurate classification. Traditional methods often struggle to extract relevant features from noisy and high-dimensional data, and they typically fail to capture the complex temporal dependencies within EEG signals. Recent progress in machine learning by deep neural networks has opened up opportunities to develop methods highly efficient and practicable as to serve useful real-world applications. The purpose of this study is to investigate a novel end-to-end deep learning method of emotion recognition using EEG data, which prefaces a combination of two-dimensional (2D) convolutional network (CNN) and Long short-term memory network (LSTM) by an autoencoder. The autoencoder layers seek a lower dimensionality encoding for optimal input signal reconstruction, and the 2D CNN/LSTM combination layers capture both spatial and temporal features that best describe the emotion classes present in the data. Experiments in four-category classification of emotions, using the public and freely available DEAP dataset, revealed that the method reached superior performance: 90.04% for the “arousal” category, 89.97% for “valence”, 87.73% for “dominance,” and 90.84% for liking”, as measured by the accuracy metric. |
Adresse de la version officielle : | https://ieeexplore.ieee.org/abstract/document/1082... |
Déposant: | Ayena, Johannes |
Responsable : | Johannes Ayena |
Dépôt : | 12 mai 2025 19:53 |
Dernière modification : | 12 mai 2025 19:53 |
![]() |
RÉVISER |