OpenAlex Citation Counts

OpenAlex Citations Logo

OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!

If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.

Requested Article:

Adaptive Multimodal Emotion Detection Architecture for Social Robots
Juanpablo Heredia, Edmundo Lopes-Silva, Yudith Cardinale, et al.
IEEE Access (2022) Vol. 10, pp. 20727-20744
Open Access | Times Cited: 54

Showing 1-25 of 54 citing articles:

An ongoing review of speech emotion recognition
Javier de Lope, Manuel Graña
Neurocomputing (2023) Vol. 528, pp. 1-11
Open Access | Times Cited: 61

Using transformers for multimodal emotion recognition: Taxonomies and state of the art review
Samira Hazmoune, Fateh Bougamouza
Engineering Applications of Artificial Intelligence (2024) Vol. 133, pp. 108339-108339
Closed Access | Times Cited: 18

Emotional Intelligence for the Decision-Making Process of Trajectories in Collaborative Robotics
Michele Gabrio Antonelli, Pierluigi Beomonte Zobel, Costanzo Manes, et al.
Machines (2024) Vol. 12, Iss. 2, pp. 113-113
Open Access | Times Cited: 11

Newman-Watts-Strogatz topology in deep echo state networks for speech emotion recognition
Rebh Soltani, Emna Benmohamed, Hela Ltifi
Engineering Applications of Artificial Intelligence (2024) Vol. 133, pp. 108293-108293
Closed Access | Times Cited: 5

Am I a Social Buddy? A Literature Review on Socially Appealing Design and Implementation Methods for Social Robots
Andreea I. Niculescu, Kheng Hui Yeo, Jochen Ehnes
Lecture notes in computer science (2025), pp. 187-196
Closed Access

Emotion recognition with hybrid attentional multimodal fusion framework using cognitive augmentation
Shailesh Kulkarni, Sharmila Khot, Yogesh Angal
International Journal of Information Technology (2025)
Closed Access

Intelligent Multimodal Artificial Agents that Talk and Express Emotions
Niyati Rawal, Rahul Singh Maharjan, Marta Romeo, et al.
Springer proceedings in advanced robotics (2025), pp. 240-254
Closed Access

Group Emotion Detection Based on Social Robot Perception
Marco Quiroz, Raquel Patiño, José Díaz-Amado, et al.
Sensors (2022) Vol. 22, Iss. 10, pp. 3749-3749
Open Access | Times Cited: 20

MMFN: Emotion recognition by fusing touch gesture and facial expression information
Yunkai Li, Qing‐Hao Meng, Yaxin Wang, et al.
Expert Systems with Applications (2023) Vol. 228, pp. 120469-120469
Closed Access | Times Cited: 9

A Framework to Evaluate Fusion Methods for Multimodal Emotion Recognition
Diego Peña, Ana Aguilera, Irvin Dongo, et al.
IEEE Access (2023) Vol. 11, pp. 10218-10237
Open Access | Times Cited: 8

An Assessment of In-the-Wild Datasets for Multimodal Emotion Recognition
Ana Aguilera, Diego Mellado, Felipe Rojas
Sensors (2023) Vol. 23, Iss. 11, pp. 5184-5184
Open Access | Times Cited: 8

IMPROVING E-LEARNING BY FACIAL EXPRESSION ANALYSIS
Amina Kinane Daouadji, Fátima Bendella
Applied Computer Science (2024) Vol. 20, Iss. 2, pp. 126-137
Open Access | Times Cited: 2

Analyzing the Influence of different Speech Data Corpora and Speech Features on Speech Emotion Recognition: A Review
Tarun Rathi, Manoj Tripathy
Speech Communication (2024) Vol. 162, pp. 103102-103102
Closed Access | Times Cited: 2

Attention and Meta-Heuristic Based General Self-Efficacy Prediction Model From Multimodal Social Media Dataset
Md. Saddam Hossain Mukta, Jubaer Ahmad, Akib Zaman, et al.
IEEE Access (2024) Vol. 12, pp. 36853-36873
Open Access | Times Cited: 2

Enhancing Multimodal Emotion Recognition through Attention Mechanisms in BERT and CNN Architectures
Fazliddin Makhmudov, Alpamis Kultimuratov, Young Im Cho
Applied Sciences (2024) Vol. 14, Iss. 10, pp. 4199-4199
Open Access | Times Cited: 2

A Robocentric Paradigm for Enhanced Social Navigation in Autonomous Robotic: a use case for an autonomous Wheelchair
Fabio Leite, Edmundo Lopes-Silva, José Díaz-Amado, et al.
(2024), pp. 112-119
Closed Access | Times Cited: 2

A Systematic Review of Human–Robot Interaction: The Use of Emotions and the Evaluation of Their Performance
Lara Toledo Cordeiro Ottoni, Jés de Jesus Fiais Cerqueira
International Journal of Social Robotics (2024)
Closed Access | Times Cited: 1

An adaptive and late multifusion framework in contextual representation based on evidential deep learning and Dempster–Shafer theory
Doaa Mohey El-Din, Aboul Ella Hassanein, Ehab E. Hassanien
Knowledge and Information Systems (2024) Vol. 66, Iss. 11, pp. 6881-6932
Open Access | Times Cited: 1

A Combined CNN Architecture for Speech Emotion Recognition
Rolinson Begazo, Ana Aguilera, Irvin Dongo, et al.
Sensors (2024) Vol. 24, Iss. 17, pp. 5797-5797
Open Access | Times Cited: 1

Dynamic Mathematical Models of Theory of Mind for Socially Assistive Robots
Maria L. M. Patrício, Anahita Jamshidnejad
IEEE Access (2023) Vol. 11, pp. 103956-103975
Open Access | Times Cited: 3

Emotion Recognition in Psychology of Human-robot Interaction
Mengyao Zhao
Psychomachina (2023) Vol. 1, pp. 1-11
Open Access | Times Cited: 3

The Face Detection / Recognition , Perspective and Obstacles In Robotic: A Review
Nafiz Md Imtiaz Uddin, Ata Jahangir Moshayedi, Hong Lan, et al.
EAI Endorsed Transactions on AI and Robotics (2022) Vol. 1, Iss. 1, pp. e14-e14
Open Access | Times Cited: 5

Multimodal Emotion Classification Supported in the Aggregation of Pre-trained Classification Models
Pedro J. S. Cardoso, João M. F. Rodrigues, Rui Novais
Lecture notes in computer science (2023), pp. 433-447
Closed Access | Times Cited: 2

Voice Response Based Emotion Intensity Classification for Assistive Robots
Hoashalarajh Rajendran, H. M. Ravindu T. Bandara, D. P. Chandima, et al.
(2023), pp. 1-6
Closed Access | Times Cited: 2

Page 1 - Next Page

Scroll to top