OpenAlex Citation Counts

OpenAlex Citations Logo

OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!

If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.

Requested Article:

Two Birds With One Stone: Knowledge-Embedded Temporal Convolutional Transformer for Depression Detection and Emotion Recognition
Wenbo Zheng, Lan Yan, Fei‐Yue Wang
IEEE Transactions on Affective Computing (2023) Vol. 14, Iss. 4, pp. 2595-2613
Closed Access | Times Cited: 23

Showing 23 citing articles:

DepMSTAT: Multimodal Spatio-Temporal Attentional Transformer for Depression Detection
Yongfeng Tao, Minqiang Yang, Huiru Li, et al.
IEEE Transactions on Knowledge and Data Engineering (2024) Vol. 36, Iss. 7, pp. 2956-2966
Closed Access | Times Cited: 20

A Systematic Review on Multimodal Emotion Recognition: Building Blocks, Current State, Applications, and Challenges
Sepideh Kalateh, Luis A. Estrada-Jimenez, Sanaz Nikghadam-Hojjati, et al.
IEEE Access (2024) Vol. 12, pp. 103976-104019
Open Access | Times Cited: 12

DepITCM: an audio-visual method for detecting depression
Lishan Zhang, Zhenhua Liu, Yumei Wan, et al.
Frontiers in Psychiatry (2025) Vol. 15
Open Access

Deep learning-based depression recognition through facial expression: A systematic review
Xiaoming Cao, Lingling Zhai, Pengpeng Zhai, et al.
Neurocomputing (2025), pp. 129605-129605
Closed Access

A spatial and temporal transformer-based EEG emotion recognition in VR environment
Ming Li, Peng Yu, Yang Shen
Frontiers in Human Neuroscience (2025) Vol. 19
Open Access

A multimodal shared network with a cross-modal distribution constraint for continuous emotion recognition
Chiqin Li, Lun Xie, Xingmao Shao, et al.
Engineering Applications of Artificial Intelligence (2024) Vol. 133, pp. 108413-108413
Closed Access | Times Cited: 3

Reading Between the Frames: Multi-modal Depression Detection in Videos from Non-verbal Cues
David Gimeno-Gómez, Ana-Maria Bucur, Adrian Cosma, et al.
Lecture notes in computer science (2024), pp. 191-209
Closed Access | Times Cited: 2

LSCAformer: Long and short-term cross-attention-aware transformer for depression recognition from video sequences
Lang He, Zheng Li, Prayag Tiwari, et al.
Biomedical Signal Processing and Control (2024) Vol. 98, pp. 106767-106767
Closed Access | Times Cited: 2

A Survey on Multi-modal Emotion Detection Techniques
Chintan Chatterjee, Nihir Shah, Sahil Bhatt, et al.
Research Square (Research Square) (2024)
Open Access | Times Cited: 1

A Least-Square Unified Framework for Spatial Filtering in SSVEP-Based BCIs
Ze Wang, Lu Shen, Yi Yang, et al.
IEEE Transactions on Neural Systems and Rehabilitation Engineering (2024) Vol. 32, pp. 2470-2481
Open Access

PCQ: Emotion Recognition in Speech via Progressive Channel Querying
Xincheng Wang, Liejun Wang, Yingfeng Yu, et al.
Lecture notes in computer science (2024), pp. 264-275
Closed Access

Depression recognition over fusion of visual and vocal expression using artificial intelligence
Chandan Gautam, A. Raj, Bhargavee Nemade, et al.
Indonesian Journal of Electrical Engineering and Computer Science (2024) Vol. 34, Iss. 3, pp. 1753-1753
Open Access

Depressive and mania mood state detection through voice as a biomarker using machine learning
Jun Ji, Wentian Dong, Jiaqi Li, et al.
Frontiers in Neurology (2024) Vol. 15
Open Access

Enhancing multimodal depression detection with intra- and inter-sample contrastive learning
Meiling Li, Yuting Wei, Yangfu Zhu, et al.
Information Sciences (2024) Vol. 684, pp. 121282-121282
Closed Access

Automatic Depression Detection Using Attention-Based Deep Multiple Instance Learning
Zixuan Shangguan, Xiaxi Li, Yanjie Dong, et al.
(2024), pp. 40-51
Closed Access

Are You Paying Attention? Multimodal Linear Attention Transformers for Affect Prediction in Video Conversations
Jia Qing Poh, John See, Neamat El Gayar, et al.
(2024), pp. 15-23
Closed Access

An adaptive multi-graph neural network with multimodal feature fusion learning for MDD detection
Tao Xing, Yutao Dou, Xianliang Chen, et al.
Scientific Reports (2024) Vol. 14, Iss. 1
Open Access

A twin disentanglement Transformer Network with Hierarchical-Level Feature Reconstruction for robust multimodal emotion recognition
Chiqin Li, Lun Xie, Xinheng Wang, et al.
Expert Systems with Applications (2024), pp. 125822-125822
Closed Access

So Many Heads, So Many Wits: Multimodal Graph Reasoning for Text-Based Visual Question Answering
Wenbo Zheng, Lan Yan, Fei‐Yue Wang
IEEE Transactions on Systems Man and Cybernetics Systems (2023) Vol. 54, Iss. 2, pp. 854-865
Closed Access | Times Cited: 1

Explainable Depression Detection using Multimodal Behavioural Cues
Monika Gahalawat
INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION (2023), pp. 721-725
Closed Access

Editorial: Special Issue on Unobtrusive Physiological Measurement Methods for Affective Applications
Ioannis Pavlidis, Theodora Chaspari, Daniel McDuff
IEEE Transactions on Affective Computing (2023) Vol. 14, Iss. 4, pp. 2564-2566
Open Access

Page 1

Scroll to top