
OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!
If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.
Requested Article:
EyeMU Interactions: Gaze + IMU Gestures on Mobile Devices
Andy Kong, Karan Ahuja, Mayank Goel, et al.
(2021)
Open Access | Times Cited: 16
Andy Kong, Karan Ahuja, Mayank Goel, et al.
(2021)
Open Access | Times Cited: 16
Showing 16 citing articles:
GazePointAR: A Context-Aware Multimodal Voice Assistant for Pronoun Disambiguation in Wearable Augmented Reality
Jaewook Lee, Jun Wang, Elizabeth Brown, et al.
(2024), pp. 1-20
Open Access | Times Cited: 9
Jaewook Lee, Jun Wang, Elizabeth Brown, et al.
(2024), pp. 1-20
Open Access | Times Cited: 9
An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile Devices
Yaxiong Lei, Shijing He, Mohamed Khamis, et al.
ACM Computing Surveys (2023) Vol. 56, Iss. 2, pp. 1-38
Open Access | Times Cited: 12
Yaxiong Lei, Shijing He, Mohamed Khamis, et al.
ACM Computing Surveys (2023) Vol. 56, Iss. 2, pp. 1-38
Open Access | Times Cited: 12
Large Generative Model Impulsed Lightweight Gaze Estimator via Deformable Approximate Large Kernel Pursuit
Xuanhong Chen, Muchun Chen, Yugang Chen, et al.
IEEE Transactions on Image Processing (2025) Vol. 34, pp. 1149-1162
Closed Access
Xuanhong Chen, Muchun Chen, Yugang Chen, et al.
IEEE Transactions on Image Processing (2025) Vol. 34, pp. 1149-1162
Closed Access
Comparing Dwell time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile Devices
Omar Namnakani, Yasmeen Abdrabou, Jonathan Grizou, et al.
(2023), pp. 1-17
Open Access | Times Cited: 10
Omar Namnakani, Yasmeen Abdrabou, Jonathan Grizou, et al.
(2023), pp. 1-17
Open Access | Times Cited: 10
Effects of a Gaze-Based 2D Platform Game on User Enjoyment, Perceived Competence, and Digital Eye Strain
Mark Colley, Beate Wanner, Max Rädler, et al.
(2024), pp. 1-14
Open Access | Times Cited: 1
Mark Colley, Beate Wanner, Max Rädler, et al.
(2024), pp. 1-14
Open Access | Times Cited: 1
The Ability-Based Design Mobile Toolkit (ABD-MT): Developer Support for Runtime Interface Adaptation Based on Users' Abilities
Junhan Kong, Mingyuan Zhong, James Fogarty, et al.
Proceedings of the ACM on Human-Computer Interaction (2024) Vol. 8, Iss. MHCI, pp. 1-26
Closed Access | Times Cited: 1
Junhan Kong, Mingyuan Zhong, James Fogarty, et al.
Proceedings of the ACM on Human-Computer Interaction (2024) Vol. 8, Iss. MHCI, pp. 1-26
Closed Access | Times Cited: 1
GazeCast: Using Mobile Devices to Allow Gaze-based Interaction on Public Displays
Omar Namnakani, Penpicha Sinrattanavong, Yasmeen Abdrabou, et al.
(2023)
Open Access | Times Cited: 3
Omar Namnakani, Penpicha Sinrattanavong, Yasmeen Abdrabou, et al.
(2023)
Open Access | Times Cited: 3
The Impact of Gaze and Hand Gesture Complexity on Gaze-Pinch Interaction Performances
Yeongseo Park, Jiwan Kim, Ian Oakley
(2024), pp. 622-626
Closed Access
Yeongseo Park, Jiwan Kim, Ian Oakley
(2024), pp. 622-626
Closed Access
E2GCO: Event-driven Eye-Gaze-Controlled Operation for Smartphone Interaction
Dong Liang, Shaoming Yan, Yuanliang Ju, et al.
Research Square (Research Square) (2024)
Open Access
Dong Liang, Shaoming Yan, Yuanliang Ju, et al.
Research Square (Research Square) (2024)
Open Access
E2GO : Free Your Hands for Smartphone Interaction
Dong Liang, Shaoming Yan, Yuanliang Ju, et al.
Research Square (Research Square) (2024)
Open Access
Dong Liang, Shaoming Yan, Yuanliang Ju, et al.
Research Square (Research Square) (2024)
Open Access
UnitEye: Introducing a User-Friendly Plugin to Democratize Eye Tracking Technology in Unity Environments
Tobias Wagner, Mark Colley, Daniel Breckel, et al.
Proceedings of Mensch und Computer 2019 (2024), pp. 1-10
Closed Access
Tobias Wagner, Mark Colley, Daniel Breckel, et al.
Proceedings of Mensch und Computer 2019 (2024), pp. 1-10
Closed Access
Knock Knock: A Children-oriented Vocabulary Learning Tangible User Interaction System
Xinrui Fang, Takuro Watanabe, Chengshuo Xia, et al.
(2022), pp. 35-39
Closed Access | Times Cited: 2
Xinrui Fang, Takuro Watanabe, Chengshuo Xia, et al.
(2022), pp. 35-39
Closed Access | Times Cited: 2
Software Framework for Implementing User Interface Interaction in IOS Applications Based on Oculography
Nikita Stanislavovich Afanasev
Russian Digital Libraries Journal (2022) Vol. 25, Iss. 3, pp. 198-245
Open Access | Times Cited: 1
Nikita Stanislavovich Afanasev
Russian Digital Libraries Journal (2022) Vol. 25, Iss. 3, pp. 198-245
Open Access | Times Cited: 1
Can Eye Gaze Improve Emotional State Detection on Off the Shelf Smart Devices
Jiwan Kim, Doyoung Lee, Jae-Ho Kim, et al.
(2023) Vol. 1, pp. 378-380
Closed Access
Jiwan Kim, Doyoung Lee, Jae-Ho Kim, et al.
(2023) Vol. 1, pp. 378-380
Closed Access
A Non-Contact Human-Computer Interaction Method Based on Gaze
Wenting Chen, Xin Sun
(2023), pp. 70-74
Closed Access
Wenting Chen, Xin Sun
(2023), pp. 70-74
Closed Access