OpenAlex Citation Counts

OpenAlex Citations Logo

OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!

If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.

Requested Article:

EyeTAP: Introducing a multimodal gaze-based technique using voice inputs with a comparative analysis of selection techniques
Mohsen Parisay, Charalambos Poullis, Marta Kersten‐Oertel
International Journal of Human-Computer Studies (2021) Vol. 154, pp. 102676-102676
Open Access | Times Cited: 11

Showing 11 citing articles:

Technologies for Multimodal Interaction in Extended Reality—A Scoping Review
Ismo Rakkolainen, Ahmed Farooq, Jari Kangas, et al.
Multimodal Technologies and Interaction (2021) Vol. 5, Iss. 12, pp. 81-81
Open Access | Times Cited: 45

An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile Devices
Yaxiong Lei, Shijing He, Mohamed Khamis, et al.
ACM Computing Surveys (2023) Vol. 56, Iss. 2, pp. 1-38
Open Access | Times Cited: 12

Eye-Gaze-Based Intention Recognition for Selection Task by Using SVM-RF
Shuai Wang, Hongwei Niu, Wanni Wei, et al.
Lecture notes in computer science (2024), pp. 157-168
Closed Access | Times Cited: 1

Design and Evaluation of a Silent Speech-Based Selection Method for Eye-Gaze Pointing
Laxmi Pandey, Ahmed Sabbir Arif
Proceedings of the ACM on Human-Computer Interaction (2022) Vol. 6, Iss. ISS, pp. 328-353
Open Access | Times Cited: 6

BIGaze: An eye-gaze action-guided Bayesian information gain framework for information exploration
Seung Won Lee, Hwan Kim, Taeha Yi, et al.
Advanced Engineering Informatics (2023) Vol. 58, pp. 102159-102159
Closed Access | Times Cited: 3

The Impact of Gaze and Hand Gesture Complexity on Gaze-Pinch Interaction Performances
Yeongseo Park, Jiwan Kim, Ian Oakley
(2024), pp. 622-626
Closed Access

Eyes can draw: A high-fidelity free-eye drawing method with unimodal gaze control
Lida Huang, Thomas Westin, Mirjam Palosaari Eladhari, et al.
International Journal of Human-Computer Studies (2022) Vol. 170, pp. 102966-102966
Open Access | Times Cited: 2

A Comparison of the Fatigue Progression of Eye-Tracked and Motion-Controlled Interaction in Immersive Space
Lukas Maximilian Masopust, David Bauer, Siyuan Yao, et al.
(2021), pp. 460-469
Closed Access | Times Cited: 2

Compass+Ring: A Multimodal Menu to Improve Interaction Performance and Comfortability in One-handed Scenarios
Xin Chen, Dongliang Guo, Li Feng, et al.
(2023), pp. 473-482
Closed Access

Affordance-Guided User Elicitation of Interaction Concepts for Unimodal Gaze Control of Potential Holographic 3D UIs in Automotive Applications
Maryia Kazhura, Bea Vorhof, André Calero Valdez
2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) (2023), pp. 14-19
Closed Access

Page 1

Scroll to top