OpenAlex Citation Counts

OpenAlex Citations Logo

OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!

If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.

Requested Article:

Vergence Matching: Inferring Attention to Objects in 3D Environments for Gaze-Assisted Selection
Ludwig Sidenmark, Christopher Clarke, Joshua Newn, et al.
(2023), pp. 1-15
Open Access | Times Cited: 15

Showing 15 citing articles:

GazeRayCursor: Facilitating Virtual Reality Target Selection by Blending Gaze and Controller Raycasting
Di Laura Chen, M. Giordano, Hrvoje Benko, et al.
(2023) Vol. 47, pp. 1-11
Closed Access | Times Cited: 9

Gaze, Wall, and Racket: Combining Gaze and Hand-Controlled Plane for 3D Selection in Virtual Reality
Uta Wagner, Mathias N. Lystbæk, Andreas Asferg Jacobsen, et al.
Proceedings of the ACM on Human-Computer Interaction (2024) Vol. 8, Iss. ISS, pp. 189-213
Open Access | Times Cited: 3

Cone&Bubble: Evaluating Combinations of Gaze, Head and Hand Pointing for Target Selection in Dense 3D Environments
Ludwig Sidenmark, Zibo Sun, Hans Gellersen
2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW) (2024), pp. 642-649
Closed Access | Times Cited: 2

Towards an Eye-Brain-Computer Interface: Combining Gaze with the Stimulus-Preceding Negativity for Target Selections in XR
G. S. Rajshekar Reddy, Michael J. Proulx, Leanne Hirshfield, et al.
(2024), pp. 1-17
Open Access | Times Cited: 2

Spatial Gaze Markers: Supporting Effective Task Switching in Augmented Reality
Mathias N. Lystbæk, Ken Pfeuffer, Tobias Langlotz, et al.
(2024), pp. 1-11
Open Access | Times Cited: 2

In-the-Wild Experiences with an Interactive Glanceable AR System for Everyday Use
Feiyu Lu, Leonardo Pavanatto, Doug A. Bowman
(2023), pp. 1-9
Open Access | Times Cited: 4

GazePuffer: Hands-Free Input Method Leveraging Puff Cheeks for VR
Yunfei Lai, Minghui Sun, Zhuofeng Li
(2024), pp. 331-341
Closed Access | Times Cited: 1

GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free Pointing
Baosheng James Hou, Joshua Newn, Ludwig Sidenmark, et al.
Proceedings of the ACM on Human-Computer Interaction (2024) Vol. 8, Iss. ETRA, pp. 1-20
Open Access | Times Cited: 1

Evaluating Node Selection Techniques for Network Visualizations in Virtual Reality
Lucas Joos, Uzay Durdu, Jonathan Wieland, et al.
(2024), pp. 1-11
Closed Access | Times Cited: 1

FocusFlow: 3D Gaze-Depth Interaction in Virtual Reality Leveraging Active Visual Depth Manipulation
Chenyang Zhang, T. Chen, Eric Shaffer, et al.
arXiv (Cornell University) (2024)
Open Access

How Deep Is Your Gaze? Leveraging Distance in Image-Based Gaze Analysis
Maurice Koch, N. Pathmanathan, Daniel Weiskopf, et al.
(2024), pp. 1-7
Open Access

Towards an Eye-Brain-Computer Interface: Combining Gaze with the Stimulus-Preceding Negativity for Target Selections in XR
G. S. Rajshekar Reddy, Michael J. Proulx, Leanne Hirshfield, et al.
bioRxiv (Cold Spring Harbor Laboratory) (2024)
Open Access

Bi-Directional Gaze-Based Communication: A Review
Björn Severitt, Nora Castner, Siegfried Wahl
Multimodal Technologies and Interaction (2024) Vol. 8, Iss. 12, pp. 108-108
Open Access

Characterizing information access needs in gaze-adaptive augmented reality interfaces: implications for fast-paced and dynamic usage contexts
Aaron L. Gardony, Kana Okano, Gregory I. Hughes, et al.
Human-Computer Interaction (2023) Vol. 39, Iss. 5-6, pp. 553-583
Open Access

Page 1

Scroll to top