
OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!
If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.
Requested Article:
Vouch: multimodal touch-and-voice input for smart watches under difficult operating conditions
Jae-Dong Lee, Changhyeon Lee, Gerard J. Kim
Journal on Multimodal User Interfaces (2017) Vol. 11, Iss. 3, pp. 289-299
Closed Access | Times Cited: 13
Jae-Dong Lee, Changhyeon Lee, Gerard J. Kim
Journal on Multimodal User Interfaces (2017) Vol. 11, Iss. 3, pp. 289-299
Closed Access | Times Cited: 13
Showing 13 citing articles:
Review of Capacitive Touchscreen Technologies: Overview, Research Trends, and Machine Learning Approaches
Hyoungsik Nam, Ki‐Hyuk Seol, Jun‐Hee Lee, et al.
Sensors (2021) Vol. 21, Iss. 14, pp. 4776-4776
Open Access | Times Cited: 39
Hyoungsik Nam, Ki‐Hyuk Seol, Jun‐Hee Lee, et al.
Sensors (2021) Vol. 21, Iss. 14, pp. 4776-4776
Open Access | Times Cited: 39
Less or More: Towards Glanceable Explanations for LLM Recommendations Using Ultra-Small Devices
Xinru Wang, Mengjie Yu, Hannah Nguyen, et al.
(2025), pp. 938-951
Closed Access
Xinru Wang, Mengjie Yu, Hannah Nguyen, et al.
(2025), pp. 938-951
Closed Access
Voice-Visualized Message Interactions on Smartwatches
JooYeong Kim, Sooyeon Ahn, YoonJae Kim, et al.
(2025)
Closed Access
JooYeong Kim, Sooyeon Ahn, YoonJae Kim, et al.
(2025)
Closed Access
Usability and user experience evaluation of natural user interfaces: a systematic mapping study
Guilherme Corredato Guerino, Natasha Valentim
IET Software (2020) Vol. 14, Iss. 5, pp. 451-467
Open Access | Times Cited: 27
Guilherme Corredato Guerino, Natasha Valentim
IET Software (2020) Vol. 14, Iss. 5, pp. 451-467
Open Access | Times Cited: 27
EasyAsk: An In-App Contextual Tutorial Search Assistant for Older Adults with Voice and Touch Inputs
Weiwei Gao, Kexin Du, Yujia Luo, et al.
Proceedings of the ACM on Interactive Mobile Wearable and Ubiquitous Technologies (2024) Vol. 8, Iss. 3, pp. 1-27
Closed Access | Times Cited: 1
Weiwei Gao, Kexin Du, Yujia Luo, et al.
Proceedings of the ACM on Interactive Mobile Wearable and Ubiquitous Technologies (2024) Vol. 8, Iss. 3, pp. 1-27
Closed Access | Times Cited: 1
Tailored, Multimodal and Opportune Interactions on a Wearable Sport Coach: The WE-nner Framework
Jean‐Claude Martin, Céline Clavel
Lecture notes in computer science (2018), pp. 24-32
Open Access | Times Cited: 6
Jean‐Claude Martin, Céline Clavel
Lecture notes in computer science (2018), pp. 24-32
Open Access | Times Cited: 6
Text Entry on Smartwatches: A Systematic Review of Literature
Mateus Machado Luna, Fabrízzio Soares, Hugo Alexandre Dantas do Nascimento, et al.
(2018), pp. 272-277
Closed Access | Times Cited: 4
Mateus Machado Luna, Fabrízzio Soares, Hugo Alexandre Dantas do Nascimento, et al.
(2018), pp. 272-277
Closed Access | Times Cited: 4
Vouch-T: Multimodal Text Input for Mobile Devices Using Voice and Touch
Minyoung Lee, Gerard J. Kim
Lecture notes in computer science (2017), pp. 208-224
Closed Access | Times Cited: 3
Minyoung Lee, Gerard J. Kim
Lecture notes in computer science (2017), pp. 208-224
Closed Access | Times Cited: 3
Evaluating a voice-based interaction
Guilherme Corredato Guerino, Natasha Valentim
(2019), pp. 1-4
Closed Access | Times Cited: 3
Guilherme Corredato Guerino, Natasha Valentim
(2019), pp. 1-4
Closed Access | Times Cited: 3
Hearing loss prevention at loud music events via real-time visuo-haptic feedback
Luca Turchet, Simone Luiten, Tjebbe Treub, et al.
Journal on Multimodal User Interfaces (2023) Vol. 18, Iss. 1, pp. 43-53
Open Access | Times Cited: 1
Luca Turchet, Simone Luiten, Tjebbe Treub, et al.
Journal on Multimodal User Interfaces (2023) Vol. 18, Iss. 1, pp. 43-53
Open Access | Times Cited: 1
Detection of Dyslexic Children Using Machine Learning and Multimodal Hindi Language Eye-Gaze-Assisted Learning System
Yogesh Kumar Meena, Hubert Cecotti, Braj Bhushan, et al.
IEEE Transactions on Human-Machine Systems (2022) Vol. 53, Iss. 1, pp. 122-131
Open Access | Times Cited: 2
Yogesh Kumar Meena, Hubert Cecotti, Braj Bhushan, et al.
IEEE Transactions on Human-Machine Systems (2022) Vol. 53, Iss. 1, pp. 122-131
Open Access | Times Cited: 2
Emerging Applications
Shuo Gao, Shuo Yan, Hang Zhao, et al.
Springer eBooks (2021), pp. 179-229
Closed Access | Times Cited: 2
Shuo Gao, Shuo Yan, Hang Zhao, et al.
Springer eBooks (2021), pp. 179-229
Closed Access | Times Cited: 2
Natural Multimodal Interaction in the Car - Generating Design Support for Speech, Gesture, and Gaze Interaction while Driving
Florian Roider
(2021)
Open Access | Times Cited: 1
Florian Roider
(2021)
Open Access | Times Cited: 1