OpenAlex Citation Counts

OpenAlex Citations Logo

OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!

If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.

Requested Article:

Feeling Colours: Crossmodal Correspondences Between Tangible 3D Objects, Colours and Emotions
Anan Lin, Meike Scheller, Feng Feng, et al.
(2021), pp. 1-12
Open Access | Times Cited: 32

Showing 1-25 of 32 citing articles:

Enhancing Emotional Support in Human-Robot Interaction: Implementing Emotion Regulation Mechanisms in a Personal Drone
Ori Fartook, Zachary McKendrick, Tal Oron-Gilad, et al.
Computers in Human Behavior Artificial Humans (2025), pp. 100146-100146
Open Access

Studying the Effects of Congruence of Auditory and Visual Stimuli on Virtual Reality Experiences
Hayeon Kim, In‐Kwon Lee
IEEE Transactions on Visualization and Computer Graphics (2022) Vol. 28, Iss. 5, pp. 2080-2090
Closed Access | Times Cited: 19

Exploring crossmodal correspondences for future research in human movement augmentation
Mattia Pinardi, Nicola Di Stefano, Giovanni Di Pino, et al.
Frontiers in Psychology (2023) Vol. 14
Open Access | Times Cited: 7

Pic2Tac: Creating Accessible Tactile Images using Semantic Information from Photographs
Karolina Pakėnaitė, Eirini Kamperou, Michael J. Proulx, et al.
(2024), pp. 1-12
Open Access | Times Cited: 2

Crossmodal Correspondence

Frontiers research topics (2024)
Open Access | Times Cited: 2

Conveying Emotions through Shape-changing to Children with and without Visual Impairment
Isabel Neto, Yuhan Hu, Filipa Correia, et al.
(2024), pp. 1-16
Open Access | Times Cited: 2

It’s Touching: Understanding Touch-Affect Association in Shape-Change with Kinematic Features
Feng Feng, Dan Bennett, Zhijun Fan, et al.
CHI Conference on Human Factors in Computing Systems (2022)
Closed Access | Times Cited: 10

Feel the Force, See the Force: Exploring Visual-tactile Associations of Deformable Surfaces with Colours and Shapes
Cameron Steer, Teodora Dinca, Crescent Jicol, et al.
(2023), pp. 1-13
Open Access | Times Cited: 5

Expanding the Interaction Repertoire of a Social Drone: Physically Expressive Possibilities of a Perched BiRDe
Ori Fartook, Karon E. MacLean, Tal Oron-Gilad, et al.
International Journal of Social Robotics (2023) Vol. 16, Iss. 2, pp. 257-280
Closed Access | Times Cited: 5

The Impact of Motion Features of Hand-drawn Lines on Emotional Expression: an Experimental Study
Yunhui Lin, Guoying Yang, Yuefeng Ze, et al.
Computers & Graphics (2024) Vol. 119, pp. 103897-103897
Closed Access | Times Cited: 1

Using Crossmodal Correspondence Between Colors and Music to Enhance Online Art Exhibition Visitors’ Experience
Qian Guo, Tingting Jiang
Lecture notes in computer science (2023), pp. 144-159
Closed Access | Times Cited: 3

Complexity Mediated Cross-modal Correspondence between Tone Sequences and Shapes
Jumpei Hayashi, Takeo Kato, Hideyoshi YANAGISAWA
International Journal of Affective Engineering (2024) Vol. 23, Iss. 2, pp. 95-107
Open Access

Investigating Crossmodal Correspondences Between Vibrotactile Stimuli and Colors
Daniel HORST, Jumpei Hayashi, Takeo Kato, et al.
International Journal of Affective Engineering (2024) Vol. 23, Iss. 2, pp. 125-142
Open Access

Trace to Touch: Eliciting Gestures from Capacitive Touch Electrodes
Aarti Darji, Gunnika Kapoor, César Torres
Creativity and Cognition (2024), pp. 67-71
Closed Access

Sense-O-Nary: Exploring Children's Crossmodal Metaphors Through Playful Crossmodal Interactions
Tegan Joy Roberts-Morgan, Brooke Morris, Elaine Czech, et al.
(2024) Vol. 1, pp. 259-269
Closed Access

SonoHaptics: An Audio-Haptic Cursor for Gaze-Based Object Selection in XR
Hyunsung Cho, Naveen Sendhilnathan, Michael Nebeling, et al.
(2024), pp. 1-19
Open Access

Physicochemical features partially explain olfactory crossmodal correspondences
Ryan J. Ward, Sophie Wuerger, Maliha Ashraf, et al.
Scientific Reports (2023) Vol. 13, Iss. 1
Open Access | Times Cited: 1

Congruency of color–sound crossmodal correspondence interacts with color and sound discrimination depending on color category
Kenta Miyamoto, Yuma Taniyama, Kyoko Hine, et al.
i-Perception (2023) Vol. 14, Iss. 4
Open Access | Times Cited: 1

OdorV-Art: An Initial Exploration of An Olfactory Intervention for Appreciating Style Information of Artworks in Virtual Museum
Shumeng Zhang, Ziyan Wang, You Zhou, et al.
(2023), pp. 1-8
Closed Access | Times Cited: 1

Birdbox: Exploring the User Experience of Crossmodal, Multisensory Data Representations
Arika Dodani, Rosa van Koningsbruggen, Eva Hornecker
(2022), pp. 12-21
Open Access | Times Cited: 2

Examining the Visual Impact of Object Typeface on Event Participation
Chimeziem Elijah Nwankwo-Ojionu, Nor Azura Adzharuddin, Moniza Waheed, et al.
International Journal of Marketing Communication and New Media (2022) Vol. 10, Iss. 19
Open Access | Times Cited: 1

From Inclusive Theatre to inclusive technologies: Lessons learnt from co-designing Touch Tours with an Inclusive Theatre group
Alexandra Tzanidou, Al Husein Sami Abosaleh, Stephen Lindsay, et al.
(2023) Vol. 1, pp. 1367-1382
Open Access

Page 1 - Next Page

Scroll to top