OpenAlex Citation Counts

OpenAlex Citations Logo

OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!

If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.

Requested Article:

Lip-Reading Enables the Brain to Synthesize Auditory Features of Unknown Silent Speech
Mathieu Bourguignon, Martijn Baart, Efthymia C. Kapnoula, et al.
Journal of Neuroscience (2019) Vol. 40, Iss. 5, pp. 1053-1065
Open Access | Times Cited: 89

Showing 1-25 of 89 citing articles:

Natural infant-directed speech facilitates neural tracking of prosody
Katharina Menn, Christine Michel, Lars Meyer, et al.
NeuroImage (2022) Vol. 251, pp. 118991-118991
Open Access | Times Cited: 49

Crossmodal Phase Reset and Evoked Responses Provide Complementary Mechanisms for the Influence of Visual Speech in Auditory Cortex
Pierre Mégevand, Manuel Mercier, David M. Groppe, et al.
Journal of Neuroscience (2020) Vol. 40, Iss. 44, pp. 8530-8542
Open Access | Times Cited: 50

Neural Tracking in Infancy Predicts Language Development in Children With and Without Family History of Autism
Katharina Menn, Emma Kate Ward, Ricarda Braukmann, et al.
Neurobiology of Language (2022) Vol. 3, Iss. 3, pp. 495-514
Open Access | Times Cited: 30

Visual Influences on Auditory Behavioral, Neural, and Perceptual Processes: A Review
Collins Opoku-Baah, Adriana M. Schoenhaut, Sarah Vassall, et al.
Journal of the Association for Research in Otolaryngology (2021) Vol. 22, Iss. 4, pp. 365-386
Open Access | Times Cited: 39

Short report on the effects of SARS-CoV-2 face protective equipment on verbal communication
Enrico Muzzi, Carol Chermaz, Veronica Castro, et al.
European Archives of Oto-Rhino-Laryngology (2021) Vol. 278, Iss. 9, pp. 3565-3570
Open Access | Times Cited: 38

Measuring the cortical tracking of speech with optically-pumped magnetometers
Paul de Lange, Elena Boto, Niall Holmes, et al.
NeuroImage (2021) Vol. 233, pp. 117969-117969
Open Access | Times Cited: 29

Differential Auditory and Visual Phase-Locking Are Observed during Audio-Visual Benefit and Silent Lip-Reading for Speech Perception
Máté Aller, Heidi Solberg Økland, Lucy MacGregor, et al.
Journal of Neuroscience (2022) Vol. 42, Iss. 31, pp. 6108-6120
Open Access | Times Cited: 20

Schlieren imaging and video classification of alphabet pronunciations: exploiting phonetic flows for speech recognition and speech therapy
H. Talaat, Kian Barari, Xiuhua April, et al.
Visual Computing for Industry Biomedicine and Art (2024) Vol. 7, Iss. 1
Open Access | Times Cited: 4

Auditory cortex encodes lipreading information through spatially distributed activity
Ganesan Karthik, Cody Zhewei Cao, Michael I. Demidenko, et al.
Current Biology (2024) Vol. 34, Iss. 17, pp. 4021-4032.e5
Open Access | Times Cited: 4

Neural speech tracking contribution of lip movements predicts behavioral deterioration when the speaker's mouth is occluded
Patrick Reisinger, Marlies Gillis, Nina Suess, et al.
eNeuro (2025), pp. ENEURO.0368-24.2024
Open Access

Cortical tracking of speech in noise accounts for reading strategies in children
Florian Destoky, Julie Bertels, Maxime Niesen, et al.
PLoS Biology (2020) Vol. 18, Iss. 8, pp. e3000840-e3000840
Open Access | Times Cited: 32

“Entraining” to speech, generating language?
Lars Meyer, Yue Sun, Andrea E. Martin
Language Cognition and Neuroscience (2020) Vol. 35, Iss. 9, pp. 1138-1148
Open Access | Times Cited: 30

A representation of abstract linguistic categories in the visual system underlies successful lipreading
Aaron Nidiffer, Cody Zhewei Cao, Aisling E. O’Sullivan, et al.
NeuroImage (2023) Vol. 282, pp. 120391-120391
Open Access | Times Cited: 9

Neocortical activity tracks the hierarchical linguistic structures of self-produced speech during reading aloud
Mathieu Bourguignon, Nicola Molinaro, Mikel Lizarazu, et al.
NeuroImage (2020) Vol. 216, pp. 116788-116788
Open Access | Times Cited: 21

Auditory detection is modulated by theta phase of silent lip movements
Emmanuel Biau, Danying Wang, Hyojin Park, et al.
Current Research in Neurobiology (2021) Vol. 2, pp. 100014-100014
Open Access | Times Cited: 18

MEG Activity in Visual and Auditory Cortices Represents Acoustic Speech-Related Information during Silent Lip Reading
Felix Bröhl, Anne Keitel, Christoph Kayser
eNeuro (2022) Vol. 9, Iss. 3, pp. ENEURO.0209-22.2022
Open Access | Times Cited: 13

Neurodevelopmental oscillatory basis of speech processing in noise
Julie Bertels, Maxime Niesen, Florian Destoky, et al.
Developmental Cognitive Neuroscience (2022) Vol. 59, pp. 101181-101181
Open Access | Times Cited: 13

Neural Speech Tracking Highlights the Importance of Visual Speech in Multi-speaker Situations
Chandra Leon Haider, Hyojin Park, Anne Hauswald, et al.
Journal of Cognitive Neuroscience (2023) Vol. 36, Iss. 1, pp. 128-142
Open Access | Times Cited: 7

Cortical tracking of visual rhythmic speech by 5‐ and 8‐month‐old infants: Individual differences in phase angle relate to language outcomes up to 2 years
Áine Ní Choisdealbha, Adam Attaheri, Sinead Rocha, et al.
Developmental Science (2024) Vol. 27, Iss. 4
Open Access | Times Cited: 2

A comparison of EEG encoding models using audiovisual stimuli and their unimodal counterparts
Maansi Desai, Alyssa M Field, Liberty S. Hamilton
PLoS Computational Biology (2024) Vol. 20, Iss. 9, pp. e1012433-e1012433
Open Access | Times Cited: 2

Shared and modality-specific brain regions that mediate auditory and visual word comprehension
Anne Keitel, Joachim Groß, Christoph Kayser
eLife (2020) Vol. 9
Open Access | Times Cited: 20

A linguistic representation in the visual system underlies successful lipreading
Aaron Nidiffer, Cody Zhewei Cao, Aisling E. O’Sullivan, et al.
bioRxiv (Cold Spring Harbor Laboratory) (2021)
Open Access | Times Cited: 16

Neural speech tracking benefit of lip movements predicts behavioral deterioration when the speaker’s mouth is occluded
Patrick Reisinger, Marlies Gillis, Nina Suess, et al.
bioRxiv (Cold Spring Harbor Laboratory) (2023)
Open Access | Times Cited: 5

Visual speech differentially modulates beta, theta, and high gamma bands in auditory cortex
Gowri Karthik, John Plass, Adriene M. Beltz, et al.
European Journal of Neuroscience (2021) Vol. 54, Iss. 9, pp. 7301-7317
Open Access | Times Cited: 12

Cortical tracking of formant modulations derived from silently presented lip movements and its decline with age
Nina Suess, Anne Hauswald, Patrick Reisinger, et al.
Cerebral Cortex (2021) Vol. 32, Iss. 21, pp. 4818-4833
Open Access | Times Cited: 11

Page 1 - Next Page

Scroll to top