OpenAlex Citation Counts

OpenAlex Citations Logo

OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!

If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.

Requested Article:

Emerging Cross-lingual Structure in Pretrained Language Models
Alexis Conneau, Shijie Wu, Haoran Li, et al.
(2020)
Open Access | Times Cited: 220

Showing 1-25 of 220 citing articles:

Unsupervised Cross-lingual Representation Learning at Scale
Alexis Conneau, Kartikay Khandelwal, Naman Goyal, et al.
(2020)
Open Access | Times Cited: 4019

On the Cross-lingual Transferability of Monolingual Representations
Mikel Artetxe, Sebastian Ruder, Dani Yogatama
(2020)
Open Access | Times Cited: 557

MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer
Jonas Pfeiffer, Ivan Vulić, Iryna Gurevych, et al.
(2020)
Open Access | Times Cited: 394

Unsupervised Cross-lingual Representation Learning at Scale.
Alexis Conneau, Kartikay Khandelwal, Naman Goyal, et al.
arXiv (Cornell University) (2019)
Closed Access | Times Cited: 369

From Zero to Hero: On the Limitations of Zero-Shot Language Transfer with Multilingual Transformers
Anne Lauscher, Vinit Ravishankar, Ivan Vulić, et al.
(2020)
Open Access | Times Cited: 233

InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-Training
Zewen Chi, Li Dong, Furu Wei, et al.
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2021)
Open Access | Times Cited: 217

Are All Languages Created Equal in Multilingual BERT?
Shijie Wu, Mark Dredze
(2020)
Open Access | Times Cited: 196

Probing Pretrained Language Models for Lexical Semantics
Ivan Vulić, Edoardo Maria Ponti, Robert Litschko, et al.
(2020)
Open Access | Times Cited: 173

Unsupervised Domain Clusters in Pretrained Language Models
Roee Aharoni, Yoav Goldberg
(2020), pp. 7747-7763
Open Access | Times Cited: 171

A Primer on Pretrained Multilingual Language Models
Sumanth Doddapaneni, G. Ramesh, Mitesh M. Khapra, et al.
ACM Computing Surveys (2025)
Open Access | Times Cited: 2

Finding Universal Grammatical Relations in Multilingual BERT
A. Ethan, John Hewitt, Christopher D. Manning
(2020)
Open Access | Times Cited: 137

Using social media data for assessing children’s exposure to violence during the COVID-19 pandemic
Pouria Babvey, Fernanda De Oliveira Capela, Claudia Cappa, et al.
Child Abuse & Neglect (2020) Vol. 116, pp. 104747-104747
Open Access | Times Cited: 129

Do Wide and Deep Networks Learn the Same Things? Uncovering How Neural Network Representations Vary with Width and Depth
Thao Nguyen, Maithra Raghu, Simon Kornblith
arXiv (Cornell University) (2021)
Closed Access | Times Cited: 88

X-FACTR: Multilingual Factual Knowledge Retrieval from Pretrained Language Models
Zhengbao Jiang, Antonios Anastasopoulos, Jun Araki, et al.
(2020)
Open Access | Times Cited: 82

On Negative Interference in Multilingual Models: Findings and A Meta-Learning Treatment
Zirui Wang, Zachary C. Lipton, Yulia Tsvetkov
(2020)
Open Access | Times Cited: 70

UNKs Everywhere: Adapting Multilingual Language Models to New Scripts
Jonas Pfeiffer, Ivan Vulić, Iryna Gurevych, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021), pp. 10186-10203
Open Access | Times Cited: 67

Lifting the Curse of Multilinguality by Pre-training Modular Transformers
Jonas Pfeiffer, Naman Goyal, Xi Lin, et al.
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2022)
Open Access | Times Cited: 53

InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-Training
Zewen Chi, Li Dong, Furu Wei, et al.
arXiv (Cornell University) (2020)
Open Access | Times Cited: 69

A Call for More Rigor in Unsupervised Cross-lingual Learning
Mikel Artetxe, Sebastian Ruder, Dani Yogatama, et al.
(2020), pp. 7375-7388
Open Access | Times Cited: 54

Inducing Language-Agnostic Multilingual Representations
Wei Zhao, Steffen Eger, Johannes Bjerva, et al.
(2021)
Open Access | Times Cited: 47

BERT syntactic transfer: A computational experiment on Italian, French and English languages
Raffaele Guarasci, Stefano Silvestri, Giuseppe De Pietro, et al.
Computer Speech & Language (2021) Vol. 71, pp. 101261-101261
Closed Access | Times Cited: 46

Explicit Alignment Objectives for Multilingual Bidirectional Encoders
Junjie Hu, Melvin Johnson, Orhan Fırat, et al.
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2021), pp. 3633-3643
Open Access | Times Cited: 41

Parsing with Multilingual BERT, a Small Corpus, and a Small Treebank
Ethan C. Chau, Lucy H. Lin, Noah A. Smith
(2020), pp. 1324-1334
Open Access | Times Cited: 46

Page 1 - Next Page

Scroll to top