OpenAlex Citation Counts

OpenAlex Citations Logo

OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!

If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.

Requested Article:

An Embarrassingly Simple Approach for Transfer Learning from Pretrained Language Models
Alexandra Chronopoulou, Christos Baziotis, Alexandros Potamianos
(2019)
Open Access | Times Cited: 111

Showing 1-25 of 111 citing articles:

Don’t Stop Pretraining: Adapt Language Models to Domains and Tasks
Suchin Gururangan, Ana Marasović, Swabha Swayamdipta, et al.
(2020)
Open Access | Times Cited: 1660

It’s Not Just Size That Matters: Small Language Models Are Also Few-Shot Learners
Timo Schick, Hinrich Schütze
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2021)
Open Access | Times Cited: 568

Pre-trained models for natural language processing: A survey
Xipeng Qiu, Tianxiang Sun, Yige Xu, et al.
Science China Technological Sciences (2020) Vol. 63, Iss. 10, pp. 1872-1897
Closed Access | Times Cited: 439

TERA: Self-Supervised Learning of Transformer Encoder Representation for Speech
Andy T. Liu, Shang-Wen Li, Hung-yi Lee
IEEE/ACM Transactions on Audio Speech and Language Processing (2021) Vol. 29, pp. 2351-2366
Open Access | Times Cited: 267

Rethinking Architecture Design for Tackling Data Heterogeneity in Federated Learning
Liangqiong Qu, Yuyin Zhou, Paul Pu Liang, et al.
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2022), pp. 10051-10061
Open Access | Times Cited: 104

Constrained Deep Reinforcement Transfer Learning for Short-Term Forecasting of Wind Discrepancies at Ocean Stations
Jun Zhang, Yaoran Chen, Hang Pan, et al.
Neurocomputing (2025), pp. 129491-129491
Closed Access | Times Cited: 2

Recall and Learn: Fine-tuning Deep Pretrained Language Models with Less Forgetting
Sanyuan Chen, Yutai Hou, Yiming Cui, et al.
(2020)
Open Access | Times Cited: 122

AI-Based Conversational Agents: A Scoping Review From Technologies to Future Directions
Sheetal Kusal, Shruti Patil, Jyoti Choudrie, et al.
IEEE Access (2022) Vol. 10, pp. 92337-92356
Open Access | Times Cited: 66

Making costly manufacturing smart with transfer learning under limited data: A case study on composites autoclave processing
Milad Ramezankhani, Bryn Crawford, Apurva Narayan, et al.
Journal of Manufacturing Systems (2021) Vol. 59, pp. 345-354
Closed Access | Times Cited: 56

True Few-Shot Learning with Prompts—A Real-World Perspective
Timo Schick, Hinrich Schütze
Transactions of the Association for Computational Linguistics (2022) Vol. 10, pp. 716-731
Open Access | Times Cited: 39

GeoLayoutLM: Geometric Pre-training for Visual Information Extraction
Chuwei Luo, Changxu Cheng, Zheng Qi, et al.
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2023)
Open Access | Times Cited: 33

Recent Advances in Generative AI and Large Language Models: Current Status, Challenges, and Perspectives
Desta Haileselassie Hagos, Rick Battle, Danda B. Rawat
IEEE Transactions on Artificial Intelligence (2024) Vol. 5, Iss. 12, pp. 5873-5893
Open Access | Times Cited: 11

AI Based Emotion Detection for Textual Big Data: Techniques and Contribution
Sheetal Kusal, Shruti Patil, Ketan Kotecha, et al.
Big Data and Cognitive Computing (2021) Vol. 5, Iss. 3, pp. 43-43
Open Access | Times Cited: 54

AdaPrompt: Adaptive Model Training for Prompt-based NLP
Yulong Chen, Yang Liu, Dong Li, et al.
(2022), pp. 6057-6068
Open Access | Times Cited: 31

It's Not Just Size That Matters: Small Language Models Are Also Few-Shot Learners
Timo Schick, Hinrich Schütze
arXiv (Cornell University) (2020)
Closed Access | Times Cited: 49

Continual Learning for Natural Language Generation in Task-oriented Dialog Systems
Fei Mi, Liangwei Chen, Mengjie Zhao, et al.
(2020)
Open Access | Times Cited: 46

Better Document-Level Machine Translation with Bayes’ Rule
Lei Yu, Laurent Sartran, Wojciech Stokowiec, et al.
Transactions of the Association for Computational Linguistics (2020) Vol. 8, pp. 346-360
Open Access | Times Cited: 45

Using Similarity Measures to Select Pretraining Data for
Xiang Dai, Sarvnaz Karimi, Ben Hachey, et al.
(2019)
Closed Access | Times Cited: 43

The classification of EEG-based winking signals: a transfer learning and random forest pipeline
Jothi Letchumy Mahendra Kumar, Mamunur Rashid, Rabiu Muazu Musa, et al.
PeerJ (2021) Vol. 9, pp. e11182-e11182
Open Access | Times Cited: 39

A Comparative Study of Using Pre-trained Language Models for Toxic Comment Classification
Zhixue Zhao, Ziqi Zhang, Frank Hopfgartner
Companion Proceedings of the The Web Conference 2018 (2021), pp. 500-507
Open Access | Times Cited: 36

The classification of EEG-based wink signals: A CWT-Transfer Learning pipeline
Jothi Letchumy Mahendra Kumar, Mamunur Rashid, Rabiu Muazu Musa, et al.
ICT Express (2021) Vol. 7, Iss. 4, pp. 421-425
Open Access | Times Cited: 34

Automated Source Code Generation and Auto-Completion Using Deep Learning: Comparing and Discussing Current Language Model-Related Approaches
Juan Cruz-Benito, Sanjay Vishwakarma, Francisco Martín-Fernández, et al.
AI (2021) Vol. 2, Iss. 1, pp. 1-16
Open Access | Times Cited: 32

XLM-T: A Multilingual Language Model Toolkit for Twitter.
Francesco Barbieri, Luis Espinosa-Anke, José Camacho-Collados
arXiv (Cornell University) (2021)
Closed Access | Times Cited: 32

Page 1 - Next Page

Scroll to top