
OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!
If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.
Requested Article:
Parameter-Efficient Transfer Learning with Diff Pruning
Demi Guo, Alexander M. Rush, Yoon Kim
(2021)
Open Access | Times Cited: 151
Demi Guo, Alexander M. Rush, Yoon Kim
(2021)
Open Access | Times Cited: 151
Showing 1-25 of 151 citing articles:
Visual Prompt Tuning
Menglin Jia, Luming Tang, Bor-Chun Chen, et al.
Lecture notes in computer science (2022), pp. 709-727
Closed Access | Times Cited: 576
Menglin Jia, Luming Tang, Bor-Chun Chen, et al.
Lecture notes in computer science (2022), pp. 709-727
Closed Access | Times Cited: 576
Recent Advances in Natural Language Processing via Large Pre-trained Language Models: A Survey
Bonan Min, Hayley Ross, Elior Sulem, et al.
ACM Computing Surveys (2023) Vol. 56, Iss. 2, pp. 1-40
Open Access | Times Cited: 558
Bonan Min, Hayley Ross, Elior Sulem, et al.
ACM Computing Surveys (2023) Vol. 56, Iss. 2, pp. 1-40
Open Access | Times Cited: 558
AdapterFusion: Non-Destructive Task Composition for Transfer Learning
Jonas Pfeiffer, Aishwarya Kamath, Andreas Rücklé, et al.
(2021)
Open Access | Times Cited: 387
Jonas Pfeiffer, Aishwarya Kamath, Andreas Rücklé, et al.
(2021)
Open Access | Times Cited: 387
BitFit: Simple Parameter-efficient Fine-tuning for Transformer-based Masked Language-models
Elad Ben Zaken, Yoav Goldberg, Shauli Ravfogel
(2022)
Open Access | Times Cited: 382
Elad Ben Zaken, Yoav Goldberg, Shauli Ravfogel
(2022)
Open Access | Times Cited: 382
Parameter-efficient fine-tuning of large-scale pre-trained language models
Ning Ding, Yujia Qin, Guang Yang, et al.
Nature Machine Intelligence (2023) Vol. 5, Iss. 3, pp. 220-235
Open Access | Times Cited: 311
Ning Ding, Yujia Qin, Guang Yang, et al.
Nature Machine Intelligence (2023) Vol. 5, Iss. 3, pp. 220-235
Open Access | Times Cited: 311
Towards a Unified View of Parameter-Efficient Transfer Learning
Junxian He, Chunting Zhou, Xuezhe Ma, et al.
arXiv (Cornell University) (2021)
Open Access | Times Cited: 250
Junxian He, Chunting Zhou, Xuezhe Ma, et al.
arXiv (Cornell University) (2021)
Open Access | Times Cited: 250
VL-ADAPTER: Parameter-Efficient Transfer Learning for Vision-and-Language Tasks
Yi-Lin Sung, Jaemin Cho, Mohit Bansal
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2022), pp. 5217-5227
Open Access | Times Cited: 172
Yi-Lin Sung, Jaemin Cho, Mohit Bansal
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2022), pp. 5217-5227
Open Access | Times Cited: 172
Frozen CLIP Models are Efficient Video Learners
Ziyi Lin, Shijie Geng, Renrui Zhang, et al.
Lecture notes in computer science (2022), pp. 388-404
Closed Access | Times Cited: 100
Ziyi Lin, Shijie Geng, Renrui Zhang, et al.
Lecture notes in computer science (2022), pp. 388-404
Closed Access | Times Cited: 100
Delta Tuning: A Comprehensive Study of Parameter Efficient Methods for Pre-trained Language Models
Ning Ding, Yujia Qin, Guang Yang, et al.
Research Square (Research Square) (2022)
Open Access | Times Cited: 88
Ning Ding, Yujia Qin, Guang Yang, et al.
Research Square (Research Square) (2022)
Open Access | Times Cited: 88
Bias and Fairness in Large Language Models: A Survey
Isabel O. Gallegos, Ryan A. Rossi, Joe Barrow, et al.
Computational Linguistics (2024) Vol. 50, Iss. 3, pp. 1097-1179
Open Access | Times Cited: 86
Isabel O. Gallegos, Ryan A. Rossi, Joe Barrow, et al.
Computational Linguistics (2024) Vol. 50, Iss. 3, pp. 1097-1179
Open Access | Times Cited: 86
On the Effectiveness of Parameter-Efficient Fine-Tuning
Zihao Fu, Haoran Yang, Anthony Man–Cho So, et al.
Proceedings of the AAAI Conference on Artificial Intelligence (2023) Vol. 37, Iss. 11, pp. 12799-12807
Open Access | Times Cited: 50
Zihao Fu, Haoran Yang, Anthony Man–Cho So, et al.
Proceedings of the AAAI Conference on Artificial Intelligence (2023) Vol. 37, Iss. 11, pp. 12799-12807
Open Access | Times Cited: 50
Efficient Methods for Natural Language Processing: A Survey
Marcos Treviso, Ji-Ung Lee, Tianchu Ji, et al.
Transactions of the Association for Computational Linguistics (2023) Vol. 11, pp. 826-860
Open Access | Times Cited: 47
Marcos Treviso, Ji-Ung Lee, Tianchu Ji, et al.
Transactions of the Association for Computational Linguistics (2023) Vol. 11, pp. 826-860
Open Access | Times Cited: 47
3DSAM-adapter: Holistic adaptation of SAM from 2D to 3D for promptable tumor segmentation
Shizhan Gong, 遠藤 忠, Wenao Ma, et al.
Medical Image Analysis (2024) Vol. 98, pp. 103324-103324
Closed Access | Times Cited: 22
Shizhan Gong, 遠藤 忠, Wenao Ma, et al.
Medical Image Analysis (2024) Vol. 98, pp. 103324-103324
Closed Access | Times Cited: 22
Raise a Child in Large Language Model: Towards Effective and Generalizable Fine-tuning
Runxin Xu, Fuli Luo, Zhiyuan Zhang, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
Open Access | Times Cited: 102
Runxin Xu, Fuli Luo, Zhiyuan Zhang, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
Open Access | Times Cited: 102
Block Pruning For Faster Transformers
François Lagunas, Ella Charlaix, Victor Sanh, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
Open Access | Times Cited: 79
François Lagunas, Ella Charlaix, Victor Sanh, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
Open Access | Times Cited: 79
UniPELT: A Unified Framework for Parameter-Efficient Language Model Tuning
Yuning Mao, Lambert Mathias, Rui Hou, et al.
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (2022)
Open Access | Times Cited: 46
Yuning Mao, Lambert Mathias, Rui Hou, et al.
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (2022)
Open Access | Times Cited: 46
Composable Sparse Fine-Tuning for Cross-Lingual Transfer
Alan Ansell, Edoardo Maria Ponti, Anna Korhonen, et al.
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (2022), pp. 1778-1796
Open Access | Times Cited: 44
Alan Ansell, Edoardo Maria Ponti, Anna Korhonen, et al.
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (2022), pp. 1778-1796
Open Access | Times Cited: 44
Vision Transformers are Parameter-Efficient Audio-Visual Learners
Yan-Bo Lin, Yi-Lin Sung, Jie Lei, et al.
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2023)
Open Access | Times Cited: 38
Yan-Bo Lin, Yi-Lin Sung, Jie Lei, et al.
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2023)
Open Access | Times Cited: 38
Diversity-Aware Meta Visual Prompting
Qidong Huang, Xiaoyi Dong, Dongdong Chen, et al.
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2023)
Open Access | Times Cited: 26
Qidong Huang, Xiaoyi Dong, Dongdong Chen, et al.
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2023)
Open Access | Times Cited: 26
Bridging Vision and Language Encoders: Parameter-Efficient Tuning for Referring Image Segmentation
Zunnan Xu, Zhihong Chen, Yong Zhang, et al.
2021 IEEE/CVF International Conference on Computer Vision (ICCV) (2023), pp. 17457-17466
Open Access | Times Cited: 25
Zunnan Xu, Zhihong Chen, Yong Zhang, et al.
2021 IEEE/CVF International Conference on Computer Vision (ICCV) (2023), pp. 17457-17466
Open Access | Times Cited: 25
A Survey on Model Compression and Acceleration for Pretrained Language Models
Canwen Xu, Julian McAuley
Proceedings of the AAAI Conference on Artificial Intelligence (2023) Vol. 37, Iss. 9, pp. 10566-10575
Open Access | Times Cited: 22
Canwen Xu, Julian McAuley
Proceedings of the AAAI Conference on Artificial Intelligence (2023) Vol. 37, Iss. 9, pp. 10566-10575
Open Access | Times Cited: 22
Revisiting Parameter-Efficient Tuning: Are We Really There Yet?
Guanzheng Chen, Fangyu Liu, Zaiqiao Meng, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2022)
Open Access | Times Cited: 31
Guanzheng Chen, Fangyu Liu, Zaiqiao Meng, et al.
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2022)
Open Access | Times Cited: 31
CroApp: A CNN-Based Resource Optimization Approach in Edge Computing Environment
Yongzhe Jia, Bowen Liu, Wanchun Dou, et al.
IEEE Transactions on Industrial Informatics (2022) Vol. 18, Iss. 9, pp. 6300-6307
Closed Access | Times Cited: 30
Yongzhe Jia, Bowen Liu, Wanchun Dou, et al.
IEEE Transactions on Industrial Informatics (2022) Vol. 18, Iss. 9, pp. 6300-6307
Closed Access | Times Cited: 30
AdapterBias: Parameter-efficient Token-dependent Representation Shift for Adapters in NLP Tasks
Chin-Lun Fu, Zih-Ching Chen, Yun-Ru Lee, et al.
Findings of the Association for Computational Linguistics: NAACL 2022 (2022)
Open Access | Times Cited: 28
Chin-Lun Fu, Zih-Ching Chen, Yun-Ru Lee, et al.
Findings of the Association for Computational Linguistics: NAACL 2022 (2022)
Open Access | Times Cited: 28