
OpenAlex is a bibliographic catalogue of scientific papers, authors and institutions accessible in open access mode, named after the Library of Alexandria. It's citation coverage is excellent and I hope you will find utility in this listing of citing articles!
If you click the article title, you'll navigate to the article, as listed in CrossRef. If you click the Open Access links, you'll navigate to the "best Open Access location". Clicking the citation count will open this listing for that article. Lastly at the bottom of the page, you'll find basic pagination options.
Requested Article:
Logit Standardization in Knowledge Distillation
Shangquan Sun, Wenqi Ren, Jingzhi Li, et al.
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2024), pp. 15731-15740
Closed Access | Times Cited: 35
Shangquan Sun, Wenqi Ren, Jingzhi Li, et al.
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2024), pp. 15731-15740
Closed Access | Times Cited: 35
Showing 1-25 of 35 citing articles:
Student-friendly knowledge distillation
Mengyang Yuan, Bo Lang, Fengnan Quan
Knowledge-Based Systems (2024) Vol. 296, pp. 111915-111915
Open Access | Times Cited: 9
Mengyang Yuan, Bo Lang, Fengnan Quan
Knowledge-Based Systems (2024) Vol. 296, pp. 111915-111915
Open Access | Times Cited: 9
Adaptive lightweight network construction method for self-knowledge distillation
Siyuan Lu, Weiliang Zeng, Xueshi Li, et al.
Neurocomputing (2025), pp. 129477-129477
Closed Access | Times Cited: 1
Siyuan Lu, Weiliang Zeng, Xueshi Li, et al.
Neurocomputing (2025), pp. 129477-129477
Closed Access | Times Cited: 1
Boundary-sensitive Adaptive Decoupled Knowledge Distillation For Acne Grading
Xinyang Zhou, Wenjie Liu, Lei Zhang, et al.
Applied Intelligence (2025) Vol. 55, Iss. 6
Closed Access
Xinyang Zhou, Wenjie Liu, Lei Zhang, et al.
Applied Intelligence (2025) Vol. 55, Iss. 6
Closed Access
Applications of knowledge distillation in remote sensing: A survey
Yassine Himeur, Nour Aburaed, Omar Elharrouss, et al.
Information Fusion (2024), pp. 102742-102742
Closed Access | Times Cited: 4
Yassine Himeur, Nour Aburaed, Omar Elharrouss, et al.
Information Fusion (2024), pp. 102742-102742
Closed Access | Times Cited: 4
A Feature Map Fusion Self-Distillation Scheme for Image Classification Networks
Zhenkai Qin, Shuiping Ni, Mingfu Zhu, et al.
Electronics (2025) Vol. 14, Iss. 1, pp. 182-182
Open Access
Zhenkai Qin, Shuiping Ni, Mingfu Zhu, et al.
Electronics (2025) Vol. 14, Iss. 1, pp. 182-182
Open Access
RMKD: Relaxed Matching Knowledge Distillation for Short-Length SSVEP-Based Brain-Computer Interfaces
Zhen Lan, Zixing Li, Chao Yan, et al.
Neural Networks (2025) Vol. 185, pp. 107133-107133
Closed Access
Zhen Lan, Zixing Li, Chao Yan, et al.
Neural Networks (2025) Vol. 185, pp. 107133-107133
Closed Access
Development of a Lightweight Model for Rice Plant Counting and Localization Using UAV-Captured RGB Imagery
Haoran Sun, Siqiao Tan, Zhengliang Luo, et al.
Agriculture (2025) Vol. 15, Iss. 2, pp. 122-122
Open Access
Haoran Sun, Siqiao Tan, Zhengliang Luo, et al.
Agriculture (2025) Vol. 15, Iss. 2, pp. 122-122
Open Access
Consistency knowledge distillation based on similarity attribute graph guidance
Jiaqi Ma, Jinfu Yang, Fuji Fu, et al.
Expert Systems with Applications (2025), pp. 126395-126395
Closed Access
Jiaqi Ma, Jinfu Yang, Fuji Fu, et al.
Expert Systems with Applications (2025), pp. 126395-126395
Closed Access
Quality Grading of Oudemansiella raphanipes Using Three-Teacher Knowledge Distillation with Cascaded Structure for LightWeight Neural Networks
Haoxuan Chen, Huamao Huang, Yangyang Peng, et al.
Agriculture (2025) Vol. 15, Iss. 3, pp. 301-301
Open Access
Haoxuan Chen, Huamao Huang, Yangyang Peng, et al.
Agriculture (2025) Vol. 15, Iss. 3, pp. 301-301
Open Access
Unambiguous granularity distillation for asymmetric image retrieval
Hongrui Zhang, Yi Xie, Haoquan Zhang, et al.
Neural Networks (2025), pp. 107303-107303
Closed Access
Hongrui Zhang, Yi Xie, Haoquan Zhang, et al.
Neural Networks (2025), pp. 107303-107303
Closed Access
Personalized federated learning via decoupling self-knowledge distillation and global adaptive aggregation
Zhiwei Tang, Shuguang Xu, Haozhe Jin, et al.
Multimedia Systems (2025) Vol. 31, Iss. 2
Closed Access
Zhiwei Tang, Shuguang Xu, Haozhe Jin, et al.
Multimedia Systems (2025) Vol. 31, Iss. 2
Closed Access
FedDyH: A Multi-Policy with GA Optimization Framework for Dynamic Heterogeneous Federated Learning
Xuhua Zhao, Yongming Zheng, Jinjin Wan, et al.
Biomimetics (2025) Vol. 10, Iss. 3, pp. 185-185
Open Access
Xuhua Zhao, Yongming Zheng, Jinjin Wan, et al.
Biomimetics (2025) Vol. 10, Iss. 3, pp. 185-185
Open Access
An accurate and efficient self-distillation method with channel-based feature enhancement via feature calibration and attention fusion for Internet of Things
Qian Zheng, Shengbo Chen, Guanghui Wang, et al.
Future Generation Computer Systems (2025), pp. 107816-107816
Closed Access
Qian Zheng, Shengbo Chen, Guanghui Wang, et al.
Future Generation Computer Systems (2025), pp. 107816-107816
Closed Access
Object detection with dynamic high-/low-frequency knowledge distillation for real-world degradation
Junyi Zhao, Jinbao Li, Xinjie Chen, et al.
Alexandria Engineering Journal (2025) Vol. 124, pp. 110-120
Closed Access
Junyi Zhao, Jinbao Li, Xinjie Chen, et al.
Alexandria Engineering Journal (2025) Vol. 124, pp. 110-120
Closed Access
A contrast enhanced representation normalization approach to knowledge distillation
Zhiqiang Bao, Di Zhu, Liang Du, et al.
Scientific Reports (2025) Vol. 15, Iss. 1
Open Access
Zhiqiang Bao, Di Zhu, Liang Du, et al.
Scientific Reports (2025) Vol. 15, Iss. 1
Open Access
Cross-domain visual prompting with spatial proximity knowledge distillation for histological image classification
Xiaohong Li, Guoheng Huang, Lianglun Cheng, et al.
Journal of Biomedical Informatics (2024) Vol. 158, pp. 104728-104728
Closed Access | Times Cited: 3
Xiaohong Li, Guoheng Huang, Lianglun Cheng, et al.
Journal of Biomedical Informatics (2024) Vol. 158, pp. 104728-104728
Closed Access | Times Cited: 3
Knowledge in attention assistant for improving generalization in deep teacher–student models
Sajedeh Morabbi, Hadi Soltanizadeh, Saeed Mozaffari, et al.
International Journal of Modelling and Simulation (2024), pp. 1-17
Closed Access | Times Cited: 1
Sajedeh Morabbi, Hadi Soltanizadeh, Saeed Mozaffari, et al.
International Journal of Modelling and Simulation (2024), pp. 1-17
Closed Access | Times Cited: 1
A Standardized-Based Knowledge Distillation Model for Multimodal Emotion Recognition in Conversation
Yihao Li, Turdi Tohti, Han Dongfang, et al.
(2024)
Closed Access
Yihao Li, Turdi Tohti, Han Dongfang, et al.
(2024)
Closed Access
Few-Shot Learning Based on Dimensionally Enhanced Attention and Logit Standardization Self-Distillation
Y. Tang, Guang Li, Ming Zhang, et al.
Electronics (2024) Vol. 13, Iss. 15, pp. 2928-2928
Open Access
Y. Tang, Guang Li, Ming Zhang, et al.
Electronics (2024) Vol. 13, Iss. 15, pp. 2928-2928
Open Access
Provably Convergent Learned Inexact Descent Algorithm for Low-Dose CT Reconstruction
Qingchao Zhang, Mehrdad Alvandipour, Wenjun Xia, et al.
Journal of Scientific Computing (2024) Vol. 101, Iss. 1
Open Access
Qingchao Zhang, Mehrdad Alvandipour, Wenjun Xia, et al.
Journal of Scientific Computing (2024) Vol. 101, Iss. 1
Open Access
Research on the Great Multi-model Pyramid Training Framework and Enhanced Loss Function for Fine-grained Classification
Pengze Guo
(2024), pp. 414-422
Closed Access
Pengze Guo
(2024), pp. 414-422
Closed Access
Instance-Level Scaling and Dynamic Margin-Alignment Knowledge Distillation for Remote Sensing Image Scene Classification
Chuan Li, Xiao Teng, Yan Ding, et al.
Remote Sensing (2024) Vol. 16, Iss. 20, pp. 3853-3853
Open Access
Chuan Li, Xiao Teng, Yan Ding, et al.
Remote Sensing (2024) Vol. 16, Iss. 20, pp. 3853-3853
Open Access
Why does knowledge distillation work? Rethink its attention and fidelity mechanism
Chenqi Guo, Shiwei Zhong, Xiaofeng Liu, et al.
Expert Systems with Applications (2024), pp. 125579-125579
Closed Access
Chenqi Guo, Shiwei Zhong, Xiaofeng Liu, et al.
Expert Systems with Applications (2024), pp. 125579-125579
Closed Access
Harmonizing Knowledge Transfer in Neural Network with Unified Distillation
Yaomin Huang, Z. Yan, Chaomin Shen, et al.
Lecture notes in computer science (2024), pp. 58-74
Closed Access
Yaomin Huang, Z. Yan, Chaomin Shen, et al.
Lecture notes in computer science (2024), pp. 58-74
Closed Access
Student-Oriented Teacher Knowledge Refinement for Knowledge Distillation
Chaomin Shen, Yaomin Huang, Hao-ting Zhu, et al.
(2024), pp. 4543-4552
Closed Access
Chaomin Shen, Yaomin Huang, Hao-ting Zhu, et al.
(2024), pp. 4543-4552
Closed Access