Improved Knowledge Distillation for Pre-trained Language Models via Knowledge Selection. EMNLP (Findings) 2022.
发布时间:
点击次数:
第一作者:Chenglong Wang,Yi Lu,Yongyu Mu,Yimin Hu,Tong Xiao,Jingbo Zhu.
是否译文:否
