Acta Scientiarum Naturalium Universitatis Pekinensis ›› 2022, Vol. 58 ›› Issue (1): 29-36.DOI: 10.13209/j.0479-8023.2021.109
Previous Articles Next Articles
WANG Qian, LI Maoxi†, WU Shuixiu, WANG Mingwen
Received:
Revised:
Online:
Published:
Contact:
王倩, 李茂西†, 吴水秀, 王明文
通讯作者:
基金资助:
Abstract:
Key words: cross-lingual pre-training language model, neural machine translation, Transformer neural network; XLM-R model, fine-tuning
摘要:
关键词: 跨语种预训练语言模型, 神经机器翻译, Transformer网络模型, XLM-R模型, 微调
WANG Qian, LI Maoxi, WU Shuixiu, WANG Mingwen. Neural Machine Translation Based on XLM-R Cross-lingual Pre-training Language Model[J]. Acta Scientiarum Naturalium Universitatis Pekinensis, 2022, 58(1): 29-36.
王倩, 李茂西, 吴水秀, 王明文. 基于跨语种预训练语言模型XLM-R的神经机器翻译方法[J]. 北京大学学报自然科学版, 2022, 58(1): 29-36.
Add to citation manager EndNote|Ris|BibTeX
URL: https://xbna.pku.edu.cn/EN/10.13209/j.0479-8023.2021.109
https://xbna.pku.edu.cn/EN/Y2022/V58/I1/29