Acta Scientiarum Naturalium Universitatis Pekinensis ›› 2020, Vol. 56 ›› Issue (1): 53-60.DOI: 10.13209/j.0479-8023.2019.104

Previous Articles     Next Articles

A Multi-Mechanism Fused Paraphrase Generation Model with Joint Auto-Encoding Learning

LIU Mingtong1, ZHANG Yujie1,†, ZHANG Shu2, MENG Yao3, XU Jin’an1, CHEN Yufeng1   

  1. 1. School of Computer and Information Technology, Beijing Jiaotong University, Beijing 100044 2. Fujitsu Research and Development Center, Beijing 100027 3. Lenovo Research, AI Lab, Beijing 100085
  • Received:2019-05-23 Revised:2019-09-20 Online:2020-01-20 Published:2020-01-20
  • Contact: ZHANG Yujie, E-mail: yjzhang(at)bjtu.edu.cn

联合自编码任务的多机制融合复述生成模型

刘明童1, 张玉洁1,†, 张姝2, 孟遥3, 徐金安1, 陈钰枫1   

  1. 1. 北京交通大学计算机与信息技术学院, 北京 100044 2. 富士通研究开发中心, 北京 100027 3. 联想研究院人工智能实验室, 北京 100085
  • 通讯作者: 张玉洁, E-mail: yjzhang(at)bjtu.edu.cn
  • 基金资助:
    国家自然科学基金(61876198, 61976015, 61370130, 61473294)、中央高校基本科研业务费专项资金(2018YJS025)、北京市自然科学基金(4172047)和科学技术部国际科技合作计划(K11F100010)资助

Abstract:

Neural network encoder-decoder framework has become the popular method for paraphrase generation, but there are still two problems. On the one hand, there are such issues as inaccurate entity words, unknown words and word repetition in the generated paraphrase sentences. To solve the first problem, we proposed a multimechanism fused paraphrase generation model to improve the decoder. The copy mechanism was used to copy words form input sentence for improving the generation of entity and unknown words. The coverage mechanism was used to model historical attention information to avoid word repetition. On the other hand, the limited-scale parallel paraphrase corpus limits the learning ability of the encoder. We proposed to jointly learn auto-encoding task, which shares one encoder with paraphrase generation task. The joint auto-encoding task enhances the learning ability of the encoder. Experimental results on Quora paraphrase dataset show that the multi-mechanism fused paraphrase generation model with joint auto-encoding task can effectively improve the performance of paraphrase generation.

Key words: paraphrase generation, auto-encoding, multi-task learning, multi-mechanism fusion, attention mechanism, copy mechanism, coverage mechanism

摘要:

基于神经网络编码–解码框架的复述生成模型存在两方面的问题: 1) 生成的复述句中存在实体词不准确、未登录词和词汇重复生成; 2) 复述平行语料的有限规模限制了编码器的语义学习能力。针对第一个问题, 本文提出在解码过程中融合注意力机制、复制机制和覆盖机制的多机制复述生成模型, 利用复制机制从原句复制词语来解决实体词和未登录词生成问题; 利用覆盖机制建模学习注意力机制历史决策信息来规避词汇重复生成。针对第二个问题, 基于多任务学习框架, 提出在复述生成任务中联合自编码任务, 两个任务共享一个编码器, 同时利用平行复述语料和原句子数据, 共同增强复述生成编码器的语义学习能力。在Quora复述数据集上的实验结果表明, 提出的联合自编码的多机制融合复述生成模型有效地解决了复述生成的问题, 并提高了复述句的生成质量。

关键词: 复述生成, 自编码, 多任务学习, 多机制融合, 注意力机制, 复制机制, 覆盖机制