Acta Scientiarum Naturalium Universitatis Pekinensis ›› 2022, Vol. 58 ›› Issue (1): 69-76.DOI: 10.13209/j.0479-8023.2021.102

Previous Articles     Next Articles

A Context-Fusion Method for Entity Extraction Based on Residual Gated Convolution Neural Network

SU Fenglong, SUN Chengzhe, JING Ning   

  1. School of Electronic Science, National University of Defense Technology, Changsha 410073
  • Received:2021-06-12 Revised:2021-08-15 Online:2022-01-20 Published:2022-01-20
  • Contact: SU Fenglong, E-mail: xueshu2021(at)qq.com

融合上下文的残差门卷积实体抽取

苏丰龙†, 孙承哲, 景宁   

  1. 国防科技大学电子科学学院, 长沙 410073
  • 通讯作者: 苏丰龙, E-mail: xueshu2021(at)qq.com

Abstract:

Due to the convolutional receptive field size, the current word has a limited relevance to the context. It brings about a problem, that is, the semantics of the entity words in the whole sentence is under-considered. The Residual Gated Convolution Neural Network (RGCNN) uses dilated convolution and residual gated linear units to simultaneously consider the associations between words from different dimensions, which adjusts the amount of information flowing to the next layer of neurons. And then by this way the vanishing gradient can be alleviated in cross-layer propagation. At the same time, RGCNN combines the attention mechanism to calculate the semantics between words in the last layer. The results on datasets show that RGCNN has a competitive advantage in speed and accuracy, which reflects the superiority and robustness of the algorithm.

Key words: entity extraction, residual gated convolution, vanishing gradient, attention mechanism

摘要:

基于传统卷积框架的实体抽取方法, 由于受到卷积感受野大小的控制, 当前词与上下文的关联程度有限, 对实体词在整个句子中的语义欠考虑, 识别效果不佳。针对这一问题, 提出一种基于残差门卷积的实体识别方法, 利用膨胀卷积和带残差的门控线性单元, 从多个时序维度同步考虑词间的语义关联, 借助门控单元调整流向下一层神经元的信息量, 缓解跨层传播的梯度消失问题, 同时结合注意力机制捕捉词间的相关语义。在公开命名实体识别数据集和专业领域数据集上运行结果表明, 与传统的实体抽取框架相比, 基于残差门卷积命名实体算法的速度和精度都有较强的竞争优势, 体现出算法的优越性和强鲁棒性。

关键词: 实体抽取, 残差门卷积, 梯度消失, 注意力机制