Acta Scientiarum Naturalium Universitatis Pekinensis ›› 2019, Vol. 55 ›› Issue (1): 113-119.DOI: 10.13209/j.0479-8023.2018.065

Previous Articles     Next Articles

N3LDG: A Lightweight Neural Network Library for Natural Language Processing

WANG Qiansheng, YU Nan, ZHANG Meishan, HAN Zijia, FU Guohong   

  1. School of Computer Science and Technology, Heilongjiang University, Harbin 150080
  • Received:2018-04-21 Revised:2018-08-15 Online:2019-01-20 Published:2019-01-20
  • Contact: FU Guohong, E-mail: ghfu(at)hotmail.com

N3LDG: 一种轻量级自然语言处理深度学习库

王潜升, 余南, 张梅山, 韩子嘉, 付国宏   

  1. 黑龙江大学计算机科学技术学院, 哈尔滨 150080
  • 通讯作者: 付国宏, E-mail: ghfu(at)hotmail.com
  • 基金资助:
    国家自然科学基金(61672211, 61602160)和黑龙江省自然科学基金(F2016036)资助

Abstract:

The authors propose a neural network library N3LDG for natural language processing. N3LDG supports constructing computation graphs dynamically, and organizing executions into batches automatically. Experiments show that N3LDG can efficiently construct and execute computation graphs when training CNN, Bi-LSTM, and Tree-LSTM. When using CPU to train above models, the training speed of N3LDG is better than that of PyTorch. When using GPU to train CNN and Tree-LSTM, N3LDG is better than PyTorch.

Key words: deep learning library, NLP, lightweight, CUDA

摘要:

提出一种用于自然语言处理的轻量级深度学习库N3LDG, 可以支持动态地构建计算图, 并能自动地批量化执行计算图。实验显示, 当训练卷积神经网络、双向LSTM和树结构LSTM时, N3LDG都能高效地构建与执行计算图; 当使用CPU训练上述模型时, N3LDG的训练速度优于PyTorch; 当使用GPU训练卷积神经网络和树结构LSTM模型时, N3LDG的训练速度优于PyTorch。

关键词: 深度学习库, 自然语言处理, 轻量级, CUDA