[1] |
黄建年, 侯汉清. 农业古籍断句标点模式研究. 中文信息学报, 2008, 22(4): 31-38
|
[2] |
陈天莹, 陈蓉, 潘璐璐, 等. 基于前后文 n-gram 模型的古汉语句子切分. 计算机工程, 2007, 33(3): 192-193, 196
|
[3] |
Lafferty J D, McCallum A, Pereira F C N. Conditional random fields: probabilistic models for segmenting and labeling sequence data // Proceedings of the 18th International Conference on Machine Learning. Williamstown, 2001: 282-289
|
[4] |
张合, 王晓东, 杨建宇, 等. 一种基于层叠 CRF 的古文断句与句读标记方法. 计算机应用研究, 2009, 26(9): 3326-3329
|
[5] |
张开旭, 夏云庆, 宇航. 基于条件随机场的古汉语自动断句与标点方法. 清华大学学报: 自然科学版, 2009, 49(10): 1733-1736
|
[6] |
Huang H H, Sun C T, Chen H H.Classical Chinese sentence segmentation // Proceedings of CIPS-SIGHAN Joint Conference on Chinese Language Processing. Beijing, 2010: 1-8
|
[7] |
Wang B, Shi X, Tan Z, et al.A sentence segmentation method for ancient Chinese texts based on NNLM // Proceedings of CLSW. Singapore, 2016: 387-396
|
[8] |
Graves A.Supervised sequence labelling with re-current neural networks. Studies in Computational Intelligence. Heidelberg: Springer, 2012: 22-29
|
[9] |
Schuster M, Paliwal K K. Bidirectional recurrent neural networks. IEEE Transactions on Signal Proce-ssing, 1997, 45(11): 2673-2681
|
[10] |
Cho K, Merrienboer B V, Gulcehre C, et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation [EB/OL]. (2014-09-03) [2016-07-29].
|
[11] |
Tieleman T, Hinton G.RMSProp: divide the gradient by a running average of its recent magnitude: report of COURSERA “Neural Networks for Machine Lear-ning”[R]. Toronto, 2012
|
[12] |
Salimans T, Kingma D P. Weight normalization: a simple reparameterization to accelerate training of deep neural networks [EB/OL]. (2016-06-04) [2016-07-29].
|
[13] |
Yao Y, Rosasco L, Caponnetto A. On early stopping in gradient descent learning. Constructive Approxi-mation, 2007, 26(2): 289-315
|
[14] |
Hinton G E, Srivastava N, Krizhevsky A, et al. Improving neural networks by preventing coadapta-tion of feature detectors [EB/OL]. (2012-07-03) [2016- 07-29].
|
[15] |
Pham V, Kermorvant C, Louradour J.Dropout improves recurrent neural networks for handwriting recognition[EB/OL]. (2014-03-10) [2016-07-29].
|
[16] |
Moon T, Choi H, Lee H, et al. RNNDROP: a novel dropout for RNNS in ASR // IEEE Automatic Speech Recognition and Understanding Workshop. Scotts-dale, 2015: 65-70
|
[17] |
Semeniuta S, Severyn A, Barth E. Recurrent dropout without memory loss [EB/OL]. (2016-03-16) [2016-07-29].
|
[18] |
Huang Z, Xu W, Yu K. Bidirectional LSTM-CRF models for sequence tagging [EB/OL]. (2015-08-09) [2016-07-29].
|
[19] |
Chen X, Qiu X, Zhu C, et al.Long short-term memory neural networks for Chinese word segmen-tation // Proceedings of EMNLP. Lisbon, 2015: 1197-1206
|