Page 175 - 《软件学报》2021年第9期
P. 175
李卫疆 等:基于多通道特征和自注意力的情感分类方法 2799
[13] Huang FL, Feng S, Wang D, Yu G. Mining topic sentiment in microblogging based on multi-feature fusion. Chinese Journal of
Computers, 2017,40(4):872−888 (in Chinese with English abstract). [doi: 10.11897/SP.J.1016.2017.00872]
[14] Huang FL, Yu G, Zhang JL, Li CX, Yuan CA, Lu JL. Mining topic sentiment in micro-blogging based on micro-blogger social
relation. Ruan Jian Xue Bao/Journal of Software, 2017,28(3):694−707 (in Chinese with English abstract). http://www.jos.org.cn/
1000-9825/5157.htm [doi: 10.13328/j.cnki.jos.005157]
[15] Vo DT, Zhang Y. Don’t count, predict! An automatic approach to learning sentiment lexicons for short text. In: Proc. of the 54th
Annual Meeting of the Association for Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2016.
219−224. [doi: 10.18653/v1/P16-2036]
[16] Chen Y, Skiena S. Building sentiment lexicons for all major languages. In: Proc. of the 52nd Annual Meeting of the Association for
Computational Linguistics. Stroudsburg: Association for Computational Linguistics, 2014. 383−389. [doi: 10.3115/v1/P14-2063]
[17] Teng Z, Vo DT, Zhang Y. Context-sensitive lexicon features for neural sentiment analysis. In: Proc. of the 2016 Conf. on Empirical
Methods in Natural Language Processing. Stroudsburg: Association for Computational Linguistics, 2016. 1629−1638. [doi: 10.
18653/v1/D16-1169]
[18] Tai KS, Socher R, Manning CD. Improved semantic representations from tree-structured long short-term memory networks. In:
Proc. of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th Int’l Joint Conf. on Natural
Language Processing. Stroudsburg: Association for Computational Linguistics, 2015. 1556−1566. [doi: 10.3115/v1/P15-1150]
[19] Zhang B, Xu X, Li X, Chen X, Ye Y, Wang Z. Sentiment analysis through critic learning for optimizing convolutional neural
networks with rules. Neurocomputing, 2019,356:21−30. [doi: 10.1016/j.neucom.2019.04.038 ]
[20] Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.
0473, 2014.
[21] Ma D, Li S, Zhang X, Wang H. Interactive attention networks for aspect-level sentiment classification. In: Proc. of the 26th Int’l
Joint Conf. on Artificial Intelligence. AAAI, 2017. 4068−4074.
[22] Wang Y, Huang ML, Zhao L, Zhu XY. Attention-based LSTM for aspect-level sentiment classification. In: Proc. of the 2016 Conf.
on Empirical Methods in Natural Language Processing. Stroudsburg: Association for Computational Linguistics, 2016. 606−615.
[doi: 10.18653/v1/D16-1058]
[23] Liu Q, Zhang H, Zeng Y, Huang Z, Wu Z. Content attention model for aspect based sentiment analysis. In: Proc. of the 2018 Int’l
Conf. on World Wide Web. Steering Committee, 2018. 1023−1032. [doi: 10.1145/3178876.3186001]
[24] Liang B, Liu Q, Xu J, Zhou Q, Zhang P. Aspect-based sentiment analysis based on multi-attention CNN. Journal of Computer
Research and Development, 2017,54(8):1724−1735 (in Chinese with English abstract). [doi: 10.7544/issn1000-1239.2017.
20170178]
[25] Guan PF, Li B, Lv XQ, Zhou JS. Attention enhanced bi-directional LSTM for sentiment analysis. Journal of Chinese Information
Processing, 2019,33(2):105−111 (in Chinese with English abstract). http://jcip.cipsc.org.cn/CN/Y2019/V33/I2/105 [doi: CNKI:
SUN:MESS.0.2019-02-017]
[26] Zhou X, Wan X, Xiao J. Attention-based LSTM network for cross-lingual sentiment classification. In: Proc. of the 2016 Conf. on
Empirical Methods in Natural Language Processing. Stroudsburg: Association for Computational Linguistics, 2016. 247−256. [doi:
10.18653/v1/D16-1024]
[27] Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser A, Polosukhin I. Attention is all you need. In: Proc. of
the Advances in Neural Information Processing Systems. 2017. 5998−6008.
[28] Lin Z, Feng M, Santos CND, Yu M, Xiang B, Zhou B, Bengio Y. A structured self-attentive sentence embedding. arXiv preprint
arXiv:1703.03130, 2017.
[29] Wang Y, Sun A, Han J, Liu Y, Zhu X. Sentiment analysis by capsules. In: Proc. of the 2018 Int’l Conf. on World Wide Web.
Steering Committee, 2018. 1165−1174. [doi: 10.1145/3178876.3186015]
[30] Schuster M, Paliwal KK. Bidirectional recurrent neural networks. IEEE Trans. on Signal Processing, 1997,45(11):2673−2681.
[31] Ba JL, Kiros JR, Hinton GE. Layer normalization. arXiv preprint arXiv:1607.06450, 2016.
[32] Le Q, Mikolov T. Distributed representations of sentences and documents. In: Proc. of the Int’l Conf. on Machine Learning. JMLR:
Workshop&CP, 2014. 1188−1196.