Page 330 - 《软件学报》2025年第7期
P. 330
黄靖 等: 基于特征融合动态图网络的多标签文本分类算法 3251
[38] Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I. Attention is all you need. In: Proc. of the
31st Int’l Conf. on Neural Information Processing Systems. Long Beach: Curran Associates Inc., 2017. 6000–6010.
[39] He XN, Deng K, Wang X, Li Y, Zhang YD, Wang M. LightGCN: Simplifying and powering graph convolution network for
recommendation. In: Proc. of the 43rd Int’l ACM SIGIR Conf. on Research and Development in Information Retrieval. ACM, 2020.
639–648. [doi: 10.1145/3397271.34010]
[40] Ma QW, Yuan CY, Zhou W, Hu SL. Label-specific dual graph neural network for multi-label text classification. In: Proc. of the 59th
Annual Meeting of the Association for Computational Linguistics and the 11th Int’l Joint Conf. on Natural Language Processing. ACL,
2021. 3855–3864. [doi: 10.18653/v1/2021.acl-long.298]
[41] Ye J, He JJ, Peng XJ, Wu WH, Qiao Y. Attention-driven dynamic graph convolutional network for multi-label image recognition. In:
Proc. of the 16th European Conf. on Computer Vision. Glasgow: Springer, 2020. 649–665. [doi: 10.1007/978-3-030-58589-1_39]
[42] Katakis I, Tsoumakas G, Vlahavas I. Multilabel text classification for automated tag suggestion. In: Proc. of the 2008 ECML/PKDD
Discovery Challenge. Antwerp: ECML, 2008. 1–9.
[43] Lewis DD, Yang YM, Rose T G, Li F. RCV1: A new benchmark collection for text categorization research. The Journal of Machine
Learning Research, 2004, 5: 361–397.
[44] Yang PC, Sun X, Li W, Ma SM, Wu W, Wang HF. SGM: Sequence generation model for multi-label classification. In: Proc. of the 27th
Int’l Conf. on Computational Linguistics. Santa Fe: ACL, 2018. 3915–3926.
[45] Zhang ML, Zhou ZH. A review on multi-label learning algorithms. IEEE Trans. on Knowledge and Data Engineering, 2014, 26(8):
1819–1837. [doi: 10.1109/TKDE.2013.39]
[46] Yang PC, Luo FL, Ma SM, Lin JY, Sun X. A deep reinforced sequence-to-set model for multi-label classification. In: Proc. of the 57th
Annual Meeting of the Association for Computational Linguistics. Florence: ACL, 2019. 5252–5258. [doi: 10.18653/v1/P19-1518]
[47] Boutell MR, Luo JB, Shen XP, Brown CM. Learning multi-label scene classification. Pattern Recognition, 2004, 37(9): 1757–1771. [doi:
10.1016/j.patcog.2004.03.009]
[48] Chen GB, Ye DH, Xing ZC, Chen JS, Cambria E. Ensemble application of convolutional and recurrent neural networks for multi-label
text categorization. In: Proc. of the 2017 Int’l Joint Conf. on Neural Networks (IJCNN). Anchorage: IEEE, 2017. 2377–2383. [doi: 10.
1109/IJCNN.2017.7966144]
[49] Wang R, Ridley R, Su XA, Qu WG, Dai XY. A novel reasoning mechanism for multi-label text classification. Information Processing &
Management, 2021, 58(2): 102441. [doi: 10.1016/j.ipm.2020.102441]
[50] Zhang XM, Zhang QW, Yan Z, Liu RF, Cao YB. Enhancing label correlation feedback in multi-label text classification via multi-task
learning. In: Proc. of the 2021 Findings of the Association for Computational Linguistics (ACL-IJCNLP 2021). ACL, 2021. 1190–1200.
[doi: 10.18653/v1/2021.findings-acl.101]
[51] Yan YY, Liu F, Zhuang XQ, Ju J. An R-Transformer_BiLSTM model based on attention for multi-label text classification. Neural
Processing Letters, 2023, 55(2): 1293–1316. [doi: 10.1007/s11063-022-10938-y]
[52] Kingma DP, Ba J. Adam: A method for stochastic optimization. In: Proc. of the 3rd Int’l Conf. on Learning Representations. San Diego,
2015. 6.
附中文参考文献:
[13] 杜晓宇, 陈正, 项欣光. 基于解耦图神经网络的可解释标签感知推荐算法. 软件学报, 2023, 34(12): 5670–5685. http://www.jos.org.cn/
1000-9825/6754.htm [doi: 10.13328/j.cnki.jos.006754]
[19] 李芳芳, 苏朴真, 段俊文, 张师超, 毛星亮. 多粒度信息关系增强的多标签文本分类. 软件学报, 2023, 34(12): 5686–5703. http://www.
jos.org.cn/1000-9825/6802.htm [doi: 10.13328/j.cnki.jos.006802]
[26] 肖琳, 陈博理, 黄鑫, 刘华锋, 景丽萍, 于剑. 基于标签语义注意力的多标签文本分类. 软件学报, 2020, 31(4): 1079–1089. http://www.
jos.org.cn/1000-9825/5923.htm [doi: 10.13328/j.cnki.jos.005923]
[31] 李博涵, 向宇轩, 封顶, 何志超, 吴佳骏, 戴天伦, 李静. 融合知识感知与双重注意力的短文本分类模型. 软件学报, 2022, 33(10):
3565–3581. http://www.jos.org.cn/1000-9825/6630.htm [doi: 10.13328/j.cnki.jos.006630]

