Page 329 - 《软件学报》2025年第7期
P. 329

3250                                                       软件学报  2025  年第  36  卷第  7  期


                     and Technology (ICET). Antalya: IEEE, 2017. 1–6. [doi: 10.1109/ICEngTechnol.2017.8308186]
                 [18]  Pouyanfar S, Sadiq S, Yan YL, Tian HM, Tao YD, Reyes MP, Shyu ML, Chen SC, Iyengar SS. A survey on deep learning: Algorithms,
                     techniques, and applications. ACM Computing Surveys (CSUR), 2018, 51(5): 92. [doi: 10.1145/3234150]
                 [19]  Li FF, Su PZ, Duan JW, Zhang SC, Mao XL. Multi-label text classification with enhancing multi-granularity information relations. Ruan
                     Jian Xue Bao/Journal of Software, 2023, 34(12): 5686–5703 (in Chinese with English abstract). http://www.jos.org.cn/1000-9825/6802.
                     htm [doi: 10.13328/j.cnki.jos.006802]
                 [20]  He JD, Xu BW, Yang Z, Han D, Yang CR, Lo D. PTM4Tag: Sharpening tag recommendation of stack overflow posts with pre-trained
                     models. In: Proc. of the 30th IEEE/ACM Int’l Conf. on Program Comprehension. ACM, 2022. 1–11. [doi: 10.1145/3524610.3527897]
                 [21]  Du XY, Wu ZK, Feng FL, He XN, Tang JH. Invariant representation learning for multimedia recommendation. In: Proc. of the 30th
                     ACM Int’l Conf. on Multimedia. Lisboa: ACM, 2022. 619–628. [doi: 10.1145/3503161.3548405]
                 [22]  Du XY, Wang X, He XN, Li ZC, Tang JH Chua TS. How to learn item representation for cold-start multimedia recommendation? In:
                     Proc. of the 28th ACM Int’l Conf. on Multimedia. Seattle: ACM, 2020. 3469–3477. [doi: 10.1145/3394171.3413628]
                 [23]  Kim Y. Convolutional neural networks for sentence classification. In: Proc. of the 2014 Conf. on Empirical Methods in Natural Language
                     Processing (EMNLP). Doha: ACL, 2014. 1746–1751. [doi: 10.3115/v1/D14-1181]
                 [24]  Dieng AB, Wang C, Gao JF, Paisley J. TopicRNN: A recurrent neural network with long-range semantic dependency. In: Proc. of the 5th
                     Int’l Conf. on Learning Representations. Toulon: OpenReview.net, 2017.
                 [25]  Yang ZC, Yang DY, Dyer C, He XD, Smola A, Hovy E. Hierarchical attention networks for document classification. In: Proc. of the
                     2016 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. San Diego:
                     ACL, 2016. 1480–1489. [doi: 10.18653/v1/N16-1174]
                 [26]  Xiao L, Chen BL, Huang X, Liu HF, Jing LP, Yu J. Multi-label text classification method based on label semantic information. Ruan Jian
                     Xue Bao/Journal of Software, 2020, 31(4): 1079–1089 (in Chinese with English abstract). http://www.jos.org.cn/1000-9825/5923.htm
                     [doi: 10.13328/j.cnki.jos.005923]
                 [27]  Read  J,  Pfahringer  B,  Holmes  G,  Frank  E.  Classifier  chains  for  multi-label  classification.  In:  Proc.  of  the  2009  European  Conf.  on
                     Machine Learning and Knowledge Discovery in Databases. Bled: Springer, 2009. 254–269. [doi: 10.1007/978-3-642-04174-7_17]
                 [28]  Nam J, Loza Mencía E, Kim HJ, Fürnkranz J. Maximizing subset accuracy with recurrent neural networks in multi-label classification. In:
                     Proc. of the 30th Conf. on Neural Information Processing Systems. Long Beach, 2017. 30
                 [29]  Ozmen M, Zhang H, Wang PY, Coates M. Multi-relation message passing for multi-label text classification. In: Proc. of the 2022 IEEE
                     Int’l Conf. on Acoustics, Speech and Signal Processing (ICASSP). Singapore: IEEE, 2022. 3583–3587. [doi: 10.1109/ICASSP43922.
                     2022.9747225]
                 [30]  Vashishth S, Sanyal S, Nitin V, Talukdar P. Composition-based multi-relational graph convolutional networks. In: Proc. of the 8th Int’l
                     Conf. on Learning Representations. Addis Ababa: OpenReview.net, 2020.
                 [31]  Li BH, Xiang YX, Feng D, He ZC, Wu JJ, Dai TL, Li J. Short text classification model combining knowledge aware and dual attention.
                     Ruan Jian Xue Bao/Journal of Software, 2022, 33(10): 3565–3581 (in Chinese with English abstract). http://www.jos.org.cn/1000-9825/
                     6630.htm [doi: 10.13328/j.cnki.jos.006630]
                 [32]  Guo L, Zhang DX, Wang L, Wang H, Cui B. CRAN: A hybrid CNN-RNN attention-based model for text classification. In: Proc. of the
                     37th Int’l Conf. on Conceptual Modeling. Xi’an: Springer, 2018. 571–585. [doi: 10.1007/978-3-030-00847-5_42]
                 [33]  Zhou P, Shi W, Tian J, Qi ZY, Li BC, Hao HW, Xu B. Attention-based bidirectional long short-term memory networks for relation
                     classification. In: Proc. of the 54th Annual Meeting of the Association for Computational Linguistics, Vol. 2 (Short Papers). Berlin: ACL,
                     2016. 207–212. [doi: 10.18653/v1/P16-2034]
                 [34]  Kipf TN, Welling M. Semi-supervised classification with graph convolutional networks. In: Proc. of the 5th Int’l Conf. on Learning
                     Representations. Toulon: OpenReview.net, 2017.
                 [35]  Veličković  P,  Cucurull  G,  Casanova  A,  Romero  A,  Liò  P,  Bengio  Y.  Graph  attention  networks.  In:  Proc.  of  the  6th  Int’l  Conf.  on
                     Learning Representations. Vancouver: OpenReview.net, 2018.
                 [36]  Liu B. GCN-BERT and memory network based multi-label classification for event text of the Chinese government hotline. IEEE Access,
                     2022, 10: 109267–109276. [doi: 10.1109/ACCESS.2022.3213978]
                 [37]  Pal A, Selvakumar M, Sankarasubbu M. Multi-label text classification using attention-based graph neural network. arXiv:2003.11644, 2020.
   324   325   326   327   328   329   330   331   332   333   334