Page 365 - 《软件学报》2024年第4期
P. 365

吕沈欢 等: 多标记学习中基于交互表示的深度森林方法                                                      1943


                     251]
                  [8]  Vens C, Struyf J, Schietgat L, Džeroski S, Blockeel H. Decision trees for hierarchical multi-label classification. Machine Learning, 2008,
                     73(2): 185–214. [doi: 10.1007/s10994-008-5077-3]
                  [9]  Liu  SY,  Song  XH,  Ma  ZC,  Ganaa  ED,  Shen  XJ.  MoRE:  Multi-output  residual  embedding  for  multi-label  classification.  Pattern
                     Recognition, 2022, 126: 108584. [doi: 10.1016/j.patcog.2022.108584]
                 [10]  Zhang  ML,  Zhou  ZH.  Multilabel  neural  networks  with  applications  to  functional  genomics  and  text  categorization.  IEEE  Trans.  on
                     Knowledge and Data Engineering, 2006, 18(10): 1338–1351. [doi: 10.1109/TKDE.2006.162]
                 [11]  Zhou ZH, Feng J. Deep forest. National Science Review, 2019, 6(1): 74–86. [doi: 10.1093/nsr/nwy108]
                 [12]  Lyu SH, Yang L, Zhou ZH. A refined margin distribution analysis for forest representation learning. In: Proc. of the 33rd Int’l Conf. on
                     Neural Information Processing Systems. Vancouver: Curran Associates Inc., 2019. 5530–5540.
                 [13]  Yang L, Wu XZ, Jiang Y, Zhou ZH. Multi-label learning with deep forest. In: Proc. of the 24th European Conf. on Artificial Intelligence.
                     Santiago de Compostela: ECAI, 2020. 1634–1641.
                 [14]  Wang QW, Yang L, Li YF. Learning from weak-label data: A deep forest expedition. In: Proc. of the 34th AAAI Conf. on Artificial
                     Intelligence, 2020, 34(4): 6251–6258. [doi: 10.1609/aaai.v34i04.6092]
                 [15]  Chen YN, Weng W, Wu SX, Chen BH, Fan YL, Liu JH. An efficient stacking model with label selection for multi-label classification.
                     Applied Intelligence, 2021, 51(1): 308–325. [doi: 10.1007/s10489-020-01807-z]
                 [16]  Ma PF, Wu YX, Li Y, Guo L, Li Z. DBC-Forest: Deep forest with binning confidence screening. Neurocomputing, 2022, 475: 112–122.
                     [doi: 10.1016/j.neucom.2021.12.075]
                 [17]  Ma PF, Wu YX, Li Y, Guo L, Jiang H, Zhu XQ, Wu XD. HW-Forest: Deep forest with hashing screening and window screening. ACM
                     Trans. on Knowledge Discovery from Data, 2022, 16(6): 123. [doi: 10.1145/3532193]
                 [18]  Yu QZ, Dong ZH, Fan XY, Zong LC, Li Y. HMD-AMP: Protein language-powered hierarchical multi-label deep forest for annotating
                     antimicrobial peptides. arXiv:2111.06023, 2021.
                 [19]  Basu S, Kumbier K, Brown J B, Yu B. Iterative random forests to discover predictive and stable high-order interactions. Proc. of the
                     National Academy of Sciences of the United States of America, 2018, 115(8): 1943–1948. [doi: 10.1073/pnas.1711236115]
                 [20]  Liang SP, Pan WW, You DL, Liu Z, Yin L. Incremental deep forest for multi-label data streams learning. Applied Intelligence, 2022,
                     52(12): 13398–13414. [doi: 10.1007/s10489-022-03414-6]
                 [21]  Liu WW, Wang HB, Shen XB, Tsang IW. The emerging trends of multi-label learning. IEEE Trans. on Pattern Analysis and Machine
                     Intelligence, 2022, 44(11): 7955–7974. [doi: 10.1109/TPAMI.2021.3119334]
                 [22]  Behr  M,  Wang  Y,  Li  X,  Yu  B.  Provable  boolean  interaction  recovery  from  tree  ensemble  obtained  via  random  forests.  Proc.  of  the
                     National Academy of Sciences of the United States of America, 2022, 119(22): e2118636119. [doi: 10.1073/pnas.2118636119]
                 [23]  Chen YH, Lyu SH, Jiang Y. Improving deep forest by exploiting high-order interactions. In: Proc. of the 2021 IEEE Int’l Conf. on Data
                     Mining. Auckland: IEEE, 2021. 1030—1035. [doi: 10.1109/ICDM51629.2021.00118]
                 [24]  Kocev D, Vens C, Struyf J, Džeroski S. Tree ensembles for predicting structured outputs. Pattern Recognition, 2013, 46(3): 817–833.
                     [doi: 10.1016/j.patcog.2012.09.023]
                 [25]  Nakano FK, Pliakos K, Vens C. Deep tree-ensembles for multi-output prediction. Pattern Recognition, 2022, 121: 108211. [doi: 10.1016/j.
                     patcog.2021.108211]
                 [26]  Hinton GE, Salakhutdinov RR. Reducing the dimensionality of data with neural networks. Science, 2006, 313(5786): 504–507. [doi: 10.
                     1126/science.1127647]
                 [27]  Read  J,  Reutemann  P,  Pfahringer  B,  Holmes  G.  MEKA:  A  multi-label/multi-target  extension  to  weka.  Journal  of  Machine  Learning
                     Research, 2016 17(1): 667–671.
                 [28]  Tsoumakas G, Vlahavas IP. Random k-labelsets: An ensemble method for multilabel classification. In: Proc. of the 18th European Conf.
                     on Machine Learning. Warsaw: ECML, 2007. 406–417.
                 [29]  Zhang ML, Zhou ZH. ML-KNN: A lazy learning approach to multi-label learning. Pattern Recognition, 2007, 40(7): 2038–2048. [doi: 10.
                     1016/j.patcog.2006.12.019]
                 [30]  Benites F, Sapozhnikova E. HARAM: A hierarchical ARAM neural network for large-scale text classification. In: Proc. of the 2015 IEEE
                     Int’l Conf. on Data Mining Workshop. Atlantic City: IEEE, 2015. 847–854. [doi: 10.1109/ICDMW.2015.14]
   360   361   362   363   364   365   366   367   368   369   370