Page 86 - 《软件学报》2020年第12期
P. 86

3752                                Journal of Software  软件学报 Vol.31, No.12, December 2020

         [29]    Kellermayer DI. Numerische optimierung von computer-modellen mittels der evolutionsstrategie Hans-Paul Schwefel Birkhäuser,
             Basel and Stuttgart. 1977 370 pages Hardback SF/48 ISBN 3-7643-0876-1. Cybernetics and Systems, 1977.
         [30]    Beyer HG. Towards a theory of evolution strategies: Self-adaptation. Evolutionary Computation, 1995,3(3):311−347.
         [31]    Hansen N, Ostermeier A. Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation, 2001,9(2):
             159−195.
         [32]    Friedrichs F, Igel C. Evolutionary tuning of multiple SVM parameters. Neurocomputing, 2005,107−117.
         [33]    Muller SD, Marchetto J, Airaghi S, et al. Optimization based on bacterial chemotaxis. IEEE Trans. on Evolutionary Computation,
             2002,6(1):16−29.
         [34]    Shepherd J, Mcdowell DL, Jacob KI, et al. Modeling morphology evolution and mechanical behavior during thermo-mechanical
             processing of semi-crystalline polymers. Journal of the Mechanics and Physics of Solids, 2006,54(3):467−489.
         [35]    Schaul T, Glasmachers T, Schmidhuber J, et al. High dimensions and heavy tails for natural evolution strategies. In: Proc. of the
             Genetic and Evolutionary Computation Conf. 2011. 845−852.
         [36]    Wierstra D, Schaul T, Glasmachers T, et al. Natural evolution strategies. Journal of Machine Learning Research, 2011.
         [37]    Berny  A. Selection and reinforcement learning for  combinatorial optimization. In: Proc. of the Parallel Problem Solving from
             Nature. 2000. 601−610.
         [38]    Berny  A.  Statistical machine  learning and combinatorial  optimization. In:  Proc.  of the Theoretical Aspects  of Evolutionary
             Computing. Berlin, Heidelberg: Springer-Verlag, 2001. 287−306.
         [39]    Amari S. Natural gradient works efficiently in learning. Neural Computation, 1998,10(2):251−276.
         [40]    Amari S,  Douglas SC.  Why natural gradient. In: Proc. of the Int’l  Conf. on Acoustics  Speech  and Signal Processing. 1998.
             1213−1216.
         [41]    Omidvar MN, Li X, Yang Z, et al. Cooperative co-evolution for large scale optimization through more frequent random grouping.
             In: Proc. of the Congress on Evolutionary Computation. 2010. 1−8.
         [42]    Tahir  MA,  Bouridane  A,  Kurugollu  F. Simultaneous feature selection  and feature  weighting using hybrid tabu search/K-nearest
             neighbor classifier. Pattern Recognition Letters, 2007,28(4):438−446.
         [43]    Hu Q, Che X, Zhang L,  et  al. Feature  evaluation  and selection based on neighborhood soft  margin.  Neurocomputing, 2010,
             73(10-12):2114−2124.
         [44]    Huang J, Cai Y, Xu X. A hybrid genetic algorithm for feature selection wrapper based on mutual information. Pattern Recognition
             Letters, 2007,28(13):1825−1844.
         [45]    Khan A, Baig  AR.  Multi-Objective feature  subset selection  using mRMR  based enhanced ant colony  optimization algorithm
             (mRMR-EACO). Journal of Experimental and Theoretical Artificial Intelligence, 2016,28(6):1061−1073.
         [46]    Zhang X. Research of feature selection algorithm based on natural evolution strategy [MS. Thesis]. Changchun: Jilin University,
             2020.
         附中文参考文献:
         [46]  张鑫.基于自然进化策略的特征选择算法研究[硕士学位论文].长春:吉林大学,2020.



                       张鑫(1994-),男,硕士,主要研究领域为进                      李占山(1966-),男,博士,教授,博士生导
                       化计算,强化学习.                                    师,CCF 专业会员,主要研究领域为机器学
                                                                    习,约束推理.
   81   82   83   84   85   86   87   88   89   90   91