Page 31 - 《软件学报》2020年第9期
P. 31

2652                                 Journal of Software  软件学报 Vol.31, No.9,  September 2020

         [65]    Pham H, Guan MY, Zoph B, Le QV, Dean J. Efficient neural architecture search via parameter sharing. In: Proc. of the 35th Int’l
             Conf. on Machine Learning. 2018. 4092−4101.
         [66]    Elsken T, Metzen JH, Hutter F. Efficient multi-objective neural architecture search via lamarckian evolution. In: Proc. of the 7th
             Int’l Conf. on Learning Representations. 2019.
         [67]    Cai H, Yang J, Zhang W, Han S, Yu Y. Path-Level network transformation for efficient architecture search. In: Proc. of the 35th
             Int’l Conf. on Machine Learning. 2018. 677−686.
         [68]    Liu H,  Simonyan K, Yang  Y. DARTS: Differentiable architecture search.  In: Proc.  of the  7th  Int’l Conf.  on Learning
             Representations. 2019.
         [69]    Zhong Z, Yan J, Wu W, Shao J, Liu CL. Practical block-wise neural network architecture generation. In: Proc. of the IEEE Conf.
             on Computer Vision and Pattern Recognition. 2018. 2423−2432.
         [70]    Dong JD, Cheng AC, Juan DC, Wei W, Sun M. Dpp-Net: Device-aware progressive search for pareto-optimal neural architectures.
             In: Proc. of the European Conf. on Computer Vision. 2018. 517−531.
         [71]    Zhong Z, Yan J, Wu W, Shao J, Liu CL. Practical block-wise neural network architecture generation. In: Proc. of the IEEE Conf.
             on Computer Vision and Pattern Recognition. 2018. 2423−2432.
         [72]    Chen T, Goodfellow I, Shlens J. Net2net: Accelerating learning via knowledge transfer. In: Proc. of the 4th Int’l Conf. on Learning
             Representations. 2016.
         [73]    Goldberg  DE,  Deb K.  A  comparative  analysis of selection schemes used in genetic  algorithms. In: Proc. of the Foundations of
             Genetic Algorithms. Elsevier. 1991. 69−93.
         [74]    Cubuk ED, Zoph B,  Schoenholz  SS, Le QV. Intriguing  properties  of adversarial examples.  In:  Proc.  of the 6th  Int’l Conf.  on
             Learning Representations Workshop. 2018.
         [75]    Liu H, Simonyan K, Vinyals O, Fernando C, Kavukcuoglu K. Hierarchical representations for efficient architecture search. In: Proc.
             of the 6th Int’l Conf. on Learning Representations. 2018.
         [76]    Elsken T, Metzen JH, Hutter F. Simple and efficient architecture search for convolutional neural networks. In: Proc. of the 6th Int’l
             Conf. on Learning Representations Workshop. 2018.
         [77]    Wistuba M. Deep learning architecture search by neuro-cell-based evolution with function-preserving mutations. In: Proc. of the
             Joint European Conf. on Machine Learning and Knowledge Discovery in Databases. 2018. 243−258.
         [78]    Real E, Moore S, Selle A, Saxena S, Suematsu YL, Tan J, Le QV, Kurakin A. Large-scale evolution of image classifiers. In: Proc.
             of the 34th Int’l Conf. on Machine Learning, Vol.70. 2017. 2902−2911.
         [79]    Xie L, Yuille A. Genetic CNN. In: Proc. of the IEEE Int’l Conf. on Computer Vision. 2017. 1379−1388.
         [80]    Kandasamy K, Neiswanger W, Schneider J, Póczos B, Xing EP. Neural architecture search with bayesian optimisation and optimal
             transport. In: Proc. of the Advances in Neural Information Processing Systems. 2018. 2016−2025.
         [81]    Luo R, Tian  F, Qin T, Chen E, Liu  TY. Neural architecture  optimization. In:  Proc.  of  the Advances  in Neural  Information
             Processing Systems. 2018. 7816−7827.
         [82]    Bender G, Kindermans PJ, Zoph B, Vasudenvan V, Le QV. Understanding and simplifying one-shot architecture search. In: Proc.
             of the Int’l Conf. on Machine Learning. 2018. 549−558.
         [83]    Brock A, Lim T, Ritchie JM, Weston N. SMASH: One-shot model architecture search through HyperNetworks. In: Proc. of the 6th
             Int’l Conf. on Learning Representations. 2018.
         [84]    Zhang C, Ren M, Urtasun R. Graph HyperNetworks for neural architecture search. In: Proc. of the 7th Int’l Conf. on Learning
             Representations. 2019.
         [85]    Real E, Aggarwal A, Huang Y, Le QV. Regularized evolution for image classifier architecture search. In: Proc. of the AAAI Conf.
             on Artificial Intelligence, Vol.33. 2019. 4780−4789.
         [86]    Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, Huang Z, Karpathy A, Khosla A, Bernstein MS, Berg AC, Li FF.
             Imagenet large scale visual recognition challenge. Int’l Journal of Computer Vision, 2015,115(3):211−252.
         [87]    Lin TY, Maire M, Belongie S, Hays J, Perona P, Ramanan D, Dollár P, Zitnick CL. Microsoft COCO: Common objects in context.
             In: Proc. of the European Conf. on Computer Vision. 2014. 740−755.
         [88]    Zela  A,  Klein A, Falkner  S,  Frank H. Towards automated  deep learning: Efficient  joint  neural architecture and  hyperparameter
             search. In: Proc. of the 21th Int’l Conf. on Artificial Intelligence and Statistics Workshop. 2018.
         [89]    Klein  A, Falkner  S,  Bartels S,  Hennig P,  Hutter F. Fast  Bayesian optimization of  machine learning hyperparameters on  large
             datasets. In: Proc. of the 20th Int’l Conf. on Artificial Intelligence and Statistics. 2017. 528−536.
   26   27   28   29   30   31   32   33   34   35   36