Page 382 - 《软件学报》2025年第4期
P. 382
1788 软件学报 2025 年第 36 卷第 4 期
[103] Saleh AY, Hameed HNBA, Salleh MNM. A novel hybrid algorithm of differential evolution with evolving spiking neural network for
pre-synaptic neurons optimization. Int’l Journal of Advances in Soft Computing and Its Applications, 2014, 6(1): 1–16.
[104] Schaffer JD. Evolving spiking neural networks: A novel growth algorithm corrects the teacher. In: Proc. of the 2015 IEEE Symp. on
Computational Intelligence for Security and Defense Applications (CISDA). Verona: IEEE, 2015. 1–8. [doi: 10.1109/CISDA.
2015.7208630]
[105] Vázquez RA. Izhikevich neuron model and its application in pattern recognition. Australian Journal of Intelligent Information
Processing Systems, 2010, 11(1): 35–40.
[106] López-Vázquez G, Ornelas-Rodríguez M, Espinal A, Soria-Alcaraz JA, Rojas-Domínguez A, PugaSoberanes HJ, Carpio JM, Rostro-
González H. Evolving random topologies of spiking neural networks for pattern recognition. Computer Science and Information
Technology, 2019, 9(7): 41–56. [doi: 10.5121/csit.2019.90704]
[107] Yusuf ZM, Hamed HNA, Yusuf LM, Isa MA. Evolving spiking neural network (ESNN) and harmony search algorithm (HSA) for
parameter optimization. In: Proc. of the 6th Int’l Conf. on Electrical Engineering and Informatics (ICEEI). Langkawi: IEEE, 2017. 1–6.
[doi: 10.1109/ICEEI.2017.8312365]
[108] Zhang AG, Han Y, Niu YZ, Gao YM, Chen ZZ, Zhao K. Self-evolutionary neuron model for fast-response spiking neural networks.
IEEE Trans. on Cognitive and Developmental Systems, 2022, 14(4): 1766–1777. [doi: 10.1109/TCDS.2021.3139444]
[109] Abadi M, Barham P, Chen JM, et al. TensorFlow: A system for large-scale machine learning. In: Proc. of the 12th USENIX Symp. on
Operating Systems Design and Implementation (OSDI 16). Savannah: USENIX Association, 2016. 265–283.
[110] Paszke A, Gross S, Massa F, et al. PyTorch: An imperative style, high-performance deep learning library. In: Proc. of the 33rd Int’l
Conf. on Neural Information Processing Systems. Vancouver: Curran Associates Inc., 2019. 8026–8037.
[111] Jia YQ, Shelhamer E, Donahue J, Karayev S, Long J, Girshick R, Guadarrama S, Darrell T. Caffe: Convolutional architecture for fast
feature embedding. In: Proc. of the 22nd ACM Int’l Conf. on Multimedia. Orlando: ACM, 2014. 675–678. [doi: 10.1145/2647868.
2654889]
[112] Hazan H, Saunders DJ, Khan H, Patel D, Sanghavi DT, Siegelmann HT, Kozma R. BindsNET: A machine learning-oriented spiking
neural networks library in Python. Frontiers in Neuroinformatics, 2018, 12: 89. [doi: 10.3389/fninf.2018.00089]
[113] Fang W, Chen YQ, Ding JH, Yu ZF, Masquelier T, Chen D, Huang LW, Zhou HH, Li GQ, Tian YH. SpikingJelly: An open-source
machine learning infrastructure platform for spike-based intelligence. Science Advances, 2023, 9(40): eadi1480. [doi: 10.1126/sciadv.
adi1480]
[114] Gütig R, Sompolinsky H. The tempotron: A neuron that learns spike timing-based decisions. Nature Neuroscience, 2006, 9(3): 420–428.
[doi: 10.1038/nn1643]
[115] Mozafari M, Ganjtabesh M, Nowzari-Dalini A, Masquelier T. SpykeTorch: Efficient simulation of convolutional spiking neural
networks with at most one spike per neuron. Frontiers in Neuroscience, 2019, 13: 625. [doi: 10.3389/fnins.2019.00625]
[116] Frémaux N, Gerstner W. Neuromodulated spike-timing-dependent plasticity, and theory of three-factor learning rules. Frontiers in
Neural Circuits, 2016, 9: 85. [doi: 10.3389/fncir.2015.00085]
[117] Javanshir A, Nguyen TT, Mahmud MAP, Kouzani AZ. Advancements in algorithms and neuromorphic hardware for spiking neural
networks. Neural Computation, 2022, 34(6): 1289–1328. [doi: 10.1162/neco_a_01499]
[118] Gewaltig MO, Diesmann M. Nest (neural simulation tool). Scholarpedia, 2007, 2(4): 1430. [doi: 10.4249/scholarpedia.1430]
[119] Bekolay T, Bergstra J, Hunsberger E, Dewolf T, Stewart TC, Rasmussen D, Choo X, Voelker A, Eliasmith C. Nengo: A Python tool for
building large-scale functional brain models. Frontiers in Neuroinformatics, 2014, 7: 48. [doi: 10.3389/fninf.2013.00048]
[120] Fidjeland AK, Roesch EB, Shanahan MP, Luk W. NeMo: A platform for neural modelling of spiking neurons using GPUs. In: Proc. of
the 20th IEEE Int’l Conf. on Application-specific Systems, Architectures and Processors. Boston: IEEE, 2009. 137–144. [doi: 10.1109/
ASAP.2009.24]
[121] Yavuz E, Turner J, Nowotny T. GeNN: A code generation framework for accelerated brain simulations. Scientific Reports, 2016, 6(1):
18854. [doi: 10.1038/srep18854]
[122] Stimberg M, Brette R, Goodman DFM. Brian 2, an intuitive and efficient neural simulator. eLife, 2019, 8: e47314. [doi: 10.7554/eLife.
47314]
[123] Niedermeier L, Chen KX, Xing JW, Das A, Kopsick J, Scott E, Sutton N, Weber K, Dutt N, Krichmar JL. CARLsim 6: An open source
library for large-scale, biologically detailed spiking neural network simulation. In: Proc. of the 2022 Int’l Joint Conf. on Neural
Networks (IJCNN). Padua: IEEE, 2022. 1–10. [doi: 10.1109/IJCNN55064.2022.9892644]
[124] Xu XW, Ding YK, Hu SX, Niemier M, Cong J, Hu Y, Shi YY. Scaling for edge inference of deep neural networks. Nature Electronics,
2018, 1(4): 216–222. [doi: 10.1038/s41928-018-0059-3]