Page 378 - 《软件学报》2025年第4期
P. 378
1784 软件学报 2025 年第 36 卷第 4 期
[13] Cybenko G. Approximation by superpositions of a sigmoidal function. Mathematics of Control, Signals and Systems, 1989, 2(4):
303–314. [doi: 10.1007/BF02551274]
[14] Funahashi KI. On the approximate realization of continuous mappings by neural networks. Neural Networks, 1989, 2(3): 183–192. [doi:
10.1016/0893-6080(89)90003-8]
[15] Hornik K, Stinchcombe M, White H. Multilayer feedforward networks are universal approximators. Neural Networks, 1989, 2(5):
359–366. [doi: 10.1016/0893-6080(89)90020-8]
[16] Rumelhart DE, Hinton GE, Williams RJ. Learning representations by back-propagating errors. Nature, 1986, 323(6088): 533–536. [doi:
10.1038/323533a0]
[17] Hinton GE, Osindero S, Teh YW. A fast learning algorithm for deep belief nets. Neural Computation, 2006, 18(7): 1527–1554. [doi: 10.
1162/neco.2006.18.7.1527]
[18] Brown TB, Mann B, Ryder N, et al. Language models are few-shot learners. In: Proc. of the 34th Int’l Conf. on Neural Information
Processing Systems. Vancouver: Curran Associates Inc., 2020. 1877–1901.
[19] Zhang ML, Wang JD, Wu JB, Belatreche A, Amornpaisannon B, Zhang ZX, Miriyala VPK, Qu H, Chua Y, Carlson TE, Li HZ.
Rectified linear postsynaptic potential function for backpropagation in deep spiking neural networks. IEEE Trans. on Neural Networks
and Learning Systems, 2022, 33(5): 1947–1958. [doi: 10.1109/TNNLS.2021.3110991]
[20] Maass W. Networks of spiking neurons: The third generation of neural network models. Neural Networks, 1997, 10(9): 1659–1671.
[doi: 10.1016/S0893-6080(97)00011-7]
[21] Han S, Pool J, Tran J, Dally WJ. Learning both weights and connections for efficient neural networks. In: Proc. of the 28th Int’l Conf.
on Neural Information Processing Systems. Montreal: MIT Press, 2015. 1135–1143.
[22] Tavanaei A, Ghodrati M, Kheradpisheh SR, Masquelier T, Maida A. Deep learning in spiking neural networks. Neural Networks, 2019,
111: 47–63. [doi: 10.1016/j.neunet.2018.12.002]
[23] Stone JV. Principles of Neural Information Theory: Computational Neuroscience and Metabolic Efficiency. Sheffield: Sebtel Press,
2018.
[24] Wang B, Xue C, Liu H, Li X, Yin AR, Feng ZY, Kong YY, Xiong TZ, Hsu H, Zhou YL, Guo A, Wang YF, Yang J, Si X. SNNIM: A
10T-SRAM based spiking-neural-network-in-memory architecture with capacitance computation. In: Proc. of the 2022 IEEE Int’l Symp.
on Circuits and Systems (ISCAS). Austin: IEEE, 2022. 3383–3387. [doi: 10.1109/ISCAS48785.2022.9937272]
[25] Pan XQ, Luo W, Shuai Y, Hu SG, Luo WB, Qiao GC, Zhou T, Wang JJ, Xie Q, Huang ST, Liu Y, Wu CG, Zhang WL. Hardware
implementation of edge neural network computing for sensor with memristors based on single-crystalline LiNbO 3 thin film. IEEE
Sensors Journal, 2023, 23(8): 8526–8534. [doi: 10.1109/JSEN.2023.3248123]
[26] Lee C, Sarwar SS, Panda P, Srinivasan G, Roy K. Enabling spike-based backpropagation for training deep neural network architectures.
Frontiers in Neuroscience, 2020, 14: 497482. [doi: 10.3389/fnins.2020.00119]
[27] Wang YC, Liu HW, Zhang ML, Luo XL, Qu H. A universal ANN-to-SNN framework for achieving high accuracy and low latency deep
spiking neural networks. Neural Networks, 2024, 174: 106244. [doi: 10.1016/j.neunet.2024.106244]
[28] Lien HH, Chang TS. Sparse compressed spiking neural network accelerator for object detection. IEEE Trans. on Circuits and Systems I:
Regular Papers, 2022, 69(5): 2060–2069. [doi: 10.1109/TCSI.2022.3149006]
[29] Liu KF, Cui XX, Ji X, Kuang YS, Zou CL, Zhong Y, Xiao KL, Wang Y. Real-time target tracking system with spiking neural networks
implemented on neuromorphic chips. IEEE Trans. on Circuits and Systems II: Express Briefs, 2023, 70(4): 1590–1594. [doi: 10.1109/
TCSII.2022.3227121]
[30] Feng LC, Zhang YQ, Zhu ZM. An efficient multilayer spiking convolutional neural network processor for object recognition with low
bitwidth and channel-level parallelism. IEEE Trans. on Circuits and Systems II: Express Briefs, 2022, 69(12): 5129–5133. [doi: 10.1109/
TCSII.2022.3207989]
[31] Huang JQ, Serb A, Stathopoulos S, Prodromakis T. Text classification in memristor-based spiking neural networks. Neuromorphic
Computing and Engineering, 2023, 3(1): 014003. [doi: 10.1088/2634-4386/acb2f0]
[32] Zou CL, Cui XX, Kuang YS, Wang Y, Wang XN. A hybrid spiking recurrent neural network on hardware for efficient emotion
recognition. In: Proc. of the 4th IEEE Int’l Conf. on Artificial Intelligence Circuits and Systems (AICAS). Incheon: IEEE, 2022.
332–335. [doi: 10.1109/AICAS54282.2022.9869950]
[33] Kumar N, Tang G, Yoo R, Michmizos KP. Decoding EEG with spiking neural networks on neuromorphic hardware. Trans. on Machine
Learning Research, 2022, (6): 1–12.
[34] Mao RX, Li SX, Zhang ZM, Xia ZH, Xiao JB, Zhu ZX, Liu JH, Shan WW, Chang L, Zhou J. An ultra-energy-efficient and high
accuracy ECG classification processor with SNN inference assisted by on-chip ANN learning. IEEE Trans. on Biomedical Circuits and