Page 380 - 《软件学报》2025年第4期
P. 380
1786 软件学报 2025 年第 36 卷第 4 期
Neurocomputing, 2014, 138: 3–13. [doi: 10.1016/j.neucom.2013.06.052]
[57] Comsa IM, Potempa K, Versari L, Fischbacher T, Gesmundo A, Alakuijala J. Temporal coding in spiking neural networks with alpha
synaptic function. In: Proc. of the 2020 IEEE Int’l Conf. on Acoustics, Speech and Signal Processing (ICASSP 2020). Barcelona: IEEE,
2020. 8529–8533. [doi: 10.1109/ICASSP40776.2020.9053856]
[58] Park S, Kim S, Choe H, Yoon S. Fast and efficient information transmission with burst spikes in deep spiking neural networks. In: Proc.
of the 56th ACM/IEEE Design Automation Conf. Las Vegas: IEEE, 2019. 1–6.
[59] van Rullen R, Thorpe SJ. Rate coding versus temporal order coding: What the retinal ganglion cells tell the visual cortex. Neural
Computation, 2001, 13(6): 1255–1283. [doi: 10.1162/08997660152002852]
[60] Fairhall AL, Lewen GD, Bialek W, de Ruyter van Steveninck RR. Efficiency and ambiguity in an adaptive neural code. Nature, 2001,
412(6849): 787–792. [doi: 10.1038/35090500]
[61] Alam SH, Foshie A, Rose G. A runtime-reconfigurable hardware encoder for spiking neural networks. In: Proc. of the 2023 Great Lakes
Symp. on VLSI. Knoxville: ACM, 2023. 203–206. [doi: 10.1145/3583781.3590284]
[62] Hebb DO. The Organization of Behavior: A Neuropsychological Theory. New York: Psychology Press, 2002. [doi: 10.4324/
9781410612403]
[63] Caporale N, Dan Y. Spike timing-dependent plasticity: A Hebbian learning rule. Annual Review of Neuroscience, 2008, 31: 25–46. [doi:
10.1146/annurev.neuro.31.060407.125639]
[64] Song S, Miller KD, Abbott LF. Competitive Hebbian learning through spike-timing-dependent synaptic plasticity. Nature Neuroscience,
2000, 3(9): 919–926. [doi: 10.1038/78829]
[65] Dan Y, Poo MM. Spike timing-dependent plasticity: From synapse to perception. Physiological Reviews, 2006, 86(3): 1033–1048. [doi:
10.1152/physrev.00030.2005]
[66] Qiao GC, Hu SG, Wang JJ, Zhang CM, Chen TP, Ning N, Yu Q, Liu Y. A neuromorphic-hardware oriented bio-plausible online-
learning spiking neural network model. IEEE Access, 2019, 7: 71730–71740. [doi: 10.1109/ACCESS.2019.2919163]
[67] Joo B, Han JW, Kong BS. Energy- and area-efficient CMOS synapse and neuron for spiking neural networks with STDP learning. IEEE
Trans. on Circuits and Systems I: Regular Papers, 2022, 69(9): 3632–3642. [doi: 10.1109/TCSI.2022.3178989]
[68] Liu S, Wang JJ, Zhou JT, Hu SG, Yu Q, Chen TP, Liu Y. An area- and energy-efficient spiking neural network with spike-time-
dependent plasticity realized with SRAM processing-in-memory macro and on-chip unsupervised learning. IEEE Trans. on Biomedical
Circuits and Systems, 2023, 17(1): 92–104. [doi: 10.1109/TBCAS.2023.3242413]
[69] Processing Systems. Montréal: Curran Associates Inc., 2018. 1419–1428.
Qu LH, Zhao ZY, Wang L, Wang Y. Efficient and hardware-friendly methods to implement competitive learning for spiking neural
networks. Neural Computing and Applications, 2020, 32(17): 13479–13490. [doi: 10.1007/s00521-020-04755-4]
[70] Sun CY, Sun HH, Xu J, Han JN, Wang XY, Wang XY, Chen QY, Fu YX, Li L. An energy efficient STDP-based SNN architecture with
on-chip learning. IEEE Trans. on Circuits and Systems I: Regular Papers, 2022, 69(12): 5147–5158. [doi: 10.1109/TCSI.2022.3204645]
[71] Tavanaei A, Maida A. BP-STDP: Approximating backpropagation using spike timing dependent plasticity. Neurocomputing, 2019, 330:
39–47. [doi: 10.1016/j.neucom.2018.11.014]
[72] Gomar S, Ahmadi M. Digital realization of PSTDP and TSTDP learning. In: Proc. of the 2018 Int’l Joint Conf. on Neural Networks
(IJCNN). Rio de Janeiro: IEEE, 2018. 1–5. [doi: 10.1109/IJCNN.2018.8489263]
[73] Bellec G, Salaj D, Subramoney A, Legenstein R, Maass W. Long short-term memory and learning-to-learn in networks of spiking
neurons. In: Proc. of the 32nd Int’l Conf. on Neural Information Processing Systems. Montréal: Curran Associates Inc., 2018. 795–805.
[74] Neftci EO, Mostafa H, Zenke F. Surrogate gradient learning in spiking neural networks: Bringing the power of gradient-based
optimization to spiking neural networks. IEEE Signal Processing Magazine, 2019, 36(6): 51–63. [doi: 10.1109/MSP.2019.2931595]
[75] Shrestha SB, Orchard G. SLAYER: Spike layer error reassignment in time. In: Proc. of the 32nd Int’l Conf. on Neural Information
[76] Rathi N, Chakraborty I, Kosta A, Sengupta A, Ankit A, Panda P, Roy K. Exploring neuromorphic computing based on spiking neural
networks: Algorithms to hardware. ACM Computing Surveys, 2023, 55(12): 243. [doi: 10.1145/3571155]
[77] Zheng N, Mazumder P. Online supervised learning for hardware-based multilayer spiking neural networks through the modulation of
weight-dependent spike-timing-dependent plasticity. IEEE Trans. on Neural Networks and Learning Systems, 2018, 29(9): 4287–4302.
[doi: 10.1109/TNNLS.2017.2761335]
[78] Qiao GC, Ning N, Zuo Y, Zhou PJ, Sun ML, Hu SG, Yu Q, Liu Y. Batch normalization-free weight-binarized SNN based on hardware-
saving IF neuron. Neurocomputing, 2023, 544: 126234. [doi: 10.1016/j.neucom.2023.126234]
[79] Zhou P, Choi DU, Kang SM, Eshraghian JK. Backpropagating errors through memristive spiking neural networks. In: Proc. of the 2023
IEEE Int’l Symp. on Circuits and Systems (ISCAS). Monterey: IEEE, 2023. 1–5. [doi: 10.1109/ISCAS46773.2023.10182006]