VALSE

VALSE 首页 活动通知 查看内容

VALSE 论文速览 第157期:Surrogate Module Learning

2023-12-22 19:53| 发布者: 程一-计算所| 查看: 885| 评论: 0

摘要: 论文题目:Surrogate Module Learning: Reduce the Gradient Error Accumulation in Training Spiking Neural Networks作者列表:邓师旷 (电子科技大学),林昊 (电子科技大学),李雨杭 (耶鲁大学),顾实 (电子科技大 ...

论文题目:

Surrogate Module Learning: Reduce the Gradient Error Accumulation in Training Spiking Neural Networks

作者列表:

邓师旷 (电子科技大学),林昊 (电子科技大学),李雨杭 (耶鲁大学),顾实 (电子科技大学)


B站观看网址:

https://www.bilibili.com/video/BV1TN4y187Vu/



论文摘要:

Spiking neural networks provide an alternative solution to conventional artificial neural networks with energy-saving and high-efficiency characteristics after hardware implantation. However, due to its non-differentiable activation function and the temporally delayed accumulation in outputs, the direct training of SNNs is extraordinarily tough even adopting a surrogate gradient to mimic the backpropagation. For SNN training, this non-differentiability causes the intrinsic gradient error that would be magnified through layerwise backpropagation, especially through multiple layers. In this paper, we propose a novel approach to reducing gradient error from a new perspective called surrogate module learning (SML). Surrogate module learning tries to construct a shortcut path to back-propagate more accurate gradient to a certain SNN part utilizing the surrogate modules. Then, we develop a new loss function for concurrently training the network and enhancing the surrogate modules' surrogate capacity. We demonstrate that when the outputs of surrogate modules are close to the SNN output, the fraction of the gradient error drops significantly. Our method consistently and significantly enhances the performance of SNNs on all experiment datasets, including CIFAR-10/100, ImageNet, and ES-ImageNet. For example, for spiking ResNet-34 architecture on ImageNet, we increased the SNN accuracy by 3.46%.


参考文献:

[1] Shikuang Deng, Hao Lin, Yuhang Li, Shi Gu, "Surrogate Module Learning: Reduce the Gradient Error Accumulation in Training Spiking Neural Networks." International conference on machine learning. PMLR, 2023.


论文链接:

[https://openreview.net/pdf?id=zRkz4duLKp]


代码链接:

[https://github.com/brain-intelligence-lab/surrogate_module_learning]


视频讲者简介:

邓师旷,电子科技大学在读博士,研究方向为脉冲神经网络,导师是顾实教授。



特别鸣谢本次论文速览主要组织者:

月度轮值AC:张力 (复旦大学)

季度轮值AC:樊彬 (北京科技大学)

小黑屋|手机版|Archiver|Vision And Learning SEminar

GMT+8, 2024-11-26 10:23 , Processed in 0.013008 second(s), 14 queries .

Powered by Discuz! X3.4

Copyright © 2001-2020, Tencent Cloud.

返回顶部