Distillation-boosted spiking neural network for gesture recognition
Loading...
Files
Date
2025
Journal Title
Journal ISSN
Volume Title
Publisher
IEEE
Abstract
Spiking Neural Networks (SNNs) are brain-inspired models known for their high computational efficiency, primarily due to their use of discrete spike signals that closely resemble biological neural processing. Although SNNs offer significant energy efficiency advantages, a major challenge lies in training them, as traditional gradient descent algorithms cannot be directly applied due to the non-differentiable nature of spikes. Several methods have been developed to address this issue, including direct training using surrogate gradients, ANN-to-SNN conversion, and knowledge distillation (KD). Among these, KD has often demonstrated superior accuracy, where a highly performing and complex artificial neural network (ANN) serves as the teacher and a less complex SNN as the student. However, most KD-based approaches have focused on static image classification, with limited exploration in dynamic data domains such as action or gesture recognition. This study addresses this gap by proposing a KD-based training framework that extends the applicability of SNNs to dynamic visual tasks. Specifically, we focus on gesture recognition using event-based data and perform comprehensive evaluations by comparing KD-trained SNNs with directly trained SNNs and the original ANN teacher model. Experimental results show that our KD-trained SNN achieves a Top-5 class accuracy of 89.73% and a Random-5 class accuracy of 85.80%, outperforming directly trained SNNs, which achieve 84.24% and 83.20%, respectively. These findings demonstrate the effectiveness of knowledge distillation in improving the accuracy and generalization of SNNs in dynamic gesture recognition tasks and extending their applicability beyond static image domains.
