Distillation-boosted spiking neural network for gesture recognition

dc.contributor.authorAponso, GMMK
dc.contributor.authorThusyanthan, J
dc.contributor.authorWellahewa, WIUD
dc.contributor.authorHettiarachchi, C
dc.date.accessioned2025-12-08T05:00:01Z
dc.date.issued2025
dc.description.abstractSpiking Neural Networks (SNNs) are brain-inspired models known for their high computational efficiency, primarily due to their use of discrete spike signals that closely resemble biological neural processing. Although SNNs offer significant energy efficiency advantages, a major challenge lies in training them, as traditional gradient descent algorithms cannot be directly applied due to the non-differentiable nature of spikes. Several methods have been developed to address this issue, including direct training using surrogate gradients, ANN-to-SNN conversion, and knowledge distillation (KD). Among these, KD has often demonstrated superior accuracy, where a highly performing and complex artificial neural network (ANN) serves as the teacher and a less complex SNN as the student. However, most KD-based approaches have focused on static image classification, with limited exploration in dynamic data domains such as action or gesture recognition. This study addresses this gap by proposing a KD-based training framework that extends the applicability of SNNs to dynamic visual tasks. Specifically, we focus on gesture recognition using event-based data and perform comprehensive evaluations by comparing KD-trained SNNs with directly trained SNNs and the original ANN teacher model. Experimental results show that our KD-trained SNN achieves a Top-5 class accuracy of 89.73% and a Random-5 class accuracy of 85.80%, outperforming directly trained SNNs, which achieve 84.24% and 83.20%, respectively. These findings demonstrate the effectiveness of knowledge distillation in improving the accuracy and generalization of SNNs in dynamic gesture recognition tasks and extending their applicability beyond static image domains.
dc.identifier.conferenceMoratuwa Engineering Research Conference 2025
dc.identifier.emailmadhawa.20@cse.mrt.ac.lk
dc.identifier.emailthusyanthan.20@cse.mrt.ac.lk
dc.identifier.emailuditha.20@cse.mrt.ac.lk
dc.identifier.emailchathuranga@cse.mrt.ac.lk
dc.identifier.facultyEngineering
dc.identifier.isbn979-8-3315-6724-8
dc.identifier.pgnospp. 740-745
dc.identifier.proceedingProceedings of Moratuwa Engineering Research Conference 2025
dc.identifier.urihttps://dl.lib.uom.lk/handle/123/24526
dc.language.isoen
dc.publisherIEEE
dc.subjectSpiking Neural Networks (SNNs)
dc.subjectKnowledge Distillation (KD)
dc.subjectGesture Recognition
dc.subjectEvent-Based Data
dc.subjectSurrogate Gradient Training
dc.titleDistillation-boosted spiking neural network for gesture recognition
dc.typeConference-Full-text

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
1571154405.pdf
Size:
1.85 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description:

Collections