Institutional-Repository, University of Moratuwa.  

Vision-emg fusion method for real-time grasping pattern classification system

Show simple item record

dc.contributor.author Perera, DM
dc.contributor.author Madusanka, DGK
dc.contributor.editor Adhikariwatte, W
dc.contributor.editor Rathnayake, M
dc.contributor.editor Hemachandra, K
dc.date.accessioned 2022-10-19T04:49:03Z
dc.date.available 2022-10-19T04:49:03Z
dc.date.issued 2021-07
dc.identifier.citation D. M. Perera and D. G. K. Madusanka, "Vision-EMG Fusion Method for Real-time Grasping Pattern Classification System," 2021 Moratuwa Engineering Research Conference (MERCon), 2021, pp. 585-590, doi: 10.1109/MERCon52712.2021.9525702. en_US
dc.identifier.uri http://dl.lib.uom.lk/handle/123/19130
dc.description.abstract Although recently developed Electromyography-based (EMG) prosthetic hands could classify a significant amount of wrist motions, classifying 5-6 grasping patterns in real-time is a challenging task. The collaboration of EMG and vision has addressed this problem to a certain extent but could not achieve significant performance in real-time. In this paper, we propose a fusion method that can improve the real-time prediction accuracy of the EMG system by merging a probability matrix that represents the usage of the six grasping patterns for the targeted object. The YOLO object detection system retrieves a probability matrix of the identified object, and it is used to correct the error in the EMG classification system. The experiments revealed that the optimized ANN model outperformed the KNN, LDA, NB, and DT by achieving the highest mean True Positive Rate (mTPR) of 69.34%(21.54) in real-time for all the six grasping patterns. Furthermore, the proposed feature set (Age, Gender, and Handedness of the user) showed that their influence increases the mTPR of ANN by 16.05%(2.70). The proposed system takes 393.89ms(178.23ms) to produce a prediction. Therefore, the user does not feel a delay between intention and execution. Furthermore, the system facilitates users to use multiple-grasping patterns for an object. en_US
dc.language.iso en en_US
dc.publisher IEEE en_US
dc.relation.uri https://ieeexplore.ieee.org/document/9525702 en_US
dc.subject Grasping patterns en_US
dc.subject Surface electromyography en_US
dc.subject Object detection en_US
dc.subject Bayesian fusion en_US
dc.subject Real-time classification en_US
dc.title Vision-emg fusion method for real-time grasping pattern classification system en_US
dc.type Conference-Full-text en_US
dc.identifier.faculty Engineering en_US
dc.identifier.department Engineering Research Unit, University of Moratuwa en_US
dc.identifier.year 2021 en_US
dc.identifier.conference Moratuwa Engineering Research Conference 2021 en_US
dc.identifier.place Moratuwa, Sri Lanka en_US
dc.identifier.pgnos pp. 585-590 en_US
dc.identifier.proceeding Proceedings of Moratuwa Engineering Research Conference 2021 en_US
dc.identifier.doi 10.1109/MERCon52712.2021.9525702 en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record