Institutional-Repository, University of Moratuwa.  

Real-time upper body motion tracking using computer vision for improved human-robot interaction and teleoperation

Show simple item record

dc.contributor.author Nandasena, NASN
dc.contributor.author Vimukthi, WAA
dc.contributor.author Herath, HMKKMB
dc.contributor.author Wijesinghe, R
dc.contributor.author Yasakethu, SLP
dc.contributor.editor Abeysooriya, R
dc.contributor.editor Adikariwattage, V
dc.contributor.editor Hemachandra, K
dc.date.accessioned 2024-03-20T09:31:54Z
dc.date.available 2024-03-20T09:31:54Z
dc.date.issued 2023-12-09
dc.identifier.citation N. A. S. N. Nandasena, W. A. A. Vimukthi, H. M. K. K. M. B. Herath, R. Wijesinghe and S. L. P. Yasakethu, "Real-Time Upper Body Motion Tracking Using Computer Vision for Improved Human-Robot Interaction and Teleoperation," 2023 Moratuwa Engineering Research Conference (MERCon), Moratuwa, Sri Lanka, 2023, pp. 201-206, doi: 10.1109/MERCon60487.2023.10355479. en_US
dc.identifier.uri http://dl.lib.uom.lk/handle/123/22344
dc.description.abstract Upper body motion tracking mapping is crucial for robot control because it gives the machine a better understanding of how a human operator moves, allowing it to react instinctively and naturally. Most current research has focused on using wearable sensors and remote controls to enhance communication between robots and humans. However, this research aims to address the issue by embracing a nonwearable sensor-based strategy to promote more natural and spontaneous interactions between humans and robots. Moreover, A 3-DoF manipulator was also designed and developed utilizing robotics technologies. The vision system captured a human operator's upper body movements in realtime video footage. Computer vision approaches were used to extract positional and orientation information from the upper body in this setting. The system combines the MediaPipe pose model with kinematics theories to estimate the hands' position and movement in real-time. According to the experiment results, the system's overall accuracy is 94.1 (±1.2) %, and the motion tracking system's accuracy is 96.5 (±2.0) %. en_US
dc.language.iso en en_US
dc.publisher IEEE en_US
dc.relation.uri https://ieeexplore.ieee.org/document/10355479 en_US
dc.subject Assistive robotics en_US
dc.subject Control systems en_US
dc.subject Computer vision en_US
dc.subject Human-robot interaction en_US
dc.subject Upper body tracking en_US
dc.title Real-time upper body motion tracking using computer vision for improved human-robot interaction and teleoperation en_US
dc.type Conference-Full-text en_US
dc.identifier.faculty Engineering en_US
dc.identifier.department Engineering Research Unit, University of Moratuwa en_US
dc.identifier.year 2023 en_US
dc.identifier.conference Moratuwa Engineering Research Conference 2023 en_US
dc.identifier.place Katubedda en_US
dc.identifier.pgnos pp. 201-206 en_US
dc.identifier.proceeding Proceedings of Moratuwa Engineering Research Conference 2023 en_US
dc.identifier.email shehann@sltc.ac.lk en_US
dc.identifier.email adithyav@sltc.ac.lk en_US
dc.identifier.email kasunkh@sltc.ac.lk, en_US
dc.identifier.email lasithy@sltc.ac.lk en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record