Autonomous wheelchair robot navigation incorporating user expressions

dc.contributor.authorPathirana, T
dc.contributor.authorJayasekara, AGBP
dc.contributor.authorMadhushanka, BGDA
dc.contributor.editorGamage , JR
dc.contributor.editorNandasiri , GK
dc.contributor.editorMawathage , SA
dc.contributor.editorHerath, RP
dc.date.accessioned2025-05-29T09:51:51Z
dc.date.issued2024
dc.description.abstractAs the aging population rises, 80% of older people will live in low—and middle-income countries by 2050. Robotic wheelchairs offer autonomous navigation using algorithms that incorporate user expressions. These systems can improve elderly care by detecting emotions and assisting with tasks to enhance living standards and reduce stress. Intelligent voice control methods have enabled human-robot interaction by designing hybrid navigation decision control [1]. However, current state unable to incorporate user emotion recognition before the navigation. Some researchers have created outdoor traveling systems by enabling the voice-model input [2], [3]. That system should be improved to provide user-oriented communication according to the user's emotions for comfortable navigation. Multimodal human-computer interaction for wheelchair navigation has been developed by researchers incorporating gesture, speech, and head posture to control the wheelchair platform [4]. However, this kind of multimodal system reduced user comfort. Researchers introduced on-ground-projection-based shared Human Machine Interface (HMI) approaches for autonomous wheelchairs [5]. Egocentric computer vision has been enabled for hands-free control of wheelchairs to navigation [6]. This kind of research can be improved by using Simultaneous Localization and Mapping (SLAM), Dynamic-Window Approach (DWA), and Adaptive Monte-Carlo Localizer (AMCL) for navigation with reasonable accuracy, including suggesting various kinds of paths for one goal of reducing user stress. Facial recognition can be developed using CNN as the architecture of CNN3 [7], [8]. With good AI assistance, wheelchair navigation can be improved to reduce user stress. Modern wheelchair platform navigation can be improved according to the user's feelings by suggesting different paths for a common goal. Human-robot interaction can be enhanced by applying a human-like thinking pattern in modern wheelchairs.
dc.identifier.conferenceERU Symposium - 2024
dc.identifier.departmentDepartment of Electrical Engineering
dc.identifier.doihttps://doi.org/10.31705/ERU.2024.5
dc.identifier.emailpathiranaaptd.23@uom.lk
dc.identifier.emailbuddhikaj@uom.lk
dc.identifier.emailbgmad@ou.ac.lk
dc.identifier.facultyEngineering
dc.identifier.issn3051-4894
dc.identifier.pgnospp. 13-14
dc.identifier.placeSri Lanka
dc.identifier.proceedingProceedings of the ERU Symposium 2024
dc.identifier.urihttps://dl.lib.uom.lk/handle/123/23576
dc.language.isoen
dc.publisherEngineering Research Unit
dc.subjectWheelchair Robot
dc.subjectMobile Robot Navigation
dc.subjectMultiple Path Planning
dc.subjectUser Expression
dc.subjectDeep Learning
dc.titleAutonomous wheelchair robot navigation incorporating user expressions
dc.typeConference-Extended-Abstract

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
5.Autonomous Wheelchair Robot Navigation.pdf
Size:
313.49 KB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description:

Collections