Abstract:
Environmental perception is the process by which visually impaired people tend to
sense, understand, and build awareness of the walking environment and the surrounding objects.
Walkable space includes any space that the visually impaired person is physically capable of
walking. The task of estimating the walkable space is performed by estimating pixels belonging
to the ground plane in the scene. Plane parameters are estimated based on the input of 3D point
coordinates of pixels belonging to the sidewalk in the outdoor environment. Moreover, the curb
line is estimated using the output of the walkable space calculation to determine whether the
visually impaired navigator is walking along the sidewalks or on the road. Proof of concept and
evaluation experiments are conducted on a 3D simulated environment with the use of RGB,
semantic and depth images taken from the wearable RGB camera, segmentation camera and depth
camera. The results are benchmarked with the existing image processing-based methods and
showed that the proposed method can be successfully implemented for the safe navigation of
visually impaired users with minimum computational complexity in image processing.