Lidar Based Navigational Tool for the Legally Blind

Mentor 1

Mohammad Habibur Rahman

Location

Union Wisconsin Room

Start Date

5-4-2019 1:30 PM

End Date

5-4-2019 3:30 PM

Description

The purpose of this research is to develop a navigational tool for the legally blind that helps them to navigate without any third-party involvement. Based on multiple tools available today, they focus on people with a low vision where call centers intervene to guide to the destination and comes with a price tag. This research focuses on a tool in the form of a smart glass or a rover that scans the surrounding as the owner moves and converts the scanned information into a voice guide that can be used by the person to navigate. All these things are expected to happen in real time. So far, we have tried different approaches using an ultrasonic sensor, pixy camera and rovers generating its own path and guiding itself. We see how a rover can tackle traffic avoidance and recognize objects in real time. We have also studied how a rover generates and tracks the path down. Our next goal is to use lidar system in both the cases and study the efficiency in mapping the surrounding and integrate the mapping to generate audio instructions. Once this is done, we plan to study how the rover precisely responses in agile traffic. We believe this technology can be cheaper than the available ones in the market which will be more accurate and precise in helping the legally blind in their daily commute.

This document is currently not available here.

Share

COinS
 
Apr 5th, 1:30 PM Apr 5th, 3:30 PM

Lidar Based Navigational Tool for the Legally Blind

Union Wisconsin Room

The purpose of this research is to develop a navigational tool for the legally blind that helps them to navigate without any third-party involvement. Based on multiple tools available today, they focus on people with a low vision where call centers intervene to guide to the destination and comes with a price tag. This research focuses on a tool in the form of a smart glass or a rover that scans the surrounding as the owner moves and converts the scanned information into a voice guide that can be used by the person to navigate. All these things are expected to happen in real time. So far, we have tried different approaches using an ultrasonic sensor, pixy camera and rovers generating its own path and guiding itself. We see how a rover can tackle traffic avoidance and recognize objects in real time. We have also studied how a rover generates and tracks the path down. Our next goal is to use lidar system in both the cases and study the efficiency in mapping the surrounding and integrate the mapping to generate audio instructions. Once this is done, we plan to study how the rover precisely responses in agile traffic. We believe this technology can be cheaper than the available ones in the market which will be more accurate and precise in helping the legally blind in their daily commute.