NavCog app from CMU, IBM maps the world for the blind

Researchers from Carnegie Mellon University teamed up with IBM to create the first open platform for a smartphone app to assist the blind navigate their surroundings.

The app, called NavCog, primarily works on the Carnegie Mellon campus. It uses existing smartphone sensors and Bluetooth beacons to inform the blind of their surroundings through ear buds or smartphone vibrations.

“When in an unfamiliar place, people tend to use a walking navigation system on their device to compare the map location to the surrounding views,” NavCog’s website explains. “However, visually impaired people cannot check the map or the surrounding scenery to bridge the gap between the ground truth and the rough GPS location.

“NavCog aims for an improved high-accuracy walking navigation system that uses BLE beacons together with various kinds of sensors with a new localization algorithm for both indoors and outdoors.”

NavCog is currently available online and will be downloadable on the App Store soon. IBM Bluemix has also made the cognitive assistant tools available through a cloud for developers.

These tools include a navigation app, a map editing tool, and localization algorithms to help the blind identify where they are and what direction they are facing.

Chieko Asakawa, an IBM Fellow and distinguished service professor in Carnegie Mellon’s Robotics Institute, headed the project.

“While visually impaired people like myself have become independent online, we are still challenged in the real world. To gain further independence and help improve the quality of life, ubiquitous connectivity across indoor and outdoor environments is necessary,” Asakawa said in a university press release. “I’m excited that this open platform will help accelerate the advancement of cognitive assistance research by giving developers opportunities to build various accessibility applications and test non-traditional technologies such as ultrasonic and advanced inertial sensors to assist navigation.”

NavCog is a product emerging from the research field of cognitive assistance, which utilizes smart machine technology to assist people in everyday life.

Researchers from the Carnegie Mellon University Cognitive Assistance Laboratory collaborated with IBM Research to explore the use of localization technologies, sensors, and computer vision to identify objects and people to assist with people’s declining cognitive abilities. Although the app currently only works on Carnegie Mellon’s campus, the researchers are optimistic, and hope to work with others to implement this technology in other locations.

Martial Hebert, the director of the Robotics Institute, commented on NavCog, saying that “from localization information to understanding of objects, we have been creating technologies to make the real-world environment more accessible for everyone.

With our long history of developing technologies for humans and robots that will complement humans’ missing abilities to sense the surrounding world, this open platform will help expand the horizon for global collaboration to open up the new real-world accessibility era for the blind in the near future.”

According to NavCog’s website, “NavCog aims to achieve a higher accuracy localization so that in the future, it will be able to provide a more precise description of surrounding Point of Interests and environmental information to the users based on their current location.”

While this new technology represents a major step forward in assisting those who are visually impaired, the research team hopes to continue pushing the boundaries.
In the future, they hope to develop versions of the app that can detect approaching individuals as well as their moods.