Two new apps will enable blind people to navigate indoor buildings with spoken directions from a smartphone app, providing a safe method of wayfinding where GPS doesn’t work.
UC Santa Cruz professor of Computer Science and Engineering Roberto Manduchi has devoted much of his research career to creating accessible technology for the blind and visually impaired. Throughout years of working with these communities, he has learned that there is a particular need for tools to help with indoor navigation of new spaces.
“Moving about independently in a place that you don’t know is particularly difficult, because you don’t have any visual reference — it’s very easy to get lost. The idea here is to try to make this a little bit easier and safer for people,” Manduchi said.
In a new paper published in the journal ACM Transactions on Accessible Computing, Manduchi’s research group presents two smartphone apps that provide indoor wayfinding, navigation to a specific point, and safe return, the process of tracing back a past route. The apps give audio cues and don’t require a user to hold their smartphone in front of themselves, which would be inconvenient and attract undue attention.
Safer, scalable technology
Smartphones provide a great platform for hosting accessible technology because they are cheaper than a dedicated hardware system, have the support of the company’s information technology teams, and are equipped with built-in sensors and accessibility features.
Other smartphone-based wayfinding systems require a person to walk with their phone out, which can create several problems. A blind person navigating a new space often has at least one hand in use for a guide dog or a cane, so using the other for a phone is less than ideal. Holding a phone out also leaves the navigator vulnerable to crime, and people with disabilities already experience criminality at disproportionately higher rates.
While companies like Apple and Google have developed indoor wayfinding for some specific locations, such as major airports and stadiums, their methods depend on sensors that are installed inside these buildings. This makes it a much less scalable solution due to the cost of adding and maintaining extra infrastructure.
Using built-in sensors
Manduchi’s wayfinding app provides a route in a similar way to GPS services like Google Maps; however, GPS-based systems don’t work indoors because the satellite signal is distorted by a building’s walls. Instead, Manduchi’s system uses other sensors within a smartphone to provide spoken instructions to navigate an unfamiliar building.
The wayfinding app works by using a map of the inside of a building to find a path toward the destination, and then uses a phone’s built-in inertial sensors, accelerometers and gyros, which provide for features like a step counter, to track the navigator’s progress along the path.
The same sensors can also track the orientation of the phone and therefore the navigator. However, the estimated location and orientation are often somewhat inaccurate, so the researchers incorporated another method called particle filtering to enforce the physical constraints of a building so it does not interpret that the navigator is walking through walls or other impossible situations.
The backtracking app simply inverts a route previously taken by a navigator, helpful for situations in which a blind person is guided into a room and wants to leave independently. In addition to inertial sensors, it uses the phone’s magnetometer to identify characteristic magnetic field anomalies, typically created by large appliances, which can serve as landmarks within a building.
Communicating directions
Both systems give directions through spoken communication and can also be used with a smartwatch to supplement the instructions with vibrations. Overall, the researchers tried to minimize the amount of input given to the navigator so that they could focus on safety.
They also rely on the navigator to make judgements about where to turn, to account for any error in tracking. The system instructs a person to make their next directional change five meters before it anticipates the turn will occur, with directions like “at the upcoming junction, turn left,” and the navigator can begin to find the turn with the help of their cane or guide dog.
“Sharing responsibility, in my opinion, is the right approach,” Manduchi said. “As a philosophy, you cannot rely on technology alone. That is also true when you drive a car — if it says turn right, you don’t just immediately turn right, you look for where the junction is. You need to work with the system.”
Testing their systems in the Baskin Engineering building at UC Santa Cruz, the research team found that users were able to successfully navigate the many hallways and turns. The team will continue to polish their apps, which use the same interface but are separate for the ease of development.
Going forward, they will focus on integrating AI features that could allow a navigator to take a photo of their surroundings and get a scene description if they are in a particularly hard area to navigate, like an alcove of a building or a wide-open space. They also want to enhance the ability to access and download building maps, perhaps taking advantage of an open source software ecosystem to do so.
“I’m very grateful to the blind community in Santa Cruz, who gave me fantastic advice. [As engineers creating technology for the blind community], you have to be very, very careful and very humble, and start from the person who will use the technology, rather than from the technology itself,” Manduchi said.
Source link