Title: The User as a Sensor: Navigating Users with Visual Impairments in Indoor Spaces using Tactile Landmarks
Author Bios:
1. Navid Fallah
- http://www.linkedin.com/in/fallah
- Was a PhD student at University of Nevada, Reno, now works at Eye-Com Corporation
- Works with a research team on vision algorithms for a gaze tracking application
- http://www.linkedin.com/pub/ilias-apostolopoulos/25/865/804, http://www.cse.unr.edu/robotics/pracsys/apostolopoulos
- PhD candidate at University of Nevada, Reno
- Works on the localization and path planning parts of Navatar
- http://www.linkedin.com/pub/kostas-bekris/4/679/b5, http://www.cse.unr.edu/robotics/pracsys/bekris,
- Assistant Professor at Rutgers University (as of July), before that he worked as an Assistant Professor at University of Nevada Reno
- Does research on physically-grounded autonomous agents and motion planning
- http://www.linkedin.com/in/eelke
- Assoicate Professor at University of Nevada Reno
- Works to solve interaction design problems for disabled users
Visually impaired people have to rely on indoor navigation systems to move around in their environment. Often these systems are inaccurate or very expensive and impractical. This team worked on an indoor navigation system that hoped to solve these issues with their indoor navigation system, Navatar.
![]() |
Navatar System Overview and how the components interact [1] |
![]() |
Person using the product, and image showing user input improving accuracy. [1] |
Related work not referenced in the paper:
1. An Indoor Navigation System for the Visually Impaired
- http://www.mdpi.com/1424-8220/12/6/8236/htm
- This one solves the same problem as mine, but it has a different solution. It uses a Wii Remote!
2. VI-Navi: a novel indoor navigation system for visually impaired people
3. Indoor-Outdoor Navigation System for Visually-Impaired Pedestrians:- http://dl.acm.org/citation.cfm?id=2023669
- Again, this has the same problem, but this one solves it through creating an environment with Infrared transmitters that is like a GPS environment.
Preliminary Evaluation of Position Measurement and Obstacle Display
- http://www.aist-ari.org/papers/distribution/2011/ISCW2011-pdr-poster.pdf
- This one has a solution for both indoor and outdoor. It uses a variety of things such as GPS and Wi-Fi. It also detects things in the person's path
4. Supporting visually impaired navigation: a needs-finding study
5. Guiding visually impaired people in the exhibition. Mobile Guide- http://dl.acm.org/citation.cfm?id=1979822
- This paper shows results from interviews with visually impaired people about developing a solution
- http://pdf.aminer.org/000/247/435/who_joins_the_platform_the_case_of_the_rfid_business.pdf
- This paper discusses a location based tour guide implemented with RFID location
- http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1550783&tag=1
- This describes a navigation and location finding system using RFID information grid
- http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6215752
- This describes a way to use ultrasonic sensors to help the visually impaired avoid obstacles
- http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5929098
- This paper is about using the ZigBee wireless mesh network that locates a user and a compass to determine their orientation to help them navigate.
- http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6048486
- The system this paper uses is a mobile RFID reader with an integrated ZigBee transceiver for sending the tag's information to solve this issue.
- ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6176765
- This paper also uses a network of ultrasonic sensors to help the person navigate
Evaluation:
This paper used several methods to evaluate their solution. They used a 5 point Likert scale to measure the usability of Navatar. This is a quantitative and subjective evaluation. They used open responses questions (which is qualitative and objective) to get feedback about the experience, and they measured the number of successful navigation of paths to get quantitative objective evaluations. They were measuring the system's use as a whole, so it was a systemic evaluation, but some of the open response questions had answers that commented on specific parts of the system. They had a well rounded evaluation of their idea, but they could have done more extensive testing on more users. There was a group of 8 and a group of 6 using this in a test situation. One group was not visually impaired and the other was. The non visually impaired group was blindfolded. Both groups were told to navigate several courses to see if they could complete it. They then provided the information I described above. [1]
Discussion:
I really enjoyed this paper. They explained their process in a way that was easily understood, and it seemed like a very successful solution. I thought their idea to use the person as the sensor was very innovative. Most computer scientists only think about how they can use technology to solve problems and often overlook the simple solution of relying a little on the actual person. I also think this would make the person feel more in control of where they are being directed, which is something I assume they don't experience often in unknown surroundings.
Reference Information:
[1] The User as a Sensor: Navigating Users with Visual Impairments in Indoor Spaces using Tactile Landmarks: http://delivery.acm.org/10.1145/2210000/2207735/p425-fallah.pdf?ip=165.91.11.8&acc=ACTIVE%20SERVICE&CFID=109444575&CFTOKEN=41185260&__acm__=1346303651_937424ec5f570827044ed006bd6707a8
[2] All papers listed were found using http://scholar.google.com/