Thursday, August 30, 2012

Paper Reading #1 - The User as a Sensor: Navigating Users with Visual Impairments in Indoor Spaces using Tactile Landmarks

Intro:
     Title: The User as a Sensor: Navigating Users with Visual Impairments in Indoor Spaces using Tactile Landmarks
      Author Bios:
            1. Navid Fallah
    • http://www.linkedin.com/in/fallah
    • Was a PhD student at University of Nevada, Reno, now works at Eye-Com Corporation
    • Works with a research team on vision algorithms for a gaze tracking application            
            2. Ilias Apostolopoulos
    •  http://www.linkedin.com/pub/ilias-apostolopoulos/25/865/804, http://www.cse.unr.edu/robotics/pracsys/apostolopoulos
    • PhD candidate at University of Nevada, Reno
    • Works on the localization and path planning parts of Navatar
            3. Kostas Bekris
    •  http://www.linkedin.com/pub/kostas-bekris/4/679/b5, http://www.cse.unr.edu/robotics/pracsys/bekris, 
    • Assistant Professor at Rutgers University (as of July), before that he worked as an Assistant Professor at University of Nevada Reno 
    • Does research on physically-grounded autonomous agents and motion planning
            4. Eelke Folmer
    • http://www.linkedin.com/in/eelke
    • Assoicate Professor at University of Nevada Reno
    • Works to solve interaction design problems for disabled users
Summary:
Visually impaired people have to rely on indoor navigation systems to move around in their environment. Often these systems are inaccurate or very expensive and impractical. This team worked on an indoor navigation system that hoped to solve these issues with their indoor navigation system, Navatar.

Navatar System Overview and how the components interact [1]
This paper began by describing current navigation systems and their short comings. GPS cannot be used since the signals cannot be received inside building structures. Accelerometers, magnetometers, and gyroscopes, which use dead reckoning estimation to determine location based on the subject's movement, become increasingly inaccurate overtime. Beacon-based techniques, such as RFID tags, can become expensive due to the amount needed to be installed, and their signal strength can be weak. Sensor-based approaches, such as cameras, can be expensive and can impede the mobility of the user.


Person using the product, and image showing user input improving accuracy. [1]
Navatar's approach is to combine smartphone accelerometers and magnetometers (dead reckoning solution) with user input as a sensor. Navatar uses a 2D map to track where the user is using the smartphone's technology. It gives the user directions such as "Follow the wall to your left until you reach a hallway intersection." The user then touches the screen of the phone to let Navatar know once the direction is complete, and it will move onto the next instruction. This user input is used to mitigate errors in dead-reckoning techniques. This system was tested on both visual impaired and blindfolded users, and while it still needs improvement, the tests showed that this could be an effective low cost solution.[1]


Related work not referenced in the paper:
1. An Indoor Navigation System for the Visually Impaired
  • http://www.mdpi.com/1424-8220/12/6/8236/htm
  • This one solves the same problem as mine, but it has a different solution. It uses a Wii Remote!
2. VI-Navi: a novel indoor navigation system for visually impaired people
  • http://dl.acm.org/citation.cfm?id=2023669
  • Again, this has the same problem, but this one solves it through creating an environment with Infrared transmitters that is like a GPS environment.
3. Indoor-Outdoor Navigation System for Visually-Impaired Pedestrians:
Preliminary Evaluation of Position Measurement and Obstacle Display
  • http://www.aist-ari.org/papers/distribution/2011/ISCW2011-pdr-poster.pdf
  • This one has a solution for both indoor and outdoor. It uses a variety of things such as GPS and Wi-Fi. It also detects things in the person's path
4. Supporting visually impaired navigation: a needs-finding study
  • http://dl.acm.org/citation.cfm?id=1979822
  • This paper shows results from interviews with visually impaired people about developing a solution
5. Guiding visually impaired people in the exhibition. Mobile Guide
  • http://pdf.aminer.org/000/247/435/who_joins_the_platform_the_case_of_the_rfid_business.pdf
  • This paper discusses a location based tour guide implemented with RFID location
6. RFID information grid for blind navigation and wayfinding
  • http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1550783&tag=1
  • This describes a navigation and location finding system using RFID information grid
7. Adaptive power control of obstacle avoidance system using via motion context for visually impaired person
  • http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6215752 
  • This describes a way to use ultrasonic sensors to help the visually impaired avoid obstacles
8.An integrated wireless indoor navigation system for visually impaired
  • http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5929098
  • This paper is about using the ZigBee wireless mesh network that locates a user and a compass to determine their orientation to help them navigate.
9. RFAIDE — An RFID based navigation and object recognition assistant for visually impaired people
  • http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6048486 
  • The system this paper uses is a mobile RFID reader with an integrated ZigBee transceiver for sending the tag's information to solve this issue.
10. Ultrasonic spectacles and waist-belt for visually impaired and blind person
  • ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6176765
  • This paper also uses a network of ultrasonic sensors to help the person navigate
From these articles, it is obvious that the trend is going to sensor based methods, such as RFID navigation, and the use of mobile phones. None of these articles seemed to focus on user input though, which I think is a novel idea from my assigned paper.

Evaluation:
This paper used several methods to evaluate their solution. They used a 5 point Likert scale  to measure the usability of Navatar. This is a quantitative and subjective evaluation. They used open responses questions (which is qualitative and objective) to get feedback about the experience, and they measured the number of successful navigation of paths to get quantitative objective evaluations. They were measuring the system's use as a whole, so it was a systemic evaluation, but some of the open response questions had answers that commented on specific parts of the system. They had a well rounded evaluation of their idea, but they could have done more extensive testing on more users. There was a group of 8 and a group of 6 using this in a test situation. One group was not visually impaired and the other was. The non visually impaired group was blindfolded. Both groups were told to navigate several courses to see if they could complete it. They then provided the information I described above. [1]

Discussion:
I really enjoyed this paper. They explained their process in a way that was easily understood, and it seemed like a very successful solution. I thought their idea to use the person as the sensor was very innovative. Most computer scientists only think about how they can use technology to solve problems and often overlook the simple solution of relying a little on the actual person. I also think this would make the person feel more in control of where they are being directed, which is something I assume they don't experience often in unknown surroundings.

Reference Information:
[1] The User as a Sensor: Navigating Users with Visual Impairments in Indoor Spaces using Tactile Landmarks:   http://delivery.acm.org/10.1145/2210000/2207735/p425-fallah.pdf?ip=165.91.11.8&acc=ACTIVE%20SERVICE&CFID=109444575&CFTOKEN=41185260&__acm__=1346303651_937424ec5f570827044ed006bd6707a8
[2] All papers listed were found using http://scholar.google.com/