Abstract
Obstacle avoidance is a major challenge during independent mobility for blind or visually impaired (BVI) people. Typically, BVI people can only perceive obstacles at a short distance (about 1 m, in case they are using the white cane), and some obstacles are hard to detect (e.g., those elevated from the ground), or should not be hit by the white cane (e.g., a standing person). A solution to these problems can be found in recent computer-vision techniques that can run on mobile and wearable devices to detect obstacles at a distance. However, in addition to detecting obstacles, it is also necessary to convey information about them in real time.
This contribution presents WatchOut, a sonification technique for conveying real-time information about the main properties of an obstacle to a BVI person, who can then use this additional feedback to safely navigate in the environment. WatchOut was designed with a user-centered approach, involving four iterations of online listening tests with BVI participants in order to define, improve and evaluate the sonification technique, eventually obtaining an almost perfect recognition accuracy. WatchOut was also implemented and tested as a module of a mobile app that detects obstacles using state-of-the-art computer vision technology. Results show that the system is considered usable and can guide the users to avoid more than 85% of the obstacles.
- [1] . 2004. Portable 3D sound/sonar navigation system for blind individuals. In Proceedings of the 2nd LACCEI International Latin American Caribbean Conference. Engineering Technology.Google Scholar
- [2] . 2019. Sonification of rotation instructions to support navigation of people with visual impairment. In Proceedings of the IEEE International Conference on Pervasive Computing and Communications. 332–341.Google ScholarCross Ref
- [3] . 2014. Zebrarecognizer: Efficient and precise localization of pedestrian crossings. In Proceedings of the 2014 22nd International Conference on Pattern Recognition.
IEEE , 2566–2571. Google ScholarDigital Library - [4] . 2019. Deep learning compensation of rotation errors during navigation assistance for people with visual impairments or blindness. ACM Transactions on Accessible Computing 12, 4 (2019), 1–19. Google ScholarDigital Library
- [5] . 2020. ReCog: Supporting blind people in recognizing personal objects. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems.
ACM , 1–12.DOI: DOI: https://doi.org/10.1145/3313831.3376143 Google ScholarDigital Library - [6] . 2013. A smart infrared microcontroller-based blind guidance system. Active and Passive Electronic Components 2013, 16 (2013). https://doi.org/10.1155/2013/726480Google Scholar
- [7] . 2009. Sonification design for complex work domains: Dimensions and distractors. Journal of Experimental Psychology: Applied 15, 3 (2009), 183–198.Google ScholarCross Ref
- [8] . 2004. Design and development of an indoor navigation and object identification system for the blind. In Proceedings of the International Conference ACM Sigaccess Accessibility and Computing.
ACM , 147–152. Google ScholarDigital Library - [9] . 2004. Musical pitch space across modalities: Spatial and other mappings through language and culture. In Proceedings of the 8th International Conference on Music Perception and Cognition, , , , and (Eds.).
Causal Productions ,Evanston, Illinois , 64–71.Google Scholar - [10] . 2019. Wearable travel aid for environment perception and navigation of visually impaired people. Electronics 8, 6 (2019), 697.Google ScholarCross Ref
- [11] . 2009. Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of Usability Studies 4, 3 (2009), 114–123. Google ScholarDigital Library
- [12] . 2004. Tactile feedback navigation handle for the visually impaired. In Proceedings of the ASME International Mechanical Engineering Congress and Exposition. 1171–1177.Google ScholarCross Ref
- [13] . 2013. Supporting blind navigation using depth sensing and sonification. In Proceedings of the ACM Conference Pervasive and Ubiquitous Computing Adjunct Publication.
ACM , 255–258. Google ScholarDigital Library - [14] . 1996. SUS: A quick and dirty usability scale. Usability Evaluation in Industry. (Eds.), Taylor & Francis. 189–194.Google Scholar
- [15] . 1953. Ecological cue-validity of ’proximity’ and of other Gestalt factors. American Journal of Psychology 66, 1 (1953), 20–32.Google ScholarCross Ref
- [16] . 2016. Sonification, musification, and synthesis of absolute program music. In Proceedings of the 22nd Annual International Conference on Auditory Display. 177–183.Google ScholarCross Ref
- [17] . 2007. A 2D vibration array as an assistive device for visually impaired. In Proceedings of the IEEE 7th International Symposium on BioInformatics and BioEngineering.
IEEE , 930–937.Google ScholarCross Ref - [18] . 2010. Wearable obstacle avoidance electronic travel aids for blind: A survey. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews) 40, 1 (2010), 25–35. Google ScholarDigital Library
- [19] . 2016. A virtual blind cane using a line laser-based vision system and an inertial measurement unit. Sensors 16, 1 (2016), 95.Google ScholarCross Ref
- [20] . 2017. Low cost GPS and GSM based navigational aid for visually impaired people. Wireless Personal Communications 92, 4 (2017), 1575–1589. Google ScholarDigital Library
- [21] . 2008. Learnability of sound cues for environmental features: Auditory icons, earcons, spearcons, and speech. In Proceedings of the 14th International Conference on Auditory Display.Google Scholar
- [22] . 1981. The nottingham obstacle detector: Development and evaluation. Journal of Visual Impairment & Blindness 75, 5 (1981), 203–209.Google ScholarCross Ref
- [23] . 2013. A systematic review of mapping strategies for the sonification of physical quantities. PLoS ONE 8, 12 (2013), e82491.Google ScholarCross Ref
- [24] . 2010. Real-time assistance prototype–a new navigation aid for blind people. In Proceedings of the IECON 2010 36th Annual Conference on IEEE Industrial Electronics Society.
IEEE , 1173–1178.Google ScholarCross Ref - [25] . 2017. Sensor-based assistive devices for visually-impaired people: Current status, challenges, and future directions. Sensors 17, 3 (2017), 565.Google ScholarCross Ref
- [26] . 2002. Human Factors: Guidelines on the Multimodality of Icons, Symbols and Pictograms.
Technical Report EG 202 048 (V1.1.1).Google Scholar - [27] . 2016. Interactive spatial sonification for non-visual exploration of virtual maps. International Journal on Human Computer Studies 85 (2016), 4–15. Google ScholarDigital Library
- [28] . 2018. Navigating without vision: Principles of blind spatial cognition. In Handbook of Behavioral and Cognitive Geography, (Ed.). Edward Elgar Publishing, 260–288.Google ScholarCross Ref
- [29] . 2008. Blind navigation and the role of technology. In The Engineering Handbook of Smart Technology for Aging, Disability, and Independence, , , and (Eds.). Wiley, 479–500.Google ScholarCross Ref
- [30] . 1999. Development of a new space perception system for blind people, based on the creation of a virtual acoustic space. In Proceedings of the International Work-Conference on Artificial Neural Networks.
Springer , 321–330.Google ScholarCross Ref - [31] . 2011. Parameter mapping sonification. In The Sonification Handbook, , , and (Eds.). Logos Verlag, Berlin, 363–397.Google Scholar
- [32] . 2019. Cabot: Designing and evaluating an autonomous navigation robot for blind people. In Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility. 68–82. Google ScholarDigital Library
- [33] . 2016. Indoor localization for the visually impaired using a 3d sensor. In Proceedings of the 31st Annual International Technology and Persons with Disabilities Conference.Google Scholar
- [34] . 1991. A blind mobility aid modeled after echolocation of bats. IEEE Transactions on Biomedical Engineering 38, 5 (1991), 461–465.Google ScholarCross Ref
- [35] . 2005. CyARM: An alternative aid device for blind persons. In Proceedings of the CHI’05 Extended Abstracts on Human Factors in Computing Systems.
ACM , 1483–1488. Google ScholarDigital Library - [36] . 2017. Visual and infrared sensor data-based obstacle detection for the visually impaired using the Google project tango tablet development kit and the unity engine. IEEE Access 6 (2017), 443–454.Google ScholarCross Ref
- [37] . 2016. Obstacle detection and avoidance for the visually impaired in indoors environments using Googles Project Tango device. In Proceedings of the International Conference on Computers Helping People with Special Needs.
Springer , 179–185.Google ScholarCross Ref - [38] . 2018. Insights on assistive orientation and mobility of people with visual impairment based on large-scale longitudinal data. ACM Transactions on Accessible Computing 11, 1 (2018), 1–28. Google ScholarDigital Library
- [39] . 2018. Safe local navigation for visually impaired users with a time-of-flight and haptic feedback device. IEEE Transactions on Neural Systems and Rehabilitation Engineering 26, 3 (2018), 583–593.Google ScholarCross Ref
- [40] . 1964. An ultrasonic sensing probe as a mobility aid for the blind. Ultrasonics 2, 2 (1964), 53–59.Google ScholarCross Ref
- [41] . 1992. Perceptual interactions between musical pitch and timbre. Journal of Experimental Psychology: Human Perception and Performance 18, 3 (1992), 739.Google ScholarCross Ref
- [42] . 1993. Multipoint scales: Mean and median differences and observed significance levels. International Journal of Human-Computer Interaction 5, 4 (1993), 383–392.Google ScholarCross Ref
- [43] . 2018. Vision-based mobile indoor assistive navigation aid for blind people. IEEE Transactions on Mobile Computing 18, 3 (2018), 702–714. Google ScholarDigital Library
- [44] . 2015. iSee: Obstacle detection and feedback system for the blind. In Adjunct Proceedings of the ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the ACM International Symposium on Wearable Computers.
ACM , 197–200. Google ScholarDigital Library - [45] . 2014. Wayfinding without visual cues: Evaluation of an interactive audio map system. Interacting with Computers 26, 5 (2014), 403–416.Google ScholarCross Ref
- [46] . 2005. Personal guidance system for people with visual impairment: A comparison of spatial displays for route guidance. Journal of Visual Impairment & Blindness 99, 4 (2005), 219–232.Google ScholarCross Ref
- [47] . 2016. The sonification space: A reference system for sonification tasks. International Journal of Human-Computer Studies 85 (2016), 72–77.
DOI: DOI: https://doi.org/10.1016/j.ijhcs.2015.08.008 Google ScholarCross Ref - [48] . 2012. Mobile vision as assistive technology for the blind: An experimental study. In Proceedings of the International Conference on Computers Helping People with Special Needs.
Springer , 9–16. Google ScholarDigital Library - [49] . 2011. Mobility-related accidents experienced by people with visual impairment. AER Journal: Research and Practice in Visual Impairment and Blindness 4, 2 (2011), 44–54.Google Scholar
- [50] . 2018. Visually impaired users can locate and grasp objects under the guidance of computer vision and non-visual feedback. In Proceedings of the 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.
IEEE , 1–4.Google ScholarCross Ref - [51] . 2016. Zebrarecognizer: Pedestrian crossing recognition for people with visual impairment or blindness. Pattern Recognition 60 (2016), 405–419. Google ScholarDigital Library
- [52] . 2016. Robust traffic lights detection on mobile devices for pedestrians with visual impairment. Computer Vision and Image Understanding 148, C (2016), 123–135. Google ScholarDigital Library
- [53] . 2005. A substitute vision system for providing 3D perception and GPS navigation via electro-tactile stimulation. In Proceedings of International Conference on Sensing Technology.Google Scholar
- [54] . 1992. An experimental system for auditory image representations. IEEE Transactions on Biomedical Engineering 39, 2 (1992), 112–121.Google ScholarCross Ref
- [55] . 2017. Obstacle avoidance feedback system for the blind using stereo sound. In Proceedings of the International Convention on Rehabilitation Engineering and Assistive Technology.
Therapeutic, Assistive & Rehabilitative Technologies (START) Centre , 1–21. Google ScholarDigital Library - [56] . 2018. A TensorFlow-based assistive technology system for users with visual impairments. In Proceedings of the Internet of Accessible Things. 1–2. Google ScholarDigital Library
- [57] . 2016. Depth-aware indoor staircase detection and recognition for the visually impaired. In Proceedings of the 2016 IEEE International Conference on Multimedia & Expo Workshops.
IEEE , 1–6.Google ScholarCross Ref - [58] . 2015. Assistive infrared sensor based smart stick for blind people. In Proceedings of the 2015 Science and Information Conference.
IEEE , 1149–1154.Google ScholarCross Ref - [59] . 2011. World Report on Disability 2011. World Health Organization.Google ScholarCross Ref
- [60] . 2016. Real-time obstacle detection system in indoor environment for the visually impaired using microsoft kinect sensor. Journal of Sensors 2016, 11 (2016), 1–13.Google ScholarCross Ref
- [61] . 2016. A wearable mobility aid for the visually impaired based on embedded 3D vision and deep learning. In Proceedings of the 2016 IEEE Symposium on Computers and Communication.
IEEE , 208–213.Google ScholarCross Ref - [62] . 2019. WatchOut: Obstacle sonification for people with visual impairment or blindness. In Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility. 402–413. Google ScholarDigital Library
- [63] . 2013. Advanced augmented white cane with obstacle height and distance feedback. In Proceedings of the 2013 IEEE 13th International Conference on Rehabilitation Robotics.
IEEE , 1–6.Google ScholarCross Ref - [64] . 2019. Navigation systems for the blind and visually impaired: Past work, challenges, and open problems. Sensors 19, 15 (2019), 3404.Google ScholarCross Ref
- [65] . 2019. Closing the gap: Designing for the last-few-meters wayfinding problem for people with visual impairments. In Proceedings of the 21st International acm Sigaccess Conference on Computers and Accessibility. 222–235. Google ScholarDigital Library
- [66] . 2007. Fuzzy image processing scheme for autonomous navigation of human blind. Applied Soft Computing 7, 1 (2007), 257–264. Google ScholarDigital Library
- [67] . 2012. Review on image sonification: A non-visual scene representation. In Proceedings of the Internationaol Conference on Recent Advances in Information Technology.
IEEE , 86–90.Google ScholarCross Ref - [68] . 2019. NavCog3 in the wild: Large-scale blind indoor navigation assistant with semantic features. ACM Transactions on Accessible Computing 12, 3 (2019), 1–30. Google ScholarDigital Library
- [69] . 2014. Auditory weather reports: Demonstrating listener comprehension of five concurrent variables. In Proceedings of the 9th Audio Mostly: A Conference on Interaction With Sound. 1–7. Google ScholarDigital Library
- [70] . 2011. In the shadow of misperception: Assistive technology use and social interactions. In Proceedings of the International Conference on Human Factors in Computing Systems.
ACM , 705–714. Google ScholarDigital Library - [71] . 1994. Mobile robot obstacle avoidance in a computerized travel aid for the blind. In Proceedings of the IEEE International Conference on Robotics and Automation.
IEEE , 2023–2028.Google ScholarCross Ref - [72] . 2018. Current use and future perspectives of spatial audio technologies in electronic travel aids. Wireless Communications and Mobile Computing 2018, 2 (2018), 1–17.
Google ScholarDigital Library - [73] . 2012. ‘Visual’acuity of the congenitally blind using visual-to-auditory sensory substitution. PloS One 7, 3 (2012), e33136.Google ScholarCross Ref
- [74] . 2013. A smartphone-based obstacle detection and classification system for assisting visually impaired people. In Proceedings of the IEEE International Conference on Computer Vision Workshops. 444–451. Google ScholarDigital Library
- [75] . 2018. Wearable assistive devices for visually impaired: A state of the art survey. Pattern Recognition Letters 137, (2018), 37–52.Google ScholarCross Ref
- [76] . 2001. The GuideCane-applying mobile robot technologies to assist the visually impaired. IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans 31, 2 (2001), 131–136. Google ScholarDigital Library
- [77] . 2012. Fifty years of artificial reverberation. IEEE Transactions on Audio, Speech, and Language Processing 20, 5 (2012), 1421–1448. Google ScholarDigital Library
- [78] . 2017. Obstacle detection display for visually impaired: Coding of direction, distance, and height on a vibrotactile waist band. Frontiers in ICT 4 (2017), 23. Google ScholarCross Ref
- [79] . 2011. Theory of sonification. In The Sonification Handbook, , , and (Eds.). Logos Verlag, Berlin, 9–39.Google Scholar
- [80] . 2004. Bonferroni correction. Wolfram Research, Inc. https://mathworld.wolfram.com/BonferroniCorrection.html.Google Scholar
- [81] , , and (Eds.). 2010. Foundations of Orientation and Mobility (3rd. ed.). Vol. 1. American Foundation for the Blind.Google Scholar
- [82] . 2007. Swan: System for wearable audio navigation. In Proceedings of the IEEE International Symposium on Wearable Computers.
IEEE , 91–98. Google ScholarDigital Library - [83] . 2005. Dynamic environment exploration using a virtual white cane. In Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition, Vol. 1.
IEEE , 243–249. Google ScholarDigital Library - [84] . 1999. A stereo-vision system for the visually impaired. University of Guelph. http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.26.1446.Google Scholar
- [85] . 2012. Non-visual 2D representation of obstacles. ACM SIGACCESS Accessibility and Computing102 (2012), 49–54. Google ScholarDigital Library
- [86] . 2019. Ambisonics: A Practical 3D Audio Theory for Recording, Studio Production, Sound Reinforcement, and Virtual Reality. Springer.Google ScholarCross Ref
Index Terms
- Iterative Design of Sonification Techniques to Support People with Visual Impairments in Obstacle Avoidance
Recommendations
NavCog: a navigational cognitive assistant for the blind
MobileHCI '16: Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and ServicesTurn-by-turn navigation is a useful paradigm for assisting people with visual impairments during mobility as it reduces the cognitive load of having to simultaneously sense, localize and plan. To realize such a system, it is necessary to be able to ...
Sonification of navigation instructions for people with visual impairment
AbstractNavigation assistance services for people with visual impairment pose the challenge of providing accurate guidance through non-visual navigation instructions. This paper proposes two techniques to guide the user during navigation ...
Deep Learning Compensation of Rotation Errors During Navigation Assistance for People with Visual Impairments or Blindness
Regular Papers and Special Issue on ASSETS 2018Navigation assistive technologies are designed to support people with visual impairments during mobility. In particular, turn-by-turn navigation is commonly used to provide walk and turn instructions, without requiring any prior knowledge about the ...
Comments