skip to main content
research-article

Iterative Design of Sonification Techniques to Support People with Visual Impairments in Obstacle Avoidance

Published:15 October 2021Publication History
Skip Abstract Section

Abstract

Obstacle avoidance is a major challenge during independent mobility for blind or visually impaired (BVI) people. Typically, BVI people can only perceive obstacles at a short distance (about 1 m, in case they are using the white cane), and some obstacles are hard to detect (e.g., those elevated from the ground), or should not be hit by the white cane (e.g., a standing person). A solution to these problems can be found in recent computer-vision techniques that can run on mobile and wearable devices to detect obstacles at a distance. However, in addition to detecting obstacles, it is also necessary to convey information about them in real time.

This contribution presents WatchOut, a sonification technique for conveying real-time information about the main properties of an obstacle to a BVI person, who can then use this additional feedback to safely navigate in the environment. WatchOut was designed with a user-centered approach, involving four iterations of online listening tests with BVI participants in order to define, improve and evaluate the sonification technique, eventually obtaining an almost perfect recognition accuracy. WatchOut was also implemented and tested as a module of a mobile app that detects obstacles using state-of-the-art computer vision technology. Results show that the system is considered usable and can guide the users to avoid more than 85% of the obstacles.

REFERENCES

  1. [1] Aguerrevere Daniel, Choudhury Maroof, and Barreto Armando. 2004. Portable 3D sound/sonar navigation system for blind individuals. In Proceedings of the 2nd LACCEI International Latin American Caribbean Conference. Engineering Technology.Google ScholarGoogle Scholar
  2. [2] Ahmetovic Dragan, Avanzini Federico, Baratè Adriano, Bernareggi Cristian, Galimberti Gabriele, Ludovico Luca A., Mascetti Sergio, and Presti Giorgio. 2019. Sonification of rotation instructions to support navigation of people with visual impairment. In Proceedings of the IEEE International Conference on Pervasive Computing and Communications. 332341.Google ScholarGoogle ScholarCross RefCross Ref
  3. [3] Ahmetovic Dragan, Bernareggi Cristian, Gerino Andrea, and Mascetti Sergio. 2014. Zebrarecognizer: Efficient and precise localization of pedestrian crossings. In Proceedings of the 2014 22nd International Conference on Pattern Recognition. IEEE, 25662571. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. [4] Ahmetovic Dragan, Mascetti Sergio, Bernareggi Cristian, Guerreiro João, Oh Uran, and Asakawa Chieko. 2019. Deep learning compensation of rotation errors during navigation assistance for people with visual impairments or blindness. ACM Transactions on Accessible Computing 12, 4 (2019), 119. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. [5] Ahmetovic Dragan, Sato Daisuke, Oh Uran, Ishihara Tatsuya, Kitani Kris, and Asakawa Chieko. 2020. ReCog: Supporting blind people in recognizing personal objects. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, 112. DOI: DOI: https://doi.org/10.1145/3313831.3376143 Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. [6] Al-Fahoum Amjed S., Al-Hmoud Heba B., and Al-Fraihat Ausaila A.. 2013. A smart infrared microcontroller-based blind guidance system. Active and Passive Electronic Components 2013, 16 (2013). https://doi.org/10.1155/2013/726480Google ScholarGoogle Scholar
  7. [7] Anderson Janet E. and Sanderson Penelope. 2009. Sonification design for complex work domains: Dimensions and distractors. Journal of Experimental Psychology: Applied 15, 3 (2009), 183198.Google ScholarGoogle ScholarCross RefCross Ref
  8. [8] Andreas, Diepstraten Joachim, and Ertl Thomas. 2004. Design and development of an indoor navigation and object identification system for the blind. In Proceedings of the International Conference ACM Sigaccess Accessibility and Computing. ACM, 147152. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. [9] Ashley Richard. 2004. Musical pitch space across modalities: Spatial and other mappings through language and culture. In Proceedings of the 8th International Conference on Music Perception and Cognition, Lipscomb S. D., Ashley R., Gjerdingen R. O., and Webster P. (Eds.). Causal Productions, Evanston, Illinois, 6471.Google ScholarGoogle Scholar
  10. [10] Bai Jinqiang, Liu Zhaoxiang, Lin Yimin, Li Ye, Lian Shiguo, and Liu Dijun. 2019. Wearable travel aid for environment perception and navigation of visually impaired people. Electronics 8, 6 (2019), 697.Google ScholarGoogle ScholarCross RefCross Ref
  11. [11] Bangor Aaron, Kortum Philip, and Miller James. 2009. Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of Usability Studies 4, 3 (2009), 114123. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. [12] Bouzit M., Chaibi A., Laurentis K. J. De, and Mavroidis C.. 2004. Tactile feedback navigation handle for the visually impaired. In Proceedings of the ASME International Mechanical Engineering Congress and Exposition. 11711177.Google ScholarGoogle ScholarCross RefCross Ref
  13. [13] Brock Michael and Kristensson Per Ola. 2013. Supporting blind navigation using depth sensing and sonification. In Proceedings of the ACM Conference Pervasive and Ubiquitous Computing Adjunct Publication. ACM, 255258. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. [14] Brooke John. 1996. SUS: A quick and dirty usability scale. Usability Evaluation in Industry. Jordan P. W., Thomas B., Weerdmeester B. A., and McClelland I. L. (Eds.), Taylor & Francis. 189194.Google ScholarGoogle Scholar
  15. [15] Brunswik Egon and Kamiya Joe. 1953. Ecological cue-validity of ’proximity’ and of other Gestalt factors. American Journal of Psychology 66, 1 (1953), 2032.Google ScholarGoogle ScholarCross RefCross Ref
  16. [16] Coop Allan D.. 2016. Sonification, musification, and synthesis of absolute program music. In Proceedings of the 22nd Annual International Conference on Auditory Display. 177183.Google ScholarGoogle ScholarCross RefCross Ref
  17. [17] Dakopoulos Dimitrios, Boddhu Sanjay K., and Bourbakis Nikolaos. 2007. A 2D vibration array as an assistive device for visually impaired. In Proceedings of the IEEE 7th International Symposium on BioInformatics and BioEngineering. IEEE, 930937.Google ScholarGoogle ScholarCross RefCross Ref
  18. [18] Dakopoulos D. and Bourbakis N. G.. 2010. Wearable obstacle avoidance electronic travel aids for blind: A survey. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews) 40, 1 (2010), 2535. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. [19] Dang Quoc, Chee Youngjoon, Pham Duy, and Suh Young. 2016. A virtual blind cane using a line laser-based vision system and an inertial measurement unit. Sensors 16, 1 (2016), 95.Google ScholarGoogle ScholarCross RefCross Ref
  20. [20] Dhod Rahul, Singh Gurmohan, Singh Gagandeep, and Kaur Manjit. 2017. Low cost GPS and GSM based navigational aid for visually impaired people. Wireless Personal Communications 92, 4 (2017), 15751589. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. [21] Dingler Tilman, Lindsay Jeffrey, and Walker Bruce N.. 2008. Learnability of sound cues for environmental features: Auditory icons, earcons, spearcons, and speech. In Proceedings of the 14th International Conference on Auditory Display.Google ScholarGoogle Scholar
  22. [22] Dodds A. G., Armstrong J. D., and Shingledecker C. A.. 1981. The nottingham obstacle detector: Development and evaluation. Journal of Visual Impairment & Blindness 75, 5 (1981), 203209.Google ScholarGoogle ScholarCross RefCross Ref
  23. [23] Dubus Gaël and Bresin Roberto. 2013. A systematic review of mapping strategies for the sonification of physical quantities. PLoS ONE 8, 12 (2013), e82491.Google ScholarGoogle ScholarCross RefCross Ref
  24. [24] Dunai Larisa, Fajarnes Guillermo Peris, Praderas Victor Santiago, Garcia Beatriz Defez, and Lengua Ismael Lengua. 2010. Real-time assistance prototype–a new navigation aid for blind people. In Proceedings of the IECON 2010 36th Annual Conference on IEEE Industrial Electronics Society. IEEE, 11731178.Google ScholarGoogle ScholarCross RefCross Ref
  25. [25] Elmannai Wafa and Elleithy Khaled. 2017. Sensor-based assistive devices for visually-impaired people: Current status, challenges, and future directions. Sensors 17, 3 (2017), 565.Google ScholarGoogle ScholarCross RefCross Ref
  26. [26] ETSI. 2002. Human Factors: Guidelines on the Multimodality of Icons, Symbols and Pictograms. Technical Report EG 202 048 (V1.1.1).Google ScholarGoogle Scholar
  27. [27] Geronazzo Michele, Bedin Alberto, Brayda Luca, Campus Claudio, and Avanzini Federico. 2016. Interactive spatial sonification for non-visual exploration of virtual maps. International Journal on Human Computer Studies 85 (2016), 415. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. [28] Giudice Nicholas A.. 2018. Navigating without vision: Principles of blind spatial cognition. In Handbook of Behavioral and Cognitive Geography, Montello Daniel R. (Ed.). Edward Elgar Publishing, 260288.Google ScholarGoogle ScholarCross RefCross Ref
  29. [29] Giudice Nicholas A. and Legge Gordon E.. 2008. Blind navigation and the role of technology. In The Engineering Handbook of Smart Technology for Aging, Disability, and Independence, Helal A., Mokhtari M., and Abdulrazak B. (Eds.). Wiley, 479500.Google ScholarGoogle ScholarCross RefCross Ref
  30. [30] González-Mora José L., Rodriguez-Hernandez A., Rodriguez-Ramos L. F., Díaz-Saco L., and Sosa N.. 1999. Development of a new space perception system for blind people, based on the creation of a virtual acoustic space. In Proceedings of the International Work-Conference on Artificial Neural Networks. Springer, 321330.Google ScholarGoogle ScholarCross RefCross Ref
  31. [31] Grond Florian and Berger Jonathan. 2011. Parameter mapping sonification. In The Sonification Handbook, Hermann T., Hunt A., and Neuhoff J. G. (Eds.). Logos Verlag, Berlin, 363397.Google ScholarGoogle Scholar
  32. [32] Guerreiro João, Sato Daisuke, Asakawa Saki, Dong Huixu, Kitani Kris M., and Asakawa Chieko. 2019. Cabot: Designing and evaluating an autonomous navigation robot for blind people. In Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility. 6882. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. [33] Hu Feng, Tsering Norbu, Tang Hao, and Zhu Zhigang. 2016. Indoor localization for the visually impaired using a 3d sensor. In Proceedings of the 31st Annual International Technology and Persons with Disabilities Conference.Google ScholarGoogle Scholar
  34. [34] Ifukube Tohru, Sasaki Tadayuki, and Peng Chen. 1991. A blind mobility aid modeled after echolocation of bats. IEEE Transactions on Biomedical Engineering 38, 5 (1991), 461465.Google ScholarGoogle ScholarCross RefCross Ref
  35. [35] Ito Kiyohide, Okamoto Makoto, Akita Junichi, Ono Tetsuo, Gyobu Ikuko, Takagi Tomohito, Hoshi Takahiro, and Mishima Yu. 2005. CyARM: An alternative aid device for blind persons. In Proceedings of the CHI’05 Extended Abstracts on Human Factors in Computing Systems. ACM, 14831488. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. [36] Jafri Rabia, Campos Rodrigo Louzada, Ali Syed Abid, and Arabnia Hamid R.. 2017. Visual and infrared sensor data-based obstacle detection for the visually impaired using the Google project tango tablet development kit and the unity engine. IEEE Access 6 (2017), 443454.Google ScholarGoogle ScholarCross RefCross Ref
  37. [37] Jafri Rabia and Khan Marwa Mahmoud. 2016. Obstacle detection and avoidance for the visually impaired in indoors environments using Googles Project Tango device. In Proceedings of the International Conference on Computers Helping People with Special Needs. Springer, 179185.Google ScholarGoogle ScholarCross RefCross Ref
  38. [38] Kacorri Hernisa, Mascetti Sergio, Gerino Andrea, Ahmetovic Dragan, Alampi Valeria, Takagi Hironobu, and Asakawa Chieko. 2018. Insights on assistive orientation and mobility of people with visual impairment based on large-scale longitudinal data. ACM Transactions on Accessible Computing 11, 1 (2018), 128. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. [39] Katzschmann Robert K., Araki Brandon, and Rus Daniela. 2018. Safe local navigation for visually impaired users with a time-of-flight and haptic feedback device. IEEE Transactions on Neural Systems and Rehabilitation Engineering 26, 3 (2018), 583593.Google ScholarGoogle ScholarCross RefCross Ref
  40. [40] Kay Leslie. 1964. An ultrasonic sensing probe as a mobility aid for the blind. Ultrasonics 2, 2 (1964), 5359.Google ScholarGoogle ScholarCross RefCross Ref
  41. [41] Krumhansl Carol L. and Iverson Paul. 1992. Perceptual interactions between musical pitch and timbre. Journal of Experimental Psychology: Human Perception and Performance 18, 3 (1992), 739.Google ScholarGoogle ScholarCross RefCross Ref
  42. [42] Lewis James R.. 1993. Multipoint scales: Mean and median differences and observed significance levels. International Journal of Human-Computer Interaction 5, 4 (1993), 383392.Google ScholarGoogle ScholarCross RefCross Ref
  43. [43] Li Bing, Muñoz Juan Pablo, Rong Xuejian, Chen Qingtian, Xiao Jizhong, Tian Yingli, Arditi Aries, and Yousuf Mohammed. 2018. Vision-based mobile indoor assistive navigation aid for blind people. IEEE Transactions on Mobile Computing 18, 3 (2018), 702714. Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. [44] Liu Hong, Wang Jun, Wang Xiangdong, and Qian Yueliang. 2015. iSee: Obstacle detection and feedback system for the blind. In Adjunct Proceedings of the ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the ACM International Symposium on Wearable Computers. ACM, 197200. Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. [45] Loeliger Esther and Stockman Tony. 2014. Wayfinding without visual cues: Evaluation of an interactive audio map system. Interacting with Computers 26, 5 (2014), 403416.Google ScholarGoogle ScholarCross RefCross Ref
  46. [46] Loomis Jack M., Marston James R., Golledge Reginald G., and Klatzky Roberta L.. 2005. Personal guidance system for people with visual impairment: A comparison of spatial displays for route guidance. Journal of Visual Impairment & Blindness 99, 4 (2005), 219232.Google ScholarGoogle ScholarCross RefCross Ref
  47. [47] Ludovico Luca Andrea and Presti Giorgio. 2016. The sonification space: A reference system for sonification tasks. International Journal of Human-Computer Studies 85 (2016), 7277. DOI: DOI: https://doi.org/10.1016/j.ijhcs.2015.08.008 Google ScholarGoogle ScholarCross RefCross Ref
  48. [48] Manduchi Roberto. 2012. Mobile vision as assistive technology for the blind: An experimental study. In Proceedings of the International Conference on Computers Helping People with Special Needs. Springer, 916. Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. [49] Manduchi Roberto and Kurniawan Sri. 2011. Mobility-related accidents experienced by people with visual impairment. AER Journal: Research and Practice in Visual Impairment and Blindness 4, 2 (2011), 4454.Google ScholarGoogle Scholar
  50. [50] Mante Nii and Weiland James D.. 2018. Visually impaired users can locate and grasp objects under the guidance of computer vision and non-visual feedback. In Proceedings of the 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE, 14.Google ScholarGoogle ScholarCross RefCross Ref
  51. [51] Mascetti Sergio, Ahmetovic Dragan, Gerino Andrea, and Bernareggi Cristian. 2016. Zebrarecognizer: Pedestrian crossing recognition for people with visual impairment or blindness. Pattern Recognition 60 (2016), 405419. Google ScholarGoogle ScholarDigital LibraryDigital Library
  52. [52] Mascetti Sergio, Ahmetovic Dragan, Gerino Andrea, Bernareggi Cristian, Busso Mario, and Rizzi Alessandro. 2016. Robust traffic lights detection on mobile devices for pedestrians with visual impairment. Computer Vision and Image Understanding 148, C (2016), 123135. Google ScholarGoogle ScholarDigital LibraryDigital Library
  53. [53] Meers Simon and Ward Koren. 2005. A substitute vision system for providing 3D perception and GPS navigation via electro-tactile stimulation. In Proceedings of International Conference on Sensing Technology.Google ScholarGoogle Scholar
  54. [54] Meijer Peter B. L.. 1992. An experimental system for auditory image representations. IEEE Transactions on Biomedical Engineering 39, 2 (1992), 112121.Google ScholarGoogle ScholarCross RefCross Ref
  55. [55] Metsiritrakul Kawin, Suchato Atiwong, and Punyabukkana Proadpran. 2017. Obstacle avoidance feedback system for the blind using stereo sound. In Proceedings of the International Convention on Rehabilitation Engineering and Assistive Technology. Therapeutic, Assistive & Rehabilitative Technologies (START) Centre, 121. Google ScholarGoogle ScholarDigital LibraryDigital Library
  56. [56] Mulfari Davide. 2018. A TensorFlow-based assistive technology system for users with visual impairments. In Proceedings of the Internet of Accessible Things. 12. Google ScholarGoogle ScholarDigital LibraryDigital Library
  57. [57] Munoz Rai, Rong Xuejian, and Tian Yingli. 2016. Depth-aware indoor staircase detection and recognition for the visually impaired. In Proceedings of the 2016 IEEE International Conference on Multimedia & Expo Workshops. IEEE, 16.Google ScholarGoogle ScholarCross RefCross Ref
  58. [58] Nada Ayat A., Fakhr Mahmoud A., and Seddik Ahmed F.. 2015. Assistive infrared sensor based smart stick for blind people. In Proceedings of the 2015 Science and Information Conference. IEEE, 11491154.Google ScholarGoogle ScholarCross RefCross Ref
  59. [59] Organization World Health et al. 2011. World Report on Disability 2011. World Health Organization.Google ScholarGoogle ScholarCross RefCross Ref
  60. [60] Pham Huy-Hieu, Le Thi-Lan, and Vuillerme Nicolas. 2016. Real-time obstacle detection system in indoor environment for the visually impaired using microsoft kinect sensor. Journal of Sensors 2016, 11 (2016), 113.Google ScholarGoogle ScholarCross RefCross Ref
  61. [61] Poggi Matteo and Mattoccia Stefano. 2016. A wearable mobility aid for the visually impaired based on embedded 3D vision and deep learning. In Proceedings of the 2016 IEEE Symposium on Computers and Communication. IEEE, 208213.Google ScholarGoogle ScholarCross RefCross Ref
  62. [62] Presti Giorgio, Ahmetovic Dragan, Ducci Mattia, Bernareggi Cristian, Ludovico Luca, Baratè Adriano, Avanzini Federico, and Mascetti Sergio. 2019. WatchOut: Obstacle sonification for people with visual impairment or blindness. In Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility. 402413. Google ScholarGoogle ScholarDigital LibraryDigital Library
  63. [63] Pyun Rosali, Kim Yeongmi, Wespe Pascal, Gassert Roger, and Schneller Stefan. 2013. Advanced augmented white cane with obstacle height and distance feedback. In Proceedings of the 2013 IEEE 13th International Conference on Rehabilitation Robotics. IEEE, 16.Google ScholarGoogle ScholarCross RefCross Ref
  64. [64] Real Santiago and Araujo Alvaro. 2019. Navigation systems for the blind and visually impaired: Past work, challenges, and open problems. Sensors 19, 15 (2019), 3404.Google ScholarGoogle ScholarCross RefCross Ref
  65. [65] Saha Manaswi, Fiannaca Alexander J., Kneisel Melanie, Cutrell Edward, and Morris Meredith Ringel. 2019. Closing the gap: Designing for the last-few-meters wayfinding problem for people with visual impairments. In Proceedings of the 21st International acm Sigaccess Conference on Computers and Accessibility. 222235. Google ScholarGoogle ScholarDigital LibraryDigital Library
  66. [66] Sainarayanan Gopala, Nagarajan R., and Yaacob Sazali. 2007. Fuzzy image processing scheme for autonomous navigation of human blind. Applied Soft Computing 7, 1 (2007), 257264. Google ScholarGoogle ScholarDigital LibraryDigital Library
  67. [67] Sarkar Rajib, Bakshi Sambit, and Sa Pankaj K.. 2012. Review on image sonification: A non-visual scene representation. In Proceedings of the Internationaol Conference on Recent Advances in Information Technology. IEEE, 8690.Google ScholarGoogle ScholarCross RefCross Ref
  68. [68] Sato Daisuke, Oh Uran, Guerreiro João, Ahmetovic Dragan, Naito Kakuya, Takagi Hironobu, Kitani Kris M., and Asakawa Chieko. 2019. NavCog3 in the wild: Large-scale blind indoor navigation assistant with semantic features. ACM Transactions on Accessible Computing 12, 3 (2019), 130. Google ScholarGoogle ScholarDigital LibraryDigital Library
  69. [69] Schuett Jonathan H., Winton Riley J., Batterman Jared M., and Walker Bruce N.. 2014. Auditory weather reports: Demonstrating listener comprehension of five concurrent variables. In Proceedings of the 9th Audio Mostly: A Conference on Interaction With Sound. 17. Google ScholarGoogle ScholarDigital LibraryDigital Library
  70. [70] Shinohara Kristen and Wobbrock Jacob O.. 2011. In the shadow of misperception: Assistive technology use and social interactions. In Proceedings of the International Conference on Human Factors in Computing Systems. ACM, 705714. Google ScholarGoogle ScholarDigital LibraryDigital Library
  71. [71] Shoval Shraga, Borenstein Johann, and Koren Yoram. 1994. Mobile robot obstacle avoidance in a computerized travel aid for the blind. In Proceedings of the IEEE International Conference on Robotics and Automation. IEEE, 20232028.Google ScholarGoogle ScholarCross RefCross Ref
  72. [72] Spagnol Simone, Wersényi György, Bujacz Michał, Bălan Oana, Martínez Marcelo Herrera, Moldoveanu Alin, and Unnthorsson Runar. 2018. Current use and future perspectives of spatial audio technologies in electronic travel aids. Wireless Communications and Mobile Computing 2018, 2 (2018), 117. Google ScholarGoogle ScholarDigital LibraryDigital Library
  73. [73] Striem-Amit Ella, Guendelman Miriam, and Amedi Amir. 2012. ‘Visual’acuity of the congenitally blind using visual-to-auditory sensory substitution. PloS One 7, 3 (2012), e33136.Google ScholarGoogle ScholarCross RefCross Ref
  74. [74] Tapu Ruxandra, Mocanu Bogdan, Bursuc Andrei, and Zaharia Titus. 2013. A smartphone-based obstacle detection and classification system for assisting visually impaired people. In Proceedings of the IEEE International Conference on Computer Vision Workshops. 444451. Google ScholarGoogle ScholarDigital LibraryDigital Library
  75. [75] Tapu Ruxandra, Mocanu Bogdan, and Zaharia Titus. 2018. Wearable assistive devices for visually impaired: A state of the art survey. Pattern Recognition Letters 137, (2018), 3752.Google ScholarGoogle ScholarCross RefCross Ref
  76. [76] Ulrich Iwan and Borenstein Johann. 2001. The GuideCane-applying mobile robot technologies to assist the visually impaired. IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans 31, 2 (2001), 131136. Google ScholarGoogle ScholarDigital LibraryDigital Library
  77. [77] Valimaki Vesa, Parker Julian D., Savioja Lauri, Smith Julius O., and Abel Jonathan S.. 2012. Fifty years of artificial reverberation. IEEE Transactions on Audio, Speech, and Language Processing 20, 5 (2012), 14211448. Google ScholarGoogle ScholarDigital LibraryDigital Library
  78. [78] Erp Jan B. F. van, Kroon Liselotte C. M., Mioch Tina, and Paul Katja I.. 2017. Obstacle detection display for visually impaired: Coding of direction, distance, and height on a vibrotactile waist band. Frontiers in ICT 4 (2017), 23. Google ScholarGoogle ScholarCross RefCross Ref
  79. [79] Walker Bruce N. and Nees Michael A.. 2011. Theory of sonification. In The Sonification Handbook, Hermann Thomas, Hunt Andy, and Neuhoff John G. (Eds.). Logos Verlag, Berlin, 939.Google ScholarGoogle Scholar
  80. [80] Weisstein Eric W.. 2004. Bonferroni correction. Wolfram Research, Inc. https://mathworld.wolfram.com/BonferroniCorrection.html.Google ScholarGoogle Scholar
  81. [81] Wiener William R., Welsh Richard L., and Blasch Bruce B. (Eds.). 2010. Foundations of Orientation and Mobility (3rd. ed.). Vol. 1. American Foundation for the Blind.Google ScholarGoogle Scholar
  82. [82] Wilson Jeff, Walker Bruce N., Lindsay Jeffrey, Cambias Craig, and Dellaert Frank. 2007. Swan: System for wearable audio navigation. In Proceedings of the IEEE International Symposium on Wearable Computers. IEEE, 9198. Google ScholarGoogle ScholarDigital LibraryDigital Library
  83. [83] Yuan Dan and Manduchi Roberto. 2005. Dynamic environment exploration using a virtual white cane. In Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition, Vol. 1. IEEE, 243249. Google ScholarGoogle ScholarDigital LibraryDigital Library
  84. [84] Zelek John, Audette Richard, Balthazaar Jocelyn, and Dunk Craig. 1999. A stereo-vision system for the visually impaired. University of Guelph. http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.26.1446.Google ScholarGoogle Scholar
  85. [85] Zeng Limin. 2012. Non-visual 2D representation of obstacles. ACM SIGACCESS Accessibility and Computing102 (2012), 4954. Google ScholarGoogle ScholarDigital LibraryDigital Library
  86. [86] Zotter Franz and Frank Matthias. 2019. Ambisonics: A Practical 3D Audio Theory for Recording, Studio Production, Sound Reinforcement, and Virtual Reality. Springer.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Iterative Design of Sonification Techniques to Support People with Visual Impairments in Obstacle Avoidance

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in

        Full Access

        • Published in

          cover image ACM Transactions on Accessible Computing
          ACM Transactions on Accessible Computing  Volume 14, Issue 4
          December 2021
          171 pages
          ISSN:1936-7228
          EISSN:1936-7236
          DOI:10.1145/3485142
          Issue’s Table of Contents

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 15 October 2021
          • Accepted: 1 June 2021
          • Revised: 1 February 2021
          • Received: 1 July 2020
          Published in taccess Volume 14, Issue 4

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article
          • Refereed

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Full Text

        View this article in Full Text.

        View Full Text

        HTML Format

        View this article in HTML Format .

        View HTML Format