skip to main content
research-article

Experimental Analysis of a Spatialised Audio Interface for People with Visual Impairments

Published:15 October 2020Publication History
Skip Abstract Section

Abstract

Sound perception is a fundamental skill for many people with severe sight impairments. The research presented in this article is part of an ongoing project with the aim to create a mobile guidance aid to help people with vision impairments find objects within an unknown indoor environment. This system requires an effective non-visual interface and uses bone-conduction headphones to transmit audio instructions to the user. It has been implemented and tested with spatialised audio cues, which convey the direction of a predefined target in 3D space. We present an in-depth evaluation of the audio interface with several experiments that involve a large number of participants, both blindfolded and with actual visual impairments, and analyse the pros and cons of our design choices. In addition to producing results comparable to the state-of-the-art, we found that Fitts’s Law (a predictive model for human movement) provides a suitable metric that can be used to improve and refine the quality of the audio interface in future mobile navigation aids.

References

  1. Teemu Tuomas Ahmaniemi and Vuokko Tuulikki Lantz. 2009. Augmented reality target finding based on tactile cues. In Proceedings of the 2009 International Conference on Multimodal Interfaces (2009), 335--342. DOI:https://doi.org/10.1145/1647314.1647383Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Aries Arditi and YingLi Tian. 2013. User interface preferences in the design if a camera-based navigation and wayfinding aid. Journal of Visual Impairment 8 Blindness 107, 2 (2013), 118--129.Google ScholarGoogle ScholarCross RefCross Ref
  3. Ruzena Bajcsy, Yiannis Aloimonos, and John K. Tsotsos. 2018. Revisiting active perception. Autonomous Robots 42, 2 (2018), 177--196.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Jens Blauert. 1969. Sound localization in the median plane. Acta Acustica united with Acustica 22, 4 (1969), 205--213.Google ScholarGoogle Scholar
  5. J. Blauert. 1997. Spatial Hearing: The Psychophysics of Human Sound Localization. MIT press.Google ScholarGoogle Scholar
  6. Jeffrey R. Blum, Mathieu Bouchard, and Jeremy R. Cooperstock. 2013. Spatialized audio environmental awareness for blind users with a smartphone. Mobile Networks and Applications 18, 3 (2013), 295--309.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Manuela Chessa, Nicoletta Noceti, Francesca Odone, Fabio Solari, Joan Sosa-García, and Luca Zini. 2016. An integrated artificial vision framework for assisting visually impaired users. Computer Vision and Image Understanding 149 (2016), 209--228.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Kai Crispien and Helen Petrie. 1994. The “GUIB” spatial auditory display-generation of an audio-based interface for blind computer users. Georgia Institute of Technology.Google ScholarGoogle Scholar
  9. Barthélémy Durette, Nicolas Louveton, David Alleysson, and Jeanny Hérault. 2008. Visuo-auditory sensory substitution for mobility assistance: Testing TheVIBE. In Workshop on Computer Vision Applications for the Visually Impaired.Google ScholarGoogle Scholar
  10. Alexander Fiannaca, Ilias Apostolopoulous, and Eelke Folmer. 2014. Headlock: A wearable navigation aid that helps blind cane users traverse large open spaces. In Proceedings of the 16th International ACM SIGACCESS Conference on Computers 8 Accessibility. ACM, 19--26.Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. P. M. Fitts. 1954. The information capacity of the human motor system in controlling the amplitude of movement.Journal of Experimental Psychology 47, 6 (1954), 381.Google ScholarGoogle Scholar
  12. Paolo Gallina, Nicola Bellotto, and Massimiliano Di Luca. 2015. Progressive co-adaptation in human-machine interaction. In International Conference on Informatics in Control, Automation and Robotics 2 (2015), 362--368.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. William G. Gardner and Keith D. Martin. 1995. HRTF measurements of a KEMAR. The Journal of the Acoustical Society of America 97, 6 (1995), 3907--3908.Google ScholarGoogle ScholarCross RefCross Ref
  14. Michele Geronazzo, Alberto Bedin, Luca Brayda, Claudio Campus, and Federico Avanzini. 2016. Interactive spatial sonification for non-visual exploration of virtual maps. International Journal of Human Computer Studies 85 (2016), 4--15.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Reginald G. Golledge, James R. Marston, Jack M. Loomis, and Roberta L. Klatzky. 2004. Stated preferences for components of a personal guidance system for nonvisual navigation. Journal of Visual Impairment 8 Blindness 98, 3 (2004), 135--147.Google ScholarGoogle ScholarCross RefCross Ref
  16. Garin Hiebert. 2005. OpenAL 1.1 Specification and Reference.Google ScholarGoogle Scholar
  17. Paul Kabbash and William A. S. Buxton. 1995. The âprinceâ technique: Fitts’ law and selection using area cursors. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 273--279.Google ScholarGoogle Scholar
  18. S. Kammoun, G. Parseihian, O. Gutierrez, A. Brilhault, A. Serpa, M. Raynal, B. Oriola, M. J.M. MacÉ, M. Auvray, M. Denis, S. J. Thorpe, P. Truillet, B. F.G. Katz, and C. Jouffrais. 2012. Navigation and space perception assistance for the visually impaired: The NAVIG project. IRBM 33, 2 (2012), 182--189. DOI:https://doi.org/10.1016/j.irbm.2012.01.009Google ScholarGoogle ScholarCross RefCross Ref
  19. Nadia Kanwal, Erkan Bostanci, Keith Currie, and Adrian F. Clark. 2015. A navigation system for the visually impaired: A fusion of vision and depth sensor. Applied Bionics and Biomechanics (2015).Google ScholarGoogle Scholar
  20. Brian F. G. Katz, Philippe Truillet, Simon J. Thorpe, Christophe Jouffrais, and Jouffrais. 2010. NAVIG: Navigation assisted by artificial vision and GNSS.Workshop on Multimodal Location Based Techniques for Extreme Navigation1 (2010), 1--4.Google ScholarGoogle Scholar
  21. Roberta L. Klatzky, James R. Marston, Nicholas A. Giudice, Reginald G. Golledge, and Jack M. Loomis. 2006. Cognitive load of navigating without vision when guided by virtual sound versus spatial language.Journal of Experimental Psychology: Applied 12, 4 (2006), 223--232.Google ScholarGoogle Scholar
  22. Y. H. Lee and G. Medioni. 2015. RGB-D camera based wearable navigation system for the visually impaired. Computer Vision and Image Understanding 149 (2015), 3--20.Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. HCCH Levitt. 1971. Transformed up-down methods in psychoacoustics. The Journal of the Acoustical Society of America 49, 2B (1971), 467--477.Google ScholarGoogle ScholarCross RefCross Ref
  24. Richard Lichenstein, Daniel Clarence Smith, Jordan Lynne Ambrose, and Laurel Anne Moody. 2012. Headphone use and pedestrian injury and death in the United States: 2004--2011. Injury Prevention 18, 5 (2012), 287--290.Google ScholarGoogle ScholarCross RefCross Ref
  25. J. C. Lock, G. Cielniak, and N. Bellotto. 2017. Portable navigations system with adaptive multimodal interface for the blind. In AAAI Spring Symposium -- Designing the User Experience of Machine Learning Systems.Google ScholarGoogle Scholar
  26. J. C. Lock, G. Cielniak, and N. Bellotto. 2019a. Active object search with a mobile device for people with visual impairments. In International Conference on Computer Vision Theory and Applications. 476--485.Google ScholarGoogle Scholar
  27. J. C. Lock, I. D. Gilchrist, G. Cielniak, and N. Bellotto. 2019b. Bone-conduction audio interface to guide people with visual impairments. Communications in Computer and Information Science (2019).Google ScholarGoogle Scholar
  28. Justin A. MacDonald, Paula P. Henry, and Tomasz R. Letowski. 2006. Spatial audio through a bone conduction interface. International Journal of Audiology 45, 10 (2006), 595--599.Google ScholarGoogle ScholarCross RefCross Ref
  29. I. S. MacKenzie. 1992. Fitts’ law as a research and design tool in human-computer interaction. Human-Computer Interaction 7, 1 (1992), 91--139.Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Georgios N. Marentakis and Stephen A. Brewster. 2006. Effects of feedback, mobility and index of difficulty on deictic spatial audio target acquisition in the horizontal plane. In Proceedings of CHI’06, ACM Press (2006), 359. DOI:https://doi.org/10.1145/1124772.1124826Google ScholarGoogle Scholar
  31. Sergio Mascetti, Lorenzo Picinali, Andrea Gerino, Dragan Ahmetovic, and Cristian Bernareggi. 2016. Sonification of guidance data during road crossing for people with visual impairments or blindness. International Journal of Human-Computer Studies (2016).Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Susanna Millar. 1994. Understanding and Representing Space: Theory and Evidence from Studies with Blind and Sighted Children. Clarendon Press/Oxford University Press.Google ScholarGoogle Scholar
  33. Bogdan Mocanu, Ruxandra Tapu, and Titus Zaharia. 2016. When ultrasonic sensors and computer vision join forces for efficient obstacle detection and recognition. Sensors 16, 11 (2016).Google ScholarGoogle Scholar
  34. Michal Pec, Michal Bujacz, Pawel Strumillo, and Andrzej Materka. 2008. Individual HRTF measurements for accurate obstacle sonification in an electronic travel aid for the blind. In 2008 International Conference on Signals and Electronic Systems. IEEE, 235--238.Google ScholarGoogle ScholarCross RefCross Ref
  35. Helen Petrie, Valerie Johnson, Thomas Strothotte, Andreas Raab, Rainer Michel, Lars Reichert, and Axel Schalt. 1997. MoBIC: An aid to increase the independent mobility of blind travellers. British Journal of Visual Impairment 15, 2 (1997), 63--66.Google ScholarGoogle ScholarCross RefCross Ref
  36. C. C. Pratt. 1930. The spatial character of high and low tones.Journal of Experimental Psychology 13, 3 (1930), 278.Google ScholarGoogle Scholar
  37. Giorgio Presti, Dragan Ahmetovic, Mattia Ducci, Cristian Bernareggi, Luca Ludovico, Adriano Baratè, Federico Avanzini, and Sergio Mascetti. 2019. WatchOut: Obstacle sonification for people with visual impairment or blindness. In ACM SIGACCESS Conference on Computers and Accessibility (ASSETS). ACM.Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Linda Pring. 2008. Psychological characteristics of children with visual impairments: Learning, memory and imagery. British Journal of Visual Impairment (2008).Google ScholarGoogle Scholar
  39. Jose Rivera-Rubio, Kai Arulkumaran, Hemang Rishi, Ioannis Alexiou, and Anil A. Bharath. 2015. An assistive haptic interface for appearance-based indoor navigation. Computer Vision and Image Understanding 149, Assistive Computer Vision and Robotics (2015), 126--145.Google ScholarGoogle Scholar
  40. Alberto Rodríguez, Luis M. Bergasa, Pablo F. Alcantarilla, Javier Yebes, and Andrés Cela. 2012. Obstacle avoidance system for assisting visually impaired people. Intelligent Vehicles Symposium Workshops (2012), 1--6.Google ScholarGoogle Scholar
  41. Daisuke Sato, Uran Oh, João Guerreiro, Dragan Ahmetovic, Kakuya Naito, Hironobu Takagi, Kris M. Kitani, and Chieko Asakawa. 2019. NavCog3: Large-scale blind indoor navigation assistant with semantic features in the wild. Transactions on Accessible Computing (2019).Google ScholarGoogle Scholar
  42. Boris Schauerte, Manel Martinez, Angela Constantinescu, and Rainer Stiefelhagen. 2012. An assistive vision system for the blind that helps find lost things. In International Conference on Computers for Handicapped Persons. Springer, 566--572.Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. David Schonstein, Laurent Ferré, and Brian F. Katz. 2008. Comparison of headphones and equalization for virtual auditory source localization. The Journal of the Acoustical Society of America 5 (2008), 3724--3724.Google ScholarGoogle ScholarCross RefCross Ref
  44. Tobias Schwarze, Martin Lauer, Manuel Schwaab, Michailas Romanovas, Sandra Bohm, and Thomas Jurgensohn. 2015. An intuitive mobility aid for visually impaired people based on stereo vision. In International Conference on Computer Vision Workshops. 17--25.Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. R. N. Shepard. 1964. Circularity in judgments of relative pitch. The Journal of the Acoustical Society of America 36, 12 (1964), 2346--2353.Google ScholarGoogle ScholarCross RefCross Ref
  46. Raymond M. Stanley and Bruce N. Walker. 2006. Lateralization of sounds using bone-conduction headsets. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Vol. 50. SAGE Publications Sage CA: Los Angeles, CA, 1571--1575.Google ScholarGoogle Scholar
  47. Yingli Tian, Xiaodong Yang, Chucai Yi, and Aries Arditi. 2013. Toward a Computer Vision-Based Wayfinding Aid for Blind Persons to Access Unfamiliar Indoor Environments. Machine Vision and Applications 24, 3 (2013), 521--535. DOI:https://doi.org/10.1007/s00138-012-0431-7Google ScholarGoogle ScholarCross RefCross Ref
  48. Marynel Vázquez and Aaron Steinfeld. 2012. Helping visually impaired users properly aim a camera. In Proceedings of the 14th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 95--102.Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. Tray Minh Voong and Michael Oehler. 2019. Auditory spatial perception using bone conduction headphones along with fitted head related transfer functions. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 1211--1212.Google ScholarGoogle ScholarCross RefCross Ref
  50. A. T. Welford. 1968. Fundamentals of Skill. Methuen.Google ScholarGoogle Scholar
  51. Gyorgy Wersenyi. 2003. Localization in a HRTF-based minimum audible angle listening test on a 2D sound screen for GUIB applications. In Audio Engineering Society Convention 115. Audio Engineering Society.Google ScholarGoogle Scholar
  52. G. B. Wetherill and H. Levitt. 1965. Sequential estimation of points on a psychometric function. British Journal of Mathematical and Statistical Psychology 18, 1 (1965), 1--10.Google ScholarGoogle ScholarCross RefCross Ref
  53. Jeff Wilson, Bruce N. Walker, Jeffrey Lindsay, Craig Cambias, and Frank Dellaert. 2007. SWAN: System for wearable audio navigation. International Symposium on Wearable Computers (2007), 91--98.Google ScholarGoogle ScholarDigital LibraryDigital Library
  54. Jacob O. Wobbrock, Kristen Shinohara, and Alex Jansen. 2011. The effects of task dimensionality, endpoint deviation, throughput calculation, and experiment design on pointing measures and models. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 1639--1648.Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. Jinglong Wu, Jiajia Yang, and Taichi Honda. 2010. Fittsâ law holds for pointing movements under conditions of restricted visual feedback. Human Movement Science 29, 6 (2010), 882--892. DOI:https://doi.org/10.1016/j.humov.2010.03.009Google ScholarGoogle ScholarCross RefCross Ref
  56. Jizhong Xiao, Samleo L. Joseph, Xiaochen Zhang, Bing Li, Xiaohai Li, and Jianwei Zhang. 2015. An assistive navigation framework for the visually impaired. IEEE Transactions on Human-Machine Systems 45, 5 (2015), 635--640.Google ScholarGoogle ScholarCross RefCross Ref
  57. Salifu Yusif, Jeffrey Soar, and Abdul Hafeez-Baig. 2016. Older people, assistive technologies, and the barriers to adoption: A systematic review. International Journal of Medical Informatics 94 (2016), 112--116.Google ScholarGoogle ScholarCross RefCross Ref
  58. Shumin Zhai, Jing Kong, and Xiangshi Ren. 2004. Speed--accuracy tradeoff in Fittsâ law tasksâon the equivalency of actual and nominal pointing precision. International Journal of Human-Computer Studies 61, 6 (2004), 823--856.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Experimental Analysis of a Spatialised Audio Interface for People with Visual Impairments

            Recommendations

            Comments

            Login options

            Check if you have access through your login credentials or your institution to get full access on this article.

            Sign in

            Full Access

            • Published in

              cover image ACM Transactions on Accessible Computing
              ACM Transactions on Accessible Computing  Volume 13, Issue 4
              December 2020
              117 pages
              ISSN:1936-7228
              EISSN:1936-7236
              DOI:10.1145/3430472
              Issue’s Table of Contents

              Copyright © 2020 ACM

              Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

              Publisher

              Association for Computing Machinery

              New York, NY, United States

              Publication History

              • Published: 15 October 2020
              • Accepted: 1 July 2020
              • Revised: 1 May 2020
              • Received: 1 January 2020
              Published in taccess Volume 13, Issue 4

              Permissions

              Request permissions about this article.

              Request Permissions

              Check for updates

              Qualifiers

              • research-article
              • Research
              • Refereed

            PDF Format

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader

            HTML Format

            View this article in HTML Format .

            View HTML Format