Abstract
Sound perception is a fundamental skill for many people with severe sight impairments. The research presented in this article is part of an ongoing project with the aim to create a mobile guidance aid to help people with vision impairments find objects within an unknown indoor environment. This system requires an effective non-visual interface and uses bone-conduction headphones to transmit audio instructions to the user. It has been implemented and tested with spatialised audio cues, which convey the direction of a predefined target in 3D space. We present an in-depth evaluation of the audio interface with several experiments that involve a large number of participants, both blindfolded and with actual visual impairments, and analyse the pros and cons of our design choices. In addition to producing results comparable to the state-of-the-art, we found that Fitts’s Law (a predictive model for human movement) provides a suitable metric that can be used to improve and refine the quality of the audio interface in future mobile navigation aids.
- Teemu Tuomas Ahmaniemi and Vuokko Tuulikki Lantz. 2009. Augmented reality target finding based on tactile cues. In Proceedings of the 2009 International Conference on Multimodal Interfaces (2009), 335--342. DOI:https://doi.org/10.1145/1647314.1647383Google ScholarDigital Library
- Aries Arditi and YingLi Tian. 2013. User interface preferences in the design if a camera-based navigation and wayfinding aid. Journal of Visual Impairment 8 Blindness 107, 2 (2013), 118--129.Google ScholarCross Ref
- Ruzena Bajcsy, Yiannis Aloimonos, and John K. Tsotsos. 2018. Revisiting active perception. Autonomous Robots 42, 2 (2018), 177--196.Google ScholarDigital Library
- Jens Blauert. 1969. Sound localization in the median plane. Acta Acustica united with Acustica 22, 4 (1969), 205--213.Google Scholar
- J. Blauert. 1997. Spatial Hearing: The Psychophysics of Human Sound Localization. MIT press.Google Scholar
- Jeffrey R. Blum, Mathieu Bouchard, and Jeremy R. Cooperstock. 2013. Spatialized audio environmental awareness for blind users with a smartphone. Mobile Networks and Applications 18, 3 (2013), 295--309.Google ScholarDigital Library
- Manuela Chessa, Nicoletta Noceti, Francesca Odone, Fabio Solari, Joan Sosa-García, and Luca Zini. 2016. An integrated artificial vision framework for assisting visually impaired users. Computer Vision and Image Understanding 149 (2016), 209--228.Google ScholarDigital Library
- Kai Crispien and Helen Petrie. 1994. The “GUIB” spatial auditory display-generation of an audio-based interface for blind computer users. Georgia Institute of Technology.Google Scholar
- Barthélémy Durette, Nicolas Louveton, David Alleysson, and Jeanny Hérault. 2008. Visuo-auditory sensory substitution for mobility assistance: Testing TheVIBE. In Workshop on Computer Vision Applications for the Visually Impaired.Google Scholar
- Alexander Fiannaca, Ilias Apostolopoulous, and Eelke Folmer. 2014. Headlock: A wearable navigation aid that helps blind cane users traverse large open spaces. In Proceedings of the 16th International ACM SIGACCESS Conference on Computers 8 Accessibility. ACM, 19--26.Google ScholarDigital Library
- P. M. Fitts. 1954. The information capacity of the human motor system in controlling the amplitude of movement.Journal of Experimental Psychology 47, 6 (1954), 381.Google Scholar
- Paolo Gallina, Nicola Bellotto, and Massimiliano Di Luca. 2015. Progressive co-adaptation in human-machine interaction. In International Conference on Informatics in Control, Automation and Robotics 2 (2015), 362--368.Google ScholarDigital Library
- William G. Gardner and Keith D. Martin. 1995. HRTF measurements of a KEMAR. The Journal of the Acoustical Society of America 97, 6 (1995), 3907--3908.Google ScholarCross Ref
- Michele Geronazzo, Alberto Bedin, Luca Brayda, Claudio Campus, and Federico Avanzini. 2016. Interactive spatial sonification for non-visual exploration of virtual maps. International Journal of Human Computer Studies 85 (2016), 4--15.Google ScholarDigital Library
- Reginald G. Golledge, James R. Marston, Jack M. Loomis, and Roberta L. Klatzky. 2004. Stated preferences for components of a personal guidance system for nonvisual navigation. Journal of Visual Impairment 8 Blindness 98, 3 (2004), 135--147.Google ScholarCross Ref
- Garin Hiebert. 2005. OpenAL 1.1 Specification and Reference.Google Scholar
- Paul Kabbash and William A. S. Buxton. 1995. The âprinceâ technique: Fitts’ law and selection using area cursors. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 273--279.Google Scholar
- S. Kammoun, G. Parseihian, O. Gutierrez, A. Brilhault, A. Serpa, M. Raynal, B. Oriola, M. J.M. MacÉ, M. Auvray, M. Denis, S. J. Thorpe, P. Truillet, B. F.G. Katz, and C. Jouffrais. 2012. Navigation and space perception assistance for the visually impaired: The NAVIG project. IRBM 33, 2 (2012), 182--189. DOI:https://doi.org/10.1016/j.irbm.2012.01.009Google ScholarCross Ref
- Nadia Kanwal, Erkan Bostanci, Keith Currie, and Adrian F. Clark. 2015. A navigation system for the visually impaired: A fusion of vision and depth sensor. Applied Bionics and Biomechanics (2015).Google Scholar
- Brian F. G. Katz, Philippe Truillet, Simon J. Thorpe, Christophe Jouffrais, and Jouffrais. 2010. NAVIG: Navigation assisted by artificial vision and GNSS.Workshop on Multimodal Location Based Techniques for Extreme Navigation1 (2010), 1--4.Google Scholar
- Roberta L. Klatzky, James R. Marston, Nicholas A. Giudice, Reginald G. Golledge, and Jack M. Loomis. 2006. Cognitive load of navigating without vision when guided by virtual sound versus spatial language.Journal of Experimental Psychology: Applied 12, 4 (2006), 223--232.Google Scholar
- Y. H. Lee and G. Medioni. 2015. RGB-D camera based wearable navigation system for the visually impaired. Computer Vision and Image Understanding 149 (2015), 3--20.Google ScholarDigital Library
- HCCH Levitt. 1971. Transformed up-down methods in psychoacoustics. The Journal of the Acoustical Society of America 49, 2B (1971), 467--477.Google ScholarCross Ref
- Richard Lichenstein, Daniel Clarence Smith, Jordan Lynne Ambrose, and Laurel Anne Moody. 2012. Headphone use and pedestrian injury and death in the United States: 2004--2011. Injury Prevention 18, 5 (2012), 287--290.Google ScholarCross Ref
- J. C. Lock, G. Cielniak, and N. Bellotto. 2017. Portable navigations system with adaptive multimodal interface for the blind. In AAAI Spring Symposium -- Designing the User Experience of Machine Learning Systems.Google Scholar
- J. C. Lock, G. Cielniak, and N. Bellotto. 2019a. Active object search with a mobile device for people with visual impairments. In International Conference on Computer Vision Theory and Applications. 476--485.Google Scholar
- J. C. Lock, I. D. Gilchrist, G. Cielniak, and N. Bellotto. 2019b. Bone-conduction audio interface to guide people with visual impairments. Communications in Computer and Information Science (2019).Google Scholar
- Justin A. MacDonald, Paula P. Henry, and Tomasz R. Letowski. 2006. Spatial audio through a bone conduction interface. International Journal of Audiology 45, 10 (2006), 595--599.Google ScholarCross Ref
- I. S. MacKenzie. 1992. Fitts’ law as a research and design tool in human-computer interaction. Human-Computer Interaction 7, 1 (1992), 91--139.Google ScholarDigital Library
- Georgios N. Marentakis and Stephen A. Brewster. 2006. Effects of feedback, mobility and index of difficulty on deictic spatial audio target acquisition in the horizontal plane. In Proceedings of CHI’06, ACM Press (2006), 359. DOI:https://doi.org/10.1145/1124772.1124826Google Scholar
- Sergio Mascetti, Lorenzo Picinali, Andrea Gerino, Dragan Ahmetovic, and Cristian Bernareggi. 2016. Sonification of guidance data during road crossing for people with visual impairments or blindness. International Journal of Human-Computer Studies (2016).Google ScholarDigital Library
- Susanna Millar. 1994. Understanding and Representing Space: Theory and Evidence from Studies with Blind and Sighted Children. Clarendon Press/Oxford University Press.Google Scholar
- Bogdan Mocanu, Ruxandra Tapu, and Titus Zaharia. 2016. When ultrasonic sensors and computer vision join forces for efficient obstacle detection and recognition. Sensors 16, 11 (2016).Google Scholar
- Michal Pec, Michal Bujacz, Pawel Strumillo, and Andrzej Materka. 2008. Individual HRTF measurements for accurate obstacle sonification in an electronic travel aid for the blind. In 2008 International Conference on Signals and Electronic Systems. IEEE, 235--238.Google ScholarCross Ref
- Helen Petrie, Valerie Johnson, Thomas Strothotte, Andreas Raab, Rainer Michel, Lars Reichert, and Axel Schalt. 1997. MoBIC: An aid to increase the independent mobility of blind travellers. British Journal of Visual Impairment 15, 2 (1997), 63--66.Google ScholarCross Ref
- C. C. Pratt. 1930. The spatial character of high and low tones.Journal of Experimental Psychology 13, 3 (1930), 278.Google Scholar
- Giorgio Presti, Dragan Ahmetovic, Mattia Ducci, Cristian Bernareggi, Luca Ludovico, Adriano Baratè, Federico Avanzini, and Sergio Mascetti. 2019. WatchOut: Obstacle sonification for people with visual impairment or blindness. In ACM SIGACCESS Conference on Computers and Accessibility (ASSETS). ACM.Google ScholarDigital Library
- Linda Pring. 2008. Psychological characteristics of children with visual impairments: Learning, memory and imagery. British Journal of Visual Impairment (2008).Google Scholar
- Jose Rivera-Rubio, Kai Arulkumaran, Hemang Rishi, Ioannis Alexiou, and Anil A. Bharath. 2015. An assistive haptic interface for appearance-based indoor navigation. Computer Vision and Image Understanding 149, Assistive Computer Vision and Robotics (2015), 126--145.Google Scholar
- Alberto Rodríguez, Luis M. Bergasa, Pablo F. Alcantarilla, Javier Yebes, and Andrés Cela. 2012. Obstacle avoidance system for assisting visually impaired people. Intelligent Vehicles Symposium Workshops (2012), 1--6.Google Scholar
- Daisuke Sato, Uran Oh, João Guerreiro, Dragan Ahmetovic, Kakuya Naito, Hironobu Takagi, Kris M. Kitani, and Chieko Asakawa. 2019. NavCog3: Large-scale blind indoor navigation assistant with semantic features in the wild. Transactions on Accessible Computing (2019).Google Scholar
- Boris Schauerte, Manel Martinez, Angela Constantinescu, and Rainer Stiefelhagen. 2012. An assistive vision system for the blind that helps find lost things. In International Conference on Computers for Handicapped Persons. Springer, 566--572.Google ScholarDigital Library
- David Schonstein, Laurent Ferré, and Brian F. Katz. 2008. Comparison of headphones and equalization for virtual auditory source localization. The Journal of the Acoustical Society of America 5 (2008), 3724--3724.Google ScholarCross Ref
- Tobias Schwarze, Martin Lauer, Manuel Schwaab, Michailas Romanovas, Sandra Bohm, and Thomas Jurgensohn. 2015. An intuitive mobility aid for visually impaired people based on stereo vision. In International Conference on Computer Vision Workshops. 17--25.Google ScholarDigital Library
- R. N. Shepard. 1964. Circularity in judgments of relative pitch. The Journal of the Acoustical Society of America 36, 12 (1964), 2346--2353.Google ScholarCross Ref
- Raymond M. Stanley and Bruce N. Walker. 2006. Lateralization of sounds using bone-conduction headsets. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Vol. 50. SAGE Publications Sage CA: Los Angeles, CA, 1571--1575.Google Scholar
- Yingli Tian, Xiaodong Yang, Chucai Yi, and Aries Arditi. 2013. Toward a Computer Vision-Based Wayfinding Aid for Blind Persons to Access Unfamiliar Indoor Environments. Machine Vision and Applications 24, 3 (2013), 521--535. DOI:https://doi.org/10.1007/s00138-012-0431-7Google ScholarCross Ref
- Marynel Vázquez and Aaron Steinfeld. 2012. Helping visually impaired users properly aim a camera. In Proceedings of the 14th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 95--102.Google ScholarDigital Library
- Tray Minh Voong and Michael Oehler. 2019. Auditory spatial perception using bone conduction headphones along with fitted head related transfer functions. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 1211--1212.Google ScholarCross Ref
- A. T. Welford. 1968. Fundamentals of Skill. Methuen.Google Scholar
- Gyorgy Wersenyi. 2003. Localization in a HRTF-based minimum audible angle listening test on a 2D sound screen for GUIB applications. In Audio Engineering Society Convention 115. Audio Engineering Society.Google Scholar
- G. B. Wetherill and H. Levitt. 1965. Sequential estimation of points on a psychometric function. British Journal of Mathematical and Statistical Psychology 18, 1 (1965), 1--10.Google ScholarCross Ref
- Jeff Wilson, Bruce N. Walker, Jeffrey Lindsay, Craig Cambias, and Frank Dellaert. 2007. SWAN: System for wearable audio navigation. International Symposium on Wearable Computers (2007), 91--98.Google ScholarDigital Library
- Jacob O. Wobbrock, Kristen Shinohara, and Alex Jansen. 2011. The effects of task dimensionality, endpoint deviation, throughput calculation, and experiment design on pointing measures and models. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 1639--1648.Google ScholarDigital Library
- Jinglong Wu, Jiajia Yang, and Taichi Honda. 2010. Fittsâ law holds for pointing movements under conditions of restricted visual feedback. Human Movement Science 29, 6 (2010), 882--892. DOI:https://doi.org/10.1016/j.humov.2010.03.009Google ScholarCross Ref
- Jizhong Xiao, Samleo L. Joseph, Xiaochen Zhang, Bing Li, Xiaohai Li, and Jianwei Zhang. 2015. An assistive navigation framework for the visually impaired. IEEE Transactions on Human-Machine Systems 45, 5 (2015), 635--640.Google ScholarCross Ref
- Salifu Yusif, Jeffrey Soar, and Abdul Hafeez-Baig. 2016. Older people, assistive technologies, and the barriers to adoption: A systematic review. International Journal of Medical Informatics 94 (2016), 112--116.Google ScholarCross Ref
- Shumin Zhai, Jing Kong, and Xiangshi Ren. 2004. Speed--accuracy tradeoff in Fittsâ law tasksâon the equivalency of actual and nominal pointing precision. International Journal of Human-Computer Studies 61, 6 (2004), 823--856.Google ScholarDigital Library
Index Terms
- Experimental Analysis of a Spatialised Audio Interface for People with Visual Impairments
Recommendations
The challenges in adopting assistive technologies in the workplace for people with visual impairments
OzCHI '18: Proceedings of the 30th Australian Conference on Computer-Human InteractionThere are many barriers to employment for people with visual impairments. Assistive technologies (ATs), such as computer screen readers and enlarging software, are commonly used to help overcome employment barriers and enable people with visual ...
Investigating touchscreen accessibility for people with visual impairments
NordiCHI '08: Proceedings of the 5th Nordic conference on Human-computer interaction: building bridgesTouchscreen computing devices such as the iPhone are becoming more common. However this technology is largely inaccessible to people with visual impairments. We present the results of a requirements capture study that illustrates the problems with ...
Exploring Interface Design for Independent Navigation by People with Visual Impairments
ASSETS '15: Proceedings of the 17th International ACM SIGACCESS Conference on Computers & AccessibilityMost user studies of navigation applications for people with visual impairments have been limited by existing localization technologies, and appropriate instruction types and information needs have been determined through interviews. Using Wizard-of- Oz ...
Comments