Skip to main content
Log in

Developing hand-worn input and haptic support for real-world target finding

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract

Locating places in cities is typically facilitated by handheld mobile devices, which draw the visual attention of the user on the screen of the device instead of the surroundings. In this research, we aim at strengthening the connection between people and their surroundings through enabling mid-air gestural interaction with real-world landmarks and delivering information through audio to retain users’ visual attention on the scene. Recent research on gesture-based and haptic techniques for such purposes has mainly considered handheld devices that eventually direct users’ attention back to the devices. We contribute a hand-worn, mid-air gestural interaction design with directional vibrotactile guidance for finding points of interest (POIs). Through three design iterations, we address aspects of (1) sensing technologies and the placement of actuators considering users’ instinctive postures, (2) the feasibility of finding and fetching information regarding landmarks without visual feedback, and (3) the benefits of such interaction in a tourist application. In a final evaluation, participants located POIs and fetched information by pointing and following directional guidance, thus realising a vision in which they found and experienced real-world landmarks while keeping their visual attention on the scene. The results show that the interaction technique has comparable performance to a visual baseline, enables high mobility, and facilitates keeping visual attention on the surroundings.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Notes

  1. Haptic perception consists of both kinaesthesia and cutaneous senses [2]. In this paper, we refer to vibrations as tactile feedback and to the system that generates tactile feedback as a haptic system/glove; we consider, in fact, the force and motor activities involved in using the system to comprise an important part of the interaction even though there is no force feedback presented.

  2. https://www.wikitude.com/app/

References

  1. Hyman IE, Boss SM, Wise BM, McKenzie KE, Caggiano JM (2010) Did you see the unicycling clown? Inattentional blindness while walking and talking on a cell phone. Appl Cogn Psychol 24(5):597–607. https://doi.org/10.1002/acp.1638

  2. Scott M (2001) Tactual perception. Australas J Philos 79(2):149–160

    Article  MathSciNet  Google Scholar 

  3. Rateau H, Grisoni L, De Araujo B (2014) Mimetic interaction spaces: controlling distant displays in pervasive environments. In: Proceedings of the 19th international conference on Intelligent User Interfaces - IUI ‘14, p 89–94. https://doi.org/10.1145/2557500.2557545

  4. Benko H, Wilson AD (2010) Multi-point interactions with immersive omnidirectional visualizations in a dome. In: ACM International Conference on Interactive Tabletops and Surfaces - ITS ‘10, p 19–28. https://doi.org/10.1145/1936652.1936657

  5. Egenhofer M (1999) Spatial information appliances: a next generation of geographic information systems. In: 1st Brazilian workshop on geoinformatics, Campinas, Brazil 1–4

  6. Simon R, Fröhlich P, Grechenig T (2008) GeoPointing: evaluating the performance of orientation-aware location-based interaction under real-world conditions. J Locat Based Serv 2(1):24–40. https://doi.org/10.1080/17489720802347986

  7. Magnusson C, Rassmus-Gröhn K, Szymczak D (2010) Scanning angles for directional pointing. In: Proceedings of the 12th international conference on Human computer interaction with mobile devices and services - MobileHCI ‘10, p 399–400. https://doi.org/10.1145/1851600.1851684

  8. Lei Z, Coulton P (2009) A mobile geo-wand enabling gesture based POI search an user generated directional POI photography. In: Proceedings of the International Conference on Advances in Computer Enterntainment Technology, p 392–395. https://doi.org/10.1145/1690388.1690469

  9. Pielot M, Heuten W, Zerhusen S, Boll S (2012) Dude, where’s my car?: in-situ evaluation of a tactile car finder. In: Proceedings of the 7th Nordic Conference on Human-Computer Interaction Making Sense Through Design - NordiCHI ‘12, p 166–169. https://doi.org/10.1145/2399016.2399042

  10. Szymczak D, Rassmus-Gröhn K, Magnusson C, Hedvall P-O (2012) A real-world study of an audio-tactile tourist guide, in Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services - MobileHCI ‘12, pp. 335–344. https://doi.org/10.1145/2371574.2371627

  11. Robinson S, Eslambolchilar P, Jones M (2009) Sweep-shake: finding digital resources in physical environments, in Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services - MobileHCI ‘09, p. 12:1--12:10. https://doi.org/10.1145/1613858.1613874

  12. Zhao Y, Chakraborty A, Hong KW, Kakaraddi S, Amant R St (2012) Pointing at responsive objects outdoors, in Proceedings of the 2012 ACM international conference on Intelligent User Interfaces - IUI ‘12, pp. 281–284. https://doi.org/10.1145/2166966.2167018

  13. Fröhlich P, Oulasvirta A, Baldauf M, Nurminen A (2011) On the move, wirelessly connected to the world. Commun ACM 54(1):132–138. https://doi.org/10.1145/1866739.1866766

  14. Tung Y-C et al. (2015) User-defined game input for smart glasses in public space, in Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems - CHI ‘15, pp. 3327–3336. https://doi.org/10.1145/2702123.2702214

  15. Braun A, McCall R (2010) Short paper: user study for mobile mixed reality devices. In: Joint virtual reality conference of EGVE - EuroVR – VEC 89–92. https://doi.org/10.2312/EGVE/JVRC10/089-092

  16. Wither J, DiVerdi S, Hollerer T(2007) Evaluating display types for AR selection and annotation, in Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 1–4. https://doi.org/10.1109/ISMAR.2007.4538832

  17. Schmalsteig D, Hollerer T (2016) Augmented reality: principles and practice, in ACM SIGGRAPH 2016 courses. https://doi.org/10.1145/2897826.2927365

  18. Vogel D, Balakrishnan R (2005) Distant freehand pointing and clicking on very large, high resolution displays, in Proceedings of the 18th annual ACM Symposium on User Interface Software and Technology, pp. 33–42. https://doi.org/10.1145/1095034.1095041

  19. Walter R, Bailly G, Müller J (2013) StrikeAPose: revealing mid-air gestures on public displays, in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI ‘13, pp. 841–850. https://doi.org/10.1145/2470654.2470774

  20. Clarke C, Bellino A, Esteves A, Velloso E, Gellersen H (2016) TraceMatch: a computer vision technique for user input by tracing of animated controls, in Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing,, pp. 298–303. https://doi.org/10.1145/2971648.2971714

  21. Merrill D, Maes P (2007) Augmenting looking, pointing and reaching gestures to enhance the searching and browsing of physical objects, in Proceedings of the 5th International Conference on Pervasive Computing (Pervasive’07), vol. 4480, pp. 1–18

  22. Haque F, Nancel M, Vogel D (2015) Myopoint: pointing and clicking using forearm mounted electromyography and inertial motion sensors, in Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems - CHI ‘15, pp. 3653–3656. https://doi.org/10.1145/2702123.2702133

  23. Verweij D, Esteves A, Khan V-J, Bakker S (2017) WaveTrace: motion matching input using wrist-worn motion sensors, in Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 2180–2186. https://doi.org/10.1145/3027063.3053161

  24. Kim D et al. (2012) Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor, in Proceedings of the 25th annual ACM symposium on User interface software and technology - UIST ‘12,, pp. 167–176. https://doi.org/10.1145/2380116.2380139

  25. Khambadkar V, Folmer E (2013) GIST: a gestural interface for remote nonvisual spatial perception, in Proceedings of the 26th annual ACM symposium on User interface software and technology - UIST ‘13, pp. 301–310. https://doi.org/10.1145/2501988.2502047

  26. Liu M, Nancel M, Vogel D (2015) Gunslinger: subtle arms-down mid-air interaction, in Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology - UIST ‘15, pp. 63–71. https://doi.org/10.1145/2807442.2807489

  27. Hrabia C-E, Wolf K, Wilhelm M (2013) Whole hand modeling using 8 wearable sensors, in Proceedings of the 4th Augmented Human International Conference on - AH ‘13, pp. 21–28. https://doi.org/10.1145/2459236.2459241

  28. Roy K, Idiwal DP, Agrawal A, Hazra B (2015) Flex sensor based wearable gloves for robotic gripper control, in Proceedings of the 2015 Conference on Advances In Robotics - AIR ‘15, p. 70:1--70:5. https://doi.org/10.1145/2783449.2783520

  29. Carter T, Seah SA, Long B, Drinkwater B, Subramanian S (2013) UltraHaptics: multi-point mid-air haptic feedback for touch surfaces, in Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology, pp. 505–514. https://doi.org/10.1145/2501988.2502018

  30. Scheibe R, Moehring M, Froehlich B (2007) Tactile feedback at the finger tips for improved direct interaction in immersive environments, in VR, pp. 293–294. https://doi.org/10.1109/3DUI.2007.340784

  31. Gallotti P, Raposo A, Soares L (2011) V-glove: a 3D virtual touch interface, in Proceedings - 2011 13th Symposium on Virtual Reality, SVR, 2011, pp. 242–251. https://doi.org/10.1109/SVR.2011.21

  32. Väänänen-Vainio-Mattila K, Suhonen K, Laaksonen J, Kildal Js, Tahiroğlu K (2013) User experience and usage scenarios of audio-tactile interaction with virtual objects in a physical environment, in Proceedings of the 6th International Conference on Designing Pleasurable Products and Interfaces - DPPI ‘13, pp. 67–76. https://doi.org/10.1145/2513506.2513514

  33. Oron-Gilad T, Downs JL, Gilson RD, Hancock P (2007) Vibrotactile guidance cues for target acquisition. IEEE Trans Syst Man Cybern Part C Appl Rev 37(5):993–1004. https://doi.org/10.1109/TSMCC.2007.900646

  34. Lehtinen V, Oulasvirta A, Salovaara A, Nurmi P (2012) Dynamic tactile guidance for visual search tasks, in Proceedings of the 25th annual ACM symposium on User interface software and technology, pp. 445–452. https://doi.org/10.1145/2380116.2380173

  35. Van Erp JBF, Van Veen HAHC, Jansen C, Dobbins T (2005) Waypoint navigation with a vibrotactile waist belt. ACM Trans Appl Percept 2(2):106–117. https://doi.org/10.1145/1060581.1060585

  36. Srikulwong M, O’Neill E (2011) A comparative study of tactile representation techniques for landmarks on a wearable device, in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems CHI ‘11, pp. 2029–2038. https://doi.org/10.1145/1978942.1979236

  37. Asif A, Heuten W, Boll S (2010) Exploring distance encodings with a tactile display to convey turn by turn information in automobiles, in Proceedings of the 6th Nordic Conference on Human-Computer Interaction Extending Boundaries - NordiCHI ‘10, pp. 32–41. https://doi.org/10.1145/1868914.1868923

  38. Jansen C, Oving A, Van Veen H (2004) Vibrotactile movement initiation, in Proceedings of Eurohaptics, pp. 110–117

  39. Cholewiak RW, Craig JC (1984) Vibrotactile pattern recognition and discrimination at several body sites. Percept Psychophys 35(6):503–514

    Article  Google Scholar 

  40. Ahmaniemi TT, Lantz VT (2009) Augmented reality target finding based on tactile cues, in Proceedings of the 2009 International Conference on Multimodal Interfaces, pp. 335–342. https://doi.org/10.1145/1647314.1647383

  41. Howell D (2012) Statistical methods for psychology. Cengage Learning, Boston

    Google Scholar 

  42. Bangor A, Kortum P, Miller J (2009) Determining what individual SUS scores mean: adding an adjective rating scale. J Usability Stud 4(3):114–123

    Google Scholar 

  43. Jylhä A, Hsieh Y-T, Orso V, Andolina S, Gamberini L, Jacucci G (2015) A wearable multimodal interface for exploring urban points of interest, in Proceedings of the 2015 ACM on International Conference on Multimodal Interaction,, pp. 175–182. https://doi.org/10.1145/2818346.2820763

  44. Ankolekar A, Sandholm T, Yu L (2013) Play it by ear: a case for serendipitous discovery of places with musicons, in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2959–2968. https://doi.org/10.1145/2470654.2481411

  45. Bial D, Kern D, Alt F, Schmidt A (2011) Enhancing outdoor navigation systems through vibrotactile feedback, in CHI ‘11 Extended Abstracts on Human Factors in Computing Systems, pp. 1273–1278. https://doi.org/10.1145/1979742.1979760

  46. Fitts PM (1954) The information capacity of the human motor system in controlling the amplitude of movement, Journal of Experimental Psychology, vol. 47, no. 6. American Psychological Association, US, pp. 381–391.

  47. Cha Y, Myung R (2013) Extended Fitts’ law for 3D pointing tasks using 3D target arrangements. Int J Ind Ergon 43(4):350–355. https://doi.org/10.1016/j.ergon.2013.05.005

  48. Hsieh Y, Orso V, Gamberini L, Jacucci G (2016) Designing a willing-to-use-in-public hand gestural interaction technique for smart glasses. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16) 4203–4215. https://doi.org/10.1145/2858036.2858436

Download references

Funding

The research leading to these results has received funding from the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement no. 601139.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yi-Ta Hsieh.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hsieh, YT., Jylhä, A., Orso, V. et al. Developing hand-worn input and haptic support for real-world target finding. Pers Ubiquit Comput 23, 117–132 (2019). https://doi.org/10.1007/s00779-018-1180-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-018-1180-z

Keywords

Navigation