Skip to main content
Log in

User-centered gesture development in TV viewing environment

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Recent advances in interaction technologies make it possible for people to use freehand gestures in such application domains as virtual reality, augmented reality, ubiquitous computing, and smart rooms. While some applications and systems have been developed to support gesture-based interaction, it is unclear what design processes these systems have adopted. Considering the diversity of freehand gestures and the lack of design guidance on gesture-based interaction, we believe that a clear and systematic design process can help to improve the quality of gesture-based interaction. In this paper, we report a study that applies a user-centered approach in the process of gesture development, including the requirement gathering and functionality definition, gesture elicitation, gesture design and usability evaluation. Our results show that these issues must be taken into consideration when designing freehand gesture interfaces. The involvement of actual users, especially in the environment in which they would use the final systems, often leads to improved user experience and user satisfaction. Finally, we highlight the implications of this work for the development of all gesture-based applications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Alpern M, Minardo K (2003) Developing a car gesture interface for use as a secondary task. In: Proceedings of the ACM conference on human factors in computing systems (CHI’03), pp 932–933

  2. Aminzade DM, Winograd T, Igarashi T (2007) Eyepatch: prototyping camera-based interaction through examples. In: Proceedings of the ACM symposium of user interface software and technology (UIST’ 07). pp 33−42

  3. Ashbrook D, Starner T (2010) MAGIC: a motion gesture design tool. In: Proceedings of the ACM conference on human factors in computing systems (CHI’10), pp 2159–2168

  4. Carrino S, Péclat A, Mugellini E, Khaled OA, Ingold R (2011) Humans and Smart Environments: A Novel Multimodal Interaction Approach. In: In: 13th ACM International Conference on Multimodal Interaction, 2011., pp 105–112

    Google Scholar 

  5. Colaco A, Kirmani A, Yang HS, Gong NW, Schmandt C Goyal VK (2013) Mime: compact, low-power 3D gesture sensing for interaction with head-mounted displays. In: Proceedings of the ACM symposium of user interface software and technology (UIST’ 13), pp 227–236

  6. Epps J, Lichman S, Wu M, (2006) A study of hand shape use in tabletop gesture interaction. In: Proceedings of the ACM Conference on human factors in computing systems (CHI’ 06), pp 748–753

  7. Erol A, Bebis G, Nicolescu M, Boyle RD, Twombly X (2007) Vision-based hand pose estimation: a review. Comput Vis Image Underst 2007(18):52–73

    Article  Google Scholar 

  8. Fails J, Olsen D (2003) A design tool for camera-based interaction. In: Proceedings of the ACM conference on human factors in computing systems (CHI’ 03), pp 449–456

  9. Feng ZQ, Yang B, Chen YH, Zheng YW, Xu T, Li Y, Xu T, Zhu DL (2011) Pattern Recogn 44:1089–1105

    Article  MATH  Google Scholar 

  10. Feng ZQ, Yang B, Zheng YW, Zhao XY, Yin JQ, Meng QF (2013) Real-time oriented behavior-driven 3d freehand tracking for direct interaction. Pattern Recogn 2013(46):590–608

    Article  Google Scholar 

  11. Hayashi E, Maas M, Hong JI (2014) Wave to Me: user identification using body lengths and natural gestures. In: Proceedings of the ACM conference on human factors in computing systems (CHI’14), pp 3453–3462

  12. Hilliges O, Izadi S, Wilson AD, Hodges S, Mendoza AG, Butz A (2009) Interactions in the Air: adding further depth to interactive tabletops. In: Proceedings of the ACM symposium of user interface software and technology (UIST’ 09). pp 139–148

  13. Hoste L, Dumas B, Signer B (2012) SpeeG: A multimodal speech- and gesture-based text input solution. In: Proceedings of the International working conference on advanced visual interfaces. 2012, pp 156–163

  14. Höysniemi J, Hämäläinen P, Turkki L, Rouvi T (2005) Children’s intuitive gestures in vision-based action games. Commun ACM 48(1):45–52

    Article  Google Scholar 

  15. Jacob RJK (1993) What you look is what you get. Computer 26(7):65–66

    Article  Google Scholar 

  16. Karam M, Schraefel MC (2005) A Study on the use of semaphoric gestures to support secondary task interactions. In: Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI’05), pp 1961–1964

  17. Kato J, McDirmid S, Cao X (2012) DejaVu: integrated support for developing interactive camera-based programs. In: Proceedings of the ACM symposium of user interface software and technology (UIST’ 12). pp 189–196

  18. Kelley J (1984) An iterative design methodology for user-friendly natural language office information applications. ACM Trans Inf Syst 2(1):26–41

    Article  MathSciNet  Google Scholar 

  19. Kim HW, Nam TJ (2013) EventHurdle: supporting designers’ exploratory interaction prototyping with gesture-based sensors. In: Proceedings of the ACM conference on human factors in computing systems (CHI’13), pp 267–276

  20. Klemmer SR, Landay JA (2009) Toolkit support for integrating physical and digital interactions. Human-Comput Interactions 24:315–366

    Article  Google Scholar 

  21. Klemmer SR, Sinha AK, Chen J, Landay JA, Aboobaker N, Wang A (2000) Suede: A Wizard of OZ prototyping tool for speech user interface. In: Proceedings of the ACM symposium of user interface software and technology (UIST’ 00), pp 1–10

  22. Kolsch M, Turk M, Hollerer T (2004) Vision-Based Interfaces for Mobility. In: In Proceedings of IEEE International Conference on Mobile and Ubiquitous Systems (Mobiquitous’04)., pp 86–94

    Google Scholar 

  23. Kulshreshth A, LaViola JJ Jr (2014) Exploring the usefulness of finger-based 3D gesture menu selection. In: Proceedings of the ACM conference on human factors in computing systems (CHI’14), pp 1093–1112

  24. Lampe T, Fiederer LDJ, Voelker M, Knorr A, Riedmiller M, Ball T (2014) A brain-computer interface for high-level remote control of an autonomous, reinforcement-learning-based robotic system for reaching and grasping. In: Proceedings of the 19th International Conference on Intelligent User Interfaces (IUI’14), pp 83–88

  25. Locken A, Hesselmann T, Pielot M, Henze N, Boll S (2011) User-centered process for the definition of free-hand gestures applied to controlling music playback. Multimedia Systems 18(1):15–31

    Article  Google Scholar 

  26. Lü H, Li Y (2013) Gesture Studio: authoring multi-touch interactions through demonstration and declaration, In: Proceedings of the ACM conference on human factors in computing systems (CHI’13), pp 257–266

  27. Miller GA (1955) The magical number seven, plus or minus two: some limits on our capacity for processing information. Psychol Rev 101(2):343–352

    Article  Google Scholar 

  28. Mo ZY, Lewis JP, Neumann U (2005) SmartCanvas: a gesture-driven intelligent drawing desk. In: Proceedings of the ACM symposium of user interface software and technology (UIST’05), pp 239–243

  29. Morris MR, Wobbrock JO, Wilson AD (2010) Understanding user’ preferences for surface gestures. In Proceedings of GI 2010 (CIPS’10), pp 261–268

  30. Mujibiya A, Miyaki T, Rekimoto J (2010) Anywhere touchtyping: text input on arbitrary surface using depth sensing. In: Proceedings of the ACM symposium of user interface software and technology (UIST’ 10), pp 443–444

  31. Nacenta MA, Kamber Y, Qiang YZ, Kristensson PO (2013) Memorability of pre-designed & user-defined gesture sets. In: Proceedings of the ACM conference on human factors in computing systems (CHI’13), pp 1099–1108

  32. Nielsen J, Storring M, Moeslund TB, Erik GA (2003) Procedure for developing intuitive and ergonomic gesture interfaces for HCI. In: The 5th International workshop on gesture and sign language based human-computer interaction (GW’03), pp 409–420

  33. Norman DA, Draper SW (1986) User-centered system design: new perspectives on human-computer interaction. Lawrence Earlbaum Associates, Hillsdale

    Google Scholar 

  34. Pan ZG, Li, Y, Zhang MM, Sun C, Guo KD, Tang X, Zhou SZY (2010) A real-time multi-cue hand tracking algorithm based on computer vision. In: Proceedings of the 2010 I.E. virtual reality conference, pp 219–222

  35. Park J, Kim KE, Jo SH (2010) A POMDP Approach to P300-based brain-computer interfaces. In: Proceedings of the 15th international conference on intelligent user interfaces (IUI’10), pp 1–10

  36. Pfeil KP, Koh SL, LaViola JJ Jr (2013) Exploring 3D gesture metaphors for interaction with unmanned aerial vehicles. In: Proceedings of the 18th international conference on intelligent user interfaces (IUI’13), pp 257–266

  37. Poli R, Cinel C, Fernandez AM, Sepulveda F, Stoica A (2013) Towards cooperative brain-computer interfaces for space navigation. In: Proceedings of the 18th international conference on intelligent user interfaces (IUI’13), pp 149–159

  38. Rautaray SS, Agrawal A (2012) Vision based hand gesture recognition for human computer interaction: a survey. Artif Intell Rev. doi:10.1007/s10462-012-9356-9

    Google Scholar 

  39. Rogers Y, Preece J, Sharp H (2011) Interaction design: beyond human-computer interaction, 3rd edn. Wiley, New York

    Google Scholar 

  40. Rovelo G, Vanacken D, Luyten K, Abad F, Camahort E (2014) Multi-viewer gesture-based interaction for omni-directional video. In: Proceedings of the ACM conference on human factors in computing systems (CHI’14), pp 4077–4086

  41. Ruiz J, Li Y, Lank E (2011) User-defined motion gestures for mobile interaction. In: Proceedings of the ACM conference on human factors in computing systems (CHI’11), pp 197–206

  42. Song P, Goh WB, Hutama W, Fu CW, Liu XP (2012) A handle bar metaphor for virtual object manipulation with mid-air interaction. In: Proceedings of the ACM conference on human factors in computing systems (CHI’12), pp 1297–1306

  43. Tollmar K, Demirdjian D, Darrell T (2004) Navigating in virtual environments using a vision-based interface. Proc of NordiCHI 2004:113–120

    Article  Google Scholar 

  44. Vatavu RD (2012) User-defined gestures for free-hand TV control. In: Proceedings of the 10th European conference on interactive TV and video (EuroITV’12), pp 45–48

  45. Voida S, Podlaseck M, Kjeldsen R, Pinhanez C (2005) A study on the manipulation of 2D objects in a projector/camera- based augmented reality environment. In: Proceedings of the ACM conference on human factors in computing systems (CHI’05), pp 611–620

  46. Wachs JP, Kolsch M, Stern H, Edan Y (2011) Vision-based hand-gesture applications. Commun ACM 54(3):60–71

    Article  Google Scholar 

  47. Walter R, Bailly G, Muller J (2013) StrikeAPose: revealing mid-air gestures on public displays. In: Proceedings of the ACM conference on human factors in computing systems (CHI’13), pp 841–850

  48. Westeyn T, Brashear H, Atrash A, Starner T (2003) Georgia tech gesture toolkit: supporting experiments in gesture recognition. In: Proceedings of the 5th international conference on multimodal interfaces (ICMI’03), pp 85–92

  49. Wilson AD (2006) Robust computer vision-based detection of pinching for one and two-handed gesture input. In: Proceedings of the ACM symposium of user interface software and technology (UIST’ 06), pp 255–258

  50. Wobbrock JO, Morris MR, Wilson AD (2009) User-defined gestures for surface computing. In: Proceedings of the ACM conference on human factors in computing systems (CHI’ 09), pp 1083–1092

  51. Wu HY, Zhang FJ, Liu YJ, Dai GZ (2009) Research on key issues of vision-based gesture interfaces. Chin J Comput 32(10):2030–2041 (in Chinese)

    Google Scholar 

  52. Wu HY, Zhang FJ, Liu YJ, Hu YH, Dai GZ (2011) Vision-based gesture interfaces toolkit for interactive games. J Softw 22(5):1067–1081 (in Chinese)

    Google Scholar 

Download references

Acknowledgments

We thank the financial support from the National Natural Science Foundation of China, No. 61202344; the Fundamental Research Funds for the Central Universities, Sun Yat-Sen University, No. 1209119; Special Project on the Integration of Industry, Education and Research of Guangdong Province, No.2012B091000062; the Fundamental Research Funds for the Central Universities, Tongji University, No.0600219052, 0600219053. We would like to express our great appreciation to editor and reviewers.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Huiyue Wu or Xiaolong (Luke) Zhang.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wu, H., Wang, J. & Zhang, X.(. User-centered gesture development in TV viewing environment. Multimed Tools Appl 75, 733–760 (2016). https://doi.org/10.1007/s11042-014-2323-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-014-2323-5

Keywords

Navigation