Skip to main content
Log in

Seeking common ground while reserving differences in gesture elicitation studies

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Gesture elicitation studies have been frequently conducted in recent years for gesture design. However, most elicitation studies adopted the frequency ratio approach to assign top gestures derived from end-users to the corresponding target tasks, which may cause the results get caught in local minima, i.e., the gestures discovered in an elicitation study are not the best ones. In this paper, we propose a novel approach of seeking common ground while reserving differences in gesture elicitation research. To verify this point, we conducted a four-stage case study on the derivation of a user-defined mouse gesture vocabulary for web navigation and then provide new empirical evidences on our proposed method, including 1) gesture disagreement is a serious problem in elicitation studies, e.g., the chance for participants to produce the same mouse gesture for a given target task without any restriction is very low, below 0.26 on average; 2) offering a set of gesture candidates can improve consistency; and 3) benefited from the hindsight effect, some unique but highly teachable gestures produced in the elicitation study may also have a chance to be chosen as the top gestures. Finally, we discuss how these findings can be applied to inform all gesture-based interaction design.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. http://www.opera.com.

  2. https://www.mozilla.org/en-US/firefox/new/.

  3. http://www.googlechromer.cn/.

  4. http://chrome.360.cn/

  5. http://browser.qq.com/.

References

  1. Budiu R (2018) Memory recognition and recall in user interfaces. Retrieved March 6, 2018 from https://www.nngroup.com/articles/recognition-and-recall/

  2. Chan E, Seyed T, Stuerzlinger W, Yang XD, Maurer F (2016) User elicitation on single-hand microgestures. In: ACM CHI’16. 3403–3411

  3. Chen Z, Ma XC, Peng ZY, Zhou Y, Yao MG, Ma Z, Wang C, Gao ZF, Shen MW (2018) User-defined gestures for gestural interaction: extending from hands to other body parts. Int J Human-Comput Inter 34(3):238–250

    Article  Google Scholar 

  4. Choi E, Kwon S, Lee D, Lee H, Chung MK (2014) Towards successful user interaction with systems: focusing on user-derived gestures for smart home systems. Appl Ergon 45:1196–1207

    Article  Google Scholar 

  5. Dulberg M, Amant RS, Zettlemoyer LS (1999) An imprecise mouse gesture for the fast activation of controls. Human-Computer Interaction – INTERACT’99. 1–10

  6. Feng ZQ, Yang B, Li Y, Zheng YW, Zhao XY, Yin JQ, Meng QF (2013) Real-time oriented behavior-driven 3D freehand tracking for direct interaction. Pattern Recogn 46:590–608

    Article  MATH  Google Scholar 

  7. Fitts PM (1954) The information capacity of the human motor system in controlling the amplitude of movement. J Exp Psychol 47:381–391

    Article  Google Scholar 

  8. Furnas GW, Landauer TK, Gomez LM, Dumais ST (1987) The vocabulary problem in human-system communication. Commun ACM 30(11):964–971

    Article  Google Scholar 

  9. Grijincu D, Nacenta MA, Kristensson PO (2014) User-defined interface gestures: dataset and analysis. ITS. 25–34

  10. Hoff L, Hornecker E, Bertel S (2016) Modifying gesture elicitation: Do kinaesthetic priming and increased production reduce legacy bias? In: TEI’16. 86–91

  11. Kray C, Nesbitt D, Rohs M (2010) User-defined gestures for connecting mobile phones, public displays, and tabletops. MobileHCI’10. 239–248

  12. Kühnel C, Westermann T, Hemmert F, Kratz S (2011) I’m home: defining and evaluating a gesture set for smart-home control. Int J Human-Comput Stud 69:693–704

    Article  Google Scholar 

  13. Kurdyukova E, Redlin M, André E (2012) Studying user-defined iPad gestures for interaction in multi-display environment. IUI’12. 93–96

  14. Locken A, Hesselmann T, Pielot M, Henze N, Boll S (2011) User-centered process for the definition of freehand gestures applied to controlling music playback. Multimedia Systems 18(1):15–31

    Article  Google Scholar 

  15. Midgley L, Vickers P (2006) Sonically-enhanced mouse gestures in the firefox browser. Proceedings of the 12th International Conference on Auditory Display. 187–193

  16. Morris, M.R., Wobbrock, J.O., Wilson, A.D. (2010) Understanding users’ preferences for surface gestures. GI’10. pp.261–268

  17. Morris MR, Danielescu A, Drucker S, Fisher D, Lee B, Schraefel MC, Wobbrock JO (2014) Reducing legacy bias in gesture elicitation studies. Interactions 21(3):40–45

    Article  Google Scholar 

  18. Moyle, M., Cockburn, A. (2003) The design and evaluation of a flick gesture for “back” and “forward” in web browsers. Australasian User Interface Conference on User Interfaces (AUIC 2003). 39–46

  19. Nacenta MA, Kamber Y, Qiang YZ, Kristensson PO (2013) Memorability of pre-designed & user-defined gesture sets. CHI’13. 1099–1108

  20. Nielsen J (2018) 10 usability heuristics for user interface design. Retrieved March 6, 2018 from https://www.nngroup.com/articles/ten-usability-heuristics/

  21. Nielsen M, Störring M, Moeslund T, Granum E (2004) A procedure for developing intuitive and ergonomic gesture interfaces for HCI. Gesture-based Communication in Human–Computer Interaction. 105–106

  22. Paschke JD (2011) A usability study on mouse gestures. Study Thesis, Institute of Software Ergonomics. University of Koblenz, German

  23. Piumsomboon T, Billinghurst M, Clark A, Cockburn A (2013) User-defined gestures for augmented reality. CHI’13. 955–960

  24. Rovelo G, Vanacken D, Luyten K, Abad F, Camahort E (2014) Multi-viewer gesture-based interaction for omni-directional video. CHI’14. 4077–4086

  25. Ruiz J, Li Y, Lank E (2011) User-defined motion gestures for mobile interaction. CHI’11. 197–206

  26. Seo HK (2013) Mouse gesture design based on mental model. J Kor Inst Ind Eng 39(3):163–171

    Google Scholar 

  27. Valdes C, Eastman D, Grote C, Thatte S, Shaer O, Mazalek A, Ullmer B, Konkel MK (2014) Exploring the design space of gestural interaction with active tokens through user-defined gestures. CHI’14. 4107–4116

  28. Vatavu RD (2012) User-defined gestures for free-hand TV control. EuroITV’12. 45–48

  29. Wobbrock JO, Wilson AD, Li Y (2007) Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. UIST’07. 159–168

  30. Wobbrock JO, Morris MR, Wilson AD (2009) User-defined gestures for surface computing. CHI’09. 1083–1092

  31. Wu HY, Wang JM, Zhang XL (2015) User-centered gesture development in TV viewing environment. Multimed Tools Appl 75(2):733–760

    Article  Google Scholar 

  32. Yee W (2009) Potential limitations of multi-touch gesture vocabulary: differentiation, adoption, fatigue. Proceeding of the 13th International Conference on Human Computer Interaction. 291–300

  33. Zaiţi IA, Pentiuc SG, Vatavu RD (2015) On free-hand TV control: experimental results on user-elicited gestures with leap motion. Pers Ubiquit Comput 19:821–838

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Huiyue Wu.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work was supported by the National Natural Science Foundation of China under Grant No. 61772564, 61202344 and the funding offered by the China Scholarship Council (CSC)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wu, H., Liu, J., Qiu, J. et al. Seeking common ground while reserving differences in gesture elicitation studies. Multimed Tools Appl 78, 14989–15010 (2019). https://doi.org/10.1007/s11042-018-6853-0

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-018-6853-0

Keywords

Navigation