Abstract
Influencing quality factors, related to the user and system, need to be considered when building a well-designed multimodal interaction system. User groups, access to input modes and tasks were defined as the user and system factors to examine effective factors of multimodal interaction with a smart TV, and its input modes consisted of voice, arrow key, and motion-based pointer modes and their combinations. User group 1 had experience of multimodal interaction with another device, while user group 2 had only experience of unimodal interaction. In addition, the sequential/simultaneous input modes and simple/complex tasks were considered as the system factors. Depending on the task complexity, two experiments were conducted. Nine input modes (three unimodes and six multimodes), sequentially and simultaneously given to both user groups, were investigated for the simple task of menu traversal and the complex task of manipulating broadcasting content, menu traversal, and web content navigation. A subjective rating of the level of preference was recorded in the sequential input mode using a modified Likert-type rating scale, while each participant’s preferred mode was observed in the simultaneous scenario. Additionally, the completion time and error rate were measured in both experiments. When performing the simple task, user group 1 used multimodes more so than group 2. However, in the complex task, both user groups preferred multimodes when modes were simultaneously presented. Considering effective quality factors, input modes of a smart TV should be simultaneously provided with a voice and motion-based pointer multimode.
Similar content being viewed by others
References
Alepis E, Virvou M (2012) Multimodal object oriented user interfaces in mobile affective interaction. Multimedia Tools Appl 59(1):41–63
Atrey PK, Hossain MA, El Saddik A, Kankanhalli MS (2010) Multimodal fusion for multimedia analysis: a survey. Multimedia Syst 16(6):345–379
Bangalore S, Johnston M (2009) Robust understanding in multimodal interfaces. Comput Linguist 35(3):345–397
Barthelmess P, Oviatt, SL (2008) Multimodal interfaces: combining interfaces to accomplish a single task. Kortum P (ed) HCI Beyond the GUI, 1st edn. Morgan Kauffman, pp 391–444
Bellik Y, Rebaï I, Machrouh E, Barzaj Y, Jacquet C, Pruvost G, Sansonnet JP (2009) Multimodal interaction within ambient environments: an exploratory study. In: Human-Computer Interaction–INTERACT 2009. Springer, Heidelberg, pp 89–92
Chbeir R, Coninx K, Ferri F, Grifoni P (2011) Management and interaction with multimodal information content. Multimedia Tools Appl 54(1):1–5
Chen F, Ruiz N, Choi E, Epps J, Khawaja MA, Taib R, Wang Y (2012) Multimodal behavior and interaction as indicators of cognitive load. ACM Trans Interact Intell Syst 2(4):22
Chittaro L (2010) Distinctive aspects of mobile interaction and their implications for the design of multimodal interfaces. J Multimodal User Interfaces 3(3):157–165
Cohen PR, Oviatt SL (2000) Multimodal interfaces that process what comes naturally. Commun ACM 43(3):45–33
Dumas JS, Fox J (2008) Usability testing: current practice and future directions. In: Sears AL, Jacko JA (eds) The Handbook of Human-Computer Interaction, 2nd edn. Taylor and Francis, pp 1129–1149
Dumas B, Lalanne D, Oviatt SL (2009) Multimodal interfaces: a survey of principles, models and frameworks. In: Lalanne D, Kohlas J (eds) Human Machine Interaction, LNCS 5440. Springer, Heidelberg, pp 3–26
Elouali N, Rouillard J, Le Pallec X, Tarby JC (2013) Multimodal interaction: a survey from model driven engineering and mobile perspectives. J Multimodal User Interfaces 7(4):351–370
Gürkök H, Nijholt A (2012) Brain–computer interfaces for multimodal interaction: a survey and principles. Int J Hum Comput Int 28(5):292–307
Herrera-Acuña R, Argyriou V, Velastin SA (2015) A Kinect-based 3D hand-gesture interface for 3D databases. J Multimodal User Interfaces 9(2):121–139
Hornbæk K (2006) Current practice in measuring usability: challenges to usability studies and research. Int J Hum Comput Stud 64(2):79–102
Hürst W, Van Wezel C (2013) Gesture-based interaction via finger tracking for mobile augmented reality. Multimedia Tools Appl 62(1):233–258
Jaimes A, Sebe N (2007) Multimodal human–computer interaction: a survey. Comput Vis Image Underst 108(1):116–134
Karray F, Alemzadeh M, Saleh JA, Arab MN (2008) Human-computer interaction: overview on state of the art. Int J Smart Sens Intell Syst 1(1):137–159
König WA, Rädle R, Reiterer H (2010) Interactive design of multimodal user interfaces. J Multimodal User Interfaces 3(3):197–213
Lee WP, Kaoli C, Huang JY (2014) A smart TV system with body-gesture control, tag-based rating and context-aware recommendation. Knowl Based Syst 56:167–178
Lee M, Kim GJ (2014) Empathetic video clip experience through timely multimodal interaction. J Multimodal User Interfaces 8(3):273–288
Lee SH, Sohn MK, Kim DJ, Kim B, Kim H (2013, January) Smart TV interaction system using face and hand gesture recognition. In: Consumer Electronics (ICCE), 2013 I.E. International Conference on, IEEE, pp. 173–174
Lemmelä S, Vetek A, Mäkelä K, Trendafilov D (2008) Designing and evaluating multimodal interaction for mobile contexts. In: Proceedings of the 10th international conference on Multimodal interfaces, ACM, pp 265–272
Li B, Zhang W, Zhou R, Yang C, Li Z (2012) A comparative ergonomics study: performing reading-based tasks on a large-scale tabletop vs. laptop. Int J Ind Ergono 42(1):156–161
Liu SF, Cheng JH, Chen WJ (2015) A study of product experience obtained from multimodal interactive displays. Multimedia Tools Appl 1–30. doi:10.1007/s11042-015-2564-y
Lopez-Cozar R, Araki M (2005) Spoken, multilingual and multimodal dialogue systems: development and assessment. Wiley, New York
Möller A, Diewald S, Roalter L, Kranz M (2014) A framework for mobile multimodal interaction. In: Proceedings of Mensch & Computer: Interaktiv unterwegs –Freiräume gestalten, Oldenbourg Verlag, pp 355–358
Möller S, Engelbrecht KP, Kühnel C, Wechsung I, Weiss B (2009) Evaluation of multimodal interfaces for ambient intelligence. In: Aghajan H, Augusto JC, Delgado RL (eds) Human-Centric Interfaces for Ambient Intelligence, 1st edn. Elsevier, Amsterdam, pp 347–370
Nogueira PA, Teófilo LF, Silva PB (2015) Multi-modal natural interaction in game design: a comparative analysis of player experience in a large scale role-playing game. J Multimodal User Interfaces 9(2):105–119
Osafo-Yeboah B, Jiang S, Delpish R, Jiang Z, Ntuen C (2013) Empirical study to investigate the range of force feedback necessary for best operator performance in a haptic controlled excavator interface. Int J Ind Ergon 43(3):197–202
Oviatt SL (1999) Ten myths of multimodal interaction. Commun ACM 42(11):74–81
Oviatt SL (2003) Advances in robust multimodal interface design. IEEE Comput Graph 5:62–68
Oviatt SL (2003) Multimodal system processing in mobile environments. In: Proceedings of the 13th annual ACM symposium on User interface software and technology, ACM, pp 21–30
Oviatt SL (2006) Human-centered design meets cognitive load theory: designing interfaces that help people think. In: Proceedings of the 14th annual ACM international conference on multimedia, ACM, pp 871–880
Oviatt SL, Coulston R, Tomko S, Xiao B, Lunsford R, Wesson M, Carmichael L (2003) Toward a theory of organized multimodal integration patterns during human-computer interaction. In: Proceedings of the 5th international conference on Multimodal interfaces, ACM, pp 44–51
Oviatt SL, Coulston R, Lunsford R (2004) When do we interact multimodally?: cognitive load and multimodal communication patterns. In: Proceedings of the 6th international conference on multimodal interfaces, ACM, pp 129–136
Plimmer B (2008) Experiences with digital pen, keyboard and mouse usability. J Multimodal User Interfaces 2(1):13–23
Ratzka A (2013) User interface patterns for multimodal interaction. In: Transactions on pattern languages of programming III. Springer, Heidelberg, pp 111–167
Reeves LM, Lai J, Larson JA, Oviatt SL, Balaji TS, Buisine S, Wang QY (2004) Guidelines for multimodal user interface design. Commun ACM 47(1):57–59
Sakamoto K, Aoyama S, Asahara S, Yamashita K, Okada A (2009) Evaluation of viewing distance vs. TV size on visual fatigue in a home viewing environment. In: Consumer Electronics, 2009. ICCE’09. Digest of Technical Papers International Conference on, IEEE, pp 1–2
Schüssel F, Honold F, Weber M (2013) Influencing factors on multimodal interaction during selection tasks. J Multimodal User Interfaces 7(4):299–310
Shaer O, Hornecker E (2010) Tangible user interfaces: past, present, and future directions. Found Trends Hum Comput Interact 3(1–2):4–137. doi:10.1007/s10648-014-9255-5
Sheu JS, Huang YL (2015) Implementation of an interactive TV interface via gesture and handwritten numeral recognition. Multimed Tools Appl 1–22. doi:10.1007/s11042-015-2739-6
Shin DH, Hwang Y, Choo H (2013) Smart TV: are they really smart in interacting with people? Understanding the interactivity of Korean Smart TV. Behav Inform Technol 32(2):156–172
Soysal M, Loğoğlu KB, Tekin M, Esen E, Saracoğlu A, Acar B, Çiloğlu T (2014) Multimodal concept detection in broadcast media: KavTan. Multimedia Tools Appl 72(3):2787–2832
Wechsung I, Engelbrecht KP, Kühnel C, Möller S, Weiss B (2012) Measuring the quality of service and quality of experience of multimodal human–machine interaction. J Multimodal User Interfaces 6(1–2):73–85
Wickens CD (2002) Multiple resources and performance prediction. Theor Issues Ergon 3(2):159–177
Wickens CD (2008) Multiple resources and mental workload. Hum Factors 50(3):449–455
Wickens CD, Sandry DL, Vidulich M (1983) Compatibility and resource competition between modalities of input, central processing, and output. Hum Factors 25(2):227–248
Xie L, Deng Z, Cox S (2014) Multimodal joint information processing in human machine interaction: recent advances. Multimedia Tools Appl 73(1):267–271
Yu J, Kim SM, Choe J, Jung ES (2013) Multi-modal controller usability for smart TV control. J Ergon Soc Korea 32(6):517–528
Acknowledgments
This research was supported by a Korea University grant.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Kim, S.M., Jung, E.S. & Park, J. Effective quality factors of multimodal interaction in simple and complex tasks of using a smart television. Multimed Tools Appl 76, 6447–6471 (2017). https://doi.org/10.1007/s11042-016-3333-2
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-016-3333-2