ABSTRACT
Co-located collaborators often work over physical tabletops with rich geospatial information. Previous research shows that people use gestures and speech as they interact with artefacts on the table and communicate with one another. With the advent of large multi-touch surfaces, developers are now applying this knowledge to create appropriate technical innovations in digital table design. Yet they are limited by the difficulty of building a truly useful collaborative application from the ground up. In this paper, we circumvent this difficulty by: (a) building a multimodal speech and gesture engine around the Diamond Touch multi-user surface, and (b) wrapping existing, widely-used off-the-shelf single-user interactive spatial applications with a multimodal interface created from this engine. Through case studies of two quite different geospatial systems -- Google Earth and Warcraft III -- we show the new functionalities, feasibility and limitations of leveraging such single-user applications within a multi user, multimodal tabletop. This research informs the design of future multimodal tabletop applications that can exploit single-user software conveniently available in the market. We also contribute (1) a set of technical and behavioural affordances of multimodal interaction on a tabletop, and (2) lessons learnt from the limitations of single user applications.
- Bolt, R. A., Put-that-there: Voice and gesture at the graphics interface. Proc ACM Conf. Computer Graphics and Interactive Techniques Seattle, 1980, 262--270.]] Google ScholarDigital Library
- Clark, H. Using language. Cambridge Univ. Press, 1996.]]Google ScholarCross Ref
- Cohen, P. Speech can't do everything: A case for multimodal systems. Speech Technology Magazine, 5(4), 2000.]]Google Scholar
- Cohen, P. R., Coulston, R. and Krout, K., Multimodal interaction during multiparty dialogues: Initial results. Proc IEEE Int'l Conf. Multimodal Interfaces, 2002, 448--452.]] Google ScholarDigital Library
- Cohen, P. R., Johnston, M., McGee, D., Oviatt, S., Pittman, J., Smith, I., Chen, L. and Clow, J., QuickSet: Multimodal interaction for distributed applications. Proc. ACM Multimedia, 1997, 31--40.]] Google ScholarDigital Library
- Dietz, P. and Leigh, D. DiamondTouch: a multi-user touch technology. Proc ACM UIST, 2001, 219--226.]] Google ScholarDigital Library
- Dix, A., Finlay, J. Abowd, G. and Beale, R. Human-Computer Interaction. 2nd ed. Prentice Hall, 1998.]] Google ScholarDigital Library
- Greenberg, S. and Boyle, M. Customizable physical interfaces for interacting with conventional applications. Proc ACM UIST, 2002, 31--40.]] Google ScholarDigital Library
- Greenberg, S., Sharing views and interactions with single-user applications. Proc ACM COIS, 1990, 227--237]] Google ScholarDigital Library
- Greenberg, S. Personalizable groupware: Accommodating individual roles and group differences. Proc ECSCW, 1991, 17--32,]] Google ScholarDigital Library
- Gutwin, C., and Greenberg, S. The importance of awareness for team cognition in distributed collaboration. In E. Salas, S. Fiore (Eds) Team Cognition: Understanding the Factors that Drive Process and Performance, APA Press, 2004, 177--201.]]Google Scholar
- Gutwin, C. and Greenberg, S. Design for individuals, design for groups: Tradeoffs between power and workspace awareness. Proc ACM CSCW, 1998, 207--216]] Google ScholarDigital Library
- Heath, C. C. and Luff, P. Collaborative activity and technological design: Task coordination in London Underground control rooms. Proc ECSCW, 1991, 65--80]] Google ScholarDigital Library
- Ishii, H., Kobayashi, M. and Grudin, J. Integration of interpersonal space and shared workspace: ClearBoard design and experiments. ACM TOIS, 11 (4), 1993, 349--375.]] Google ScholarDigital Library
- Kruger, R., Carpendale, M. S. T., Scott, S. and Greenberg, S. Roles of orientation in tabletop collaboration: Comprehension, coordination and communication. J CSCW, 13(5--6), 2004, 501--537.]] Google ScholarDigital Library
- McGee, D. R. and Cohen, P. R., Creating tangible interfaces by augmenting physical objects with multimodal language. Proc ACM Conf Intelligent User Interfaces, 2001, 113--119.]] Google ScholarDigital Library
- Oviatt, S. L. Ten myths of multimodal interaction, Comm. ACM, 42(11), 1999, 74--81.]] Google ScholarDigital Library
- Oviatt, S. Multimodal interactive maps: Designing for human performance. Human-Computer Interaction 12, 1997.]]Google Scholar
- Pinelle, D., Gutwin, C. and Greenberg, S. Task analysis for groupware usability evaluation: Modeling shared-workspace tasks with the mechanics of collaboration. ACM TOCHI, 10(4), 2003, 281--311.]] Google ScholarDigital Library
- Rekimoto, J. SmartSkin: An infrastructure for freehand manipulation on interactive surfaces. Proc ACM CHI, 2002.]] Google ScholarDigital Library
- Ringel-Morris, M., Ryall, K., Shen, C., Forlines, C., Vernier, F. Beyond social protocols: Multi-user coordination policies for co-located groupware. Proc ACM CSCW, 262--265, 2004.]] Google ScholarDigital Library
- Segal, L. Effects of checklist interface on non-verbal crew communications, NASA Ames Research Center, Contractor Report 177639, 1994]]Google Scholar
- Tang, J. Findings from observational studies of collaborative work. Int. J. Man-Machine. Studies. 34 (2), 1991, 143--160.]] Google ScholarDigital Library
- Wigdor, D., Balakrishnan, R. Empirical investigation into the effect of orientation on text readability in tabletop displays. Proc ECSCW, 2005.]] Google ScholarDigital Library
- Wu, M., Shen, C., Ryall, K., Forlines, C., and Balakrishnan, R. Gesture registration, relaxation, and reuse for multi-point direct-touch surfaces. IEEE Int'l Workshop Horizontal Interactive Human-Computer Systems (TableTop), 2006.]] Google ScholarDigital Library
- Wu, M. and Balakrishnan, R. Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays. Proc ACM UIST, 193--202, 2003.]] Google ScholarDigital Library
Index Terms
- Enabling interaction with single user applications through speech and gestures on a multi-user tabletop
Recommendations
Exploring true multi-user multimodal interaction over a digital table
DIS '08: Proceedings of the 7th ACM conference on Designing interactive systemsTrue multi-user, multimodal interaction over a digital table lets co-located people simultaneously gesture and speak commands to control an application. We explore this design space through a case study, where we implemented an application that supports ...
Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays
UIST '03: Proceedings of the 16th annual ACM symposium on User interface software and technologyRecent advances in sensing technology have enabled a new generation of tabletop displays that can sense multiple points of input from several users simultaneously. However, apart from a few demonstration techniques [17], current user interfaces do not ...
Multimodal multiplayer tabletop gaming
Interactive TVThere is a large disparity between the rich physical interfaces of co-located arcade games and the generic input devices seen in most home console systems. In this article we argue that a digital table is a conducive form factor for general co-located ...
Comments