skip to main content
10.1145/2686612.2686677acmotherconferencesArticle/Chapter ViewAbstractPublication PagesozchiConference Proceedingsconference-collections
research-article

Cross-platform usability and eye-tracking measurement and analysis model

Published:02 December 2014Publication History

ABSTRACT

Evaluating the usability of cross-platform interactive systems has become increasingly important. In this paper, we review the interpretations of current eye-movement metrics across several usability and HCI studies. We found that the existing eye-tracking metrics and their associated interpretations do not consider cross-platform usability (CPU) aspects. Therefore, taking into consideration the characteristics of user interaction with Multiple User Interfaces (MUIs), a usability-engineering model, which is called Eye Tracking Measurement and Analysis model for Cross-Platform Usability (CPU-EMA), has been developed. This model decomposed eye-tracking metrics for cross-platform usability into four high-level metrics, namely, cross-platform fixation, saccade, scanpath and gaze. The high-level metrics were further decomposed into low-level metrics with possible interpretations for cross-platform usability. The model also provided procedures for measuring and analysing cross-platform usability using eye-movement data.

References

  1. Bång, M., A. Larsson, E. Berglund and H. Eriksson (2005). "Distributed user interfaces for clinical ubiquitous computing applications." International journal of medical informatics 74(7): 545--551.Google ScholarGoogle ScholarCross RefCross Ref
  2. Byrne, M. D., J. R. Anderson, S. Douglass and M. Matessa (1999). Eye tracking the visual search of click-down menus. Proceedings of the SIGCHI conference on Human Factors in Computing Systems, ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Denis, C. and L. Karsenty (2004). "Inter-usability of multi-device systems: A conceptual framework." Multiple user interfaces: Cross-platform applications and context-aware interfaces: 373--384.Google ScholarGoogle Scholar
  4. Ellis, S., R. Candrea, J. Misner, C. S. Craig, C. P. Lankford and T. E. Hutchinson (1998). Windows to the soul? What eye movements tell us about software usability. Proceedings of the 7th Annual Conference of the Usability Professionals Association, Washington, DC: UPA Press.Google ScholarGoogle Scholar
  5. Goldberg, J. H., M. J. Stimson, M. Lewenstein, N. Scott and A. M. Wichansky (2002). Eye tracking in web search tasks: design implications. Proceedings of the 2002 symposium on Eye tracking research & applications, ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Hendrickson, J. J. (1989). Performance, preference, and visual scan patterns on a menu-based system: implications for interface design. ACM SIGCHI Bulletin, ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Jacob, R. J. and K. S. Karn (2003). "Eye tracking in human-computer interaction and usability research: Ready to deliver the promises." Mind 2(3): 4.Google ScholarGoogle Scholar
  8. Luyten, K. and K. Coninx (2005). Distributed user interface elements to support smart interaction spaces. Multimedia, Seventh IEEE International Symposium on, IEEE. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Nielsen, J. (2005). "Ten usability heuristics."Google ScholarGoogle Scholar
  10. Pyla, P. S., M. Tungare, J. Holman and M. A. Pérez-Quiñones (2009). Continuous user interfaces for seamless task migration. Human-Computer Interaction. Ambient, Ubiquitous and Intelligent Interaction, Springer: 77--85. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Renshaw, J., J. Finlay, D. Tyfa and R. D. Ward (2003). Designing for visual influence: An eye tracking study of the usability of graphical management information. Proceedings of the IFIP conference on Human-computer interaction (INTERACT 2003).Google ScholarGoogle Scholar
  12. Savidis, A. and C. Stephanidis (2005). "Distributed interface bits: dynamic dialogue composition from ambient computing resources." Personal and Ubiquitous Computing 9(3): 142--168. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Seffah, A., P. Forbrig and H. Javahery (2004). "Multi-devices "Multiple" user interfaces: development models and research opportunities." Journal of Systems and Software 73(2): 287--300. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Cross-platform usability and eye-tracking measurement and analysis model

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      OzCHI '14: Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing Futures: the Future of Design
      December 2014
      689 pages
      ISBN:9781450306539
      DOI:10.1145/2686612
      • Conference Chair:
      • Tuck Leong

      Copyright © 2014 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 2 December 2014

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      OzCHI '14 Paper Acceptance Rate85of176submissions,48%Overall Acceptance Rate362of729submissions,50%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader