skip to main content
10.1145/3616195.3616225acmotherconferencesArticle/Chapter ViewAbstractPublication PagesamConference Proceedingsconference-collections
short-paper
Open Access

Kiroll: A Gaze-Based Instrument for Quadriplegic Musicians Based on the Context-Switching Paradigm

Published:11 October 2023Publication History

ABSTRACT

In recent years, Accessible Digital Musical Instruments (ADMIs) designed for motor-impaired individuals that incorporate gaze-tracking technologies have become more prevalent. To ensure a reliable user experience and minimize delays between actions and sound production, interaction methods must be carefully studied. This paper presents Kiroll, an affordable and open-source software ADMI specifically designed for quadriplegic users. Kiroll can be played by motor-impaired users through eye gaze for note selection and breath for sound control. The interface features the infinite keyboards context-switching interaction method, which exploits the smooth-pursuit capabilities of human eyes to provide an indefinitely scrolling layout so as to resolve the Midas Touch issue typical of gaze-based interaction. This paper outlines Kiroll’s interaction paradigm, features, implementation processes, and design approach.

References

  1. Sam Bailey, Adam Scott, Harry Wright, Ian M. Symonds, and Kia Ng. 2010. Eye.Breathe.Music: Creating Music through Minimal Movement. In Proc. Conf. Electronic Visualisation and the Arts (EVA 2010). ScienceOpen, London, UK, 254–258.Google ScholarGoogle ScholarCross RefCross Ref
  2. G. R. Barnes. 2008. Cognitive Processes Involved in Smooth Pursuit Eye Movements. Brain and Cognition 68, 3 (Dec. 2008), 309–326. https://doi.org/10.1016/j.bandc.2008.08.020Google ScholarGoogle ScholarCross RefCross Ref
  3. Eugenia Costa-Giomi. 2004. Effects of Three Years of Piano Instruction on Children’s Academic Achievement, School Performance and Self-Esteem. Psychology of Music 32, 2 (April 2004), 139–152. https://doi.org/10.1177/0305735604041491Google ScholarGoogle ScholarCross RefCross Ref
  4. Rudi Črnčec, Sarah J. Wilson, and Margot Prior. 2006. The Cognitive and Academic Benefits of Music to Children: Facts and Fiction. Educational Psychology 26, 4 (Aug. 2006), 579–594. https://doi.org/10.1080/01443410500342542Google ScholarGoogle ScholarCross RefCross Ref
  5. Nicola Davanzo. 2022. Accessible Digital Musical Instruments for Quadriplegic Musicians. Ph. D. Dissertation. Università degli Studi di Milano, Milano.Google ScholarGoogle Scholar
  6. Nicola Davanzo and Federico Avanzini. 2020. Hands-Free Accessible Digital Musical Instruments: Conceptual Framework, Challenges, and Perspectives. IEEE Access 8 (2020), 163975–163995. https://doi.org/10.1109/ACCESS.2020.3019978Google ScholarGoogle ScholarCross RefCross Ref
  7. Nicola Davanzo and Federico Avanzini. 2022. Design Concepts for Gaze-Based Digital Musical Instruments. In Proceedings of the 2022 Sound and Music Computing Conference. Zenodo, Saint-Etiénne, France, 477–483.Google ScholarGoogle Scholar
  8. Nicola Davanzo, Matteo De Filippis, and Federico Avanzini. 2021. Netychords: An Accessible Digital Musical Instrument for Playing Chords Using Gaze and Head Movements. In In Proc. ’21 Int. Conf. on Computer- Human Interaction Research and Applications (CHIRA ’21). SciTePress, Online conf., 8.Google ScholarGoogle Scholar
  9. Nicola Davanzo, Piercarlo Dondi, Mauro Mosconi, and Marco Porta. 2018. Playing Music with the Eyes through an Isomorphic Interface. In Proc. of the Workshop on Communication by Gaze Interaction. ACM Press, Warsaw, Poland, 1–5. https://doi.org/10.1145/3206343.3206350Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze Interaction for Smart Watches Using Smooth Pursuit Eye Movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology(UIST ’15). Association for Computing Machinery, New York, NY, USA, 457–466. https://doi.org/10.1145/2807442.2807499Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Fove Inc.2020. Eye Play the Piano. http://eyeplaythepiano.com/en/.Google ScholarGoogle Scholar
  12. Henkjan Honing, Olivia Ladinig, Gábor P. Háden, and István Winkler. 2009. Is Beat Induction Innate or Learned? Probing Emergent Meter Perception in Adults and Newborns Using Event-Related Brain Potentials. Annals of the New York Academy of Sciences 1169, 1 (2009), 93–96. https://doi.org/10.1111/j.1749-6632.2009.04761.xGoogle ScholarGoogle ScholarCross RefCross Ref
  13. Andrew Housholder, Jonathan Reaban, Aira Peregrino, Georgia Votta, and Tauheed Khan Mohd. 2022. Evaluating Accuracy of the Tobii Eye Tracker 5. In Intelligent Human Computer Interaction(Lecture Notes in Computer Science), Jong-Hoon Kim, Madhusudan Singh, Javed Khan, Uma Shanker Tiwary, Marigankar Sur, and Dhananjay Singh (Eds.). Springer International Publishing, Cham, 379–390. https://doi.org/10.1007/978-3-030-98404-5_36Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Mohamed Khamis, Carl Oechsner, Florian Alt, and Andreas Bulling. 2018. VRpursuits: Interaction in Virtual Reality Using Smooth Pursuit Eye Movements. In Proceedings of the 2018 International Conference on Advanced Visual Interfaces(AVI ’18). Association for Computing Machinery, New York, NY, USA, 1–8. https://doi.org/10.1145/3206505.3206522Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Juno Kim, Greg Schiemer, and Terumi Narushima. 2007. Oculog: Playing with Eye Movements. In Proceedings of the 7th International Conference on New Interfaces for Musical Expression - NIME ’07. ACM Press, New York, New York, 50. https://doi.org/10.1145/1279740.1279747Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Päivi Majaranta and Andreas Bulling. 2014. Eye Tracking and Eye-Based Human–Computer Interaction. In Advances in Physiological Computing, Stephen H. Fairclough and Kiel Gilleade (Eds.). Springer, London, 39–65. https://doi.org/10.1007/978-1-4471-6392-3_3Google ScholarGoogle ScholarCross RefCross Ref
  17. Carlos H. Morimoto, Antonio Diaz-Tula, José A. T. Leyva, and Carlos E. L. Elmadjian. 2015. Eyejam: A Gaze-Controlled Musical Interface. In Proceedings of the 14th Brazilian Symposium on Human Factors in Computing Systems(IHC ’15). ACM, Salvador, Brazil, 37:1–37:9. https://doi.org/10.1145/3148456.3148493Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. P. Pfordresher and C. Palmer. 2002. Effects of Delayed Auditory Feedback on Timing of Music Performance. Psychological Research 66, 1 (Feb. 2002), 71–79. https://doi.org/10.1007/s004260100075Google ScholarGoogle ScholarCross RefCross Ref
  19. Andrea Polli. 1999. Active Vision: Controlling Sound with Eye Movements. Leonardo 32, 5 (Oct. 1999), 405–411. https://doi.org/10.1162/002409499553479Google ScholarGoogle ScholarCross RefCross Ref
  20. D. A. Robinson, J. L. Gordon, and S. E. Gordon. 1986. A Model of the Smooth Pursuit Eye Movement System. Biological Cybernetics 55, 1 (Oct. 1986), 43–57. https://doi.org/10.1007/BF00363977Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Elena Rusconi, Bonnie Kwan, Bruno L. Giordano, Carlo Umiltà, and Brian Butterworth. 2006. Spatial Representation of Pitch Height: The SMARC Effect. Cognition 99, 2 (March 2006), 113–129. https://doi.org/10.1016/j.cognition.2005.01.004Google ScholarGoogle ScholarCross RefCross Ref
  22. Andrew Sears, Mark Young, and Jinjuan Feng. 2008. Physical Disabilities and Computing Technologies: An Analysis of Impairments. In The Human-Computer Interaction Handbook (2 ed.), Andrew Sears and Julie A. Jacko (Eds.). CRC Press, United States, Chapter 42, 829–852.Google ScholarGoogle Scholar
  23. Zacharias Vamvakousis and Rafael Ramirez. 2016. The EyeHarp: A Gaze-Controlled Digital Musical Instrument. Frontiers in Psychology 7 (2016), article 906. https://doi.org/10.3389/fpsyg.2016.00906Google ScholarGoogle ScholarCross RefCross Ref
  24. David Wessel and Matthew Wright. 2017. Problems and Prospects for Intimate Musical Control of Computers. In A NIME Reader: Fifteen Years of New Interfaces for Musical Expression, Alexander Refsum Jensenius and Michael J. Lyons (Eds.). Springer International Publishing, Cham, 15–27. https://doi.org/10.1007/978-3-319-47214-0_2Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Kiroll: A Gaze-Based Instrument for Quadriplegic Musicians Based on the Context-Switching Paradigm

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Other conferences
        AM '23: Proceedings of the 18th International Audio Mostly Conference
        August 2023
        204 pages
        ISBN:9798400708183
        DOI:10.1145/3616195

        Copyright © 2023 Owner/Author

        This work is licensed under a Creative Commons Attribution International 4.0 License.

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 11 October 2023

        Check for updates

        Qualifiers

        • short-paper
        • Research
        • Refereed limited

        Acceptance Rates

        Overall Acceptance Rate177of275submissions,64%
      • Article Metrics

        • Downloads (Last 12 months)110
        • Downloads (Last 6 weeks)20

        Other Metrics

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format .

      View HTML Format