ABSTRACT
In recent years, Accessible Digital Musical Instruments (ADMIs) designed for motor-impaired individuals that incorporate gaze-tracking technologies have become more prevalent. To ensure a reliable user experience and minimize delays between actions and sound production, interaction methods must be carefully studied. This paper presents Kiroll, an affordable and open-source software ADMI specifically designed for quadriplegic users. Kiroll can be played by motor-impaired users through eye gaze for note selection and breath for sound control. The interface features the infinite keyboards context-switching interaction method, which exploits the smooth-pursuit capabilities of human eyes to provide an indefinitely scrolling layout so as to resolve the Midas Touch issue typical of gaze-based interaction. This paper outlines Kiroll’s interaction paradigm, features, implementation processes, and design approach.
- Sam Bailey, Adam Scott, Harry Wright, Ian M. Symonds, and Kia Ng. 2010. Eye.Breathe.Music: Creating Music through Minimal Movement. In Proc. Conf. Electronic Visualisation and the Arts (EVA 2010). ScienceOpen, London, UK, 254–258.Google ScholarCross Ref
- G. R. Barnes. 2008. Cognitive Processes Involved in Smooth Pursuit Eye Movements. Brain and Cognition 68, 3 (Dec. 2008), 309–326. https://doi.org/10.1016/j.bandc.2008.08.020Google ScholarCross Ref
- Eugenia Costa-Giomi. 2004. Effects of Three Years of Piano Instruction on Children’s Academic Achievement, School Performance and Self-Esteem. Psychology of Music 32, 2 (April 2004), 139–152. https://doi.org/10.1177/0305735604041491Google ScholarCross Ref
- Rudi Črnčec, Sarah J. Wilson, and Margot Prior. 2006. The Cognitive and Academic Benefits of Music to Children: Facts and Fiction. Educational Psychology 26, 4 (Aug. 2006), 579–594. https://doi.org/10.1080/01443410500342542Google ScholarCross Ref
- Nicola Davanzo. 2022. Accessible Digital Musical Instruments for Quadriplegic Musicians. Ph. D. Dissertation. Università degli Studi di Milano, Milano.Google Scholar
- Nicola Davanzo and Federico Avanzini. 2020. Hands-Free Accessible Digital Musical Instruments: Conceptual Framework, Challenges, and Perspectives. IEEE Access 8 (2020), 163975–163995. https://doi.org/10.1109/ACCESS.2020.3019978Google ScholarCross Ref
- Nicola Davanzo and Federico Avanzini. 2022. Design Concepts for Gaze-Based Digital Musical Instruments. In Proceedings of the 2022 Sound and Music Computing Conference. Zenodo, Saint-Etiénne, France, 477–483.Google Scholar
- Nicola Davanzo, Matteo De Filippis, and Federico Avanzini. 2021. Netychords: An Accessible Digital Musical Instrument for Playing Chords Using Gaze and Head Movements. In In Proc. ’21 Int. Conf. on Computer- Human Interaction Research and Applications (CHIRA ’21). SciTePress, Online conf., 8.Google Scholar
- Nicola Davanzo, Piercarlo Dondi, Mauro Mosconi, and Marco Porta. 2018. Playing Music with the Eyes through an Isomorphic Interface. In Proc. of the Workshop on Communication by Gaze Interaction. ACM Press, Warsaw, Poland, 1–5. https://doi.org/10.1145/3206343.3206350Google ScholarDigital Library
- Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze Interaction for Smart Watches Using Smooth Pursuit Eye Movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology(UIST ’15). Association for Computing Machinery, New York, NY, USA, 457–466. https://doi.org/10.1145/2807442.2807499Google ScholarDigital Library
- Fove Inc.2020. Eye Play the Piano. http://eyeplaythepiano.com/en/.Google Scholar
- Henkjan Honing, Olivia Ladinig, Gábor P. Háden, and István Winkler. 2009. Is Beat Induction Innate or Learned? Probing Emergent Meter Perception in Adults and Newborns Using Event-Related Brain Potentials. Annals of the New York Academy of Sciences 1169, 1 (2009), 93–96. https://doi.org/10.1111/j.1749-6632.2009.04761.xGoogle ScholarCross Ref
- Andrew Housholder, Jonathan Reaban, Aira Peregrino, Georgia Votta, and Tauheed Khan Mohd. 2022. Evaluating Accuracy of the Tobii Eye Tracker 5. In Intelligent Human Computer Interaction(Lecture Notes in Computer Science), Jong-Hoon Kim, Madhusudan Singh, Javed Khan, Uma Shanker Tiwary, Marigankar Sur, and Dhananjay Singh (Eds.). Springer International Publishing, Cham, 379–390. https://doi.org/10.1007/978-3-030-98404-5_36Google ScholarDigital Library
- Mohamed Khamis, Carl Oechsner, Florian Alt, and Andreas Bulling. 2018. VRpursuits: Interaction in Virtual Reality Using Smooth Pursuit Eye Movements. In Proceedings of the 2018 International Conference on Advanced Visual Interfaces(AVI ’18). Association for Computing Machinery, New York, NY, USA, 1–8. https://doi.org/10.1145/3206505.3206522Google ScholarDigital Library
- Juno Kim, Greg Schiemer, and Terumi Narushima. 2007. Oculog: Playing with Eye Movements. In Proceedings of the 7th International Conference on New Interfaces for Musical Expression - NIME ’07. ACM Press, New York, New York, 50. https://doi.org/10.1145/1279740.1279747Google ScholarDigital Library
- Päivi Majaranta and Andreas Bulling. 2014. Eye Tracking and Eye-Based Human–Computer Interaction. In Advances in Physiological Computing, Stephen H. Fairclough and Kiel Gilleade (Eds.). Springer, London, 39–65. https://doi.org/10.1007/978-1-4471-6392-3_3Google ScholarCross Ref
- Carlos H. Morimoto, Antonio Diaz-Tula, José A. T. Leyva, and Carlos E. L. Elmadjian. 2015. Eyejam: A Gaze-Controlled Musical Interface. In Proceedings of the 14th Brazilian Symposium on Human Factors in Computing Systems(IHC ’15). ACM, Salvador, Brazil, 37:1–37:9. https://doi.org/10.1145/3148456.3148493Google ScholarDigital Library
- P. Pfordresher and C. Palmer. 2002. Effects of Delayed Auditory Feedback on Timing of Music Performance. Psychological Research 66, 1 (Feb. 2002), 71–79. https://doi.org/10.1007/s004260100075Google ScholarCross Ref
- Andrea Polli. 1999. Active Vision: Controlling Sound with Eye Movements. Leonardo 32, 5 (Oct. 1999), 405–411. https://doi.org/10.1162/002409499553479Google ScholarCross Ref
- D. A. Robinson, J. L. Gordon, and S. E. Gordon. 1986. A Model of the Smooth Pursuit Eye Movement System. Biological Cybernetics 55, 1 (Oct. 1986), 43–57. https://doi.org/10.1007/BF00363977Google ScholarDigital Library
- Elena Rusconi, Bonnie Kwan, Bruno L. Giordano, Carlo Umiltà, and Brian Butterworth. 2006. Spatial Representation of Pitch Height: The SMARC Effect. Cognition 99, 2 (March 2006), 113–129. https://doi.org/10.1016/j.cognition.2005.01.004Google ScholarCross Ref
- Andrew Sears, Mark Young, and Jinjuan Feng. 2008. Physical Disabilities and Computing Technologies: An Analysis of Impairments. In The Human-Computer Interaction Handbook (2 ed.), Andrew Sears and Julie A. Jacko (Eds.). CRC Press, United States, Chapter 42, 829–852.Google Scholar
- Zacharias Vamvakousis and Rafael Ramirez. 2016. The EyeHarp: A Gaze-Controlled Digital Musical Instrument. Frontiers in Psychology 7 (2016), article 906. https://doi.org/10.3389/fpsyg.2016.00906Google ScholarCross Ref
- David Wessel and Matthew Wright. 2017. Problems and Prospects for Intimate Musical Control of Computers. In A NIME Reader: Fifteen Years of New Interfaces for Musical Expression, Alexander Refsum Jensenius and Michael J. Lyons (Eds.). Springer International Publishing, Cham, 15–27. https://doi.org/10.1007/978-3-319-47214-0_2Google ScholarCross Ref
Index Terms
- Kiroll: A Gaze-Based Instrument for Quadriplegic Musicians Based on the Context-Switching Paradigm
Recommendations
A gaze gesture-based paradigm for situational impairments, accessibility, and rich interactions
ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & ApplicationsGaze gesture-based interactions on a computer are promising, but the existing systems are limited by the number of supported gestures, recognition accuracy, need to remember the stroke order, lack of extensibility, and so on. We present a gaze gesture-...
Dueto: Accessible, Gaze-Operated Musical Expression
ASSETS '19: Proceedings of the 21st International ACM SIGACCESS Conference on Computers and AccessibilityGaze-tracking technologies can enable computer access for users who are unable to use standard input devices. However, using gaze as input poses challenges for interactions that require visual planning, like playing a digital instrument. We explore how ...
Gaze gestures or dwell-based interaction?
ETRA '12: Proceedings of the Symposium on Eye Tracking Research and ApplicationsThe two cardinal problems recognized with gaze-based interaction techniques are: how to avoid unintentional commands, and how to overcome the limited accuracy of eye tracking. Gaze gestures are a relatively new technique for giving commands, which has ...
Comments