skip to main content
research-article

Performer-centered visual feedback for human-machine improvisation

Published:14 November 2011Publication History
Skip Abstract Section

Abstract

This article describes the design and implementation of the Multimodal Interactive Musical Improvisation (Mimi) system. Unique to Mimi is its visual interface, which provides the performer with instantaneous and continuous information on the state of the system, in contrast to other human-machine improvisation systems, which require performers to grasp and intuit possible extemporizations in response to machine-generated music without forewarning. In Mimi, the information displayed extends into the near future and reaches back into the recent past, allowing the performer awareness of the musical context so as to plan their response accordingly. This article presents the details of Mimi's system design, the visual interface, and its implementation using the formalism defined by François' Software Architecture for Immersipresence (SAI) framework. Mimi is the result of a collaborative iterative design process. We have recorded the design sessions and present here findings from the transcripts that provide evidence for the impact of visual support on improvisation planning and design. The findings demonstrate that Mimi's visual interface offers musicians the opportunity to anticipate and to review decisions, thus making it an ideal performance and pedagogical tool for improvisation. It allows novices to create more contextually relevant improvisations and experts to be more inventive in their extemporizations.

References

  1. Allauzen, C., Crochemore, M., and Raffinot, M. 1999. Factor oracle: A new structure for pattern matching. In Proceedings of SOFSEM'99, Theory and Practice of Informatics. J. Pavelka, G. Tel, and M. Bartosek, Eds. Lecture Notes in Computer Science, Springer Verlag, 291--306. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Assayag, G. and Bloch, G. 2007. Navigating the oracle: A heuristic approach. In Proceedings of the International Computer Music Conference. 405--412.Google ScholarGoogle Scholar
  3. Assayag, G., Bloch, G., and Chemillier, M. 2006a. OMax-Ofon. In Proceedings of Sound and Music Computing.Google ScholarGoogle Scholar
  4. Assayag, G., Bloch, G., Chemillier, M., Cont, A., and Dubnov, S. 2006b. OMax Brothers: A dynamic topology of agents for improvization learning. In Proceedings of the ACM Workshop on Music and Audio Computing. 125--132. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Assayag, G. and Dubnov, S. 2004. Using factor oracles for machine improvisation. Soft Comp. 8, 9, 1--7. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Assayag, G., Rueda, C., Laurson, M., Agon, C., and Delerue, O. 1999. Computer assisted composition at IRCAM: PatchWork & OpenMusic. Comp. Music J. 23, 3, 59--72. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Bresson, J., Agon, C., and Assayag, G. 2005. OpenMusic 5: A cross-platform release of the computer-assisted composition environment. In Proceedings of the 10th Brazilian Symposium on Computer Music.Google ScholarGoogle Scholar
  8. Dannenberg, R. B. 2000. A language for interactive audio applications. In Proceedings of the International Computer Music Conference.Google ScholarGoogle Scholar
  9. Dubnov, S. and Assayag, G. 2005. Improvisation planning and jam session design using concepts of sequence variation and flow experience. In Proceedings of the International Conference on Sound and Music Computing.Google ScholarGoogle Scholar
  10. François, A. R. 2004. A hybrid architectural style for distributed parallel processing of generic data streams. In Proceedings of the International Conference on Software Engineering. 367--376. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. François, A. R. 2009. Time and perception in music and computation. In New Computational Paradigms for Computer Music, G. Assayag and A. Gerzso, Eds., Editions Delatour France/IRCAM, 125--146.Google ScholarGoogle Scholar
  12. François, A. R. 2010. An architectural framework for the design, analysis and implementation of interactive systems. Computer J. doi: 10.1093/comjnl/bxq081.Google ScholarGoogle Scholar
  13. François, A. R. and Chew, E. 2006. An architectural framework for interactive music systems. In Proceedings of the International Conference on New Interfaces for Musical Expression. 150--155. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. François, A. R., Chew, E., and Thurmond, D. 2007. Visual feedback in performer-machine interaction for musical improvisation. In Proceedings of the International Conference on New Interfaces for Musical Expression. 277--280. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Lévy, B. 2009. Visualizing OMax. Tech. rep. IRCAM.Google ScholarGoogle Scholar
  16. Lewis, G. 2000. Too many notes: Computers, complexity and culture in Voyager. Leonardo Music J. 10, 33--39.Google ScholarGoogle ScholarCross RefCross Ref
  17. Norman, D. A. 2002. The Design of Everyday Things. Basic Books, New York, NY. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Pachet, F. 2003. The Continuator: Musical interaction with style. J. New Music Res. 32, 3, 333--341.Google ScholarGoogle ScholarCross RefCross Ref
  19. Puckette, M. S. 2004. A divide between “compositional” and “performative” aspects of Pd. In Proceedings of the First International Pd Convention.Google ScholarGoogle Scholar
  20. Thom, B. 2000. Bob: An interactive improvisational companion. In Proceedings of the International Conference on Autonomous Agents (AA'00). Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Thom, B. 2003. Interactive improvisational music companionship: A user-modeling approach. User Model. User-Adapted Interact. J. (Special Issue on User Modeling and Intelligent Agents), 13, 1-2, 133--177. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Walker, W. and Belet, B. 1996. Applying ImprovisationBuilder to interactive composition with midi piano. In Proceedings of the International Computer Music Conference.Google ScholarGoogle Scholar
  23. Walker, W., Hebel, K., Martirano, S., and Scaletti, C. 1992. ImprovisationBuilder: Improvisation as conversation. In Proceedings of the International Computer Music Conference.Google ScholarGoogle Scholar
  24. Walker, W. F. 1997. A computer participant in musical improvisation. In Proceedings of Human Factors in Computing Systems (CHI). Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Weinberg, G. and Driscoll, S. 2006. Toward robotic musicianship. Comp. Music J. 30, 4, 28--45. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Performer-centered visual feedback for human-machine improvisation

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      Full Access

      • Published in

        cover image Computers in Entertainment
        Computers in Entertainment   Volume 9, Issue 3
        Theoretical and Practical Computer Applications in Entertainment
        November 2011
        196 pages
        EISSN:1544-3574
        DOI:10.1145/2027456
        Issue’s Table of Contents

        Copyright © 2011 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 14 November 2011

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article
        • Research
        • Refereed

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader