Abstract
This article describes the design and implementation of the Multimodal Interactive Musical Improvisation (Mimi) system. Unique to Mimi is its visual interface, which provides the performer with instantaneous and continuous information on the state of the system, in contrast to other human-machine improvisation systems, which require performers to grasp and intuit possible extemporizations in response to machine-generated music without forewarning. In Mimi, the information displayed extends into the near future and reaches back into the recent past, allowing the performer awareness of the musical context so as to plan their response accordingly. This article presents the details of Mimi's system design, the visual interface, and its implementation using the formalism defined by François' Software Architecture for Immersipresence (SAI) framework. Mimi is the result of a collaborative iterative design process. We have recorded the design sessions and present here findings from the transcripts that provide evidence for the impact of visual support on improvisation planning and design. The findings demonstrate that Mimi's visual interface offers musicians the opportunity to anticipate and to review decisions, thus making it an ideal performance and pedagogical tool for improvisation. It allows novices to create more contextually relevant improvisations and experts to be more inventive in their extemporizations.
- Allauzen, C., Crochemore, M., and Raffinot, M. 1999. Factor oracle: A new structure for pattern matching. In Proceedings of SOFSEM'99, Theory and Practice of Informatics. J. Pavelka, G. Tel, and M. Bartosek, Eds. Lecture Notes in Computer Science, Springer Verlag, 291--306. Google ScholarDigital Library
- Assayag, G. and Bloch, G. 2007. Navigating the oracle: A heuristic approach. In Proceedings of the International Computer Music Conference. 405--412.Google Scholar
- Assayag, G., Bloch, G., and Chemillier, M. 2006a. OMax-Ofon. In Proceedings of Sound and Music Computing.Google Scholar
- Assayag, G., Bloch, G., Chemillier, M., Cont, A., and Dubnov, S. 2006b. OMax Brothers: A dynamic topology of agents for improvization learning. In Proceedings of the ACM Workshop on Music and Audio Computing. 125--132. Google ScholarDigital Library
- Assayag, G. and Dubnov, S. 2004. Using factor oracles for machine improvisation. Soft Comp. 8, 9, 1--7. Google ScholarDigital Library
- Assayag, G., Rueda, C., Laurson, M., Agon, C., and Delerue, O. 1999. Computer assisted composition at IRCAM: PatchWork & OpenMusic. Comp. Music J. 23, 3, 59--72. Google ScholarDigital Library
- Bresson, J., Agon, C., and Assayag, G. 2005. OpenMusic 5: A cross-platform release of the computer-assisted composition environment. In Proceedings of the 10th Brazilian Symposium on Computer Music.Google Scholar
- Dannenberg, R. B. 2000. A language for interactive audio applications. In Proceedings of the International Computer Music Conference.Google Scholar
- Dubnov, S. and Assayag, G. 2005. Improvisation planning and jam session design using concepts of sequence variation and flow experience. In Proceedings of the International Conference on Sound and Music Computing.Google Scholar
- François, A. R. 2004. A hybrid architectural style for distributed parallel processing of generic data streams. In Proceedings of the International Conference on Software Engineering. 367--376. Google ScholarDigital Library
- François, A. R. 2009. Time and perception in music and computation. In New Computational Paradigms for Computer Music, G. Assayag and A. Gerzso, Eds., Editions Delatour France/IRCAM, 125--146.Google Scholar
- François, A. R. 2010. An architectural framework for the design, analysis and implementation of interactive systems. Computer J. doi: 10.1093/comjnl/bxq081.Google Scholar
- François, A. R. and Chew, E. 2006. An architectural framework for interactive music systems. In Proceedings of the International Conference on New Interfaces for Musical Expression. 150--155. Google ScholarDigital Library
- François, A. R., Chew, E., and Thurmond, D. 2007. Visual feedback in performer-machine interaction for musical improvisation. In Proceedings of the International Conference on New Interfaces for Musical Expression. 277--280. Google ScholarDigital Library
- Lévy, B. 2009. Visualizing OMax. Tech. rep. IRCAM.Google Scholar
- Lewis, G. 2000. Too many notes: Computers, complexity and culture in Voyager. Leonardo Music J. 10, 33--39.Google ScholarCross Ref
- Norman, D. A. 2002. The Design of Everyday Things. Basic Books, New York, NY. Google ScholarDigital Library
- Pachet, F. 2003. The Continuator: Musical interaction with style. J. New Music Res. 32, 3, 333--341.Google ScholarCross Ref
- Puckette, M. S. 2004. A divide between “compositional” and “performative” aspects of Pd. In Proceedings of the First International Pd Convention.Google Scholar
- Thom, B. 2000. Bob: An interactive improvisational companion. In Proceedings of the International Conference on Autonomous Agents (AA'00). Google ScholarDigital Library
- Thom, B. 2003. Interactive improvisational music companionship: A user-modeling approach. User Model. User-Adapted Interact. J. (Special Issue on User Modeling and Intelligent Agents), 13, 1-2, 133--177. Google ScholarDigital Library
- Walker, W. and Belet, B. 1996. Applying ImprovisationBuilder to interactive composition with midi piano. In Proceedings of the International Computer Music Conference.Google Scholar
- Walker, W., Hebel, K., Martirano, S., and Scaletti, C. 1992. ImprovisationBuilder: Improvisation as conversation. In Proceedings of the International Computer Music Conference.Google Scholar
- Walker, W. F. 1997. A computer participant in musical improvisation. In Proceedings of Human Factors in Computing Systems (CHI). Google ScholarDigital Library
- Weinberg, G. and Driscoll, S. 2006. Toward robotic musicianship. Comp. Music J. 30, 4, 28--45. Google ScholarDigital Library
Index Terms
- Performer-centered visual feedback for human-machine improvisation
Recommendations
Visual feedback in performer-machine interaction for musical improvisation
NIME '07: Proceedings of the 7th international conference on New interfaces for musical expressionThis paper describes the design of Mimi, a multi-modal interactive musical improvisation system that explores the potential and powerful impact of visual feedback in performer-machine interaction. Mimi is a performer-centric tool designed for use in ...
Machine Improvisation with Variable Markov Oracle: Toward Guided and Structured Improvisation
Special Issue on Musical Metacreation, Part IIIn this article, we describe the Variable Markov Oracle and how it can be used in stylistic machine music improvisation scenarios. A Variable Markov Oracle is a data structure capable of identifying repeated subsequences within a multivariate time ...
Unifying performer and accompaniment
CMMR'05: Proceedings of the Third international conference on Computer Music Modeling and RetrievalA unique real time system for correlating a vocal, musical performance to an electronic accompaniment is presented. The system has been implemented and tested extensively in performance in the author's opera ‘La Quintrala', and experience with its use ...
Comments