How to integrate interactions into video editing software?

  • Céline Jost Paris 8 University
  • Brigitte Le Pévédic Southern Brittany University

Abstract


This paper explores an idea to enrich existing mulsemedia editing software, getting inspiration from music scores, to have them integrate actions coming from the viewer while editing a mulsemedia. In this paper, we reflect on the possibility to do it with a unique timeline. Thus, we propose to completely change the point of view. Instead of trying to insert the interaction in the timeline, we propose to cut the media into several parts and insert them into the interaction.
Keywords: interaction, interactive mulsemedia, media editing software, scenagram

References

C. Jost et al., “PRIM Project: Playing and Recording with Interactivity and Multisensoriality,” in IMX 2021: ACM International Conference on Interactive Media Experiences, 2021.

C. Jost, B. Le Pévédic, and G. Uzan, “Using Multisensory Technologies to Stimulate People: a Reflexive Paper on Scenagrams,” in SensoryX ’21: Workshop on Multisensory Experiences, together with IMX 2021: ACM International Conference on Interactive Media Experiences, 2021.

E. Coronado, F. Mastrogiovanni, B. Indurkhya, and G. Venture, “Visual Programming Environments for End-User Development of intelligent and social robots, a systematic review,” J. Comput. Lang., vol. 58, p. 100970, Jun. 2020.

K. Howland and J. Good, “Learning to communicate computationally with Flip: A bi-modal programming language for game creation,” Comput. Educ., vol. 80, pp. 224–240, Jan. 2015.

G. Ghinea and O. Ademoye, “A user perspective of olfaction-enhanced mulsemedia,” in Proceedings of the International Conference on Management of Emergent Digital EcoSystems - MEDES ’10, 2010, p. 277.

E. B. Saleme and C. A. S. Santos, “PlaySEM: a Platform for Rendering MulSeMedia Compatible with MPEG-V,” in Proceedings of the 21st Brazilian Symposium on Multimedia and the Web - WebMedia ’15, 2015, pp. 145–148.

D. P. De Mattos and D. C. Muchaluat-Saade, “Steve: A hypermedia authoring tool based on the simple interactive multimedia model,” Proc. ACM Symp. Doc. Eng. 2018, DocEng 2018, Aug. 2018.

M. Waltl, C. Timmerer, and H. Hellwagner, “A test-bed for quality of multimedia experience evaluation of Sensory Effects,” in 2009 International Workshop on Quality of Multimedia Experience, 2009, pp. 145–150.

M. Waltl, B. Rainer, C. Timmerer, and H. Hellwagner, “An end-to-end tool chain for Sensory Experience based on MPEG-V,” Signal Process. Image Commun., vol. 28, no. 2, pp. 136–150, Feb. 2013.

R. S. De Abreu, D. Mattos, J. Dos Santos, G. Ghinea, and D. C. Muchaluat-Saade, “Toward Content-Driven Intelligent Authoring of Mulsemedia Applications,” IEEE Multimed., vol. 28, no. 1, pp. 7–16, Jan. 2021.

D. P. De Mattos, D. C. Muchaluat-Saade, and G. Ghinea, “An Approach for Authoring Mulsemedia Documents Based on Events,” 2020 Int. Conf. Comput. Netw. Commun. ICNC 2020, pp. 273–277, Feb. 2020.

C. Jost, B. Le Pevedic, and D. Duhaut, “ArCo: An architecture for children to control a set of robots,” Proc. - IEEE Int. Work. Robot Hum. Interact. Commun., pp. 158–164, 2012.

“Kdenlive - Video Editing Freedom.” [Online]. Available: https://kdenlive.org/fr/. [Accessed: 02-Apr-2022].

D. Todea, “The Use of the MuseScore Software in Musical E-Learning,” Virtual Learn., p. 88, 2015.

D. P. De Mattos, C. Muchaluat-Saade, Débora, and G. Ghinea, “Beyond Multimedia Authoring: On the Need for Mulsemedia Authoring Tools,” ACM Comput. Surv., vol. 54, no. 7, Jul. 2021.

C. Jost, B. Le Pévédic, J. Debloos, and G. Uzan, “Interactions in Multisensory Experiences: Toward a New Timeline Metaphor,” in SensoryX ’22: Workshop on Multisensory Experiences, together with IMX 2022: ACM International Conference on Interactive Media Experiences, 2022, p. 8.
Published
2022-06-22
How to Cite

Select a Format
JOST, Céline; LE PÉVÉDIC, Brigitte. How to integrate interactions into video editing software?. In: WORKSHOP ON MULTISENSORY EXPERIENCES (SENSORYX), 2. , 2022, Aveiro. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2022 . DOI: https://doi.org/10.5753/sensoryx.2022.20004.