Toward the design and evaluation of continuous sound in tangible interfaces: The Spinotron

https://doi.org/10.1016/j.ijhcs.2009.07.002Get rights and content

Abstract

This paper reports on an approach to the design of continuous sonic feedback in tangible interfaces, and on quantitative evaluation methods intended to guide such design tasks. The issues it addresses may be of central relevance to areas of the emerging discipline of sonic interaction design that have begun to address the unique problems of designing sound for highly interactive contexts. Three experiments were conducted to assess two key aspects of the sound design developed for an abstract object designed for these experiments, which we refer to as the Spinotron. First, a comparison of sound source identification was made between three cases: passive listening to temporally static sounds; passive listening to dynamically evolving sounds; and listening to sounds generated through active manipulation of the artifact. The results show that control over the sound production process influences the material of the objects in interaction identified as the source of the sounds. Second, in a learning experiment, users’ performance with the Spinotron device was compared between a group of participants that were provided only with passive proprioceptive information, and for another group who were also presented with synthetic sound produced by the artifact. The results indicated that the sound, when present, aided users in learning to control the device, whereas without the sound no learning was observed. Together, these results hold promise toward creating a foundation for the design of continuous sound that is intended to accompany control actions on the part of users, and toward establishing a basis for experimental/quantitative evaluation methods and gathering basic knowledge about sensory-motor activity engaged in tangible sonic interactions.

Introduction

The use of sounds in human–computer interfaces, whether within the graphical user interface of a desktop computer, or the browser of a mobile phone, is widespread in our everyday lives. Moreover, new technologies for sensing and embedded computation, and related economies of scale, have made it possible for designers to consider sonic augmentations of a much wider array of everyday objects that incorporate electronic sensing and computational capabilities, either for aesthetic purposes (such as enhancing the roar of a car engine) or for functional reasons (such as improving the usability of a browser on a music player with a small visual display).

To date, the use of such sounds has been primarily limited to signals indicating discrete changes of state in the system involved, or discrete actions of a user; for example, the sound of crumpled paper is played when a computer file is deleted. Where continuous auditory feedback is concerned, however, far less is known to guide the designer, who at the same time faces a more complex design task, as the sound is no longer produced in a static or isolated way, but is rather coupled to human action in real time. The most refined approaches to such design come from music, and musical instrument design has been at the forefront of interaction design with sound for many years. To produce a good tone, a violinist bows a string, and (particularly during training) adjusts his or her bowing action continuously, as required, by listening to the sound that is produced. This sonic feedback informs the player about the state of the violin, but also guides the player's control, modifying bow speed, pressure, angle, and so forth. Feedback of this type can therefore be regarded as part of a continuous and dynamical loop: a user is continuously controlling an instrument (here, a violin); this manipulation of the instrument produces sounds that vary in a coherent way with the actions of the user or the player; in turn the sounds affect how a user is performing. Transposing this complex aspect of traditional musical instruments to the design of digitally augmented devices raises a number of technical and aesthetic questions that have been explored in research communities surrounding new digital instruments for musical expression.1

However, the design of musical artifacts is guided by different aims than those that are relevant to product design. The effective application of auditory displays to HCI contexts that are not primarily concerned with musical performance is arguably limited by a lack of suitable design methodologies and evaluation methods. Comparatively few examples within HCI exist, and even a smaller number have contributed knowledge toward the design and evaluation of continuous auditory feedback in functional contexts.

The goal of the design and evaluation activities that are reported upon in this paper is to develop further knowledge toward a basis for the design of continuous auditory feedback in sonically augmented interfaces. Our studies are based on an interface that we refer to as the Spinotron. It is a tangible, one degree-of-freedom controller that is endowed with both sensing and synthesized sonic feedback, design of which was based on the metaphor of the rotation of a virtual ratcheted wheel, driven in a manner analogous to a child's toy consisting of a spinning top.

From the standpoint of perception, the level of dynamical interactivity embodied by such artifacts is very different from the situation of passive listening in which most perceptual studies are carried out. If experiments are carried out in such a setting, participants are not listening to sequences of static sounds selected by an experimenter, but are instead dynamically exploring the sounds of an interactive object. This context may be thought to be more closely allied with enactive views of perception (e.g. Bruner, 1966) than with experimental auditory psychology. A second goal of the study reported on in this article was to investigate the manner in which listeners that must manipulate a system to generate sound, rather than just passively listening to them, perceive the cause of those sounds differently. More precisely, the questions that were experimentally addressed in this study are:

  • How does manipulation modulate the perception of the cause of the sounds?

  • Does the sound guide users in learning how to control the Spinotron so as to drive the speed of the ratcheted wheel?

This paper is divided into four sections. In Section 1, we report on related work in sonic interaction. Section 2 details the design of the Spinotron. 3 Perception of the sounds and selection of sound parameters, 4 The manipulation of the Spinotron (Experiment 3) report on three perceptual studies: the experiments in Section 3 assess whether the speed of the wheel can be used to convey information to the users, and allow to select the model parameters that generate sounds that are the most coherent with the metaphor of a ratcheted wheel. The experiment reported on in Section 4 first studies the interpretation of the sounds, as the Spinotron is being manipulated, and as the users listen to the sounds passively. Second, it investigates influence of sonic feedback on how users learn to control the speed of the virtual ratcheted wheel, using two different control modes.

Section snippets

Sound in HCI: from iconic to dynamic

Human computer interaction has evolved tremendously in the past 30 years. Many methods for interaction have been developed, making use of a large variety of devices and techniques, and sound is no exception.

Historically, HCI has focused on sounds in the form of short abstract static signals, typically warning or feedback sounds. The use of these sounds is now relatively common in applications such as hospital or car equipment, or high performance aircraft (Patterson et al., 1986, Edworthy et

Interface design: the Spinotron

Similarly to the Ballancer, the Spinotron (see Fig. 1) was conceived as an abstracted artifact capable of generating digitally synthesized sound through interaction in an intuitive way, via a metaphor based on a virtual physical mechanism (see Fig. 2). Specifically, it affords a simple one-dimensional mode of input, based on an act of manual vertical pumping of its central shaft. The artifact was designed with the aim of supporting the experiments of 3 Perception of the sounds and selection of

Perception of the sounds and selection of sound parameters

The metaphor of the ratcheted wheel is communicated to users pumping the Spinotron through the ratchet sound. The latter consists of a series of impacts, the rhythm of which is driven by the speed of the wheel: the faster the wheel turns, the greater the density of impacts. Furthermore, the parameters of the model used to synthesize the sound of the ratchet may be selected to convey different impressions of materials of the wheel and the pawl. The experiments of this section study whether

The manipulation of the Spinotron (Experiment 3)

With the parameter setting selected in the previous section, it is now possible to study how users learn how to manipulate the Spinotron. The goal of the experiment reported on in this section is twofold. First, it aims at comparing a group of participants interacting with the Spinotron without auditory feedback, and a group manipulating the Spinotron with feedback from the ratchet sounds selected in the previous experiment. In the latter group, the participants receive information about the

Discussion and conclusion

This paper has reported on an approach to the design of continuous sonic feedback in tangible artifacts, and on quantitative evaluation methods intended to guide such design. An interactive artifact was designed: the Spinotron. It is a physical object enabling pumping, and driving the real-time synthesis of a ratcheted wheel. A set of experiments was conducted to assess two key aspects of the sonic interactions involved: how manipulation modulates the perception of the sounds; how sound guides

Acknowledgments

This work was funded by the European Project CLOSED: Closing the Loop of Sound Evaluation and Design, FP6-NEST-PATH no. 29085. The authors would like to thank Hans Hansen and Bruno Giordano for assistance with the statistics.

References (27)

  • J. Bruner

    Toward a Theory of Instruction

    (1966)
  • C. Cadoz

    Musique, geste technologie

  • J. Edworthy et al.

    Improving auditory warning design: relationship between warning sound parameters and perceived urgency

    Human Factors

    (1991)
  • P.M. Fitts

    The information capacity of the human motor system in controlling the amplitude of movement

    Journal of Experimental Psychology

    (1954)
  • W.W. Gaver

    Auditory icons: using sound in computer inter-faces

    Human Computer Interactions

    (1986)
  • W.W. Gaver

    The sonic finder: an interface that use auditory icons

    Human Computer Interactions

    (1989)
  • W.W. Gaver

    Using and creating auditory icons

  • B.L. Giordano et al.

    Material identification of real impact sounds: effect of size variation in steel, glass, wood and plexiglass plates

    Journal of the Acoustical Society of America

    (2006)
  • Houix, O., Lemaitre, G., Misdariis, N., Susini, P., Franinovic, K., Hug, D., Franinovic, K., Otten, J., Scott, J.,...
  • Hunt, K., Crossley, F., 1975. Coefficient of restitution interpreted as damping in vibroimpact. ASME Journal of Applied...
  • Monache, S.D., Devallez, D., Drioli, C., Fontana, F., Papetti, S., Polotti, P., Rocchesso, D., 2008. Sound synthesis...
  • Müller-Tomfelde, C., Münche, T., 2001. Modeling and sonifying pen strokes on surfaces. In: Proceedings of the COST G-6...
  • Müller-Tomfelde, C., Steiner, S., 2001. Audio-enhanced collaboration at an interactive electronic whiteboard. In:...
  • Cited by (35)

    • Evaluating a sonic interaction design based on a historic theatre sound effect

      2022, International Journal of Human Computer Studies
      Citation Excerpt :

      The sounds were produced by partial rotations of the crank handle in order to ensure a consistency of speed for the full duration of each clip, and speed of rotation was monitored in RPM in Max/MSP during recording. Following the procedure in Lemaitre et al. (2009), all of the audio clips were of a similar duration, in this case a maximum of two seconds in length. All of the participants who had previous participated in the similarity ratings step also participated in this step of the experiment.

    • Interaction by ear

      2019, International Journal of Human Computer Studies
      Citation Excerpt :

      One of the main objectives of design research is to abstract from specific objects, configurations, and contexts, to derive facts and principles of general validity. In research practices of sonic interaction design (Rocchesso and Serafin, 2009), abstract interactive objects have been developed to design and evaluate specific interaction gestalts (Lemaitre et al., 2009), thus gaining understanding on how sound and action affect each other in interactive contexts. In product design, even in those cases where sound does not affect the perceived usability, it does affect the overall aesthetics (Sonderegger and Sauer, 2015) and, in turn, it may change behaviors.

    • The use of semantic differential scales in listening tests: A comparison between context and laboratory test conditions for the rolling sounds of office chairs

      2017, Applied Acoustics
      Citation Excerpt :

      Several methods [29,37] and tools have been developed over the last few decades to assess and predict the human perception, acceptance and emotion towards a product sound, as well as to support the design phase. A tool with an abstract shape was developed and tested to assess sonic feedback in tangible interfaces (i.e. a digital environment) [25]. Environmental sound categorization was found to be reliable as a sound design tool [20], and voice is also currently used by sound designers to simulate (i.e. reproduce) a product sound before, and then to design it later [26].

    • Multisensory texture exploration at the tip of the pen

      2016, International Journal of Human Computer Studies
      Citation Excerpt :

      Their realization, for example, allowed to test if a person could rely on either the ball rolling rumble, or the time-to-collision cue, in a length estimation task. Examples in sonic interaction design are the Spinotron (Lemaitre et al., 2009), based on the simulation of a ratcheted wheel, the Flops (Lemaitre et al., 2012), based on pouring virtual balls out of a glass, and the Ballancer (Rath and Rocchesso, 2005), based on a rolling ball simulation. An example of artifact that integrates audio and touch for exploratory purposes is the PebbleBox (O׳Modhrain and Essl, 2013).

    • Movement interaction with a loudspeaker: An index of possibilities

      2022, ACM International Conference Proceeding Series
    • Auditory Interfaces

      2022, Auditory Interfaces
    View all citing articles on Scopus
    View full text