Abstract
We present the methods that were used in capturing a library of human movements for use in computeranimated displays of human movement. The library is an attempt to systematically tap into and represent the wide range of personal properties, such as identity, gender, and emotion, that are available in a person’s movements. The movements from a total of 30 nonprofessional actors (15 of them female) were captured while they performed walking, knocking, lifting, and throwing actions, as well as their combination in angry, happy, neutral, and sad affective styles. From the raw motion capture data, a library of 4,080 movements was obtained, using techniques based on Character Studio (plug-ins for 3D Studio MAX, AutoDesk, Inc.), MATLAB (The Math Works, Inc.), or a combination of these two. For the knocking, lifting, and throwing actions, 10 repetitions of the simple action unit were obtained for each affect, and for the other actions, two longer movement recordings were obtained for each affect. We discuss the potential use of the library for computational and behavioral analyses of movement variability, of human character animation, and of how gender, emotion, and identity are encoded and decoded from human movement.
Article PDF
Similar content being viewed by others
References
Amaya, K., Bruderlin, A., &Calvert, T. (1996). Emotion from motion. In W. A. Davis & R. Bartels(Eds.),Proceedings of the Conference on Graphics Interface ’96 (pp. 222–229). Toronto: Canadian Information Processing Society.
Atkinson, A. P., Dittrich, W. H., Gemmell, A. J., &Young, A. W. (2004). Emotion perception from dynamic and static body expressions in point-light and full-light displays.Perception,33, 717–746.
Cutting, J. E., &Kozlowski, L. T. (1977). Recognizing friends by their walk: Gait perception without familiarity cues.Bulletin of the Psychonomic Society,9, 353–356.
Dekeyser, M., Verfaillie, K., &Vanrie, J. (2002). Creating stimuli for the study of biological-motion perception.Behavior Research Methods, Instruments, & Computers,34, 375–382.
Dittrich, W. H., Troscianko, T., Lea, S., &Morgan, D. (1996). Perception of emotion from dynamic point-light displays represented in dance.Perception,25, 727–738.
Ekman, P., &Friesen, W. V. (1972).Pictures of facial affect. Palo Alto, CA: Consulting Psychologists Press.
Giese, M. A., &Poggio, T. (2003). Neural mechanisms for the recognition of biological movements.Nature Reviews Neuroscience,4, 179–192.
Hill, H., &Pollick, F. E. (2000). Exaggerating temporal differences enhances recognition of individuals from point light displays.Psychological Science,11, 223–228.
Hodgins, J. (2005).CMU graphics lab motion capture database. Retrieved April 4, 2005, from mocap.cs.cmu.edu/.
Jellema, T., &Perrett, D. I. (2003). Cells in monkey STS responsive to articulated body motions and consequent static posture: A case of implied motion?Neuropsychologia,41, 1728–1737.
Johansson, G. (1973). Visual perception of biological motion and a model for its analysis.Perception & Psychophysics,14, 201–211.
Kozlowski, L. T., &Cutting, J. E. (1978). Recognizing the gender of walkers from point-lights mounted on ankles: Some second thoughts.Perception & Psychophysics,23, 459.
Martinez, A. M., &Benavente, R. (1998).The AR face database (CVC Tech. Rep. 24). Barcelona: Computer Vision Center.
Mather, G., &Murdoch, L. (1994). Gender discrimination in biological motion displays based on dynamic cues.Proceedings of the Royal Society of London: Series B,258, 273–279.
Nixon, M. S., Carter, J. N., Grant, M. G., Gordon, L. G., &Hayfron-Acquah, J. B. (2003). Automatic recognition by gait: Progress and prospects.Sensor Review,23, 323–331.
Pollick, F. E., Lestou, V., Ryu, J., &Cho, S.-B. (2002). Estimating the efficiency of recognizing gender and affect from biological motion.Vision Research,42, 2345–2355.
Pollick, F. E., Paterson, H. M., Bruderlin, A., &Sanford, A. J. (2001). Perceiving affect from arm movement.Cognition,82, B51-B61.
Shipley, T. F., & Brumberg, J. S. (2005).Markerless motion-capture for point-light displays. Retrieved April 4, 2005, from astro.temple.edu/ ~tshipley/mocap/MarkerlessMoCap.pdf.
Sim, T., Baker, S., &Bsat, M. (2002). The CMU pose, illumination, and expression (PIE) database.IEEE Transactions on Pattern Analysis & Machine Intelligence,25, 1615–1618.
Troje, N. F. (2002). Decomposing biological motion: A framework for analysis and synthesis of human gait patterns.Journal of Vision,2, 371–387.
Vanrie, J., &Verfaillie, K. (2004). Perception of biological motion: A stimulus set of human point-light actions.Behavior Research Methods, Instruments, & Computers,36, 625–629.
Verfaillie, K. (2000). Perceiving human locomotion: Priming effects in direction discrimination.Brain & Cognition,44, 192–213.
Author information
Authors and Affiliations
Corresponding author
Additional information
This work was supported by Engineering and Physical Science Research Council Grant GR/M30326 to F.E.P. and a British Academy Postdoctoral Fellowship to H.M.P.
Rights and permissions
About this article
Cite this article
Ma, Y., Paterson, H.M. & Pollick, F.E. A motion capture library for the study of identity, gender, and emotion perception from biological motion. Behavior Research Methods 38, 134–141 (2006). https://doi.org/10.3758/BF03192758
Received:
Accepted:
Issue Date:
DOI: https://doi.org/10.3758/BF03192758