Skip to main content

EDITORIAL article

Front. Comput. Sci., 03 August 2022
Sec. Human-Media Interaction
Volume 4 - 2022 | https://doi.org/10.3389/fcomp.2022.998416

Editorial: Recognizing the state of emotion, cognition and action from physiological and behavioral signals

  • 1Department of Electrical Engineering and Telecommunications, University of New South Wales, Sydney, NSW, Australia
  • 2Department of Computer Science, University College London, London, United Kingdom
  • 3The Data Science Institute, University of Technology Sydney, Sydney, NSW, Australia
  • 4STARS Team, INRIA, Université Côte d'Azur, Nice, France
  • 5STARS Team, INRIA, Sophia Antipolis, France

Seamless blending of humans and technology for intelligent interaction is becoming more popular. One key aspect is to let machine understand users' state of emotion, cognition, and action. This Research Topic is a collection of ten papers where physiological and behavioral signals are exploited to recognize user states. In this collection, multiple techniques, systems, and applications are introduced, spanning from healthcare (e.g., dementia, disorientation in aged people, alpha waves asymmetry), workload, sleep monitoring and self-care assistive technology, to decision-making tasks (e.g., relevance of text read, relational communication, emotion classification). We highlight the main findings of these research studies.

A multidisciplinary research team from the UCL Interaction Centre (Holloway et al.) proposes a new cost-effective approach with Inertial Measurement Units (IMU) sensors to predict dementia. The results demonstrate state-of-the-art performance in classifying data from different dementia groups including typical Alzheimer's disease and posterior cortical atrophy. This approach paves the way for a simple clinical test to enable dementia screening in real-world.

Researchers at the University of Bremen (Steinert et al.) conduct a study on the prediction of activation ratings of people with dementia, which has been shown to be a possible cue of cognitive functioning. With an existing dataset that includes verbal and non-verbal cues of people with dementia, the team demonstrates the positive contribution of behavioral cues to the prediction and discusses unique challenges in the task.

Teipel et al. study the features of gait and accelerometry associated with disorientation events. The orientation ability of older and younger cognitively normal participants navigating on a treadmill is under investigation. Although the strength of the association of currently studied features is not sufficient for accurate real-time prediction of disorientation in a single individual, it paves the way for a future system that allows monitoring the orientation, the gait, the accelerometric and physiological data in a controlled environment.

To better understand and apply the theory of alpha asymmetry, Sabu et al. conduct a review on the role of affective stimuli in event-related frontal alpha asymmetry. They confirm that strongly engaging, salient and/or personally relevant stimuli are important to induce an approach-avoidance effect. Meanwhile, the selection of stimuli accounts for part of the diversity in alpha asymmetry research, where notably, multimodal stimuli and stimuli employing tasks induce approach-avoidance effects more strongly than images.

A collaborative team (Meteier et al.) from Switzerland investigates the use of physiological data to assess mental workload in the context of automated driving. The team confirms that respiratory indicators and heart rate variability are effective measures of mental workload and highlights the possible relationship between task performance and mental workload prediction.

The author Liang investigates the relationship between brain hemodynamics and stress in the first sleep cycle. Chemical biomarkers and novel wearables for near-infrared spectroscopy are coupled with machine learning in a new research paradigm. The study sheds light on the possible role of the left rostral and dorsolateral prefrontal cortex in stress responses.

Barz et al. conduct a study on estimating paragraph relevance from eye movement. They confirm that eye gaze can be used to estimate the perceived relevance of short news articles although there is no evidence to clearly show that the approach generalizes to multi-paragraph documents when users scroll down to see all text passages. It can be envisaged that the gaze-based relevance detection can be a part of future adaptive user interfaces that leverage multiple sensors for behavioral signal processing and analysis.

Vortmann et al. compare early, middle, and late fusion in a classification task to infer internal (e.g., thought, memories) or external (e.g., sensory input) attentional state. The dataset used in this study is multimodal and composed of EEG and eye tracking. The results indicate that middle or late fusion are better suited than early fusion approaches.

Burgoon et al. apply the Brunswikian lens model of relational communication, which measures linguistic, vocalic, and facial cues, to establish a perception of other people on relational attributes (dominance, affection, composure, involvement, similarity, trust) and quantify their perceived credibility while participants are interacting in game of Resistance. They find that the behavior elicited during the activity correlates with relational messages in a supportive manner, such as the correlations between affection and longer sentences and less hedging.

The research conducted by Menétrey et al. from University of Geneva and University of New South Wales aims to identify key components contributing to accurate emotion prediction. They highlight that emotion recognition requires the integration of various components (appraisal, motivation, expression, physiology, and feeling). In this study they extract mean and variance of the physiological data and show that emotional features are encoded within the other components.

We hope the readers enjoy this topic collection. These studies demonstrate a growing interest in empowering machine to understand user state and a multidisciplinary approach to improve human and machine collaboration in the best form.

Author contributions

All authors listed have made a substantial, direct, and intellectual contribution to the work and approved it for publication.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Keywords: computational psychophysiology, affective computing, human computer interaction, human factors, assistive technology, signal processing, pattern recognition, machine learning

Citation: Chen S, Cho Y, Yu K, Ferrari LM and Bremond F (2022) Editorial: Recognizing the state of emotion, cognition and action from physiological and behavioral signals. Front. Comput. Sci. 4:998416. doi: 10.3389/fcomp.2022.998416

Received: 19 July 2022; Accepted: 20 July 2022;
Published: 03 August 2022.

Edited and reviewed by: Anton Nijholt, University of Twente, Netherlands

Copyright © 2022 Chen, Cho, Yu, Ferrari and Bremond. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Siyuan Chen, siyuan.chen@unsw.edu.au

Download