Efficient communication is key to effective group work, group-based learning, and successfully accomplishing tasks. Face-to-face communication is recognized to be the most valuable form of communication, whereas e-mail and text messaging are considered the least valuable (Pentland 2012).
Moreover, how people interact in the group is considered even more critical than what is being said. A range of tools and techniques can be used to describe characteristics of interactions and to uncover patterns of the efficient ones. One such study, focusing on small group interactions in a pedagogical setting, was recently carried out at Haaga-Helia University of Applied Sciences (Gjerstad 2019; Aunimo et al. 2021). It undertook a multimodal research approach, which included registering and assessing emotions of the participants.
iMotions – a window into emotions
The iMotions software powered by the AFFDEX algorithm is based on the Facial Action Coding System (Ekman 1992), which in turn is based on discrete emotion theory. It detects seven prototypical emotions: anger, contempt, disgust, fear, joy, sadness and surprise. This anatomy-based system measures 46 observable action units (e.g., brow furrow) and can translate them into the emotions as well as other facial expressions (e.g., smile).
The software also allows to detect and measure engagement and valence, i.e., the pleasantness and unpleasantness of emotional stimuli (Facial Expression Analysis 2017). It is, however, not able to measure non-prototypical emotions (e.g., confusion) and it does not integrate contextual information, for instance, it would not be able to understand sarcasm behind a smile.
Furthermore, iMotions relies on the assumption that the produced emotion is recognized and expressed, which may not be always the case. Joy is the emotion recognized with the highest certainty, while emotions of surprise and fear are most difficult to distinguish from another (Stöckli et al. 2018).
Registering emotions during interactions
Utilization of iMotions to register and measure emotions caused by different stimuli (e.g., pictures, videos, speech) is a powerful tool to observe study participants. This can be done during the interactions and conversations as well.
The state-of-the-art equipment at Haaga-Helia’s Lab offers a wide range of possibilities to explore emotions of individuals in different research settings. The primary detection is based on eye tracking, galvanic skin response (a.k.a. electrodermal activity), and facial expressions. However, iMotions is designed to incorporate recordings of other signals as well (e.g., electroencephalogram, electrocardiogram, and electromyogram).
The quality of the signal recording is of paramount importance for the subsequent data analysis. This must be addressed by researchers already in the study design step. It can be particularly challenging for eye tracking and facial expressions when study participants are moving or have their face view obstructed (e.g., by holding hands in front of the face).
Nevertheless, upon careful planning, proper participant instructions, and adequate preparation and testing, experimental design and conditions can be achieved to obtain a highly valuable information for a wide range of purposes, including the interaction studies.
This publication is part of the Future Experience in Sales and Services -project funded by the Ministry of Education and Culture.
Eevastiina Gjerdstad’s current Teams and Safety -project, funded by The Finnish Work Environment Fund and executed by Haaga-Helia, Humak UAS and Metropolia UAS, uses the iMotions software in its research and development activities.
References:
- Aunimo, L. & Gjerstad, E. Raulinaitis, V. 2021. How to foster dialogicality in group interaction? ICERI conference full paper.
- Ekman, P. 1992. An Argument for Basic Emotions. Cognition and Emotion 6 (3-4), 169 – 200.
- Facial Expression Analysis. The Complete Pocket Guide. 2017. IMotions. Copenhagen: Denmark.
- Gjerstad, E. 2019. Mitä lisäarvoa (tunne)tekoäly tuo vuorovaikutus-tutkimukseen? Haaga-Helia eSignals 21.12.2019.
- Pentland, A. 2012. The New Science of Building Great Teams. Harvard Business Review April 2012. 1 – 11.
- Stöckli, S. & Schulte-Mecklenbeck, M., Borer, S. & Samson, A.C. 2018. Facial expression analysis with AFFDEX and FACET: A validation study. Behavior Research Methods 50 (4), 1446 – 1460.