Bodily sensation maps: exploring a new direction for detecting emotions from user self-reported data
Authors: García-Magariño I., Chittaro L., Plaza I.
Published in: International Journal of Human-Computer Studies, 113, May 2018, pp. 32–47.
Abstract: The ability of detecting emotions is essential in different fields such as user experience (UX), affective computing, and psychology. This paper explores the possibility of detecting emotions through user-generated bodily sensation maps (BSMs). The theoretical basis that inspires this work is the proposal by Nummenmaa et al. (2014) of BSMs for 14 emotions. To make it easy for users to create a BSM of how they feel, and convenient for researchers to acquire and classify users’ BSMs, we created a mobile app, called EmoPaint. The app includes an interface for BSM creation, and an automatic classifier that matches the created BSM with the BSMs for the 14 emotions. We conducted a user study aimed at evaluating both components of EmoPaint. First, it shows that the app is easy to use, and is able to classify BSMs consistently with the considered theoretical approach. Second, it shows that using EmoPaint increases accuracy of users’ emotion classification when compared with an adaptation of the well-known method of using the Affect Grid with the Circumplex Model, focused on the same set of 14 emotions of Nummenmaa et al. Overall, these results indicate that the novel approach of using BSMs in the context of automatic emotion detection is promising, and encourage further developments and studies of BSM-based methods.