In situ affect detection in mobile devices: a multimodal approach for advertisement using social network

Published in ACM SIGAPP Applied Computing Review, 2013

Affect detection has been widely advocated to be implemented in a natural environment. But due to constraints such as correct labeling and lack of usable sensors in natural environment most of the research in multi-modal affect detection has been done in laboratory environment. In this paper, we investigate affect detection in natural environment using sensors available in smart phones. We use facial expression and energy expenditure of a person to classify a person’s affective state by continuously recording accelerometer data for energy and camera image for facial expression and measure the performance of the system. We have deployed our system in a natural environment and have provided special attention on annotation for the training data to validate the ‘ground truth’. We have found important relationship between valence and arousal space for better accuracy of affect detection by using facial image and energy. This validates Russell’s two dimensional theory of emotion using arousal and valence space. In this paper, we have presented initial findings in multi-modal affect detection. Using the multimodal technique, we propose a system that can be used in social networks for affect sensitive advertisement.