Multimodal emotion recognition
This project aims to develop novel techniques for automatically recognizing emotions through combining physiology, behaviour, facial expressions and language.
The design of collaborative learning activities has not been informed by the underlying affective dynamics learners go through. Although emotions have generally been recognized as having an impact in learning, particularly during collaborative activities, researchers have found them hard to research and model. Recent advances in biomedical engineering, neuroscience and data mining have increased researchers attention to this issue. We are at a point where significant accuracy in recognizing basic emotional states is feasible through a number of approaches. The identification of affective and mental states provides a magnifying glass into the processes involved in collaborative learning activities.
For more information: Calvo, R. A., & D’Mello, S. (2010). Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications. IEEE Transactions on Affective Computing, 1(1), 18-37. Published by the IEEE Computer Society.
Want to find out more?
The opportunity ID for this research opportunity is: 1244
Other opportunities with Associate Professor Rafael A. Calvo