
Staff members of the Institute of Digital Games were awarded a Runner-up prize for their journal paper "From the Lab to the Wild: Affect Modeling Via Privileged Information" from the highly prestigious IEEE Transactions on Affective Computing. Konstantinos Makantasis, Kosmas Pinitas, Antonios Liapis and Georgios N. Yannakakis conducted research on machine learning under priviledged information (LUPI) for the task of modelling human emotions. The rationale of LUPI is that a computer model can be trained on a lot more data when that data is available, but will suffer when there is no "priviledged" data is available. In our example, in a laboratory setting you may have advanced biofeedback sensors measuring heart rate, skin conductance, speech audio etc. while in a real setting (in the wild) you may only have webcam video feed due to budget and privacy concerns. With LUPI, however, a computer model trained on all the data available in the lab can itself be used as a "teacher" to a computer model that only receives the limited data (acting as a "student").
We used two datasets from Affective Computing to test our hypothesis: the non-priviledged data is webcam feeds of people collaborating to perform a task (RECOLA) or discussing advertisements they just saw (SEWA), while the full data includes acoustic features (from complex audio processing sofrware) and biofeedback signals (heart rate and skin conductance for RECOLA, facial action units for SEWA). The datasets were already labeled by experts for arousal (intensity of emotions) and for valence (positive or negative emotions). We used deep learning architectures on the raw pixel data (webcam feeds) for the "student" model, while for the "teacher" model we fused the webcam feeds with the additional priviledged information via late fusion. The "teacher" models, especially those that had access only to priviledged information (for RECOLA) and those that combined all information (for SEWA) performed very well. However, the "student" models that used only webcam data as input (but trained with the help of the teacher models) performed almost as highly as the teacher models while accessing much less information. This can be very beneficial for detecting emotions without needing to record audio of someone interacting with a colleague (due to privacy concerns) and without needing specialized hardware (e.g. sensors attached to the skin) or software (e.g. complex audio processing libraries).
The paper was considered for the 2024 Best Paper of the IEEE Transactions on Affective Computing, and received the Runner-Up award. IEEE Transactions on Affective Computing is a top-tier journal for modeling emotions with computer models, with an Impact Factor of 9.8. The nomination highlights the international prestige of the Institute of Digital Games, which is also ranked 5th in technical games research worldwide–above Google and NYU.