Papers
1–3 of 3Research Paper·Mar 2, 2026
Nano-EmoX: Unifying Multimodal Emotional Intelligence from Perception to Empathy
The development of affective multimodal language models (MLMs) has long been constrained by a gap between low-level perception and high-level interaction, leading to fragmented affective capabilities ...
7.0 viability
Research Paper·Mar 16, 2026
Anchoring Emotions in Text: Robust Multimodal Fusion for Mimicry Intensity Estimation
Estimating Emotional Mimicry Intensity (EMI) in naturalistic environments is a critical yet challenging task in affective computing. The primary difficulty lies in effectively modeling the complex, no...
7.0 viability
Research Paper·Mar 16, 2026
Conflict-Aware Multimodal Fusion for Ambivalence and Hesitancy Recognition
Ambivalence and hesitancy (A/H) are subtle affective states where a person shows conflicting signals through different channels -- saying one thing while their face or voice tells another story. Recog...
7.0 viability