CONF
Ali_WACV_2025/IDIAP
Loose Social-Interaction Recognition in Real-world Therapy Scenarios
Ali, Abid
Dai, Rui
Marisetty, Ashish
Astruc, Guillaume
Thonnat, Monique
Odobez, Jean-Marc
Thümmler, Suzanne
Bremond, Francois
EXTERNAL
https://publications.idiap.ch/attachments/papers/2024/Ali_WACV_2025.pdf
PUBLIC
IEEE/CVF Winter Conference on Applications of Computer Vision
2025
The computer vision community has explored dyadic interactions for atomic actions such as pushing, carrying object, etc. However, with the advancement in deep learning models, there is a need to explore more complex dyadic situations such as loose interactions. These are interactions where two people perform certain atomic activities to complete a global action irrespective of temporal synchronisation and physical engagement, like cooking-together for example. Analysing these types of dyadic-interactions has several useful applications in the medical domain for social-skills development and mental health diagnosis. To achieve this, we propose a novel dual-path architecture to capture the loose interaction between two individuals. Our model learns global abstract features from each stream via a CNNs backbone and fuses them using a new Global-Layer-Attention module based on a cross-attention strategy. We evaluate our model on real-world autism diagnoses such as our Loose-Interaction dataset, and the publicly available Autism dataset for loose interactions. Our network achieves baseline results on the Loose-Interaction and SOTA results on the Autism datasets. Moreover, we study different social interactions by experimenting on a publicly available dataset i.e. NTU-RGB+D (interactive classes from both NTU-60 and NTU-120). We have found that different interactions require different network designs. We also compare a slightly different version of our method (details in Section 3.6) by incorporating time information to address tight interactions achieving SOTA results.