Harnessing Haptic Technology for Real-Time Emotion Detection

Lital Levy, Yuval Blum, Asmare Ambaw, Roi Yozevitch, Eldad Holdengreber

Research output: Contribution to journalArticlepeer-review

Abstract

This letter introduces a novel multi-modal environmental translator for real-time emotion recognition. The system integrates facial expression recognition (FER) and speech emotion recognition (SER) to analyze visual and vocal cues while conveying emotional feedback through vibrotactile signals. Emotions are mapped to distinct vibration frequencies - ranging from 0.4 Hz for neutral to 35 Hz for anger - enabling users to identify seven core emotions through tactile sensations intuitively. A user study involving ten participants demonstrated an average adaptation time of fewer than 7 min, indicating the system's effectiveness in quickly familiarizing users with the vibration signals. Overall, this innovative solution provides a robust approach to enhancing real-time emotion recognition through haptic feedback, making it suitable for everyday social interactions.

Original languageEnglish
Article number5500804
JournalIEEE Sensors Letters
Volume9
Issue number3
DOIs
StatePublished - 2025

Keywords

  • Sensor systems
  • emotion recognition
  • facial expression recognition (FER)
  • haptic feedback
  • machine learning (ML)
  • sensory substitution
  • speech emotion recognition (SER)

Fingerprint

Dive into the research topics of 'Harnessing Haptic Technology for Real-Time Emotion Detection'. Together they form a unique fingerprint.

Cite this