Real-time Semantic Full-Body Haptic Feedback Converted from Sound for Virtual Reality Gameplay

Published in CHI '25: Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems, 2025

We present a multisensory virtual reality (VR) system that enables users to experience concurrent visual, auditory, and haptic feedback, featuring semantic classification of events from sound, sound-to-haptic conversion, and full-body haptic effects. This concept is applied to enhance the user experience of virtual reality (VR) gameplay. The system utilizes a Long-Short-Term Memory (LSTM) model to classify game sounds and detect key events such as gunfire, explosions, and hits. These events are translated into full-body haptic patterns through a haptic suit, providing users with realistic and immersive haptic experiences. The system operates with low latency, ensuring the seamless synchrony between sound and haptic feedback. Evaluations through user studies demonstrate significant improvements in user experience compared to traditional sound-to-haptic methods, emphasizing the importance of accurate sound classification and well-designed haptic effects.

Gyeore Yun, Seungmoon Choi
Download Paper