Generating Real-Time, Selective, and Multimodal Haptic Effects from Sound for Gaming Experience Enhancement
Published in CHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 2023
We propose an algorithm that generates a vibration, an impact, or a vibration+impact haptic effect by processing a sound signal in real time. Our algorithm is selective in that it matches the most appropriate type of haptic effects to the sound using a machine-learning classifier (random forest) that is built on expert-labeled datasets. Our algorithm is tailored to enhance user experiences for video game play, and we present two examples for the RPG (role-playing game) and FPS (first-person shooter) genres. We demonstrate the effectiveness of our algorithm by a user study in comparison to other state-of-the-art (SOTA) methods for the same cross-modal conversion. Our system elicits better multisensory user experiences than the SOTA algorithms for both game genres.
Gyeore Yun, Minjae Mun, Jungeun Lee, Dong-Geun Kim, Hong Z Tan, Seungmoon Choi
Download Paper