Sound-Haptic Perception

Sound and haptic effects can greatly enhance usability and experience when used together. Examples include applications that provide music along with haptic feedback or generate vibration effects based on sounds to deliver them through a gamer’s controller. To design such interfaces appropriately, research on how people perceive simultaneous auditory and haptic stimuli is necessary.

I have conducted three studies during my time as a graduate student.

  1. Sound-to-Motion Conversion for Game Viewing



    Detecting gunfire sounds in FPS gameplay videos and generating corresponding recoil motion effects to complement the audio cues.

  2. Sound-to-Tactile Conversion for Game Playing



    Providing multimodal (impact and vibration) haptic effects by detecting appropriate moments for haptic effects in real-time from sound signals.

  3. Semantic Haptic Rendering for Game Playing (To be submitted)



    Employing a deep learning model to classify the semantic classes of sound events and providing users with tailored full-body vibration patterns through a haptic suit.

For more details, please see the papers.