Exploring various ways to control smartphone haptic feedback using web technologies. This project makes haptics much more accessible by leveraging the Web Vibration API in conjunction with different sensor inputs.
The foundational page demonstrating basic Web Vibration API usage through direct button interactions, custom patterns, and presets.
Translates motion sensor (MPU) readings into haptic feedback, allowing users to "feel" their device's movement.
Converts real-time microphone input levels into haptic feedback, making the device vibrate in response to sound.
Processes visual input from the smartphone camera to generate haptic feedback based on average brightness or motion detection.
Enables remote control of haptic feedback between devices using PeerJS (WebRTC), allowing a Host to trigger vibrations on multiple Clients.