(Click to expand/collapse details for each section below.)
Press the Enable face & hand tracking button below, approve the camera usage request and let the javaScript to download code as needed. Once the polygon mask appears, pinch at pieces of the mask with your thumb and index finger and drag them away. A pinch is successfully detected when the hands changes color.
Each polygons plays their own notes, determined by their x-y coordinations, measured to the center of their polygons... So each masks makes different sounds, given the random nature of each polygons, and your distance to the camera, and where you're looking at, and so on. Breaking a piece of polygon off the mask stops that polygon from moving around with your face.
This project uses three AIs at once; face detection, palm detection and fingers detection. From Mediapipe by Google.
The topmost mask is made via Delaunay triangulation, via this library: Delaunator. It accepts points and draw polygons between those points... Which works wonderfully with nearly all hand/face detecttion AIs since their landmarks can be used as points!
Sounds made by piping data over to Cycling Max via websocket through its node server object. X and Y coords of each detached polygons affects the final output, along with percentage of the static mask exposed. Download my max patch zip here. The zip contains the patch, node script and the WebSocket nodejs package. Run the max patch as is locally, then start up the website page in any browsers on the same machine and you'll get sounds.
There is no story. Polygons just have been on the front of my brain for a while, and things fell into place like this.