This project presents the prototype using Kinect to track userβs movement in the space, to interact with invisible virtual objects with audio feedback in physical environment. In this project, the user manipulates the virtual sound object in both ego- and allocentric spatial reference. The concept is to let the user uses body movement to produces sound in real-time, so a mime actor or a dancer can have improvised movement with correspondent sound effect.
Experimental Lab
Sound Cube
Interact with the virtual world with audio feedback in the physical environment
Kinect Unity3D Tangible