I participated in Intel Perceptual Computing Challenge with project Mercury, Digital Sculpting Application - voxel-based 3D mesh edition with bare hands.
Prototype is based on the Glow engine and uses voxel engine, which was made for Iron Cube game. Smooth mesh generation was additionally written, which is based on Surface Nets (sort of modification of Matching Cubes)
Intel Perceptual Computing SDK and Creative* Interactive Gesture Camera Developer Kit are used for hand-based interface. I received camera just one day before deadline, so had only one day to integrate it to the engine. And surprisingly it was done just in few hours - SDK has special util library, which make it easy to use camera's features.
SDK reports about several types of commonly used gestures like victory (V) , "Thumbs signal" etc and reports about exact 3D positions of every finger. Also it reports about hand state (open, close) and position.
Only restricted gestures and manipulations are supported currently in Mercury project, but I hope to make it as easy as in real sculpting. Camera provides enough precision to make it.