For a current project I am trying to animate and track a face mesh in realtime using Kinect data. All of the demos I have seen use live Kinect data to generate a face mesh dynamically, so there is no ability to texture this mesh.
For example this Unity asset has face tracking demos but the face mesh is dynamically generated
https://www.assetstore.unity3d.com/en/#!/content/18708
I am targeting the Unity platform but I could also use OpenFrameworks, the Unreal game engine or even webgl if it makes sense.
So my goal is to take a previously textured face mesh and then somehow connect this mesh to data from the Kinect.
Does that make any sense?
For an example of something similar to what I am trying to do, check out this video
https://www.youtube.com/watch?v=fHyOguL6nOM
If you scroll to 4:45 you can see that facial expressions are being animated by the Kinect!