Animate a manually created face mesh in realtime RRS feed

  • Question

  • For a current project I am trying to animate and track a face mesh in realtime using Kinect data. All of the demos I have seen use live Kinect data to generate a face mesh dynamically, so there is no ability to texture this mesh.

    For example this Unity asset has face tracking demos but the face mesh is dynamically generated


    I am targeting the Unity platform but I could also use OpenFrameworks, the Unreal game engine or even webgl if it makes sense.

    So my goal is to take a previously textured face mesh and then somehow connect this mesh to data from the Kinect.

    Does that make any sense?

    For an example of something similar to what I am trying to do, check out this video


    If you scroll to 4:45 you can see that facial expressions are being animated by the Kinect!

    • Edited by eco_bach Tuesday, February 16, 2016 5:24 PM
    Tuesday, February 16, 2016 3:47 PM