none
Is there a way to take a .xef file recorded within Kinect Studio and import it for playback as a textured mesh within Unity? RRS feed

  • General discussion

  • I'm interested in using the recorded data from the Kinect Studio software within unity and blender.  It would be great to be able to playback point cloud and texture data from a recording within Unity and Blender, in addition to being able to generate mocap .bvh files for the facial tracking and the skeletal tracking.  Is there some way to accomplish this, and if so could anyone provide me some direction for where to rummage?  Thanks!
    Wednesday, December 10, 2014 3:33 AM

All replies

  • Using Kinect Studio in production environment is not supported. If this is something you need to deploy, then you will have to come up with an alternative for playing back recorded data.

    As for the KStudio playback any application the is Kinect v2 enabled will get the data. Playback of a KStudio recording would be the same as running a live sensor. You would just write your application as if there was a real sensor attached.

    Kinect does not have any support for .bvh recording, so you would have to come up with your own implementation if you intend to do recordings, or use a third party. There are third party applications that already enable that functionality.

    There is a Unity plug-in for Kinect v2, see documentation for link: http://msdn.microsoft.com/en-us/library/dn782041.aspx

    For Blender, I am not sure there was one created for that yet. You can create a plug-in that is based on Kinect Common Bridge as a quick way to start building out a plug-in. Be sure you are using the correct branch for v2: https://github.com/MSOpenTech/KinectCommonBridge/tree/2.0


    Carmine Sirignano - MSFT

    Wednesday, December 10, 2014 6:39 PM