Working with Joint Orientations? RRS feed

  • Question

  • I am trying to get orientation tracking to work.

    I have seen the other posts regarding this, and the overall theme is that nobody really has it working properly yet. 

    I tried focusing in on the forearm, the values are very erratic, even when using the Quaternion->Euler function provided in the FaceBasics sample. Sometimes the angle just does a complete flip into the opposite direction and things like that.

    Am I doing something wrong, or is everybody seeing this? Would it be possible to be provided an example of this in action, something like the BodyBasics sample?


    Tuesday, September 30, 2014 5:27 PM

All replies

  • The best example of this Joint Orientation algorithm I've seen is in the "BoxMan" demo, or the sampled called "Evolution". Some of us have the source code, from the Kinect Hackathons, however I'm afraid that you'll have to wait until MS releases it for the general public. The K4W team did say they were going to release this sample source as part of the final drop so just hang tight...

    Sr. Enterprise Architect | Trainer | Consultant | MCT | MCSD | MCPD | SharePoint TS | MS Virtual TS |Windows 8 App Store Developer | Linux Gentoo Geek | Raspberry Pi Owner | Micro .Net Developer | Kinect For Windows Device Developer |blog:

    Tuesday, September 30, 2014 5:43 PM
  • We will release KE as a sample, but it is not ready to be released to the general public as a sample yet. Keep in mind, Boxman only does a simple Quaternion to Matrix rotation and applies that rotation to a simple object. When trying to use the values with a Skinned Mesh or Avateering type models in 3D game engines, the model have to be created in a way to match to take the rotation directly.

    There are several threads that go into this deeper, but this thread discusses what the values represent from the SDK:

    Carmine Sirignano - MSFT

    Monday, October 6, 2014 5:16 PM