none
How to interpret JointOrientation data RRS feed

  • Question

  • Hi,

    I just started to experiment with visualizing the skeletons.
    Positions are easy but I'm having a little trouble with the orientations and am not sure if I'm interpreting the data incorrectly or have a problem in my math/drawing code.


    How should we interpret the Orientation from:

    typedef struct _JointOrientation {
        JointType JointType;
        Vector4 Orientation;
    } JointOrientation

    Is it an euler angle in radians or a quaternion? (w does seem to have data in it)

    Or axis/angle perhaps?


    I'm assuming from the description it is in local space (relative to parent joint)
    And what is the hierarchy definition, I'm assuming the following:

    child						parent
    JointType_SpineBase			-
    JointType_SpineMid			JointType_SpineBase
    JointType_SpineShoulder		JointType_SpineMid
    JointType_Neck			JointType_SpineShoulder
    JointType_Head			JointType_Head
    JointType_ShoulderLeft		JointType_SpineShoulder
    JointType_ElbowLeft			JointType_ShoulderLeft
    JointType_WristLeft			JointType_ElbowLeft
    JointType_HandLeft			JointType_WristLeft
    JointType_HandTipLeft		JointType_HandLeft
    JointType_ThumbLeft		JointType_HandLeft
    JointType_HipLeft			JointType_SpineBase
    JointType_KneeLeft			JointType_HipLeft
    JointType_AnkleLeft			JointType_KneeLeft
    JointType_FootLeft			JointType_AnkleLeft




    Brekel



    Sunday, December 1, 2013 4:47 PM
    Moderator

Answers

  • leftArm.transform.rotation = ElbowLeft * Quaternion.AngleAxis(90, new Vector3(0, 1, 0)) * Quaternion.AngleAxis(-90, new Vector3(0, 0, 1));

    It seems if I remove these AngleAxis it becomes quite jerky and unstable again, so it seems like it might be doing more than just rotating the orientation?

    It's never doing other than rotation.

    Because the Kinect quaternions seem to assume that any bone points to the sky before rotating,
    I turned the bones of my model to the sky before applying the quaternions.

    I think you observed your model was jerky and unstable because the quaternions rotated wrongly oriented objects.
    Suppose a quaternion rotates objects around y-axis and consider rotating the following two objects by it.

    We feel the first is not so jerky(it can be understood as a twist), but the second is jerky, isn't it?

    • Marked as answer by _Ray Price_ Monday, August 18, 2014 2:38 PM
    Saturday, August 9, 2014 2:27 AM

All replies

  • The BodyBasics sample illustrates the skeleton connections, see DrawBody.  Your diagram looks right except that Head's parent should be Neck.

    Monday, December 2, 2013 5:15 PM
  • Yep that's were I took the hierarchy information from, and thanks for spotting a bug :)

    Unfortunately BodyBasics doesn't show how to interpret the rotations though and the docs are still a bit vague on how to interpret the values.

    But in the meantime I'll fiddle a bit more to figure it out.


    Brekel

    Monday, December 2, 2013 5:21 PM
    Moderator
  • Also interested in knowing this.

    - Alvaro

    Monday, December 2, 2013 5:40 PM
  • Joint orientation is exposed as a quaternion.

    Basis for each joint is defined by:

    • Binormal (X) – perpendicular to bone and normal
    • Bone direction (Y) - always matches skeleton
    • Normal (Z) – joint roll, perpendicular to the bone

    So if you want to extract the normal, transform vector (0,0,1) using the quaternion.


    Kevin.K

    Monday, December 2, 2013 8:04 PM
  • Thanks for the clarity Kevin.


    Brekel

    Monday, December 2, 2013 8:08 PM
    Moderator
  • Any more clarity on this, how do I determine the degrees of certain joint angles?

    For instance, the angle of the spine?

    Tuesday, February 11, 2014 11:27 AM
  • So I am trying to map a unity model to the kinect skeleton but I am having trouble with the orientation of the joints.  Any suggestion on whats the best way?

    I tried concatenating the quaternions to get the absolute orientation but still doesn't work!!

    Thanks for your help!!

    • Proposed as answer by Jose.Canton Monday, August 18, 2014 2:23 PM
    • Unproposed as answer by Jose.Canton Monday, August 18, 2014 2:23 PM
    Friday, May 23, 2014 10:38 PM
  • Ok many people have asked about the joint orientation given by the SDK and I think nobody has clearly explaned how the orientation information works so far. 

    I (and I am sure not only me) would be really interested in a document explaining the use of orientation data in the Kinect v2 SDK. I tried to make sens of the data myself using all the information I could find in this forum (also this ever again cited Kinect v1 document http://msdn.microsoft.com/en-us/library/hh973073.aspx) and still am not sure if all I think I have found out is actually true.

    The following points were my conclusion so far (please confirm or correct me if I am wrong):

    1. Unlike stated in many other posts, the quaternions DO NOT have to be chained in order to get the absolute orientation of a joint in Kinect coordinates.
    2. Each joint has it's own coordinate system defined. As Kevin.K posted above, Y is in the direction of the bone (so for the elbow for example, Y lies on the line going from the shoulder to the elbow and points away from the shoulder). As for the "normal" it is not clear for me how I know in what direction Z points except for being normal to Y. For example when puting the right hand flat on a table with the palm facing down, I think that Z is pointing to the left (where the thumb is) in the plain of the table and is normal to Y (where Y is pointing away from the elbow in the forearm direction). Is this correct and how are we supposed to know this? What did I miss here?
    3. Now: I think, the quaternion stored in a joint gives the direct rotation from the Kinect coordinate system (if you stand in front and look at the Kinect: X is pointing right, Y is pointing up, Z is pointing towards you) to the coordinate system of the joint in it's current orientation.

    No quaternion chaining is therefore needed right?.

    Thank you for a quick feedback and maybe someone who fully understands the design will write a short note about "How to interpret JointOrientation data" as the thread's name actually suggests.. that would be awesome! 

    Steven




    • Edited by Hocoma Saturday, May 24, 2014 2:55 PM
    Saturday, May 24, 2014 2:51 PM
  • I've posted the following several times so please excuse me but a lot of threads are asking the same question

    This is a skeleton taken from UE4 and the following algorithm works well for avateering in UE4.

    UE4 coord system X right, Y forward, Z up.

    Calculating bone rotations for avateering example:

    Defenitions:
    Kquat = Kinect bone quaternion.

    Calculations for bone 1:

    Check along what axis the avatar bone lais.
    In this case it as along negative Y therfore:
    RotAxisY = rotation of (0,-1,0,0) by Kquat
    RotAxisZ = rotation of (0,0,1,0) by Kquat
    RotAxisY = ConvertFromKinectCoordSystem(RotAxisY)
    RotAxisZ = ConvertFromKinectCoordSystem(RotAxisZ)
    Avatar bone 1 world rotation = MakeRotationFromYZ(RotAxisY,RotAxisZ)

    Calculations for bone 2:

    Check along what axis the avatar bone lais.
    In this case it as along positive X therfore:
    RotAxisX = rotation of (1,0,0,0) by Kquat
    RotAxisZ = rotation of (0,0,1,0) by Kquat
    RotAxisX = ConvertFromKinectCoordSystem(RotAxisX)
    RotAxisZ = ConvertFromKinectCoordSystem(RotAxisZ)
    Avatar bone 2 world rotation = MakeRotationFromXZ(RotAxisX,RotAxisZ)

     
    Saturday, May 24, 2014 5:38 PM
  • I also noticed the rotation quaternions are actually in global space not in local/hierarchical space as mentioned in some posts from the beginning of the beta.
    This actually makes them easier to use in conjunction with the positions which are also in global space.

    Using the Configurable Math Library (CML) in C++ I do the following:

    - get joint data from SDK:
    pBody->GetJoints(JointType_Count, m_body_joints);
    pBody->GetJointOrientations(JointType_Count, m_body_joints_orientation);
    


    - obtain global position vector:
    m_body_joints[pos_jointId].Position


    - obtain global rotation quaternions:
    m_body_joints_orientation[rot_jointId].Orientation


    - Note that the position and rotation are not stored in the same joint index, I'm using this:

    pos_jointId: SpineBase rot_jointId: SpineMid

    pos_jointId: SpineMid rot_jointId: SpineShoulder

    pos_jointId: SpineShoulder rot_jointId: Neck

    pos_jointId: Neck rot_jointId: Head

    pos_jointId: ShoulderLeft rot_jointId: ElbowLeft

    pos_jointId: ElbowLeft rot_jointId: WristLeft

    pos_jointId: WristLeft rot_jointId: HandLeft

    pos_jointId: HandLeft rot_jointId: HandTipLeft

    pos_jointId: ThumbLeft rot_jointId: ThumbLeft

    pos_jointId: ShoulderRight rot_jointId: ElbowRight

    pos_jointId: ElbowRight rot_jointId: WristRight

    pos_jointId: WristRight rot_jointId: HandRight

    pos_jointId: HandRight rot_jointId: HandTipRight

    pos_jointId: ThumbRight rot_jointId: ThumbRight

    pos_jointId: HipLeft rot_jointId: KneeLeft

    pos_jointId: KneeLeft rot_jointId: AnkleLeft

    pos_jointId: AnkleLeft rot_jointId: FootLeft

    pos_jointId: HipRight rot_jointId: KneeRight

    pos_jointId: KneeRight rot_jointId: AnkleRight

    pos_jointId: AnkleRight rot_jointId: FootRight



    - I then construct a position matrix:

    cml::matrix44f_c posMat;

    cml::matrix_translation(posMat, joint_gbl_pos);



    - And a rotation matrix:

    cml::matrix44f_c rosMat;

    cml::matrix_rotation_quaternion(rotMat, joint_gbl_rot);



    - And a global transform matrix:
    cml::matrix44f_c joint_gbl_transform = posMat * rotMat;



    - These can then be used to calculate local transforms, or for drawing in OpenGL directly like this:
    glPushMatrix();
    	glMultMatrixf(joint_gbl_transform.data());
    	drawJoint();
    glPopMatrix();



    I am seeing quite severe noise in some roll channels (especially the elbow/forearm), so I'm not sure if this is to be expected or the result of my method.
    @lion03: how stable are the forearms in your application?
    Saturday, May 24, 2014 6:21 PM
    Moderator
  • @Brekel

    I conducted some tests comparing the Kinect joint orientation estimation with a VICON system. I noticed a lot of noise for roll(arm rotation) too, especially for the elbow (when the elbow is flexed) and even more for the wrist. Also I think that the thumb has a big influence on the orientation estimation of the wrist. So when only moving the thumb, the wrist's orientation estimation actually changes.


    • Edited by Hocoma Saturday, May 24, 2014 7:47 PM
    Saturday, May 24, 2014 7:09 PM
  • Thanks for confirming we're having the exact same experience!

    I've been experimenting with smoothing filters on the rotation data only which helps, especially with selective settings per joint.

    Most of the noise seems to be in the axis that rolls/twists the bone along it's own length axis.
    And it may be possible to stabilize the rolls a bit more using the shoulder/wrist positions, but I haven't experimented with that.

    Brekel

    Saturday, May 24, 2014 7:36 PM
    Moderator
  • That is exactely the problem: Arm rotation can not easily be calculated from the shoulder/elbow/wrist position gernerally. picture a straight arm (elbow extended).. the rotation can now change a lot but the joint positions stay the same.

    About the filters you are using.. what kind of filters are those? Are you simply averaging over time (which would be an issue regarding the latency)? Or are you using something more sophisticated? Kalman? Also I think the "noise" we were talking about is not just noise but a lot of times rotations of 180 degrees. If true, this fact could also be taken into consideration somehow. I think there are to many big "jumps". Do you have experience with a filtering method that lets small changes pass but big jumps not?

    Saturday, May 24, 2014 7:56 PM
  • Not a big fan of Kallman myself since they're hard to tune (and implement)
    Butterworth, Double Quaternion and OneEuro filters are much easier and can give similarly good results.

    I haven't started experimenting and implementing the following ideas btw:

    Well the elbow roll orientation actually doesn't change that much with an extended arm as it's generated by rolling of the shoulder joint.
    With a bent arm a lot of it can be inferred from the elbow position and maybe the shoulder orientation.
    The wrist indeed can rotate a lot, maybe heavily filtering it's rotation can be acceptable. Or maybe heavily filtered positions of the fingers can give an indication for stabilizating the roll component of the elbow.

    After noticing the jumps were big it initially let me to believe my implementation was hitting some kind of mathematical discontinuity.
    I was thinking about using the normal/tangent of the shoulder to stabilize the elbow so it stays in the correct quadrant.


    Ok, I'll stop ranting now :)

    Brekel


    Saturday, May 24, 2014 8:25 PM
    Moderator
  • I had issues with the forearm.

    Sunday, May 25, 2014 12:04 AM
  • Seems like the 'box man' from the original Xbox One presentations shows the same kind of bone roll issues.

    (starting at 1:40)

    https://www.youtube.com/watch?v=bdviGrPaQDQ


    Brekel

    Sunday, May 25, 2014 12:53 PM
    Moderator
  • Ok so I just wanted to confirm our suspicion, the bone orientations are global.

    check this video it shows just conversion for Kinect coords to Unreal coords:

    http://youtu.be/1X_HC_j60r8 

    The conversion was done in the following manner:

    UnrealQuat.x = KinectBoneQuat.x 

    UnrealQuat.y = -KinectBoneQuat.z

    UnrealQuat.z = KinectBoneQuat.y 

    UnrealQuat.w = KinectBoneQuat.w

    Rotate UnrealQuat around Z axis by 180 degrees.

    Sunday, May 25, 2014 2:53 PM
  • Thanks for sharing!

    My results are very similar except the elbows seem to be less stable for me than for you.

    I have my sensor further away so it can see my full body, will check tomorrow if moving it closer like yours will make a difference.


    Brekel

    Sunday, May 25, 2014 3:06 PM
    Moderator
  • Brekel can you please email me at lion032 at gmail dot com
    Thursday, May 29, 2014 7:11 AM
  • This seems to be the primary thread for addressing the absence of a consistent method for finding the joint rotation. That remains an obstacle in finally fully skinning an avatar. Can anyone at Microsoft provide a definitive description for this? The new SDK just arrived today, so we'll see if that provides any improvement in consistency that  might allow us to infer conclusions, but documentation would still be nice.

    • Edited by Mediascape Wednesday, June 25, 2014 6:19 PM
    Wednesday, June 25, 2014 6:16 PM
  • The release SDK arrived without any new documentation on this, so the question remains:

    How are we to interpret the Quaternions given in each Kinect Body's JointOrientations array?  The documentation states that the JointOrientation structure "Orients a joint relative to the parent joint in the skeleton."  


    However, in practice this does not seem to be true.  For example, we have a program that reads in the joint orientations of a person standing facing the Kinect and converts them from Quaternions into Euler coordinates (X, Y, Z).  But as we examined the joint orientations for the three Spine joints (SpineBase, SpineMid, SpineShoulder), we noticed that each of them had a Y rotation of ~180 degrees.  If these orientations were truly relative to the parent joint and we applied them to a 3d avatar from the SpineBase all the way to the SpineShoulder, we would be applying a 180 degree rotation (of the Y axis) to each subsequent child bone.  This would cause the avatar to have shoulders that were 540 degrees rotated from the hips.

    Since this is obviously not the case (the head should have very little rotation from the hips when the user is standing straight and facing directly perpendicular to the Kinect), this leads us to believe that the joint orientations are actually global instead of relative to the corresponding parent joint. 

    Is this analysis correct? Has anyone provided definitive documentation, or any representative sample code on this?

    Monday, July 28, 2014 9:31 PM
  • We did some experimentation with with JointOrientation and that matches exactly our experience -- the joint rotation quaternions are global. This was with the 1406 SDK.

    Monday, July 28, 2014 9:33 PM
  • I agree with Paul. The Kinect V2 quaternions are global or absolute.

    The quaternions represent the orientation of the joints relative to their parent joint in CameraSpace. If you rotate (0,1,0) by the quaternion of a child joint, you will get the normalized vector of childJoint.Position-parentJoint.Position.

    I believe my thread would be help(though I'm waiting for the answer...)
    http://social.msdn.microsoft.com/Forums/en-US/f2e6a544-705c-43ed-a0e1-731ad907b776/meaning-of-rotation-data-of-k4w-v2?forum=k4wv2devpreview


    • Edited by wingcloud Monday, July 28, 2014 11:25 PM
    Monday, July 28, 2014 11:06 PM
  • Hi,

    You seem to have a better understanding of this than me, so I was hoping I could ask you a question.  If the quaternions representing rotation are global, then I thought Quaternion.Inverse(parentQuat) * childQuat would give the local rotation, but when I pull this for the bend in the elbow (using elbow and wrist rotations in that formula) the resulting local rotation still changes as the shoulder rotates even though the bend and rotation in the arm and forearm remain the same.

    Do you know what would explain this?

    Thanks

    Thursday, August 7, 2014 5:39 AM
  • Hi Ray,

    Your quaternion seems to be right. How did you test it?

    I'm afraid you might mistake in using quaternions.

    Thursday, August 7, 2014 1:29 PM
  • So, I pulled the LOCAL rotation for the forearm and assigned it to the elbow as follows...

    forearm.transform.localRotation = Quaternion.Inverse(elbowOrientation) * wristOrientation;

    Note that for testing this was the only join whose rotation I was assigning.

    I then stood in front of the camera with my elbow bent 90 degrees (strongman pose) and I saw rotation in the model.  Good up to this point.  I then rotated my shoulder slowly 90 degrees to the front so my elbow was now facing the camera, however keeping the 90 degree bend in my elbow.

    I would expect the LOCAL rotation of the elbow to change very little in this test, but in fact it went through almost a 90 degree rotation in another direction, which kind of makes me wonder if even after applying the parent inverse that the rotation is still somehow not in local space.

    Thanks

    Ray

    Thursday, August 7, 2014 2:53 PM
  • Since the quaternions are global, you don't need to involve parentQuat. Just use childQuat and I think you should get the behavior you're looking for.
    Thursday, August 7, 2014 2:55 PM
  • As a side note, the elbow may not be the best joint to start testing with as it can be a bit jittery.

    Especially when the thumb cannot be seen and therefore the roll rotations are difficult to calculate.

    The hips/spine would probably be the best candidates to start testing with.


    Brekel

    Thursday, August 7, 2014 3:01 PM
    Moderator
  • Is that true?  Don't I need the rotations I assign to my joints to be local (relative to the parent), as I want them to be relative to the rotations of the entire model in world space?  That's why I assign to the localRotation property.
    Thursday, August 7, 2014 3:35 PM
  • Hi Ray,

    You're using Unity, aren't you?

    Here is a part of my code for Unity.
    I don't know Unity well, but it seems to work fine without using local rotations.

    Quaternion comp = Quaternion.FromToRotation(new Vector3(floorPlane.X, floorPlane.Y, floorPlane.Z),Vector3.up);
    Quaternion SpineBase = VToQ(joints[Kinect.JointType.SpineBase].Orientation,comp);
    Quaternion SpineMid = VToQ(joints[Kinect.JointType.SpineMid].Orientation,comp);
    Quaternion SpineShoulder = VToQ(joints[Kinect.JointType.SpineShoulder].Orientation,comp);
    Quaternion ShoulderLeft = VToQ(joints[Kinect.JointType.ShoulderLeft].Orientation,comp);
    Quaternion ShoulderRight = VToQ(joints[Kinect.JointType.ShoulderRight].Orientation,comp);
    Quaternion ElbowLeft = VToQ(joints[Kinect.JointType.ElbowLeft].Orientation,comp);
    Quaternion WristLeft = VToQ(joints[Kinect.JointType.WristLeft].Orientation,comp);
    Quaternion HandLeft =VToQ (joints[Kinect.JointType.HandLeft].Orientation,comp);
    Quaternion ElbowRight = VToQ(joints[Kinect.JointType.ElbowRight].Orientation,comp);
    Quaternion WristRight = VToQ(joints[Kinect.JointType.WristRight].Orientation,comp);
    Quaternion HandRight = VToQ(joints[Kinect.JointType.HandRight].Orientation,comp);
    Quaternion KneeLeft = VToQ(joints[Kinect.JointType.KneeLeft].Orientation,comp);
    Quaternion AnkleLeft = VToQ(joints[Kinect.JointType.AnkleLeft].Orientation,comp);
    Quaternion KneeRight = VToQ(joints[Kinect.JointType.KneeRight].Orientation,comp);
    Quaternion AnkleRight = VToQ(joints[Kinect.JointType.AnkleRight].Orientation,comp);
                    
    Quaternion q = transform.rotation;
    transform.rotation = Quaternion.identity;
    
    Spine1.transform.rotation = SpineMid * Quaternion.AngleAxis(90, new Vector3(0, 1, 0)) * Quaternion.AngleAxis(-90, new Vector3(0, 0, 1));
    RightArm.transform.rotation = ElbowRight * Quaternion.AngleAxis(90, new Vector3(0, 1, 0)) * Quaternion.AngleAxis(-90, new Vector3(0, 0, 1));
    RightForeArm.transform.rotation = WristRight * Quaternion.AngleAxis(90, new Vector3(0, 1, 0)) * Quaternion.AngleAxis(-90, new Vector3(0, 0, 1));
    RightHand.transform.rotation = HandRight * Quaternion.AngleAxis(90, new Vector3(0, 1, 0)) * Quaternion.AngleAxis(-90, new Vector3(0, 0, 1));
    LeftArm.transform.rotation = ElbowLeft * Quaternion.AngleAxis(90, new Vector3(0, 1, 0)) * Quaternion.AngleAxis(-90, new Vector3(0, 0, 1));
    LeftForeArm.transform.rotation = WristLeft * Quaternion.AngleAxis(90, new Vector3(0, 1, 0)) * Quaternion.AngleAxis(-90, new Vector3(0, 0, 1));
    LeftHand.transform.rotation = HandLeft * Quaternion.AngleAxis(90, new Vector3(0, 1, 0)) * Quaternion.AngleAxis(-90, new Vector3(0, 0, 1));
    
    RightUpLeg.transform.rotation = KneeRight * Quaternion.AngleAxis(180, new Vector3(0, 1, 0)) * Quaternion.AngleAxis(-90, new Vector3(0, 0, 1));
    RightLeg.transform.rotation = AnkleRight * Quaternion.AngleAxis(180, new Vector3(0, 1, 0)) * Quaternion.AngleAxis(-90, new Vector3(0, 0, 1));
    LeftUpLeg.transform.rotation = KneeLeft * Quaternion.AngleAxis(-90, new Vector3(0, 0, 1));
    LeftLeg.transform.rotation = AnkleLeft * Quaternion.AngleAxis(-90, new Vector3(0, 0, 1));
    
    transform.rotation = q;


    private Quaternion VToQ(Windows.Kinect.Vector4 kinectQ, Quaternion comp)
    {
            return Quaternion.Inverse(comp) * (new Quaternion(-kinectQ.X, -kinectQ.Y, kinectQ.Z, kinectQ.W));
    }

    I use VToQ to adjust Kinect quaternions to the Unity left-handed coordinate system.

    The factors like "Quaternion.AngleAxis(90, new Vector3(0, 1, 0)) * Quaternion.AngleAxis(-90, new Vector3(0, 0, 1))" can be changed
    with respect to the base pose of your Unity model.

    The comp is the compensation term for the Kinect's angle.

    Thursday, August 7, 2014 10:52 PM
  • Wow, thanks for this!  The alignment is off for my model, but I can already see it's much more stable than what I've been trying.  Could you explain a little more about what you're doing with...

    leftArm.transform.rotation = ElbowLeft * Quaternion.AngleAxis(90, new Vector3(0, 1, 0)) * Quaternion.AngleAxis(-90, new Vector3(0, 0, 1));

    It seems if I remove these AngleAxis it becomes quite jerky and unstable again, so it seems like it might be doing more than just rotating the orientation?

    Thanks
    Ray
    Friday, August 8, 2014 3:45 PM
  • leftArm.transform.rotation = ElbowLeft * Quaternion.AngleAxis(90, new Vector3(0, 1, 0)) * Quaternion.AngleAxis(-90, new Vector3(0, 0, 1));

    It seems if I remove these AngleAxis it becomes quite jerky and unstable again, so it seems like it might be doing more than just rotating the orientation?

    It's never doing other than rotation.

    Because the Kinect quaternions seem to assume that any bone points to the sky before rotating,
    I turned the bones of my model to the sky before applying the quaternions.

    I think you observed your model was jerky and unstable because the quaternions rotated wrongly oriented objects.
    Suppose a quaternion rotates objects around y-axis and consider rotating the following two objects by it.

    We feel the first is not so jerky(it can be understood as a twist), but the second is jerky, isn't it?

    • Marked as answer by _Ray Price_ Monday, August 18, 2014 2:38 PM
    Saturday, August 9, 2014 2:27 AM
  • Omg, and THERE is the crucial missing piece of information...

    '[Because the Kinect quaternions seem to assume that any bone points to the sky before rotating]'

    Where did you find that out from?  I think that's what has been driving me nuts all this time.  Thank you so much for your explanation.  Let me see if I can apply it to my model more sucessfully now. :)

    EDIT: Ok, I just tried this again now you've explained what's going on, and it works PERFECTLY.  Now I just need to work it into local space so my model can move around freely and apply some smoothing.  Thank you so much!
    • Edited by _Ray Price_ Saturday, August 9, 2014 4:28 AM
    Saturday, August 9, 2014 3:38 AM
  • Omg, and THERE is the crucial missing piece of information...

    '[Because the Kinect quaternions seem to assume that any bone points to the sky before rotating]'

    Where did you find that out from?  I think that's what has been driving me nuts all this time.  Thank you so much for your explanation.  Let me see if I can apply it to my model more sucessfully now. :)

    From mathematics! :) I analysed quaternions and positions of joints by myself.
    I tried to explain it in this thread.
    http://social.msdn.microsoft.com/Forums/en-US/f2e6a544-705c-43ed-a0e1-731ad907b776/meaning-of-rotation-data-of-k4w-v2?forum=k4wv2devpreview

    It was like a riddle and I enjoyed, but I think MS should have disclosed such information.

    Saturday, August 9, 2014 4:19 AM
  • @wingcloud can you elaborate on what you mean with "any bone points to the sky before rotating"? am i right in understanding that this would make the bind-pose the kinect joint-orientation data is relative to? how would such a pose look?

    how can i make a "bone point to the sky"? what do you mean with bone exactly? (we only have joint orientations, right?)is the sky in Y? around what axis do i have to rotate it to the sky?

    hope those questions make sense..

    Thursday, August 20, 2015 3:22 AM
  • Hi Ray

    I am following wingcloud method to apply on Unity 3D  and it works well, but when I tried to extract the every joints Quaternion out and convert it into Euler angle and apply on my Bvh file. I cannot see any pattern on arm and forearm.

    Have you tried to use local quaternion to apply on your model?

    Tuesday, August 23, 2016 4:40 PM
  • What is the floor plane vector. Is it the normal of the plane (which in most applications isn't that just the Up vector?) or is it the forward vector? Like 0,0,1?
    Tuesday, December 13, 2016 1:50 AM