Kinect v2 SDK Joint Orientation


  • Hi,

    I am trying to use the Joint Orientations provided by Kinect. The Vector4 member of the JointOrientation structure is the orientation quaternion, is that correct?

    A quaternion should be an ordered tuple {cos(t/2), sin(t/2)*axis_x, sin(t/2)*axis_y, sin(t/2)*axis_z}, (where t = angle). How do these four numbers correspond to {x,y,z,w} in the Vector4 (i.e, JointOrientation.Orientation)?

    In other words, what do each of the numbers {x,y,z,w} in the Vector4 mean?

    Further, what is the reference orientation, i.e., the orientation for zero quaternion? is it Pointing upwards, along Y axis?


    2014年9月9日 10:02


  • There are some threads that touch on some of your questions.

    Joint Orientation is a per-joint yaw, pitch, and roll. Each quaternion is the absolute orientation of the parent bone. The basis of each joint is defined by:

    - Bone direction(Y green) - always matches the skeleton.
    - Normal(Z blue) - joint roll, perpendicular to the bone
    - Binormal(X orange) - perpendicular to the bone and normal


    2014年9月9日 15:31
  • Those posts don't answer what {x,y,z,w} exactly are.  That needs to be known to rotate a vector, or compute the conjugate.

    • 編集済み aqurai 2014年9月10日 11:50
    • 回答の候補に設定 Qin0126 2016年3月14日 2:47
    2014年9月10日 11:50
  • In one of the links, this guy gives some elaboration.

    So, the orientation quaternion is the orientation of the parent bone, though for end joints this quaternion is zero, so no "roll" angle for the end bones. The provided orientation is 3-degrees-of-freedom, but given 3D position information of all joints, this orientation can only provide 1-degree-of-freedom rotation i.e. around the bone axis. But of course, the next child joint position in 3D will also define this "roll" angle around the parent bone axis, so joint orientation gives no extra information except for the end bones -- and given that the end joints have no orientation data, it seems evident that this orientation data is 100% redundant given the position of each joint. This makes sense for orientations that are computed from position estimates. Well, perhaps one orientation is not redundant i.e. spine base, which may be the orientation of the entire body.

    <input id="aaf9eabd-7b12-4299-a0c1-6b062b2203ae_attachments" type="hidden" value="" />                
    • Edited by                             sault<abbr class="affil"></abbr>                         Sunday, August 17, 2014 11:13 PM                    

    Sunday, August 17, 2014 11:09 PM

    The crazy part is that even standing still the vectors will change.  (see the comments following this guy.)

    If you move your arm is a fluid constant direction, they jump around pretty crazy.  I don't see that in Kinect Studio, but when you use Sample code and turn on the painting of the vectors, they jump around or have noise jitter. 


    2014年9月10日 14:26
  • I'm also wondering this. I know Ken, Ray and others have been trying to figure out how to correctly interpret the Joint Orientation coming from Kinect, but have not been able to (at least, conclusively). In all the "answers" the same response is referenced, but the fact of the matter is that it's not clear how to interpret them. Can somebody on MS post an example about this?

    I guess the complete question should be: What is the best way to extract and process joint orientation information from Kinect? Using the joint positions? Using the joint orientation? Some combinations of both?

    2014年9月10日 17:24
  • Tjjos:

    I agree, but I'm bewilderingly out of my league with this stuff. Hamiltonian Quaternion math and visualizing what's happening is not triggering in my brain.  I figured it will take smarter monsters than me to get this one.

    Unity has a sample working Here's some more goodies:

    Searching in unity3d store for Kinect MSDK v2 is hit or miss painful. I found it after some trial and error.

    Start with this Russian guys site:

    The sample has avatar moving with the Kinect.  You can see the code insider there as a sample.

    The code is working here. I have not made sense of it yet, as I've moved on to other things for the moment. However, the C# samples are contained therein, you can follow the logic for the Unity3d platform (v 4.x)

    Looking in KinectInterop there's a struct wrapper for jointdata.

    jointData.direction = jointData.kinectPos - bodyFrame.bodyData[i].joint[jParent].kinectPos;

    In AvatarControllers of the sample you'll find:

    	// Converts kinect joint rotation to avatar joint rotation, depending on joint initial rotation and offset rotation
    	Quaternion Kinect2AvatarRot(Quaternion jointRotation, int boneIndex)
    		Quaternion newRotation = jointRotation * initialRotations[boneIndex];
    		if (offsetNode != null)
    			Vector3 totalRotation = newRotation.eulerAngles + offsetNode.transform.rotation.eulerAngles;
    			newRotation = Quaternion.Euler(totalRotation);
    		return newRotation;

    So yeah, everybody knows that... but where did the Quaternion jointRotation data come from, the Kinect stuff is crazy. <Right> 

    Is this the mythical Vector4/Quaternion somehow manipulated and transformed? Answer: NO.

    		// Get Kinect joint orientation
    		Quaternion jointRotation = kinectManager.GetJointOrientation(userId, iJoint, flip);
    		// returns the joint rotation of the specified user, relative to the Kinect-sensor
    		public Quaternion GetJointOrientation(Int64 userId, int joint, bool flip)
    		... this basically just returns either normalRotation or mirroredRotation
    		??? but how does that get set you ask, (me too). 
    during the setters. 
    	// calculates joint orientations for the given body
    	private void SetJointOrientations(ref KinectInterop.BodyData bodyData)
    	Vector3 baseDir = KinectInterop.JointBaseDir[nextJoint];
    	Vector3 jointDir = nextJointData.direction;
    	jointDir.z = -jointDir.z;
    	jointData.normalRotation = Quaternion.FromToRotation(baseDir, jointDir);
    	... or 
    	baseDir = Vector3.right;
    	jointDir = bodyData.hipsDirection;
    	jointDir.z = -jointDir.z;
    	jointData.normalRotation *= Quaternion.FromToRotation(baseDir, jointDir);

    So the Unity example doesn't use the Kinect Body Joint Orientation Vector4.  Alas, the answer the original question still eludes us as to how to use the Vector4 Joint Orientation in fashion that folks are relatively familiar with.

    Bottom Line to answer your last question:

    In the mean time, the Unity3d sample indicates using directional Vectors to build Unity Quaternions (or whatever platform you're working in) instead of fooling around with Kinect Body Joint Orientation Vector4.

    NOTE it looks like the newer MS v2 Unity3d has issues: This project is taken temporarily down at the Unity asset store, due to request from a partner. Hope it will be back soon. Sorry for the inconvenience.



    2014年9月11日 16:41
  • Wow, excellent find and analysis Ken. Thanks very much for the information. This will definitely help. I already asked Rumen for the asset but unfortunately he is on vacation (I got a mail from his auto-responder). Would you be so kind and share the asset? 
    2014年9月11日 22:08
  • Ken,

    May I about any progress or solution for this issue?


    2014年9月14日 9:16
  • Converts rotation quaternion to Euler angles ... I tried looking you up can't find a proper contact.

    I just found this in the new 1409 release for Face API sample:

            /// <summary>
            /// Converts rotation quaternion to Euler angles 
            /// And then maps them to a specified range of values to control the refresh rate
            /// </summary>
            /// <param name="rotQuaternion">face rotation quaternion</param>
            /// <param name="pitch">rotation about the X-axis</param>
            /// <param name="yaw">rotation about the Y-axis</param>
            /// <param name="roll">rotation about the Z-axis</param>
            private static void ExtractFaceRotationInDegrees(Vector4 rotQuaternion, out int pitch, out int yaw, out int roll)
                double x = rotQuaternion.X;
                double y = rotQuaternion.Y;
                double z = rotQuaternion.Z;
                double w = rotQuaternion.W;
                // convert face rotation quaternion to Euler angles in degrees
                double yawD, pitchD, rollD;
                pitchD = Math.Atan2(2 * ((y * z) + (w * x)), (w * w) - (x * x) - (y * y) + (z * z)) / Math.PI * 180.0;
                yawD = Math.Asin(2 * ((w * y) - (x * z))) / Math.PI * 180.0;
                rollD = Math.Atan2(2 * ((x * y) + (w * z)), (w * w) + (x * x) - (y * y) - (z * z)) / Math.PI * 180.0;
                // clamp the values to a multiple of the specified increment to control the refresh rate
                double increment = FaceRotationIncrementInDegrees;
                pitch = (int)(Math.Floor((pitchD + ((increment / 2.0) * (pitchD > 0 ? 1.0 : -1.0))) / increment) * increment);
                yaw = (int)(Math.Floor((yawD + ((increment / 2.0) * (yawD > 0 ? 1.0 : -1.0))) / increment) * increment);
                roll = (int)(Math.Floor((rollD + ((increment / 2.0) * (rollD > 0 ? 1.0 : -1.0))) / increment) * increment);


    • 回答の候補に設定 Morphée 2014年10月13日 22:42
    2014年9月23日 2:20
  • Start with this Russian guys site:

    This is of no importance, but the 'Russians guys' are actually a Bulgarian guy (living in Austria). :)

    And yes, my project doesn't use the the quaternions provided by K2-SDK, but calculates the orientations of the bones internally. The best researches I've found so far on the topic 'K2-SDK orientations' are: (Brekel's)

    and (wingcloud's)

    Hope this info helps.

    • 編集済み rfilkov 2014年10月1日 11:58
    • 回答の候補に設定 Ken MacPherson 2014年10月22日 14:39
    2014年10月1日 11:57
  • Hi rfilkov,

    That two links doesn't work. I'm very curious about that because I have the same question: I want to find the orientation of each joint. what can I do with the quaternion I get?

    Also, can I know how you get the orientation without using the quaternion? Thanks!

    2015年1月25日 4:56
  • The links are OK, but they point to the K2-developer forums, which are probably still not publicly available.

    As to Tomohiro (wingcloud's thread), who gets all the credit for this disclosure, you need to orient each joint to the sky, if you want to use the SDK-provided quaternions.

    The orientation can be calculated from the change of each bone, between its initial pose and the currently tracked one. This is what I do and what SDK does, I suppose.

    • 編集済み rfilkov 2015年1月26日 23:21
    2015年1月25日 20:23