Asked by:
Kinect v2 SDK Joint Orientation
Question

Hi,
I am trying to use the Joint Orientations provided by Kinect. The Vector4 member of the JointOrientation structure is the orientation quaternion, is that correct?
A quaternion should be an ordered tuple {cos(t/2), sin(t/2)*axis_x, sin(t/2)*axis_y, sin(t/2)*axis_z}, (where t = angle). How do these four numbers correspond to {x,y,z,w} in the Vector4 (i.e, JointOrientation.Orientation)?
In other words, what do each of the numbers {x,y,z,w} in the Vector4 mean?
Further, what is the reference orientation, i.e., the orientation for zero quaternion? is it Pointing upwards, along Y axis?
Thanks.
All replies

There are some threads that touch on some of your questions.
http://social.msdn.microsoft.com/Forums/enUS/749e47fc8e7b4286b09572c7676ecaaf/unityjointorientationstobonemappingnotquiteright?forum=kinectv2sdk
Joint Orientation is a perjoint yaw, pitch, and roll. Each quaternion is the absolute orientation of the parent bone. The basis of each joint is defined by:
 Bone direction(Y green)  always matches the skeleton.
 Normal(Z blue)  joint roll, perpendicular to the bone
 Binormal(X orange)  perpendicular to the bone and normal
Ken
 Proposed as answer by Carmine Si  MSFTMicrosoft employee Tuesday, September 9, 2014 8:34 PM
 Unproposed as answer by aqurai Wednesday, September 10, 2014 11:50 AM
 Proposed as answer by Carmine Si  MSFTMicrosoft employee Wednesday, September 10, 2014 11:55 PM
 Unproposed as answer by aqurai Thursday, September 11, 2014 11:36 AM
 Proposed as answer by Carmine Si  MSFTMicrosoft employee Friday, September 12, 2014 12:28 AM


In one of the links, this guy gives some elaboration.
So, the orientation quaternion is the orientation of the parent bone, though for end joints this quaternion is zero, so no "roll" angle for the end bones. The provided orientation is 3degreesoffreedom, but given 3D position information of all joints, this orientation can only provide 1degreeoffreedom rotation i.e. around the bone axis. But of course, the next child joint position in 3D will also define this "roll" angle around the parent bone axis, so joint orientation gives no extra information except for the end bones  and given that the end joints have no orientation data, it seems evident that this orientation data is 100% redundant given the position of each joint. This makes sense for orientations that are computed from position estimates. Well, perhaps one orientation is not redundant i.e. spine base, which may be the orientation of the entire body.
 Edited by sault<abbr class="affil"></abbr> Sunday, August 17, 2014 11:13 PM
Sunday, August 17, 2014 11:09 PM
The crazy part is that even standing still the vectors will change. (see the comments following this guy.)
If you move your arm is a fluid constant direction, they jump around pretty crazy. I don't see that in Kinect Studio, but when you use Sample code and turn on the painting of the vectors, they jump around or have noise jitter.
Ken
 Proposed as answer by Carmine Si  MSFTMicrosoft employee Wednesday, September 10, 2014 11:55 PM
 Unproposed as answer by aqurai Thursday, September 11, 2014 11:36 AM
 Proposed as answer by Carmine Si  MSFTMicrosoft employee Friday, September 12, 2014 12:28 AM

I'm also wondering this. I know Ken, Ray and others have been trying to figure out how to correctly interpret the Joint Orientation coming from Kinect, but have not been able to (at least, conclusively). In all the "answers" the same response is referenced, but the fact of the matter is that it's not clear how to interpret them. Can somebody on MS post an example about this?
I guess the complete question should be: What is the best way to extract and process joint orientation information from Kinect? Using the joint positions? Using the joint orientation? Some combinations of both?

Tjjos:
I agree, but I'm bewilderingly out of my league with this stuff. Hamiltonian Quaternion math and visualizing what's happening is not triggering in my brain. I figured it will take smarter monsters than me to get this one.
Unity has a sample working Here's some more goodies:
http://unity3d.am/2014/04/29/kinectv2pluginforunity3d/
Searching in unity3d store for Kinect MSDK v2 is hit or miss painful. I found it after some trial and error.
Start with this Russian guys site:
http://rfilkov.com/2014/08/01/kinectv2withmssdk/
The sample has avatar moving with the Kinect. You can see the code insider there as a sample.
The code is working here. I have not made sense of it yet, as I've moved on to other things for the moment. However, the C# samples are contained therein, you can follow the logic for the Unity3d platform (v 4.x)
Looking in KinectInterop there's a struct wrapper for jointdata.
jointData.direction = jointData.kinectPos  bodyFrame.bodyData[i].joint[jParent].kinectPos;
In AvatarControllers of the sample you'll find:
// Converts kinect joint rotation to avatar joint rotation, depending on joint initial rotation and offset rotation Quaternion Kinect2AvatarRot(Quaternion jointRotation, int boneIndex) { Quaternion newRotation = jointRotation * initialRotations[boneIndex]; if (offsetNode != null) { Vector3 totalRotation = newRotation.eulerAngles + offsetNode.transform.rotation.eulerAngles; newRotation = Quaternion.Euler(totalRotation); } return newRotation; }
So yeah, everybody knows that... but where did the Quaternion jointRotation data come from, the Kinect stuff is crazy. <Right>
Is this the mythical Vector4/Quaternion somehow manipulated and transformed? Answer: NO.
// Get Kinect joint orientation Quaternion jointRotation = kinectManager.GetJointOrientation(userId, iJoint, flip); // returns the joint rotation of the specified user, relative to the Kinectsensor public Quaternion GetJointOrientation(Int64 userId, int joint, bool flip) ... this basically just returns either normalRotation or mirroredRotation ??? but how does that get set you ask, (me too). during the setters. // calculates joint orientations for the given body private void SetJointOrientations(ref KinectInterop.BodyData bodyData) { ... Vector3 baseDir = KinectInterop.JointBaseDir[nextJoint]; Vector3 jointDir = nextJointData.direction; jointDir.z = jointDir.z; jointData.normalRotation = Quaternion.FromToRotation(baseDir, jointDir); ... or baseDir = Vector3.right; jointDir = bodyData.hipsDirection; jointDir.z = jointDir.z; jointData.normalRotation *= Quaternion.FromToRotation(baseDir, jointDir);
So the Unity example doesn't use the Kinect Body Joint Orientation Vector4. Alas, the answer the original question still eludes us as to how to use the Vector4 Joint Orientation in fashion that folks are relatively familiar with.
Bottom Line to answer your last question:
In the mean time, the Unity3d sample indicates using directional Vectors to build Unity Quaternions (or whatever platform you're working in) instead of fooling around with Kinect Body Joint Orientation Vector4.
NOTE it looks like the newer MS v2 Unity3d has issues: This project is taken temporarily down at the Unity asset store, due to request from a partner. Hope it will be back soon. Sorry for the inconvenience.Ken
Ken
 Proposed as answer by Carmine Si  MSFTMicrosoft employee Friday, September 12, 2014 12:28 AM

Wow, excellent find and analysis Ken. Thanks very much for the information. This will definitely help. I already asked Rumen for the asset but unfortunately he is on vacation (I got a mail from his autoresponder). Would you be so kind and share the asset?


Converts rotation quaternion to Euler angles ... I tried looking you up can't find a proper contact.
I just found this in the new 1409 release for Face API sample:
/// <summary> /// Converts rotation quaternion to Euler angles /// And then maps them to a specified range of values to control the refresh rate /// </summary> /// <param name="rotQuaternion">face rotation quaternion</param> /// <param name="pitch">rotation about the Xaxis</param> /// <param name="yaw">rotation about the Yaxis</param> /// <param name="roll">rotation about the Zaxis</param> private static void ExtractFaceRotationInDegrees(Vector4 rotQuaternion, out int pitch, out int yaw, out int roll) { double x = rotQuaternion.X; double y = rotQuaternion.Y; double z = rotQuaternion.Z; double w = rotQuaternion.W; // convert face rotation quaternion to Euler angles in degrees double yawD, pitchD, rollD; pitchD = Math.Atan2(2 * ((y * z) + (w * x)), (w * w)  (x * x)  (y * y) + (z * z)) / Math.PI * 180.0; yawD = Math.Asin(2 * ((w * y)  (x * z))) / Math.PI * 180.0; rollD = Math.Atan2(2 * ((x * y) + (w * z)), (w * w) + (x * x)  (y * y)  (z * z)) / Math.PI * 180.0; // clamp the values to a multiple of the specified increment to control the refresh rate double increment = FaceRotationIncrementInDegrees; pitch = (int)(Math.Floor((pitchD + ((increment / 2.0) * (pitchD > 0 ? 1.0 : 1.0))) / increment) * increment); yaw = (int)(Math.Floor((yawD + ((increment / 2.0) * (yawD > 0 ? 1.0 : 1.0))) / increment) * increment); roll = (int)(Math.Floor((rollD + ((increment / 2.0) * (rollD > 0 ? 1.0 : 1.0))) / increment) * increment); }
Ken
 Proposed as answer by Morphée Monday, October 13, 2014 10:42 PM

Start with this Russian guys site:
http://rfilkov.com/2014/08/01/kinectv2withmssdk/
This is of no importance, but the 'Russians guys' are actually a Bulgarian guy (living in Austria). :)
And yes, my project doesn't use the the quaternions provided by K2SDK, but calculates the orientations of the bones internally. The best researches I've found so far on the topic 'K2SDK orientations' are:
and
http://social.msdn.microsoft.com/Forums/enUS/f2e6a544705c43eda0e1731ad907b776/meaningofrotationdataofk4wv2 (wingcloud's)
Hope this info helps.
 Edited by rfilkov Wednesday, October 1, 2014 11:58 AM
 Proposed as answer by Ken MacPherson Wednesday, October 22, 2014 2:39 PM

Hi rfilkov,
That two links doesn't work. I'm very curious about that because I have the same question: I want to find the orientation of each joint. what can I do with the quaternion I get?
Also, can I know how you get the orientation without using the quaternion? Thanks!

The links are OK, but they point to the K2developer forums, which are probably still not publicly available.
As to Tomohiro (wingcloud's thread), who gets all the credit for this disclosure, you need to orient each joint to the sky, if you want to use the SDKprovided quaternions.
The orientation can be calculated from the change of each bone, between its initial pose and the currently tracked one. This is what I do and what SDK does, I suppose.
 Edited by rfilkov Monday, January 26, 2015 11:21 PM