none
How to display joints coordinates systems ? RRS feed

  • Question

  • Hi all kinect users !

    I'm trying to achieve a quite simple result and cannot figure it out. The goal is to display a skeleton, and for every bone, to draw the coordinates system attached to it.

    I found this intersting post in the forum : 

    Kinect JointOrientation and BoneRotation Matrix questions

    (as a new user i cannot put a link in this message)

    where Ashok j's answer specifies that bone axis is +Y, but this is not what I'm displaying

    Here is the piece of code to retrieve the matrices and the bone position

    NUI_SKELETON_FRAME skelFrame;
    HRESULT out = NuiSkeletonGetNextFrame(200, &skelFrame);
    NUI_IMAGE_RESOLUTION	imgRes	=	 NUI_IMAGE_RESOLUTION_640x480;
    
    for(int skelId = 0; skelId < NUI_SKELETON_COUNT; ++skelId)
    {
    	const NUI_SKELETON_DATA & skeleton = skelFrame.SkeletonData[skelId];
    
    	if(skeleton.eTrackingState == NUI_SKELETON_TRACKED)
    	{
    		// retrieve bone orientations
    		NUI_SKELETON_BONE_ORIENTATION boneOrientations[NUI_SKELETON_POSITION_COUNT];
    		NuiSkeletonCalculateBoneOrientations(&skeleton, boneOrientations);
    
    		// add smoothing
    		NUI_TRANSFORM_SMOOTH_PARAMETERS defaultParams = {0.5f, 0.5f, 0.5f, 0.05f, 0.04f};
    		NuiTransformSmooth(&skelFrame, &defaultParams);
    
    		FILE* fic = fopen("c:\\temp\\bones2.txt", "w");
    		for (int u = 0; u < NUI_SKELETON_POSITION_COUNT; ++u)
    		{
    			fprintf(fic, "%5.8f\t%5.8f\t%5.8f\n", boneOrientations[u].absoluteRotation.rotationMatrix.M11, boneOrientations[u].absoluteRotation.rotationMatrix.M12, boneOrientations[u].absoluteRotation.rotationMatrix.M13);
    			fprintf(fic, "%5.8f\t%5.8f\t%5.8f\n", boneOrientations[u].absoluteRotation.rotationMatrix.M21, boneOrientations[u].absoluteRotation.rotationMatrix.M22, boneOrientations[u].absoluteRotation.rotationMatrix.M23);
    			fprintf(fic, "%5.8f\t%5.8f\t%5.8f\n", boneOrientations[u].absoluteRotation.rotationMatrix.M31, boneOrientations[u].absoluteRotation.rotationMatrix.M32, boneOrientations[u].absoluteRotation.rotationMatrix.M33);
    			fprintf(fic, "\n");
    		}
    		fclose(fic);
    
    		fic = fopen("c:\\temp\\vertices2.txt", "w");
    		for (int u = 0; u < NUI_SKELETON_POSITION_COUNT; ++u)
    		{
    			fprintf(fic, "%5.8f\t%5.8f\t%5.8f\n", skeleton.SkeletonPositions[boneOrientations[u].endJoint].x, skeleton.SkeletonPositions[boneOrientations[u].endJoint].y, skeleton.SkeletonPositions[boneOrientations[u].endJoint].z);
    			fprintf(fic, "\n");
    		}
    		fclose(fic);
    	}
    }
    

    Did domeone achieve to display these correctly ?

    Kindest regards

    Access75010

    Friday, June 14, 2013 3:09 PM

Answers

  • The type of orientation would require additional analysis of the hand region to determine that. We don't have any type of hand or figure detection in the Kinect for Windows SDK. The bone rotations are to determine direction from the previous joint to generate skeletal structure.

    • Marked as answer by Access75010 Thursday, June 20, 2013 7:42 AM
    Wednesday, June 19, 2013 6:02 PM

All replies

  • For reference:

    http://social.msdn.microsoft.com/Forums/en-US/kinectsdk/thread/4e90b413-b302-4c29-a8da-cb08ebc53f9c
    "Each row of the matrix represents a vector of the bone coordinate system (in it’s parent bone coordinate system for the hierarchical joints, or in the camera coordinate system for the absolute orientations). The +Y vector always lies along the bone direction."

    What results are you getting and what would you expect the result to be? The Y vector(M21 - M24) represents the Y vector. In your code, the direction this is heading is based on the camera coordinate system(absoluteRotation). It might be easier to be in hierarchical and then apply the transformation to get into camera coordinates after.

    Monday, June 17, 2013 8:21 PM
  • Hi Carmine and thanks for your answer !

    Basically, I prefer using the camera coordinates system (cs) as my final application will do so. I rewrote part of the code (using start joint position instead of endJoint position) and obtained the result I expected, that is, as you pointed it, the Y axis oriented along the bone direction (cf below where "hips center" cs is displayed in rgb -- X, Y, Z)...arg as a new user.... cannot display images. Anyway the result is correct. 

    The thing is that some cs are oriented the way they should be (for example, Hips center cs is oriented such as Z axis corresponds to the front, whatever the pose of the skeleton) and some are not. For example, forearms cs are not physically meaningful, indeed, the Y axis is oriented along the bone direction, but both X and Z axes are not precisely defined (for example a torsion of the forarm does not produce any difference in the X and Z axes coordinates... Cf Avateering demo). Anyway, I did not find any documentation about these axes. 

    Do you know where I could find any info about this specific point and if the X and Z axes can be correctly tracked (hope this question makes sense :) ) ?

    Thank you

    Regards

    Tuesday, June 18, 2013 7:54 AM
  • I am not exactly sure what you mean. Are you making assumptions about projection and left/right hand coordinates? Kinect uses a right hand coordinate system. You might also be dealing with some joints that are not actually tracked(JointTrackingState)

    Tuesday, June 18, 2013 4:42 PM
  • Let's assume I take a T-pose, palms towards the ground, the Kinect skeleton, whatever it is tracked or inferred, will provide an orientation of the left forearm (for example), with the Y axis oriented from the elbow to the wrist, and some X and Z axes.

    If I modify my pose, and orient my palms towards the sky, (meaning that i'm doing a kind of twist of my forearm), this should not impact the Y axis, as neither my elbow nor my wrist has moved, but does that impact the X and Z axes

    When tested on the XNA avateering demo, this forearm twist does not produce any result on the avatar, meaning that the X and Z axes are not correctly tracked (if they are).

    Does my question make sense? And do you know if I can find any info about this?

    Thanks for reading anyway

    Laurent

    Wednesday, June 19, 2013 7:56 AM
  • The type of orientation would require additional analysis of the hand region to determine that. We don't have any type of hand or figure detection in the Kinect for Windows SDK. The bone rotations are to determine direction from the previous joint to generate skeletal structure.

    • Marked as answer by Access75010 Thursday, June 20, 2013 7:42 AM
    Wednesday, June 19, 2013 6:02 PM
  • Thank you for this answer.

    Actually, this member orientation is a part of the project i'm currently working on. I hoped the Kinect SDK could do that part for me :)

    Regards

    Laurent

    Thursday, June 20, 2013 7:45 AM