none
Applying rotation from kinect v2 to blender RRS feed

  • Question

  • Hello,

    I'm doing a realtime motion capture with kinect v2 and blender. I managed to send rotation data from kinect application to blender, but my 3D model in blender gets totaly wracked. I got rotation data like this:

    Vector4 vec;
    vec = body.JointOrientations[joint.Key].Orientation;
    Quaternion qOrientation = new Quaternion(vec.W, vec.X, vec.Y, vec.Z);

    It seems I must do something with those rotation data before applying them to bones.

    Any idea?

    Thanks!



    • Edited by stabak Saturday, July 4, 2015 3:12 PM
    Saturday, July 4, 2015 1:45 PM

All replies

  • Not sure how Blender handles quaternions, but they usually are in x,y,z,w order instead of w first.


    Brekel

    Saturday, July 4, 2015 9:36 PM
    Moderator
  • Actually it is w,x,y,z. (http://blender.stackexchange.com/questions/3114/what-is-the-parameter-order-of-a-quaternion-constructor)

    I alse added:

                 JointType parentJoint = KinectHelpers.GetParentJoint(joint.Key);
                Vector4 vecParent = body.JointOrientations[parentJoint].Orientation;
                Quaternion qOrientationParent = new Quaternion(vecParent.W, vecParent.X, vecParent.Y, vecParent.Z);
                Quaternion qSend;
                qSend = (qOrientationParent.Conj / qOrientationParent.Norm) * qOrientation;

     And now I'm sending qSend.w, qSend.x, qSend.y, qSend.z

    I hope that's what was meant here https://social.msdn.microsoft.com/Forums/en-US/3f9e03b4-2670-41b5-9a91-2b72c77fe843/using-kinect-v2-jointorientations-along-with-threejs-skinnedmesh?forum=kinectv2sdk

    However it still doesn't work. I see some bugs in blender python code. I'll try to fix it.

    Sunday, July 5, 2015 7:57 AM
  • Ok sorry, as mentioned I a Blender noob :)

    Don't want to spam here but another way could be to connect to the live stream of my apps. http://brekel.com

    It will give you added quality on top of the standard SDK and it should be as easy as picking up a TCP or UDP packet and getting the positions/rotations vectors out of that.

    If you're interested just drop me a message and I'll set you up. http://brekel.com/contact


    Brekel

    Sunday, July 5, 2015 10:13 AM
    Moderator
  • Sorry, but this is the way it needs to be done Kinect -> Blender.

    Sunday, July 5, 2015 10:20 AM
  • I managed to make it using ni mate. I'm still puzled with this. 

    I did:

     public OscMessage BuildJointMessage(Body body, KeyValuePair<JointType, Joint> joint)
            {
                var address = String.Format("/{0}", joint.Key);
                var position = joint.Value.Position;
                Vector4 vec;
                vec = body.JointOrientations[joint.Key].Orientation;
                Quaternion qOrientation = new Quaternion(vec.W, vec.X, vec.Y, vec.Z);

                JointType parentJoint = KinectHelpers.GetParentJoint(joint.Key);
                Vector4 vecParentOrientation = body.JointOrientations[parentJoint].Orientation;

                Quaternion qOrientationParent = new Quaternion(vecParentOrientation.W, vecParentOrientation.X, vecParentOrientation.Y, vecParentOrientation.Z);
                Quaternion qSend;
                qSend = ((1f / qOrientationParent.Norm) * qOrientationParent.Conj) * qOrientation;

    return new OscMessage(address, position.X, position.Y, position.Z, qSend.W, qSend.X, qSend.Y, qSend.Z);
            }

    And I get rotations in blender but they are not following my rotations.

    I noticed that by moving my hips front and back I get my knees together and separated.

    I use ni mate addon in blender.

    Thursday, August 13, 2015 1:53 PM