none
Grip Recognition + skeleton hand tracking => stabilization becomes difficult RRS feed

  • Question

  • Good afternoon everyone,

    I'm currently facing something quite weird.

    Technologies : - Kinect 1 & C++ (Qt bundle + 3D human anatomical model)

    I am using my right hand to control the mouse position over the application, and grabbing the objects (mainly to validate buttons + grabbing organs on the model to then drag and drop them into a bin).

    The problem I'm facing is that when I am grabbing, my pointer moves ~50 pixels to the bottom right of the screen. Since my organs are pretty small, I often finish by clicking right away from the desired organ, grabbing lungs instead of trachea, for exemple.

    I thought that it came from my implementation, but testing the skeleton tracking example in Kinect example browser leads to the same results. When doing the grabbing movement, the skeleton even has the right arm going smaller (around 10/20% in size). 

    I have found that if the kinect is placed on the ground with 25/27° angle, this works better than having it right in front of me (while as seen in the documentation, kinect SHOULD be right in front of the player ...). 

    Also, for hand movement detection, I cant use anything else (as far as we have searched) than skeleton tracking because of the 3D part of the project (given by external developers) and it seems fine to use skeleton to detect hand movements, I think.

    Do you have any idea about how to fix this ? Maybe smoothing parameters so skeleton doesnt get modified when the players does a grip ? 


    Wednesday, September 10, 2014 12:30 PM

All replies

  • depending on the orientation to the user hand, arm, shoulder you can be casting a shadow causing occlusion of joints. When that happens joints will become inferred and therefore you should be checking the state of the joints to ensure you take that into account.

    Are you using interactions api's?

    Joint smoothing parameters will improve a bit on stability of the joints taking some things into account. Placement of the sensor will be key. Ensure it is on a level surface parallel to the floor. Changing the elevation slightly down so you get an angle will stabilize the tracking a bit better.

    See Joint Smoothing whitepaper http://msdn.microsoft.com/en-us/library/jj131429.aspx

    http://msdn.microsoft.com/en-us/library/jj131024.aspx


    Carmine Sirignano - MSFT

    Thursday, September 11, 2014 12:01 AM
  • Hello,

    thanks for the answer, much appreciated.

    I am using the NUI library to catch the skeleton joints and the grip/griprelease. Then I calculate a cursor position based on right hand joint position, so I suppose I am not using the interactions api (sorry if I sound inexperimented in this technology, I got this project to continue for 1 week now, wasn't there when the specifications/modeling were made).

    In the software, the state was taken in consideration, only tracked joints where taken. I have tried using both tracked and inferred position just for testing purposes and that didnt change much (not better nor worse).

    The real strange thing is that arm movement. When I grip on the green skeleton example (from the kinect browser), if my arm is directly facing the kinect (like that : __ k), this doesn't show any problem about the arm size, but when my arm has an angle with the kinect (like that : /   k) the arm can reduce ~20% in size during the grip, causing my final hand pointer to move quite a lot on the screen ...

    Should I just tell my superiors it is needed to be in a certain position/angle to get a better user experience ? or is it really possible to reduce this "noise" ? Did you ever face this sort of problem ?


    Thursday, September 11, 2014 7:54 AM