none
Kinect Tracking Person Holding Objects (Connected Components) RRS feed

  • Question

  • I've been using the Kinect for a research project, and I've been running into an issue where the skeletal tracking algorithm performs poorly when an object is being held in the hand of the test subject. For instance, when the subject being tracked is holding a ball, the IR depth stream in the skeletal viewer demo colors the ball as part of the human body and this appears to cause the skeletal viewer to treat the ball as a body part. This disproportionately distorts the skeleton - even when the ball is raised above the head and the only part of the body obscured by the ball is one hand, the joints of the arms tracked appear to twist downwards.

    Is there any way that we can use the Kinect SDK 1.5 to recognize these connected components as external to the human body, and then either remove them from the point cloud or filter them out when being processed in the skeletal algorithm? It would greatly help to be able to recognize and filter out objects of uniform dimensions - plates, boards, spheres and the such.

    Thanks in advance for all of your help.

    Tuesday, May 29, 2012 8:37 AM

Answers

  • Sorry, there is no way for you to modify the body mask that our skeletal tracking system determines. We appreciate you raising the issue.

    I'm not aware of a great work around right now...there is no way to change the input depth data into the skeletal tracking system.

    Thanks, Rob Relyea
    Kinect for Windows team


    PS. [Please let's focus this discussion on the problem at hand...not suggesting that people add it to a list for future requests. We read most of the posts on the forum...it doesn't need to be collected in one forum thread.]

    • Edited by Rob Relyea [MSFT] Wednesday, May 30, 2012 4:58 PM adding ps
    • Proposed as answer by The Thinker Wednesday, May 30, 2012 7:48 PM
    • Marked as answer by evaninja Wednesday, May 30, 2012 7:52 PM
    Wednesday, May 30, 2012 4:57 PM

All replies

  • I have had this problem too. Try waving your hand in front of the kinect camera really close or go out of view of kinect to reset the tracking. This will make it start tracking you correctly this time.

    I have it happen to me when debugging on a 2GB and 3.0 ghz cpu computer (non-dual core) and detected the wall sometimes as a skeleton.


    Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth. - "Sherlock holmes" "speak softly and carry a big stick" - theodore roosevelt. Fear leads to anger, anger leads to hate, hate leads to suffering - Yoda. Blog - http://jefferycarlsonblog.blogspot.com/


    • Edited by The Thinker Tuesday, May 29, 2012 12:36 PM
    • Proposed as answer by The Thinker Wednesday, May 30, 2012 8:27 PM
    • Unproposed as answer by The Thinker Wednesday, May 30, 2012 8:30 PM
    Tuesday, May 29, 2012 12:31 PM
  • Thanks for the reply, The Thinker - that makes sense as a debugging approach, although I haven't gotten it to work out precisely yet. The issue here is that I'd really like to filter out all objects touching the subject, rather than correct for one-off errors where the Kinect mistakes a common object for a skeleton (he/she will be holding at least one item throughout the entire experiment). The color of the depth stream indicates that the object is recognized as part of the skeleton the moment it touches the subject's hand, even after resetting the Kinect by waving a hand in front of the sensor. I've thought of two potential solutions to this problem, and I'm curious to see if anybody else has been trying these approaches, or found something that works better:

    1.) Post-processing on the point cloud returned in the depth stream: intuitively, one would think that if the object could be recognized in the depth stream and then removed, the skeletal algorithm could be run on the modified point cloud to return a more accurate representation of the skeleton. Does anybody know if the skeletal algorithm can be run on modified depth stream data as via post-processing, or can it only be used at runtime in the SDK?

    2.) Forcing IR rays to be fully absorbed by the object. This would require coating the object with Lamp Black, which can get expensive and isn't an option I'm too fond of :)

    Wednesday, May 30, 2012 12:27 AM
  • Sorry, there is no way for you to modify the body mask that our skeletal tracking system determines. We appreciate you raising the issue.

    I'm not aware of a great work around right now...there is no way to change the input depth data into the skeletal tracking system.

    Thanks, Rob Relyea
    Kinect for Windows team


    PS. [Please let's focus this discussion on the problem at hand...not suggesting that people add it to a list for future requests. We read most of the posts on the forum...it doesn't need to be collected in one forum thread.]

    • Edited by Rob Relyea [MSFT] Wednesday, May 30, 2012 4:58 PM adding ps
    • Proposed as answer by The Thinker Wednesday, May 30, 2012 7:48 PM
    • Marked as answer by evaninja Wednesday, May 30, 2012 7:52 PM
    Wednesday, May 30, 2012 4:57 PM
  • I was talking on the correct subject I was talking out of my experience about the kinect acting up on a lower memory and processor machine.

    P.S. It wasnt enough off-topic to mark as a bad post but I wasn't making that up about moving out of view to stop tracking or reset it. That method works all the time for me. I show up colored blue as active and it starts bleeding into the wall for some reason and it happens frequently for me and i move out of view and it stops. Depth view can become distorted sometimes when using the kinect sdk.

    Correction to above for OP: it should be used in debugging and when the software acts up on lower processor machines with windows 7 to help correct skeleton tracking but you can also do other stuff too.


    Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth. - "Sherlock holmes" "speak softly and carry a big stick" - theodore roosevelt. Fear leads to anger, anger leads to hate, hate leads to suffering - Yoda. Blog - http://jefferycarlsonblog.blogspot.com/







    • Edited by The Thinker Wednesday, May 30, 2012 9:22 PM
    Wednesday, May 30, 2012 8:21 PM
  • Who marked my only post above as abusive? (Non-relavant at all thread i deleted per robs request to stay on track) I've done that before and it was on subject do why abusive?

    Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth. - "Sherlock holmes" "speak softly and carry a big stick" - theodore roosevelt. Fear leads to anger, anger leads to hate, hate leads to suffering - Yoda. Blog - http://jefferycarlsonblog.blogspot.com/

    Wednesday, May 30, 2012 9:24 PM