none
Skeleton Tracking with 2 Kinects RRS feed

  • Question

  • Hi,

    I want to use two kinects to perform skeleton tracking over the same person (the kinects will be separated 90 grades or more from each other). The goal of my project is to detect people's gestures even if the person is not in front of the Kinect, and since the detection is not really accurate from a side view I wanted to "merge" the results provided by2 Kinects to create a "virtual body" to detect gestures. My idea was to project each Kinect's points in a common plane or space to calculate the angle between a set of those projected points, it would be a sort of Stereo Vision I think.

    I read in previous questions of the forum that it is not posible to perform skeleton tracking with two Kinects (this was in June '11), is this problem already fixed? Still, some people thought that the problem was happening because you were trying to detect the same person, so, is there a way to tell the Kinects that each skeleton belongs to a different person (though this is not true in reality)?

    Sorry for my English and thanks in advance!

    Wednesday, February 22, 2012 9:00 AM

All replies

  • What you will need to do is to have two processes running, each connected to a different Kinect, and use inter-process communication to get both skeleton streams to the same place.
    Wednesday, February 22, 2012 6:06 PM
  • Thanks for your answer tcqwerty. That's a difficult way of solving the situation but it might be the only one. I was thinking of giving a try to OpenNI with the Kinect because it seems to work fine in the videos, but  I am not sure that the result will be better than with the Kinect SDK....
    Thursday, February 23, 2012 9:12 AM
  • Most people from their forums prefer c++/c# but more c++ from the google group i joined that uses open ni. Try kinect as its ease of use is good. Actually it wouldnt be that hard to do two kinects with skeleton data. I had an idea that you could save skeleton data to file from both kinects like how many people are being actively tracked and passively tracked  and skeleton joint data but i've seen that done with many kinect projects.

    Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth. - "Sherlock holmes" "speak softly and carry a big stick" - theodore roosevelt. Fear leads to anger, anger leads to hate, hate leads to suffering - Yoda

    Thursday, February 23, 2012 4:31 PM
  • as it was said, the most difficult thing would be to merge the data of two processes.

    Suppose you figured a way to handle the threads, all you have to do would be to calculate the average joint positions of each skeleton, relative to the Skeleton.Position (important: you need to rotate one of them by the angle between the kinects (±90°). You can't have a

     common point of orign, so you can track only gestures relative to the skeleton itself.


    Life is unsure - always eat the dessert first!

    Thursday, February 23, 2012 11:44 PM
  • Hi~!

    Did you get it working?

    Saturday, April 21, 2012 10:38 AM