none
Custom Gesture Segmentation Kinect RRS feed

  • Question

  • I need to split a gesture in 3 phases: preparatory, execution and final phase. For every phase, i want to calculate the angle between 2 joints and test whether the angle is within some range. Any idea how to do this without using support vector machine or hidden markov models?

    My idea is have a 'start recording' button and a 'stop recording' button. At every frame, we gonna insert the joint angle value into a list. When the recording is done, we gonna iterate through that list and see if there are values within that range +- a certain threshold away from the optimum joint angle. If we find the exact optimum angle value, the score for that stage will be maximum, if not, we're gonna add just half of the maximum score or we're gonna develop a certain logic for the score thing.

    Willing to know what your approach will be. Kind regards.

    Monday, July 6, 2015 12:52 PM

All replies

  • Proabably the way you suggest would work. I tend to try to use the tested tools to get have the work done for me. For example...

    Assuming the body joints, use the Visual Gesture Builder (VGB). Use both discreet gestures and progressive ones. Create a discreet gesture for Prep, Execution Start, Execution End, and Final Start, and Final End Phases. Then use progressive gestures to track motion percentage. Once you visually see the angle you're looking for within the Progressing gesture, use this progressive gesture event in your code to determine the actual angle at that point, and then add your logic fitting thresholds.


    Sr. Enterprise Architect | Trainer | Consultant | MCT | MCSD | MCPD | SharePoint TS | MS Virtual TS |Windows 8 App Store Developer | Linux Gentoo Geek | Raspberry Pi Owner | Micro .Net Developer | Kinect For Windows Device Developer |blog: http://dgoins.wordpress.com

    Monday, July 6, 2015 7:38 PM
  • Thanks for your reply Dwight. If i'm using the VGB I want to get a score of gesture accuracy, and not a 'Gesture Recognized Successfully' or 'Gesture Recognized Fail' message if the gesture was recognized or not. I want to be able to tell how much in percent the user was able to remake the gesture. Is the VGB able to deliver that? Kind regards.
    Monday, July 13, 2015 8:04 AM
  • There is a detected and confidence property you can use to determine the boost decision percentage. You should be able to use this.

    Sr. Enterprise Architect | Trainer | Consultant | MCT | MCSD | MCPD | SharePoint TS | MS Virtual TS |Windows 8 App Store Developer | Linux Gentoo Geek | Raspberry Pi Owner | Micro .Net Developer | Kinect For Windows Device Developer |blog: http://dgoins.wordpress.com


    Tuesday, July 14, 2015 4:43 PM