running control basic code RRS feed

  • Question

  • Dear all.

    I try to sample run sample code given in SDK 2.0 i.e control basic.When i run this code the sensor being detected properly. i can move my mouse cursor. but i could not able to detect Hand gesture for choosing the icons control. can you let me know problem.

    Even it has 2gb RAm i didnt find problem  with depth; color;IR example. even i try some blog example channel9 msdn. I hold my hand front of kinect sensor 10 min hoping it detetct.

    please give solution for it.


    using App1.Common;
    using System;
    using System.Collections.Generic;
    using System.IO;
    using System.Linq;
    using System.Runtime.InteropServices.WindowsRuntime;
    using Windows.ApplicationModel;
    using Windows.ApplicationModel.Activation;
    using Windows.Foundation;
    using Windows.Foundation.Collections;
    using Windows.UI.Xaml;
    using Windows.UI.Xaml.Controls;
    using Windows.UI.Xaml.Controls.Primitives;
    using Windows.UI.Xaml.Data;
    using Windows.UI.Xaml.Input;
    using Windows.UI.Xaml.Media;
    using Windows.UI.Xaml.Navigation;
    using Microsoft.Kinect;
    using Microsoft.Kinect.Xaml.Controls;
    // The Grid App template is documented at
    namespace App1
        /// <summary>
        /// Provides application-specific behavior to supplement the default Application class.
        /// </summary>
        sealed partial class App : Application
            /// <summary>
            /// Initializes the singleton Application object.  This is the first line of authored code
            /// executed, and as such is the logical equivalent of main() or WinMain().
            /// </summary>
            public App()
                this.Suspending += OnSuspending;
            /// <summary>
            /// Invoked when the application is launched normally by the end user.  Other entry points
            /// will be used such as when the application is launched to open a specific file.
            /// </summary>
            /// <param name="e">Details about the launch request and process.</param>
            protected override async void OnLaunched(LaunchActivatedEventArgs e)
    #if DEBUG
                // Show graphics profiling information while debugging.
                if (System.Diagnostics.Debugger.IsAttached)
                    // Display the current frame rate counters
                    this.DebugSettings.EnableFrameRateCounter = true;
                Frame rootFrame = Window.Current.Content as Frame;
                // Do not repeat app initialization when the Window already has content,
                // just ensure that the window is active
                if (rootFrame == null)
                    // Create a Frame to act as the navigation context and navigate to the first page
                    rootFrame = new Frame();
                    //Associate the frame with a SuspensionManager key                                
                    SuspensionManager.RegisterFrame(rootFrame, "AppFrame");
                    // Set the default language
                    rootFrame.Language = Windows.Globalization.ApplicationLanguages.Languages[0];
                    rootFrame.NavigationFailed += OnNavigationFailed;
                    if (e.PreviousExecutionState == ApplicationExecutionState.Terminated)
                        // Restore the saved session state only when appropriate
                            await SuspensionManager.RestoreAsync();
                        catch (SuspensionManagerException)
                            //Something went wrong restoring state.
                            //Assume there is no state and continue
                    // Place the frame in the current Window
                    KinectRegion kinectRegion = new KinectRegion();
                    KinectUserViewer KinectUserViewer = new KinectUserViewer()
                        HorizontalAlignment = HorizontalAlignment.Center,
                        VerticalAlignment = VerticalAlignment.Top,
                        Height = 100,
                        Width = 121,
                    Grid grid = new Grid();
                    kinectRegion.Content = rootFrame;
                    Window.Current.Content = grid;
                if (rootFrame.Content == null)
                    // When the navigation stack isn't restored navigate to the first page,
                    // configuring the new page by passing required information as a navigation
                    // parameter
                    rootFrame.Navigate(typeof(GroupedItemsPage), e.Arguments);
                // Ensure the current window is active
            /// <summary>
            /// Invoked when Navigation to a certain page fails
            /// </summary>
            /// <param name="sender">The Frame which failed navigation</param>
            /// <param name="e">Details about the navigation failure</param>
            void OnNavigationFailed(object sender, NavigationFailedEventArgs e)
                throw new Exception("Failed to load Page " + e.SourcePageType.FullName);
            /// <summary>
            /// Invoked when application execution is being suspended.  Application state is saved
            /// without knowing whether the application will be terminated or resumed with the contents
            /// of memory still intact.
            /// </summary>
            /// <param name="sender">The source of the suspend request.</param>
            /// <param name="e">Details about the suspend request.</param>
            private async void OnSuspending(object sender, SuspendingEventArgs e)
                var deferral = e.SuspendingOperation.GetDeferral();
                await SuspensionManager.SaveAsync();


    Thursday, March 19, 2015 12:04 PM

All replies

  • is there any updtate??


    Friday, March 20, 2015 4:28 AM
  • Engagement just takes a momment, I think there is no point in holding the hand ten minutes.

    Do the BodyBasics sample work in your system?

    Friday, March 20, 2015 8:39 AM
  • yes i have , 

    The kinect sensor respond more faster for code like 

    1. audio basic
    2. color basic
    3. depth basic
    4. face basic
    5. IR basic example

    it take more time to capture frame in

    1. body basic
    2. control basic
    3. kinect fusion example code

    let me know how can i resolve this issue


    Friday, March 20, 2015 9:41 AM
  • Maybe you have a performance issue, you are 2GB RAM under requeriments and recommend processor is i7 for full functionality and you have i3 (althouth many people are running different apps with other processors depending on what they are doing).

    If you could try and test with a more powerful equipment you could check if this is a performance problem or if you are doing any other thing wrong.

    Friday, March 20, 2015 12:59 PM
  • In addition to jmmroldan suggestion about hardware, what gesture are you using to interact with the button? Have you reviewed the Human Interface Guidelines? Additionally, the video from the MVA shows correct engagement model/gestures to use.

    Does the control basics sample work? What is your sensor setup?

    Carmine Sirignano - MSFT

    Monday, March 23, 2015 5:20 PM
  • I have update my 4GB Ram & processor ; still I am facing same problem. It will display me the icons But i couldnot able to detect Hand pointer . 

    It wont allow me to Choose Hand pointer  Remain code works like mentioned in the videos. Using Mouse pointer i can able to Control the values


    Friday, April 10, 2015 5:24 AM
  • As Carmine said, could you tell us what is your sensor setup? Things like the height from the floor, the direction it is pointing to, the distance you are from the sensor when using it...

    Also, did you use "Kinect Configuration Verifier" application? Maybe you have some issue due to USB 3.0 controller.

    Friday, April 10, 2015 7:50 AM
  • I have kept sensor pointing toward me . the distance might be 0.25-0.75 meter .I just kept on table . The kinect device placed .75 meter from ground level.

    It mentioned in the forum that Even it Say gives error check USB3.0 compatible with SDK v2. You can use sensor.

    I am not getting problem with other program like skeleton; depth; sound;color ; face which use same USB 3.0 but transfer rate are lower .

    The Desktop i am using is specification wise very high for kinect basic requirements. It doesn't make any sense to purchase i7 process 1124$ 


    Friday, April 10, 2015 8:45 AM
  • A couple of comments:

    - Recommended setting for interaction tracking is about 1.8 meters from the floor and pointing slightly downwards so the sensor can see the floor, (see this post, althogh I have used it at other heights).  Besides, I would say you are too close to the sensor. I think depth sensing starts at 0.5 meters so you are quite at the border of the supported ranged. Try to get a little further and see if it helps.

    - I think USB 3.0 error can be omitted under some circunstances but I think many people could not use the sensor without a proper USB 3.0 controller. I've paid no particular attention because I did not found this problem so I'm not sure of when you can ignore the error and when you can not.

    • Edited by jmmroldan Friday, April 10, 2015 12:29 PM
    Friday, April 10, 2015 12:28 PM