none
Issue when Kinect Hand Cursor Leaves the Application Screen RRS feed

  • Question

  • Hi Everyone 

    i have an application in v2. i have implemented the hand cursor and when the cursor leaves the application or when the user stop using hands the sensor takes time to detect the sensor back and sometimes it does not come back. 

    We updated the VGA Driver to the latest and still we have an issue.  we followed the standard by assigning the Kinect to the Kinect region as we did in V1 sdk 1.8. 

    Our Kinect instance is in a Static Variable so we ca reuse it in other Windows that we created below is my MainPage 

               

                this.InitializeComponent();
    
                Generics.Generics.FullScreen(this); 
                 
                // only one sensor is currently supported
                KinectGeneric.kinectSensor = KinectSensor.GetDefault();
    
    
                // open the sensor
                KinectGeneric.kinectSensor.Open();
    
    
                // set IsAvailableChanged event notifier
                KinectGeneric.kinectSensor.IsAvailableChanged += Sensor_IsAvailableChanged;
    
                // set the status text
                this.Status.Content = KinectGeneric.kinectSensor.IsAvailable ? Properties.Resources.RunningStatusText
                                                                : Properties.Resources.NoSensorStatusText;
                
                // open the reader for the body frames
                this.bodyFrameReader = KinectGeneric.kinectSensor.BodyFrameSource.OpenReader();
    
                // set the BodyFramedArrived event notifier
                this.bodyFrameReader.FrameArrived += bodyFrameReader_FrameArrived;
    
                // initialize the BodyViewer object for displaying tracked bodies in the UI
                this.kinectBodyView = new KinectBodyView(KinectGeneric.kinectSensor);
    
                // initialize the gesture detection objects for our gestures
               // KinectGeneric.gestureDetectorList = new List<GestureDetector>();
    
                //set the Region 
                KinectRegion.SetKinectRegion(this, kinectRegion);
                kinectRegion.KinectSensor = KinectGeneric.kinectSensor;
              
                
                App app = ((App)Application.Current);
                app.KinectRegion = kinectRegion; 


    and we have a viewbox that covers the grid and the Kinect region as detected below 


        <Viewbox   Margin="0,0,-0.2,0" VerticalAlignment="Top">
            <Grid  x:Name="LayoutRoot" Height="1367">
    
                <k:KinectRegion x:Name="kinectRegion" Margin="0,138,-8.4,10">


    we use the view box to adjust our UI automatically on different sizes of the screen. We also have Navigation services which is inside the above grid as depicted below 

             <DockPanel Margin="551,-57,668,177">
                            <Frame x:Name="_home"     NavigationUIVisibility="Hidden" HorizontalAlignment="Stretch" Source="Default.xaml" Width="1460" Margin="0,12,0,0"   />
    
                        </DockPanel>
    

    On all other Pages that get navigated to,  we dont have Kinect Region because it will inherit from the Frame Pages else we get ourselves with two hand cursors. 

    If there is anything i have missed please point it out for me

    Thanks


    Vuyiswa Maseko

    Monday, February 23, 2015 6:13 PM

All replies

  • you said "we dont have Kinect Region because" how does this differ from how we implement multi-pages in Controls-Basics? Are you creating new windows? If there is a new/corewindow that is being created, you must create a KinectRegion for that window, otherwise it doesn't know what Window to send events to.

    Carmine Sirignano - MSFT

    Tuesday, February 24, 2015 6:28 PM
  • Hi Carmine 

    Thanks for the reply 

    you said "we dont have Kinect Region because" how does this differ from how we implement multi-pages in Controls-Basics? Are you creating new windows?

    Yes you are right it does not Differ, we used this Navigation Part in our V1 Application and the it worked nicely. i have pages not window and i just navigate to those pages , because they are hosted in the Frame when the hand is active it can move from the Parent window and the Child .


    we have borrowed the client the sensor i cant actually reproduce that until they finish testing and we can attend to the issue, they just called us and explained, when i was there i was able to do it nicely i think they just need to put the sensor in the right place. 

    Carmine if i can just squeeze this other question , 

    In V1 from SDK 1.7 ,1.8 we had the push to press that was in the Toolkit which showed a nice purple looking hand , why was that not added in 2.0 SDK ? 

    Thanks 


    Vuyiswa Maseko

    Tuesday, February 24, 2015 6:55 PM