none
Depth error of 10% RRS feed

  • Question

  • Hi,

    I am measuring the distance measured by an XBOX kinect camera against real measurement (with a tape).

    To do so, I point the kinect camera at a wall (distance: 3 meters from the kinect lens). The kinect is parallel to the wall.

    A depth pixel, more or less at the centre of the image, returns  a depth value of 3.3 meters. The same experiment with other kinect gives 2.98 meters and 3.08 meters.

    Those errors seems a bit too high, and the variability worrying, am I missing something ? Could that be that my cameras are damaged ?

    Frank

    Friday, February 7, 2014 2:17 PM

All replies

  • The only commercially supported sensor is the Kinect for Windows device. The Kinect for Windows device has a different design and gone through different testing and configuration that may provide more stability. Are the device new out of the box or have they been used "in the field"?


    Carmine Sirignano - MSFT

    Friday, February 7, 2014 7:37 PM
  • The devices are not new. They have been used and moved around, but nothing dramatic. Still, they might have fallen on the floor once or twice.

    Among the three kinect cameras tested, one was actually a Kinect for Windows camera, the one which returned 3.08 meters.

    My understanding is that the depth returned from the camera is the depth from the zero plane. Where is this zero plane relative to the lens (or, let say, to the front side of the Kinect) ? If this is 8cm behind, then our Kinect for Windows camera is spot on, but 8cm  seems a bit high...

    Saturday, February 8, 2014 5:43 PM
  • OK, no answer, so here is a message for the world:

    Don't assume the depth measurement of your Kinect cameras to be accurate: it is quite common to have errors up to 10%. In our case, 2 cameras out of 10 had an error of 10%, that is, an object at 3 metres was reported to be at 3.3 metres by the camera.

    Friday, February 14, 2014 3:02 PM
  • if you are consistently off by the same amount it would seem the basis of your measurement are on assumed center point. Where is that?

    I honestly can't remember if it was the center of the depth image or in-between the color/depth images(center of the device). If you just move the point do your values change?


    Carmine Sirignano - MSFT

    Saturday, February 15, 2014 12:52 AM
  • Hi Frank,

    In my experience the real world accuracy of the 3D point cloud varies substantially from sensor to sensor, your 10% deviation is not surprising. We were able to correct for this using a semi-automated calibration routine to collect 1000 data points with a robot and vision software, then compute a 4x4 projective transformation matrix to correct spatial errors. In our application the errors went from up to approx 100mm to less than 10mm in any direction.

    Carmine, the error was not consistent with centre point being off. Indeed we had already carried out a full machine vision camera calibration using the IR camera but the point cloud was not able to be adjusted to match this without the 4x4 matrix calculation.

    After this we were able to align the point cloud with the IR image and external images from other cameras.

    Roland

    Monday, February 17, 2014 3:48 AM