none
Volume integration in fusion using extrinsic calibrated camera pose RRS feed

  • Question

  • Hi,

    I am trying to integrate volume using multiple sensors (using the example from Multiple Static cameras in SDK v1.8).

    I am explicitly supplying the camera pose using extrinsic calibration. 

    As long as the camera pose I supply is in front of the volume, it integrates fine. When the camera goes behind the volume tracking is lost in some cases. Usually when the X, Y or Z angles are at the boundary of 0, 90 or 180 degrees.

    Assuming coordinates like this:  

    ^ Y    /Z

    |      / 

    | / 

    ---------------> x 


    As long as camera pose is in -ve z direction, integration is fine. 

    But when the camera lies in quadrants of +Z, the integration is siddenly lost. 

    In the WPF samples, the function: CameraTransformFailed(Matrix4 initial, Matrix4 final, float maxTrans, float maxRotDegrees)

    gets called usually when the tracking is lost. 

    Any suggestions on a universal integration technique in fusion irrespective of actual camera location?

    Thanks !

    Nagaraj


    • Edited by Nagraj B Thursday, August 13, 2015 11:01 AM
    Thursday, August 13, 2015 10:59 AM