Compute point cloud from disparity map RRS feed

  • Question

  • Hi everybody,

    I read many questions on this forum related to my problem and I have looked at the examples provided in the sdk

    however I am quite stuck as my computation of the point cloud seems to be incorrect as you can see in the image here.

    I am using the fixed constants depthHFOV and depthVFOV that you can find in the sdk api. The cloud I have generated seems a bit distorted, maybe there is a problem with the angle of the camera. What about if the camera is tilted? Am I missing something in the procedure of computing the world coordinates?

    Thanks in advance for any help :)

    int imageWidthHalf = 320;
    int imageHeigthHalf = 240;
    // fixed parameters
    float depthHFOV = 58.5f;
    float depthVFOV = 45.6f;
    float angle = 180.0f;
    float degreesToRadiant = M_PI/angle;
    float depthH = std::tanf( (depthHFOV * 0.5f) * degreesToRadiant );
    float depthV = std::tanf( (depthVFOV * 0.5f) * degreesToRadiant );
    wolrdX = d * depthH * (y / imageWidthHalf);
    worldY = d * depthV * (x / imageHeigthHalf);

    Monday, February 4, 2013 4:59 PM