none
Color and IR simultaneously RRS feed

  • Question

  • Hello, has anyone tried to get both color and infrared frames simultaneously using v1.6?

    Thank you 

     
    Monday, November 19, 2012 11:03 PM

Answers

  • IR is a setting on the color stream, so you would have to have multiple sensors to get both
    • Marked as answer by chrych Sunday, November 25, 2012 3:28 AM
    Thursday, November 22, 2012 3:32 AM
  • oh yeah, you can switch all you like. Take a look at the Kinect Explorer Toolkit sample, it lets you switch through all the modes.
    • Marked as answer by chrych Sunday, November 25, 2012 3:28 AM
    Thursday, November 22, 2012 10:56 PM

All replies

  • IR is a setting on the color stream, so you would have to have multiple sensors to get both
    • Marked as answer by chrych Sunday, November 25, 2012 3:28 AM
    Thursday, November 22, 2012 3:32 AM
  • I see, thanks for the reply. Are you aware if it is available to switch after I have started the Kinect? For example to start with 

    sensor.ColorStream.Enable();                   
    sensor.Start();

    and then somehow later change it to 

    sensor.ColorStream.Enable(ColorImageFormat.InfraredResolution640x480Fps30);

    ?

    Thursday, November 22, 2012 10:27 PM
  • oh yeah, you can switch all you like. Take a look at the Kinect Explorer Toolkit sample, it lets you switch through all the modes.
    • Marked as answer by chrych Sunday, November 25, 2012 3:28 AM
    Thursday, November 22, 2012 10:56 PM
  • Perhaps, there is an oversight in the SDK docs or in the return values from APIs. Would you please review the following?

    I live in C land. Not C#. As such and the current online SDK docs, I see nothing doc'd to prevent me from the following. Everything returns nice HR=S_OK. However, when I start accessing the locked bit data, I then get crashes deep in Windows (somewhere without debug symbols).

    To clarify, in SDK v1.6 do you support 3 simultaneous streams to be open and streaming of Depth, Color, and IR? Here is sample code which throws no API errors (instead Windows crashes the app as I describe above).

    x->pNuiSensor->NuiInitialize(	NUI_INITIALIZE_FLAG_USES_SKELETON |
    				NUI_INITIALIZE_FLAG_USES_DEPTH |
    				NUI_INITIALIZE_FLAG_USES_COLOR);
    // HR is S_OK
    x->pNuiSensor->NuiImageStreamOpen(NUI_IMAGE_TYPE_DEPTH,          ..., &(x->hDepthStreamHandle));
    x->pNuiSensor->NuiImageStreamOpen(NUI_IMAGE_TYPE_COLOR,          ..., &(x->hColorStreamHandle));
    x->pNuiSensor->NuiImageStreamOpen(NUI_IMAGE_TYPE_COLOR_INFRARED, ..., &(x->hirStreamHandle));
    // all the above HRs are S_OK
    
    x->pParent->pNuiSensor->NuiImageStreamGetNextFrame(hDepthStreamHandle, 0, &depthImageFrame);
    x->pParent->pNuiSensor->NuiImageStreamGetNextFrame(hColorStreamHandle, 0, &colorImageFrame);
    x->pParent->pNuiSensor->NuiImageStreamGetNextFrame(hIRStreamHandle, 0, &irImageFrame);
    // all the above HRs are S_OK and the frames even have distinct near temporarily close timestamps, e.g.
    // frameTimestamps depth=58818849 ir=58818882 color=58818850
    
    colorImageFrame.pFrameTexture->LockRect(0, &LockedRect, NULL, 0);
    // HR is S_OK
    
    for(in parallel)...crash!
    // when iterating over the pLockedRect->pBits that is where my app is crashing
    // somewhat hard to trace because I'm dong some work in parallel. The app worked for months
    // until I added the new IR support calls today

    If this is not possible to have all 3 streams open, then I request some clarification and updates:

    • update the API to return error codes when a programmer tries to open two simultaneous streams for color and ir
    • programming guide pages on color and ir streams; clearily indicate that only one or the other can function
    • Rework the index for the programming guide. Currently it is Programming Guide / NUI / Data streams/ and then list all 4 distinct (audio, color, depth, ir) as if they are separate streams. 
    • C++ reference for NuiInitialize, NuiImageStreamOpen, etc.

    Do you understand my inquiry and confusion?


    --Dale

    Wednesday, January 16, 2013 7:14 PM
  • Yes, your confusion is understandable. As you've discovered, there are some subtle behaviors of INuiSensor::NuiImageStreamOpen that are not obvious (and not well documented).

    In particular, if you have already opened any "color" stream (that is, its image type is any value that starts with NUI_IMAGE_TYPE_COLOR) and you subsequently call NuiImageStreamOpen again for a "color" stream, the effect is to reset the already-opened stream to operate with the new settings. Essentially, the existing stream is repurposed. In fact, if you examine the values of hColorStreamHandle and hIRStreamHandle in your code, you'll find that they actually have the same handle value.

    (The same applies to the "depth" stream types. There is only one "depth" stream, regardless of how many times NuiImageStreamOpen is called.

    So, your call to NuiImageStreamOpen(NUI_IMAGE_TYPE_COLOR_INFRARED ...) has just modified the settings associated with hColorStreamHandle. If you attempt to process a frame from this stream under the assumption that the original settings (resolution, bits-per-pixel, etc.) are still in effect, then a crash is not unexpected. For example, even at the same resolution, a color image frame contains twice as many bytes (4 bytes per pixel) as an infrared frame (2 bytes per pixel). So if your loop assumes is is operating on the larger-sized color frame, but has actually received a smaller-sized infrared frame, it is likely walking off the end of the frame, resulting in a crash.

    There is no NuiImageStreamClose API. If there were, we could have caused NuiImageStreamOpen to fail if when you tried to reuse a stream without closing it first. But the behavior of NuiImageStreamOpen is already defined to allow the reuse behavior; we would be unable to change this behavior without breaking existing application code that depends on this "feature."

    The best we can do is try to better document the subtleties of this API.


    John | Kinect for Windows development team

    Thursday, January 17, 2013 12:06 AM
  • 100% get it now. Thank for you for clarifying.

    I suspect there is a hardware design limitation of your presenting color, raw ir, and depth simultaneously. A quick xls tells me that all 3 of those streams would need 50.6 MB/s (of the 60MB/s possible w/ usb2) just for uncompressed raw data not considering metadata, signaling, audio stream, etc. So, perhaps the design of the API was guided by this.

    In a future API set or hardware, would you please consider the possibility to allow the developer to choose how to use the bandwidth? Allow them to choose to enable what streams they want and if too many streams given needed bandwidth are enabled then an error is thrown -or- SDK documentation warning them of the bandwidth needs for each streamtype and the max aggregate possible. Bigger pipes like USB3 might lessen bandwidth issues, yet the approach to let the developer choose is one that I request be still considered.

    Thank you again for your above clarification and with it I can now re-approach my code and move foward. :-)


    --Dale

    Thursday, January 17, 2013 12:15 PM
  • You nailed it. We're limited to either color and depth, or IR and depth. There's also a hardware design constraint that prevents running color and IR (without depth) simultaneously.


    John | Kinect for Windows development team

    Saturday, January 19, 2013 1:45 AM
  • Has this been fixed in Kinect 2 SDK?

    (Or is there still a hardware constraint that Color and IR streams cannot be active simultaneously?)

    Monday, September 15, 2014 10:22 PM
  • The Kinect v1 continues to be bandwidth limited and has the limitations described above.

    The Kinect v2 sensor uses USB3, has the bandwidth, and the v2 SDK supports simultaneous Color and IR steams.

    The Kinect v2 SDK only works with the Kinect v2 sensor.



    --Dale

    Monday, September 15, 2014 11:56 PM