locked
Why H264 is DXVA on one stream but not DXVA on another H264 stream? RRS feed

  • Question

  • Two sources, two H264 streams.  According to MFTrace, one decodes using dxva (direct-x video acceleration I believe), the other does not.  The one that does not (no dxva) uses about 30-100 times more CPU time (long running test shows, say, 100 secs of CPU versus 1 second).  Why?  What criteria is used to detemine if an H264 stream gets dxva?  The H264 stream that does dxva is High, level 4.1, perhaps 4.0.  The H264 stream that does not is Base, 3.1 or so.  Resolution for either is 640x480.  30 FPS.

    I am reading/feeding the data myself.  Decoder used is win7/MS h264 one.

    Is this a video driver thing?  As in Nvidia?




    Saturday, July 28, 2012 9:41 AM

Answers

  • I was not to post the question here since it's the type that I would not expect answered (here), no better reason that it would only add noise.  I saw the SDK guy reply (amazing!) so I gave it a shot.

    The profile difference only affects added abilities, not the low end.  H264 is perfectly fine at High, level 4.1 (the one that works) at 640x480, 30 FPS, reasonably low bitrate.  The company behind that is Panasonic, and they are all into DLNA, broadcast, and going less probably is beneath them.  Kidding aside, it's been a long while since I looked at this. While the title asks the question why, what I want to know is inside: what are the criteria ... blah, blah, blah.

    And, as it turned out, just added noise.  I'll makr mine as answered, if that's possible, and close this out.

    Wednesday, August 8, 2012 7:02 PM

All replies

  • DXVA enabled device (decoder) is a limited resource, and what you are seeing is that only one is available in your system. You can use DXVA Checker tool to see how many devices you can create. On failure to create a device, decoder falls back to software decoding, and software-only operation is, yes, CPU consuming.

    http://alax.info/blog/tag/directshow

    Tuesday, July 31, 2012 6:50 AM
  • Let me be clear.  One at a time:  One failed to dxva; one succeeded at dxva.

    I've tried the dxva checker before.  And again today.  It's not useful to me.
    Tuesday, July 31, 2012 11:00 AM
  • If both videos have the same resolution, framerate and roughly the same bitrate, they should have roughly the same H264 level. A gap of 3.1 to 4.0 is fairly massive and 4.0/4.1 for a 640x480@30fps is hard to reach unless the bitrate is rediculously high for that resolution/framerate.

    If you are using the MS software decoder, the H264 parameters shouldn't have any affect on DXVA as you'd be parsing YUV based uncompressed frames to the sink. What YUV format are you using and have you checked to see if DXVA supports that format?

    Wednesday, August 8, 2012 7:17 AM
  • I was not to post the question here since it's the type that I would not expect answered (here), no better reason that it would only add noise.  I saw the SDK guy reply (amazing!) so I gave it a shot.

    The profile difference only affects added abilities, not the low end.  H264 is perfectly fine at High, level 4.1 (the one that works) at 640x480, 30 FPS, reasonably low bitrate.  The company behind that is Panasonic, and they are all into DLNA, broadcast, and going less probably is beneath them.  Kidding aside, it's been a long while since I looked at this. While the title asks the question why, what I want to know is inside: what are the criteria ... blah, blah, blah.

    And, as it turned out, just added noise.  I'll makr mine as answered, if that's possible, and close this out.

    Wednesday, August 8, 2012 7:02 PM