Answered by:
How can I deliver ARGB32 video format instead of YUY2 to a custom MFT using MediaCapture?

Question
-
Using C# front end with a C++/Cx component (custom MFT) I currently have YUY2 video frames coming into my custom MFT and displayed via a MediaCapture object. I actually need ARGB32 format to do post processing on. I tried to set the IMFMediaType input and output subtype to MFVideoFormat_ARGB32 from within the MFT code, but it doesn't work, most likely because the Asus WinRT tablet capture devices I'm using only supports capturing of YUY2 video data. I need this conversion to be super fast, can somebody point me in the right direction to do color space conversion from YUY2 to ARGB32? I am investigating doing this conversion on the GPU thru a pixel shader, meanwhile I hope that a built-in DSP can do the color conversion for me, but from what I can tell the color conversion DSP's are supported on the desktop only. Do I need to use an IMFMediaSource with an IMFSourceReader instead of using MediaCapture w/ custom MFT ?
- Edited by Jason Hermann Thursday, November 15, 2012 4:44 AM
Thursday, November 15, 2012 12:56 AM
Answers
-
Hello,
You theoretically can use a pixel shader to do the conversion. However, Media Foundation has a built-in color space converter that is highly optimized. You should be able to force the input of your MFT to only accept ARGB32 and the color space converter (CSC) should automatically be added to the underlying topology. AFAIK you can't force the CSC to be loaded into the Media Engine and must rely on the automatic topology resolver. The issue might reside in the way the MediaCapture element is handling the topology resolution. I would recommend that you try adding your MFT to a MediaElement and see if it connects correctly. If it connects then we know that the MediaCapture element is not adding the CSC. If it does not connect correctly then it is possible that your MFT is not advertising the MediaType properly.
I hope this helps,
James
Windows Media SDK Technologies - Microsoft Developer Services - http://blogs.msdn.com/mediasdkstuff/
- Proposed as answer by James Dailey - MSFTMicrosoft employee, Moderator Thursday, November 15, 2012 1:56 AM
- Marked as answer by Jason Hermann Thursday, November 15, 2012 8:49 AM
Thursday, November 15, 2012 1:56 AMModerator
All replies
-
Hello,
You theoretically can use a pixel shader to do the conversion. However, Media Foundation has a built-in color space converter that is highly optimized. You should be able to force the input of your MFT to only accept ARGB32 and the color space converter (CSC) should automatically be added to the underlying topology. AFAIK you can't force the CSC to be loaded into the Media Engine and must rely on the automatic topology resolver. The issue might reside in the way the MediaCapture element is handling the topology resolution. I would recommend that you try adding your MFT to a MediaElement and see if it connects correctly. If it connects then we know that the MediaCapture element is not adding the CSC. If it does not connect correctly then it is possible that your MFT is not advertising the MediaType properly.
I hope this helps,
James
Windows Media SDK Technologies - Microsoft Developer Services - http://blogs.msdn.com/mediasdkstuff/
- Proposed as answer by James Dailey - MSFTMicrosoft employee, Moderator Thursday, November 15, 2012 1:56 AM
- Marked as answer by Jason Hermann Thursday, November 15, 2012 8:49 AM
Thursday, November 15, 2012 1:56 AMModerator -
James,
I got the ARGB32 color space converter working now after trying what you recommended (MediaElement didn't connect my MFT, i got sound but no video) and then debugging the code more thoroughly I figured out how the MFT was configuring it's input and output media types, so I tweaked the MFT code to support ARGB32 media types properly. However going from MFVideoFormat_YUY2 to MFVideoFormat_ARGB32 was actually quite a noticeable degradation in performance even when setting the video encoding properties to use the lowest video resolution of 640x480; it got a bit choppier while the YUY2 video data stream seemed much smoother and acceptable to me; is it because one is 16bpp and the other 32bpp? also do u know if it's using hardware acceleration by default or is there a setting I need to check in regards to the built-in CSC processing goes?
Thanks,
Jason
Thursday, November 15, 2012 8:48 AM