Windows Dev Center

render video to texture problem


  • Actually, I want render a video to texture, and the texture can be mapped to a 3D mesh。

    My decoding procedure is followed as . It seems differ from you given to me. In theAllocating Uncompressed Buffers step, it will create a texture by calling CreateTexture2D and returnID3D11Texture2D interface. Does this interface can be used in CreateShaderResourceView? if can't , how to use theID3D11Texture2D?

    For each surface in the texture array , call CreateVideoDecoderOutputView to create a video decoder output view, And theID3D11VideoDecoderOutputView interface will be get from this, How to convert it to theID3D11ShaderResourceView that can be used in the PSSetShaderResources?

    I'm so confused, thank you for your help.

    7 martie 2012 01:12


Toate mesajele

  • Yes , the media foundation decode video procedure that can generate the array of texture, which hold in the ID3D11Texture2D interface . Does each surface of texture store the video frame? How can I get a texture from the texture array to generate the  ID3D11ShaderResourceView that can be used in the PSSetShaderResources? Then I can to map the texture on the mesh.

    Thank you

    8 martie 2012 07:32
  • Hello zjubaoli,

    I'm a bit confused the document you reference talks about how to write a custom DXVA enabled decoder. Are you writing a decoder MFT? Based on your questions I'm guessing that you are being handed a surface that contains the video data. When using DXVA video is "streamed" onto a D3D surface as a texture. If I understand you correctly you want to lock the texture and get at the raw bits that represent the texture map so that you can pass the texture as a resource to your shader. Unfortunately looking at the D3D11_SHADER_RESOURCE_VIEW_DESC structure it is not immediately clear to me if you need to pass a pointer to the raw bits or simply a pointer to the texture object. Let me talk with my D3D expert tomorrow and I will try to see if we can get you going in the right direction. If you don't hear from me by early next week please give me a bump and I will make sure that we help you if we can.

    Thanks much,


    Windows Media SDK Technologies - Microsoft Developer Services -

    9 martie 2012 02:54
  • Yes, you understand well,  I'm using Media Foundation to decode the video, and want render video frame to D3D texture. In the decode procedure I have got a ID3D11Texture2D interface that maybe used to store the video data.I want to have a sample just as the DShow to render video to texture. 

    Thank you for your help.

    looking forward your reply. 

    9 martie 2012 05:20
  • Hello zjubaoli,

    I spoke with our DX expert today and he honestly didn't remember how to pass a texture to the PSSetShaderResources method. However, he did say that the technique was detailed in one of our DX for Metro style apps samples. I have included a reference to the sample below. Hopefully you will be able to reverse engineer the sample and understand how to do this in your code. If you have any specific questions once you review the sample, please let me know and I will do what I can to help.

    Lesson5.Components. This tutorial sample takes the concepts from the previous
    four lessons and demonstrates how to separate them into separate code objects
    for reuse

    I hope this helps,


    Windows Media SDK Technologies - Microsoft Developer Services -

    10 martie 2012 00:30
  • Thank you for your reply soon, I have known have to pass a texture to the PSSetShaderResources.  Acually , I found the When I use the Media foundation to decode the video , I should Create a 2D texture array by calling ID3D11Device::CreateTexture2D, the bind flag should include the D3D11_BIND_DECODER flag, but this D3D11_BIND_DECODER flag cannot use texture arrays that are created with this flag in calls to ID3D11Device::CreateShaderResourceView. Next is the screenshot on the msdn, where you can find it in the .

    So, if I can't use CreateShaderResourceView, I can't generate texture view to pass to PSSetShaderResources. How can I do it?
    12 martie 2012 02:44
  • Hello zjubaoli,

    I honestly don't know the answer to this. Please give me a couple of days and I'll try to track someone down that might know the answer. I'll let you know what I find out.



    Windows Media SDK Technologies - Microsoft Developer Services -

    15 martie 2012 00:10
  • Hi James

     Thank you very much. I'm looking forward for your reply.

    16 martie 2012 00:52
  • Hello zjubaoli,

    I am still having problems tracking down an expert to help us with this one. I've got a few conversations started and I hope that I will hear something back next week. I will let you know what I find.



    Windows Media SDK Technologies - Microsoft Developer Services -

    16 martie 2012 23:14
  • There seems to be some massive confusion and miscommunication in this thread.

    I think Zjuaboli simply wants to "render a video to texture, and [use] the texture ... [on] a 3D mesh".

    The latter part is just basic Direct3D stuff. Rendering a video into a texture seems to be the main problem.

    I think the source of the confusion is "". As James has said that page is "about how to write a custom DXVA enabled decoder". Even though the document talks about Media Foundation and Direct3D 11, it really has nothing directly to do with what you (Zjuaboli) want to do. You are not trying to write a Media Foundation Transform, none of that code is applicable to what you are trying to do.

    In order to render video to a texture, you should be using the IMFMediaEngine class.

    The following sample code shows you how to use IMFMediaEngine to get an ID3D11Texture2D.

    17 martie 2012 04:45
  • Thank you for helping me . It makes me know one way to render a video to texture. 

    Is it the only way to render a video to a texture?

    I don't want to use the IMFMediaEngine , because I want to decode video for myself. so I think I can use DXVA to decoder the video to a buffer or texture , then use the texture on a 3D mesh. Is it a right way? How can I do it?

    Thank you for your help

    21 martie 2012 08:50
  • Hello zjubaoli,

    I would recommend that you use the Media Engine and write an MFT that is DXVA2 aware. This MFT will automatically plug into the Media Engine if it is registered properly when your application starts. You can then use the Media Engine in frame server mode to grab the DX surface and texture and add it to your mesh.

    Unfortunately we don't have any samples that do exactly what you are trying to do but this list might help you:

    Media capture using webcam sample (shows how to write an MFT)

    Media extensions sample (how to define a custom scheme handler)

    Media engine native C++ playback sample (previously referenced sample)

    I hope this helps,


    Windows Media SDK Technologies - Microsoft Developer Services -

    23 martie 2012 01:01
  • The next win8 version, can you supply a samples about what I talk?

    Thanks very much for your help

    23 martie 2012 07:15
  • hello Zjubaoli,

    Have you solved the problem, I 've got the same question?



    one work one gain!

    17 septembrie 2012 05:23
  • hello Zjubaoli and Jackic,

    Have you solved the problem, I 've got the same question?



    20 iunie 2013 20:29
  • I'm definitely interested in how to too.

    Peace Love Happiness PLH Only One Earth OOE

    30 iunie 2013 20:38
  • Hi! 
    I have the same issue as trhead creator but for Windows Phone 8. 
    And a lot of features are not implemented (such as Source Resolver, ByteStreamHandler, etc.) so how I can use Media Engine for decoding H264 frames and getting texture of it? 

    If I can't use MediaEngine, how I can use DXVA VideoDecoder for that?
    12 august 2013 08:38