locked
How to render raw video frames using WinRT APIs ?

    Question

  • Hi..

    I am developing Video VoIP application using WinRT in c++   .

    I am receiving video packets from the network in real time and decoding it to get raw video frames. Now I need to render them on the screen frame by frame. Which WinRT API I should use to render it to the screen ?

    I read from the net that one of the ways to do is by using Enhanced Video Renderer. But most of the API's of it are listed as "Desktop only" on msdn.

    Pls help me...

    Thanks in advance


    • Edited by abwin8 Thursday, October 11, 2012 7:24 PM
    Thursday, October 11, 2012 12:47 PM

Answers

  • Just create a Direct3D App project template from visual studio. And take a look for code.

    Take a look this Media engine native C++ video playback sample too.

    ComPtr <ID3D11Texture2D> spTextureDst;

    m_spDX11SwapChain->GetBuffer (0, IID_PPV_ARGS (&spTextureDst));

    m_spMediaEngine->TransferVideoFrame (spTextureDst.Get (), nullptr, &m_rcTarget, &m_bkgColor);

    m_spDX11SwapChain->Present (1, 0);

    This is the key piece of code above in which we get the video frame from the IMFMediaEngine object.

    Or another way below for case if you already have a bitmap data of the video frame.

    ID3D11Texture2D *pTexture; BYTE *bitmapData = new BYTE [bitmapDataSize]; //here copy your buffer with frame data to bitmapData using memcpy function for example D3D11_SUBRESOURCE_DATA frameData; frameData.pSysMem = bitmapData; frameData.SysMemPitch = pitch; //I bad understand this parameter, but I guess here must be frameWidth multiplied by pixel size in bytes frameData.SysMemSlicePitch = 0; D3D11_TEXTURE2D_DESC texDesc; texDesc.Width = frameWidth; texDesc.Height = frameHeight; texDesc.MipLevels = 1; texDesc.ArraySize = 1; texDesc.Format = DXGI_FORMAT_B8G8R8A8_UNORM; texDesc.SampleDesc.Count = 1; texDesc.SampleDesc.Quality = 0; texDesc.BindFlags = D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_RENDER_TARGET; texDesc.MiscFlags = 0; texDesc.Usage = D3D11_USAGE_DEFAULT; texDesc.CPUAccessFlags = 0; d3dDevice->CreateTexture2D (&texDesc, &frameData, &pTexture); //that's it, now you have a texture with your frame,

    //and now you can use it in your rendering code, see examples which I proposed above





    • Edited by hellobody Thursday, October 11, 2012 9:26 PM some additions
    • Marked as answer by Jesse Jiang Wednesday, October 24, 2012 7:27 AM
    Thursday, October 11, 2012 8:32 PM

All replies

  • Hi! I think you bad understand the TransferVideoFrame method. But what do you mean you have frame in hand? IMFMediaEngine::TransferVideoFrame method can receive a pointer to ID3D11Texture2D object and fill it by current video frame. Then you can do anything you want with this texture. For example, you can set it as shader resource. At first of all MediaEngine is using for get video data, to my mind. And then you can use it in that way you want.

    But if you already have a frame bitmap data you do not need to use MediaEngine I think. You can just create a ID3D11Texture2D object using ID3D11Device::CreateTexture2D method. And pass into it as second parameter (D3D11_SUBRESOURCE_DATA) the frame which you have.

    Thursday, October 11, 2012 7:22 PM
  • Hey, you comletely changed your question! And my reply does no have any sense now :(
    Thursday, October 11, 2012 7:42 PM
  • Hi Hellobody,

    Thanks for your reply. I had misunderstood use of TransferVideoFrame API. Now i have changed my question suitably to reflect the real issue i am facing!

    From your comments it looks like ID3D11Device::CreateTexture2D itself is a API that can "render" a frame onto the screen?? Is that the case?

    Or is there some other API that i need to call to do the rendering, after calling CreateTexture2D ?

    Thanks.

    Thursday, October 11, 2012 7:42 PM
  • No. I was thinking you are familiar with direct3d technology. If you have a texture you can render it.

    "ID3D11Device::CreateTexture2D itself is a API that can "render" a frame onto the screen??" It's wrong thought.

    Using this ID3D11Device::CreateTexture2D method you can create texture and fill it with data you have. 

    Thursday, October 11, 2012 7:56 PM
  • Okay, i get it now. But what is the render API that i should use once i create the texture and fill it with data i have??

    I am looking in msdn for this API for quite a while now and the search is going in loops now. Your help is much appreciated.

    Thursday, October 11, 2012 8:00 PM
  • Just create a Direct3D App project template from visual studio. And take a look for code.

    Take a look this Media engine native C++ video playback sample too.

    ComPtr <ID3D11Texture2D> spTextureDst;

    m_spDX11SwapChain->GetBuffer (0, IID_PPV_ARGS (&spTextureDst));

    m_spMediaEngine->TransferVideoFrame (spTextureDst.Get (), nullptr, &m_rcTarget, &m_bkgColor);

    m_spDX11SwapChain->Present (1, 0);

    This is the key piece of code above in which we get the video frame from the IMFMediaEngine object.

    Or another way below for case if you already have a bitmap data of the video frame.

    ID3D11Texture2D *pTexture; BYTE *bitmapData = new BYTE [bitmapDataSize]; //here copy your buffer with frame data to bitmapData using memcpy function for example D3D11_SUBRESOURCE_DATA frameData; frameData.pSysMem = bitmapData; frameData.SysMemPitch = pitch; //I bad understand this parameter, but I guess here must be frameWidth multiplied by pixel size in bytes frameData.SysMemSlicePitch = 0; D3D11_TEXTURE2D_DESC texDesc; texDesc.Width = frameWidth; texDesc.Height = frameHeight; texDesc.MipLevels = 1; texDesc.ArraySize = 1; texDesc.Format = DXGI_FORMAT_B8G8R8A8_UNORM; texDesc.SampleDesc.Count = 1; texDesc.SampleDesc.Quality = 0; texDesc.BindFlags = D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_RENDER_TARGET; texDesc.MiscFlags = 0; texDesc.Usage = D3D11_USAGE_DEFAULT; texDesc.CPUAccessFlags = 0; d3dDevice->CreateTexture2D (&texDesc, &frameData, &pTexture); //that's it, now you have a texture with your frame,

    //and now you can use it in your rendering code, see examples which I proposed above





    • Edited by hellobody Thursday, October 11, 2012 9:26 PM some additions
    • Marked as answer by Jesse Jiang Wednesday, October 24, 2012 7:27 AM
    Thursday, October 11, 2012 8:32 PM
  • "But what is the render API that i should use once i create the texture and fill it with data i have??"

    In case which I suggest this is the Direct3D API.

    Thursday, October 11, 2012 8:54 PM
  • Are each going to CreateTexture once again
    Are each going to CreateTexture once again
    Tuesday, November 20, 2012 7:41 AM
  • hello , can u render one frame with createtexture2d?

    What about performance?

    are u debug it on the surface ?

    Tuesday, November 20, 2012 7:52 AM