none
DirectX & Media Foundation [UWP|WIN32] IMFMediaBuffer/IMF2DBuffer to ID3D11Texture2D RRS feed

  • Question

  • Hi.

      I Decide to integrate WebCam (IMFSourceReaderCallback) to my engine and I make some workaround with MFCaptureD3D from Windows Sample. But I get a little confuse about two thing.

    1. In my engine, all was synchronize, with CRXMaterial::OnTick. By Example, if you take IMFMediaEngine and ask to read a video (mp4/avi). This one get Texture ID3D11Texture2D1 from ID3D11ShaderResourceView1 and send it back to IMFMediaEngine::TransferFrame and your video frame was copy inside your texture.

    Now with IMFSourceReaderCallback you must draw the frame inside IMFSourceReaderCallback::OnReadSample. If i will respect OnTick philosophy I got a idea. Buffering IMFMediaBuffer inside IMFSourceReaderCallback::OnReadSample and take it back inside CRXMaterial::OnTick to transfer inside resource texture.

    2. The problem come with example who use IDirect3DSurface9 and YUV color system. In sort, how transfer IMF2DBuffer to ID3D11Texture2D?

    OnReadSample

    //***************************************************************************
    //* Class name    : CRXWebCam
    //* Output        : HRESULT
    //* Function name : OnReadSample
    //* Description   : 
    //* Input         : HRESULT hrStatus
    //*                 DWORD dwStreamIndex
    //*                 DWORD dwStreamFlags
    //*                 LONGLONG llTimestamp
    //*                 IMFSample* pSample
    //***************************************************************************
    HRESULT CRXWebCam::OnReadSample(HRESULT hrStatus, DWORD dwStreamIndex, DWORD dwStreamFlags, LONGLONG llTimestamp, IMFSample* pSample)
    {
        HRESULT hr = S_OK;
        
        IMFMediaBuffer* pBuffer = NULL;
    
        if (FAILED(hrStatus))
            hr = hrStatus;
    
        if (SUCCEEDED(hr))
        {
            if (pSample)
            {
                hr = pSample->GetBufferByIndex(0, &pBuffer);
    
                //Buffering the frame
                m_pBuffer = pBuffer;
                m_pBuffer->AddRef();
    
                // Draw the frame.
            }
        }
    
        if (SUCCEEDED(hr))
            hr = m_pReader->ReadSample((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, 0, NULL, NULL, NULL, NULL);
        
        if (pBuffer) pBuffer->Release();
        pBuffer = NULL;
        
        return hr;
    }

    OnTick

    //***************************************************************************
    // Class name     : CRXMedia
    // Function name  : Render
    // Description    : Render Media Class
    //***************************************************************************
    void CRXMaterial::OnTick()
    {
    	/*IDXGIFactory4* spFactory;
    	IDXGIAdapter1* spAdapter;
    	IDXGIOutput* spOutput;
    
    	ThrowIfFailed(CreateDXGIFactory2(0, __uuidof(IDXGIFactory4), (void**)&spFactory));
    	ThrowIfFailed(spFactory->EnumAdapters1(0, &spAdapter));
    	ThrowIfFailed(spAdapter->EnumOutputs(0, &spOutput));*/
    	
    	if (m_pMediaEngine)
    	{
    		if (/*SUCCEEDED(spOutput->WaitForVBlank()) && this && */!m_pMediaEngine->m_spMediaEngine->IsPaused())
    		{
    			ID3D11Texture2D1* Texture;
    			m_pTextureView->GetResource((ID3D11Resource**)&Texture);
    			
    			DWORD SizeX;
    			DWORD SizeY;
    			m_pMediaEngine->m_spMediaEngine->GetNativeVideoSize(&SizeX, &SizeY);
    
    			RECT Rect = { 0,0, (LONG)SizeX, (LONG)SizeY };
    			MFVideoNormalizedRect NormRect = { 0.0f, 0.0f, 1.0f, 1.0f };
    			MFARGB BackColor = { 0, 0, 0, 255 };
    
    			m_pMediaEngine->TransferFrame(Texture, NormRect, Rect, BackColor);
    			Texture->Release();
    		}
    	}
    
    	if (m_pWebCamEngine)   //HERE
    	{			
    		if (m_pWebCamEngine->m_pBuffer)
    		{
    			ID3D11Texture2D1* Texture;
    			m_pTextureView->GetResource((ID3D11Resource**)&Texture);
    
    			IMF2DBuffer* m_p2DBuffer = NULL;
    			//m_pWebCamEngine->m_pBuffer->QueryInterface(IID_PPV_ARGS(&m_p2DBuffer));
    			m_pWebCamEngine->m_pBuffer->QueryInterface(IID_IMF2DBuffer, (void**)&m_p2DBuffer);
    
    			BYTE* ppbScanLine0;
    			LONG plStride;
    
    			m_p2DBuffer->Lock2D(&ppbScanLine0, &plStride);
    
    			//YUV to RGB???
    
    			/*for (DWORD y = 0; y < m_Height; y++)
    			{
    				RGBQUAD* pDestPel = (RGBQUAD*)mapped.pData;
    				WORD* pSrcPel = (WORD*)ppbScanLine0;
    
    				for (DWORD x = 0; x < 640; x += 2)
    				{
    					// Byte order is U0 Y0 V0 Y1
    
    					int y0 = (int)LOBYTE(pSrcPel[x]);
    					int u0 = (int)HIBYTE(pSrcPel[x]);
    					int y1 = (int)LOBYTE(pSrcPel[x + 1]);
    					int v0 = (int)HIBYTE(pSrcPel[x + 1]);
    
    					pDestPel[x] = ConvertYCrCbToRGB(y0, v0, u0);
    					pDestPel[x + 1] = ConvertYCrCbToRGB(y1, v0, u0);
    				}
    
    				ppbScanLine0 += plStride;
    				//mapped.pData += mapped.RowPitch;
    			}*/
    
    			//m_pDirect3D->GetD3DDeviceContext()->Unmap(Texture, NULL);
    			
    			m_p2DBuffer->Unlock2D();
    			m_p2DBuffer->Release();
    
    			Texture->Release();
    
    			if (m_pWebCamEngine->m_pBuffer) m_pWebCamEngine->m_pBuffer->Release();
    			m_pWebCamEngine->m_pBuffer = NULL;
    		}
    	}
    
    	/*spFactory->Release();
    	spAdapter->Release();
    	spOutput->Release();*/
    }


    Monday, December 9, 2019 5:13 PM

All replies

  • You shouldn't have to cache the image.  The OnReadSample callback merely tells you that a new sample is available.  You can read it any time after that.  So, you could just set a flag saying "a sample is available", and then call ReadSample during your OnTick call.

    So, what's your question?  To complete your OnTick, you need to know the format of the samples you're getting from the reader, and the format of the texture surface.  We can't tell you that.  If you know the format of the texture, you ought to be able to cal lIMFSourceReader::SetCurrentMediaType and have the reader deliver exactly what you need.


    Tim Roberts | Driver MVP Emeritus | Providenza &amp; Boekelheide, Inc.

    Monday, December 9, 2019 10:03 PM
  • Thanks for reply.

    1. I've try to ReadSample inside CRXMaterial::Tick, but unfortunately  my pSample was always NULL and I can't read my sample.

    m_pReader->ReadSample((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, 0, NULL, NULL, NULL, &pSample);

    2. Now it's working with few Bug.

    • I got full image inside Debug Version and Horizontal Interlace inside Release Version. (-Must check image format)
    • Only work with one object at same time (-Buffer seem release after first Object render. Also try to move release inside OnReadSample, but I got instability and crash)
    • Texture doesn't stretch or resize. (-Texture from shaderview are full HD and WebCAM are smaller, then I can't downside ShaderViewTexture with CopyResource. This one need same format size.)
    • I miss all about IIMFSourceReader get attributes (height, width, ScanLine and Stride) to perform a proper BitBlt for each different ShaderViewTexture conversion.
    • Because of pSample always NULL, Map/UnMap optimization and multiple object display. Does is more efficient to Convert into ID3D11Texture2D1 inside ::OnSample() and Resize inside ::OnTick()?? it's depends if I can use ReadSample inside ::OnTick.

    Release Version

    Debug Version:


    3. I didn't work with IMFMediaType for the moment, but if I can set IIMFSourceReader::SetCurrentMediaType in the right format, Does I'm gonna enable to cut my YUV@RGB conversion code inside BitBlt procedure? I hope.

    ACTUAL CODE

    CRXWebCam.h (Try to buffer the top-Most "IMFSample" -- but never work)

    HRESULT CRXWebCam::OnReadSample(HRESULT hrStatus, DWORD dwStreamIndex, DWORD dwStreamFlags, LONGLONG llTimestamp, IMFSample* pSample)
    {
        HRESULT hr = S_OK;
        
        if (FAILED(hrStatus))
            hr = hrStatus;
    
        if (SUCCEEDED(hr))
        {
            if (pSample)
            {
                hr = pSample->GetBufferByIndex(0, &m_pBuffer);                      
                // Draw the frame.
            }
        }
    
        if (SUCCEEDED(hr))
            hr = m_pReader->ReadSample((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, 0, NULL, NULL, NULL, NULL);
      
        return hr;
    }

    CRXMaterial.h

    if (m_pWebCamEngine) //HERE { if (m_pWebCamEngine->m_pBuffer) { ID3D11Texture2D1* Texture; m_pTextureView->GetResource((ID3D11Resource**)&Texture); IMF2DBuffer2* m_p2DBuffer = NULL; //m_pWebCamEngine->m_pBuffer->QueryInterface(IID_PPV_ARGS(&m_p2DBuffer)); m_pWebCamEngine->m_pBuffer->QueryInterface(IID_IMF2DBuffer, (void**)&m_p2DBuffer); BYTE* ppbScanLine0; LONG plStride; m_p2DBuffer->Lock2D(&ppbScanLine0, &plStride); D3D11_TEXTURE2D_DESC1 description; Texture->GetDesc1(&description); //m_pWebCamEngine->m_pReader->SetCurrentMediaType(0, 0, pMediaType) //description.Width = 640; //description.Height = 480; description.BindFlags = 0; description.CPUAccessFlags = D3D11_CPU_ACCESS_READ | D3D11_CPU_ACCESS_WRITE; description.Usage = D3D11_USAGE_STAGING; ID3D11Texture2D1* texTemp = NULL; m_pDirect3D->GetD3DDevice()->CreateTexture2D1(&description, NULL, &texTemp); D3D11_MAPPED_SUBRESOURCE mapped = D3D11_MAPPED_SUBRESOURCE(); m_pDirect3D->GetD3DDeviceContext()->Map(texTemp, 0, D3D11_MAP_READ_WRITE, 0, &mapped); //YUV to RGB??? for (DWORD y = 0; y < 480; y++) { RGBQUAD* pDestPel = (RGBQUAD*)mapped.pData; WORD* pSrcPel = (WORD*)ppbScanLine0; for (DWORD x = 0; x < 640; x +=2) { // Byte order is U0 Y0 V0 Y1 int y0 = (int)LOBYTE(pSrcPel[x + (y * 640)]); int u0 = (int)HIBYTE(pSrcPel[x + (y * 640)]); int y1 = (int)LOBYTE(pSrcPel[x + (y * 640) + 1]); int v0 = (int)HIBYTE(pSrcPel[x + (y * 640) + 1]); pDestPel[x + (y * description.Width)] = ConvertYCrCbToRGB(y0, v0, u0); pDestPel[x + (y * description.Width) + 1] = ConvertYCrCbToRGB(y1, v0, u0); } pSrcPel += plStride; const int pitch = mapped.RowPitch; pDestPel += pitch; } m_pDirect3D->GetD3DDeviceContext()->Unmap(texTemp, 0); m_pDirect3D->GetD3DDeviceContext()->CopyResource(Texture, texTemp); m_p2DBuffer->Unlock2D(); m_p2DBuffer->Release(); Texture->Release(); texTemp->Release(); if(m_pWebCamEngine->m_pBuffer) m_pWebCamEngine->m_pBuffer->Release(); m_pWebCamEngine->m_pBuffer = NULL; } }

    I know for the moment, the code is catastrophic... Try to simplify the example to understand it. I have read something about flushing the source reader. but It's dosen't work for my. I get a lot of freeze.

    Thanks Again.

    Must learn more about IMFMediaType;



    Wednesday, December 11, 2019 4:23 PM
  • Well. After read this...

    OnRead and AsyncMode

    I got Update...


    Asynchronous Mode

    In asynchronous mode:

    • All of the [out] parameters must be NULL. Otherwise, the method returns E_INVALIDARG.
    • The method returns immediately.
    • When the operation completes, the application's IMFSourceReaderCallback::OnReadSample method is called.
    • If an error occurs, the method can fail either synchronously or asynchronously. Check the return value of ReadSample, and also check the hrStatus parameter of IMFSourceReaderCallback::OnReadSample.

    Synchronous Mode

    • The pdwStreamFlags and ppSample parameters cannot be NULL. Otherwise, the method returns E_POINTER.
    • The pdwActualStreamIndex and pllTimestamp parameters can be NULL.
    • The method blocks until the next sample is available.

    I Understand the error....

    OnReadSample is an Asynchronous function and my CRXMaterial::OnTick act more a Synchronous function.

    If I will return to Synchronous, i need to:

    • Underived my CRXWebCam from IMFSourceReaderCallback
    • remove abstract function like AddRef, OnReadSample and blabla
    • remove pAttributes->SetUnknown(MF_SOURCE_READER_ASYNC_CALLBACK, this);
    • Use the Reader as old fashion style.


    • Proposed as answer by Jeffrey Shao Friday, January 3, 2020 6:57 AM
    Thursday, December 12, 2019 6:21 AM