locked
How to Allocating Uncompressed Buffers by "ID3D11Device::CreateTexture2D"?

    Question

  • Hello everyone,

    i'm writing video decoder MFT based on D3D11 or DXVA, and i refer to MS document: http://msdn.microsoft.com/en-us/library/windows/desktop/hh162912(v=vs.85).aspx, when i want create texture for store video buffer, but i want to Create a 2D texture array by calling ID3D11Device::CreateTexture2D, always failed to E_INVALIDARG about this function CreateTexture2D, can someone know how to create ID3D11Texture2D interface?

    as below my code:

    ....

    ComPtr<ID3D11Device> pD3DDevice;
    ComPtr<ID3D11Texture2D> pInTex;
    D3D11_TEXTURE2D_DESC textureDesc = {0};
    textureDesc.Width = m_imageWidthInPixels;
    textureDesc.Height = m_imageWidthInPixels;
    textureDesc.Format = DXGI_FORMAT_420_OPAQUE
    textureDesc.Usage = D3D11_USAGE_DYNAMIC
    textureDesc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE;
    textureDesc.MiscFlags = 0;
    textureDesc.MipLevels = 1;
    textureDesc.ArraySize = 1;
    textureDesc.SampleDesc.Count = 1;
    textureDesc.SampleDesc.Quality = 0;
    textureDesc.BindFlags = D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_DECODER;

    hr = pD3DDevice->CreateTexture2D(&textureDesc, nullptr, &pInTex);
    if (FAILED(hr))
    {
    return (-1);
    }


    ....

    Could you give me a hand, Thanks

    Jackic


    one work one gain!


    • Edited by Jackic Friday, September 07, 2012 9:46 AM modified
    Friday, September 07, 2012 9:07 AM

Answers

  • 1: 420O format can't be created with D3D11_CPU_ACCESS_WRITE, this is because 420O is opaque format, thus you don't know what is actual format (or bit layout), thus it must be accessed via GPU only, thus no CPU access (read and write) is allowed.

    2: As above 1: said, 420O format must be bound to at least one GPU pipeline, that said, you can't have 0 in bind flags.

    3: As above 1&2: said, 420O format must be created with D3D11_USAGE_DEFAULT.

    4: Depending on your hardware support, but mostly, 420O format can't be bound to 3D pipeline, thus binding SHADER_RESOURCE (= 3D pipe input) is not allowed with 420O with many hardware/driver. You can check it by ID3D11Device::CheckFormatSupport API.

    5: Depending on your hardware support, creating texture array in 420O requires DECODER binding. For example, at 9_x feature level, DECODER binding must be specified for texture array.

    • Marked as answer by Jackic Thursday, September 20, 2012 4:47 AM
    • Unmarked as answer by Jackic Thursday, September 20, 2012 4:47 AM
    • Marked as answer by Jesse JiangModerator Monday, September 24, 2012 9:01 AM
    Wednesday, September 19, 2012 8:54 PM

All replies

  • Hi Jackic,

    Hopefully you've got more code initializing the ID3D11Device then what your showing... a lot more.  Either way, there are some good examples of using CreateTexture2D.  After looking at the list of online samples, the one titled "Direct2D effects photo adjustment app sample" as GOT to have a CreateTexture2D in it, but I have not actually used it so can not be sure.

    I am sure that the 3D samples "Direct3D tutorial", and "Xaml DirectX 3D shooting game" both demonstrate the use of CreateTexture2D.

    Good luck


    Jim Tomasko

    Sunday, September 09, 2012 5:42 AM
  • Thanks Jim Tomasko,

    I have refered to the sample "Direct3D tutorial sample", if i set those parameter same as the example, then created ok, but because i want to create video decoder, so i change many parameter ,such as: 

    textureDesc.Format = DXGI_FORMAT_R8G8B8A8_UNORM;
    textureDesc.Usage = D3D11_USAGE_DEFAULT;
    textureDesc.CPUAccessFlags = 0;
    textureDesc.BindFlags = D3D11_BIND_SHADER_RESOURCE;

    change to:

    //data format is YUV420

    textureDesc.Format = DXGI_FORMAT_420_OPAQUE
    textureDesc.Usage = D3D11_USAGE_DYNAMIC
    textureDesc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE;

    //The binding flags (BindFlags) should include the D3D11_BIND_DECODER flag,

    textureDesc.BindFlags = D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_DECODER; 

    after changed, create failed, so i donot know how to set those parameter for video decoder.

    thanks

    Jackic


    one work one gain!

    Monday, September 10, 2012 2:02 AM
  • 1: 420O format can't be created with D3D11_CPU_ACCESS_WRITE, this is because 420O is opaque format, thus you don't know what is actual format (or bit layout), thus it must be accessed via GPU only, thus no CPU access (read and write) is allowed.

    2: As above 1: said, 420O format must be bound to at least one GPU pipeline, that said, you can't have 0 in bind flags.

    3: As above 1&2: said, 420O format must be created with D3D11_USAGE_DEFAULT.

    4: Depending on your hardware support, but mostly, 420O format can't be bound to 3D pipeline, thus binding SHADER_RESOURCE (= 3D pipe input) is not allowed with 420O with many hardware/driver. You can check it by ID3D11Device::CheckFormatSupport API.

    5: Depending on your hardware support, creating texture array in 420O requires DECODER binding. For example, at 9_x feature level, DECODER binding must be specified for texture array.

    • Marked as answer by Jackic Thursday, September 20, 2012 4:47 AM
    • Unmarked as answer by Jackic Thursday, September 20, 2012 4:47 AM
    • Marked as answer by Jesse JiangModerator Monday, September 24, 2012 9:01 AM
    Wednesday, September 19, 2012 8:54 PM
  • thanks to Hideyuki Nagase,

    I've created successfully.


    one work one gain!

    • Proposed as answer by vampireegg Tuesday, June 03, 2014 6:13 AM
    • Unproposed as answer by vampireegg Tuesday, June 03, 2014 6:13 AM
    Thursday, September 20, 2012 4:48 AM
  • Dear Hideyuki Nagase,

    I am facing facing similar problems. I have a byte array containing the video frame, and my program crashes when trying to create a texture. Can you help me please? My code is as follows:

        D3D11_TEXTURE2D_DESC texDesc = {};
        texDesc.Width = 352;
        texDesc.Height = 288;
        texDesc.MipLevels = 1;

        byte *bitmapData;
        int datasize = read_whole_file_into_array(&bitmapData , "1.i420");

        D3D11_SUBRESOURCE_DATA frameData;
        frameData.pSysMem = bitmapData;
        frameData.SysMemSlicePitch = 0;
        //frameData.SysMemPitch = texDesc.Width; //Unsure about it

        texDesc.ArraySize = 1;
        texDesc.Format = DXGI_FORMAT_420_OPAQUE;
        texDesc.SampleDesc.Count = 1;
        texDesc.SampleDesc.Quality = 0;
        texDesc.BindFlags =  D3D11_BIND_DECODER;
        texDesc.MiscFlags = 0;
        texDesc.Usage = D3D11_USAGE_DEFAULT;
        texDesc.CPUAccessFlags = 0;
            
        m_d3dDevice->CreateTexture2D (&texDesc, &frameData, &m_background);

        BasicLoader^ loader = ref new BasicLoader(m_d3dDevice.Get(), m_wicFactory.Get());

    Please help. Thanks in advance.

    Tuesday, June 03, 2014 6:19 AM
  • Firstly, you shouldn't be creating a new texture for each frame of video as that would be exceptionally slow. This is an ok way to get debugging, but you should look at using a DYNAMIC texture with Map/Unmap or using a STAGING texture and then CopyResource to the display texture.

    Secondly, if you are getting 'crashes' with Direct3D 11, you should start by enabling the DEBUG device and look for diagnostic messages.

    http://blogs.msdn.com/b/chuckw/archive/2012/11/30/direct3d-sdk-debug-layer-tricks.aspx

    You have to provide a valid SysMemPitch for 2D and 3D texture initData structures. DXGI_FORMAT_420_OPAQUE is 4:2:0 8-bit YUV data. Direct3D maps this to a rowpitch that must be even and is 1 byte per pixel (i.e. width). The full image size is rowpitch * height * 1.5.

    Remember that there are limits to how you can render video format textures in Direct3D, so you can't really directly use the texture you are trying to create.

    Tuesday, June 03, 2014 7:22 PM