How to use multiple textures in shader?


  • Hi, all

       I'm doing something like blending two textures and producing a final texture. If two textures are the same size, we just need one texture coordinate in vertex input and output. If they are the different size, shall I put two texture coordinates in  vertex input and output? I try it but can't get the correct texture coordinate in pixel shader. The main code looks like:

     Shader code:

    Texture2D tx[2];
    SamplerState samLinear;

     struct VS_INPUT
        float4 Pos : POSITION;
        float2 Tex : TEXCOORD0;
        float2 Tex1 : TEXCOORD1;

    struct PS_INPUT
        float4 Pos : SV_POSITION;
        float2 Tex : TEXCOORD0;
        float2 Tex1 : TEXCOORD1;

    PS_INPUT VS( VS_INPUT input )
        PS_INPUT output = (PS_INPUT)0;
        output.Pos = input.Pos;
        output.Tex = input.Tex;
        output.Tex1 = input.Tex1;
        return output;

    float4 PS( PS_INPUT input ) : SV_Target
        float4 color;

        color.x = tx[0].Sample(samLinear, input.Tex);

        color.y = tx[0].Sample(samLinear, input.Tex);

        color.z = tx[1].Sample(samLinear, input.Tex1);   


    App code:

    struct MultiVertex
        XMFLOAT3 Pos;
        XMFLOAT2 Tex;
        XMFLOAT2 Tex1;

    MultiVertex vertices[] =
            {XMFLOAT3( -1.0f, -1.0f, 1.0f ), XMFLOAT2( 0.0f, 1.0f ), XMFLOAT2( 0.0f, 1.0f )},    //left bottom
            {XMFLOAT3( 1.0f, -1.0f, 1.0f ),     XMFLOAT2( 1.0f, 1.0f ), XMFLOAT2( 1.0f, 1.0f )},    //right bottom
            {XMFLOAT3( 1.0f, 1.0f, 1.0f ),   XMFLOAT2( 1.0f, 0.0f ), XMFLOAT2( 1.0f, 0.0f )},    //right top
            {XMFLOAT3( -1.0f, 1.0f, 1.0f ),  XMFLOAT2( 0.0f, 0.0f ), XMFLOAT2( 0.0f, 0.0f )},    //left top

        D3D11_BUFFER_DESC bd;
        ZeroMemory( &bd, sizeof(bd) );
        bd.Usage = D3D11_USAGE_DEFAULT;
        bd.ByteWidth = sizeof(MultiVertex) * ARRAYSIZE(vertices);
        bd.BindFlags = D3D11_BIND_VERTEX_BUFFER;
        bd.CPUAccessFlags = 0;

        D3D11_SUBRESOURCE_DATA InitData;
        ZeroMemory( &InitData, sizeof(InitData) );
        InitData.pSysMem = vertices;
        hr = m_pDevice->CreateBuffer(&bd, &InitData, &m_pVertexBuffer);

    D3D11_INPUT_ELEMENT_DESC layout[] =
            { "POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D11_INPUT_PER_VERTEX_DATA, 0 },
            { "TEXCOORD", 0, DXGI_FORMAT_R32G32_FLOAT, 0, 12, D3D11_INPUT_PER_VERTEX_DATA, 0 },
            { "TEXCOORD", 1, DXGI_FORMAT_R32G32_FLOAT, 0, 24, D3D11_INPUT_PER_VERTEX_DATA, 0 },
        UINT numElements = ARRAYSIZE( layout );

        hr = m_pDevice->CreateInputLayout(

    m_pDeviceContext->PSSetShaderResources(0, 2, &m_ShaderViewArray[0]);

    m_pDeviceContext->PSSetSamplers(0, 1, &m_pSamplerLinear);

    To simplify the question I let the two textures are created with the same size, one's format is DXGI_FORMAT_R8G8_UNORM, another is DXGI_FORMAT_R8_UNORM. In pixel shader, texture coordinate Input.Tex and Input.Tex1 shoud be the same, is right? I debug the shader and find that Input.Tex1 is strange value, for some positions, it's negative. What's wrong with my code?

    Wednesday, November 7, 2012 7:01 AM

All replies

  • Hi,

    If the two sets of texture coordinates are the same in the input vertices (they are in your app) you only need one in your shader.
    In a pixel shader you can sample different textures with the same texture coordinates even if the textures sizes are not the same.

    Wednesday, November 7, 2012 12:20 PM
  • The second item of D3D11_INPUT_ELEMENT_DESC is SemanticIndex, they shoud be different if there are two textures.

    It's OK if using the same texture coordinates when the textures size are not the same, but I still want to know how to use 2 textures coordinates. Can anyboday tell me?

    • Edited by moonincloud Thursday, November 8, 2012 1:50 AM
    Wednesday, November 7, 2012 12:51 PM
  • My bad, I thought it was the input slot. Sorry.
    Wednesday, November 7, 2012 1:14 PM