locked
Debugging Pixel Shaders for WPF effects RRS feed

  • Question

  • I am writing a Pixel shader in HLSL for a WPF Shader effect for an image with pixel format of Gray16. I wanted to debug the shader to see what values are being passed in for the texture. I followed all the instructions about setting up PIX but I can't seem to capture the trace. I am running VS 2008, on Windows Vista Ultimate with NVidia 8600 GPU. Does anyone have a suggestion on what to try?
    Thursday, April 9, 2009 1:45 PM

All replies

  • I don't know the answer to this, but this is something I too would like to know.
    Thursday, April 9, 2009 9:40 PM
  • I have used PixelShader in HLSL too.
    How about writing out a log file with the values that are passed?

    Friday, April 10, 2009 1:30 AM
  • It looks like there is no way to intercept how the values are passed. I assume they are copied into the GPU memory by the ShaderEffect class when it wants to render the control. The output values are also written into the frame buffer at the final rendering stage and there is no way to intercept that either.
    Friday, April 10, 2009 1:43 AM
  • I wanted to debug the shader to see what values are being passed in for the texture. I followed all the instructions about setting up PIX but I can't seem to capture the trace. I am running VS 2008, on Windows Vista Ultimate with NVidia 8600 GPU. Does anyone have a suggestion on what to try?

    Are you running PIX for Windows as Administrator?
    Are you attaching to the exe of your application directly and running your shader application by clicking 'Run Experiment
    Did you choose 'Single-frame capture of Direct3D whenever F12 is pressed?

    Are you shutting down the application process.  The logs should load at that time.  Do you see any logged information?

    Walt | http://wpfwonderland.wordpress.com
    Friday, April 10, 2009 4:14 AM
  • PIX has worked for me in the past, but it will be filled with lots of noise since your output will include all of WPF's rendering as well, including the shaders we use internally, etc.  PIX only simulates "debugging" the shader - since they are executed on the video hardware itself (in parallel) there isn't any support for logging or interrupting in the middle of a shader's execution.  The most simple thing to do, since the shader programs are relatively small, is to verify your shader is working as you intend by modifying it to output each intermediate calculation, one at a time.  It's a bit of manual work since you have to recompile the shader each time, but you can relatively quickly track down an issue by verifying that the shader is outputting the values you expect at each important point in its execution (either by checking visually, or using an image editing program to check the actual pixel value on-screen).

    As far as the first post goes, however, I'm guessing your problem is that WPF doesn't support different pixel formats as inputs to shaders.  Internally, everything we render is converted to 32-bit pBGRA (pre-multiplied alpha), so every input sampler to the ShaderEffects will use that format.
    Friday, April 10, 2009 8:11 PM
  • Thanks to everyone for their reponse. I never did get PIX to work for me. Eventually I concluded that I was losing a lot of precision going from a 16 bit Gray16 format to a float inside the shader. There is no premultiplied alpha because there is no alpha component. Although if there is some conversion happening, there is no document on how a format is being converted to pBGRA as mentioned by Brendan.

    My image only had values up to 4096 and not up to the full the 16 bit. So I changed the format to BGR565 which gave me three components each normalized from 0 to 1. I then used a float3 for each component and recombined them to a single float with the dynamic range that I needed. Then the output was created to be a standard ARGB format. This seemed to work for now anyway. I would like to try again with the Gray16 format when ps 3.0 is supported in a future version.
    Saturday, April 11, 2009 4:31 PM
  • I think you misunderstood what I'm saying.  When your image is decoded in WPF, no matter what pixel format you created it with, we'll convert it internally to PBGRA32 for use with our rendering stack.  Of course, when alpha is 1 this is equivalent to BGRA.  The same thing is happening both to your BGR565 texture and your Gray16 texture (though I'm speculating the latter conversion is lossy where the former is not), and would happen to a 2-bit texture as well even though it will cost significantly more memory to store once it's decoded.  Using the BGR565 texture is a good workaround, though you could use the full 24-bit RGB as well and pay no more cost, since we're converting it to 8-bits per channel internally anyway.  If we add ps 3.0 support in the future it won't change this scenario at all, except for allowing a more complicated shader. What you really want is for WPF to broaden its support of different pixel formats in the rendering stack, including respecting the original pixel format of an image when it's used as input to ShaderEffects.
    Monday, April 13, 2009 8:04 PM
  • Thank you for the clarification. I missed the fact that you automatically convert the data to PBGRA32 no matter what. Is there a way to know how the data might be converted? If the input is Gray16, how is it converted to PBGRA? If we then use an HLSL based pixel shader, I assume we would have to use a construct like

    float4 input = tex2D(inputSampler, uv);

    This would retrieve each PBGRA32 sample. I would have no problem with this as long as I know how things are converted.

    Your assessment at the end about supporting different pixel formats is a good idea.
    Monday, April 13, 2009 11:04 PM
  • You would probably have to play around with it a bit to know for sure how it's converted, I'm not entirely sure.  We call into WIC (Windows Imaging Component) to do the actual format conversion.
    Tuesday, April 14, 2009 12:48 AM
  • Thank you. It has been quite difficult to figure out exactly how the conversion happens since I have no debugging visibility into the process. I was hoping to use PIX and the shader debugger to figure out how things are being converted. I would welcome some suggestions. I did manage to figure out that the conversion was creating a lot of precision loss so something else has to be done. The conversion does not matter to me as long as I can preserve the precision.

    I am trying the DirectX route now to see if there is a way I can do the same thing with DirectX interop technology.

    Tuesday, April 14, 2009 2:19 AM