locked
Using MEStreamFormatChanged to change Pixel Aspect Ratio during playback? RRS feed

  • Question

  • hi there,

     

    i´m currently facing a scenario where the pixel aspect ratio of a stream changes. My custom media sources creates a media type, is started and then realizes that the PAR is different than previously set. So i send a MEStreamFormatChanged (with the original IMFMediaType, but with a new MF_MT_PIXEL_ASPECT_RATIO value set) on the MediaStream´s event queue (which delivers zero/success).

    Debugging with mftrace i can see that the new InputType is set on the decoder MFT, the stream sink and so on with success (no errors).

    However, the rendered video does not have the new aspect ratio applied. 

    I tried sending the MeStreamFormatChanged event before the first frame rendered, or between the first and the second frame. I cannot guarantee to have the correct aspec ratio until the first RequestSample is called. 

    Do i have to set aspect ratio during runtime in another way? or is MeStreamFormatChanged the way to go. (because maybe event the frame size might change).

    any ideas/hints are appreciated ;)

    best regards

    joachim



    Monday, May 2, 2011 1:54 PM

All replies

  • It sounds like you know what you are talking about, but just to make sure ...

    You are for sure interested in Pixel Aspect Ratio? Not Display aspect ratio? 4:3 or 16:9 is display and is set by MF_MT_FRAME_SIZE rather than the MF_MT_PIXEL_ASPECT_RATIO.

    Good luck :)


    Michael Strein
    Thursday, June 9, 2011 3:50 AM
  • hello michael,

     

    thanks for your reply. well, i receive streams with a resolutino of e.g. 1440x1080 and have to adapt the pixel aspect ratio to show them with 1920x1080, i.e. i receive data with non square pixel, therefore i guess MF_MT_PIXEL_ASPECT_RATIO should be the way to go.

    it works when set upfront on the media type before the stream starts, however, i cannot get it to work during playback using MeStreamFormatChanged.

    any ideas? ;)

    regards

    j.

    Thursday, June 9, 2011 6:10 AM
  • Hey Jo,

    I don't think Pixel Aspect ratio is what you are looking for here. In most cases, your pixel data is square. It has nothing to do with the resolution or aspect ratio of the video but the actual shape of the pixels. Non-square pixel data is usually generated by hardware capture devices where the physical nature of the device has oddly shaped pixels. For example, some playback screens or cameras have the red-green-blue components of the pixel arranged in 'L'-shapes, long rectangles or interlocking triangles. I used to have a special magnifying glass unit to inspect pixels on plasma displays, most were interlocking triangles of RGB glow tubes making them square pixels.

    Therefore you only need to define non-square pixel shapes to MF so they get rendered properly. Upscalaing and down-scaling the video size or changing the aspect ratio it's played back at is all done internally by the pipeline through the video driver. Are you trying to use a mixer MFT to mix a 1440x1080 stream onto a 1920x1080 stream? It should automatically stretch the video based on source and destination rectangle dimensions.

    Saturday, June 11, 2011 1:30 PM
  • hi there,

     

    thanks for your reply. well, i receive data using rtp from axis network cameras that send a 1440x1080 stream which should be rendered as 1920x1080 (the old system treated this as PAR). i don´t use any special mixers, just simple playback. 

    am i missing some link?

    Tuesday, June 14, 2011 6:59 AM
  • Have you tried to just render the 1440x1080 video on a 1920x1080 surface? If so, does it have letterboxing black bars or does it stretch the video to fill the rendering surface?

    Secondly, if you need to notify that a stream's pixel aspect ratio has changed, you do it in-band through ProcessOutput(). ProcessOutput should return MF_E_TRANSFORM_STREAM_CHANGE and the output flags should contain MFT_OUTPUT_DATA_BUFFER_FORMAT_CHANGE and there should also be no sample present in the output buffer.

    Wednesday, June 15, 2011 12:22 AM
  • hi nobby,

    i don´t want to care about the actual surface size (its rendered in wpf space on a hwnd) (and yes, i get the letterbox). rendering the 1440x1080 video without PAR set gives me the incorrectly stretched video. 

    notifying in-band may be somewhat difficult as i do not use any custom MFT´s. just a custom media source that recognizes the change on the media type.

    when i set the PAR initially, everythingw orks fine, however, when i detect the change in the stream, create the new media type and send it using MEStreamFormatChanged nothing happends. i can see that the event arrive within the MF core components (using mftrace) but it does not have any effect.

    Wednesday, June 15, 2011 6:23 AM
  • So you get letterboxing and also oddly stretched display? I've used Axis cameras in the past and I don't recall them having non-square pixel data before.

    What experiments have you tried to verify that the video format has non-square pixels?

    Have you tried running the data through the microsoft H.264 decoder, saving the uncompressed image data to disk then looking at it? Try using a colour space transform from YUY2 to RGB then blit the data to a GDI bitmap. Save the bitmap to file as a bmp or jpeg then view it in microsoft paint. Let me know if it looks the same as the video playback.

    Wednesday, June 15, 2011 12:32 PM
  • well, we have a legacy rendering application (using VfW and GDI rendering). pixel aspect ratio is used over there. eg. a camera sends 1080i which should be displayed as 1920x1080 (i.e. having non square pixels). itr´s the same with 2-CIF or some stuff like that. also it´s not tied to axis network cameras as the same applies for other (maybe custom) network video sources.

    setting the PAR for such sources upfront works as expected. video is rendered correctly. however, when the stream changes (maybe even resolution, e.g, does not directly apply to axis cameras but to custom network recording sources), i cannot propagate stream changes related to PAR

    Wednesday, June 15, 2011 1:48 PM
  • I'm a bit confused. Why does a 1920x1080 image have non-square pixels? Are you saying that the pixels are non-square because the resolution is 1920x1080 or are you saying that your 1920x1080 video has non-square pixels? The reason I'm trying to clear this up is that I work with a rediculous amount of different video capture devices and network based media sources, and I've never used one yet that has non-square pixel data. I've used devices with varying aperture aspect ratios but not varying pixel aspect ratios.

    I'd just like to make sure you're actually going down the right path before I stop rambling on about this haha. Lets also say for a minute that your stream media types are changing. You're saying that MFTrace shows the downstream components of the pipeline are actually recieving the event but nothing is happening. Have you confirmed whether all the downstream pipeline nodes support stream changes once the topology is active? Not all media sinks and transforms can accept stream changes while they are in use.

    Wednesday, June 15, 2011 11:22 PM