locked
ASF Streaming using MFCreateASFStreamingMediaSink RRS feed

  • Question

  • Per http://msdn.microsoft.com/en-us/library/dd388087(VS.85).aspx it is possible to stream ASF media over the network.

    A MediaSink requires an output ByteStream, as far as for StreamingMediaSink the Byte stream should be provided externally, this imply that the actual streaming protocol is not implemented by the object, having that said, what is the role of the StreamingSink, why can't a FileSink suffice?
    I wonder, is there any built in ByteStream for ASF network streaming? if yes, what protocol does it use? WMSP/RTSP or MMS?

    How can I stream the ASF payload to the network? should I implement my own WMSP envelope for http streaming or is there anything built in to MF already?

    Any help will be appreciated
    Nadav Rubinstein, http://www.sophin.com
    Monday, January 11, 2010 10:28 AM

All replies

  • Streaming has a seperate set of requirements from archiving to a file.  Streaming supports rate changes and seeking, and is more concerned about getting packets out on time than optimal packetization, among other things.

    MF does not provide a network output bytestream.  The streaming sink is not intended to be used with any of the streaming protocols, but rather just a raw progressive download of the content.  I do not believe it is feasible to implement one of the streaming protocols just with the information available to a bytestream.

    MF does not provide any of the server-side components for network streaming.  If you want these, you will have to implement them yourself.  There are other APIs in Windows that may help -- for example, for implementing HTTP progressive download, the HTTP Server API can do most of the heavy lifting for you.

    Tuesday, January 12, 2010 11:53 PM
  • I have implemented a WMSP Streaming application, to verify proper streaming I read an existing ASF file and mux + push the read samples to the net ( using MS-WMSP )

     

    A. When streaming either an audio or video only ~file~ all works fine using graphedt.exe & WMP
    B. When streaming audio AND video ~file~ it plays well using WMReader filter @ graphedt.exe but jitters when playing with WMP.
    C. SilverLight / MediaElement module was not able to play single trac or multiple trac feeds, seemed as if all the packets are dropped upon MediaElement reception, such a behavior happened for me while sending invalid payload timestamps to WMP
     

    My current implementation Mux multiple payload packets ( per the ASF specification ), pads them with a FramingHeader and DataPacket and sends them through the net, suspecting there is some problem with my muxing code, I was trying to use IMFASFMultiplexer as a replacement , padding the resulting packets with a FramingHeader and DataPacket before sending to the network seemed to parse well with the WMSP plug-in of the Network Monitor, however, the actual payload seemed to be missing video samples resulting artefacts while decoding the stream during WMP playback, having this in mind I have reverted to my original muxing implementation which seemed to give better results.

     

    Having that said, the next suspect is the leaky bucket algorithm which, in my opinion, is required only for UDP connections and is not really needed for WMSP where TCP is used: I don’t know how the WMSP source filter / media source is implemented, I would like to believe that it is using the pull concept, that is, the streamer push the data to the client as fast as it can while the consumer pull the data from the socket at the rate it finds adequate, however, UDP streaming support for ASF files ( mmsu ) and the leaky bucket algorithm makes me believe otherwise, that is: the consumer (source filter / media source) receive the data from the socket at the rate it is transmitted by the server ( UDP Support ), if the reception rate is higher than the playback rate samples are dropped ( by the consumer ) and hence, the Jittering.

     

    I have also implemented a dumping mechanism to dump in to ASF file the exact data sent through the net, that is the ASF $Header followed by the $Data packets, surprisingly, the resulting ASF file play on WMP w/o any problem.

     

    Implementing the leaky bucket / push logic where the streamer is timing data throughput didn’t show any improvement, following is how is set the timing variables:

    a.       ASF_PAYLOAD_PARSING_INFORMATION::wPacketDuration = ((packet size excluding headers) * 8 * 1000) / ASF_FILE_PROPERTIES_OBJECT::dwMaxBitRate;

    b.      ASF_PAYLOAD_PARSING_INFORMATION::dwSendTime = ((sum of all sent packets bytes) * 8 * 1000) / ASF_FILE_PROPERTIES_OBJECT::dwMaxBitRate;

    c.       ASF_MULTIPLE_PAYLOAD_DATA::ReplicatedData::dwPresentationTime= (qwSampleStreamTime / 10000I64) + ASF_FILE_PROPERTIES_OBJECT::qwPreroll

     

    Checking with a custom DirectShow filter I have verified the MediaSample timestamps coming out of WMReader are correct.

     

    Having all that said:

    1.      Have I calculated the packet timings correctly ?

    2.      What is the way the source filter / media source is communicating? Does it use the pull model or does it use the push model?

    3.      Can U think of anything else that might cause Jittering during WMP playback ?


    Nadav Rubinstein, http://www.sophin.com
    Wednesday, January 13, 2010 9:07 AM
  • 1) Do you not have the accurate timestamps/durations available?  For VBR content, calculations based upon the bitrate can be grossly off from the actual timestamps of the samples.  This could potentially cause jitter at the renderer.

    2) I am not terribly familiar with DShow, but my understanding is that DShow largely uses the push model.  Still, unless the client is being overwhelmed with data (hundreds of extra samples?) I would not expect any data to be dropped.

    3) Usually jittering is a timestamp or data starvation issue.  I do not know what else might cause it.
    Thursday, January 21, 2010 8:20 PM
  • To test my streaming application I read an existing ASF file using the SyncReader object and re-mux it on runtime using the existing MediaSample timestamps as a reference, Using the native timestamp provided with the media sample produce the same result, video jitters while audio is playing smooth.

    Having in mind multiple MediaSamples/payloads might share the same DataPacket I was using the data bit-rate to ~accurately~ calculate the send time for a given multiple payload packet, unfortunetely this didn't help...

    The funny thing is that it plays well using WMReader@graphedt.exe and Media foundation Protected_Playback sample.


    More over, I dump the asf header + packets sent on the network to an ASF file exactly reflecting what is sent on the network, this file plays well when directly loaded using WMP, and is playing well when being streamed using Windows Media Streamer...

    When WMP is used to consume the feed from my app it seems audio is starting few sec before video does ( although timestamps are in sync ) I am really not sure what might be wrong here... maybe I should offset audio timestamps only I am not sure.


    Nadav Rubinstein, http://www.sophin.com
    Sunday, January 24, 2010 2:38 PM
  • Have you tried using WMReader + graphedt + EVR as the renderer?  This would most accurately represent what is going on in WMP.  If it jitters then, that would at least isolate the problem to dshow + EVR.

    The EVR fires some diagnostic events that might give some clues as to what causes the jitter.  Open up the event viewer and turn on 'Show Analyitic and Debug logs' in the event viewer.  Navigate to Applications and Services->Microsoft->Windows->Media Foundation-Performance.  Right click the log and select 'Enable Log'.  Then run your streaming playback.  Event tasks that would look interesting are 'video frame glitch', which will report late frames/data starvation, and 'Render sample render' which correlates a sample time to a QPC time that the EVR expects to present the frame. 
    Thursday, January 28, 2010 2:11 AM
  • Hi Matt,

    Thanks for your response, Unfortunetely ( sort of speak ) the EVR@GraphEdt.exe performs well, video is played w/o any problem/jittering, this makes me think that the prob is with the MediaSource/SourceFilter consuming the data from the network.

    Checking the event log as U have advised present the following two suspected events:
    A. Source Resolution Begin EndCreateObjectFromURL Tag=SoRe Object=0x3bc9b00 URL=NULL Object Created=0x0 hr=0x0
    B. Source Resolution End EndCreateObjectFromURL Tag=SoRe Object=0x3bc9b00 URL=http://127.0.0.1:8003/wmav.asf Object Created=0x2 hr=0xc00d2ee8

    0xc00d2ee8 == "The server is not a compatible streaming media server" though this doesn't cause WMP to stop and streaming does happen...

    One other thing to note is that when using SilverLight to consume video packets are sent at the adequate rate but SL doesn't render any of the samples, seems as if all samples are dropped...
    One more thing to note is that while playing @ WMP audio start playing ~3 sec before video does, 3 sec is the buffering/preroll duration provided with the ASF header, still I can't think of anything I do to cause this, the asf header I am using is generated using MF objects...
    Nadav Rubinstein, http://www.sophin.com
    Sunday, January 31, 2010 4:28 PM