none
What are the options available to reduce latency in output stream?

    Question

  • Hi,

    I am facing latency between input stream and output stream

    I am using TCP data stream as an Input event for SI, as TCP data stream has some complex
    structure. So I am using the UDSO as a "second-stage" deserialization.

    How can I reduce the latency in output stream

    Also kindly suggest tool to debug this situation

    Thanks in Advance.



    • Edited by DevCode13 Thursday, August 22, 2013 9:13 AM
    Thursday, August 22, 2013 9:12 AM

Answers

  • It's a start.

    First, you need to understand how AdvanceTimeSettings.IncreasingStartTime works. It won't enqueue a CTI until it gets a start time that is ahead of the last one. If, for example, you have the following start times:

    01:00:00
    01:00:00
    01:00:00
    01:00:01

    You won't get a CTI until the 01:00:01 item is actually enqueued. So that will give you events a full second of latency. Depending on the frequency of inbound events, this can increase latency pretty significantly. What is your inbound/outbound event rate, by the way? If you use a null sink, do you see the latency disappear?
    Second, since you are measuring latency as the time it takes to get to the output adapter's source, you need to take into account any latency from that as well. One thing that you said ... that the latency increases as the application runs ... leads me to think that your output queue is getting backed up. This can happen if your output adapter/sink takes too long to process individual messages. StreamInsight will queue them up and feed them to your sink as fast as it'll process the messages but if your processing for each event takes 100 ms and you have 40 events/second (which is pretty slow), you'll take 4 seconds to process 1 second of events. That will get your output queue backed up pretty quickly. You can check this by looking at the StreamInsight Server : # Events in output queues performance counter. This tells you the total number of events that have been "released" to output by the engine but are waiting on the sink/output adapter to actually process them. One strategy to handle this is to batch your output events and write/send whenever you receive a CTI. Events are released to the output adapter/sink when there is a CTI anyway - they don't "trickle" in but come in "spurts" immediately followed by a CTI - and then do whatever processing that you need to do on a separate thread and/or asynchronously. It is very important to keep the actual dequeue/OnNext operation as small and fast, especially if you have large numbers of events.


    DevBiker (aka J Sawyer)
    Microsoft MVP - Sql Server (StreamInsight)


    Ruminations of J.net


    If I answered your question, please mark as answer.
    If my post was helpful, please mark as helpful.

    Saturday, August 31, 2013 5:18 PM

All replies

  • First, what kind of latency are you seeing? And what kind of latency are you looking for? And how are you measuring the latency? Finally, how are you handling your CTIs?


    DevBiker (aka J Sawyer)
    Microsoft MVP - Sql Server (StreamInsight)


    Ruminations of J.net


    If I answered your question, please mark as answer.
    If my post was helpful, please mark as helpful.

    Saturday, August 24, 2013 8:20 PM
  • Hi,

    Thanks for your response.

    Below are the answers of your question

    what kind of latency are you seeing?

    As i am pushing data from TCP to SI. Time when data pushed to SI, an output from SI comes late to my application. This latency increases by time.

    what kind of latency are you looking for?

    If it would be for few second then no problem.

    And how are you measuring the latency?

    I have a time on my data and same time i am pusing it to SI, but when i get it on my application (Windows Appln) the latency.

    how are you handling your CTIs?

    .ToPointStreamable(i => PointEvent.CreateInsert<string>(i.Time, i.Data), AdvanceTimeSettings.IncreasingStartTime);
    Hope this help

    Friday, August 30, 2013 11:19 AM
  • It's a start.

    First, you need to understand how AdvanceTimeSettings.IncreasingStartTime works. It won't enqueue a CTI until it gets a start time that is ahead of the last one. If, for example, you have the following start times:

    01:00:00
    01:00:00
    01:00:00
    01:00:01

    You won't get a CTI until the 01:00:01 item is actually enqueued. So that will give you events a full second of latency. Depending on the frequency of inbound events, this can increase latency pretty significantly. What is your inbound/outbound event rate, by the way? If you use a null sink, do you see the latency disappear?
    Second, since you are measuring latency as the time it takes to get to the output adapter's source, you need to take into account any latency from that as well. One thing that you said ... that the latency increases as the application runs ... leads me to think that your output queue is getting backed up. This can happen if your output adapter/sink takes too long to process individual messages. StreamInsight will queue them up and feed them to your sink as fast as it'll process the messages but if your processing for each event takes 100 ms and you have 40 events/second (which is pretty slow), you'll take 4 seconds to process 1 second of events. That will get your output queue backed up pretty quickly. You can check this by looking at the StreamInsight Server : # Events in output queues performance counter. This tells you the total number of events that have been "released" to output by the engine but are waiting on the sink/output adapter to actually process them. One strategy to handle this is to batch your output events and write/send whenever you receive a CTI. Events are released to the output adapter/sink when there is a CTI anyway - they don't "trickle" in but come in "spurts" immediately followed by a CTI - and then do whatever processing that you need to do on a separate thread and/or asynchronously. It is very important to keep the actual dequeue/OnNext operation as small and fast, especially if you have large numbers of events.


    DevBiker (aka J Sawyer)
    Microsoft MVP - Sql Server (StreamInsight)


    Ruminations of J.net


    If I answered your question, please mark as answer.
    If my post was helpful, please mark as helpful.

    Saturday, August 31, 2013 5:18 PM