locked
No output events RRS feed

  • Question

  • Hello

    I've setup a small test using one of our event hubs which captures application telemetry data from our software. Input events seems to be recording correctly, however output events are always 0 and no data is being recorded.

    For the output we have tried:-

    SQL Azure database and Blob Storage

    We have tested the query and the results look fine.

    Can anybody help? All connections are showing as ok.

    Kind regards

    David


    David Dasher

    Tuesday, December 23, 2014 9:08 AM

Answers

  • The fix for the missing output has been rolled out, could you please verify? You will have to stop and restart your streaming job to pick up the fix.

    The fix for the "missing closing parenthesis" is still in process, we will roll it out as soon as it is ready.

    Thanks


    Zafar Abbas

    • Proposed as answer by Zafar Abbas Saturday, January 24, 2015 2:47 AM
    • Marked as answer by Janet Yeilding Friday, February 6, 2015 5:14 PM
    Saturday, January 24, 2015 2:47 AM

All replies

  • Hi David,

    Could you provide more data:

    1. Your job name

    2. Do you observer any entries in the operations log in the Azure portal for your job?

    3. Did you already attempt to stop/restart your job and verify that it continues to run into errors?

    Thanks


    Zafar Abbas

    Friday, January 16, 2015 12:42 AM
  • I had similiar problems getting output from my jobs if I had events coming  very late. Although I have an Adjust strategy on out of order events it was just throwing them out.

    I did not have the problem some week ago with the same setup. They should have changed something.

    Regards

    Friday, January 16, 2015 2:40 AM
  • Gökhan,

    Please share your job name so we could investigate this issue. Also, do you know exactly when this issue appeared on the service? It could help us isolate the issue with changed deployed on the Service.

    Zafar


    Zafar Abbas

    Sunday, January 18, 2015 12:14 AM
  • It was between 20 december and 10 januari the behavior began. My job name is sporveienalarms. I can get output for events coming in with current datetime but not for old datetimes. Another issue is that the output was only a single json object before. Now It is an array of objects. And a buggy one actually as it is missing the last closing parenthesis "]".

    Regards

    Monday, January 19, 2015 7:56 AM
  • Hi,

    We have investigated and discovered an issue with the Service. We are working on a fix and will roll it out as soon as it is ready. We will keep you posted.

    Thanks!


    Zafar Abbas

    Wednesday, January 21, 2015 2:03 AM
  • The fix for the missing output has been rolled out, could you please verify? You will have to stop and restart your streaming job to pick up the fix.

    The fix for the "missing closing parenthesis" is still in process, we will roll it out as soon as it is ready.

    Thanks


    Zafar Abbas

    • Proposed as answer by Zafar Abbas Saturday, January 24, 2015 2:47 AM
    • Marked as answer by Janet Yeilding Friday, February 6, 2015 5:14 PM
    Saturday, January 24, 2015 2:47 AM
  • ok thx
    Tuesday, January 27, 2015 10:52 AM
  • Now it is working. And I believe even problem with the missing parenthesis is solved as well.
    Wednesday, January 28, 2015 9:23 AM
  • Great to hear!

    Please let us know any issues or further feedback.

    Thanks!


    Zafar Abbas

    Thursday, January 29, 2015 1:42 AM
  • Hello Zafar

    I've recreated the Stream Job from scratch and still get no output events. The job name is cplonlineeventhub.

    Kind regards

    David


    David Dasher

    Sunday, February 1, 2015 10:50 PM
  • Hi David,

    Do you see any entries in your operations log?

    Zafar


    Zafar Abbas

    Monday, February 2, 2015 8:42 AM
  • Hi Zafar

    I see the issue

    A receiver with epoch '44223' already exists. A new receiver with epoch null cannot be created.

    Do I have the option to change this if setting up through the portal?

    Thanks

    David

    Monday, February 2, 2015 2:16 PM
  • Do you have other applications reading from the same Event Hub that you are using in this stream analytics job? If you already have a reader connected using epoch, ASA jobs will not be able to connect to the same EventHub.

    Zafar


    Zafar Abbas

    Monday, February 2, 2015 10:17 PM
  • Hi

    Yes we pull the event hub data into our BigData platform on a regular basis. Is there anyway round this? We were hoping to pull out real time stats from Stream Analytics to show errors and system activity.

    David

    Tuesday, February 3, 2015 9:03 AM
  • Because of current limitations of creating receivers on Event Hub using epoch ( more details at [1] ), Stream Analytics is currently unable to receive from an Event Hub which already has a receiver created using epoch. There are a couple of ways to proceed:

    1. If you have applications other than stream analytics that are reading using Event Hub using epoch, change them to stop using epoch. That way you will be able to create up to 5 receives per Event Hub consumer group.

    2. We are planning to add support for multiple Event Hub consumer groups to stream analytics in the near future. With that support you should be able to create a new consumer group on your Event Hub and then specify that in the stream analytics job. This way you will not conflict with your existing application.

    [1] http://blogs.msdn.com/b/gyan/archive/2014/09/02/event-hubs-receiver-epoch.aspx


    Zafar Abbas

    Tuesday, February 3, 2015 5:00 PM
  • Hi David,

    Another way to resolve your issue is to have your App (reading from EventHub) to use a non-default consumer group. The reason your App uses the default consumer group is why it runs into a conflict with stream analytics.

    Let us know if this helps.

    Thanks


    Zafar Abbas

    Friday, February 6, 2015 7:26 PM
  • Hi Zafar

    I am getting the same issue now with another Stream job that I newly created. It is called sporveienstreamanalyticsjob. Can you check what the problem might be? The issue on my side is that I am sending old events but getting no output for them. They are as old as 6 months. The newer events like 1-2 days old get through and generate output.

    We do not have actually that old events. These are just for test purpose.

    Regards

    Thursday, February 12, 2015 7:47 AM
  • Please note that support for #2 as described by Zafar has been implemented via REST APIs.  Associating a Stream Analytics job with its own Consumer Group enables your job to have its own view over the input stream, independent of other event readers. A Consumer Group is specified via the optional property consumerGroupName in the Create Input request and will be surfaced in the portal in a future update.  

    Thanks for the feedback here.


    Saturday, February 21, 2015 12:36 AM