none
Issues with sending date in JSON format from EventHub as input RRS feed

  • Question

  • Hello,

    I am able to send the messages/data to my event hub. The problem happens, when i want to use my data from the eventhub as the input into my stream analytics. I am simulating the inputs with javascript. 

      const client = EventHubClient.createFromConnectionString(connectionString, eventHubsName);
      id = 0;
      setInterval(function() {
        var temperature = 25 + (Math.random() * 15);
        var humidity = 70 + (Math.random() * 10);
        var pressure = 90 + (Math.random() * 5);
        var eventData = {id: id, temperature: temperature, humidity: humidity, pressure: pressure};
        id++;
        client.send(eventData);},2000);

    Here i am just sending in eventData. If i tried sending in something like 

    var json = JSON.stringify(eventData);

    then send that instead of eventData, i get an error saying "Error data is required and it must be of type object". Just sending eventData is not giving errors end the messages are counted in the eventhub. Problem is that when i try to put it into the stream analytics i get the error

    "Error code: BadArgument

    Error message: There was an error while reading sample input. Please check if the input source is configured correctly and data is in correct format."

    On one of the azure github pages i see a note saying:

    "Note: When working with Azure Stream Analytics, the body of the event being sent should be a JSON object as well. For example: body: { "message": "Hello World" }"

    I don't fully understand this, so my question is, how would i send this data if i followed this rule?

    var data = {id: 1, temperature: 35.444, humidity: 78.555, pressure: 32.455}

    If this is not the correct way to send in JSON data into stream analytics, then what is?

    Thanks in advance for any help



    Wednesday, June 19, 2019 10:49 PM

All replies

  • Hello,

    Couple of reasons for receiving this error message:

    Normally, this issue is transient. Could you please retry and see if the error persists?

    OR

    Typically, it is recommended to have a unique consumer group for each Azure Stream Analytics job.

    You should configure each Stream Analytics event hub input to have its own consumer group. When a job contains a self-join or has multiple inputs, some inputs might be read by more than one reader downstream. This situation impacts the number of readers in a single consumer group. To avoid exceeding the Event Hubs limit of five readers per consumer group per partition, it's a best practice to designate a consumer group for each Stream Analytics job. 

    For more details, refer “Stream data from Event Hubs”.

    Hope this helps.

    Thursday, June 20, 2019 8:56 AM
    Moderator
  • Hello,

    Just checking in to see if the above answer helped. If this answers your query, do click “Mark as Answer” and Up-Vote for the same. And, if you have any further query do let us know.

    Tuesday, June 25, 2019 11:03 AM
    Moderator