none
Cosmos DB issue as an Azure stream analytics output RRS feed

  • Question

  • Hello everyone,

    I have a problem with my Cosmos DB as an output of an Azure Stream Analytics job.

    When I launch my ASA job (which takes an IoT Hub as an input and my Cosmos DB as an output), I have this warning bellow telling me that there is a problem on my Cosmos DB (tutodb). I do not receive data on the target container of my DB while I do receive data on my IoT Hub. And when I test the connexion between my ASA job and both my IoT Hub AND my Cosmos DB, Azure tells me that the connexion is well established. 

    Have you guys ever had such an issue or do you know where it could come from ?

    Thank you in advance.

    the Error : Source 'tutodb' had 1 occurrences of kind 'OutputDataConversionError.RequiredColumnMissing' between processing times '2019-08-05T07:46:14:1203171Z' and '2019-08-05T07:46:14:1203171Z'.

    Wednesday, August 7, 2019 8:15 AM

Answers

All replies

  • Hello Elies1 and thank you for your inquiry.  I found a related issue on Stack Overflow, where this issue was caused by case sensitivity in column names.  Does the following apply to you?

    If you're specifying fields in your query (ie Select Name, ModelNumber ...) rather than just using Select * ... the field names are converted to lowercase by default when using Compatibility Level 1.0, which throws off Cosmos DB. In the portal if you open your Stream Analytics job and go to 'Compatibility level' under the 'Configure' section and select v1.1 or higher that should fix the issue. You can read more about the compatibility levels in the Stream Analytics documentation here: https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-compatibility-level

    Wednesday, August 7, 2019 8:50 PM
    Moderator
  • Hello,

    Here is the query i used to stream my data into my cosmos DB : 

    SELECT avg(temperature) as avgtemp
    INTO
    outputdb
    FROM
    inputhub

    GROUPBY deviceid,TumblingWindow(second,30)

    when i choose a blob storage as an output, it works perfectly so I do not understand.

    I tried with compatibility level 1.0 and 1.1 and even 1.2 and I always have the same issue.

    Do you have any other idea ?

    Thursday, August 8, 2019 3:05 PM
  • Cosmos DB requires a partition key for all writes.  In your SELECT statement, I do not see a partition key, I only see your metric.  Without the partition key, Cosmos does not know whether these are inserts or updates.
    Thursday, August 8, 2019 11:12 PM
    Moderator
  • The partition key of the container I target in my cosmos DB is : /deviceid

    I don't really understand where to put it in my ASA job query.

    (Knowing that I already made a ASA job in the past (on an other cosmos DB and an other subscription) and I did not need to mention any Partition key in my query.)

    Friday, August 9, 2019 8:14 AM
  • SELECT deviceid,avg(temperature) as avgtemp
    INTO
    outputdb
    FROM
    inputhub
    GROUPBY deviceid,TumblingWindow(second,30)
    I believe it would look like above.  If this doesn't work, then let me know, and I will spin up an instance and try myself.
    Friday, August 9, 2019 9:17 PM
    Moderator
  • The query you sent me makes the thing work thank you !

    Monday, August 12, 2019 9:29 AM
  • Awesome!
    Thursday, August 15, 2019 1:10 AM
    Moderator