locked
CSV Append Blob RRS feed

  • Question

  • So my current situation is as follows:

    I'm gathering data of a bunch of sensors and this is being appended to a single CSV that is being remade hourly in a new map i.e. {data}/{time}/file.csv

    Now this is all setup in ASA but when it is running it only checks the blob once and not continously. So what happens when I start the asa job it has a huge spike on the monitor because all the new entries are being read but after that it doesn't do anything anymore. The SU% is at a continous 30% though. 

    So my question is as follows: Is it possible to have ASA continously stream all the data that is being added to a single csv blob with the type being append?

    Thursday, March 22, 2018 1:31 PM

All replies

  • Hi,

    ASA does not support that scenario. As noted here:

    "Stream Analytics does not support adding content to an existing blob file. Stream Analytics will view each file only once, and any changes that occur in the file after the job has read the data are not processed. Best practice is to upload all the data for a blob file at once and then add additional newer events to a different, new blob file."

    Thanks,
    Kati Iceva


    This posting is provided "AS IS" with no warranties, and confers no rights.

    Thursday, March 22, 2018 2:33 PM
  • "Best practice is to upload all the data for a blob file at once and then add additional newer events to a different, new blob file."

    I used to do this using json, every single event that got fired by a sensor had his own json file, now the problem with that was that Azure Streaming Analytics had a hard time keeping track of all those seperated files. So how should I approach this problem to both analyze huge amounts, saving it and all in real time. 

    Wednesday, March 28, 2018 12:31 PM