locked
How does the Job works in Azure SAJ RRS feed

  • Question

  • Hi

    I am testing the ...

    https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-quick-create-portal

    ...

    and once I run the job for the before I don't see the new file created in the BLOB\Container2.

    interesting thing is that once I forgot to turn off the Job for over night and the file did show

    anyways my question are

    -- how does the link above works? has anyone tested it?

    -- How does the ASAJ Job works? and why should we run the job starting from yesterday?

    thank you


    Sincerely Nik -- Please kindly mark the post(s) that answered your question and/or vote for the post(s). http://sqldataside.blogspot.ca/ (SQL Tabular + PowerShell)

    Tuesday, May 8, 2018 1:17 PM

Answers

  • hi

    in the link   CLICK HERE

    MS says that .....

    "Under Start job, select Custom, for Start time field. Select one day prior to when you uploaded the file to blob storage because the time at which the file was uploaded is earlier that the current time. When you're done, select Start."

    Up to this point this is not correct, after days of test the current answer NOT "the time at which the file was uploaded " it is "the data in the [time] field at exist in the file was uploaded NOT the time that the was uploaded"

    I think that MS should fix that or make it clear what the approach is

    thank you MS 


    Sincerely Nik -- Please kindly mark the post(s) that answered your question and/or vote for the post(s). http://sqldataside.blogspot.ca/ (SQL Tabular + PowerShell)

    Friday, May 11, 2018 4:03 AM

All replies

  • Hi,

    Thanks for your comment.

    Sometimes job takes few minutes to start since we are provisioning the infrastructure to run the job 24/7.

    For the second question, "why should we run the job starting from yesterday": the reason is Azure Stream Analytics focuses on analysis streams of data, so we want to give a starting date for the stream. While for IoT Hub and Event Hub it will just go back in the queue or start from now; for blob storage ASA will use the blob time creation. Then it will read only blobs that were created after the starting time of ASA.

    I hope it answers your question. We'll try to make this more explicit in the doc.

    Thanks,

    JS

    Tuesday, May 8, 2018 2:54 PM
  • Hi

    thanks for the reply

    yes my job is working and I know that it takes few min, my job is working and running according to the link

    but its not writing to the second container , I loaded the file this morning (2018-05-08) and I set the job for a day before (2018-05-07) and still the job is not reading the file

    Yes I did check the file before running the job through the QUERY Editor and it did run and it did provide what is required according to the link, anyways can you please check and try the link from Microsoft

    you will see minor issues like MyBlobInput must be BlobInput, I have fixed them as well but still nothing

    please check the link and if you can run it

    thanks


    Sincerely Nik -- Please kindly mark the post(s) that answered your question and/or vote for the post(s). http://sqldataside.blogspot.ca/ (SQL Tabular + PowerShell)

    Tuesday, May 8, 2018 3:32 PM
  • Can anyone help please

    Sincerely Nik -- Please kindly mark the post(s) that answered your question and/or vote for the post(s). http://sqldataside.blogspot.ca/ (SQL Tabular + PowerShell)

    Thursday, May 10, 2018 11:55 AM
  • I have tested the Link from MS maybe 10 times not and I have deleted all the objects and recreated them all again and again, I am not getting any output to the container2

    why I don't know?

    its making me nuts


    Sincerely Nik -- Please kindly mark the post(s) that answered your question and/or vote for the post(s). http://sqldataside.blogspot.ca/ (SQL Tabular + PowerShell)

    Friday, May 11, 2018 1:53 AM
  • When I am testing the query in the ASAJob it works it provides an output

    and I test it with the blobinputs and even with uploading a file

    but when I run the ASAJob it don't provide any data , and yes I have set the job to start a day before and in some cases an hour before I created any objects

    why why why


    Sincerely Nik -- Please kindly mark the post(s) that answered your question and/or vote for the post(s). http://sqldataside.blogspot.ca/ (SQL Tabular + PowerShell)

    Friday, May 11, 2018 1:58 AM
  • While he job was running I uploaded a file, still nothing

    I can see in the graphs the OUTPUT EVENT is zero


    Sincerely Nik -- Please kindly mark the post(s) that answered your question and/or vote for the post(s). http://sqldataside.blogspot.ca/ (SQL Tabular + PowerShell)

    Friday, May 11, 2018 2:41 AM
  • hi

    in the link   CLICK HERE

    MS says that .....

    "Under Start job, select Custom, for Start time field. Select one day prior to when you uploaded the file to blob storage because the time at which the file was uploaded is earlier that the current time. When you're done, select Start."

    Up to this point this is not correct, after days of test the current answer NOT "the time at which the file was uploaded " it is "the data in the [time] field at exist in the file was uploaded NOT the time that the was uploaded"

    I think that MS should fix that or make it clear what the approach is

    thank you MS 


    Sincerely Nik -- Please kindly mark the post(s) that answered your question and/or vote for the post(s). http://sqldataside.blogspot.ca/ (SQL Tabular + PowerShell)

    Friday, May 11, 2018 4:03 AM