none
WebJob set to run continuously keeps aborting

    Question

  • I have an azure WebJob that is setup to run continuously on my web site but it randomly aborts, per logs "Status changed to Aborted" happens randomly.

    I understand my web site could become idle and timeout but I have a couple of web services underneath it that are in pretty much constant use so I don't think it's timing out (although it could be).

    In any case I'm not going to switch my web site from shared to standard just to be able to enable the "Always On" feature for this particular purpose, seems ridiculous, any other ideas on how I can prevent my web site from recycling and my web job aborting all the time and not restarting? this pretty much makes continuous webjobs useless in non-standard web sites.

    ps- one "hack" I'm trying but haven't been successful at is making a call to the REST api at "mysite.scm.azurewebsites.net/jobs/myjob/start" to make sure my job restarts whenever someone loads the web page (which is pretty much all the time) but I'm not having much luck as I keep getting 403 access denied to that, which is weird because the GET to "mysite.scm.azurewebsites.net/jobs/myjob" works fine with the same creds. Any help getting this hack working or another workaround is welcome :)

    thanks,

    ~ L

    Thursday, March 27, 2014 12:07 AM

Answers

  • To use the continuous WebJob you need a standard site set with "Always On", the reason you can still create a continuous WebJob in Free/Shared sites is for experimentation/trying out the feature.

    The reason you get a 403 is probably since you don't provide the basic auth credentials (your site's deployment credentials) with your request, doing this will keep your WebJob up as long as you have traffic, once you don't again the WebJob may go down.

    BTW: You don't need to hit the start url, the root of the scm site will also work.

    • Proposed as answer by AmitApple Thursday, March 27, 2014 1:14 AM
    • Marked as answer by LCyberFoxx Thursday, March 27, 2014 1:34 AM
    Thursday, March 27, 2014 1:13 AM

All replies

  • To use the continuous WebJob you need a standard site set with "Always On", the reason you can still create a continuous WebJob in Free/Shared sites is for experimentation/trying out the feature.

    The reason you get a 403 is probably since you don't provide the basic auth credentials (your site's deployment credentials) with your request, doing this will keep your WebJob up as long as you have traffic, once you don't again the WebJob may go down.

    BTW: You don't need to hit the start url, the root of the scm site will also work.

    • Proposed as answer by AmitApple Thursday, March 27, 2014 1:14 AM
    • Marked as answer by LCyberFoxx Thursday, March 27, 2014 1:34 AM
    Thursday, March 27, 2014 1:13 AM
  • Thanks Amit,

    My complaint here is simple, don't advertise the WebJobs feature as a feature of non-standard site if it essentially does not work with non-standard sites. I spent countless hours creating my WebJob with the expectation that it would work as documented, yes if it exceeds CPU load (which is not my case) that's a different story, but a "continuous" job aborting just because I'm not on the standard is a bit ridiculous.

    Anyhow, thanks for the tip on just hitting the root SCM, I got that going now, we'll see if it stops aborting. Btw the 403 is strange, I don't get it when I hit the root SCM or even when I do GET calls to webjobs, just when I do the POST call.. and yes i'm using the site's deployment creds otherwise even GET calls fail. Regardless, hitting the root should do it then hopefully.

    Thanks,
    ~ L

    Thursday, March 27, 2014 1:37 AM
  • I understand you concern, we will improve our documentation in this aspect.

    Thanks,

    Amit

    Thursday, March 27, 2014 5:24 PM
  • Just to report back, my "hack" is working beautifully. Hitting the scm page from my website keeps the webjob alive since I have enough users hitting the page on a daily basis. A little bit of a security risk since I have the management login info on my website now but oh well. Thanks for the tip.
    Thursday, March 27, 2014 10:33 PM
  • Thanks for this thread, I learnt a lot because the information available wasn't exactly obvious / clear.

    I have basically the same requirements and issues as the OP. I'm not really interested in paying for the Basic or Standard tier website as it's well above my needs which are adequately covered by the Shared tier with the exception of 'always on'.

    Inspired by LCyberFoxx's solve, I've been investigating the use of an external service periodically calling my website to ensure it stays alive. I've been using pingdom as they have a free tier which allows you to availability track 1 website - https://www.pingdom.com/free/

    Every minute pingdom is calling my azure site and it's not going down, the jobs aren't aborting!

    You might consider this a suitable alternative to putting the code within your own application, especially for applications which aren't guaranteed to be used as frequently.

    Tuesday, June 3, 2014 11:30 PM