The following forum(s) are migrating to a new home on Microsoft Q&A (Preview): Azure App Service - Web Apps!

Ask new questions on Microsoft Q&A (Preview).
Interact with existing posts until December 13, 2019, after which content will be closed to all new and existing posts.

Learn More

 none
NotImplementedException in WebJob RRS feed

  • Question

  • Please help me to understand where the problem can be, it's quite urgent for me.

    I had to process about 1.300 messages from a storage queue, at a certain point in time (8:22:58PM) started receiving very frequently this exception but some messages still processing correctly.
    Starting from after a couple of hours (10:23:15PM) when there were about 300 messages left, every single message detected ended with the same exception, this morning I found all remaining messages in poison queue.

    I copied all messages back to the original queue and tried again. Same result.
    Restarting application, killing job from Kudu, etc... didn't help.
    Redeployed (with no changes in code and references) my console application as webjob in the same app service from VS (2015 upd 2), started webjob (is OnDemand, triggered manually first time) and everything ran fine until the queue was empty.

    This App Service is set as AlwaysON, contains only two webjobs/console applications, and has been created directly from VS when I published the first one.
    The second web job was published into the same app.

    The error is in webjob #1, it gets an url from the message, scans a web page and writes some data on a second queue.

    Please help, thank you.


    Full stack:

    Function had errors. See Azure WebJobs SDK dashboard for details. Instance ID is 'e24ef222-c9f4-4e79-9e56-664296ee56f6'
    Microsoft.Azure.WebJobs.Host.FunctionInvocationException: Exception while executing function: Functions.ProcessMyMessage ---> System.NotImplementedException: The method or operation is not implemented.
    at Crawler.Jobs.Functions.ProcessMyMessage(myPocoObj pocoObj, TextWriter logger)
    at lambda_method(Closure , Functions , Object[] )
    at Microsoft.Azure.WebJobs.Host.Executors.VoidMethodInvoker`1.InvokeAsync(TReflected instance, Object[] arguments)
    at Microsoft.Azure.WebJobs.Host.Executors.FunctionInvoker`1.<InvokeAsync>d__0.MoveNext()
    --- End of stack trace from previous location where exception was thrown ---
    at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
    at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
    at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.<ExecuteWithWatchersAsync>d__31.MoveNext()
    --- End of stack trace from previous location where exception was thrown ---
    at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
    at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
    at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.<ExecuteWithLoggingAsync>d__2c.MoveNext()
    --- End of stack trace from previous location where exception was thrown ---
    at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
    at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
    at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd(Task task)
    at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.<ExecuteWithLoggingAsync>d__13.MoveNext()
    --- End of inner exception stack trace ---
    at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
    at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.<ExecuteWithLoggingAsync>d__13.MoveNext()
    --- End of stack trace from previous location where exception was thrown ---
    at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
    at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
    at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.<TryExecuteAsync>d__1.MoveNext()


    Marco

    Friday, June 17, 2016 8:25 AM

All replies

  • Hmmm, never seen that one before. How long have you had the job running before this started happening - was it running fine for a long time, then you changed something? What version of the SDK are you using? Any specific steps to repro this?

    Mathew Charles [MSFT]

    Friday, June 17, 2016 5:11 PM
  • Thank you for your attention.

    It was running since 6:45PM, only a couple of hours, every queue message processed in 3/10 seconds. I've set job host BatchSize to 1, cause I need to process only one message at a time.

    Don't know how to repro, I will try again but I'll much appreciate any suggestion, I'm new in all this azure stuff.

    Microsoft.Azure.WebJobs is v1.1.2, WindowsAzure.Storage 7.0.0. Microsoft.Data...series is 5.6.4 (I see in Nuget 5.7.0 is available)

    Code inside my trigger sub is something like this:

    	Dim msgs As New Text.StringBuilder
            Try
    
                Dim br As MyBrowser = BrowserFactory.Instance.GetBrowser(pocoObj)
                Dim ok As Boolean = br.GetObjectDetails
    
                If Not ok Then Throw New ApplicationException("GetObjectDetails Failed")
                msgs.AppendLine("OK RESULT: " & ......)
    
            Catch ex As Exception
                AddMessage(msgs, ex)
                Throw ex
            Finally
                Try
                    Console.Out.WriteLine(msgs.ToString)
                Catch ex As Exception
                    'ignored
                End Try
            End Try


    Marco

    Friday, June 17, 2016 5:32 PM
  • First, I recommend you add some more exception I handling/logging to your own code to see if the NotImplementedException isn't coming from your code.

    Another potential cause may be the fact that you've upgraded to Azure Storage SDK version 7.0. The WebJobs SDK targets an older version (4.3), but for the 1.1.2 release we removed the version cap to allow people to upgrade to higher versions if those versions work for them. I'm pretty confident that Storage SDK version 6.2 works with our SDK (since we ran some full test passes on that), but I do know version 7.0 included some breaking changes.

    For more info on versioning, see #700 and #698. If you determine that the exception is not coming from your code, you could try downgrading to 6.2 or earlier.


    Mathew Charles [MSFT]


    Friday, June 17, 2016 5:42 PM
  • I sure will try in the weekend and report back.

    Thank you so far.


    Marco

    Friday, June 17, 2016 5:47 PM
  • Charles, you were right about SDK 6.2 insted of 7.0, now it's working smoothly.

    Thank you very much.


    Marco

    Wednesday, June 22, 2016 3:08 PM
  • Ouch! I spoke too soon...

    Today exactly the same exception, killed webjob from kudu, double checked try/catch inside my code, published from VS with no changes, same code, same references, now working perfectly, consumed every message from the queue (only 300, process completed in a few minutes).

    Since one week ago, when I downgraded storage SDK to 6.2, I never saw this exception again. Webjob is OnDemand, I start it manually, it's often waiting on an empty queue, one/two times a day receives some messages to process (300-1.000), job was restarted on wed 22nd when I published some changes in business logic code, on Wednesday processed 2.500 msgs, yesterday 1.000 without problems.

    What else can I check?

    The App service plan is B1 with very low CPU%, memory% regular on 60/65%, web app is AlwaysOn, here are some properties from kudu before I killed the process:

    handle count
    516
    module countid
    76
    thread count
    12
    start time
    2016-06-22T10:14:19.5927424Z
    total cpu time
    00:06:28.0937500
    user cpu time
    00:06:09.6875000
    privileged cpu time
    00:00:18.4062500
    working set
    10,908 KB
    peak working set
    38,448 KB
    private memory
    27,968 KB
    virtual memory
    206,408 KB
    peak virtual memory
    214,748 KB
    paged system memory
    331 KB
    non-paged system memory
    37 KB
    paged memory
    27,968 KB
    peak paged memory
    37,164 KB


    Marco

    Friday, June 24, 2016 7:08 AM