locked
Fault Tolerant Background Job RRS feed

  • Question

  • User1985972524 posted

    I have a service worker up and running in my ASP.NET Core 2.2 app. I am exposed to a problem that I want to solve which is to create a fault-tolerant background service. 

    I have a shared table on which two different jobs select and update data. Basically what we are doing is that we are syncing Office Calendar and Mail so the token has to be stored in the database. If any of the services fails to access the API they refresh access by using refresh token and then updates it in the database as well. Now there might be a situation that one of the services is working on the same row while other has updated the tokens in the database and we might end up in an infinite loop and both update the tokens over and over again. What should be the workaround to avoid such scenarios?

    Thanks

    Monday, April 13, 2020 3:51 PM

All replies

  • User-474980206 posted

    use proper locking of the token use. They should be checked out for update, and checked in when valid. checkout should have timeouts in case the process that did the check out fails.

    logic

    new = false
    loop
       token = getToken(new) 
       if useToken() exit loop
       new = true
    end loop
    func getToken(new)
       loop
           if new 
              check out token
              if token is already checked out
                 new = false
                 contine loop
              end if
              token = create new database token 
              release checkout
              return token
           else
              token = read current database token
              if token is checked out
                 delay 
                 continue loop
              end if
              if check out is expired
                  new = true
           end if
       end loop
    

    Monday, April 13, 2020 5:56 PM
  • User1985972524 posted

    Hello, Thanks for your reply. There is another thing that I would like to know since there is only one job that gets the data from the Office API and push that to database. I have seen service workers in Laravel where we have the option to queue the data into the Redis or Database and then there is another worker which processes that entry from Database or Redis. Do we have such a feature/framework/library for Asp.NET Core? 

    Moreover, We are near to MVP deployment and we upload the .dll files over and over again. This definitely interrupts the job in between so for example if there are 100 emails that I received from Outlook API now we processed 50 of them and then dotnet service on Linux is restarted. Since we have the previous deltaLink/Sync token because after the 100 emails are processed then that sync token is stored in the database. When the service is restarted the system is going to process those previous imported entries again so we gonna have redundant data in the database. How can we avoid such situations?

    Tuesday, April 14, 2020 1:20 PM
  • User302204515 posted

    Hi hasnihaider,

    Do we have such a feature/framework/library for Asp.NET Core?  

    I think what you mean is some thing like a message queue and message push and pop function, It can seperate into 3 different senario

    1. Message queue inside within the same process-> Yes we have build in queue for this, see docs here https://docs.microsoft.com/en-us/dotnet/api/system.collections.generic.queue-1?view=netframework-4.8
    2. Message queue accross process but within the same machine -> Yes there is something called named pipeline that different process can communicate with each other just like message queue, see article here https://www.codeguru.com/csharp/csharp/cs_misc/sampleprograms/article.php/c7259/InterProcess-Communication-in-NET-Using-Named-Pipes-Part-1.htm
    3. Message queue accross different process and different machine -> No asp.net does not have built in support fot it, you need to utlize a 3rd party message queue product such as RabbitMQ, Kafka, or redis. All these message queue product have official APIs for .net and I belive you can find them in nuget easily.

    For the second question, the best way is to write your process status in some places and once there are service/system restart, pulse your process and after that, resume your job form break point.

    Thursday, April 16, 2020 9:47 AM
  • User-474980206 posted

    if you are processing emails, you should be using logging the messageid so you don't reprocess the same message.

    as Microsoft is pushing cloud computing it should be no surprise, that on-prem computing lacks message queues. if you picked aws or azure, you could use an event hub, and server functions for this architecture.

     azure has the cool durable functions.

    Thursday, April 16, 2020 10:24 PM