We are starting to load test a new design of our app that uses Azure Queue and Table storage. In a recent test we saw an effect that appears in some ways to act like throttling but the errors we received were not as expected. We ran a 1hour load test that was simulating about 200 requests per second and we see a pattern where after about 15-20mins we start receiving HTTP 403 errors when calling EndAddMessage() on a queue that last just a couple of minutes then clear up. The same pattern repeats after another 15-20minutes. This pattern is similar to what I have seen with SQL Azure, but I am receiving HTTP 403 errors when I expected HTTP 503 (which is I believe what is documented).
The workload here is accepting 200 requests per second from outside Azure and is hitting table storage for a lookup using a highly partitioned schema, then queue storage for an async add, the error occurs as I mentioned when calling EndAddMessage() on the queue in response to the callback from BeginAddMEssage(). The app uses a single storage account, so it seems the total load on the account is under 500 requests per sec, which is what I understand a single partition to be capable of and I am using multiple partitions (tables and queues).
I also wouldn't expect this load to trigger throttling as I have seen that partitions are documented as being capable of 500rps (http://goo.gl/iBiuJ) and Queues even up to 2000rps (http://goo.gl/N4GlI). So maybe my throttling thoughts are way off base.
So what other explanations might there be for this pattern and for 403 errors? My code is pretty simple and I am using the managed storage client so it seems very unlikely there are actually any authorization header errors.
Below is what the errors look like, any thoughts?
Microsoft.WindowsAzure.StorageClient.StorageClientException: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. ---> System.Net.WebException: The remote server returned an error: (403) Forbidden.
at System.Net.HttpWebRequest.EndGetResponse(IAsyncResult asyncResult)
at Microsoft.WindowsAzure.StorageClient.EventHelper.ProcessWebResponse(WebRequest req, IAsyncResult asyncResult, EventHandler`1 handler, Object sender)
--- End of inner exception stack trace ---
at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.get_Result()at Digimarc.Data.ResolverDal.RequestQueueRepository.AddMessageComplete(IAsyncResult ar) in
Not sure if this is related but long back when we were load testing one of our application, we ran into similar issues. We were actually trying to do bulk data transfer from/to Azure. Since the Internet connectivity from India sucks and Windows Azure at that time didn't offer RDP functionality, we decided to host our application in an Amazon VM instance. The application worked fine for some time and then we started getting 403 errors. The specific error message we started to get was "Request Too Old". Upon further inspection, we found that we somehow managed to slow down the clock in Amazon VM. Over a period of time, we slowed down the clock by more than 20 minutes and that's when we started to see the error.
Here's the question I posted on MSDN forums at that time: http://social.msdn.microsoft.com/Forums/en-US/windowsazuredata/thread/6370a519-b016-42fc-87da-1a8b6d6a97e4
Could this be the issue?
Hope this helps.
Thanks for the though Gaurav, but I don;t think that is our problem here. THe load test ran for just 1 hour and cycled through this problem twice then recovered in the one hour period.
I also didn't mention that if we ease back on the amount of load from 200 requests per sec to 100 the problem occurs only once and last for only a few seconds. Then if we go down to 75 it goes away entirely.
Can you look at inner exception when this happens? In our case, we found out about the "Request Too Old" message in the inner exception. It should either be in InnerException or ExtendedErrorInformation (http://msdn.microsoft.com/en-us/library/windowsazure/microsoft.windowsazure.storageclient.storageclientexception_members.aspx).