I was doing a simple POC to save some data in azure tables. If I save a lot of records from the client side to the tables using just datacontext.savechanges it works
always but its slow. If I do the same thing using save options = batch it works much faster…but if the changes are more than 40 I am getting some error saying error code 400 bas request or something…
What can the batch size be? I read somewhere that batch size should be 100 ..so for some reason if we are pushing more than 100 entities to azure tables we need to
split the changes in batches of 100? But I am getting error if changes are more than 40? Is this expected?
I am using SDK 1.3 and using data services to do the save to azure tables.
There are couple of more constraints when performing group operations (batching) on a table. Beside the maximum number of operations (100), you have also a constraint that all entities must have same Partition Key, and you must perform only one (single)
operation on the same entity withing the batch. Please verify whether you are not breaking any other contraint.
You can check out the full list of constraints here: