Best practices for reading AzureStorage rows RRS feed

  • Question

  • Background - will be using .NET 4.0, Azure SDK 1.7, Azure Table Storage

    Problem How to most efficiently (= fastest processing time ) to read N entries, where N is a large # (1000's to millions) of entities, and each entity is very small (<200 bytes) from a set of Azure tables, where upfront I know the PartitionID and RowID for each of the entities ie [(P1,R1),(P2,R2),...,(PN,RN)].

    So far, all our tests to read data from AzureStorage are underperforming. Some benchmarks: 800 rows are read in about 2 seconds. When investigating the run, we see that within the first 500 ms CPU utilization is high, but after that there is a significant drop off in CPU usage. Then, the read process continues for another 1.5 seconds with relatively low CPU usage.

    Would appreciate any feedback.

    Wednesday, December 12, 2012 6:56 PM