locked
PropertyValueTooLarge Exception RRS feed

  • Question

  • I am getting this exception when trying to add something to my Azure table. Obviously my entity which is a serialized object is too big to fit in a table. I am planning to store it in a blob instead of table. Is that a right approach? Thank you for your help.

     

    Regards,


    Dinesh Agarwal
    Thursday, April 7, 2011 6:38 PM

Answers

  • The limits for properties in Azure Table are documented as follows:

    -- An entity may have up to 255 properties, including the 3 system properties described in the following section. Therefore, the user may include up to 252 custom properties, in addition to the 3 system properties. The combined size of all data in an entity's properties may not exceed 1 MB.

    A blob looks like it is the way to go. However, you lose querying ability so you might want to put an entity in a table, that you can search on, but store the large property in a blob with a link to it in the table entity.

    • Proposed as answer by dropoutcoder Thursday, April 7, 2011 8:16 PM
    • Marked as answer by Wenchao Zeng Thursday, April 14, 2011 3:59 AM
    Thursday, April 7, 2011 7:37 PM
    Answerer
  • I would run this with Fiddler since that shows the raw request and response which is often helpful in identifying storage problems.

    You should also use automated retries (and batch queries if possible) using something like:

    context.SaveChangesWithRetries(SaveChangesOptions.Batch);

    The above is taken from the Azure Storage Team Blog post on how to use storage effectively.

    You should also use a new context for every distinct set of unrelated storage operations because when reusing a context it is possible that there is a relic of a previously failed storage operation and an error trapped during a particular operation may, in fact, be an error from the context retrying a previous and unrelated failed storage operation.

    • Marked as answer by Wenchao Zeng Thursday, April 14, 2011 3:59 AM
    Thursday, April 7, 2011 9:09 PM
    Answerer

All replies

  • The limits for properties in Azure Table are documented as follows:

    -- An entity may have up to 255 properties, including the 3 system properties described in the following section. Therefore, the user may include up to 252 custom properties, in addition to the 3 system properties. The combined size of all data in an entity's properties may not exceed 1 MB.

    A blob looks like it is the way to go. However, you lose querying ability so you might want to put an entity in a table, that you can search on, but store the large property in a blob with a link to it in the table entity.

    • Proposed as answer by dropoutcoder Thursday, April 7, 2011 8:16 PM
    • Marked as answer by Wenchao Zeng Thursday, April 14, 2011 3:59 AM
    Thursday, April 7, 2011 7:37 PM
    Answerer
  • Thank you Neil for the prompt reply. I just realized that I get this exception when I am trying to Delete the entity object from the table. Is that a known bug or expected behavior or something else?

     

     

    public void

    EmptyTable()

    {

     

    try

    {

     

    int

    cntr = 0;

     

    var result = from g in this.Polygon select

    g;

     

    foreach (PolygonEntity polygonEntities in

    result)

    {

     

    this.DeleteObject(polygonEntities);

    cntr++;

    if

     

     

    (cntr >= 100) // I have tried without this too by saving changes everytime

    {

     

    this

    .SaveChanges();

    }

    }

     

    this

    .SaveChanges();

     

    }

     

    catch (DataServiceQueryException

    ) { }

     

    }


    Dinesh Agarwal
    Thursday, April 7, 2011 8:23 PM
  • I would run this with Fiddler since that shows the raw request and response which is often helpful in identifying storage problems.

    You should also use automated retries (and batch queries if possible) using something like:

    context.SaveChangesWithRetries(SaveChangesOptions.Batch);

    The above is taken from the Azure Storage Team Blog post on how to use storage effectively.

    You should also use a new context for every distinct set of unrelated storage operations because when reusing a context it is possible that there is a relic of a previously failed storage operation and an error trapped during a particular operation may, in fact, be an error from the context retrying a previous and unrelated failed storage operation.

    • Marked as answer by Wenchao Zeng Thursday, April 14, 2011 3:59 AM
    Thursday, April 7, 2011 9:09 PM
    Answerer