locked
How to calculate the size of some rows from each table in SQL AZURE database? RRS feed

  • Question

  • Hi,

     I have a requirement to calculate the size of some rows from a table in sql azure.

    Can anyone please help me out.

    • Moved by Brian AurichMicrosoft employee Wednesday, March 30, 2011 5:05 PM Not a labs specific question, moving to SQL Azure forum (From:SQL Azure Labs Support)
    Monday, March 28, 2011 10:59 AM

Answers

  • Hi Everyone,

     

    First of all Thanks a lot for the reply.

     

    I got the solution for the problem.

     

    suppose i have a table Emp(ID, Name, Age, Desc)

    Sol: Select sum(DataLength(ID)) + sum(DataLength(Name)) + sum(DataLength(Age)) + sum(DataLength(Desc)) as Size where <condition>


    Monday, April 4, 2011 5:33 AM

All replies

  • Does it help - if you know the size of the Table? Once you have it and assuming each row in a table is of same size - you can do simple arithmetic to calculate the size of those rows.

    If it sounds good. Here's the TSQL code:

    select
    sys.objects.name, sum(reserved_page_count) * 8.0
    from
    sys.dm_db_partition_stats, sys.objects
    where
    sys.dm_db_partition_stats.object_id = sys.objects.object_id
    group by sys.objects.name

    courtesy: http://msdn.microsoft.com/en-us/library/ff394114.aspx 

     


    - Paras Doshi, (SQL) Azure and cloud computing paradigm enthusiast! I blog at http://www.parasdoshi.com and URL of my unified profile is http://www.flavors.me/paras
    Wednesday, March 30, 2011 9:54 PM
  • Hi Paras,

     

    Thanks for the reply.

    But I can not assume each row of same size, because one column of a row can have 10 chars (lets assume) and same column of another row can have 1000 chars.

    And if i have 10 million record in the table so i can not assume each row of same size becuase that would give me incorrect result.

    Thursday, March 31, 2011 7:08 AM
  • right. i once needed to know the size of tables in sql azure - but found it only after searching for more than an hour or so; so i just thought of sharing it with you. May be, it can give you a start point to write your own tsql code.
    - Paras Doshi, (SQL) Azure and cloud computing paradigm enthusiast! I blog at http://www.parasdoshi.com and URL of my unified profile is http://www.flavors.me/paras
    Thursday, March 31, 2011 7:33 AM
  • Hi Jitendra,

    If my understanding is correct, as far as simple data types are concerned they can either be of fixed size (e.g. int, bit etc.) or variable size (varchar, nvarchar etc.). What you can do is take all your columns and find out maximum bytes each data type can occupy e.g. int data type will always occupy 4 bytes. You can then look at the variable size data type columns and figure out how many bytes are actually occupied by the data. For example, let's say you have a column called Country or type Varchar(50) then maximum it can occupy 50 bytes but let's say the row in question contains value "India" in this column then you're using only 5 bytes out of 50. Using this approach I think you can figure out the row size. This obviously gets tricky when you get into complex data types.

    Hope this helps.

    Thanks

    Gaurav Mantri

    Cerebrata Software

    http://www.cerebrata.com

     

    Thursday, March 31, 2011 2:00 PM
  • Hi Jitendra,

    I don't have a perfect answer for you, but one thing you could do is load each row in a DataTable, then calculate the size of each row programmatically. Here is a code sample that should get you close to what you are looking for. Against, it's not extremely precise, but hopefully it will help. The idea is to use the ItemArray property of a DataRow. This returns an array of objects in their native format. Then the row is saved into a Serializable class, which is serialized as a binary object into a memory stream. Finally... you can get the length of that memory object which returns the number of bytes needed to store the data. The reason I say it is not very precise is due to the fact that the bytes returned include the footprint of the class itself. But if you have large rows, it should at least allow you to determine which rows are larger than others and get a good idea of their size.

    class Program
      {
        static void Main(string[] args)
        {
    
          DataTable table = new DataTable("test");
          table.Columns.Add(new DataColumn("id", typeof(int)));
          table.Columns.Add(new DataColumn("name", typeof(string)));
          table.Columns.Add(new DataColumn("createdOn", typeof(DateTime)));
    
          table.Rows.Add(new object[] { 100, "This is a test for length", DateTime.Now});
    
          ARow row = new ARow();
          row.ItemArray = table.Rows[0].ItemArray;
    
          System.Runtime.Serialization.Formatters.Binary.BinaryFormatter bf = new System.Runtime.Serialization.Formatters.Binary.BinaryFormatter();
          System.IO.MemoryStream ms = new System.IO.MemoryStream();
          bf.Serialize(ms, row);
          ms.Flush();
    
          long size = ms.Length;
    
        }
    
        [Serializable]
        public class ARow
        {
          public object[] ItemArray { get; set; }
        }
    
      }
    

    Herve Roggero, Blue Syntax MVP SQL Azure Co-Author: Pro SQL Azure
    Thursday, March 31, 2011 3:26 PM
  • Hi Everyone,

     

    First of all Thanks a lot for the reply.

     

    I got the solution for the problem.

     

    suppose i have a table Emp(ID, Name, Age, Desc)

    Sol: Select sum(DataLength(ID)) + sum(DataLength(Name)) + sum(DataLength(Age)) + sum(DataLength(Desc)) as Size where <condition>


    Monday, April 4, 2011 5:33 AM
  • thanks for sharing the solution.
    - Paras Doshi, (SQL) Azure and cloud computing paradigm enthusiast! I blog at http://www.parasdoshi.com and URL of my unified profile is http://www.flavors.me/paras
    Tuesday, April 5, 2011 7:33 AM