locked
Passing large amount of rows as a User defined Table value Parameter to SP in Sql Azure taking long time RRS feed

  • Question

  • We need to pass around 80K rows to a SP in sql Azure. Previously we are able to do so without any glitches. But currently once we call the SP from c#, it's some times taking 10-15 minutes to start execution in DB and many times the SP is not getting Executed. one thing I have noticed once we make the call from c#, some operation is getting started in DB. And If I try to alter the Sp, the mentioned operation blocks it. The info about the blocking sessionid is not available in sp_who2 or from "sys.dm_exec_requests" Any help to resolve this issue is highly appreciated.
    Saturday, June 3, 2017 2:35 PM

Answers

  • 2GB seems quite a lot.  Have you specified the MaxLength for the string column in the DataTable? I know your TVP column length is 50 but it important to specify the same length for the DataTable column to avoid excessive OBJECTSTORE_LBSS memory usage. 


    Dan Guzman, Data Platform MVP, http://www.dbdelta.com

    • Marked as answer by Rajiv1234 Sunday, June 4, 2017 7:51 AM
    Sunday, June 4, 2017 3:24 AM
  • You can use below snippet

    for(inti = 1; i <= dtinput.Columns.count; i++)

                {

                   

    varmaxVal = dtinput

                        .AsEnumerable()

                        .Max(r => r.Field<

    string>(<<get the column name>>) != null? r.Field<string>(<<get the column Name>>)).Length : 1);

                    dtinput.Columns[i].MaxLength = maxVal;

                }

    • Marked as answer by Rajiv1234 Sunday, June 4, 2017 7:51 AM
    Sunday, June 4, 2017 7:21 AM

All replies

  • What type of C# object are you passing as the table-valued parameter value? In the case of a DataTable, it is important to specify the maximum length for string columns.  See http://www.dbdelta.com/sql-server-tvp-performance-gotchas/.


    Dan Guzman, Data Platform MVP, http://www.dbdelta.com

    • Proposed as answer by Naomi N Sunday, June 4, 2017 4:01 AM
    Saturday, June 3, 2017 3:32 PM
  • Very interesting, Dan!

    Another thing that Raijv may want to look into is whether the table type has a primary key or some other index. If this is the case, SQL Server needs to sort the data to be able to insert it into TVP. And if the data is sent from the client in an order which may not conform to the index, there is not much you can do about it. However, if the data is already sorted according to the index on the TVP, it is possible to state this on the client side to avoid the sort.

    I discuss the mechanisms to do this in my article about TVPs:
    http://www.sommarskog.se/arrays-in-sql-2008.html

    Saturday, June 3, 2017 3:48 PM
  • I am using data table to pass Table valued parameter which has 20 nvarchar(50) columns(Nullable) and one Id (Int not null) Column.

    It was working perfectly fine few weeks back and suddenly we have noticed it's breaking.

    Saturday, June 3, 2017 4:49 PM
  • Sounds like there is all reason to review your application from the observations that I and Dan have offered.

    Saturday, June 3, 2017 5:41 PM
  • Are there any indexes on the table type as Erland mentioned? Take a look at the execution plan to see if it has changed and now suboptimal. 

    Are you using Azure SQL Database or is SQL on an Azure VM? Run tis query while your query is executing to rule out high memory usage of this memory clerk:

    SELECT SUM(pages_kb) AS pages_kb
    FROM sys.dm_os_memory_clerks
    WHERE type = N'OBJECTSTORE_LBSS';


    Dan Guzman, Data Platform MVP, http://www.dbdelta.com

    Saturday, June 3, 2017 6:01 PM
  • Passing 80k rows to an SP via parameter is a bad idea and design.

    I would suggest you to make changes to your C# code, and let it populate 80k rows to a Table instead, and then the SP body should use the table. You may added indexes to the table to improve performance.


    ~manoj | http://SQLwithManoj.com

    Saturday, June 3, 2017 7:16 PM
  • Passing 80k rows to an SP via parameter is a bad idea and design.

    I would suggest you to make changes to your C# code, and let it populate 80k rows to a Table instead, and then the SP body should use the table. You may added indexes to the table to improve performance.

    And how would you insert the data into that table?

    True, there is an alternative to use a TVP and that is to use the SqlBulkCopy class, but I think a table-valued parameter is a feasible solution if used correctly.

    Saturday, June 3, 2017 8:46 PM
  • Hi Dan

    We are using Sql Azure database. There is no index on the table type.

    I have run the above query and value keep adding up. After ~10 min the value was 2066392. 

    Sunday, June 4, 2017 2:43 AM
  • Hi Manoj,

    I am not sure whether it's a bad design or not, But it was working fine few weeks ago.

    Thanks

    Sunday, June 4, 2017 2:54 AM
  • 2GB seems quite a lot.  Have you specified the MaxLength for the string column in the DataTable? I know your TVP column length is 50 but it important to specify the same length for the DataTable column to avoid excessive OBJECTSTORE_LBSS memory usage. 


    Dan Guzman, Data Platform MVP, http://www.dbdelta.com

    • Marked as answer by Rajiv1234 Sunday, June 4, 2017 7:51 AM
    Sunday, June 4, 2017 3:24 AM
  • Hi Dan,

    Can you please show how to augment the following code:

     SqlParameter benefitPar = new SqlParameter("@benefit", SqlDbType.Structured);
                SqlParameter templateLinksPar = new SqlParameter("@benefitTemplates", SqlDbType.Structured);
                SqlParameter departmentsPar = new SqlParameter("@departments", SqlDbType.Structured);
                SqlParameter categoriesPar = new SqlParameter("@categories", SqlDbType.Structured);
                SqlParameter itemsPar = new SqlParameter("@items", SqlDbType.Structured);
                SqlParameter itemGroupsPar = new SqlParameter("@itemGroups", SqlDbType.Structured);
    
                DataTable dtBenefit = new DataTable();
                dtBenefit.Columns.AddRange(new DataColumn[]
                    {  new DataColumn("BenefitId", typeof(int)),
                       new DataColumn("Description", typeof(string)),
                       new DataColumn("QuantityType", typeof(byte)),
                       new DataColumn("Quantity", typeof(Int16)),
                       new DataColumn("TimeFrame", typeof(byte)),
                       new DataColumn("Frequency", typeof(Int16)),
                       new DataColumn("AnnualRefreshMonth", typeof(byte)),
                       new DataColumn("AnnualRefreshDay", typeof(byte)),
                       new DataColumn("group_no", typeof(int)),
                       new DataColumn("ATId", typeof(int))
                     });

    I don't see a way to specify length of the string parameter when defining data table. How should I do that?

    Thanks.

    How should I use the above to set the MaxLength for my columns?

    UPDATE. Never mind, I used Columns[1], e.g. used 0 based array to figure out the column. I'm currently working through our application and adjusting code to use proper MaxLength when needed.

    Do I need to set anything else besides MaxLength?


    For every expert, there is an equal and opposite expert. - Becker's Law


    My blog


    My TechNet articles




    • Edited by Naomi N Sunday, June 4, 2017 4:21 AM
    Sunday, June 4, 2017 4:08 AM
  • You can use below snippet

    for(inti = 1; i <= dtinput.Columns.count; i++)

                {

                   

    varmaxVal = dtinput

                        .AsEnumerable()

                        .Max(r => r.Field<

    string>(<<get the column name>>) != null? r.Field<string>(<<get the column Name>>)).Length : 1);

                    dtinput.Columns[i].MaxLength = maxVal;

                }

    • Marked as answer by Rajiv1234 Sunday, June 4, 2017 7:51 AM
    Sunday, June 4, 2017 7:21 AM
  • You Rock Dan.

    Setting the maxLength fix the issue.

    So I am able to pass a TVP with 80k Record in 9-15 second.

    I have also tried to use SqlBulkcopy which took 30-45 second. Do you people think I should relook into the design of not sending the data as part of a TVP or I should be OK with TVP?

    Again thanks Dan for your timely help.

     

    Sunday, June 4, 2017 7:24 AM
  • If the TVP works now after applying Dan's advice, I see no reason to change if you SqlBulkCopy does not give you better performance.

    (Note to self: need to augment my article with a link to Dan's blog post.)

    Sunday, June 4, 2017 9:19 AM
  • No Rajiv, your design is correct, plz ignore my answer.

    And thanks @Erland, you are correct.

    Another answer on similar old thread, @Dan mentioned and provided more data points on using TVPs rather than other methods, link: https://social.msdn.microsoft.com/Forums/sqlserver/en-US/c3018510-3725-4df6-8790-442e7e266687/tvp-performance-with-high-volume?forum=transactsql

    Learnt something today !!!


    ~manoj | SQLwithManoj.com

    Sunday, June 4, 2017 9:31 AM
  • I have also tried to use SqlBulkcopy which took 30-45 second. Do you people think I should relook into the design of not sending the data as part of a TVP or I should be OK with TVP?

    I second Erland's opinion there is no need to switch techniques at this point.

    If you are doing no more than a mass insert of the data passed into a into a permanent table, SqlBulkCopy would theoretically have a slight performance edge if all things are equal because intermediate tempdb storage for the TVP data would be eliminated. 

    However, you cannot level the playing field with Azure SQL Database.  TVP data are always mass inserted into tempdb with a TVP. You don't have the option to load data into tempdb with SqlBulkCopy in the Azure SQL Database world so you need to instead use a permanent table in a user database. I suspect the reason a TVP outperformed SqlBulkCopy in your test is related to tempdb optimizations rather than due to a difference in the method used.

    For slightly more complex scenarios, say data used in a MERGE query to update a permanent table, the downside with a TVP is that they have no statistics so the query plan might be suboptimal. Constraints and indexes on the TVP table type can help improve plan quality but there is no stats histogram as there are with indexes on regular user tables and temp tables. 

    Performance testing is the only way to determine the technique that will perform best. We can make educated guesses based on knowledge and experience but the only way to know for sure is to race your horses.


    Dan Guzman, Data Platform MVP, http://www.dbdelta.com

    Sunday, June 4, 2017 1:17 PM
  • Hi Dan,

    If my types are varchar in the TVP, do I need to do something so they are treated as varchar and not as nvarchar when I define my datatable in C# code?

    UPDATE. Looks like I don't need anything special.


    For every expert, there is an equal and opposite expert. - Becker's Law


    My blog


    My TechNet articles



    • Edited by Naomi N Thursday, June 22, 2017 9:13 PM
    Thursday, June 22, 2017 7:37 PM
  • If my types are varchar in the TVP, do I need to do something so they are treated as varchar and not as nvarchar when I define my datatable in C# code?

    Just keep in mind that the only reason you would pass a DataTable to your TVP is that you have a DataTable anyway in your application. If you are getting the data from elsewhere, say a file, you should use a List(SqlDataRecord).

    Thursday, June 22, 2017 10:09 PM
  • I'm using Entity Framework in our Web Application. I think saving rows with the related rows is easier done using stored procedures than using EF and individual rows updates. Say, today I added another simple SP like this

    -- ==========================================================================
    -- Create date:      06/22/2017 (NN)
    -- Modify date:      
    -- Description:      Saves (Adds/Updates) Modifier Group and related links
    -- Sample call: 
    --declare @p3 int
    --set @p3=9
    --declare @p6 dbo.siriusType_LinkedItems
    --insert into @p6 values(908,N'',1)
    --insert into @p6 values(807,N'',2)
    
    --exec sp_executesql N'execute dbo.SiriusSP_SaveModifierGroup
    --                @GroupId = @GroupId OUTPUT,
    --                @GroupName = @GroupName,
    --                @Hidden = @Hidden,
    --	            @assignedItems = @assignedItems; ',N'@GroupId int output,@GroupName varchar(100),@Hidden bit,@assignedItems [siriusType_LinkedItems] READONLY',@GroupId=@p3 output,@GroupName='Second Group',@Hidden=0,@assignedItems=@p6
    --select @p3
    -- =========================================================================
    CREATE PROCEDURE [dbo].[SiriusSP_SaveModifierGroup]
    	(@GroupId INT OUTPUT,
    	 @GroupName varchar(100),
    	 @Hidden bit,
    	 @assignedItems siriusType_LinkedItems readonly)
    AS
    	
    begin
          set nocount on ;
    	  
    	  declare @output table (GroupId int);	  
          
    	  begin try
    	  begin transaction
    
    	     ;merge dbo.ModGrps  as trgt
    		 using (select @GroupId as GroupId, @GroupName as GroupName, @Hidden as [Hidden]) as srce 
    		 on trgt.GroupId = srce.GroupId
    
    		 when matched and (srce.GroupName !=trgt.GroupName or srce.[Hidden] !=trgt.[Hidden])  then update
    		 set [GroupName] = srce.[GroupName],
    		     [Hidden] = srce.[Hidden] 
            when not matched then insert
    		(	[GroupName], [Hidden])
    		values (srce.[GroupName], srce.[Hidden])
    		   output inserted.GroupId into @Output (GroupId);
    
           select @GroupId = GroupId from @output;		
    	   	
    	   merge dbo.ModGrpLnk as trgt
    	   using (select 
    	          @GroupId as ModGrpId,
    			  ItemId,
    			  Macros,
    	          SortOrder from @assignedItems) as srce
    	   on trgt.ModGrpId = srce.ModGrpId and trgt.ItemId = srce.ItemId
    	   when matched and not (trgt.SortOrder = srce.SortOrder and trgt.Macros = srce.Macros)
    	   then update 
    	   set SortOrder = srce.SortOrder,
    	       Macros = srce.Macros
    		when not matched then insert
    			(ModGrpId,
    			  ItemId,
    			  Macros,
    	          SortOrder)
    		 values
    		 (srce.ModGrpId,
    		 srce.ItemId,
    		 srce.Macros,
    		 srce.SortOrder)
    		 when not matched by source and trgt.ModGrpId = @GroupId
    		 then delete;
    	  commit transaction
    	  end try
    
    	  begin catch		
    
    		-- if the error renders the transaction as uncommittable or we have open transactions, we may want to rollback
    		if @@trancount > 0
    		begin
    			rollback transaction
    		end
    		
    		;throw -- send error back
    		
    	end catch
       
      end

    and in the repository I'm using code like this:

     SqlParameter groupIdPar = new SqlParameter("@GroupId", SqlDbType.Int);
                groupIdPar.Direction = ParameterDirection.InputOutput;
                groupIdPar.Value = (modifierGroup.GroupId==0)?-10: modifierGroup.GroupId;
                
                SqlParameter groupNamePar = new SqlParameter("@GroupName", SqlDbType.VarChar, 100);
                groupNamePar.Value = modifierGroup.GroupName.Trim();
    
                SqlParameter hiddenPar = new SqlParameter("@Hidden", SqlDbType.Bit);
                hiddenPar.Value = modifierGroup.Hidden;
    
                SqlParameter assignedItemsPar = new SqlParameter("@assignedItems", SqlDbType.Structured);
    
                DataTable dtItems = new DataTable();
                dtItems.Columns.AddRange(new DataColumn[]
                    {                          
                              new DataColumn("ItemId", typeof(int)),
                              new DataColumn("Macros", typeof(string)),
                              new DataColumn("SortOrder", typeof(int))
                    });
    
                int i = 0;
                foreach (ModGrpLnk link in assignedItems)
                {
                    i++;
                    dtItems.Rows.Add(new object[]
                    {                    
                        link.ItemId,
                        link.Macros,
                        i
                    });
                }
    
                assignedItemsPar.Value = dtItems;
                assignedItemsPar.TypeName = "siriusType_LinkedItems";
    
                _siriusContext.CoreContext.ExecuteStoreCommand(@"execute dbo.SiriusSP_SaveModifierGroup
                    @GroupId = @GroupId OUTPUT,
                    @GroupName = @GroupName,
                    @Hidden = @Hidden,
    	            @assignedItems = @assignedItems; ",
                      groupIdPar, groupNamePar,  hiddenPar,  assignedItemsPar);
    
                int Id = (int)groupIdPar.Value;
    
                modifierGroup.GroupId = Id;
                SaveChanges();
    I created that pattern for saving individual row with related links some time ago and now I'm just re-using that pattern when we need to save row with the related info. Normally in our case it's just a few related rows.


    For every expert, there is an equal and opposite expert. - Becker's Law


    My blog


    My TechNet articles


    • Edited by Naomi N Friday, June 23, 2017 1:53 AM
    Friday, June 23, 2017 1:51 AM
  • So you should be using a List(SqlDataRecord) instead of DataTable which brings you extra overhead you have no use for. That, or a custom class that implements IEnumerable<SqlDataRecord>.

    For examples, see this article on my web site:
    http://www.sommarskog.se/arrays-in-sql-2008.html

    Friday, June 23, 2017 11:58 AM
  • Thanks, Erland. I'll study the code and update ours appropriately. I'm always for re-factoring and simplifying.

    For every expert, there is an equal and opposite expert. - Becker's Law


    My blog


    My TechNet articles

    Friday, June 23, 2017 12:14 PM
  • Hi Erland,

    I finished reading your article but I didn't find an example that will show how to switch from 

    IEnumerable<T> to IEnumerable<SqlDataRecord>

    In my case I use my model class which I'm passing. I'm going to read the articles you listed in Further reading on EF section now.


    For every expert, there is an equal and opposite expert. - Becker's Law


    My blog


    My TechNet articles

    Friday, June 23, 2017 4:39 PM
  • I finished reading your article but I didn't find an example that will show how to switch from 

    IEnumerable<T> to IEnumerable<SqlDataRecord>

    In my case I use my model class which I'm passing. I'm going to read the articles you listed in Further reading on EF section now.

    Not sure what you mean here. If you already have an IEnumerable of something else, you probably need to copy data over, but I am not the best C# programmer in town, despite having published an article on C#.

    Friday, June 23, 2017 7:54 PM