locked
Execute SQL Server stored proc for a million records RRS feed

  • Question

  • Hi, I need to execute a stored proc (residing on SQL server 2008) for over a million records. The input parameters keep varying for each of the records. When I do this in C# using SqlCommand-ExecuteReader, the operation is very slow. It takes almost an hour for me. Could somebody please guide me on how I can make this efficient? I need to bring it down to around 10 mins. Please help.. Thanks.....
    Tuesday, September 10, 2013 9:45 AM

All replies

  • I would use OPENXML for that.

    To give you more concrete answer, it would be good to know what exactly you are doing.


    Josip Habjan      http://habjan.blogspot.com

    Tuesday, September 10, 2013 9:58 AM
  • The transfer between VS and SQL is slow.  When I do a query with the SQL Manager it takes only a couple of minutes to get 400,000 records.  Performing the same SQL in VS using a SQLDataAdapter takes over an hour.  I think the only way of speeding up the transfer is to upload the data using the SQL Manager.  You could write a stored procedure in SQL to automatically perform the upload from a file.  In VS you could save the data to a file and then execute the SQL stored command from VS.

    jdweng

    Tuesday, September 10, 2013 10:33 AM
  • Usually stored procedures are very fast.

    From what you say I assume that you are iterating throughout the records and do your code.

    Maybe you can iterate inside the stored procedure. That would be faster.

    Here is an example:

    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/fd5991f1-a24d-49ea-885d-d39ee637d172/iteration-within-stored-procedure

     

    Noam B.

     


    Do not Forget to Vote as Answer/Helpful, please. It encourages us to help you...

    Tuesday, September 10, 2013 11:53 AM
  • You can do this looping server-side using a TSQL cursor or a CLR stored procedure. 

    Even better, you can rewrite the stored procedure to be a set-based operation instead of running once for each row in your query.

    David


    David http://blogs.msdn.com/b/dbrowne/

    Tuesday, September 10, 2013 2:32 PM
  • Hi Rosh,

    Could you please tell us why you need to read over a million records from DB? It’s a big work for SqlCommand instance. Please describe your problem in detail, and we will be able to provide quicker and better responses.

    I pick up an article about improving SQL Server Performance. See http://msdn.microsoft.com/en-us/library/ff647793.aspx#scalenetchapt14 _topic7 for more information.

    Hope useful to you.

    Best Regards,


    <THE CONTENT IS PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, WHETHER EXPRESS OR IMPLIED>
    Thanks
    MSDN Community Support

    Please remember to "Mark as Answer" the responses that resolved your issue. It is a common way to recognize those who have helped you, and makes it easier for other visitors to find the resolution later.

    Thursday, September 12, 2013 5:26 AM
  • Hi, I need to execute a stored proc (residing on SQL server 2008) for over a million records. The input parameters keep varying for each of the records. When I do this in C# using SqlCommand-ExecuteReader, the operation is very slow. It takes almost an hour for me. Could somebody please guide me on how I can make this efficient? I need to bring it down to around 10 mins. Please help.. Thanks.....

    do you mean that, there are already million records present in a table in database and you are making a loop to process each record with the help of a stored procedure by your .net code? 

    maybe you can try to move your processing logic completely to sql side. You can put the input paramters for the stored procedures to some other tables using some SQlBulkCopy or so (http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx) with some key values matching with the main table. Then you can call another stored procedure or so, which will use your main table and this table with input parameters and perform the operation that you need and give your result back.

    regards

    joon


    Thursday, September 12, 2013 7:31 AM
  • Thank you all for the responses which were really helpful.

    I did try to implement the suggestions. I wrote all my data to a DataTable and passed this onto the StoredProc which would then iterate through each of the records and process them. But then I could not see much of an improvement in performance. The data was really huge and the application was hanging at times, so I broke down the data and sent smaller chunks of the DataTable (500 records each time) to the Stored Proc. No improvement in performance this time too...

    What my application actually does is, it gets over a million records from a mainframe db and writes this to a data file. This file is then read and the contents are passed onto a Stored proc in SQL Server 2008. All this was happenning in C++ and everything was pretty fast(10 mins), but the moment I do the same in C# it takes more than an hour. How do I resolve this?? I need to execute the SP for every record in the DataTable.


    • Edited by Rosh.2013 Friday, September 20, 2013 5:46 AM Incorrect Data
    Friday, September 20, 2013 5:43 AM
  • I would try the c++ code again to make sure the slow speed is something in the database.  I never used SQL Standard , but have plenty of experience with SQL Express.  I have a project which queries the SQL Express database for 500,00 row of data from the database.  Doing the query using the SQL Manager in the server I can get the data out in about 1 minute.  Using a VS application on the local PC where the database is located take under around 45 minutes.  On a remote PC it takes 1 hour and 15 minutes.

    Then something happened to the PC or database.  The SQL Manager still worked quickly.  but the PC application on both the local PC and remote PC the time went up to 24 hours.


    jdweng

    Friday, September 20, 2013 8:45 AM
  • >This file is then read and the contents are passed onto a Stored proc in SQL Server 2008.

    Ok.  Can you post some code. 

    It's not clear why you are running ExecuteReader() if you are calling a stored procedure for every record in a file.  It's also important how you handle transactions in this scenario.

    And can you see high CPU utilization on the client or database server?  Note that on a multi-core server high CPU utilization looks like moderate CPU utilization. EG on a 4-core box, a single-threaded process that is CPU bound will lead to 25% CPU utilization.

    In general you should be able to get similar performance in C# to C++ for this kind of program.

    David


    David http://blogs.msdn.com/b/dbrowne/


    Friday, September 20, 2013 1:32 PM