none
API: Send datatable as parameter to stored procedure RRS feed

  • Question

  • I have an idea after watching a video tut on Sending datatable as parameter to stored proc and need feedback. I have an API and I want to store update data sent to my API for a set period of time. Every 5 mins for example, I want to call my database and perform the update passing the data table as a parm to the stored proc. Basically one giant update every 5 mins as apposed to making literally thousands of to my database over a 5 min period of time. Is this possible? Can I use timers in API's? I've never stored data in an API is that even possible? 
    Monday, January 27, 2020 9:16 PM

All replies

  • Hello,

    Seems that if this is SQL-Server you might consider looking a MERGE.


    Please remember to mark the replies as answers if they help and unmarked them if they provide no help, this will help others who are looking for solutions to the same or similar problem. Contact via my Twitter (Karen Payne) or Facebook (Karen Payne) via my MSDN profile but will not answer coding question on either.

    NuGet BaseConnectionLibrary for database connections.

    StackOverFlow
    profile for Karen Payne on Stack Exchange

    Monday, January 27, 2020 9:22 PM
    Moderator
  • Pondering on the idea some more. Thinking it might be useful to use c# to write the datatable temp to a XML file locally and then every X amount of mins send the data table update data to a database. So it would be a combined effort of c# and SQL 
    Monday, January 27, 2020 9:41 PM
  • Hello,

    Seems that if this is SQL-Server you might consider looking a MERGE.


    Please remember to mark the replies as answers if they help and unmarked them if they provide no help, this will help others who are looking for solutions to the same or similar problem. Contact via my Twitter (Karen Payne) or Facebook (Karen Payne) via my MSDN profile but will not answer coding question on either.

    NuGet BaseConnectionLibrary for database connections.

    StackOverFlow
    profile for Karen Payne on Stack Exchange

    Looking at it now. Is MERGE for SQL 2008? Is this something I can do via c# back end code? I have a SQL 2008 server. will this work? 
    • Edited by old_School Monday, January 27, 2020 9:52 PM
    Monday, January 27, 2020 9:43 PM
  • I have done a bit of experimenting with this, and I have found that for "giant updates" the fastest way is to send the data from C# into SQL Server using a SqlBulkCopy (which streams the data directly into the server instead of using multiple inserts). Write this data into a temporary table, and then use the MERGE statement to update from the temporary table into the final table that needs to be updated. Then, delete the temporary table.

    MERGE appeared for the first time in SQL Server 2008, so you should be able to use it.

    Monday, January 27, 2020 10:41 PM
    Moderator

  • StackOverFlow
    profile for Karen Payne on Stack Exchange

    Looking at it now. Is MERGE for SQL 2008? Is this something I can do via c# back end code? I have a SQL 2008 server. will this work? 

    See Alberto's reply which is yes it will work with SQL-Server 2008. Here is a very simple example. Note you can not simply copy and paste as there are parts not included as this is part of a larger code sample I wrote.

    using BaseLibrary;
    using System;
    using System.Collections.Generic;
    using System.Data;
    using System.Data.SqlClient;
    
    namespace BulkCopyierLibrary
    {
        /// <summary>
        /// Responsible for bulk and merge operations
        /// </summary>
        public class PersonExporter : BaseSqlServerConnections
        {
            public ErrorInformation Exception = new ErrorInformation();
            public List<Person> List;
            public int BatchSize { get; set; }
    
            /// <summary>
            /// Set BatchSize to a default of 100
            /// </summary>
            /// <param name="pList">List of person</param>
            public PersonExporter(List<Person> pList)
            {
                List = pList;
                BatchSize = 100;
    
            }
            /// <summary>
            /// 
            /// </summary>
            /// <param name="pList">List of person</param>
            /// <param name="pBatchSize">A value to indicate when to write data to the backend database</param>
            public PersonExporter(List<Person> pList, int pBatchSize)
            {
                List = pList;
                BatchSize = pBatchSize;
    
            }
            /// <summary>
            /// Simple bulk copy with no column mappings as in this case the fields in
            /// Person class names and data types are a match to the table in the database.
            /// 
            /// If an exception is thrown, the caller can check Exception property of this
            /// class for the exception message.
            /// </summary>
            /// <param name="pReset">Passing true will empty the current table's data while not passing a value will not empty the tables content</param>
            /// <returns>Success of the operation</returns>
            public bool Execute(bool pReset = false)
            {
                DefaultCatalog = "BulkCopyDatabaseCodeSample";
                bool success = true;
                using (var cn = new SqlConnection { ConnectionString = ConnectionString })
                {
                    cn.Open();
    
                    if (pReset)
                    {
                        using (SqlCommand cmd = new SqlCommand() { Connection = cn })
                        {
                            // allows us to start fresh :-)
                            cmd.CommandText = "TRUNCATE TABLE Person";
                            cmd.ExecuteNonQuery();
                        }
                    }
    
                    SqlTransaction transaction = cn.BeginTransaction();
    
                    using (var sbc = new SqlBulkCopy(cn, SqlBulkCopyOptions.Default, transaction))
                    {
    
                        // you should tinker with this in your project
                        sbc.BatchSize = BatchSize;
                        sbc.DestinationTableName = "Person";
    
                        try
                        {
                            sbc.WriteToServer(List.AsDataTable());
                        }
                        catch (Exception ex)
                        {
    
                            transaction.Rollback();
                            cn.Close();
    
                            Exception.HasError = true;
                            Exception.Message = ex.Message;
                            success = false;
                        }
                    }
    
                    transaction.Commit();
                }
    
                return success;
            }
            public void UpdateData(DataTable pDataTable)
            {
                using (var cn = new SqlConnection() { ConnectionString = ConnectionString })
                {
                    using (var cmd = new SqlCommand("", cn))
                    {
                        try
                        {
                            cn.Open();
    
                            //Creating temp table on database which is used them removed after the merge
                            cmd.CommandText = @"CREATE TABLE #TmpPersonTable(
                                    [Id] [INT],
                                    [FirstName] [TEXT] NULL,
                                    [LastName] [TEXT] NULL,
                                    [Gender] [INT] NULL,[BirthDay] [DATETIME2](7) NULL)";
    
                            cmd.ExecuteNonQuery();
    
                            //Bulk operation into temp table
                            using (var bulkcopy = new SqlBulkCopy(cn))
                            {
                                // you should experiment with this in your project
                                bulkcopy.BulkCopyTimeout = 660;
                                bulkcopy.DestinationTableName = "#TmpPersonTable";
                                bulkcopy.WriteToServer(pDataTable);
                                bulkcopy.Close();
                            }
    
                            // you should tinker with this in your project
                            cmd.CommandTimeout = 300;
    
                            cmd.CommandText = @"
                                MERGE INTO dbo.Person AS P
                                USING dbo.#TmpPersonTable AS S
                                ON P.Id = S.Id
                                WHEN MATCHED THEN
                                    UPDATE SET P.FirstName = S.FirstName ,
                                               P.LastName = S.LastName ,
                                               P.Gender = S.Gender ,
                                               P.BirthDay = S.BirthDay
                                WHEN NOT MATCHED THEN
                                    INSERT ( FirstName ,
                                             LastName ,
                                             Gender ,
                                             BirthDay
                                           )
                                    VALUES ( S.FirstName ,
                                             S.LastName ,
                                             S.Gender ,
                                             S.BirthDay
                                           )
                                WHEN NOT MATCHED BY SOURCE THEN
                                    DELETE;
                                DROP TABLE #TmpPersonTable";
    
                            cmd.ExecuteNonQuery();
    
                        }
                        catch (Exception ex)
                        {
                            Exception.HasError = true;
                            Exception.Message = ex.Message;
                        }
                        finally
                        {
                            cn.Close();
                        }
                    }
                }
            }
        }
    
    }
    


    Please remember to mark the replies as answers if they help and unmarked them if they provide no help, this will help others who are looking for solutions to the same or similar problem. Contact via my Twitter (Karen Payne) or Facebook (Karen Payne) via my MSDN profile but will not answer coding question on either.

    NuGet BaseConnectionLibrary for database connections.

    StackOverFlow
    profile for Karen Payne on Stack Exchange

    Monday, January 27, 2020 11:12 PM
    Moderator
  • Myself, I  would send XML data to a MS SQL Server Service Broker Queue Service Broker process the XML from the queue, which can be done my C# hosted by Service Broker. 
    • Edited by DA924x Tuesday, January 28, 2020 7:30 AM
    Monday, January 27, 2020 11:40 PM