Big data set ocuurs OutOfMemoryException RRS feed

  • Question

  • Hi all.

    I want to send a backup command to sql server that is another machine and we have not access to sql server machine hard drive. Then pass the backup file to the c# application.

    Firs I create backup then I copy it to a #temp table by openrowset command. Finally I fill a dataset with that temp table for create a filestream.

    But because the backup file is 1.5 GB, datadapter.fill(dataset) occurs OutOfMemory Exception.

    What should I do?

    abstract code:

    com.CommandText = string.Format("backup database {0} to disk='{1}\\{2}' with format", DatabaseName, DirectoryServer, filename);
    tempTablecom.CommandText = "insert into #temp(backupfile) select * from openrowset(bulk N'E:\\DBBackUp.bak',single_blob) as bck";
    string query = "select backupfile from #temp";
          SqlDataAdapter da = new SqlDataAdapter(query, con);
          DataSet ds = new DataSet();
            da.Fill(ds); //ERROR occurs

    Each problem has a solution.
    Monday, October 18, 2010 8:32 AM


All replies

  • Why do you want to copy the data into a DataSet?  This is always going to cause an out-of-memory exception if you fill a huge amount of data into a DataSet.

    If you just want to copy the data out to a local file, use the bcp utility:



    Monday, October 18, 2010 4:20 PM
  • I want to copy a 1.5 GB data base file into local machine and sql server is in another machine.

    Each problem has a solution.
    Tuesday, October 19, 2010 4:24 AM