none
Export Large amounts of data from SQL Azure to local storage

    Question

  • Hi

    I current have a large database ~500GB and i need to export old data ( about 1.6 billion rows) so i can reduce the live database down to the current data.

    We have a full version of SQL server 2017 and i am trying to export a specific table to a local database. 

    I tried using the export wizard from SQL management studio and this was going well until a TCP error occurred and it then failed.

    My problem is i do not want to start from the beginning again - so what is the best way to download the data. I don't mind doing it in batches if that would be more robust.

    Thanks


    john

    Tuesday, January 8, 2019 12:42 PM

All replies

  • Please check,

    https://sqlbackupandftp.com/blog/how-to-backup-azure-sql-database-to-local-machine

    Regards

    PT

    Tuesday, January 8, 2019 12:54 PM
  • Hello,

    Try to use SQL Server Integration Services and use OLEDB connection with zero on the Connect Timeout and General Timeout properties to avoid intermittent drops.


    Hope this helps.


    Regards,

    Alberto Morillo
    SQLCoffee.com

    Tuesday, January 8, 2019 2:34 PM
  • Here's what I would do.

    Using SSMS, create a BACPAC for that one table, and save the BACPAC to Azure storage. 

    Then, copy that BACPAC file locally.

    Restore that BACPAC as a new database (because you have to restore a BACPAC as a database), and then import the data into your local database as desired.

    HTH


    Tuesday, January 8, 2019 4:11 PM
  • hi

    thanks - the problem with that is the amount of additional storage i would then need to keep on azure as these process may happen quite frequently.

    Its something i had read - but at the moment i'm looking at other options.

    thank you 


    john

    Friday, January 11, 2019 2:17 PM
  • hi

    thanks

    i will install that component and see how it goes.


    john

    Friday, January 11, 2019 2:32 PM