I'm creating a bacpac file from my sql azure database. I want to import this file into a local SQL Server.
If I rename the file to .zip and explore it I can see the schema and JSON data inside. OK.
When I try to import the .bacpac into Sql Server Express 2008 R2 SP1 via
DacImportExportCli.exe -s xxx\SQLEXPRESS -d newdb -f C:\Backup\SQLAzureBackup\xxx.bacpac -i -e
It creates the db, tables, stored procedures etc, - but there is no data!
I have SP1, the lastest 'feature pack; files (like the SharedManagementObjects) and I've tried with every version of DacImportExportCli.exe.
What am I missing here?
I've got the same issue on a full version of SQL Server with SSMS SP1 and the latest MSI objects applied. I'm running Windows Server 2003 R2 SP2. I tried both the -E option and the -U -P options, just to rule that out.
Are there any JSON libraries that need to be installed for .NET to use, or is that built in?
Thanks in advance for your help,
Do you have any output for the command line tool. Also, I guess you installed all necessary SQL Server Denali CTP3 binaries on the machine which you run the command, right?
You do need .Net 4.0 Full Version intalled at well, but no need for JSON libraries.
Qingsong Yao [MSFT]
I had the same issue. here is the output from the commandline. Says it imported, but no data!
Import started: 9/15/2011 8:31:26 PM
Connecting to mattbookpro\sql2008r2...
[8:31:56 PM] PrepareSystemTables: Pending Preparing DAC metadata in the SQL Server instance 'MATTBOOKPRO\SQL2008R2'
[8:31:57 PM] PrepareSystemTables: Success Preparing DAC metadata in the SQL Server instance 'MATTBOOKPRO\SQL2008R2'
[8:31:57 PM] CreateDatabase: Pending Creating database 'AppsandOR1'
[8:31:58 PM] CreateDatabase: Success Creating database 'AppsandOR1'
[8:31:58 PM] ScriptDACObjects: Pending Preparing deployment script
[8:32:59 PM] ScriptDACObjects: Success Preparing deployment script
[8:32:59 PM] CreateDatabaseObjects: Pending Creating schema objects in database 'AppsandOR1'
[8:33:00 PM] CreateDatabaseObjects: Success Creating schema objects in database 'AppsandOR1'
[8:33:00 PM] PrepareImportTables: Pending Preparing Import data from tables of database 'AppsandOR1'
[8:33:00 PM] PrepareImportTables: Success Preparing Import data from tables of database 'AppsandOR1'
[8:33:00 PM] FinalizeImportTables: Pending Finalizing Import data from tables of database 'AppsandOR1'
[8:33:00 PM] FinalizeImportTables: Success Finalizing Import data from tables of database 'AppsandOR1'
[8:33:01 PM] RegisterDAC: Pending Registering the DAC in DAC metadata.
[8:33:01 PM] RegisterDAC: Success Registering the DAC in DAC metadata.
Import Complete. Total time: 00:01:34.2233391
Same for me...
Data downloaded from Azure using the DAC Server-side tools
Imported to local SQL Express (64Bit) with the 5 Command Tools installed.
Import started: 20/09/2011 1:13:19 PM Connecting to SONATAII\SQLEXPRESS... Connection Open. [1:13:56 PM] PrepareSystemTables: Pending Preparing DAC metadata in the SQL Server instance 'SONATAII\SQLEXPRESS' [1:13:57 PM] PrepareSystemTables: Success Preparing DAC metadata in the SQL Server instance 'SONATAII\SQLEXPRESS' [1:13:58 PM] CreateDatabase: Pending Creating database 'ZinfoniaLive' [1:14:00 PM] CreateDatabase: Success Creating database 'ZinfoniaLive' [1:14:00 PM] ScriptDACObjects: Pending Preparing deployment script [1:15:24 PM] ScriptDACObjects: Success Preparing deployment script [1:15:25 PM] CreateDatabaseObjects: Pending Creating schema objects in database'ZinfoniaLive' [1:15:27 PM] CreateDatabaseObjects: Success Creating schema objects in database'ZinfoniaLive' [1:15:27 PM] PrepareImportTables: Pending Preparing Import data from tables of database 'ZinfoniaLive' [1:15:27 PM] PrepareImportTables: Success Preparing Import data from tables of database 'ZinfoniaLive' [1:15:27 PM] FinalizeImportTables: Pending Finalizing Import data from tables of database 'ZinfoniaLive' [1:15:27 PM] FinalizeImportTables: Success Finalizing Import data from tables of database 'ZinfoniaLive' [1:15:27 PM] RegisterDAC: Pending Registering the DAC in DAC metadata. [1:15:27 PM] RegisterDAC: Success Registering the DAC in DAC metadata. Import Complete. Total time: 00:02:08.5891529
Could you please change the extension on the bacpac to .zip and explore 'Data' subfolder inside. We create a single directory per each table. Could you please take a look at the names of these folders and ensure that they match to your table names in the format of <Schema_Name>.<Table_Name>. Please let us know if any of you tables are not correctly matched.
We have been trying to repro this issue locally and have not been succesful so far. Would you be able to provide a sample bacpac (by truncating data + any schema that you don't want to share) that can repro this issue ? If so, can you please provide a SkyDrive link or contact me at my firstname.lastname @ microsoft.com
Just out of curiosity, how many of the people that have this problem have either of the following scenarios:
My default database collation is Latin1_General_CI_AI
My Locale is Australia
OS is Win 7 64 bit
Thanks Peter for providing the sample bacpac.
It turns out the CLI tool is incompatible with the service where-in the data from the bacpac exported from service is not being imported through the tool. Please note that the service, however, is expected to round trip the data correctly.
We are looking at our options to refresh the msis required for the CLI tool to fix this issue.
Meanwhile, if you do need to use the CLI tool to import to on-premise, there is a rather inconvenient workaround. Please go the 'Data' folder inside the bacpac and rename the .JSON files to not have the 5 digit suffix (i.e. rename TableData-00000.JSON to TableData.JSON). Once you rename the files, please zip it back. The new bacpac should work with the existing CLI tool.
I'm trying to do what you recommend above...
1.) Do I also need to change the data in the _rels file?
2.) When I re-zip the data, I get a DacException: The stream cannot be read to construct the DacType. Are there any caveats to the re-zip process that I'm missing? What program do you use to re-zip?
OK, I got it...
1.) I did not need to change any data in the _rels .rels file.
2.) I had been using WinRar and Windows 7 built-in compression, but when I switched to WinZip and simply renamed the files within the existing ZIP file, everything worked.
Hope that helps if anybody else runs into the same issues.
I made simple exe to fix-up the bacpac for the client importer. Simply run like this:
bacpacfixup.exe -f filename
The new file will be called filename-fixed.bacpac
That is great...it works a treat (a slight glitch using a unc path to the file but when I copied it locally it all worked beautifully).
Thanks for sharing!
Just wanted you to know that the service will be updated this week and will no longer produce BACPAC files which are not fully compatible with the CTP3 version of the DAC framework. After the update, any new BACPAC files created by the service will not need to be tweaked before importing using the client side tools.
The service was updated. All files created by the service will now be compatible with the SQL Server "Denali" CTP3 toolset.
Please note that in the future, the service will be updated to emit a slightly modified artifact for performance purposes that will no longer be compatible with the CTP3 toolset. A new version of the client side DAC Framework and its associated components will be made available at that point in time in order to maintain compatibility between the service and the client side tools.
Additionally, the service and the newer set of "Denali" tools will continue to accept files previously generated either by the older version of the service or by an older set of the client side tools.
Updates will be provided as the release date solidifies for the service and the client side tools.
There's no need to download anything new. The actual service itself was updated, the service EXE tool you are using from CodePlex simply posts reqeuests to the service running in Azure. The tools don't really do anything so they do not need to change at all, they continue making the same calls to the service but the service itself produces a different file format.
So given the above, you should expect to not change anything in your local environment in order to get this to work:
Everything worked for a week, but this week rather than having 1 JSON data file for each table, the bacpac contains multiple TableData files with -0000, -0001, -0002 extensions and so no data is being imported again and the tool created by cdouglas3578 will not work because there is more than 1 file per table.
Installing the the November Azure SDK makes no difference.
I am having the exact same problem as BTM ( bacpac now containing multiple JSON files instead of the one as previously)
OMG, just when it looked like a great tool, it gets changed. Any update from MSFT?
Indeed there is an update:
The service produces a newer file version which is compatible with the new version of the client side tools.
I downloaded the new client but when it runs I get an error:
An error has occurred:
Method not found: 'Void Microsoft.SqlServer.Management.Dac.DacStore.Import(System.String, Microsoft.SqlServer.Management.Dac.DatabaseDeploymentProperties, Boolean)'.
What else do I need to install?
You must install both the EXE and the new set of binaries which have all been refreshed. It's pretty painless, you dont have to uninstall or delete anything, just grab the x64 or x86 files from the provided links:
If you had previously installed the MSIs, running the newer MSI will "upgrade" your existing install.
OK, so you also have download the binaries for the new version as well. Because the link on the page is not working, the steps to get back up and working are:
Download the new build of the CLI Client here:
Then from this page http://www.microsoft.com/download/en/details.aspx?id=28147 download the new version of the following files for your processor:
It was the "Required Assemblies Links" option on the home page but that seems to be OK now.
Try using the -B option on the command line:
DacImportExportCli.exe -s xxx\SQLEXPRESS -d newdb -f C:\Backup\SQLAzureBackup\xxx.bacpac -i -B -e
This works for me.