Wednesday, January 16, 2013 4:31 PM
I'm trying to pull some data out of SQL Server, convert it into a DataTable, then push it back into a different SQL Server database using the SQLBulkCopy command.
Varbinary fields are causing me headaches, however, as when they are extracted using the invoke-SqlCmd cmdlet they are returned as Byte data types. This in itself is fine, however they are defaulting to a size of 1024 and padding out the remaining bytes with zeros. When I then try to run the bulk insert it is failing due to an incorrect field length.
I want to try and keep my bulk insert function as dynamic as possible, i.e. I don't want to explicitly set up the DataTable, I want to dynamically create it based on the source query. Also, I can't take the first 'n' bytes as it is perfectly viable to have a value, for example, with 12 bytes where the last 3 bytes are zero. In a column of varbinary(64) it would mean an extra 42 zero bytes were present which would be incorrect.
Is this possible? Or do I have to change the approach?
Monday, January 21, 2013 2:18 AM
Monday, January 21, 2013 3:05 PM
I had this posted in two forums.
- Marked As Answer by Mike Lewis (mzz3lh) Monday, January 21, 2013 3:05 PM