none
bcp_bind fails when used with size over 8000 bytes for VARBINARY(MAX) column

    Question

  • Hi,

    I am using a Bulk Copy Functions (http://msdn.microsoft.com/en-us/library/ms130922.aspx) via SQL Server native client v11 for fast data insertion into SQL Server 2012 table. Everything works fine, except insertion of big (over 8K) data chunks into BLOB column (inserting data under 8K works without any issues). I get following extended information about the error:


    HY000 „Invalid field size for datatype“

    Example C++ code:

    std::vector<char> vData(9000, 0);
    BYTE* pData = &buf[0];
    DBINT nLenth = buf.size();
    auto nRet = bcp_bind(hHandle, pData, 0, nLength, nullptr, 0, SQLVARBINARY, 2/*Column number is correct*/);
    assert(nRet != FAIL); //Fails

    Table creation SQL:

    CREATE TABLE [Table1] (C1 INT IDENTITY (1,1) NOT NULL, C2 VARBINARY(MAX) NULL);

    I’ve tried different data type valus (SQLBIGBINARY, SQLBIGVARBINARY, SQLBINARY, …) – same error.

    I’ve tried setting „pData“ to „nullptr“ to signal that data will be provided via calls to „bcp_moretext“ - same error.

    Transferring same data via ODBC SQLBindParameter with type SQL_VARBINARY (SQL_C_BINARY) works without any problems.

    Any chance to get bulk copy functionality working?

    Thank you!

    Roman

    Monday, August 25, 2014 8:44 AM

Answers