Azure Data factory Copy Pipeline failing if more than 6 columns RRS feed

  • Question

  • I'm trying to copy data from a view in either on-premise PostgeSQL or MSSQL to a table in Azure PostgreSQL. I can't seem to get this to successfully copy when I map more than 6 columns from source to sink.

    I suspected that one of the source columns was the issue, so varied what columns were mapped but all columns will succeed if less than 6 are copied in total. I then tried different source and sink tables, which produces the same results.
    If I copy to an intermediary csv file I can import/export unlimited columns successfully. Also works as expected if doing MSSQL to MSSQL directly.

    Error output shown in the ADF console consistently is: "Operation on target Copy_7dp failed: Type=Npgsql.PostgresException,Message=08P01: invalid message format,Source=Npgsql,'"

    Friday, October 18, 2019 10:53 AM

All replies

  • Hi Matt,

    As answered by Jay on your Stack Overflow thread, For now,as workaround, you could use csv file to be an intermediary. Transfer data from on-premise database to csv files in Azure Blob Storage. Then transfer data into destination postgresql database.

    I would recommend you to provide feedback at the feedback forum. All the feedback you share, is closely monitored by the Data Factory Product team and implemented in future releases.

    Also, Regarding service general availability, I would suggest to keep an eye on Azure updates

    Azure updates provide information about important Azure product updates, roadmap, and announcements.

    Hope this helps.

    Friday, October 18, 2019 11:58 AM
  • Thanks ChiragMishra,

    I'll proceed with the intermediary CSV step for now.

    I will also provide this as feedback tot he ADF feedback forum.


    • Edited by TestMan897 Friday, October 18, 2019 12:38 PM
    Friday, October 18, 2019 12:38 PM