locked
How to handle records of large size (980016677 bytes=almost 1GB) from BizTalk to SQL RRS feed

  • Question

  • hi all,

    We have a requirement where bulk records will be coming from one SQL server database. BizTalk has to do the mapping (one to one with loop) and  send the bulk records to another SQL server database using the composite schema.

    We are able to receive and send successfully. But the problem came when we received 1 GB data from source SQL server database. We are able to receive the data but while sending from BizTalk to destination database, it takes ages and the instance is forever active.

    We have tried to increase the send time out as well. But no use. After 3 retries (for some hours) the instance gets suspended.

    Can anyone please suggest how to handle this?

    Thanks and Regards,

    Veena

    Thursday, August 27, 2015 11:13 AM

Answers

  • In short: Don't use BizTalk for that type of data transfers. Instead, use SSIS. SSIS is part of the same "platform" and is a lot better at handling bulk data.
    Thursday, August 27, 2015 11:19 AM
  • hi Marcus,

    We know that SSIS is used for huge data. But very rarely we get huge data from SQL database. Is there any way to handle this using pipelines? Kindly suggest.

    Thanks and Regards,
    Veena

    Even if you do some magic in a pipeline, you will have 1GB of data being sent through the product. BizTalk will persist it to the messagebox no matter what. That is a bottleneck.

    You might want to try to split the data into smaller chunks and not try to push a single 1GB message through, but rather many smaller messages. That could give you better performance. The BizTalk engine will most likely throttle if you do it that way, but that might be ok depending on your overall solution. Basically either use envelopes and split the data that way or not do "select * from table" but rather "select top 500 * from table" or something like that.

    Friday, August 28, 2015 11:36 AM

All replies

  • In short: Don't use BizTalk for that type of data transfers. Instead, use SSIS. SSIS is part of the same "platform" and is a lot better at handling bulk data.
    Thursday, August 27, 2015 11:19 AM
  • Marcus Rangell is correct.  SSIS is an integral part of the BizTalk stack and represents no less of a BizTalk solution.

    SSIS is also much better at handling sets of data.

    Thursday, August 27, 2015 11:22 AM
    Moderator
  • Hi Veena,

    BizTalk is not the right solution for you . You can use SSIS for doing the mapping and sending bulk data to the destination SQL server .

    In your case not only the processing of messages will be slow but it will effect multiple performance counter of BizTalk as well

    Thanks

    Abhishek

    Thursday, August 27, 2015 12:48 PM
  • Hi Veena,

    I would suggest to go with Marcus Rangell advise. It's batter to use SSIS instead BizTalk server.


    Thanks, If my reply is helpful please mark as answer or vote as helpful.

    Thursday, August 27, 2015 3:18 PM
    Moderator
  • hi Marcus,

    We know that SSIS is used for huge data. But very rarely we get huge data from SQL database. Is there any way to handle this using pipelines? Kindly suggest.

    Thanks and Regards,
    Veena

    Friday, August 28, 2015 10:36 AM
  • hi Marcus,

    We know that SSIS is used for huge data. But very rarely we get huge data from SQL database. Is there any way to handle this using pipelines? Kindly suggest.

    Thanks and Regards,
    Veena

    Even if you do some magic in a pipeline, you will have 1GB of data being sent through the product. BizTalk will persist it to the messagebox no matter what. That is a bottleneck.

    You might want to try to split the data into smaller chunks and not try to push a single 1GB message through, but rather many smaller messages. That could give you better performance. The BizTalk engine will most likely throttle if you do it that way, but that might be ok depending on your overall solution. Basically either use envelopes and split the data that way or not do "select * from table" but rather "select top 500 * from table" or something like that.

    Friday, August 28, 2015 11:36 AM
  • If you don't have to treat the SQL output as a single batch, you can drain the records in smaller batches, say 1,000 each.

    Here's some techniques you can use:

    http://social.technet.microsoft.com/wiki/contents/articles/24803.biztalk-server-sql-patterns-for-polling-and-batch-retreive.aspx

    Friday, August 28, 2015 12:16 PM
    Moderator
  • hi Marcus,

    Thank you for your response. We tried to do debatching making the source schema as envelope. The problem here is, we have the schema generated using Typed Polling and inside the Typed polling we have 6 nodes containing records like

    TypedPolling

          TypedPolling0

          TypedPolling1

          ......

         TypedPolling5

    The source SQL Stored procedure is created by the third party. We will try to ask them to provide some restriction during fetching the records for the stored procedure during polling.

    Please let me know if you have any suggestions on this.

    Thanks and Regards,
    Veena

    Monday, August 31, 2015 7:13 AM