none
Inserting records into SQL table using BizTalk 2006R2 RRS feed

  • Question

  • I have a requirements where BizTalk will receive an huge XML file containing atleast 40000records. I am supposed to pass the XML as input parameter to an Stored procedure in SQL database.

    Any suggestions ? 


    Thanks and Regards, Jay

    Wednesday, June 20, 2012 9:41 AM

Answers

  • You might want to do some prototyping with different processing options:

    1. Use the XML Disassembler to disassemble the records and process them individually.
    2. Pass the entire document into an orchestration and call an external data access assembly to fill a typed dataset and let the EnterpriseLibrary apis process the dataset by calling the stored procedure.

    In our production environment, it's not unusual for our typed datasets to contain 20000+ related records.  All of our data access (insert, read, update and delete) is accomplished using our own data access layer rather than using the SQL Adapter.


    David Downing... If this answers your question, please Mark as the Answer. If this post is helpful, please vote as helpful.

    • Marked as answer by Jay Nam Thursday, June 21, 2012 8:56 AM
    Wednesday, June 20, 2012 2:10 PM

All replies

  • You might want to do some prototyping with different processing options:

    1. Use the XML Disassembler to disassemble the records and process them individually.
    2. Pass the entire document into an orchestration and call an external data access assembly to fill a typed dataset and let the EnterpriseLibrary apis process the dataset by calling the stored procedure.

    In our production environment, it's not unusual for our typed datasets to contain 20000+ related records.  All of our data access (insert, read, update and delete) is accomplished using our own data access layer rather than using the SQL Adapter.


    David Downing... If this answers your question, please Mark as the Answer. If this post is helpful, please vote as helpful.

    • Marked as answer by Jay Nam Thursday, June 21, 2012 8:56 AM
    Wednesday, June 20, 2012 2:10 PM
  • Thanks David. I  decided to go with the second option, but I am getting a timeout error for XML containing more than 20000 records. I guess I have to handle timeout in my external assembly.

    Thanks and Regards, Jay

    Thursday, June 21, 2012 9:00 AM
  • Within our processing, we typically break up the processing into smaller chunks to minimize timeout impacts.  We do this for both read operations (read by page size up to a maximum number of pages) building up an output message, as well as write operations, limit the size of the constructed dataset and use multiple writes to accomplish the entire batch.

    We also include a global orchestration exception handler that will write to a flat Error Log database table that includes the inbound message content, and the received transport uri and some additional tracing information.  We have a scheduled job that executes every hour that will read the entries of the Error Log table and resubmit the failed messages back to their original transport location for reprocessing.

    HTH


    David Downing... If this answers your question, please Mark as the Answer. If this post is helpful, please vote as helpful.

    Thursday, June 21, 2012 2:36 PM