none
Dynamics 365 copy - auto table creation at sql dest/incremental copy

    Question

  • Need to copy bulk tables data from dynamics 365 cloud to Sql server.  Through copy wizard i am able to copy the individual table only after creating same table definition in destination.

    Is there any way it can create automatically table structure in destination, while in bulk copy process?

    Another question, how to create the incremental copy  from dynamics 365 tables and every entity has modified date. I have seen the example from sql to sql through creating another watermarklookup table in source, but in cloud dynamic 365 how to achieve where SQL access is restricted. 

    https://docs.microsoft.com/en-us/azure/data-factory/tutorial-incremental-copy-portal


    • Edited by everfor Wednesday, July 18, 2018 2:11 PM
    Wednesday, July 18, 2018 10:19 AM

All replies

  • Appreciate if anybody answers my query.
    Thursday, July 19, 2018 1:41 PM
  • For your first question about auto create table, unfortunately, this is not supported. Today, only one scenario supports auto create table. That is copy from azure sql/ sql server to azure sql data warehouse.

    For incremental copy, if you already has modified date column, then you don't need watermarklookup table. You just need specify your query correctly according to your date column. If you are using copy wizard, in the first step, choose run on schedule, and then in the query page, it will show a query template to you. 

    Something like this:

    <fetch>
    <entity name="<<tableName>>">
    <all-attributes />
    <filter type="and">
    <condition attribute ="<<columnName>>" operator="between">
    <value>@{formatDateTime(pipeline().parameters.windowStart, 'yyyy-MM-dd HH:mm')}</value>
    <value>@{formatDateTime(pipeline().parameters.windowEnd, 'yyyy-MM-dd HH:mm')}</value>
    </condition>
    </filter>
    </entity>
    </fetch>
    

    Thursday, July 19, 2018 3:15 PM
  • Thanks for quick reply. Incremental load i need to automate to run every day and your below statement won't look for last copied lasttimestamp field. Any advise on this?

    <value>@{formatDateTime(pipeline().parameters.windowStart, 'yyyy-MM-dd HH:mm')}</value>
    <value>@{formatDateTime(pipeline().parameters.windowEnd, 'yyyy-MM-dd HH:mm')}</value>



    Thursday, July 19, 2018 3:58 PM
  • There is a <<tableName>> and a <<columnName>> in the template. You need replace it with your entity and entity field.
    Friday, July 20, 2018 1:59 AM
  • Not sure, what's wrong. Please advice

    Dynamics operation failed with error code: -2147220935, error message: The date-time format for @{formatDateTime(pipeline().parameters.windowStart, 'yyyy-MM-dd HH:mm')} is invalid, or value is outside the supported range... Activity ID:b9dd44ea-bf2d-49d3-83dd-0086d1277102

    And it's working with manual date i.e value="2018-08-11T12:26:57Z"/>


    Friday, July 20, 2018 9:06 AM
  • In the first step, did you select "Run regularly on schedule" ?
    If you select "Run once now", copy data tool won't help you parse the expressions because you don't have pipeline().parameters.

    And if 'yyyy-MM-dd HH:mm' is not the right format of your data, you could edit it yourself.

    Saturday, July 21, 2018 8:03 AM
  • The failure above is not because wrong date format but the pipeline parameters didn’t get resolved at all. In other words, backend sends the real fetch xml query with date value “@{formatDateTime(pipeline().parameters.windowStart, 'yyyy-MM-dd HH:mm')}” to Dynamics, which certainly causes a failure.

    Could you please check whether Pipeline Schedule is configured and “windowStrat” & “windowsEnd” are well defined in the pipeline parameters? Example:

    windowStart: @trigger().scheduledTime

    windowEnd: @{addhours(trigger().scheduledTime, 24)}


    Tuesday, August 7, 2018 6:51 AM