Data Factory v2: datetime issue


  • Hi!

    I'm fairly new to Azure Data Factory and I'm trying to do a copy activity with AzureDataFactory v2, but it fails everytime with this error message: 

    Activity Copy3 failed: ErrorCode=FailedDbOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=A database operation failed with the following error: 'Conversion failed when converting date and/or time from character string.',Source=,''Type=System.Data.SqlClient.SqlException,Message=Conversion failed when converting date and/or time from character string.,Source=.Net SqlClient Data Provider,SqlErrorNumber=241,Class=16,ErrorCode=-2146232060,State=1,Errors=[{Class=16,Number=241,State=1,Message=Conversion failed when converting date and/or time from character string.,},],'

    Source: a .csv file located into a folder within DataLake store with these characteristics

    - Column delimiter: pipe(|)

    -Quote character: "

    Sink: an azure SQL dataset

    I've tried many solution like cast( NameOfTheColumn as datetime2(0)), cast( NameOfTheColumn as datetime2(7)) and so...

    Am I skipping something?

    Thank you in advance!


    An extract of the dataset saved in Data Lake Store:

    "AAAAAAAA"|2003-11-20 00:00:00.0000000|"15:56:10"|1608|"AA"|"A "|"AA AAAAAA"|2003-11-20 00:00:00.0000000|"15:56:10"|1608|"aaaaaaaa"|\N|0.00000|\N

    Here the pipeline JSON code:

        "name": "TestPipe",
        "properties": {
            "activities": [
                    "name": "Copy3",
                    "type": "Copy",
                    "policy": {
                        "timeout": "7.00:00:00",
                        "retry": 0,
                        "retryIntervalInSeconds": 30,
                        "secureOutput": false
                    "typeProperties": {
                        "source": {
                            "type": "AzureDataLakeStoreSource",
                            "recursive": true
                        "sink": {
                            "type": "SqlSink",
                            "writeBatchSize": 10000,
                            "sqlWriterStoredProcedureName": "[myschema].[Upsert_tablename_]",
                            "sqlWriterTableType": "[myschema].[tablename_Type]"
                        "enableStaging": false,
                        "cloudDataMovementUnits": 0
                    "userProperties": [
                            "name": "Source",
                            "value": "@{concat( 'myfolder/',substring(utcnow(),0,4),'/',substring(utcnow(),5,2),'/',substring(utcnow(),8,2) )}/tablename_.csv"
                            "name": "Destination",
                            "value": "tablename"
                    "inputs": [
                            "referenceName": "tablename_DL",
                            "type": "DatasetReference"
                    "outputs": [
                            "referenceName": "tablename_ASQL",
                            "type": "DatasetReference"

    Friday, May 25, 2018 1:54 PM

All replies

  • Hi,

    Could you check if your source data has empty strings? 

    If that is the case, you may want to treat empty as null.

    You could specify this property at your input dataset format settings.

    "format": {
                    "type": "TextFormat",
                    "treatEmptyAsNull": true,

    Sunday, May 27, 2018 3:12 AM