locked
Format the filename using slicestart time in Azure Data Factory RRS feed

  • Question

  • Hello,

    We are copying data from SQL to DataLake storage. When we are using the below script for filename generation. It is giving us below error. Same kind of code worked for sql select query but not working for the file name.

    Scenario is pulling the data of one hour before data into data lake by keeping the filename as starting slicestart  it means slicestart-2.

    Please help me if anybody faced the similar issue.

    Code:

    Output Dataset:

    Tried Below Examples:

    Example 1 :

    "typeProperties": {

                 "fileName": "$$Text.Format('SampleData-\\'{0:yyyy-MM-dd HH:mm:ss}\\'.csv', Time.AddHours(eventdatetime,-2))",

                "folderPath": "datalake/Data/",

                "format": {

                    "type": "TextFormat",

                    "rowDelimiter": "\n",

                    "columnDelimiter": ","

                },

       "partitionedBy": [

                    {

                        "name": "eventdate",

                        "value": {

                            "type": "DateTime",

                            "date": "SliceStart",

                            "format": "yyyy-MM-dd"

                        }

                    },

                    {

                        "name": "eventdatetime",

                        "value": {

                            "type": "DateTime",

                            "date": "SliceStart",

                            "format": "yyyy-MM-dd-HH-mm-ss"

                        }

                    }

                ]

            },

            "availability": {

                "frequency": "Hour",

                "interval": 1

            }

    Example 2:

    {

        "name": "DataLakeTable_Time1",

        "properties": {

            "published": false,

            "type": "AzureDataLakeStore",

            "linkedServiceName": "AzureDataLakeStoreLinkedService",

            "typeProperties": {

                "fileName": "XXXXX-{eventdatetime}.csv",

                "folderPath": "datalake/etldate={eventdate}/",

                "format": {

                    "type": "TextFormat",

                    "rowDelimiter": "\n",

                    "columnDelimiter": ","

                },

                "partitionedBy": [

                    {

                        "name": "eventdate",

                        "value": {

                            "type": "DateTime",

                            "date": "SliceStart",

                            "format": "yyyy-MM-dd"

                        }

                    },

                    {

                        "name": "eventdatetime",

                        "value": {

                            "type": "DateTime",

                            "date": "SliceStart",

                            "format": "$$Text.Format(\\'{0:yyyy-MM-dd HH:mm:ss}\\', Time.AddHours(SliceStart,-2))"

                        }

                    }

                ]

            },

            "availability": {

                "frequency": "Hour",

                "interval": 1

            }

        }

    }

    In both the scenarios we are facing the below error.

    Error:

    Copy activity encountered a user error at Sink side: ErrorCode=UserErrorFailedFileOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The request to 'Unknown' failed and the status code is 'BadRequest' ,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The remote server returned an error: (400) Bad Request.,Source=System,

    Thanks,

    Nagarjuna


    Nagarjuna

    Monday, February 20, 2017 9:14 PM

All replies

  • Hi Nagarjuna, 

    Is that possible you can share the runid which encountered such failure? Then we can look into what's wrong there. BTW, is that possible you can also share the script that work with the sql query?


    panchao

    Saturday, May 27, 2017 6:33 PM