locked
Pending Validation - No Runs for this Data Slice RRS feed

  • Question

  • Hello,

    I have created an ADF with a custom activity. I got the tutorial for this to work. Now I am trying to get my own custom activity to work. My problem is that the input and output data slices are stuck in 'pending validation' state. See below for the image. 

    Context: I am trying to read zipped .db files from blob storage, unzip them, and write their data to Azure SQL using a custom activity.

    Here is the definition of my input dataset: 

    {
      "$schema": "http://datafactories.schema.management.azure.com/schemas/2015-09-01/Microsoft.DataFactory.Table.json",
      "name": "ETLBlobInput",
      "properties": {
        "type": "AzureBlob",
        "linkedServiceName": "StorageSas",
        "typeProperties": {
          "folderPath": "configdb-storage/"
        },
        "availability": {
          "frequency": "Hour",
          "interval": 1
        },
        "external": true,
        "policy": { }
      }
    }

    Here is my output dataset:

    {
        "$schema": "http://datafactories.schema.management.azure.com/schemas/2015-09-01/Microsoft.DataFactory.Table.json",
        "name": "ETLSqlOutput",
        "properties": {
            "type": "AzureSqlTable",
            "linkedServiceName": "SqlDbLinkedService",
            "structure": [],
            "typeProperties": {
                "tableName": "TestSqlDb"
            },
            "availability": {
                "frequency": "Hour",
                "interval": 1
            }
        }
    }
    and here is my pipeline: 
    {
      "$schema": "http://datafactories.schema.management.azure.com/schemas/2015-09-01/Microsoft.DataFactory.Pipeline.json",
      "name": "ETLPipeline",
      "properties": {
        "description": "ETL Blob Zip file to SQL",
        "activities": [
          {
            "name": "DotNetActivityTemplate",
            "type": "DotNetActivity",
            "inputs": [
              {
                "name": "ETLBlobInput"
              }
            ],
            "outputs": [
              {
                "name": "ETLSqlOutput"
              }
            ],
            "typeProperties": {
              "assemblyName": "ETLCustomActivity.dll",
              "entryPoint": "ETLCustomActivity.ETLCustomActivity",
              "packageLinkedService": "AzureStorageAccount",
              "packageFile": "customactivitycontainer/ETLCustomActivity.zip",
              "extendedProperties": {
                "SliceStart": "$$Text.Format('{0:yyyyMMddHH-mm}', Time.AddMinutes(SliceStart, 0))"
              }
            },
            "linkedServiceName": "AzureBatchLinkedService",
            "policy": {
              "concurrency": 1,
              "executionPriorityOrder": "OldestFirst",
              "retry": 3,
              "timeout": "01:00:00"
            },
            "scheduler": {
              "frequency": "Hour",
              "interval": 1
            }
          }
        ],
        "start": "2016-11-01T00:00:00Z",
        "end": "2017-11-01T00:00:00Z"
      }
    }


    Does anyone have any ideas why my slices are stuck? Thank you, 

    Danielle



    • Edited by Danielle F Friday, November 11, 2016 12:08 AM
    Friday, November 11, 2016 12:06 AM

All replies

  • In further investigation, I first of all noticed my SAS URI had a ';' at the end which I deleted. This didn't fix the problem.

    Then I noticed that when I put the SAS URI in a browser, it gives back an 'authentication failed' message. I tried adding the two query string parameters: "restype=container&comp=list" and then the browser returns a list of blobs.

    So I tried adding those two query string parameters to my data factory linked service, but that didn't fix the problem either.

    Friday, November 11, 2016 5:42 PM
  • Hi, Danielle

    Based on your information, I checked our backend log, and seems "ETLPipeline" is running.

    I guess too many runs and the UI may not list the running one. Could you help to try "see more" button? I think it will work :)

    Thanks,

    Charles

    • Proposed as answer by Charles Gu Monday, November 14, 2016 8:27 AM
    Monday, November 14, 2016 8:27 AM