none
Sending parameters to U-SQL script through Data Factory

    Question

  • I am having some issues passing a parameter to a U-SQL script.

    The parameter seems to be put into the script correctly, but it fails because there seems to be an extra character that causes a syntax error.

    ADFbc5f3628-6585-4984-a579-bf87b14acf24 is the id of the job.

    There parameter is pretty simple. It's just sending the Window Start time of my Data Factory activity.

    "parameters": {
                            "runtime": "$$Text.Format('{0:yyyy-MM-ddTHH:mm:ss}',WindowStart)"
                        }

    I looked at the script in the Job Management area of the Data Lake Analytics blade. The values look fine. It should work, but then I see that extra character. If I copy out the script and then put it in my VS 2013 solution, I see the error. I take out the extra character and it runs fine.

    Is this a bug? Is there something I am missing. It seems like the parameter needs some kind of Trim function.

    Wednesday, April 6, 2016 2:28 PM

Answers

All replies

  • Hi wilnelmpls,

    This might happen because currently Azure Data Factory doesn't support UTF8 with BOM encoding script.

    One workaround you can have a try is to copy-paste your script to notepad++ (or any other editor with format convention feature), convert encoding format from utf-8 with BOM to utf-8, then save it and upload to your storage to overwrite your existing script.

    The fix should be deployed in 2-3 weeks and please let me know if this issue still happens after trying the workaround.

    Thanks,

    Jack

    Wednesday, April 6, 2016 8:03 PM
  • Actually, there is another solution that people can use right now. I created procedures on the catalog in Data Lake and included a parameter there. It worked like a charm.

    Michael Rhys has a great post regarding this.

    https://social.msdn.microsoft.com/Forums/en-US/1cbae39c-7a1c-45ac-a66f-adeb3ac66403/parameterizing-usql-scripts?forum=AzureDataLake

    Actually, it is probably best to use procedures. You can keep them in version control easier than just submitting a script through a job from Visual Studio. Also, since the non-procedure script can't be accessed from the Data Lake you need to use a separate blob. This makes the code more centralized now that there is no need to use a blob.

    Nelson

    • Marked as answer by wilnelmpls Wednesday, April 6, 2016 8:34 PM
    Wednesday, April 6, 2016 8:34 PM