Options for processing files with Logic Apps + Functions. What is correct? RRS feed

  • Question

  • Let's say I have a task to process files. The files are in the range of 1MB to 10MB. I assume that it is not possible to send these files directly to an action (if it is, please elaborate on how that could be accomplished). Instead, I send the file to BlobStorage using the existing Microsoft action "AzureBlob - Create file". But now I wish to process that file in an Azure Function. I think there are two options.

    1. Create Azure Function that watches for new blobs. This Function is not present in the Logic Apps design.

    2. Create Azure Function that receives a blob name. This Function is the action following the  "AzureBlob - Create file" action. The function would use the Blob Storage API to retrieve the blob.

    What I like about #2 is that the Function is an entity in the Logic App, which makes it more transparent as to the sequence. What I am not sure about, however, is if a "AzureBlob - Create file" action is synchronous so that when it passes control to the Azure Function, the blob will be guaranteed to have been written.

    Are there other considerations that would weight into the above choice? Are there other options?

    • Edited by intrasight Thursday, June 30, 2016 4:34 AM formatting
    Thursday, June 30, 2016 4:33 AM


All replies

  • Hi,

    Your scenario #1 seems also fine. If you have an Azure Function using the BlobTrigger, once the blob is created the function will be triggered and you can process the Blob. And it's decoupled from your Logic App.

    Best Regards, Carlos Sardo

    Thursday, June 30, 2016 7:57 AM
  • "Seems fine" is not very definitive. I'm hoping for more of a discussion here. What are the pros and cons? And still need answer as to whether "AzureBlob - Create file" action is synchronous and thus if #2 is even an option.
    Thursday, June 30, 2016 3:52 PM
  • Hi,

    The Azure blob - create file action is indeed synchronous. The action reruns 200 (OK) only after the file has been successfully written to the blob storage. Said that, you can also use another azure blob action - get file content using path - https://azure.microsoft.com/en-us/documentation/articles/connectors-create-api-azureblobstorage/ to fetch and pass blob content to next actions in the Logic App. I will need to look into the maximum response length supported by Logic Apps and the blob connector.

    Thanks, Vinay.

    Sunday, July 3, 2016 11:12 PM
  • What I was implying with my option #2 was to have in the output the name of the created blob.

    I did observe, after first posting this question, that the "Output Link" of the "Create file" action does contain in the "body" param the full contents of the file. That doesn't really make much sense - why include the blob in the output JSON if the purpose of the action is to save it to Azure Storage? I would have thought that it would output the path the saved storage blob.  It seems very inefficient. Can I change that behaviour by editing the JSON manifest?

    Sunday, July 3, 2016 11:50 PM
  • A couple of questions:

    1. What is the trigger for your workflow?
    2. What is your file source? Based on your description it seems like you are using Azure Blob as a temporary storage.

    Monday, July 4, 2016 4:28 AM
  • The trigger is an SFTP file drop. Yes, I am using Azure Blob as a temporary store.
    Monday, July 4, 2016 12:12 PM
  • You can actually use the 'When a file is created or modified' trigger which returns the file content in the response body which you can pass (@triggerBody()) to your function.
    Tuesday, July 5, 2016 7:15 AM
  • That is good to know.

    So what would be considered "best practice" here for files in range of 1M - 10MB in size? What about bigger files like 10MB-100MB?

    Wednesday, July 6, 2016 12:02 AM
  • Logic Apps supports payload size up to 25 MB. So that will work for 1-10 MB files sizes.

    Increasing this size limit is in our backlog.

    Wednesday, July 6, 2016 7:11 PM
  • Hi Vinay,

    So would you say that processing the output of the 'When a file is created or modified' trigger is a better approach vs. sending results to a storage blog? I'm still hoping to hear the pros and cons.

     - Chris

    Wednesday, July 6, 2016 7:26 PM
  • Yes, we are avoiding copying the date across to blob storage. So, trigger approach is definitely better than sending results to storage.
    Wednesday, July 6, 2016 10:24 PM
  • I found that one of my files is ~30MB. Is the current limit still 25MB? If so, where is "increasing the size" stand in the current backlog?



    Tuesday, December 13, 2016 3:52 PM
  • I decided to do some payload size limit testing.This is for sending a payload from Logic Apps to Azure Function.

    I tried a 26MB file and it failed. Nothing is logged on the Azure Function size.


        "statusCode": 500,
        "headers": {
            "Pragma": "no-cache",
            "Cache-Control": "no-cache",
            "Date": "Tue, 13 Dec 2016 16:36:18 GMT",
            "Server": "Microsoft-IIS/8.0",
            "X-AspNet-Version": "4.0.30319",
            "X-Powered-By": "ASP.NET",
            "Content-Length": "36",
            "Content-Type": "application/json; charset=utf-8",
            "Expires": "-1"
        "body": {
            "Message": "An error has occurred."

    Then I tried a 14MB file and it also failed with the same message.

    I tried a 9MB file and it succeeded.

    So is the limit 10MB and not 25MB?

    Also, this page


    Says the payload limit is 50MB but that it depends on the connector. Assuming that it is the Azure Function Apps connector that has the lower limit of 10MB, is there some efficient means in Logic Apps to split up a file that is too large into a set of smaller files? Processing in a loop in Logic Apps would, I assume, not make sense due to the compute cost.

    • Edited by intrasight Tuesday, December 13, 2016 5:51 PM
    Tuesday, December 13, 2016 4:39 PM