locked
HttpTrigger with multipart formdata RRS feed

  • Question

  • I want to use a Node.js azure function to receive a PDF file and save it to blob storage. I have an httpTrigger I want to use to upload the file.  And an output binding to save it blob storage. How do I access the file in req, convert it to a blob and assign it to a context.bindings.outputblob?  I'm having trouble accessing the file in req.body.

    Thanks,

    Donnie

     submitFile(){
            
                let formData = new FormData();
                formData.append('file', this.file);
          
                axios.post( url,
                    formData,
                    {
                    headers: {
                        'Content-Type': 'multipart/form-data'
                    }
                  }
                ).then(function(){
              console.log('SUCCESS!!');
            })
            .catch(function(){
              console.log('FAILURE!!');
            });
          }


    Donnie Kerr


    • Edited by DonnieKerr Wednesday, February 6, 2019 4:59 PM
    Wednesday, February 6, 2019 4:58 PM

Answers

  • Hi, I tried this today and I was able to receive a multipart form, hope this helps

    multipart form data processing via httptrigger using nodejs on azure function


    var multipart = require("parse-multipart");
     
    
    module.exports = function (context, request) {  
        context.log('JavaScript HTTP trigger function processed a request.'); 
        // encode body to base64 string
        var bodyBuffer = Buffer.from(request.body);
       
        var boundary = multipart.getBoundary(request.headers['content-type']);
        // parse the body
        var parts = multipart.Parse(bodyBuffer, boundary);
        context.res = { body : { name : parts[0].filename, type: parts[0].type, data: parts[0].data.length}}; 
        context.done();  
    }; 


    • Marked as answer by DonnieKerr Thursday, February 7, 2019 5:34 PM
    • Edited by Anass Kartit Friday, February 8, 2019 9:42 AM sample code
    Thursday, February 7, 2019 5:25 PM

All replies

  • This is what I get when logging req.body ...


    Donnie Kerr

    Wednesday, February 6, 2019 5:48 PM
  • An advice on best practices is you have to use a blob triggered function instead of HttpTrigger function, so I would upload a file to blob which would trigger a function that would send a queue message to a second function.

    if you want still to pursue this path, have a look at this 

    https://github.com/Azure/azure-functions-host/issues/2009

    • Proposed as answer by Anass Kartit Wednesday, February 6, 2019 6:45 PM
    • Edited by Anass Kartit Wednesday, February 6, 2019 6:47 PM
    • Unproposed as answer by Anass Kartit Thursday, February 7, 2019 8:08 PM
    Wednesday, February 6, 2019 6:45 PM
  • Why would I want my user to have to leave my app, go to Azure, find the right blob storage, then upload a PDF there? 

    My use case is the user clicks browse in the app to select a PDF to upload to blob storage.  When the file is saved to blob, then a blog trigger is used to fire off another downstream function. 

    Unless I'm misunderstanding your best practice suggestion here?

    Thanks,

    Donnie 


    Donnie Kerr

    Wednesday, February 6, 2019 6:50 PM
  • To clarify what Anass has mentioned, blob storage has all the infrastructure designed to handle uploads (of any size). And your user can still use your app to upload to your blob storage without ever leaving your app too.

    Here is a great tutorial which includes how this can be done.

    Note that HTTP Triggered functions have some limits.

    Also, the github issue that Anass shared shows how you can still achieve multipart uploads with functions too.

    Thursday, February 7, 2019 12:48 AM
  • The user will not leave your app , it is just changing the trigger type and approch.

    The app will first upload the pdf to blob , then you can get the URL from blob and pass that to another function to read from blob, this is more secure and scalable.

    Have a look at this example for NodeJS.

    https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-nodejs

    also as Pramod suggested , HttpTrigger has limits and issues for processing files in memory.

    Thursday, February 7, 2019 9:47 AM
  • This is great!  Didn't realize blob storage had it own REST API.

    My question is, in the tutorial,  I don't see any client-side javascript that calls the Blob Storage REST API?  I don't see how javascript uploads the image?


    Donnie Kerr

    Thursday, February 7, 2019 10:56 AM
  • I may still pursue this path, but I'm not seeing an actual solution on the issues link you provided.  I posted a comment there to see if there is a solution, because v2 doesn't work.

    I'm also looking at the blob service REST API.  I still don't see an example in javascript that takes a file object and uploads it to blob storage. I need to see if it supports multi-part in this REST API. 


    Donnie Kerr

    Thursday, February 7, 2019 11:24 AM
  • Have a look at this for multipart upload in nodejs

    https://stackoverflow.com/questions/40713479/how-to-parse-multi-part-form-data-in-an-azure-function-app-with-http-trigger-n


    Thursday, February 7, 2019 2:09 PM
  • Thank you.

    Unfortunately, the request object used in an azure function is not the same as the request object in a Node.js server application.  It is wrapped and doesn't support things like req.on, etc.  So, all these node examples and their parsing packages won't work in the function because they depend on a true request object.

    This example in the link you provided won't work in a function

    https://stackoverflow.com/questions/21745432/image-upload-to-server-in-node-js-without-using-express 

    Need a multipart parsing solution that will work with a function req/context objects.

    Thanks,

    Donnie


    Donnie Kerr

    Thursday, February 7, 2019 3:46 PM
  • Hi, I tried this today and I was able to receive a multipart form, hope this helps

    multipart form data processing via httptrigger using nodejs on azure function


    var multipart = require("parse-multipart");
     
    
    module.exports = function (context, request) {  
        context.log('JavaScript HTTP trigger function processed a request.'); 
        // encode body to base64 string
        var bodyBuffer = Buffer.from(request.body);
       
        var boundary = multipart.getBoundary(request.headers['content-type']);
        // parse the body
        var parts = multipart.Parse(bodyBuffer, boundary);
        context.res = { body : { name : parts[0].filename, type: parts[0].type, data: parts[0].data.length}}; 
        context.done();  
    }; 


    • Marked as answer by DonnieKerr Thursday, February 7, 2019 5:34 PM
    • Edited by Anass Kartit Friday, February 8, 2019 9:42 AM sample code
    Thursday, February 7, 2019 5:25 PM
  • I found parse-multipart too!  And got it working just when you posted this!  Awesome!

    Well done on the documentation!  explains it very well!  Wish I had it 3 days ago! lol 

    Thank you!

    Donnie


    Donnie Kerr

    Thursday, February 7, 2019 5:36 PM
  • Anass,

    How do I save the uploaded file to storage account either as a blob or a file?  

    I tried to bind the file's data to outputBlob and the filename to the path, but nothing shows up in blob storage.  And I don't get any errors. 

    tried to assign them like this ...

    let part0 = part[0];

    filename = part0.filename;

    context.bindings.outputBlob = part0.data;

    to pass them to the output binding like this ...

    {
          "name": "outputBlob",
          "type": "blob",
          "path": "sandy-documents/{filename}",
          "connection": "MyStorageConnectionAppSetting",
          "direction": "out"
        }


    Donnie Kerr


    • Edited by DonnieKerr Sunday, February 10, 2019 12:38 AM
    Sunday, February 10, 2019 12:34 AM
  • If anyone is looking for how to upload buffer which is a result of parse-multipart or multipart-formdata, here is how you can do it:

    const {AnonymousCredential, BlockBlobClient} = require("@azure/storage-blob");
    
        const blobClient = new BlockBlobClient(
          `http://127.0.0.1:10000/devstoreaccount1/files/name.jpg?sasToken`, //sasToken is Shared Access Signature from portal
          new AnonymousCredential() //or use SharedKeyCredential with your storage credentials, so you don't need to use sasToken
        );
    
        const uploadResult = await blobClient.upload(
          buffer, //buffer is what you get from multipart parser
          buffer.length,
          {
            blobHTTPHeaders: {
              blobContentType: 'set correct mime type here'
            }
          }
        );


    Saturday, October 12, 2019 2:27 AM