none
Azure Function can't find file on D:\ RRS feed

  • Question

  • I have written some azure functions that will download via ftp a file into a directory on the "D:" drive.  In fact, the path would be D:\home\site\wwwroot\data\mca where \data\mca was created within the console applet (as well as the file part of the storage portal).

    The function successfully instances WinSCP to do the transfer and the file does indeed download.  Once downloaded, WinSCP closes out and any file locks are lifted (if any).

    Next the code checks to see if the file exists, gets length information of the file and last write times.  All methods return, the file exists, the file is over 2 megabytes, and I have a date set for last written.  Last written is the same time stamp as the file has as it exists on the ftp site.

    Here is the problem.  When attempting to open the file as a file stream and read one byte, an exception occurs.  It's the "can't find file exception".  I cannot read the file.

    The catch is .. if I leave the files in the azure function d:\ location for a few days,  the function can read the file...

    When the file is actually present, I can successfully read the file, create an image from the data, then I can even run a java program via System diagnostics process and use a version of java from the azure runtime enironment - java runs and more image files are made.  I also am able to read the file and transfer the data to my website's blob storage.

    Usually the function cannot read the file directly.  But from within the console, I can run the jar file and it will work against the file.

    I am at a loss as to why I can't read this file.

    Edit:  If I completely kill the function host and restart it after I transfer the ftp files, the rendering function will then work.  But new stuff will not.

    Edit: If I use d:\local\temp, then check and create d:\local\temp\mca, ftp files to the mca folder, the functions can read the data and process as expected.  Just can't do the same when using d:\home\site\wwwroot\data\mca (\data\mca are created).

     



    Friday, March 16, 2018 2:35 AM

Answers

  • I don't think its as simple as you state it. For example, take the following function:

    using System;
    using System.IO;
    
    public static void Run(TimerInfo myTimer, TraceWriter log)
    {
        log.Info($"C# Timer trigger function executed at: {DateTime.Now}");
        var filePath = Path.Combine(Environment.ExpandEnvironmentVariables("%HOME%"), @"site\wwwroot", "testfile.txt");
        
        File.WriteAllText(filePath, $"Writing to file at {DateTime.Now}.");
        var read = File.ReadAllText(filePath);
        log.Info($"Read from file: {read}");
        
    }
    


    It produces these logs:

    2018-03-19T18:57:05.004 [Info] Function started (Id=a87d18c3-b719-4cd5-84b5-d5461bf3a5a5)
    2018-03-19T18:57:05.004 [Info] C# Timer trigger function executed at: 3/19/2018 6:57:05 PM
    2018-03-19T18:57:05.082 [Info] Read from file: Writing to file at 3/19/2018 6:57:05 PM.
    2018-03-19T18:57:05.082 [Info] Function completed (Success, Id=a87d18c3-b719-4cd5-84b5-d5461bf3a5a5, Duration=90ms)
    2018-03-19T18:57:10.012 [Info] Function started (Id=fcfdb4f0-766f-4b18-ad39-31dbaad99088)
    2018-03-19T18:57:10.012 [Info] C# Timer trigger function executed at: 3/19/2018 6:57:10 PM
    2018-03-19T18:57:10.075 [Info] Read from file: Writing to file at 3/19/2018 6:57:10 PM.
    2018-03-19T18:57:10.075 [Info] Function completed (Success, Id=fcfdb4f0-766f-4b18-ad39-31dbaad99088, Duration=77ms)

    This demonstrate that a function can read and write from the home drive just fine.

    I think the problem is associated with the FTP client, possibly with how it interacts with Azure Files.

    Monday, March 19, 2018 6:58 PM

All replies

  • Is this a consumption function app? In that case, files under d:\home\ are stored in Azure Files. Perhaps there is something about your access pattern that does not work right with Azure Files. You mentioned that you were able to get things working from temp and this makes sense as that file system is local to the VM. Has this unblocked you? Is there some specific reason you need this to work using wwwroot? If its important for the file to end up in wwwroot, have you tried an approach where you copy the file from temp to wwwroot once the download is complete? 
    • Edited by Paul Batum Friday, March 16, 2018 11:09 PM
    Friday, March 16, 2018 11:07 PM
  • It is a consumption plan.  d:\home seems to support write, but not read - if the file was put there by the function itself.  Since it is backed by azure files (storage) I can see an issue as delayed syncing.  

    I had not tried moving the file from temp to home to see if the file becomes readable from home.  

    I'm under the assumption that d:\local either could go away as  soon as the function exits or until the function host re-starts and/or scaling out to another process/host.   

    Since I can read using d:\local, I'll need to change my strategy.  I'll need to ftp a file, process it and get the data out to blob storage in one go, check to see how much time I have and repeat.  Then I'll use a second function to pull out of blob storage and finish image processing.   I could gain more efficiency in execution of these functions if the intermediate state of the download files could be slightly more persistent. 


    Saturday, March 17, 2018 2:23 AM
  • After re-configuring to use d:\local\temp  my question is:

    Why does code running in azure function write to the d:\home\ drive but then not be able to read from it?  Is this by design?

    Sunday, March 18, 2018 6:40 PM
  • I don't think its as simple as you state it. For example, take the following function:

    using System;
    using System.IO;
    
    public static void Run(TimerInfo myTimer, TraceWriter log)
    {
        log.Info($"C# Timer trigger function executed at: {DateTime.Now}");
        var filePath = Path.Combine(Environment.ExpandEnvironmentVariables("%HOME%"), @"site\wwwroot", "testfile.txt");
        
        File.WriteAllText(filePath, $"Writing to file at {DateTime.Now}.");
        var read = File.ReadAllText(filePath);
        log.Info($"Read from file: {read}");
        
    }
    


    It produces these logs:

    2018-03-19T18:57:05.004 [Info] Function started (Id=a87d18c3-b719-4cd5-84b5-d5461bf3a5a5)
    2018-03-19T18:57:05.004 [Info] C# Timer trigger function executed at: 3/19/2018 6:57:05 PM
    2018-03-19T18:57:05.082 [Info] Read from file: Writing to file at 3/19/2018 6:57:05 PM.
    2018-03-19T18:57:05.082 [Info] Function completed (Success, Id=a87d18c3-b719-4cd5-84b5-d5461bf3a5a5, Duration=90ms)
    2018-03-19T18:57:10.012 [Info] Function started (Id=fcfdb4f0-766f-4b18-ad39-31dbaad99088)
    2018-03-19T18:57:10.012 [Info] C# Timer trigger function executed at: 3/19/2018 6:57:10 PM
    2018-03-19T18:57:10.075 [Info] Read from file: Writing to file at 3/19/2018 6:57:10 PM.
    2018-03-19T18:57:10.075 [Info] Function completed (Success, Id=fcfdb4f0-766f-4b18-ad39-31dbaad99088, Duration=77ms)

    This demonstrate that a function can read and write from the home drive just fine.

    I think the problem is associated with the FTP client, possibly with how it interacts with Azure Files.

    Monday, March 19, 2018 6:58 PM
  • It has to be the way WinSCP interacts with the d:\home.  I'll do another test, since I can get the code to work in the d:\local arena.  Like previously suggested I'll copy the file from local "temp" to home to see if I can read/write.  So far only 11 cents worth of IO :)

    Thanks for your help and suggestions,

    Dave

    Monday, March 19, 2018 9:47 PM
  • Wanted to follow up.  When working with WinSCP, I have found receiving files into the d:\home directory area will write the files, but any reading immediately after or even within a few days will cause a "file not found" error.

    I've confirmed if the file is transferred to the d:\local directory area, then moved into the d:\home area, full file access is granted.

    At least this works for me.  So hopefully if anyone being vexed will find this.  Thanks again for the help!  Oh man, this makes the functions work as intended now - very happy.

    • Proposed as answer by DVishal Thursday, January 31, 2019 6:25 PM
    Friday, March 23, 2018 5:06 PM
  • Great! Glad to hear you got this working. 
    Friday, March 23, 2018 9:52 PM