locked
How to load R file from Azure Storage? RRS feed

  • Question

  • I saved a file called MyModelFile.R to Azure Storage.  Now I would like to load it from inside of an “Execute R Script” module by doing this:

            load(file = “<path to my Azure storage>/ MyModelFile.R”)

    How do I specify a path to Azure storage?

    Friday, November 18, 2016 2:40 PM

Answers

  • the SAS URL route is arguably more secure than accessing blob through storage name/key pair. since you are not exposing the key to the entire storage account, and you can expire the SAS URL by time or just revoke it for whatever reason. Plus, under the cover the storage name/key pair route is going through a HTTP REST call.
    • Marked as answer by Hai Ning Wednesday, November 30, 2016 4:38 PM
    Friday, November 18, 2016 9:45 PM

All replies

  • you either need to make the Storage "public", or generate a SAS URL from that R file in the blob.

    The path should be something like this:

    http://<storagename>.blob.core.windows.net/<containername>/<path>/MyModel.File.R<SAS key>

    Even then, I am not sure if "load" will work as R code running in the sandbox doesn't have network access for the most part. but give it a try.

    Friday, November 18, 2016 3:24 PM
  • I had not considered accessing the file via http, but this is an interesting thought:  I will give it a try.   However, this is not a technique I can take into production.  Our team here wants to access the file "directly" from Azure storage, not via http.  I had considered using the "Import Data" module, but that will not work in this scenario.  What I did successfully was to zip up this file, manually load it into Azure Machine Learning, and then feed that zipped file to the Execute R script.  I was then able to access every file in the zip from my "Execute R Script" module.  The problem is that I have to manually load this zipped file every time I have a new version of it.     If there was a module like "Import Data" that got files from Storage, that would definitely work.   The module could be called "Import R files from Storage".
    • Edited by CarlosBarichello Friday, November 18, 2016 6:28 PM I had accidentally written S3 where I meant to write Azure storage.
    Friday, November 18, 2016 3:39 PM
  • Another thought for a module that could solve the problem.  Today we are currently able to manually load a zip file into Azure which can then be fed to the "Execute R script".  Perhaps we could have a similar module that went to Azure Storage to get the zipped file instead of us having to manually upload the zipped file.
    Friday, November 18, 2016 3:41 PM
  • the SAS URL route is arguably more secure than accessing blob through storage name/key pair. since you are not exposing the key to the entire storage account, and you can expire the SAS URL by time or just revoke it for whatever reason. Plus, under the cover the storage name/key pair route is going through a HTTP REST call.
    • Marked as answer by Hai Ning Wednesday, November 30, 2016 4:38 PM
    Friday, November 18, 2016 9:45 PM
  • Will definitely try the SAS URL. But in keeping with the current spirit of how things are done in ML Studio today, I think it would be nice to have a module similar to "Import Data" which allowed us to read in any files from storage.    Consistency in a tool is good.
    Monday, November 21, 2016 3:43 PM