Working with Azure Blobs Non-Locally RRS feed

  • Question

  • Hi all,

    Is there a way to work with files stored in azure blobs using Jupyter notebooks without downloading files locally? For example we have some predictor variable files that are gigabytes in size, and it does not seem efficient to have to download the files from the blobs locally in order to use them. So is there a way to treat the Azure storage account or the blobs as a directory?

    To give an example, here I copied some code from a tutorial to access data from the blob, however it downloads the file locally. 

    blob_account_name = "myaccountname" # fill in your blob account name
    blob_account_key = "myaccountkey"  # fill in your blob account key
    mycontainer = "container"       # fill in the container name 
    myblobname = "blobname"        # fill in the blob name 
    flename = "filename"        # fill in the output file name
    ref_file = "reference_China_20190215"
    blob_service = BlockBlobService(account_name=blob_account_name,
    blob_service.get_blob_to_path(mycontainer, myblobname, filename)

    I understand that I can delete the file locally after using it, however it seems inefficient when the file sizes are larger. Is there a way to access files within blobs without downloading them locally? Basically what I would like to be able to do is reference the files by name without having to download them locally; so instead of using ~/path_to_local_directory/filename it would be something like path_to_blob_in_azure/filename


    • Edited by carbonaiwri Sunday, February 17, 2019 6:12 PM
    Sunday, February 17, 2019 6:10 PM

All replies

  • Hi,

    Sorry to hear you are suffering from this issue. Could you please share the document you are referring to/ the sample code source to us?

    Thanks a lot.



    Wednesday, February 20, 2019 6:34 AM