Skip to main content

 none
Unable to Mount CSV file From Azure Blob Storage into Azure DataBricks using R RRS feed

All replies

  • Hello sadanandm2 and thank you for your question.  Is there any error message  you get when you try to read?  Could you please share with me  your (redacted) code so I can try to reproduce the problem?
    Monday, October 21, 2019 5:21 PM
    Owner
  • I tried it using below code but i am not getting code how to do the same using [R] . if possible could you provide any link or any help it will be great.

    %python

    dbutils.fs.mount(
    source = "wasbs://test1@networkcsvstorage.blob.core.windows.net",
    mount_point = "/mnt/blobmount",
    extra_configs = {"fs.azure.sas.test1.networkcsvstorage.blob.core.windows.net":dbutils.secrets.get(scope = "*******", key = "*********")})

    df = spark.read.csv("mnt/blobmount/20191015.csv",header=true)
    df.show()

    Tuesday, October 22, 2019 10:52 AM
  • Please find the code below with error:

    library(AzureStor)
    library(SparkR)

    blob_endp <- blob_endpoint(
        "https://xxxxxstorage.blob.core.windows.net/",
        key="zj1/xs6xxxxxxxxxxxxxx3Y4kJaCxxxxxxxxxxxxxxxxxxx6OdBrtA==")
    cont <- blob_container(blob_endp, "csvfile")
    storage_download(cont, "/cases.csv", "~/cases.csv",overwrite=TRUE)
    spData <- read.df("/cases.csv", source = "csv", header="true", inferSchema = "true",na.strings = "NA")

    Error :

    Error : Error in load : analysis error - Path does not exist: dbfs:/cases.csv; Error : Error in load : analysis error - Path does not exist: dbfs:/cases.csv;
    Error : Error in load : analysis error - Path does not exist: dbfs:/cases.csv;
    Tuesday, October 22, 2019 2:12 PM
  • Hi Sadanandm,

    Steps to mount Azure Blob Storage to DBFS:

    Step1: Installing and configuring Azure Databricks CLI

    Step2: Create Secret Scopes

    Step3: Mount Azure Blob Storage

    Step4: Access files in your container as if they were local files.

    Hope this helps.      

    ----------------------------------------------------------------------------------------

    Do click on "Mark as Answer" and Upvote on the post that helps you, this can be beneficial to other community members.

    Wednesday, October 23, 2019 6:36 AM
    Owner
  • Already using Python i had mounted the storage blob.I need the process to mount the blob using [R].

    I am using NOTEBOOK [R]

    If there is any other way using [R] if we can mount please let me know.

    Wednesday, October 23, 2019 9:07 AM
  • I have been informed that R is not capable of doing the acutal mounting.  The work around is to mount using another language, as you have done.  Then afterwards, in R you can do

    read.table("/mnt/blobmount/20191015.csv")


    I did not need to use AzureStor library.
    Wednesday, October 23, 2019 7:18 PM
    Owner
  • Hi Sadanandm,

    Note: File system utilities are not available in R notebooks; however, you can use a language magic command to invoke those dbutils methods in R and SQL notebooks. For example, to list the Azure Databricks Datasets DBFS folder in an R or SQL notebook, run the command:

    %python
    
    dbutils.fs.ls("/databricks-datasets")

    The two most commonly used libraries that provide an R interface to Spark are SparkR and sparklyr. Databricks notebooks and jobs support both packages, although you cannot use functions from both SparkR and sparklyr with the same object.

    Mount using python:

    Run R notebook using the library “SparkR”:

    Hope this helps.      

    ----------------------------------------------------------------------------------------

    Do click on "Mark as Answer" and Upvote on the post that helps you, this can be beneficial to other community members.

    Thursday, October 24, 2019 5:22 AM
    Owner
  • Hi sandanadm,

    Just checking in to see if the above answer helped. If this answers your query, do click “Mark as Answer” and Up-Vote for the same. And, if you have any further query do let us know.

    Friday, October 25, 2019 11:34 AM
    Owner
  • Hi Sandanadm,

    Following up to see if the above suggestion was helpful. And, if you have any further query do let us know.

    Wednesday, October 30, 2019 8:53 AM
    Owner