locked
Unable to mount the Azure storage container to Databricks with SAS Key (Even Microsoft sample is not working) RRS feed

  • Question

  • Hi,

    I am using this sample to mount the Azure storage container to the databricks

    val storageAccount = "dropoff"
    val container = "qadropoff"
    val sasKey = "?mysaskey taken from Azure storage explorer
    val mountPoint = s"/mnt/qa"
    
    try {
      dbutils.fs.unmount(s"$mountPoint") // Use this to unmount as needed
    } catch {
      case ioe: java.rmi.RemoteException => println(s"$mountPoint already unmounted")
    }
     
    
    
    val sourceString = s"wasbs://$container@$storageAccount.blob.core.windows.net/"
    val confKey = s"fs.azure.sas.$container.$storageAccount.blob.core.windows.net"
    
    
    
     
      dbutils.fs.mount(
        source = sourceString,
        mountPoint = mountPoint,
        extraConfigs = Map(confKey -> sasKey)
      )

    It Throws this error

    /mnt/qa already unmounted

    java.lang.NullPointerException
    at shaded.databricks.org.apache.hadoop.fs.azure.NativeAzureFileSystem.getAncestor(NativeAzureFileSystem.java:2628) at shaded.databricks.org.apache.hadoop.fs.azure.NativeAzureFileSystem.mkdirs(NativeAzureFileSystem.java:2660) at shaded.databricks.org.apache.hadoop.fs.azure.NativeAzureFileSystem.mkdirs(NativeAzureFileSystem.java:2646) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$mkdirs$1$$anonfun$apply$mcZ$sp$7$$anonfun$apply$mcZ$sp$8.apply$mcZ$sp(DatabricksFileSystemV2.scala:767) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$mkdirs$1$$anonfun$apply$mcZ$sp$7$$anonfun$apply$mcZ$sp$8.apply(DatabricksFileSystemV2.scala:765) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$mkdirs$1$$anonfun$apply$mcZ$sp$7$$anonfun$apply$mcZ$sp$8.apply(DatabricksFileSystemV2.scala:765) at com.databricks.s3a.S3AExeceptionUtils$.convertAWSExceptionToJavaIOException(DatabricksStreamUtils.scala:119) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$mkdirs$1$$anonfun$apply$mcZ$sp$7.apply$mcZ$sp(DatabricksFileSystemV2.scala:765) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$mkdirs$1$$anonfun$apply$mcZ$sp$7.apply(DatabricksFileSystemV2.scala:765) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$mkdirs$1$$anonfun$apply$mcZ$sp$7.apply(DatabricksFileSystemV2.scala:765) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$withUserContextRecorded$1.apply(DatabricksFileSystemV2.scala:960) at com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:251) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:246) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withAttributionContext(DatabricksFileSystemV2.scala:474) at com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:288) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withAttributionTags(DatabricksFileSystemV2.scala:474) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withUserContextRecorded(DatabricksFileSystemV2.scala:933) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$mkdirs$1.apply$mcZ$sp(DatabricksFileSystemV2.scala:764) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$mkdirs$1.apply(DatabricksFileSystemV2.scala:764) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$mkdirs$1.apply(DatabricksFileSystemV2.scala:764) at com.databricks.logging.UsageLogging$$anonfun$recordOperation$1.apply(UsageLogging.scala:440) at com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:251) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:246) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withAttributionContext(DatabricksFileSystemV2.scala:474) at com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:288) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withAttributionTags(DatabricksFileSystemV2.scala:474) at com.databricks.logging.UsageLogging$class.recordOperation(UsageLogging.scala:421) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.recordOperation(DatabricksFileSystemV2.scala:474) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.mkdirs(DatabricksFileSystemV2.scala:763) at com.databricks.backend.daemon.data.client.DatabricksFileSystem.mkdirs(DatabricksFileSystem.scala:194) at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1881) at com.databricks.backend.daemon.dbutils.DBUtilsCore.mount(DBUtilsCore.scala:398) at com.databricks.dbutils_v1.impl.DbfsUtilsImpl.mount(DbfsUtilsImpl.scala:85) at line525a2ba0c47d46a68b969f0e9a1f423165.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-4398987565860355:20) at line525a2ba0c47d46a68b969f0e9a1f423165.$read$$iw$$iw$$iw$$iw$$iw.<init>(command-4398987565860355:82) at line525a2ba0c47d46a68b969f0e9a1f423165.$read$$iw$$iw$$iw$$iw.<init>(command-4398987565860355:84) at line525a2ba0c47d46a68b969f0e9a1f423165.$read$$iw$$iw$$iw.<init>(command-4398987565860355:86) at line525a2ba0c47d46a68b969f0e9a1f423165.$read$$iw$$iw.<init>(command-4398987565860355:88) at line525a2ba0c47d46a68b969f0e9a1f423165.$read$$iw.<init>(command-4398987565860355:90) at line525a2ba0c47d46a68b969f0e9a1f423165.$read.<init>(command-4398987565860355:92) at line525a2ba0c47d46a68b969f0e9a1f423165.$read$.<init>(command-4398987565860355:96) at line525a2ba0c47d46a68b969f0e9a1f423165.$read$.<clinit>(command-4398987565860355) at line525a2ba0c47d46a68b969f0e9a1f423165.$eval$.$print$lzycompute(<notebook>:7) at line525a2ba0c47d46a68b969f0e9a1f423165.$eval$.$print(<notebook>:6) at line525a2ba0c47d46a68b969f0e9a1f423165.$eval.$print(<notebook>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793) at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054) at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645) at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644) at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31) at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19) at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644) at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576) at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572) at com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:215) at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply$mcV$sp(ScalaDriverLocal.scala:202) at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202) at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202) at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:699) at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:652) at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:202) at com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$9.apply(DriverLocal.scala:385) at com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$9.apply(DriverLocal.scala:362) at com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:251) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:246) at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:49) at com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:288) at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:49) at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:362) at com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644) at com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644) at scala.util.Try$.apply(Try.scala:192) at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:639) at com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:485) at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:597) at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:390) at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337) at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219) at java.lang.Thread.run(Thread.java:748)

    Thursday, March 26, 2020 12:38 AM

All replies

  • Hi Rajaniesh,

    I’m able to mount the Azure Storage Container to the Azure Databricks without any issue.

    val storageAccount = "cheprasas"
    val container = "carona"
    val sasKey = "?sv=2019-02-02&ss=bfqt&srt=sco&sp=rwdlacup&se=2020-03-26T11:32:02Z&st=2020-03-26T03:32:02Z&spr=https&sig=QsXXXXXXXXXXXXXXXXXXXXXXXX5PHQo5HL9JM%3D"
    val mountPoint = s"/mnt/qa"
    
    try {
      dbutils.fs.unmount(s"$mountPoint") // Use this to unmount as needed
    } catch {
      case ioe: java.rmi.RemoteException => println(s"$mountPoint already unmounted")
    }
     
    val sourceString = s"wasbs://$container@$storageAccount.blob.core.windows.net/"
    val confKey = s"fs.azure.sas.$container.$storageAccount.blob.core.windows.net"
     
      dbutils.fs.mount(
        source = sourceString,
        mountPoint = mountPoint,
        extraConfigs = Map(confKey -> sasKey)
      )

    I have used the same code which you are running and able to mount and see the files exists in the Azure Storage Container.

    Hope this helps. Do let us know if you any further queries.

    ----------------------------------------------------------------------------------------

    Do click on "Mark as Answer" and Upvote on the post that helps you, this can be beneficial to other community members.

    Thursday, March 26, 2020 3:50 AM
  • Hi Cheekatlapradeep,

    I do not know then why Azure data bricks is throwing me this error. I have sent you the error above. IF this code is correct then it should work everywhere.

    Are we  missing something in the blob configuration? I have already set the permission for my azure ad account in blob. Now sure if I am missing some steps there.It will be great if you can look into this issue?

    Regards

    Rajaniesh


    Thursday, March 26, 2020 11:47 AM
  • Hi Rajaniesh,

    Thanks for your reply. For a deeper investigation and immediate assistance on this issue, if you have a support plan you may file a support ticket, else could you please send an email to AzCommunity@Microsoft.com with the below details, so that we can create a one-time-free support ticket for you to work closely on this matter. 

    Subject of the email: <ATTN: PRADEEP - Unable to mount the Azure storage container to Databricks with SAS Key (Even Microsoft sample is not working)>

    Thread URL: Unable to mount the Azure storage container to Databricks with SAS Key (Even Microsoft sample is not working)

    Subscription ID: 

    Please let me know here once you have done the same.

    Friday, March 27, 2020 4:36 AM
  • Pradeep,

    I have created a support ticket for this case and the ID for the ticket is 120032824000618

    Regards

    Rajaniesh

    Saturday, March 28, 2020 4:55 PM
  • Hi Rajaniesh,

    Thanks for the update, Once the issue is sorted out with the support, please do share the resolution, which might be beneficial to other community members reading this thread. 

    Monday, March 30, 2020 3:59 AM