none
How to connect ADLS account with Azure VM by authentication methods

    Question

  • I have followed this site https://blogs.msdn.microsoft.com/arsen/2016/08/05/accessing-azure-data-lake-store-using-webhdfs-with-oauth2-from-spark-2-0-that-is-running-locally/ to connect ADLS storage with my Azure VM.

    Here is my core-site.xml:-

    <configuration>
    	<property>
    		<name>dfs.webhdfs.oauth2.enabled</name>
    		<value>true</value>
    	</property>
    	<property>
    		<name>dfs.webhdfs.oauth2.access.token.provider</name>
    		<value>org.apache.hadoop.hdfs.web.oauth2.ConfRefreshTokenBasedAccessTokenProvider</value>
    	</property>
    	<property>
    		<name>dfs.webhdfs.oauth2.refresh.url</name>
    		<value>https://login.windows.net/tenaid-id-here/oauth2/token</value>
    	</property>
    	<property>
    		<name>dfs.webhdfs.oauth2.client.id</name>
    		<value>Client id</value>
    	</property>
    	<property>
    		<name>dfs.webhdfs.oauth2.refresh.token.expires.ms.since.epoch</name>
    		<value>0</value>
    	</property>
    	<property>
    		<name>dfs.webhdfs.oauth2.refresh.token</name>
    		<value>Refresh token</value>
    	</property>
    </configuration>

    I have installed my application in Azure VM and I get following error when I upload file in my application.

    2017-01-27 12:54:25.963 GMT+0000 WARN  [admin-1fd467a4c41f43fe9f30ab446a5c93ac-84-b6792518109848bead029c9144603d04-libraryService.importDataFiles] LibraryImpl - Failed to write data file partID: 0 at:  library/51dc056c0a634beba243120501fe70d6/545ca95c2a894f948b1f5184b013a53e/5c68d893090f471d81f3cdfc810bc4f7/b6d5ceb64bfd4d65ba4ea24d24f99e90
    java.io.IOException: Mkdirs failed to create file:/clusters/myapp/library/51dc056c0a634beba243120501fe70d6/545ca95c2a894f948b1f5184b013a53e/5c68d893090f471d81f3cdfc810bc4f7/b6d5ceb64bfd4d65ba4ea24d24f99e90/data (exists=false, cwd=file:/home/palmtree/work/software/myapp-2.5-SNAPSHOT/myapp)
    	at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:450)
    	at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:435)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:909)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:890)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:787)
    	at parquet.hadoop.ParquetFileWriter.<init>(ParquetFileWriter.java:150)
    	at parquet.hadoop.ParquetWriter.<init>(ParquetWriter.java:176)
    	at parquet.avro.AvroParquetWriter.<init>(AvroParquetWriter.java:93)
    	at com.myapp.hadoop.common.PaxParquetWriterImpl.doWriteRow(PaxParquetWriterImpl.java:52)
    	at com.myapp.hadoop.common.PaxParquetWriterImpl.access$000(PaxParquetWriterImpl.java:19)
    	at com.myapp.hadoop.common.PaxParquetWriterImpl$1.run(PaxParquetWriterImpl.java:43)
    	at com.myapp.hadoop.common.PaxParquetWriterImpl$1.run(PaxParquetWriterImpl.java:40)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:422)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    	at com.myapp.hadoop.common.PaxParquetWriterImpl.writpeRow(PaxParquetWriterImpl.java:40)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at com.myapp.hadoop.core.DistributionManager$$anon$10.invoke(DistributionManager.scala:313)
    	at com.sun.proxy.$Proxy56.writeRow(Unknown Source)
    	at com.myapp.library.stacks.DataFileWriter.write(DataFileWriter.java:49)
    	at com.myapp.library.LibraryImpl.pullImportData(LibraryImpl.java:747)
    	at com.myapp.library.LibraryImpl.importDataFile(LibraryImpl.java:631)
    	at com.myapp.frontend.server.LibraryAPI.importDataFile(LibraryAPI.java:269)
    	at com.myapp.frontend.server.LibraryWebSocketDelegate.importDataFile(LibraryWebSocketDelegate.java:189)
    	at com.myapp.frontend.server.LibraryWebSocketDelegate.importDataFiles(LibraryWebSocketDelegate.java:204)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at com.myapp.frontend.util.PXWebSocketProtocolHandler$PXMethodHandler.call(PXWebSocketProtocolHandler.java:144)
    	at com.myapp.frontend.util.PXWebSocketEndpoint.performMethodCall(PXWebSocketEndpoint.java:284)
    	at com.myapp.frontend.util.PXWebSocketEndpoint.access$200(PXWebSocketEndpoint.java:47)
    	at com.myapp.frontend.util.PXWebSocketEndpoint$1.run(PXWebSocketEndpoint.java:169)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)
    2017-01-27 12:54:25.966 GMT+0000 WARN  [admin-1fd467a4c41f43fe9f30ab446a5c93ac-84-b6792518109848bead029c9144603d04-libraryService.importDataFiles] LibraryImpl - Failed to import acquisition da73b76755c34c74a1643a324e41e156
    com.myapp.iface.service.RequestFailedException
    	at com.myapp.library.LibraryImpl.pullImportData(LibraryImpl.java:754)
    	at com.myapp.library.LibraryImpl.importDataFile(LibraryImpl.java:631)
    	at com.myapp.frontend.server.LibraryAPI.importDataFile(LibraryAPI.java:269)
    	at com.myapp.frontend.server.LibraryWebSocketDelegate.importDataFile(LibraryWebSocketDelegate.java:189)
    	at com.myapp.frontend.server.LibraryWebSocketDelegate.importDataFiles(LibraryWebSocketDelegate.java:204)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at com.myapp.frontend.util.PXWebSocketProtocolHandler$PXMethodHandler.call(PXWebSocketProtocolHandler.java:144)
    	at com.myapp.frontend.util.PXWebSocketEndpoint.performMethodCall(PXWebSocketEndpoint.java:284)
    	at com.myapp.frontend.util.PXWebSocketEndpoint.access$200(PXWebSocketEndpoint.java:47)
    	at com.myapp.frontend.util.PXWebSocketEndpoint$1.run(PXWebSocketEndpoint.java:169)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)
    

    Kindly help to solve this

    Friday, January 27, 2017 12:58 PM

All replies

  • Firstly, we really do not recommend that you use the swebhdfs path.  As called out in Arsen’s blog, the adl client is much more performant.  Here are directions for configuring the adl filesystem: 

    https://hadoop.apache.org/docs/current/hadoop-azure-datalake/index.html

     

    For your specific error, it looks like the mkdir is invoked on the local file system as shown by the "file:" in the output of the mkdir command. 

     

    To solve the error, follow the steps mentioned in Arsen’s blog.  After configuration, run hdfs command on the swebhdfs path like

     

     

    bin\hadoop> fs -ls swebhdfs://avdatalake2.azuredatalakestore.net:443/

     

    One more thing:  Since posting that blog, Azure Data Lake now has full support for the Java SDK.  Here is an article that describes how to use the Java SDK to perform basic file operations:

     

    https://docs.microsoft.com/en-us/azure/data-lake-store/data-lake-store-get-started-java-sdk

     

    -- Cathy

    Tuesday, January 31, 2017 8:55 PM
  • Tried the following to connect My application on Azure VM with ADLS:-

    1. Added azure-data-lake-store-sdk in lib

    2. I have followed this Service-to-service authentication to create application in Azure Active Directory .


    3. Updated core-site.xml based on values from above documentation.


    <configuration>
      <property>
        <name>dfs.adls.home.hostname</name>
        <value>dev.azuredatalakestore.net</value>
      </property>
      <property>
        <name>dfs.adls.home.mountpoint</name>
        <value>/clusters</value>
      </property>

      <property>
        <name>fs.adl.impl</name>
        <value>org.apache.hadoop.fs.adl.AdlFileSystem</value>
      </property>

      <property>
        <name>fs.AbstractFileSystem.adl.impl</name>
        <value>org.apache.hadoop.fs.adl.Adl</value>
      </property>

      <property>
        <name>dfs.adls.oauth2.refresh.url</name>
        <value>https://login.windows.net/[tenantId]/oauth2/token</value>
      </property>

      <property>
        <name>dfs.adls.oauth2.client.id</name>
        <value>[CLIENT ID]</value>
      </property>

      <property>
        <name>dfs.adls.oauth2.credential</name>
        <value>[CLIENT KEY]</value>
      </property>

      <property>
        <name>dfs.adls.oauth2.access.token.provider.type</name>
        <value>ClientCredential</value>
      </property>

      <property>
      <name>fs.azure.io.copyblob.retry.max.retries</name>
        <value>60</value>
      </property>

      <property>
        <name>fs.azure.io.read.tolerate.concurrent.append</name>
        <value>true</value>
      </property>

      <property>
        <name>fs.defaultFS</name>
        <value>adl://dev.azuredatalakestore.net</value>
        <final>true</final>
      </property>

      <property>
        <name>fs.trash.interval</name>
        <value>360</value>
      </property>

    </configuration>


    4. I am getting following error when I start my application Server in VM:-

    2017-02-02 07:40:27.527 GMT+0000 INFO  [main] DistributionManager - Looking for class loader for distroName=adl kerberized=false
    2017-02-02 07:40:28.428 GMT+0000 ERROR [main] SimpleHdfsFileSystem - Failed to initialize HDFS file storage on null as hdfs root /myapp
    org.apache.hadoop.security.AccessControlException: Unauthorized
        at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:347)
        at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:98)
        at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:623)
        at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:472)
        at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:502)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:498)
        at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.mkdirs(WebHdfsFileSystem.java:919)
        at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1877)
        at com.myapp.hadoop.common.HdfsFileSystem$1.run(HdfsFileSystem.java:98)
        at com.myapp.hadoop.common.HdfsFileSystem$1.run(HdfsFileSystem.java:91)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at com.myapp.hadoop.common.HdfsFileSystem.__initialize(HdfsFileSystem.java:91)
        at com.myapp.hadoop.common.SimpleHdfsFileSystem.initialize(SimpleHdfsFileSystem.java:40)
        at com.myapp.hadoop.hdp2.HadoopDistributionImpl.initializeHdfs(HadoopDistributionImpl.java:63)
        at com.myapp.hadoop.hdp2.UnsecureHadoopDistributionImpl.connectToFileSystem(UnsecureHadoopDistributionImpl.java:22)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at com.myapp.hadoop.core.DistributionManager$$anon$1.invoke(DistributionManager.scala:135)
        at com.sun.proxy.$Proxy22.connectToFileSystem(Unknown Source)
        at com.myapp.library.LibraryStorageImpl.parseSimpleAuthFileSystem(LibraryStorageImpl.scala:126)
        at com.myapp.library.LibraryStorageImpl.initializeStorageWithPrefix(LibraryStorageImpl.scala:64)
        at com.myapp.library.LibraryStorageImpl.initialize(LibraryStorageImpl.scala:39)
        at com.myapp.library.LibraryStorageImpl.initialize(LibraryStorageImpl.scala:33)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1581)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1522)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
        at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
        at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
        at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
        at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBean(DefaultListableBeanFactory.java:274)
        at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1106)
        at com.myapp.container.PxBeanContext.getBean(PxBeanContext.java:156)
        at com.myapp.library.streaming.files.UploadFileServiceImpl.initialize(UploadFileServiceImpl.java:49)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1581)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1522)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
        at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
        at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
        at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
        at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:609)
        at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
        at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:469)
        at com.myapp.container.PxBeanContext.startup(PxBeanContext.java:42)
        at com.myapp.jetty.FrontendServer.main(FrontendServer.java:124)
    2017-02-02 07:40:28.462 GMT+0000 WARN  [main] server - HQ222113: On ManagementService stop, there are 1 unexpected registered MBeans: [core.acceptor.dc9ff2aa-e91a-11e6-9a51-09b76b4431e6]
    2017-02-02 07:40:28.479 GMT+0000 INFO  [main] server - HQ221002: HornetQ Server version 2.5.0.SNAPSHOT (Wild Hornet, 124) [7039110c-dd57-11e6-b90d-2bc6685808f5] stopped
    2017-02-02 07:40:28.480 GMT+0000 ERROR [main] FrontendServer - Fatal error trying to start server
    java.lang.RuntimeException: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.streaming.files.UploadFileServiceImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.LibraryStorageImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Unauthorized
        at com.myapp.container.PxBeanContext.startup(PxBeanContext.java:44)
        at com.myapp.jetty.FrontendServer.main(FrontendServer.java:124)
    Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.streaming.files.UploadFileServiceImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.LibraryStorageImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Unauthorized
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1455)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
        at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
        at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
        at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
        at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:609)
        at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
        at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:469)
        at com.myapp.container.PxBeanContext.startup(PxBeanContext.java:42)
        ... 1 more
    Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.myapp.library.LibraryStorageImpl#0' defined in class path resource [system-config.xml]: Invocation of init method failed; nested exception is java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Unauthorized
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1455)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:519)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:456)
        at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:294)
        at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:225)
        at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:291)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
        at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBean(DefaultListableBeanFactory.java:274)
        at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1106)
        at com.myapp.container.PxBeanContext.getBean(PxBeanContext.java:156)
        at com.myapp.library.streaming.files.UploadFileServiceImpl.initialize(UploadFileServiceImpl.java:49)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1581)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1522)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
        ... 11 more
    Caused by: java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Unauthorized
        at com.myapp.hadoop.common.SimpleHdfsFileSystem.initialize(SimpleHdfsFileSystem.java:45)
        at com.myapp.hadoop.hdp2.HadoopDistributionImpl.initializeHdfs(HadoopDistributionImpl.java:63)
        at com.myapp.hadoop.hdp2.UnsecureHadoopDistributionImpl.connectToFileSystem(UnsecureHadoopDistributionImpl.java:22)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at com.myapp.hadoop.core.DistributionManager$$anon$1.invoke(DistributionManager.scala:135)
        at com.sun.proxy.$Proxy22.connectToFileSystem(Unknown Source)
        at com.myapp.library.LibraryStorageImpl.parseSimpleAuthFileSystem(LibraryStorageImpl.scala:126)
        at com.myapp.library.LibraryStorageImpl.initializeStorageWithPrefix(LibraryStorageImpl.scala:64)
        at com.myapp.library.LibraryStorageImpl.initialize(LibraryStorageImpl.scala:39)
        at com.myapp.library.LibraryStorageImpl.initialize(LibraryStorageImpl.scala:33)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1581)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1522)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)
        ... 28 more
    Caused by: org.apache.hadoop.security.AccessControlException: Unauthorized
        at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:347)
        at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$200(WebHdfsFileSystem.java:98)
        at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.runWithRetry(WebHdfsFileSystem.java:623)
        at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.access$100(WebHdfsFileSystem.java:472)
        at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner$1.run(WebHdfsFileSystem.java:502)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$AbstractRunner.run(WebHdfsFileSystem.java:498)
        at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.mkdirs(WebHdfsFileSystem.java:919)
        at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1877)
        at com.myapp.hadoop.common.HdfsFileSystem$1.run(HdfsFileSystem.java:98)
        at com.myapp.hadoop.common.HdfsFileSystem$1.run(HdfsFileSystem.java:91)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at com.myapp.hadoop.common.HdfsFileSystem.__initialize(HdfsFileSystem.java:91)
        at com.myapp.hadoop.common.SimpleHdfsFileSystem.initialize(SimpleHdfsFileSystem.java:40)
        ... 47 more

    Using following jars in my project:-

    azure-data-lake-store-sdk-2.1.4.jar
    commons-cli-1.2.jar
    commons-configuration-1.6.jar
    hadoop-auth-2.7.1.jar
    hadoop-azure-datalake-3.0.0-alpha1.jar
    hadoop-common-2.5-SNAPSHOT.jar
    hadoop-common-2.7.1.jar
    hadoop-hdfs-2.7.3.jar
    hadoop-hdp2-2.5-SNAPSHOT.jar


    • Edited by KaranM Thursday, February 2, 2017 1:55 PM
    Thursday, February 2, 2017 8:43 AM
  • I have also assigned the Azure AD application to the ADLS account root directory.

    Root directory -> /clusters/myapp

    I am using rootDirectory=/clusters/sample and user=hdfs to access to DataLake store.
    • Edited by KaranM Friday, February 3, 2017 12:09 PM
    Thursday, February 2, 2017 9:02 AM
  • 1. My intention is to connect My application on Azure VM with DataLake Store with out need of HDInsight Cluster. Is that possible ? If so what steps should I need to follow ? What are the configuration needs to be present in core-site.xml ?

    2. File preview fails with AccessControlException error in ADLS

    1. Login the HDInsight cluster which is associated to the Data Lake Store using ssh command - ssh [user]@[cluster2]-ssh.azurehdinsight.net
    2. Copy a file to the cluster using the wget command - wget http://www.sample-videos.com/csv/Sample-Spreadsheet-10-rows.csv
    3. Create a new folder in your Data Lake Store account
    4. Now upload a file using PUT command
    hdfs dfs -put Sample-Spreadsheet-10-rows.csv adl://dev2.azuredatalakestore.net/new
    5. View the file in the Azure Portal

    Actual Result: The file is uploaded and shows in the Azure Portal. But, file preview is broken and I see the below error

    AccessControlException
    OPEN failed with error 0x83090aa2 (Forbidden. ACL verification failed. Either the resource does not exist or the user is not authorized to perform the requested operation.). [4f97235c-0852-44c8-a8d4-cbe190ffdb34]

    How to solve this issue ?

    Thursday, February 2, 2017 11:55 AM
  • Hi Karan,

    I just responded to your post on StackOverflow. 

    http://stackoverflow.com/questions/41936868/how-to-connect-adls-account-with-azure-vm

    Let's continue to have the conversation there. 

    • Proposed as answer by Amit R Kulkarni Wednesday, February 15, 2017 7:40 PM
    Wednesday, February 15, 2017 7:40 PM