How to delete/rename the files/folder in Azure data lake and blob store using spark scala ?


  • I am using Data bricks Scala notebook , processing the files from data lake and storing again in data lake and blob store. I see some unwanted log files are stored along with data file. Hence I need a Scala based solution to rename/delete the files/folder in Azure data lake and blob store which can be executed within Scala notebook.
    Tuesday, July 10, 2018 10:50 AM

All replies

  • For azure data lake, You can try to rename or delete a file by calling these rest endpoints using spark scala:

    1. Rename a file
    2. Delete a file

    Please let me know, if that helps.

    Tuesday, July 10, 2018 8:49 PM
  • I've a process using Azure Databricks that writes out to the Data Lake in parquet and I use the following to drop the top level folder that gets created in the parquet write...

    dbutils.fs.rm("adl://<FOLDER_PATH>", true)

    Friday, November 23, 2018 10:34 AM