none
how to get .csv files in a data lake into an azure sql database using powershell?

    Question

  • I have a powershell script that goes through folders and puts the .csv files into a sql table. The problem is: That script expects to be run from the directory containing the files. That is, it expects you to "cd" into the right directory and then execute the script. Is there any way to simulate this using a data lake? If not, what can I do to run this script and have it go through the data lake? I've read up on all sorts of things: azure automation, azure cloud shell, etc. But I've hit dead-ends on all of them. 

    I'm looking for just a general strategy to explore. Perhaps I need to discard the script and switch to Data Factory, or something like that? 

    Monday, June 25, 2018 8:27 PM

All replies