batch processing


  • I am working on for an energy provider company, currently, we are generating 1 Gb data in form of flat file per day. we have decided to use azure data lack store to store our data, in which we want to do batch processing on a daily basis. my question is that what is the best way to transfer the flat files into azure data lack store? and after the data is pushed into azure I am wondering whether it is good idea to process the data with HDInsight spark ? like Dataframe API or SparkSQL and finally visualize it with azure?
    Thursday, May 3, 2018 8:36 AM

All replies