locked
Read a fixed Length File RRS feed

  • Question

  • I have a fixed length data file which i need to load into sql db.

    Input : IAA21

    Output : Col1 Col2 Col3

                  I      AA    21

    Can someone please help here. Incase we need to use Azure Function to do this, can someone please help me with an example as well.

    I suppose I can use data flow and then substring the columns and add the delimiter. Please let me know if it is correct or if there is any other way to do it.



    Thanks, Sweta



    • Edited by sahusweta Monday, June 17, 2019 9:32 AM
    Thursday, June 13, 2019 3:24 PM

All replies

  • Hi Sweta,

    You have rightly figured out that a delimiter has to be added. You can use an Azure Function/Custom activity for the same. 

    I would recommend using an Azure Function to convert the fixed length file to a csv file which is easy to read. There are many resources out there to do the same. Eg - https://github.com/metcalfalex/fixedwidth

    Once you have set a column and row delimiter, Please try the following :

    • Create an input dataset to create a connection to the source blob. In the connection tab of the dataset, click on "detect text format". It will parse the file and come up with the column and row delimiters used in the file.


      If you want to rename the columns, you can do so by going to the "Schema" tab by clicking import schema as shown below:

    • Create a sink dataset to point to the sink blob and in the "Connection" tab, give the file name in filepath in the form - "<file name>.csv" as shown below :


      Also, make sure the "Column Delimiter" is set to Comma (for csv files).
    • Run the pipeline. 

    Hope this helps. Let us know if you need any more assistance.


    Monday, June 17, 2019 10:12 AM