Switching csv data source post deploy RRS feed

  • Question

  • Hello, I have just starting tackling my first AAS project. I've read that smaller datasets should be used when developing the model with SSDT and once you've deployed your model to AAS you can then switch your data source to point to your production (larger) one.

    In my case I envision developing my model with a 500MB csv and after deploying I'd like to switch the model to point to my production csv (10GB).

    I've searched and can't find any references as to how this can be achieved.

    Could someone point me in the right direction?



    Friday, September 13, 2019 7:39 PM

All replies

  • Hi Mike,

    To change the external data source used by a current connection

    1. In SQL Server Data Tools (SSDT), click the Model menu, and then click Existing Connections.

    2. Select the data source connection you want to modify, and then click Edit.

    3. In the Edit Connection dialog box, click Browse to locate another database of the same type but with a different name or location.

      As soon as you change the database file, a message appears indicating that you need to save and refresh the tables in order to see the new data.

    4. Click Save, and then click Close.

    5. In SQL Server Data Tools, click the Model, click Process, and then click Process All.

      The tables are re-processed using the new data source, but with the original data selections.


      If the new data source contains any additional tables that were not present in the original data source, you must re-open the changed connection and add the tables.

    Hope this helps.

    Monday, September 16, 2019 11:37 AM
  • Hi Chirag,

    I'm using SSDT in VS 2017 so the menu options are different. I changed the data source connection from one Blob to a different Blob (along with updating the access key).

    When I Processed All I got the following error message:

    Failed to save modifications to the server. Error returned: 'OLE DB or ODBC error: [Expression.Error] The key didn't match any rows in the table..

    I also tried adding both Data Sources (Blob 1 with the smaller csv and and Blob 2 with the larger csv). I then manually updated the Power Query code so that the table was pulling from Blob 2.

    When I Processed All the table started to load but it was attempting to load the entire 10GB file to my workstation. This failed since I don't have enough RAM in my computer.

    Right now it seems like I have to Process All first before I can deploy to AAS. If I only have the smaller data set processed on my local machine I can only push this small data set to AAS.

    I'm still confused as to how i can deploy the larger dataset without Process All.


    Monday, September 16, 2019 5:09 PM
  • Hi there,

    Sorry for the delayed response. Was your issue resolved ? Do you need assistance ?

    Thursday, September 26, 2019 10:30 AM