locked
Exporting double/float to real in SQL Data Warehouse RRS feed

  • Question

  • Hello,

    I have a simple table in my Data Warehouse:

    CREATE TABLE [dbo].[fighters]

    (

    [fighter] [nvarchar](1) NOT NULL,

    [avg_rounds] [real] NOT NULL)

    WITH

    (

    DISTRIBUTION = HASH ( [fighter] ),

    CLUSTERED COLUMNSTORE INDEX

    )

    GO

    I then have a .csv with the following schema:

    fighter - string

    no_of_rounds - string

    Now I want to create a data flow that aggregates the number of rounds for each fighter and averages them. Easy peasy:

    toFloat(avg(toInteger(no_of_rounds)))

    I then have a sink with a sink, I disabled auto-mapping and did the following:

    It looks like the types match perfectly, also note that it finds the columns from the table, so it should be created correctly.

    Now I want to export it to my table in the Data Warehouse through a sink. However it seems I get a mismatch:

    {
        "message": "Job '06c394c3-4415-4bed-87ab-309e4ebfc1e5 failed due to reason: DF-SINK-005 at Sink 'SQL': Column(s) [avg_rounds] do not exist or their datatypes do not match the store datatypes",
        "failureType": "UserError",
        "target": "dataflow"
    }

    I don't believe it is because the column does not exist, as it for sure is called avg_rounds in the SQL table, and it is also called that in the aggregation.

    So that makes me come to the conclusion, that it fails because the table is a real and the aggregation makes an float/double.


    • Edited by cenh Thursday, November 14, 2019 9:59 AM
    Tuesday, November 12, 2019 1:37 PM

All replies

  • I did tested the stuff on my side side and I think ,please use 

    toDecimal(avg(no_of_rounds))

    instead of

    avg(toFloat(no_of_rounds))
    And it should work for you .


    Thanks Himanshu

    • Proposed as answer by HimanshuSinha-msft Wednesday, November 13, 2019 9:20 PM
    • Unproposed as answer by cenh Thursday, November 14, 2019 8:44 AM
    Wednesday, November 13, 2019 9:20 PM
  • That did not fix it sadly. I am able to use a sink to a data lake, but not to DW, which is what I want.

    I updated the original post with more information.

    • Edited by cenh Thursday, November 14, 2019 9:59 AM
    Thursday, November 14, 2019 8:44 AM
  • Sorry to know that you don't have a resolution on this yet .
    Can you please elaborate as to what do you mean by "I then have a sink with a sink, :" . 

    This is what I understand in your case the source is SQLDW  and the sink is Data lake . I solution which i posted worked for sink as DW ( which does not seems to be the case ) . Please do confirm if my understanding is correct .


    Thanks Himanshu

    Thursday, November 14, 2019 7:35 PM
  • Sorry if it was not clear. But the source is the Data lake and the SQL DW is the sink.

    I fixed this issue by instead coding it in Databricks, I am not sure why this issue could not be solved with a data flow.

    Friday, November 15, 2019 8:51 AM
  • Can you post the DF Script that failed? You can get that by clicking on the script button. This is tested to work fine.
    Friday, November 15, 2019 6:51 PM
  • Make sure you do NOT check the validate schema checkbox in the sink for what you are doing
    Friday, November 15, 2019 6:52 PM