locked
Stream Analytics job - data in blob storage - but does not export to sql server table RRS feed

  • Question

  • Hi,

    I want to export Application Insights data from blob storage to a SQL Server table. I followed the link

    https://azure.microsoft.com/en-us/documentation/articles/app-insights-code-sample-export-sql-stream-analytics/#set-query

    It has a query for extracting pageviews from the blob storage.

    I set up the Stream Analytics job with input pointing to the storage blob and the output pointing to the sql server table.

    I used the same query provided in the tutorial

        SELECT flat.ArrayValue.name as pageName
        , flat.ArrayValue.count as viewCount
        , flat.ArrayValue.url as url
        , flat.ArrayValue.urlData.port as urlDataPort
        , flat.ArrayValue.urlData.protocol as urlDataprotocol
        , flat.ArrayValue.urlData.host as urlDataHost
        , flat.ArrayValue.urlData.base as urlDataBase
        , flat.ArrayValue.urlData.hashTag as urlDataHashTag
          ,A.context.data.eventTime as eventTime
          ,A.context.data.isSynthetic as isSynthetic
          ,A.context.device.id as deviceId
          ,A.context.device.type as deviceType
          ,A.context.device.os as os
          ,A.context.device.osVersion as osVersion
          ,A.context.device.locale as locale
          ,A.context.device.userAgent as userAgent
          ,A.context.device.browser as browser
          ,A.context.device.browserVersion as browserVersion
          ,A.context.device.screenResolution.value as screenResolution
          ,A.context.session.id as sessionId
          ,A.context.session.isFirst as sessionIsFirst
          ,A.context.location.clientip as clientIp
          ,A.context.location.continent as continent
          ,A.context.location.country as country
          ,A.context.location.province as province
          ,A.context.location.city as city
        INTO
          AIOuput
        FROM AIinput A
        CROSS APPLY GetElements(A.[view]) as flat


    From the Azure portal, I tested the connection for the input and output and they both succeeded. I then did a test run for the input within the azure management portal and I see the extraction was successful. I can also see data using the "Sample Data" option. I added the IP address to the firewall of SQL Server to allow for access.

    However, the stream analytics job started and is in running mode, but I never see the data exported to the SQL Server table. Is there a way to debug further? Since the job is not failing, I am assuming, that for some reason, the job is either ignoring some error or is in some loop.

    Please let me know

    Thanks




    Wednesday, August 12, 2015 10:03 PM

Answers

  • Hi,

    Same as in the previous case I'm guessing that you're seeing this error during ouput creation and validation step. Is that correct?

    If that's the case you can ignore that warning for now. We'll fix that warning in the near future. Can you check that the job runs successfully and you get data into the SQL Azure table?

    Thanks


    This posting is provided "AS IS" with no warranties, and confers no rights

    Friday, January 15, 2016 6:31 PM

All replies

  • Are you using a SQL Azure database? If so, you need to add a clustered index before data can be inserted.

    Wednesday, August 12, 2015 11:44 PM
  • Yes, it is a SQL Azure database, and yes, it has a clustered index on it. The weird thing is that there are no errors thrown (or atleast the stream analytics job says it is in running mode). Is there a way to further diagnose / look for errors? 



    Thursday, August 13, 2015 12:04 AM
  • You are able to look at the operation logs  which will show you the status of processing events and sending events.
    Thursday, August 13, 2015 1:02 AM
  • If the operation logs  does not contain any errors please check that the blob input path is correct, the blobs have data and the date/time encoded in the path correspond to a later date/time compared to the job start time.

    Thanks


    This posting is provided "AS IS" with no warranties, and confers no rights

    Thursday, August 13, 2015 4:39 PM
  • I ran into this same issue today.  I tried to define my columns in SQL with specific data types according to the data types I was expecting.  Never worked.  Then I just gave up and set each column to nvarchar(max) to see what happened and I started seeing data.  Kind of a brute force way of doing it, but it worked.  I haven't gotten to the reporting part of it, so I'm not sure how that will work out.  I find it hard to believe this is how it is supposed to be done.

    The problem seems to be, as you note, is there is no feedback from Azure to tell you why data failed to be written to SQL or even THAT data failed to be written to SQL..

    Friday, August 28, 2015 8:45 PM
  • And right after I wrote this I discovered the "Operation Logs" in the dashboard of my processor which tells me the errors. 

    Though it seems like some of the error aren't as helpful as I might hope them to be.  I see things like "failed to convert from system.int32 to system.double.  Double?  Plus it does not tell me which column the error is for. :(

    Friday, August 28, 2015 8:58 PM
  • Sorry about the inconvenience, we are working the bug fix to improve the error message, should be in production soon.
    Wednesday, September 9, 2015 12:09 AM
  • hey,

    not sure if this is the right place to ask, but I'm having a similar issue with StreamAnalytics.

    I'm trying to define a new output, which is an Azure Sql table.

    I've created the table with a columnStore clustered index, using the following query: 

    CREATE CLUSTERED COLUMNSTORE INDEX cl_indexName ON Table1.

    and I'm getting "SQL Database table [dbo].[Table1] does not have a clustered index."

    since the only index on this table is clustered, I'm not really sure how to handle this issue.

    Monday, October 19, 2015 11:44 AM
  • Just check the Path prefix Pattern Setting once and check out the instrumentation key of the Application Insights resource is with dashes in it. if there then those dashes needs to be removed.  Let us know if that worked for you.
    Thursday, November 5, 2015 7:41 AM
  • didn't help me.
    Monday, December 21, 2015 12:28 PM
  • Having the same issue here, can't insert data to azure sql table with clustered column store index.

    Any help?

    Monday, December 21, 2015 12:37 PM
  • Hi,

    I'm guessing that you're seeing this error during ouput creation and validation step. Is that correct?

    If that's the case you can ignore that warning for now. We'll fix that warning in the near future. Can you check that the job runs successfully and you get data into Table1?

    Thanks


    This posting is provided "AS IS" with no warranties, and confers no rights

    Friday, January 15, 2016 6:30 PM
  • Hi,

    Same as in the previous case I'm guessing that you're seeing this error during ouput creation and validation step. Is that correct?

    If that's the case you can ignore that warning for now. We'll fix that warning in the near future. Can you check that the job runs successfully and you get data into the SQL Azure table?

    Thanks


    This posting is provided "AS IS" with no warranties, and confers no rights

    Friday, January 15, 2016 6:31 PM