locked
Different types of BLOB as Input RRS feed

  • Question

  • Hello All,

    My requirement is to connect to a BLOB storage, parse and ingest to SQL. Problem that i am facing is, in the BLOB storage, there could be multiple JSON types. How to parse different jsons?

    Tuesday, August 14, 2018 8:03 PM

All replies

  • This doc describes in detail how to connect to blob store streaming input to your ASA job.

    https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-define-inputs#stream-data-from-blob-storage

    Let me know if you have any specific questions about this scenario.

    Tuesday, August 14, 2018 10:16 PM
  • The issue am facing is: in the container, there are json, xml and csv files and i want to process only the JSON files.If i specify JSON as the serializer in the Input window, then am getting the below error: 

    Could not deserialize the input event(s) from resource 'https://testblobaccount/payload/749e69bf-0879-4c2f-a03e-0039235afc1c.xml' as Json. Some possible reasons: 1) Malformed events 2) Input source configured with incorrect serialization format

    Though I have the below in my query, it is still reading XML file and the analytics job stopped with warning.

    SELECT
    *
    FROM GetPayload

    WHERE (BlobName NOT LIKE '%Response%') AND (UPPER(BlobName) NOT LIKE '%XML')

    Or to put it other way, how to skip XML files from the BLOB being processed in ASA?


    • Edited by GuruSKNew Tuesday, August 21, 2018 3:56 PM
    Tuesday, August 21, 2018 3:54 PM
  • ASA will read from the given configuration and loads all files. If it deserializes well, it will be read. The only way to workaround is to have these in different containers or different prefixes.
    Tuesday, August 21, 2018 7:19 PM