locked
Data Factory Lookup size limitation RRS feed

  • Question

  • I have one question for you. We are currently creating a DF pipeline which reads data from Azure SQL database and pushing it to azure blob storage.

    We are creating blobs for each row in SQL.If there are 100K rows we have to create 100k blobs and this blobs are curated and used for indexing in Azure Search.

    We are using lookup activity to look the records from SQL and passing it through foreach activity.However, Lookup has a limitation of 5000 rows so we are able to create only 5000 blobs.
    Is there any work around to get all the records and get the exact count of blobs and read data from cache instead of quering the database each time.
    Wednesday, April 24, 2019 9:10 PM

All replies

  • Hi Nayanjyoti,

    If there are more entries, the API returns a continuation token which the client needs to send in a subsequent request.

    The number of paths returned with each invocation is limited. If the number of paths to be returned exceeds this limit, a continuation token is returned in the response header x-ms-continuation. When a continuation token is returned in the response, it must be specified in a subsequent invocation of the list operation to continue listing the paths.

    Note that if the listing operation crosses a partition boundary, then the service will return a continuation token for retrieving the remainder of the results. For this reason, it is possible that the service will return fewer results than specified by maxresults, or than the default of 5000.  If the parameter is set to a value less than or equal to zero, the server returns status code 400 (Bad Request).

    You may refer SO thread which addressing similar issue.

    Hope this helps.

    Monday, April 29, 2019 9:55 AM
  • Hi Nayanjyoti,

    Just checking in to see if the above answer helped. If this answers your query, do click “Mark as Answer” and Up-Vote for the same. And, if you have any further query do let us know.

    Friday, May 3, 2019 8:25 AM
  • I coul not find any example on how to do this.

    I alheady read a file with lookup and put its result on arrays of 1000 rows to send to a web API.

    How can I read more than 5000 rows?

    Tuesday, April 21, 2020 8:18 PM
  • Hello,

    Here are some limitations of the Lookup activity and suggested workarounds.

    Reference: Lookup activity in ADF

    Hope this helps. Do let us know if you any further queries.

    ----------------------------------------------------------------------------------------

    Do click on "Mark as Answer" and Upvote on the post that helps you, this can be beneficial to other community members.

    Wednesday, April 22, 2020 6:43 AM