none
The FILE receive adapter cannot monitor receive location <network path here> RRS feed

  • Question

  • Tech Details: BizTalk 2013 R2, using included File adapter.

    I have an application with a custom Pipeline that uses a Receive Location to get a file from a Network Share using a File adapter and then pitches it to an Archive location to another Network Share and also to an internal SFTP for a trading partner to pick up.  We are having an issue where at certain times (random, not consistently same time) in the day, the Receive Locations are disabled and the only error we see are: 

    "The FILE receive adapter cannot monitor receive location <networkpath> ".

    There are around 400 Receive Locations.  We initially thought that there must be a limit to the Receive/Send port combinations per Host Instance so we created multiple Receive Hosts and split the Receive Locations by groups of no more than 50.  All was working well but without any changes to network, server, applications, we started seeing these Receive Locations come down.  They are not consistently coming down for the entire Host Instance but partial.  A particular host instance may have 15 running and the rest down.  Another may have all running and none down, etc.

    There is no message box piece as we're strictly using pipelines to move files.

    While researching this issue on MSDN we found that there could be some limitation on the number of connections between the BizTalk app server and the Network file share.  There was a suggestion to create a Registry DWORD to increase the max on both servers.  We proceeded to do that and did a calculation where the max was set to 2048 based on that article.  

    After having made the registry changes, the issue happened again!

    Need help, has anyone encountered this issue before or something similar?  What can we do to resolve this issue?
    Thursday, August 4, 2016 3:34 PM

Answers

  • Here's the thing, this would not be any limitation imposed by BizTalk Server or the File Adapter specifically.  Most likely, this is some limit based many factors withing Windows.

    Either way, 400 is a lot.  Can you consider a different pattern?  Such as a script to sweep the 400+ shares to a single location?  Or at least combine similar messages into fewer folders?

    • Proposed as answer by Angie Xu Sunday, August 14, 2016 10:15 AM
    • Marked as answer by Angie Xu Monday, August 15, 2016 2:08 AM
    Thursday, August 4, 2016 3:40 PM

All replies

  • Here's the thing, this would not be any limitation imposed by BizTalk Server or the File Adapter specifically.  Most likely, this is some limit based many factors withing Windows.

    Either way, 400 is a lot.  Can you consider a different pattern?  Such as a script to sweep the 400+ shares to a single location?  Or at least combine similar messages into fewer folders?

    • Proposed as answer by Angie Xu Sunday, August 14, 2016 10:15 AM
    • Marked as answer by Angie Xu Monday, August 15, 2016 2:08 AM
    Thursday, August 4, 2016 3:40 PM
  • Hi

    Can you also try to increase the Retry Count from the default of 5 on the File Adapter as described here ?

    If all of these Receive Locations are pointing to the same server, it could be an issue with multiple persistent connections in use by the File Adapter (but the registry changes should mitigate that).

    Can you also capture a network trace using Netmon/WireShark during the time when the Receive Locations start getting disabled and see if something is broken at the network layer?


    Thanks Arindam


    Thursday, August 4, 2016 3:49 PM
    Moderator
  • Hi,

    Like John's  mentioned 400 Receive Locations is a lot. Can you use some kind of endpoint resolver to consume and direct it to required subscriber. Few examples are below.

    https://mikearnett.wordpress.com/2012/10/16/739/

    http://social.technet.microsoft.com/wiki/contents/articles/34924.biztalk-understanding-bussiness-rules-engine-via-the-dynamic-endpoint-resolver-pattern.aspx


    Regards Pushpendra K Singh

    Wednesday, August 10, 2016 7:54 PM