none
ADF V2 Custom Console throws exception: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.

    Question

  • I'm having a problem with Azure Data Factory V2. I design a custom azure batch service (actually it is just a console) to calculate our data and make report. During this process I need to analyse many huge tables and I use stream to read and write data and results.

    The thing is that if I process data on the staging environment, which means a less mount of tables and less data, it works fine.

    Also, if I run the console in my local computer or Azure Virtual Machine (link through remote desktop), they all work fine with around 6 hours. While this process at ADF v2 took 48 hours and get this error:

    Unhandled Exception: System.AggregateException: One or more errors occurred. ---> Microsoft.WindowsAzure.Storage.StorageException: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host. ---> System.IO.IOException: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host. ---> System.Net.Sockets.SocketException: An existing connection was forcibly closed by the remote host

       at System.Net.Sockets.Socket.Receive(Byte[] buffer, Int32 offset, Int32 size, SocketFlags socketFlags)
       at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size)
       --- End of inner exception stack trace ---
       at System.Net.ConnectStream.Read(Byte[] buffer, Int32 offset, Int32 size)
       at Microsoft.WindowsAzure.Storage.Core.ByteCountingStream.Read(Byte[] buffer, Int32 offset, Int32 count)
       at Microsoft.Data.OData.MessageStreamWrapper.MessageStreamWrappingStream.Read(Byte[] buffer, Int32 offset, Int32 count)
       at System.IO.StreamReader.ReadBuffer(Char[] userBuffer, Int32 userOffset, Int32 desiredChars, Boolean& readToUserBuffer)
       at System.IO.StreamReader.Read(Char[] buffer, Int32 index, Int32 count)
       at Microsoft.Data.OData.Json.JsonReader.ReadInput()
       at Microsoft.Data.OData.Json.JsonReader.ParseStringPrimitiveValue(Boolean& hasLeadingBackslash)
       at Microsoft.Data.OData.Json.JsonReader.ParseValue(String& rawValue)
       at Microsoft.Data.OData.Json.JsonReader.Read()
       at Microsoft.Data.OData.Json.BufferingJsonReader.ReadInternal()
       at Microsoft.Data.OData.Json.BufferingJsonReader.ReadNextAndCheckForInStreamError()
       at Microsoft.Data.OData.Json.BufferingJsonReader.ReadInternal()
       at Microsoft.Data.OData.Json.BufferingJsonReader.Read()
       at Microsoft.Data.OData.JsonLight.ODataJsonLightDeserializer.ParseProperty(DuplicatePropertyNamesChecker duplicatePropertyNamesChecker, Func`2 readPropertyAnnotationValue, String& parsedPropertyName)
       at Microsoft.Data.OData.JsonLight.ODataJsonLightDeserializer.ProcessProperty(DuplicatePropertyNamesChecker duplicatePropertyNamesChecker, Func`2 readPropertyAnnotationValue, Action`2 handleProperty)
       at Microsoft.Data.OData.JsonLight.ODataJsonLightEntryAndFeedDeserializer.ReadEntryContent(IODataJsonLightReaderEntryState entryState)
       at Microsoft.Data.OData.JsonLight.ODataJsonLightReader.ReadEntryStart(DuplicatePropertyNamesChecker duplicatePropertyNamesChecker, SelectedPropertiesNode selectedProperties)
       at Microsoft.Data.OData.JsonLight.ODataJsonLightReader.ReadAtEntryEndImplementationSynchronously()
       at Microsoft.Data.OData.JsonLight.ODataJsonLightReader.ReadAtEntryEndImplementation()
       at Microsoft.Data.OData.ODataReaderCore.ReadImplementation()
       at Microsoft.Data.OData.ODataReaderCore.ReadSynchronously()
       at Microsoft.Data.OData.ODataReaderCore.InterceptException[T](Func`1 action)
       at Microsoft.Data.OData.ODataReaderCore.Read()
       at Microsoft.WindowsAzure.Storage.Table.Protocol.TableOperationHttpResponseParsers.TableQueryPostProcessGeneric[TElement,TQueryType](Stream responseStream, Func`6 resolver, HttpWebResponse resp, TableRequestOptions options, OperationContext ctx, String accountName)
       at Microsoft.WindowsAzure.Storage.Table.TableQuery`1.<>c__DisplayClass10`2.<QueryImpl>b__f(RESTCommand`1 cmd, HttpWebResponse resp, OperationContext ctx)
       at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ProcessEndOfRequest[T](ExecutionState`1 executionState)
       at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync[T](RESTCommand`1 cmd, IRetryPolicy policy, OperationContext operationContext)
       --- End of inner exception stack trace ---
       at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync[T](RESTCommand`1 cmd, IRetryPolicy policy, OperationContext operationContext)
       at Microsoft.WindowsAzure.Storage.Table.TableQuery`1.ExecuteQuerySegmentedInternal(TableContinuationToken token, CloudTableClient client, CloudTable table, TableRequestOptions requestOptions, OperationContext operationContext)
       at Microsoft.WindowsAzure.Storage.Table.TableQuery`1.<>c__DisplayClass7.<ExecuteInternal>b__6(IContinuationToken continuationToken)
       at Microsoft.WindowsAzure.Storage.Core.Util.CommonUtility.<LazyEnumerable>d__0`1.MoveNext()
       at System.Linq.Enumerable.WhereEnumerableIterator`1.MoveNext()
       at System.Linq.Buffer`1..ctor(IEnumerable`1 source)
       at System.Linq.OrderedEnumerable`1.<GetEnumerator>d__1.MoveNext()
       at System.Linq.Lookup`2.Create[TSource](IEnumerable`1 source, Func`2 keySelector, Func`2 elementSelector, IEqualityComparer`1 comparer)
       at System.Linq.GroupedEnumerable`3.GetEnumerator()
       at System.Linq.Enumerable.ToDictionary[TSource,TKey,TElement](IEnumerable`1 source, Func`2 keySelector, Func`2 elementSelector, IEqualityComparer`1 comparer)

    Besides, I check the stdout.txt and the analytic process has not been even activated, it seem that it dead at initialize data from big tables.

    The pipeline run ID is 43eebc2a-0eba-467e-bc7e-67070d9c828c ( less code, less run time but same error)

     and f65b505c-8c98-4eea-b72b-47f6fc31b815 (all data)

    and 057ccc9e-a4a8-423b-8c03-a54ff8fa5aed (less data and successful try in just 6 min)

    In addition, the azure batch service is based on batch vm, the system of which is MicrosoftWindowsServer WindowsServer 2012-R2-Datacenter-smalldisk (latest), if this would help.


    Would anyone be so kind to help me out? I have been distributed by this error for nearly two weeks...I will be REALLY appreciate of any help!


    • Edited by Rachel Xuan Friday, August 31, 2018 9:48 AM
    Friday, August 31, 2018 9:24 AM

All replies

  • Hi Rachel, 

    Could you explain more about how you are accessing the big tables and where they are stored?

    Friday, August 31, 2018 9:25 PM
    Moderator
  • Hi Rachel, 

    Could you explain more about how you are accessing the big tables and where they are stored?

    Thanks a lot for your kindly reply!

    I just initialize data from big tables in my code, they are stored at Azure subscribe as table.

    I am not quite sure if this could clarify this condition, thanks again!


    Monday, September 3, 2018 2:26 AM
  • Thanks a lot for your kindly reply!

    I just initialize data from big tables in my code, they are stored at Azure subscribe as table.

    I am not quite sure if this could clarify this condition, thanks again!

    Tuesday, September 4, 2018 2:33 AM