none
Uploading parallel file in chunks to cloud fails with error - An existing connection was forcibly closed by the remote host RRS feed

  • Question

  • I am writing a windows app which uploads large file to cloud in parallel. If I transfer them as single transfer each file then everything is fine, but if I divide my file and upload them into multiple parts in parallel - many of the parts fails while transferring with error - "Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host."

    The code to launch transfers in parallel is this -

    ParallelOptions options = new ParallelOptions { MaxDegreeOfParallelism = 5000 };
    
    
         Parallel.ForEach(parts, options, part =>
               {
                 try
                 {
                   part .Transfer();
                 }
                 catch
                 {
                 }
               }
              );

    The Transfer method creates HttpWebRequest for the destination stream and transfers file to the cloud.

    I have tried setting ServicePointManager.DefaultConnectionLimit to large number, KeepAlive is false, AllowWriteStreamBuffering is false,

    <connectionManagement>
         <add address="*" maxconnection="65535" />
    </connectionManagement>

    nothing worked for me.

    Please help!!!

    Wednesday, August 21, 2013 5:30 AM

Answers

  • You didn't say previously that it was closing in the middle of the transfer. Sometimes there are limitations of file size transfers.  Sometimes there isn't enough free space on server.  It could be you are transfering in ascii mode instead of binary mode.  A binary character will sometimes force a connection to close.  There are virus protection programs that sometimes look at files to check contence.  Often files with .exe or .mdb (databases) are blocked.  Sometimes zipping the files before transfering the files wil get past the blocking.  Sometimes just changing the extension of a file name wil get through the blocking.

    jdweng

    Wednesday, August 21, 2013 7:32 AM
  • Try a few diferent files and see if there is a pattern to the failure.  Usually trasfering is done using FTP which has the ASCII and Binary modes.  You can build you own FTP class to do the transfering.  If there are binary characters you can use UUENCODE/UUDECODE classes which converts the binary characters to ascii and ascii to binarary.  Using UUENCODE will increase the number of bytes but it is still an efficeint way of eliminating the problem.

    jdweng

    Wednesday, August 21, 2013 7:47 AM

All replies

  • A connection consists of the following 3 items

    1) Source IP address

    2) Destination IP address

    3) Port Number

    You can only have one connection with all three items being the same.  When you are transxfering between tow computers the source and destination address will always be the same.   So you must use a different port number for each parallel connection.

    Normally a server will have one listening port for all client to connect.  Then during the connection process another port number is used to make the data transfer.  Then the first connection is closed so a parallel process can use the first port number to connect.

    Another method is simply have the server listening for multiple port numbers and each parallel connection use a different port number.


    jdweng

    Wednesday, August 21, 2013 6:45 AM
  • Hi jdweng,

    Thanks for reply.

    The server is a public cloud and I have a fixed address and port like https://sample.com:443/...

    I can not change anything here. What I think is, there is some limit on how many httpwebrequests can a process OR system open. I tried 10 MB files - not chunked - 12 files parallel upload - no errors - but if I increase number of parallel uploads program starts giving the same error. So my point in question that only chunk transfer is creating problem is false, its number of httpwebrequests, not sure what is that number.


    Wednesday, August 21, 2013 7:07 AM
  • There is either a limit at the server on the number of connections allowed or a firewall is blocking multiple connections.  Server have added limitations because hackers attempt to bring down servers by flooding the server with lots of connetions.


    jdweng

    Wednesday, August 21, 2013 7:12 AM
  • Firewall is not the cause, as I have tried disabling firewall and my antivirus software. That didn't helped.

    Limitation on server side is possibly causing the behavior but the point is, it doesn't give error on crating HttpWebRequest or getting stream form the request, transfer fails while writing to stream say after 30-40 % transfer done. See the log if it says something.

    Unable to write data to the transport connection: An established connection was aborted by the software in your host machine.
       at System.Net.Sockets.NetworkStream.Write(Byte[] buffer, Int32 offset, Int32 size)
       at System.Net.Security._SslStream.StartWriting(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest)
       at System.Net.Security._SslStream.ProcessWrite(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest)
       at System.Net.TlsStream.Write(Byte[] buffer, Int32 offset, Int32 size)
       at System.Net.PooledStream.Write(Byte[] buffer, Int32 offset, Int32 size)
       at System.Net.ConnectStream.InternalWrite(Boolean async, Byte[] buffer, Int32 offset, Int32 size, AsyncCallback callback, Object state)
       at System.Net.ConnectStream.Write(Byte[] buffer, Int32 offset, Int32 size) 
    Wednesday, August 21, 2013 7:24 AM
  • You didn't say previously that it was closing in the middle of the transfer. Sometimes there are limitations of file size transfers.  Sometimes there isn't enough free space on server.  It could be you are transfering in ascii mode instead of binary mode.  A binary character will sometimes force a connection to close.  There are virus protection programs that sometimes look at files to check contence.  Often files with .exe or .mdb (databases) are blocked.  Sometimes zipping the files before transfering the files wil get past the blocking.  Sometimes just changing the extension of a file name wil get through the blocking.

    jdweng

    Wednesday, August 21, 2013 7:32 AM
  • How can I check the mode, is it ascii or binary?

    For my test case - File size is 5 - 10 MB 

    I am creating these files from a tool called Dummy, Can that be a reason?

    extension is .txt for these files.

    Wednesday, August 21, 2013 7:42 AM
  • Try a few diferent files and see if there is a pattern to the failure.  Usually trasfering is done using FTP which has the ASCII and Binary modes.  You can build you own FTP class to do the transfering.  If there are binary characters you can use UUENCODE/UUDECODE classes which converts the binary characters to ascii and ascii to binarary.  Using UUENCODE will increase the number of bytes but it is still an efficeint way of eliminating the problem.

    jdweng

    Wednesday, August 21, 2013 7:47 AM