Thursday, October 02, 2008 10:42 AM
The following code is directly from the MSDN website to upload a file using FTP.
I am using this to upload SQL Server Backup files to a local FTP Server.
The backups vary in size from 1k to 150,000 k and there are about 10 of them.
The problem is that after about 250000k of files the sub fails at the line
byte  fileContents = Encoding.UTF8.GetBytes(sourceStream.ReadToEnd());
No one file is causing the problem because if the order of the files is changed then the oom occurs in a different place. The issue appears to be with the bytearray and it not being disposed or GC'd.
I added the line
fileContents = null;
this does not make any difference.
Please can anybody advise....
- Edited by Funklet Thursday, October 02, 2008 12:58 PM
Thursday, October 02, 2008 11:25 AMModerator
There's definitely room for improvement in the code. For example, I see no reason to read the file as text using StreamReader and then convert back to binary using Encoding.UTF8, that's just a waste of memory. Instead open the file directly as binary using a FileStream.
Also, to deal with files of any size, I'd recommend transfering the file in chunks instead of reading it all into memory at once. So I'd write something like
Mattias, C# MVP
- Marked As Answer by Funklet Thursday, October 02, 2008 12:36 PM
Thursday, October 02, 2008 12:41 PM
Many thanks for your swift response on this one.
I have now seen quite a lot of posts regarding OOM with byte and it seems the only thing is to avoid having large byte in the first place.
For those landing here with the same problem here is the complete function with the suggestion from Mattias included.
It is essential that the path exists when uploading the file otherwise you will get a WebException.