how to read 1gb txt from url RRS feed

  • Question

  • i am trying to use that code but it wont do nothing

    WebClient client = new WebClient();
                String username = u;
                String password = p;
                string url = "URL";
                client.Credentials = new System.Net.NetworkCredential(username, password);
                string credentials = Convert.ToBase64String(Encoding.ASCII.GetBytes(username + ":" + password));
                client.Headers[HttpRequestHeader.Authorization] = "Basic " + credentials;
                var result = client.DownloadString(url);
                address = result;

    i need that it will read a txt (1GB) of text with usename and password
    Wednesday, June 19, 2019 9:50 PM

All replies

  • Hello,

    How do you know it's not doing nothing? Have to monitored traffic and responses via Fiddler or WireShark?

    Please remember to mark the replies as answers if they help and unmarked them if they provide no help, this will help others who are looking for solutions to the same or similar problem. Contact via my Twitter (Karen Payne) or Facebook (Karen Payne) via my MSDN profile but will not answer coding question on either.

    NuGet BaseConnectionLibrary for database connections.

    profile for Karen Payne on Stack Exchange

    Wednesday, June 19, 2019 10:10 PM
  • For HTTP Basic Authorization, credentials property should do this.

    req.Credentials = new NetworkCredential("Test", "whateverhere");

    The header might not be working.

    Also I suggest you do is use the DownloadFile method from


    • Edited by HisKingdom Thursday, June 20, 2019 3:46 AM
    Thursday, June 20, 2019 3:42 AM
  • Hi Ahron321,

    Thank you for posting here.

    Based on your description, you want to read a txt of text with username and password form url.

    You could try the following code.

    WebClient client = new WebClient();
    String username = "USERNAME";
    String password = "PASSWORD";
    string url = "http:\\hello.txt";
    string path = "test.txt";
    client.Credentials = new NetworkCredential(username, password);
    client.DownloadFile(url, path);
    string result = File.ReadAllText(path);

    Best Regards,


    MSDN Community Support
    Please remember to click "Mark as Answer" the responses that resolved your issue, and to click "Unmark as Answer" if not. This can be beneficial to other community members reading this thread. If you have any compliments or complaints to MSDN Support, feel free to contact

    Thursday, June 20, 2019 5:53 AM
  • and what do i have in my code?

    Thursday, June 20, 2019 9:06 AM
  • i dont want that it will download the file i want that it will read from url but this code wont help me like wont work
    Thursday, June 20, 2019 9:07 AM
  • cuz its wont meassagebox that.... or something
    Thursday, June 20, 2019 9:07 AM
  • You're not going to read a 1GB string into memory like that. It doesn't even make sense. In order to read a 1GB string the runtime would need to find a contiguous block of memory that big. Probably get OOM more than successes even with 64-bit process. .NET strings are limited to 2 billion characters (Length is int) and the underlying CLR limits objects to 2GB (by default). Strings are copy on write so every time you do anything with that string you'll get a copy. You'll run out of memory really fast.

    var result = client.DownloadString(url);
    //Up to 2GB + a couple of characters
    result += " Test";
    //Up to 3GB + a couple of characters (ignoring GC running)
    result = result.Substring(100);

    You need to stream the data from the server into a file or something and then you can memory map it or stream it to actually process. Doing anything else and get used to OOM problems.

    Michael Taylor

    Thursday, June 20, 2019 2:11 PM
  • Most of your response is spot-on, Michael, but it's not really a problem to allocate 1GB in a 64-bit process.  I just did a quick experiment, and I was able to allocate 54 GB before the process crashed ("not enough room in the page file to complete the operation").  Presumably, if I increased the size of the page file, i could get even more.

    Tim Roberts | Driver MVP Emeritus | Providenza & Boekelheide, Inc.

    Friday, June 21, 2019 11:21 PM