none
C# Httpclient how to avoid CSRF verification failed. Request aborted error RRS feed

  • Question

  • Hello:

    I want to use HTTP Client to post some data to a web site.

    First, I did this by hand: launch Chrome browser to visit the web site, and click on the web form, then fill in the form, then click “submit” button, as it always works.

    Second, I setup a HTTP proxy using Fiddler Version 5.0, so all the HTTPS traffic went through the proxy.

    I can see the HTTPS traffic when posting the web form, and I record all the information; then I tried to create a http client to post exactly the same data.

    Here is my code:

    public static async Task<string> Place_Order1(string encoded_body1)
    {
        try
        {
    string response1 = "";
    HttpClientHandler handler = new HttpClientHandler
    {
        AutomaticDecompression = DecompressionMethods.Deflate | DecompressionMethods.GZip
    };
    HttpClient client = new HttpClient(handler);
    ServicePointManager.Expect100Continue = false;
    try
    {
        string place_order_url1 = "https://dot.com/betslip/";
        StringContent http_content = new StringContent(encoded_body1);
        client.DefaultRequestHeaders.Add("Accept", "*/*");
        client.DefaultRequestHeaders.Add("Accept-Encoding", "gzip, deflate, br");
        client.DefaultRequestHeaders.Add("Connection", "keep-alive");
        client.DefaultRequestHeaders.AcceptLanguage.TryParseAdd("en-gb;q=0.8");
        client.DefaultRequestHeaders.Add("authority", "dot.com");
        client.DefaultRequestHeaders.Add("origin", "https://dot.com");
        client.DefaultRequestHeaders.Add("Referer", "https://dot.com");
        client.DefaultRequestHeaders.Add("User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/77.0.3835.0 Safari/537.36");
        client.DefaultRequestHeaders.Add("Content_Type", "application/x-www-form-urlencoded");
        client.DefaultRequestHeaders.TryAddWithoutValidation("Cookie", DotCom_Cookies);
        HttpResponseMessage reply1 = await client.PostAsync(place_order_url1, http_content);
        response1 = await reply1.Content.ReadAsStringAsync();
        return (response1);
    }
    return (response1);
        }
        catch (HttpRequestException ex)
        {
    Console.WriteLine("[Place_Order1] Exception: {0}", ex.Message);
    return (null);
        }
    }
    

    The DotCom_Cookies looks like the following data:

    csrftoken=gWmnlHZSxxwJvLSXPUFygqVAszlNNf1DikwgIxIIF9TY5vAHwQHYtJHeONotUp; _gid=GA1.2.148678906.1565117481; _fbp=fb.1.1565117480950.1449659034; _gat_UA-41965734-1=1; sessionid=uqn8oyxqtyyws2ciizfc4x8rp5sxw42x; _ga=GA1.2.2133559866.1565117481; __cfduid=d4f5da4642b9bea9fcda2fc38bb33cca01565117479

    The post data payload is something like the following:

    csrfmiddlewaretoken=XYZ&side=0&price=1.10&market=123&runner=456&type=1&price_formatted=1.10&amount=1.00

    I can see from Fiddler proxy, I made all the data in http client exactly the same as it appear in the proxy, but when I run my program, after post the data, I got the following reply:

    <!DOCTYPE html>

    <html lang="en">

    <head>

    <meta http-equiv="content-type" content="text/html; charset=utf-8">

    <meta name="robots" content="NONE,NOARCHIVE">

    <title>403 Forbidden</title>

    <style type="text/css">

    html * { padding:0; margin:0; }

    body * { padding:10px 20px; }

    body * * { padding:0; }

    body { font:small sans-serif; background:#eee; color:#000; }

    body>div { border-bottom:1px solid #ddd; }

    h1 { font-weight:normal; margin-bottom:.4em; }

    h1 span { font-size:60%; color:#666; font-weight:normal; }

    #info { background:#f6f6f6; }

    #info ul { margin: 0.5em 4em; }

    #info p, #summary p { padding- }

    #summary { background: #ffc; }

    #explanation { background:#eee; border-bottom: 0px none; }

    </style>

    </head>

    <body>

    <div id="summary">

    <h1>Forbidden <span>(403)</span></h1>

    <p>CSRF verification failed. Request aborted.</p>

    </div>

    <div id="explanation">

    <p><small>More information is available with DEBUG=True.</small></p>

    </div>

    </body>

    </html>

    I check again and again to make sure the CSRF token values are the same in Fiddler proxy and in http client. It seems the web site detected I am not a human to post the form data, but why bother?

    I have included the __cfduid cookie in the http client headers.

    In this example, it is like this:

    csrftoken=gWmnlHZSxxwJvLSXPUFygqVAszlNNf1DikwgIxIIF9TY5vAHwQHYtJHeONotUp

    But why I got this kind of error.

    Minor issue: I can’t add those cookies in pairs, if I did so, then I can’t use the http client to get the post form, but if I put all the cookies in one string and add them together as my code did, it worked, at least, I can use the httpclient to get the form to post the data.

    By the way, when I used web browser to fill in the form and post the data, it seems web browser used XHR request, but I think http client should be able to do the same as XHR request, right?

    If not, then how I can post the same data to the web server using Dot net (C#)?

    Finally, I am use Visual Studio 2019 (Version 16.2.0) on Windows 10 (Version 1903)

    Best Regards,

    Tuesday, August 6, 2019 7:24 PM

Answers

  • In terms of CSRF the cookie and form token are generated per-request, not per-page. If you refresh a page over and over again (ignoring browser caching) a new token should be generated. This is partially what makes CSRF effective. When CSRF is enabled the server compares the cookie with the generated token in the form. If they don't match then an attack has occurred. Since CSRF assumes that same site origin options are set a hacker cannot read the cookie to get the token. The token is completely random and generally involves time or something that changes on each request. This makes it next to impossible to calculate. 

    Each time you send a request to the remote server you need to send the "current" cookie and value from the form that was sent during your last request. Otherwise CSRF will block the call.


    Michael Taylor http://www.michaeltaylorp3.net

    • Marked as answer by zydjohn Sunday, September 8, 2019 5:03 PM
    Thursday, August 8, 2019 8:21 PM
    Moderator

All replies

  • Hi zydjohn,

    Thank you for posting here.

    CSRF token is used to avoid CSRF attack.

    If you want to use http client to send the request, you should follow below steps:
    1.Use httpclient to send get request to the server and get the response in C#
    2.Get the cookie from the response
    3.Then you could set the cookie to the cookie container from the post request 

    Best Regards,

    Jack


    MSDN Community Support
    Please remember to click "Mark as Answer" the responses that resolved your issue, and to click "Unmark as Answer" if not. This can be beneficial to other community members reading this thread. If you have any compliments or complaints to MSDN Support, feel free to contact MSDNFSF@microsoft.com.

    Wednesday, August 7, 2019 8:13 AM
    Moderator
  • Hello:

    Thanks for your reply, but I did what you said here.  And I check the data in my code, and compare the information I see in proxy, I can't see the difference, but when I used http client to post the form, I always got CSRF verification failed error, but when I submit the form via web browser, it always works.

    Any more ideas why I got the error?

    Wednesday, August 7, 2019 1:33 PM
  • Jack's answer is correct. The CSRF tries to prevent exactly what you're trying to do. The cookies and form data contained in the returned response from the initial GET must be included in subsequent calls otherwise it is an attack. In your posted code I don't see you passing any cookies to the HttpClient. Please post the updated code where you get the original response from the server, set up the HttpClient to use the new cookie and any form data provided and then send the response back to the server.

    Michael Taylor http://www.michaeltaylorp3.net

    Wednesday, August 7, 2019 1:53 PM
    Moderator
  • Hello:

    I added one line to my code:

    client.DefaultRequestHeaders.Add("X-Requested-With", "XMLHttpRequest");

    And tried again with my code, in debug mode, I can see the following result:

    {StatusCode: 403, ReasonPhrase: 'Forbidden', Version: 1.1, Content: System.Net.Http.HttpConnection+HttpConnectionResponseContent, Headers:
    {
      Date: Wed, 07 Aug 2019 13:49:51 GMT
      Transfer-Encoding: chunked
      Connection: keep-alive
      X-Frame-Options: SAMEORIGIN
      Vary: Cookie
      Set-Cookie: sessionid=zwr3m5otkpcu39i28y9x6d9c52itu829; expires=Wed, 21 Aug 2019 13:49:51 GMT; HttpOnly; Max-Age=1209600; Path=/; SameSite=Lax
      Expect-CT: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
      Server: cloudflare
      CF-RAY: 5029b75c68d6cc56-ZRH
      Content-Type: text/html
      Content-Encoding: br
    }}

    Then I used Rest Client Insomnia version 6.6.0 to do anther similar test, I got the following result:

    * Preparing request to https://dot.com/betslip/
    * Using libcurl/7.57.0-DEV OpenSSL/1.0.2o zlib/1.2.11 libssh2/1.7.0_DEV
    * Current time is 2019-08-07T14:08:22.525Z
    * Disable timeout
    * Enable automatic URL encoding
    * Enable SSL validation
    * Enable cookie sending with jar of 14 cookies
    * Hostname in DNS cache was stale, zapped
    *   Trying 104.28.27.236...
    * TCP_NODELAY set
    * Connected to dot.com (104.28.27.236) port 443 (#3)
    * ALPN, offering http/1.1
    * Cipher selection: ALL:!EXPORT:!EXPORT40:!EXPORT56:!aNULL:!LOW:!RC4:@STRENGTH
    * successfully set certificate verify locations:
    *   CAfile: C:\Users\John\AppData\Local\Temp\insomnia_6.6.0\2017-09-20.pem
    *   CApath: none
    * TLSv1.2 (OUT), TLS header, Certificate Status (22):
    * TLSv1.2 (OUT), TLS handshake, Client hello (1):
    * TLSv1.2 (IN), TLS handshake, Server hello (2):
    * TLSv1.2 (IN), TLS handshake, Certificate (11):
    * TLSv1.2 (IN), TLS handshake, Server key exchange (12):
    * TLSv1.2 (IN), TLS handshake, Server finished (14):
    * TLSv1.2 (OUT), TLS handshake, Client key exchange (16):
    * TLSv1.2 (OUT), TLS change cipher, Client hello (1):
    * TLSv1.2 (OUT), TLS handshake, Finished (20):
    * TLSv1.2 (IN), TLS change cipher, Client hello (1):
    * TLSv1.2 (IN), TLS handshake, Finished (20):
    * SSL connection using TLSv1.2 / ECDHE-ECDSA-AES128-GCM-SHA256
    * ALPN, server accepted to use http/1.1
    * Server certificate:
    *  subject: OU=Domain Control Validated; OU=PositiveSSL Multi-Domain; CN=sni155362.cloudflaressl.com
    *  start date: Aug  7 00:00:00 2019 GMT
    *  expire date: Feb 13 23:59:59 2020 GMT
    *  subjectAltName: host "dot.com" matched cert's "dot.com"
    *  issuer: C=GB; ST=Greater Manchester; L=Salford; O=COMODO CA Limited; CN=COMODO ECC Domain Validation Secure Server CA 2
    *  SSL certificate verify ok.

    > POST /market/betslip/ HTTP/1.1
    > Host: dot.com
    > Accept: */*
    > Body: text
    > Content-Type: application/x-www-form-urlencoded
    > Cookie: csrftoken=zyTuTmqehUnZDYsydHdnGnW4LpzXIRgYR4e1T9i7e6Zj7LKw5fiBM7NMURIfBdMk; _gid=GA1.2.613971234.1565185681; _gat_UA-41965734-1=1; _fbp=fb.1.1565185680651.226638747; sessionid=zwr3m5otkpcu39i28y9x6d9c52itu829; _ga=GA1.2.72152516.1565185681; __cfduid=d50cc0168d3b25cc191041a7b6cf3f3a11565185677; _hjIncludedInSample=1
    > Accept-Encoding: gzip, deflate
    > Accept-Language: en-gb; q=0.8
    > authority: dot.com
    > origin: https://dot.com
    > Referer: https://dot.com/markets/live/?sort=close_date
    > User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/77.0.3835.0 Safari/537.36
    > X-Requested-With: XMLHttpRequest
    > Content-Length: 174

    | csrfmiddlewaretoken=TDmabynrGtFEhntOkHBY8bGSlTZa96W5b9HHblfkDFhYLaLMcfGceVxAul8s2ssr&side=0&price=1.01&market=123&runner=456&type=2&price_formatted=1.01&amount=1.00

    * upload completely sent off: 174 out of 174 bytes

    < HTTP/1.1 403 Forbidden
    < Date: Wed, 07 Aug 2019 14:08:21 GMT
    < Content-Type: text/html; charset=utf-8
    < Transfer-Encoding: chunked
    < Connection: keep-alive
    < X-Frame-Options: SAMEORIGIN
    < Vary: Cookie

    * cookie size: name/val 9 + 32 bytes
    * cookie size: name/val 7 + 29 bytes
    * cookie size: name/val 8 + 0 bytes
    * cookie size: name/val 7 + 7 bytes
    * cookie size: name/val 4 + 1 bytes
    * cookie size: name/val 8 + 3 bytes

    < Set-Cookie: sessionid=ra32kx24jmzzngxdlely2ezhylsa9di4; expires=Wed, 21 Aug 2019 14:08:21 GMT; HttpOnly; Max-Age=1209600; Path=/; SameSite=Lax
    < Expect-CT: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
    < Server: cloudflare
    < CF-RAY: 5029d2740aa93e5a-ZRH
    < Content-Encoding: gzip


    * Received 26 B chunk
    * Received 5 B chunk
    * Connection #3 to host dot.com left intact
    * Saved 1 cookie

    This time, I didn't get the CSRF verification failed error, but it still did not work.  I really think there is some issue trying to use httpclient to simulate XHR request.

    I want to know what is the critical difference between httpclient request and XHR request in the AJAX requests?

    Wednesday, August 7, 2019 2:26 PM
  • I don't understand, both requests you posted generated a 403 which is a failure.  The header you added doesn't change anything.

    You need to pass the appropriate form data (where session information is stored) and cookies so the remote server can validate your identity and verify the session data. Without that you'll get errors. 403 tends to indicate that validation was successful but you didn't have permission to do something but it depends on the remote server.


    Michael Taylor http://www.michaeltaylorp3.net

    Wednesday, August 7, 2019 2:56 PM
    Moderator
  • Hello:

    The cookie is like this: (the exact value):

    Cookie: csrftoken=zyTuTmqehUnZDYsydHdnGnW4LpzXIRgYR4e1T9i7e6Zj7LKw5fiBM7NMURIfBdMk; _gid=GA1.2.613971234.1565185681; _gat_UA-41965734-1=1; _fbp=fb.1.1565185680651.226638747; sessionid=zwr3m5otkpcu39i28y9x6d9c52itu829; _ga=GA1.2.72152516.1565185681; __cfduid=d50cc0168d3b25cc191041a7b6cf3f3a11565185677; _hjIncludedInSample=1

    I changed the header, then I got different errors, the first error was: CSRF verification failed

    The second was: 403 - Forbidden.

    The first error was misleading, as if I compare the data I posted by http client and the data I get from web browser (I saw them both in Fiddler proxy), the CSRF tokens value were the same.  But if I submit the form via web browser, it always works, but via http client, it always fails.  I don't believe, the remote server can detect the data came from a program not from a human clicked on the button in the web form.  But someone on internet doubt that http client can really submit the same form data as XHR request by Javascript.

    Any more suggestions?

    Wednesday, August 7, 2019 6:43 PM
  • Again, no place in your code you posted are you actually attaching a cookie to the request. If the cookie is required then your request will fail. 

    How to add a cookie to HttpClient

    Note that it seems a little odd to me that CSRF would be sent in the cookie for the web request. Generally it is part of the form data and encoded. In your posted code you're just taking the body as a parameter and passing it on so the form data that is required isn't there. You would technically need to add the form data required by the server (session ID, token, etc.) to the body that you send to the remote server. I suspect this is really where things are going wrong.


    Michael Taylor http://www.michaeltaylorp3.net

    Wednesday, August 7, 2019 8:05 PM
    Moderator
  • Hello:

    I tried to add cookies via CookieContainer, but if I did so, then I can't even use HTTP GET to download the HTML contains the form data.  But I found if I put all the cookie name and value pairs in one long string as shown in my code, it can download the HTML with the form data.

    The form data to post to the server is exactly like this:

    csrfmiddlewaretoken=TDmabynrGtFEhntOkHBY8bGSlTZa96W5b9HHblfkDFhYLaLMcfGceVxAul8s2ssr&side=0&price=1.01&market=123&runner=456&type=2&price_formatted=1.01&amount=1.00

    The only changing part is the csrfmiddlewaretoken for each request, I have to extract it from a hidden input field from the HTML DOM, then other form data are almost the same all the time. Those session ID, CSRF token are in the cookies in this case, the only session related data in the form data is  the csrfmiddlewaretoken.

    But I will try to see if I can use CookieContainer to save the cookies again.  However, is it better to use XMLHTTPRequest to replace HTTPClient to do the job?  But I didn't see good examples in C# using XMLHTTPRequest, C# prefers HTTPClient.

    Thursday, August 8, 2019 6:50 AM
  • It makes sense to me that the CSRF is in the form data. It does have to change for each request (hence how it works) and updating a cookie on each request would be odd to say the least. So I don't think cookies (which is where this discussion started) is actually the issue.

    My gut instinct is that your request body isn't correct. Therefore it is failing the call. Can you post an example of what your body variable contains that you're passing. Also it would be useful to see how you're building this up. Note that it has to be in form-urlencoded format otherwise it won't work.

    As for XmlHttpRequest, that has nothing to do with this. XmlHttpRequest is the client-side HTTP component that sends requests back to the server when you're making calls from JavaScript. The only reason why it adds a header is so that the server can detect a client side call. In most apps this is irrelevant but in the early days of ASP.NET when AJAX was new we could detect a "callback" by looking for this header vs a "postback" which was a standard POST. A standard POST is how HttpClient works and is standard for all client-server interactions nowadays. A site would have to explicitly care about the XHR header to notice the difference and I cannot imagine why anyone would do that. Furthermore it would break any page that is using a form because forms POST back as well. So the XHR shouldn't matter here.


    Michael Taylor http://www.michaeltaylorp3.net

    Thursday, August 8, 2019 1:44 PM
    Moderator
  • Hello:

    Thanks for your reply.  I spent hours searching around, and find some issues.  It seems, the web site from time to time has many pages (10 to 20); but each page contains different CSRF form data, yet, the site seems use the same cookie for the duration of the login until logout, so cookie is not changing at all.  But the CSRF token is different in each different page, my program used the CSRF token for the fist page.  My program should work as long as the web site has only one page, it happens only short time, but most of the time, the web site has many pages, so my program always fails.  I will try to do more test to find a solution.

    Thanks for your advice!

    Thursday, August 8, 2019 7:44 PM
  • In terms of CSRF the cookie and form token are generated per-request, not per-page. If you refresh a page over and over again (ignoring browser caching) a new token should be generated. This is partially what makes CSRF effective. When CSRF is enabled the server compares the cookie with the generated token in the form. If they don't match then an attack has occurred. Since CSRF assumes that same site origin options are set a hacker cannot read the cookie to get the token. The token is completely random and generally involves time or something that changes on each request. This makes it next to impossible to calculate. 

    Each time you send a request to the remote server you need to send the "current" cookie and value from the form that was sent during your last request. Otherwise CSRF will block the call.


    Michael Taylor http://www.michaeltaylorp3.net

    • Marked as answer by zydjohn Sunday, September 8, 2019 5:03 PM
    Thursday, August 8, 2019 8:21 PM
    Moderator