locked
Remote Logging and Web Farms RRS feed

  • Question

  • User-1258788955 posted

    Hi All,

    In a web farm where a given website is served by multiple, identical, load-balanced IIS7 servers, can these servers log the website's w3c log information to a single log file on a UNC share or will the servers be fighting with each other over file locks etc?
     

    Tuesday, June 3, 2008 5:35 AM

Answers

  • User797879709 posted

    The Hosting Guidance document might be able to give you more information:
    http://learn.iis.net/page.aspx/31/shared-hosting-roadmap/

    There is a logging section in the configuration.  You can set that directory information to be just about anything (namely, a UNC share).  If you set the ACLs on that share to allow the webserver's worker process identity to write then the servers can write to their own log file locations.  All of the servers in the web farm wont' be able to write to the same file, but this way you might be able to consoliate your logs to a single log server.

    • Marked as answer by Anonymous Tuesday, September 28, 2021 12:00 AM
    Tuesday, June 3, 2008 11:54 AM

All replies

  • User797879709 posted

    The Hosting Guidance document might be able to give you more information:
    http://learn.iis.net/page.aspx/31/shared-hosting-roadmap/

    There is a logging section in the configuration.  You can set that directory information to be just about anything (namely, a UNC share).  If you set the ACLs on that share to allow the webserver's worker process identity to write then the servers can write to their own log file locations.  All of the servers in the web farm wont' be able to write to the same file, but this way you might be able to consoliate your logs to a single log server.

    • Marked as answer by Anonymous Tuesday, September 28, 2021 12:00 AM
    Tuesday, June 3, 2008 11:54 AM
  • User-1258788955 posted

    Thanks rlucero, that was a very useful link, for example In the FAQ from there it states

    Can multiple front ends write to the same log file on a share?

    No, because there is no serialization service that would serialize file access between different writers.  The utility Log Parser supports merging log files and potentially supports inserting them into SQL.

    Note: Writing to separate logfiles can also help you determine what requests are going to a particular server in the farm. This can be helpful if you are seeing sporadic errors and need to track down what server they occurred on.

    So I guess that is a definitive answer!

    Incidentally I thought that when writing log files to UNC shares it was http.sys which was doing the writing, and that it would do so as the computer user not the worker process identity?

    Tuesday, June 3, 2008 12:13 PM
  • User-1258788955 posted

    Hang on though - if the servers in the web farm are using shared configuration (i.e. applicationHost.config on a UNC share) then I can't configure a log file location on a UNC share because all of the servers will read it and will try to log to the same file won't they?

    What actually happens if the servers were to try to log to the same file? Would they end up failing due to locking, or being delayed waiting for locks to release? I guess there's no point trying if you've already said it won't work :-(

    Unfortunately I need a single consodlidated log file incorporating data from all the servers at once, in near-realtime. I guess I'm going to have to specify a local disk log file location and use a service to poll-and-rewrite-with-consolidation. Yuk.

    Tuesday, June 3, 2008 12:22 PM
  • User797879709 posted

    Yeah, if log parser can merge the logs together that might be a decent workaround. 

    What you'll have to do is set up a nightly job to export your logs to a UNC share and then when you need to analyze a site's information you can use Log Parse to do that.  Scripting this might not be to difficult.

    Tuesday, June 3, 2008 2:30 PM
  • User-1341446551 posted

    Something that might not necessarily be related to our present scenario.. but i was wondering what would happen if the UNC path is not reachable at some point due to some network problem??

    what would be the fate of IIS at that point ???  

    Tuesday, June 3, 2008 4:24 PM
  • User797879709 posted

    For saving the log file or for accessing the config?  There are two very different responses for these scenarios.

    Tuesday, June 3, 2008 4:43 PM
  • User-1341446551 posted

    For saving the log file?

    Thanks for bringing up, Accessing. If this was down,that would effect badly on IIS I assume...

    Monday, June 9, 2008 5:53 PM
  • User1073881637 posted

    You can't have multiple servers log to a single file.  Each server has the http.sys control logging to a open file.  The suggestion of using log parser to take entries from all files and put them into a single file or a database is the closest thing you'll get. 

    Monday, June 9, 2008 11:31 PM
  • User-2064283741 posted

    I would log all to a database. You can simply script and Log Parse them to it.

    Once in a database you can sort by what server in a farm, what website, etc becomes so much easier.

    It has a load more useful than flat text logs even using LogParser tool. Especially when spread over multiple servers in a farm and multiple sites.

    Tuesday, June 10, 2008 4:19 AM
  • User-1258788955 posted

    Thanks for all of these suggestions. 

    The problem with logging to a database is that it is done using a module (either native or managed) in the pipeline and as such has no access to the total bytes sent to the client - which is a crucial piece of information which I want to use to track bandwidth use.

    Also I can't parse my log files overnight to extract this information as I need it available in near-realtime.

    I'm really surprised I can't get this from a module but it seems I can't so I'm having to create a windows service to poll and read live log files and parse them for live bytes-sent information. Unless anyone can help me with a better idea?

    Tuesday, June 10, 2008 4:24 AM
  • User-2064283741 posted

    One of the places I worked for was a online banking place and we parsed the work overnight to a super database and had scripts running against that days flat file log via logparser.  

    Logparser scripts ran ever 10 minutes to keep the info up to date, combined all the different servers and sites that the scripts and sites accessed.<?xml:namespace prefix = o ns = "urn:schemas-microsoft-com:office:office" /><o:p></o:p>It depends what you want the time frequency of this. I found 10 minutes or so fine for must needs for the 'live' log data. You may need more frequency but be aware of the hit this will give.<o:p></o:p> So there was 2 tier system 'today’s' logs that were done via logpraser and live checking. And the legacy dump of all the logs in a database from the day before.<o:p></o:p>I find this the best way of managing the logs in IIS. The historical data that you need in a nice database and the 'live' stuff for today’s data in logparser. 

     

    Looking after loads of data effectively require a database I wouldn't recommend logparser for 'loads' of data for 'live' checking. It all depends on the amount of data that you are dealing with.<o:p></o:p>
    Tuesday, June 10, 2008 5:21 AM
  • User1073881637 posted

    One additional thing about Log parser, it does a 'checkpoint' when running so it doesn't have to rescan the logs each time.   You have to configure your query to do the checkpoint option, but it'll help speed-up the process of querying logs.

    Tuesday, June 10, 2008 5:53 AM