locked
Using SEO Toolkit on multi-server farm RRS feed

  • Question

  • User-1015242273 posted

    Is it possible to configure either IIS or the SEO toolkit to be used in a multi-server load-balanced farm environment? As far as I can see the toolkit generates a file on a single web server instance, in my scenario I would need to have these replicated across the servers in the farm. This could, of course, be performed using file copy process but seems a bit 20th century. :)

    Thursday, October 7, 2010 9:08 AM

All replies

  • User-47214744 posted

    I'm not sure I was able to follow, could you provide more context on what is it that you are trying to achieve and the scenario?

    SEO Toolkit operates as a Web Crawler (client) just "walking" all the links that it finds and storing them in the client. Is the intent to crawl all individual servers of the farm? Or are you trying to share the results to be able to see them from several machines? Or are you trying to distribute the crawling over multiple machines to accelerate the process?

    Thursday, October 7, 2010 12:59 PM
  • User-1015242273 posted

    We have a load balanced server farm, four servers. This is for the standard reasons of failover. Each server is a mirror of the others, they have IIS on them, the content management system being MOSS.

    We can run the SEO toolkit on one server, and that will create a robots.txt and sitemap.xml file on that specific server. However, a request for either file from the outside world could go to any of those four servers. We could put a rule in the reverse proxy to say for requests for those specific files go to one specific server, however if that server happens to be down for whatever reason the request will fail.

    So what we need to be able to do is have robots.txt and sitemap.xml on every server in the farm, mirroring each other. The only way I can see of doing this is to run the SEO toolkit on each server in the farm, or copy files between servers.

    Friday, October 8, 2010 6:23 AM
  • User-47214744 posted

    I would recommend just copying both files (robots.txt, and Sitemap.xml) to all servers. You should only need to generate them once since the process of crawling takes probably some time, and since they are just "mirrors", same files should work just fine.

    Are there any challenges you foresee with that? You could use tools like Web Deploy to syncronize the content of the entire Web site if needed.

    Friday, October 8, 2010 2:56 PM