locked
Add/remove rules to robots.txt programmatically RRS feed

  • Question

  • User238627002 posted

    Hello,

    I would like to know if it is possible to add or remove rules to robots.txt programmatically, from ASP.NET.  Is there any API to do this? I don't need anything related to site analysis or sitemaps, just simple allow/disallow rules to robots.txt.

    I can see that going to IIS, clicking on 'SEO' and adding a new rule to exclude one site is quite easy, but I need a way to do that programmatically. And I would like to avoid accessing the file 'robots.txt' and parsing it myself.

    Thanks!

    Tuesday, September 11, 2012 8:10 AM

All replies

  • User-1499466209 posted

    Hi,

    have a look at the StreamWriter class: [url=http://msdn.microsoft.com/en-us/library/system.io.streamwriter.aspx]http://msdn.microsoft.com/en-us/library/system.io.streamwriter.aspx[/url]

     

    Tuesday, September 11, 2012 8:15 AM
  • User-1499466209 posted

     Hi, did it fits your need?

    Friday, September 14, 2012 3:59 AM
  • User238627002 posted

    Hi,

    Thank you for your reply. Sorry for the late reply, I had "email subscription enabled" but the notification e-mail went to the spam folder, so I had thought no one replied :/

    StreamWriter class would mean editing the file directly, which is precisely what I would like to avoid.

    If there is an action that can be performed using a User Interface, I had assumed there would be some API to perform it programmatically, but according to all manuals and internet forums I'm starting to think that there is none.

    Tuesday, October 2, 2012 10:05 AM