locked
IIS // Robots.txt // libwww-perl RRS feed

  • Question

  • User-1826049516 posted

    Hey,

    I'm trying to block this as per the recommendation of an SEO check.  I used this article http://forums.asp.net/t/1970637.aspx?I+want+to+block+libwww+perl but it doesn't seem to work me:

    User-agent: libwww-perl
    Disallow: /
    
    User-agent: *
    Disallow: /_archive/
    Disallow: /App_Code/
    Disallow: /webservices/
    Sitemap: http://www.domainname.com/sitemap.xml
    Host: www.domainname.com

    What am I doing wrong?  Is it case-sensitive (Libwww-perl); do the user agents need to go in a specific order?

    Thanks

    Thursday, February 19, 2015 10:11 AM

Answers

  • User1550517537 posted

    Hi ldoodle,

    I think your issue is more related with IIS setting. You can post to IIS forum.

    Maybe you can get suggestions at there.

    Smith

    • Marked as answer by Anonymous Thursday, October 7, 2021 12:00 AM
    Friday, February 20, 2015 10:42 AM

All replies

  • User-1826049516 posted

    Doesn't seem possible because even this doesn't seem to block it:

    Sitemap: http://www.domainname.com/sitemap.xml
    Host: www.domainname.com
    
    User-agent: *
    Disallow: /
    
    User-agent: bingbot
    User-agent: googlebot
    User-agent: msnbot
    Disallow: /_archive
    Disallow: /App_Code
    Disallow: /scripts
    Disallow: /styles
    Disallow: /webservices


    So I'm blocking ever user agent except bingbot, googlebot and msnbot, and the SEO checker still says libwww-perl is not blocked.

    That is assuming robots.txt is the correct way of blocking it, of course!

    ??????????

    Friday, February 20, 2015 7:37 AM
  • User1550517537 posted

    Hi ldoodle,

    I think your issue is more related with IIS setting. You can post to IIS forum.

    Maybe you can get suggestions at there.

    Smith

    • Marked as answer by Anonymous Thursday, October 7, 2021 12:00 AM
    Friday, February 20, 2015 10:42 AM