• Resolved grishart

    (@grishart)


    Hi All

    I am using google webmaster tools to check out my sites performance and am having problem with restricted by robots.txt I keep getting a few pages saying they are restricted by robots.txt

    This is the robots.txt

    User-agent: *
    Disallow: /cgi-bin
    Disallow: /wp-admin
    Disallow: /feed/
    Disallow: /trackback/
    Disallow: /date/
    Disallow: /comments/
    Allow: /wp-content/uploads
    
    Sitemap: https://www.emotionalfirstaid.co.uk/sitemap.xml

    Is there something I should do to get the whole site searchable?

    Thanks for your time.

Viewing 4 replies - 1 through 4 (of 4 total)
  • Why do you disallow all those folders? It isn’t necessary for performance.

    Thread Starter grishart

    (@grishart)

    Which folders are not necessary to disallow?

    You can delete them all. If you want you can disallow wp-admin. Disallow only blocks from crawling. Doesn’t do anything with performance.

    Thread Starter grishart

    (@grishart)

    Thanks for your help I have reduced it to the following and hopefully this will allow the whole site to be crawled ??

    User-agent: *
    Disallow: /cgi-bin
    Disallow: /wp-admin
    Allow: /wp-content/uploads
    
    Sitemap: https://www.emotionalfirstaid.co.uk/sitemap.xml
Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘problems with restricted by robots.txt’ is closed to new replies.