• deanljbirch

    (@deanljbirch)


    Within Google Search Console, I’m being told that the robots.txt file is preventing google crawlers from crawling the site.

    I’ve checked the file and it shows;

    User-agent: *
    Disallow: /wp-admin/
    Allow: /wp-admin/admin-ajax.php
    
    Disallow: /wp-content/sabai/
    Allow: /wp-content/sabai/File/thumbnails/
    Disallow: /wp-content/plugins/sabai/
    Disallow: /wp-content/plugins/sabai-directory/
    Disallow: /wp-content/plugins/sabai-googlemaps/
    Sitemap: https://enviroforum.net/sabai-sitemap-index.xml

    Can anyone explain why this is happening?

    The page I need help with: [log in to see the link]

Viewing 1 replies (of 1 total)
  • Moderator Steven Stern (sterndata)

    (@sterndata)

    Volunteer Forum Moderator

    Why are you disallowing wp-content?

    Anyhow, if Google first comes across the site while you’ve set WP to “discourage search engines”, it may take it a week or two before it circles back to see that robots.txt has been reset.

Viewing 1 replies (of 1 total)
  • The topic ‘Google Search Console and robots.txt file’ is closed to new replies.