Google Search Console and robots.txt file
-
Within Google Search Console, I’m being told that the robots.txt file is preventing google crawlers from crawling the site.
I’ve checked the file and it shows;
User-agent: * Disallow: /wp-admin/ Allow: /wp-admin/admin-ajax.php Disallow: /wp-content/sabai/ Allow: /wp-content/sabai/File/thumbnails/ Disallow: /wp-content/plugins/sabai/ Disallow: /wp-content/plugins/sabai-directory/ Disallow: /wp-content/plugins/sabai-googlemaps/ Sitemap: https://enviroforum.net/sabai-sitemap-index.xml
Can anyone explain why this is happening?
The page I need help with: [log in to see the link]
Viewing 1 replies (of 1 total)
Viewing 1 replies (of 1 total)
- The topic ‘Google Search Console and robots.txt file’ is closed to new replies.