[Plugin: Google XML Sitemaps] robots.txt
-
I’ve added this plugin but it seems to be blocking crawlers without me asking it to. I want search engines to be able to crawl my site, why has the plugin disallowed them all?
I recently noticed all my Google AdSense ads had vanished and my visitor numbers have plummeted. I contacted Google who told me my robot.txt file was blocking them from crawling my site. It had:
User-agent: *
Disallow: /Why has the plugin done this and how do I stop it? Why has it acted in this way without notification or asking permission? Why are there no settings to disable it?
- The topic ‘[Plugin: Google XML Sitemaps] robots.txt’ is closed to new replies.