robots.txt set to disallow, can't change
-
I’m working with wordpress multisite, and have verified that the primary blog is set to allow crawlers in the privacy settings. Unfortunately, the generated robots.txt file is still showing disallow for all the sites. Any ideas on why this would be the case and how to fix it?
- The topic ‘robots.txt set to disallow, can't change’ is closed to new replies.