Repeated user agent in robots.txt
-
Slim SEO doesnt seem to look whats in the robots.txt and add everything repeated. So you need to strip User-agent: *
User-agent: * Disallow: /?s= Disallow: /page/*/?s= Disallow: /search/ Sitemap: https://example.com/sitemap.xml User-agent: *
And there is no function to disable slim seo robots.txt
Viewing 15 replies - 1 through 15 (of 15 total)
Viewing 15 replies - 1 through 15 (of 15 total)
- The topic ‘Repeated user agent in robots.txt’ is closed to new replies.