Submitted URL blocked by robots.txt
-
Hi
I have an issue with my Search Console telling me that: Submitted URL blocked by robots.txt. I am using All In One SEO Pack 3.4.2
My robots.txt looks like this:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.phpSitemap: https://www.techthatworks.net/sitemap.xml
Right now there is 148 url blocked by robots.txt and I don’t see why I get that error in Google Search Console. All the url that are reported blocked starts with
https://www.techthatworks.net/tag/
or
https://www.techthatworks.net/category/Can you tell my why Google see these URL’s as blocked by my robots.txt generated from All-In-One-SEO-pack?
The page I need help with: [log in to see the link]
- The topic ‘Submitted URL blocked by robots.txt’ is closed to new replies.