Pages blocked by X-Robots-Tag: noindex HTTP header
-
Hello,
We ran a Semrush Site Audit yesterday and it found 180 pages that were blocked by X-Robots-Tag: noindex HTTP header. Double checking in Google Search Console confirmed the pages were not indexed (Excluded by ‘noindex’ tag) and even the Request Indexing and Test Live URL both failed in Google Search Console.
Is there a way to resolve this with Seopress so that all pages are indexed (except for ones that have the Advanced meta robot setting “Do not display this page in search engine results / XML – HTML sitemaps (noindex)” checked)?
Thank you,
Adam
The page I need help with: [log in to see the link]
Viewing 3 replies - 1 through 3 (of 3 total)
Viewing 3 replies - 1 through 3 (of 3 total)
- The topic ‘Pages blocked by X-Robots-Tag: noindex HTTP header’ is closed to new replies.