• Resolved awright

    (@adamwrethinkfirst)


    Hello,

    We ran a Semrush Site Audit yesterday and it found 180 pages that were blocked by X-Robots-Tag: noindex HTTP header. Double checking in Google Search Console confirmed the pages were not indexed (Excluded by ‘noindex’ tag) and even the Request Indexing and Test Live URL both failed in Google Search Console.

    Is there a way to resolve this with Seopress so that all pages are indexed (except for ones that have the Advanced meta robot setting “Do not display this page in search engine results / XML – HTML sitemaps (noindex)” checked)?

    Thank you,

    Adam

    The page I need help with: [log in to see the link]

Viewing 3 replies - 1 through 3 (of 3 total)
  • Plugin Author Benjamin Denis

    (@rainbowgeek)

    Hi,

    we confirm this issue on your homepage.

    However, this doesn’t depend on SEOPress. We don’t act on this except for XML sitemaps.

    This is probably done by your server configuration (maybe another WP plugin).

    You should contact your webhost about that.

    Thanks

    Thread Starter awright

    (@adamwrethinkfirst)

    Hi Benjamin,

    Thank you for the fast response and clarification.

    Just to clarify; does SEOPress have any settings for the robots.txt file?

    Thank you,

    Adam

    Plugin Author Benjamin Denis

    (@rainbowgeek)

    Hi,

    only the PRO version allows you to quickly edit your robots.txt file from your WordPress admin:

    https://www.seopress.org/features/htaccess-robots-txt/

    Thx

Viewing 3 replies - 1 through 3 (of 3 total)
  • The topic ‘Pages blocked by X-Robots-Tag: noindex HTTP header’ is closed to new replies.