Viewing 5 replies - 1 through 5 (of 5 total)
  • Thread Starter eeegamer

    (@eeegamer)

    I am not using a static robot.txt – at least I do not know of any. How can I find out?

    Hi eeegamer,

    The field “Additional robots.txt rules” is only to give some extra (manual) control over the dynamic robots.txt output. It does not control the output if the dynamic sitemaps.

    to exclude all tag URLs from the sitemap you’ll need to go back to Settings > Reading and uncheck the box “Tags” at “Include taxonomies” in the XML Sitemap section.

    Thread Starter eeegamer

    (@eeegamer)

    Hi RavanH,

    thanks for your quick response. The mentioned box was unchecked all the time, so I do not know what the problem could be. I checked/unchecked it now to see if it works.

    Thread Starter eeegamer

    (@eeegamer)

    Category is unchecked, too, but it shows all my categories in my sitemap.

    Did you remove any static sitemap files from your site root? Did you make sure you have no other sitemap generating plugins active? It does not sound like the sitemap is from my plugin…

    Can you share a link?

Viewing 5 replies - 1 through 5 (of 5 total)
  • The topic ‘Is robots.txt rule working?’ is closed to new replies.