• Resolved ddelicious

    (@ddelicious)


    In the crawl optimization settings, I enabled “Prevent crawling of internal site search URLs.” It says that the setting will add a disallow rule to my robots.txt. But after enabling the setting, clearing cache, and waiting a long time, there are no edits to robots.txt.

    In the Yoast robots.txt editor, I am able to view & edit my robots file, but it is still the default wordpress robots.txt file, not the Yoast style. So it seems that wordpress file editing is enabled, but Yoast still isn’t editing my robots.txt. Any suggested fix here? Thanks.

    The page I need help with: [log in to see the link]

Viewing 4 replies - 1 through 4 (of 4 total)
  • Noticed the same problem here. I use the pro version

    Plugin Support Maybellyne

    (@maybellyne)

    Hello @ddelicious & @ddelicious,

    Thanks for using the Yoast SEO plugin. The directives are added to the virtual robots.txt file. I’m sorry about the confusion; it’s in our roadmap to consolidate what you have in the Yoast SEO robots.txt file and the WordPress robots.txt file.

    • This reply was modified 11 months, 2 weeks ago by Maybellyne.
    Thread Starter ddelicious

    (@ddelicious)

    Ok, thanks for the info. When I check the google robots.txt report it shows the wordpress file, not the yoast virtual file. Image. Is there something I need to do to make google detect the virtual file?

    Plugin Support Maybellyne

    (@maybellyne)

    I’m sorry I left out another tip. You can copy the contents of the virtual to the Yoast SEO-generated version.

Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘Yoast isn’t editing robots.txt after crawl settings change’ is closed to new replies.