• Resolved arnold30

    (@arnold30)


    Hi

    I have been using the seo framework for a while now, and the sitemap.

    A week ago the sitemap stopped working i get this https://imgur.com/Tf1ArOA
    As google search console also only gets 404 errors when crawling my website

    I have spoken o my hosing company and theme developers and every one is a bit puzzled.
    There is one robots.txt file i can find in my public html file, but not sure if i should delete it as it might be from the seo framework, i don’t want to cause more problems.

    So not sure where the robots file came from or how to fix this.
    Any help is greatly appreciated

    The page I need help with: [log in to see the link]

Viewing 2 replies - 1 through 2 (of 2 total)
  • Plugin Author Sybre Waaijer

    (@cybr)

    Hi Arnold,

    In 3.1 I added these new notifications in the plugin so it should be more clear on which functionalities might conflict with the installation.

    You can safely delete the robots.txt file in your root directory. This will reinstate the functionality of WordPress managing it as a virtual file, and it allows The SEO Framework (or any other plugin) to alter it further when necessary. The red “Note:” should go away, too.

    Google’s Search Console should tell you where it detected the 404 pages, and in their detailed overview, they should tell you where they found a link to the 404 page. This should then be easily resolvable.

    I hope this helps you underway! Cheers ??

    Thread Starter arnold30

    (@arnold30)

    Hi
    thank you very much
    I have deleted the robots file, and the red note has gone away.
    I will see if google can crawl the pages now ??
    If not will give an update
    Thanks again cheers ??

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘Sitemap setting detecting a robots file’ is closed to new replies.