• Resolved jelliott2014

    (@jelliott2014)


    I am attempting to create a robots.txt file that prevents crawling of several directories of my site as shown in the link attached. Tried with/without a “/” at the end of each directory. Google Console indicates errors in both.

    According to Yoast’s “The ultimate guide to robots.txt”, this should be acceptable to Google. What am I missing??

    The page I need help with: [log in to see the link]

Viewing 5 replies - 1 through 5 (of 5 total)
  • Plugin Support Maybellyne

    (@maybellyne)

    Hello @jelliott2014,

    Thanks for reaching out about your robot.txt file. Have you consider having a disallow directive for each folder instead of one directive that covers them all. Also, be mindful as directives are case sensitive. That means /photo is not the same as /Photo

    Thread Starter jelliott2014

    (@jelliott2014)

    Tried that also. No change.

    Plugin Support Maybellyne

    (@maybellyne)

    I am unsure what you mean by no change as this is what I see in the file: screenshot.

    Thread Starter jelliott2014

    (@jelliott2014)

    I had changed robots.txt back to the original text. Have now reverted again, as you suggested. Sorry.

    Plugin Support devnihil

    (@devnihil)

    Thanks for the clarification. Regarding your issue, can you please try using the following and let us know whether this resolves the error in Google Search Console?

    User-agent: *
    Disallow: /pdf/
    Disallow: /wp-content/uploads/
    Disallow: /subjects/
    Disallow: /ngg_tag/
    Disallow: /Archives/
    Allow: /
    Sitemap: https://www.saanichsommeliers.ca/sitemap_index.xml

    If it doesn’t, can you please provide us with the exact errors that Google Search Console is returning for the robots.txt file?

Viewing 5 replies - 1 through 5 (of 5 total)
  • The topic ‘robots.txt errors’ is closed to new replies.