• Resolved ai4k Web Design

    (@soylaostia)


    I′ve noticed a lot of 404’s from Google spider trying to index plugins, and I’ve disallowed it in the robots.

    Disallow: /wp-content/plugins/
    Disallow: /plugin/
    Disallow: /plugins/

    Can you check it, please?

    Thanks in advance.

Viewing 2 replies - 1 through 2 (of 2 total)
  • It happens to everyone, it is not related to all in one.
    In fact, googlebot crawls a HTML page, and then later on, you may uninstall plugins. However, the HTML page referred to those plugins in the HTML code (Path to CSS file, path to JS files, or WOFF files). So, Googlebot will still think there are there as long as it didn’t recrawl the HTML page. That’s why you get so many 404 errors.
    But at some point, you will less and less of them because googlebot will re-crawl the HTML pages.
    Keep in mind, that googlebot separately crawls HTML pages and CSS files, images, JS files.

    • This reply was modified 4 years, 6 months ago by bloup.
    Plugin Support Steve M

    (@wpsmort)

    You shouldn’t block Google from the /wp-content/plugins/ directory. They need to be able to access any plugins that have any JS or CSS required in order to render your website in Chrome. If they can’t render your site in Chrome because you’ve blocked access to these files then this will affect whether they index your content or not.

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘Robots problem?’ is closed to new replies.