• I limited the number of languages on my blog to increase performance of the most relevant languages for the readers. This has created tons of urls which aren’t available anymore and thus showing up as crawling error in google’s webmaster tool. I have crawling errors in two categories:
    – a) network not reachable (unreachable)
    – b) error 404 (not found)

    An example of an error a) is https://blog_url/ar/2009/02/20/the-temples-of-angkor/

    I have removed the complete directory containing the post via Google’s removal tool from the index. My intention was to remove everything from google index with a certain language.

    But after a few days those posts are still being reported in google’s webmaster tool as error (a or b) while the directory in which they exist are removed by google??? The removal tools has confirmed that the directories are being removed from the index.

    Is this just a matter of time before google’s webmaster tools will remove those posts from the error-list or will they stay there forever?

    Any help is appreciated.

Viewing 2 replies - 1 through 2 (of 2 total)
  • Google is slow to catch up sometimes. You might send them a sitemap, either manually using https://www.xml-sitemaps.com/ or install the Google Sitemap plugin.

    When they finally crawl from your sitemap they will update your listings in their index to the sitemap.

    Thread Starter BJJ125

    (@bjj125)

    I have a sitemap installed but google only indexes it partially. (from 122 urls, 25 indexed) My feeling is that because of the crawling errors (due to slow catching up of Google?) he is not indexing to the fullest. In the past I hadn’t this problem.

    Anybody an idea if there is a relationship between the # crawling errors and the indexing?

    Thx in advance

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘sticking crawler errors – how to get rid of them?’ is closed to new replies.