• kletskater

    (@kletskater)


    How can I prevent search bots to look for non existing pages blogs?

Viewing 5 replies - 1 through 5 (of 5 total)
  • petercralen

    (@petercralen)

    Hi,

    You have to Disallow these pages in robots.txt

    Thread Starter kletskater

    (@kletskater)

    When I disallow them, they won’t be able to index the existing pages, but the search engines should refresh there database for they have pages and posts in there I removed 2 years ago.
    So when I was not clear in my question, how can I prevent search bots from looking for old non existing posts/pages. but stay able to index my site?

    MarkRH

    (@markrh)

    You Disallow the specific pages, not the whole site

    User-agent: *
    Disallow: /this-page-doesnt-exist.shtml
    Disallow: /not-here.php
    Disallow: /2012/05/10/this-post-no-longer-exists/
    Thread Starter kletskater

    (@kletskater)

    merci
    Do I have to specify the bots like msn and google or does this disallow cover al searchbots?

    petercralen

    (@petercralen)

    You don’t have to specify bots.

    User-agent: * – mean all bots. Its ok like that.
    If you will search on google you will some helpful articles about this.
    For example you can disallow whole path (if you have many urls with same structure) … for example
    Disallow: /2012/ mean you disallow all links like example.com/2012/anything
    if you type just
    Disallow: /2012 without slash you are disallowing just one url example.com/2012

    So be careful don’t disallow whole site, bc. you will disappear on google ??

    Btw. after you do this, in Search console – (before called google webmaster tool) you can remove these links from google index, there is tool called “Remove URLs’ under Google index. Be sure that before you noindex that page and/or that page does not exist.

Viewing 5 replies - 1 through 5 (of 5 total)
  • The topic ‘search bots looking for non existing pages blogs’ is closed to new replies.