Viewing 2 replies - 1 through 2 (of 2 total)
  • My answer: different webmasters have different needs. Some will disallow some pages and not others. I don’t think the example you found is just as good as the example on the robots.txt page.

    It’s a question of what you want search engines to have access to. My two cents.

    Thread Starter Martin999

    (@martin999)

    The seven disallow-rows above don’t represent a certain, single robots.txt, containg exactly and only these. So it’s not one robots.txt-example.

    The most robots.txt webmaster use the official recommendation (12 disallow-rows) PLUS at least some of the seven I mentioned above.
    Some webmaster add all seven, some only 4 or 5 …

    Just to clear it.

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘Optimizing the robots.txt’ is closed to new replies.