• Bootje

    (@bootje)


    Hi

    I have a few private pages (visible after logging in) that I don’t want to be found by Google. Does someone know how to do this?

    I know you can hide folders with robots.txt, but how to do something like this with page. As the page is not linked on the public site, it shouldn’t crawl the page (I guess). But I want to be sure that it doesn’t happen.

Viewing 4 replies - 1 through 4 (of 4 total)
  • alism

    (@alism)

    If they’re only visible to people logged in… well, Googlebot wouldn’t have a log in? So wouldn’t be able to access and index those pages, right?

    Wouldn’t hurt to stick an entry in your robots.txt though.

    Thread Starter Bootje

    (@bootje)

    What kind of entry for a page?

    alism

    (@alism)

    Something like:

    User-agent: *
    Disallow: /yourcategory/yourpage/

    (change according to what directory/page structure you’re using)

    Thread Starter Bootje

    (@bootje)

    Thanks, even behind logging in, I want to be sure that they are not found by search engines.

Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘Google crawl and private pages’ is closed to new replies.