• Greetings, I have one simple question. My website was getting loaded by too many requests by bots/crawlers. I have changed the “Crawler’s Page Views” option, so that it BLOCKS the crawlers for 2 hours when the Crawler’s Page Views exceed the minimum (1 per minute). Since then, I get around 20-25 notifications for blocks per day.

    Is there any possibility that real people who are viewing my website, get blocked?

    Thank you in advance.

Viewing 2 replies - 1 through 2 (of 2 total)
  • Thread Starter daniantonio8

    (@daniantonio8)

    anyone?

    Hi daniantonio8,
    Actually, by blocking crawlers if they exceeds the rate (1 per minute), I think you are almost restricted their access to your website, that’s why you have this increased number of blocked visits, crawlers are usually faster than that, so I suggest increasing the rate to something like “120 per minute” and using “throttle” option which will rate limit rather than block crawlers. (i.e their site access will be temporarily blocked until they reduce their request frequency to below the limit you have set.)

    Let me know if you have any further question,
    Thanks.

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘Rate Limiting Values/Crawler’s Page Views’ is closed to new replies.