Blocking Robots
-
Hi,
I have been informed by my hosting provider (SiteGround) that my CPU load was extremely high this month. After further investigation, they informed me that it’s mainly due to being aggressively crawled by robots/crawlers.
They advised me to use robots.txt to block all robots using:
User-agent: *
Disallow: /And then allow access to “whitelisted” robots, like:
User-agent: Googlebot
Allow:/Is that a wise move? Would I not potentially be disabling some useful bots as well? Is there a good “whitelist” of robots that it’s recommended to allow through? Is there perhaps another way to solve this, such as blocking only known “bad” robots? Maybe there’s a plugin that does that?
My website is: https://nest-expressed.com/
Any help is much appreciated!
Thanks,
Daniel
- The topic ‘Blocking Robots’ is closed to new replies.