• Resolved jetxpert

    (@jetxpert)


    Good Day!

    Essentially, there are five (5) methods for blocking and/or redirecting bad bots.

    They are:

    (1) Via .htaccess file
    (2) Via plugin (e.g., SG Optimizer, Wordfence, Blackhole for Bad Bots)
    (3) Via CDN (e.g., Cloudflare, KeyCDN)
    (4) Via robots.txt file
    (5) Via host server (built-in code)

    In order to protect server resources and achieve the highest level of protection, which method do you recommend?

    At a glance, the robots.txt file method seems the most logical way to go, but adding tens if not hundreds of bad bots to the robots.txt file doesn’t seem efficient.

    Currently, we are blocking known bad bots via Cloudflare’s Firewall Rules (no other method being used), and it seems to be working pretty good.

    Thoughts on this appreciated.

    Thank you!

Viewing 1 replies (of 1 total)
  • Plugin Author Hristo Pandjarov

    (@hristo-sg)

    SiteGround Representative

    In .htaccess files you can block IPs and User-Agents. Both aren’t reliable.

    Bad bots don’t honour robots files.

    The best way is through a service running on your servers and we have such service. You can keep doing it on CF since you have them running fine. If such check reaches the plugin this means that it will use some resources to process this request and filter it or pass it through.

Viewing 1 replies (of 1 total)
  • The topic ‘Best Method for Blocking Bad Bots’ is closed to new replies.