• Resolved jetxpert

    (@jetxpert)


    Good Day!

    Essentially, there are five (5) methods for blocking and/or redirecting bad bots.

    They are:

    (1) Via .htaccess file
    (2) Via plugin (e.g., SG Optimizer, Wordfence, Blackhole for Bad Bots)
    (3) Via CDN (e.g., Cloudflare, KeyCDN)
    (4) Via robots.txt file
    (5) Via host server (built-in code)

    In order to protect server resources and achieve the highest level of protection, which method do you recommend? (beside your plugin)

    At a glance, the robots.txt file method seems the most logical way to go, but adding tens if not hundreds of bad bots to the robots.txt file doesn’t seem efficient.

    Currently, we are blocking known bad bots via Cloudflare’s Firewall Rules (no other method being used), and it seems to be working pretty good.

    Thoughts on this appreciated.

    Thank you!

Viewing 1 replies (of 1 total)
  • Plugin Author Jeff Starr

    (@specialk)

    For performance it’s usually optimal to implement measures at the server level (e.g., Apache/.htaccess or Nginx/config, etc.). But every setup is different as are the needs and goals of each site. So in some cases it may make sense to use a plugin, another case may make more sense to use CDN, and so forth. There is no one-size-fits-all solution when it comes to security (and most other things).

    I hope this helps, feel free to post again with any other questions, etc. Glad to help anytime.

Viewing 1 replies (of 1 total)
  • The topic ‘Best Method for Blocking Bad Bots’ is closed to new replies.