Hello @rifatspir and thanks for reaching out to us!
I generally set my Rate Limiting Rules to these values to start with:
If anyone’s requests exceed 240 per minute
If a crawler’s page views exceed 120 per minute
If a crawler’s pages not found (404s) exceed 60 per minute
If a human’s page views exceed 120 per minute
If a human’s pages not found (404s) exceed 60 per minute
How long is an IP address blocked when it breaks a rule 30 minutes
I also always set the rule to Throttle instead of Block. Throttling is generally better than blocking because any good search engine understands what happened if it is mistakenly blocked and your site isn’t penalized because of it. Make sure and set your Rate Limiting Rules realistically and set the value for how long an IP is blocked to 30 minutes or so.
https://www.wordfence.com/help/firewall/rate-limiting/ is an amazing resource for rate limiting rules.
You could use your robots.txt file to rate limit Facebook’s crawler bot using the example guide here below:
https://blog-en.openalfa.com/how-to-limit-the-crawl-rate-of-bots-in-a-website
https://developers.facebook.com/docs/sharing/webmasters/crawler/
Let me know if this helps!
Thanks!