Hi there Alfred,
I’ve just looked into that incident reported as a bad query, and I concluded it’s generally passing one of plugins’s detection mechanism and that is that every link that leads to 404 (page not found) with any appended parameters will get reported.
So in this case url on your website /en/mijn-fotos-in-een-galerij?page=6
with parameter page=6
dont exist, have param in it and its treated as bad_query by plugin.
We have large database of known bad queries used for exploits, but to be able to get new findings we cannot allow plugin to track only known, but must allow reporting of new too.
When this situations happens best thing would be to make sure reported page/link that doesnt exist get fixed and that you send request for unlisting IP that created that “incident”.
We are planning to introduce new settings that will allow user to set level of “sensitivity” and possibly skip bad query if not already confirmed in public database.
Thanks again for your interactions and helping to achieve better quality plugin
Regards
P.S. On the subject of your previous topic with crawler bots, when I checked this last query you reported in this topic, I noticed that same site reported google bots again, so just to make sure, in your plugin settings page set “Web Crawlers” button to “green state”, green means its ON, red color state means its OFF, I just noticed it could use some status text showing current status next to it and explaining that, will make sure to add it in next release.