• Resolved AlfredG

    (@alfredg)


    Why is this a bad qyery? I see them a lot and they only adress subpages.

    /en/mijn-fotos-in-een-galerij?page=6

Viewing 4 replies - 1 through 4 (of 4 total)
  • Plugin Author Iridium Intelligence

    (@iridiumintel)

    Hi there Alfred,

    I’ve just looked into that incident reported as a bad query, and I concluded it’s generally passing one of plugins’s detection mechanism and that is that every link that leads to 404 (page not found) with any appended parameters will get reported.

    So in this case url on your website /en/mijn-fotos-in-een-galerij?page=6 with parameter page=6 dont exist, have param in it and its treated as bad_query by plugin.

    We have large database of known bad queries used for exploits, but to be able to get new findings we cannot allow plugin to track only known, but must allow reporting of new too.

    When this situations happens best thing would be to make sure reported page/link that doesnt exist get fixed and that you send request for unlisting IP that created that “incident”.

    We are planning to introduce new settings that will allow user to set level of “sensitivity” and possibly skip bad query if not already confirmed in public database.

    Thanks again for your interactions and helping to achieve better quality plugin

    Regards

    P.S. On the subject of your previous topic with crawler bots, when I checked this last query you reported in this topic, I noticed that same site reported google bots again, so just to make sure, in your plugin settings page set “Web Crawlers” button to “green state”, green means its ON, red color state means its OFF, I just noticed it could use some status text showing current status next to it and explaining that, will make sure to add it in next release.

    Thread Starter AlfredG

    (@alfredg)

    That could be a right indicator, why crawl a link yhat doesn’t exist. On the P.S., after updating I had to switch the plugin off and on to be able to set the “Web Crawlers” to the green state. Now it works.

    Plugin Author Iridium Intelligence

    (@iridiumintel)

    Indeed, crawling should be done based on visible links on entry/main page of the site and go deeper and deeper by following the links on crawled pages and by sitemap and robots files if they exist.

    So there is possibility that you have that broken link somewhere on some of your pages or that fault in scanning comes from crawler bot side, where they use some old historical/cached data or have “aggressive” mechanisms in scanning manually targeting url’s by increasing values of parameters used in page queries.

    This scenario for sure pointed to us that we need a way to handle broken links from plugin’s side, and we will work on that solution in next versions.

    Thanks again for your functional interactions

    P.S. In versions 1.0.1 and 1.0.2 we implemented additional function that did automatic table updating for settings of the plugin as we introduced new fields in settings, but as number of installs is still low, and we want our plugin to have as less “extra” functions as possible, thinking everybody already updated, we removed that function in version 1.0.3, so it is possible that you had problem with missing field in settings table that gets fixed by deactivating and activating plugin. We’ll work on reintroducing that “upgrader function” with next release.

    Plugin Author Iridium Intelligence

    (@iridiumintel)

    Hi there Alfred,

    Just to notify you that we just released new version (v1.0.5) with new option to manually white list detected bad queries.

    On main page under Bad Queries report, you will notice icon button to add or remove detected query / link to list and exclude it from future detection.

    Hope that this is help you needed in scenario like you reported here

    Regards

Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘Why is this a bad qyery?’ is closed to new replies.