• Hi!

    My site is https://eileenanddogs.com and I’m running WP 4.7.1

    I got a “Soft 404s” notice from Google. I have seen questions similar to mine in this forum but my message from Google was slightly different. Here is the text:

    “Googlebot identified a significant increase in the number of URLs on https://eileenanddogs.com/ that should return a 404 (not found) error, but currently don’t. This can cause a bad experience for your users, who might have been looking for a specific page, but end up elsewhere on your website. This misconfiguration can also prevent Google from showing the correct page in search results.”

    I have taken a look and they started on January 5, 2017 and are accumulating at the rate of about 20 per day. There are 171 through yesterday.

    Here is an example:
    https://eileenanddogs.com/?s=%E5%A4%A9%E9%BE%99Texas+Hold%E2%80%99em+poker+Q82019309.com

    Some of the characters are Chinese. The ones I checked are affiliated with gambling sites.

    I use the free version of Wordfence and it hasn’t detected any problems.

    I installed some optimizing plug-ins recently and one was W3 Total Cache. I found out it had some vulnerabilities.

    I have taken the action of changing my site passwords and disabling W3 Total Cache. I won’t know until tomorrow whether these actions stemmed the tide.

    • Does this sound like a problem with the plugin?
    • Is there something else I need to do to secure my site?
    • And is there a way to get rid of the URLs to nonexistent pages?

    Thank you for your help!

Viewing 2 replies - 1 through 2 (of 2 total)
  • https://eileenanddogs.com/?s=’ refers to a search link

    You site may not be infected thus Wordfence does not detect any issues, but something is trying to generate fake search on your site.

    Try to setup ‘Enable Rate Limiting and Advanced Blocking’ under Wordfence options

    Thread Starter eileenanddogs

    (@eileenanddogs)

    OK thank you, I have checked the box to Enable Rate Limiting and Advanced Blocking. What should my Rate Limiting Rules be? Here are the choices

    Immediately block fake Google crawlers:
    How should we treat Google’s crawlers ? Treat Google like any other Crawler ?
    If anyone’s requests exceed: ?
    If a crawler’s page views exceed: ?
    If a crawler’s pages not found (404s) exceed: ?
    If a human’s page views exceed: ?
    If a human’s pages not found (404s) exceed: ?
    If 404s for known vulnerable URLs exceed: ?
    How long is an IP address blocked when it breaks a rule:

    Thank you for your help, luckychingi

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘Soft 404s accumulating: Site Hack or Plugin?’ is closed to new replies.