This was my last issue, it ‘may’ be yours too. But may not.
__________________________________
System administration has identified your account as using higher resources on the server housing your account. This is impacting other users, and we may be forced to suspend or have already suspended your site in order to stabilize the server.
We noticed that your site is being heavily ‘crawled’ by search engines. Search engines tend to mimic the effect of hundreds of visitors going through every portion of your site, often all at once.
You may wish to implement a robots.txt file in order to reduce this effect. This file contains instructions for well behaving ‘robots’ on how to crawl your site. You can find more information about this here:
https://www.robotstxt.org/.
The basic format would be as follows to block robots from the following (example) directories as well as set a 30 second delay between requests:
User-agent: *
Crawl-delay: 30
Disallow: /cgi-bin/
Disallow: /images/
Disallow: /tmp/
Disallow: /private/
Crawl-delay is an unofficial extension to the robots.txt standard but one that most popular search engines use. One notable example however is Google’s crawlers, which instead require you to set this delay in Google Webmaster Tools. We have a step-by-step guide on doing so at this URL:
https://www.inmotionhosting.com/support/website/google-tools/setting-a-crawl-delay-in-google-webmaster-tools
The delay and directories which are disallowed for crawlers are particularly useful for parts of your sites like forums or ‘tag clouds’ that, while useful to human visitors, are troublesome in terms of how robots aggressively pass through them repeatedly.
You can also use your access logs to see how search engines are hitting your site. Let us know if you need help finding your logs in our control panel and we’ll be glad to help.
If your site is currently suspended, please contact us to lift the suspension in order to implement the above recommendation. As always, feel free to contact us with any further questions.