Viewing 5 replies - 1 through 5 (of 5 total)
  • i’ll post what I just got from my host. Ive had an occasional email from them about system resources.

    I have 1800 products.

    This was my last issue, it ‘may’ be yours too. But may not.

    __________________________________

    System administration has identified your account as using higher resources on the server housing your account. This is impacting other users, and we may be forced to suspend or have already suspended your site in order to stabilize the server.

    We noticed that your site is being heavily ‘crawled’ by search engines. Search engines tend to mimic the effect of hundreds of visitors going through every portion of your site, often all at once.

    You may wish to implement a robots.txt file in order to reduce this effect. This file contains instructions for well behaving ‘robots’ on how to crawl your site. You can find more information about this here:
    https://www.robotstxt.org/.

    The basic format would be as follows to block robots from the following (example) directories as well as set a 30 second delay between requests:

    User-agent: *
    Crawl-delay: 30
    Disallow: /cgi-bin/
    Disallow: /images/
    Disallow: /tmp/
    Disallow: /private/

    Crawl-delay is an unofficial extension to the robots.txt standard but one that most popular search engines use. One notable example however is Google’s crawlers, which instead require you to set this delay in Google Webmaster Tools. We have a step-by-step guide on doing so at this URL:
    https://www.inmotionhosting.com/support/website/google-tools/setting-a-crawl-delay-in-google-webmaster-tools

    The delay and directories which are disallowed for crawlers are particularly useful for parts of your sites like forums or ‘tag clouds’ that, while useful to human visitors, are troublesome in terms of how robots aggressively pass through them repeatedly.

    You can also use your access logs to see how search engines are hitting your site. Let us know if you need help finding your logs in our control panel and we’ll be glad to help.

    If your site is currently suspended, please contact us to lift the suspension in order to implement the above recommendation. As always, feel free to contact us with any further questions.

    Thread Starter Don

    (@d_random)

    gmet, THANK YOU!!

    It sounds like I may be experiencing the same problem, here is a message from my hosting, what do you think?

    _____________________________________________________
    Currently it appears that your server is using all its RAM as well as a large portion of its swap which is what is causing it to crash.

    [root@server ~]# free -m
    total used free shared buffers cached
    Mem: 489 184 304 0 3 32
    -/+ buffers/cache: 148 340
    Swap: 1023 147 876

    The top processes which are using this RAM are standard httpd requests which most likely means people view your website on the server.

    PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
    2539 apache 20 0 353m 7316 4248 S 0.0 1.5 0:02.99 /usr/sbin/httpd

    It appears that you may be getting more traffic that can be handled with your 512 MB of RAM and I would recommend adding more to your server.

    Im an Artist, and a novice at this stuff by trial and error at best. ??

    I think I’m seeing the spikes that may be causing my problem, Im glad my host gave some indication on how to fix.

    Definitely make that Google setting change (link in my previous post)

    <looks like its a 90 day setting, then google resets>

    In Google Webmaster Tools
    Crawl > Crawl Stats

    Worth a try, at least before they get you to buy more RAM.

    good luck

    reminds me to look back into CloudFlare
    theres a free version.

    Loads a lot of your website from the cloud, and also there was some good security benefits from getting hacked.

    you set your nameservers to CloudFlare servers.

Viewing 5 replies - 1 through 5 (of 5 total)
  • The topic ‘How much memory needed for 2K products?’ is closed to new replies.