Viewing 5 replies - 16 through 20 (of 20 total)
  • I found the solution to the issue for me was to reset my permalinks fyi.

    I don’t know if this is of any use to anyone, but I struck this problem also on a domain mapped MS install.

    The problem in my case was that the request for robots.txt wasn’t being passed through to WordPress and was being intercepted by Nginx.

    Adding this fixed the problem:

    location = /robots.txt {
    	allow		all;
    	log_not_found	off;
    	access_log	off;
    	try_files $uri $uri/ /index.php?$args;
    }

    I can’t imagine Apache would be too different.

    Check if robots.txt file is present physically in root directory and is readable.

    Most SEO plugins including wordpress-seo uses common robots.txtfile across network in multisite

    That as my first thought too Rahul, in my case there were no other physical or virtual robots.txt.

    Hmm. I checked my found a robots.txt in my root dir.

    May be your document-root is not writable by php-process owner.

    A small suggestion. Try creating a empty robots.txt and make it 0777. If your SEO plugins (if you using any) update it then most likely its a permission issue.

    We should avoid requests to php as much as possible.

Viewing 5 replies - 16 through 20 (of 20 total)
  • The topic ‘Network sites 404 Error on robots.txt’ is closed to new replies.