• Hi all,

    Any attempt to access the robots.txt file for one of the websites I run — idolchatteryd.com — comes back with a 500 Internal Server Error. That happens when I try to do so by typing idolchatteryd.com/robots.txt into the url, and (more disturbingly) it happens when I try to “fetch as Google” using Google Webmaster, which tells me the file is unreachable. For this reason, Google has — *gasp* — deindexed the site.

    I really would like the site to be indexed, obviously, so what the heck do I do?

    My Googling has told me that the .htaccess might be the problem, but I don’t really know what to do with that information, or what I’d be looking for that could be problematic. So…help, please?

    Thanks!

Viewing 3 replies - 1 through 3 (of 3 total)
  • Have you checked to see if the file is really there?

    And your links above are incorrect.

    Thread Starter sephcot

    (@sephcot)

    Now that you mention it, it isn’t in the FTP. Sorry, I’m pretty new to this. But the issue remains: Google gets that internal server error when trying to figure out what bots are allowed to crawl the site when it looks for a robots.txt. Any suggestions?

    And my bad: idolchatteryd.com and idolchatteryd.com/robots.txt

    An internal service error means something is set up wrong in the server. If it was just a file missing you would have a 404 error.

    You don’t have to have a robots.txt file to be indexed by Google or anyone. Bots would look for it and follow the instructions or if they don’t see it they should go about their work. The trouble is that when they look they get the error.

    Time to ask GoDaddy for some assistance. Good luck.

Viewing 3 replies - 1 through 3 (of 3 total)
  • The topic ‘Robots.txt internal server error’ is closed to new replies.