robots.txt is invalid – pagespeed.web.dev
-
I get an error from Google, that the robots.txt is not valid. How to fix this?
The page I need help with: [log in to see the link]
-
Hi,
Your robots.txt file appears to have some issues. Here’s the corrected version:
User-agent: * Disallow: /wp-admin/ Allow: /wp-admin/admin-ajax.php Sitemap: https://ixtract.de/sitemap_index.xml Disallow: /wp-content/uploads/wpo-plugins-tables-list.json
In your original robots.txt file, you had two sets of User-agent directives. According to the robots.txt specification, directives for the same user-agent should be grouped together. I’ve combined the directives into one block for the wildcard user-agent “*”, ensuring clarity and correct interpretation by search engine crawlers.
Additionally, I’ve removed the “User-agent: ” line that appears after the Sitemap directive. This line is unnecessary because the wildcard user-agent “” already applies to all user-agents.
Lastly, I’ve corrected the Allow directive for /wp-admin/admin-ajax.php to be on a separate line, as it should be a separate directive from the Disallow directive for /wp-admin/.
With these adjustments, your robots.txt file should now be correctly formatted and functional. Make sure to test it using Google’s robots.txt Tester tool in Google Search Console to verify its correctness and effectiveness.
I hope this helps.
Thank you
Thanks for your help. Actually the robots.txt is the automatically generated robots.txt file from RankMath. So I did not edit it so far …
But I will add your lines and see if it works out.
It worked out, but after a while RankMath mixed up the robots.txt again. Is there a way to lock the robots.txt?
User-agent: *
Disallow: /wp-admin/ Allow: /wp-admin/admin-ajax.php
Sitemap: https://ixtract.de/sitemap_index.xml
Disallow: /wp-content/uploads/wpo-plugins-tables-list.json
User-agent: *
Disallow: /wp-content/uploads/wpo-plugins-tables-list.json
Hello @hirschferkel,
Thank you for contacting Rank Math support.
You should remove the following part from the robots.txt file, as it isn’t needed:
User-agent: * Disallow: /wp-content/uploads/wpo-plugins-tables-list.json Here is how to edit your robots.txt file: https://rankmath.com/kb/how-to-edit-robots-txt-with-rank-math/Please follow this guide to fix the issue in PageSpeed test: https://rankmath.com/kb/fix-common-robots-txt-issues/#robots-txt-is-not-valid-in-pagespeed-insights
Hope that helps and please do not hesitate to let us know if you need our assistance with anything else.
I already changed the robots.txt but I just wonder, why it will be created the wrong way, again? I do not use a CDN.
Even I changed the robots.txt in Rank math, the wrong robots.txt is still available and not the Rankmath one. Any idea how to fix it? https://ixtract.de/robots.txt
Hello @hirschferkel,
Thank you for contacting Rank Math support.
You should remove the following part from the robots.txt file, as it isn’t needed:
User-agent: *
Disallow: /wp-content/uploads/wpo-plugins-tables-list.jsonHere is how to edit your robots.txt file:?https://rankmath.com/kb/how-to-edit-robots-txt-with-rank-math/
Please follow this guide to fix the issue in PageSpeed test: https://rankmath.com/kb/fix-common-robots-txt-issues/#robots-txt-is-not-valid-in-pagespeed-insights
Hope that helps and please do not hesitate to let us know if you need our assistance with anything else.
Hello @hirschferkel,
You should check your website root directory for any robots.txt static file.
Rank Math can only edit your robots.txt, but if there’s a static file in your directory, then Rank Math cannot override it as Rank Math’s robots.txt is created virtually.
Here’s a link for more information:
https://rankmath.com/kb/cant-edit-robots-txt/Looking forward to helping you.
There is no static robots.txt file. Seems like the “virtual” file gets overwritten by other plugins, like WordPress Optimize, still. Dealing with a virtual robots.txt seems to be the root cause of the problem.
Now I know why I ran into these troubles. I fixed the issue meanwhile.
Hello @hirschferkel,
WP Optimize adds the Disallow rule to the robots.txt file, but it shouldn’t duplicate the user-agent or add the rule twice. There could be some other plugin conflicting with the robots.txt file generated by our plugin. However, since you were able to edit it and fix the issue, there is nothing to be worried about.
Please do not hesitate to let us know if you need our assistance with anything else.
- The topic ‘robots.txt is invalid – pagespeed.web.dev’ is closed to new replies.