Reviving this because I was having this issue and I was trying to find the answer here. Everything is fine on host’s side.
Site was not indexing. GSC was giving me an error with the robots.txt being unreachable. It also couldn’t fetch my sitemap.
I also had Really Simple SSL installed. Go to that plugin > Settings > Hardening > Uncheck “Disable the Built In File Editors”
Go back to Rank Math > Ensure your sitemap is available. Robots.txt should be writable.
Purge site cache (I use W3 Total Cache)
I can now request indexing of my URLs, and re-submitted the sitemap to Google.
(In my case, GSC has an error where if you submit the sitemap originally when this robots.txt issue is happening, it will say “cant fetch” instead of pending… Just TEST the sitemap URL [sitemap shouldnt be indexable, so test instead of request indexing.] Sitemap should be crawlable.)
-
This reply was modified 9 months, 2 weeks ago by deciduouslypressed.
-
This reply was modified 9 months, 2 weeks ago by deciduouslypressed. Reason: added additional information on gsc