I’m looking for clarification on how The Events Calendar automatically applies <meta name=”robots” content=”noindex”> to empty event views, as described in your documentation.
Issue
Your SEO & Performance Issues article states:
“Our plugin adds a noindex meta tag to your page header if there’s no events listed on that page, to block this duplicate content.”
However, in my testing:
? Local WP Install (fresh site, only TEC installed) → Empty event views do not show noindex in the HTML.
? TEC Demo Site (Example) → Noindex is also missing on empty views.
This seems to contradict the expected behavior described in the documentation.
Questions
1. Under what conditions does TEC automatically apply noindex to empty event views?
2. Are there known cases where TEC’s noindex function is confirmed to be working as described?
I appreciate any guidance on this! It’s a question that’s been asked many times, but does not seem to be resolved. I want to better understand how to implement TEC’s recommended SEO best practices effectively.
Thanks in advance!
]]>Hello everyone,
Site configuration:
WordPress: 6.6.2
PHP/MySQL version: 8.2 /
Rank Math SEO 1.0.239
Site: https://www.pepiniere-courtin.fr
I installed Rank Math SEO 4 days ago.
I notice several things that intrigue me:
1) Contrary to the documentation, there are no robots.txt files at the root of the site, or even in the rest of the site (in the subfolders).
2) I saw that you can view the robots.txt file by going to the Rank Math SEO dashboard but you can't modify it.
I can also see the content by going to:
https://www.pepiniere-courtin.fr/robots.txt
Here is the content of the robots.txt file:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: https://www.pepiniere-courtin.fr/sitemap_index.xml
3) I notice that the line Disallow: /wp-includes/ is missing in the robots.txt.
Normally, any robots.txt file must include the line Disallow: /wp-includes/.
4) Why is this line not added?
5) How can I add this line to robots.txt when this file is not on the site?
Thank you in advance.
Kind regards,
]]>I have migrated my website hosting from Godaddy Cpanel hosting (WordPress Website) to Hostinger’s WordPress hosting 2 days back.
The transition was not easy. I had to disable WP Rocket and start using LiteSpeed Cache. There were a few errors in the beginning but I figured my way out, somehow. The problem began 2 hours back when I tried to check PageSpeed Insights. It showed no error in Desktop report but in Mobile reports it shows that Robots.txt file has errors. It showed complete HTML document of my website instead of the robots.txt file. Then I tried to reach my robots.txt file by going to mydomain/robots.txt but it kept redirecting me to my homepage.
I checked file manager and the robots.txt file was present. I deleted it and tried to create with Yoast Seo. As soon as I clicked “create robots.txt file”, it takes me to my source code. I hot back and it created a file. I saved.
But still the file was inaccessible.
Finally, I deleted the robots.txt file. But now, after going to mydomain/robots.txt it’s showing my file but with xml code in the beginning.
It’s some sort of Ghost file which Google Search Console can read and produce error to.
I have tried disabling LiteSpeed Cache, Quic.cloud CDN, Yoast SEO. I’ve tried clearing cache from CDN dashboard as well as from LiteSpeed Cache plugin.
Still, it’s showing a ghost file with wrong code. I’ve searched file manager for other robots.txt files but couldn’t find any.
Mydomain/robots.txt reads:
<?xml encoding="UTF-8"><?xml
encoding="UTF-8"><?xml
encoding="UTF-8"><p># START
YOAST BLOCK
# ---------------------------
User-agent: *
Disallow:
Sitemap: https://mydomain/sitemap_index.xml
# ---------------------------
# END YOAST BLOCK</p>
I’m not a developer but I can implement code changes if suggested. Plz help.
]]>news-views/page/26
I would like to stop google from indexing news pages other than page one, but I can’t see a way of overriding robots.txt output, is it possible?
I’d like to add something like this:
Disallow: /news-views/page/*$
Allow: /news-views/page/1$
or possibly replace the robots.txt file completely as it is very basic and doesn’t need to be controlled by The SEO Framework plugin
]]>I’m encountering an issue where the AIOSEO plugin is unable to create the robots.txt
file. The plugin displays the following message:
“It looks like you are missing the proper rewrite rules for the robots.txt file. It appears that your server is running on nginx, so the fix will most likely require adding the correct rewrite rules to our nginx configuration. Check our documentation for more information.”
I’ve reviewed the documentation here:
https://aioseo.com/docs/nginx-rewrite-rules-for-robots-txt/
My WordPress site is hosted on an Azure App Service, where we don’t have the ability to modify the NGINX configuration as recommended. Is there an alternative workaround that would allow AIOSEO to manage or generate the robots.txt
without needing to change NGINX rewrite rules on Azure App Service?
Any guidance would be appreciated. Thank you!
]]>I have enjoyed this plugin with no problems. But recently I added a AI scrapper robots.txt. plugin and after some troubleshooting realized that if I activate your plugins “Add sitemap URL to the virtual robots.txt file”-function, it blocks out the function of the AI scrapper blocker plugin.
So I unchecked it and everything seems fine.
But now my robots.txt links wp-sitemap.xml instead of sitemap.xml. And the wp-sitemap.xml appears to be generated by your plugin, but is blank. Is there anything I can do to fix this without having to stop using the AI scrapper blocker?
]]>I would like to report a bug related to the sitemap URLs declared in the robots.txt file when using The SEO Framework in combination with Polylang.
On my test site, the sitemap section in the robots.txt file looks like this:
Sitemap: https://app.local/en/sitemap.xml
Sitemap: https://app.local/en/fr/sitemap.xml
The first URL is correct, but the second one includes two language codes (/en/fr/), while it should only show /fr/. The proper URL should be:
Sitemap: https://app.local/fr/sitemap.xml
Could you please investigate this issue?
Thank you!
]]>Estoy intentando desde Yoast SEO con la versión gratuita, acceder a Herramientas, al editor de archivos para corregir el robots.txt de mi página web.
El problema es que no me aparece esa opción. Me aparecen solamente las adjuntadas: importar y exportar, editor masivo y optimizar datos de seo.
?Cómo puedo acceder?
]]>