Viewing 6 replies - 1 through 6 (of 6 total)
  • Plugin Author Sybre Waaijer

    (@cybr)

    Disable the “Detect browser language” setting in Polylang – you can find it at “WP Admin → Languages → Settings → Module”.

    To learn more, see: https://kb.theseoframework.com/kb/translation-plugin-compatibility/#polylang

    Thread Starter rik1234

    (@rik1234)

    PS we use Polylang (not because it is free but because WPML slowed down our site massively), and we have read this article (https://kb.theseoframework.com/kb/translation-plugin-compatibility/#same-site-sitemaps), and we have already disabled the Detect browser language function, so that is not the problem.

    Thread Starter rik1234

    (@rik1234)

    haha we posted simultaneously i see. So just to reiterate: this setting is already disabled (and has been for a long time).

    Thread Starter rik1234

    (@rik1234)

    hi @cybr ,
    part of the problem is solved: Google now accepted the Dutch sitemap too. It just took some time.
    But, in doing some more digging and asked the Google Search Console community for help, they found a possibly related issue: that our Robots.txt file only mentions the English sitemap:
    https://giveforgood.world/robots.txt
    This file is created by The SEO Framework right?
    Currently, we have separate sitemaps for the EN and NL parts of our site. And the Robots.txt only mentions the EN sitemap.
    To get our site crawled correctly, would it not be better if:
    A) the robots.txt mention both sitemaps (this is possible it seems https://stackoverflow.com/questions/2594179/multiple-sitemap-entries-in-robots-txt), OR
    B) the sitemaps are merged into 1 sitemap that is then mentioned in robots.txt (from what i’ve read, with a multilanguage setup, both separate sitemaps or 1 overarching sitemap that indicates languages are both possible)
    ?
    Google now has both our sitemaps, but getting this robots.txt file right might be important for other search engines i think.

    • This reply was modified 3 years, 4 months ago by rik1234.
    Plugin Author Sybre Waaijer

    (@cybr)

    Hoi Rik,

    I’m glad time solved the issue ?? I scratched my head after your reply two days ago and couldn’t find a proper way to guide you further.

    Yes, the robots.txt output is augmented by TSF. To be technically correct: WordPress outputs that page virtually, TSF modifies it.

    The sitemap link isn’t added to the robots.txt output because we cannot confidently “predict” where the alternative sitemaps are located: We’d have to “switch” the site to another language and generate a URL — this is an expensive operation.

    Most search engines use that link as a “hint”. It’s always better to submit your sitemap.

    1. Google: https://developers.google.com/search/docs/advanced/sitemaps/build-sitemap#addsitemap
    2. Bing: https://www.bing.com/webmasters/help/Sitemaps-3b5cf6ed
    3. Yandex: https://yandex.com/support/webmaster/indexing-options/sitemap.html

    Most Western search engines rely on the index of Bing (DuckDuckGo, Ecosia, Qwant, Yahoo!, etc.). So, you only need to give Bing the locations of the sitemap to support them.

    We won’t merge the alternative sitemap languages. To reiterate, we’d have to perform expensive operations — for the robots.txt output, it’s somewhat trivial; but, for sitemaps, it’ll bring servers down to their knees. See https://github.com/sybrew/the-seo-framework/issues/69.

    And, to reiterate our documentation pages, the sitemap is redundant. Both Polylang and WPML correctly interlink pages via hidden hreflang links, which is correct and sufficient. It is your job to provide internal links to pages you want to be discovered by humans — doing so will help index and rank the pages on search engines. WordPress automatically does this via categories and tags for posts; you have to provide the rest via navigational menus, footers, and content. I think you’ve already done an excellent job.

    The sitemap helps search engines discover the latest changes of your website quickly: https://tsf.fyi/kb/sitemap. That’s all it does, but SEOs have glamourized that simple page of links too much (70% of our support is about sitemaps). It’s useless once your pages are already discovered.

    I do not think there’s much more to add to this topic. Feel free to add your voice to our GitHub issues — these forums aren’t great for discussing ideas, for comments get lost in time.

    • This reply was modified 3 years, 4 months ago by Sybre Waaijer. Reason: typo, clarity
    Thread Starter rik1234

    (@rik1234)

    OK @cybr , thanks for the extensive answer, i really appreciate it, your support is outstanding!
    Good to know we don’t have to worry about this too much. Just to be safe, I might also replace the Robots.txt file manually in the root folder as advised here https://github.com/sybrew/the-seo-framework/issues/59 . Then it is also complete :).
    best and thanks again

Viewing 6 replies - 1 through 6 (of 6 total)
  • The topic ‘sitemap not accepted by Google’ is closed to new replies.