• Resolved Sn00z389

    (@webmakers2011)


    Hi,

    I am using RankMath on all of our websites. However I have a question about robots.txt:

    Is it better to add those in the robots.txt to stop wasting crawl budget:

    User-agent: *
    Disallow: /wp-admin/
    Disallow: /*add-to-cart=*
    Disallow: /*add_to_wishlist=*
    Disallow: /*filter_*
    Disallow: /*orderby_*
    Disallow: /*min_price*
    Disallow: /*stock_status*
    Disallow: /*?s=*
    Disallow: /*?woodmart_reviews_sorting*
    Disallow: /wp-admin/
    Allow: /wp-admin/admin-ajax.php

    Will these help better ranking or will not have an impact? Have you tested this out? I would love to hear from you about those parameters.

    Can those make things worse for ranking?

    Thanks,
    Best Regards

Viewing 3 replies - 1 through 3 (of 3 total)
  • Plugin Support Rank Math Support

    (@rankmathsupport)

    Hello @webmakers2011,
    ?
    Thank you so much for getting in touch.
    ?
    Using the provided rules in your robots.txt can help save crawl budget by preventing search engines from accessing unnecessary or low-value URLs, especially for dynamic parameters like filters, cart actions, and other sorting queries.
    ?
    However, these rules won’t directly improve rankings but can improve crawl efficiency, indirectly benefiting SEO. Ensure important pages aren’t accidentally blocked, and regularly monitor your site’s indexability in?Google Search Console.?
    ?
    These rules are generally safe and beneficial if applied thoughtfully.
    ?
    Hope that helps.

    Thread Starter Sn00z389

    (@webmakers2011)

    Thanks for the response!

    Awhile ago we have just added this to the robots.txt:

    Disallow: /*?*

    But after awhile we received a message from Google Ads for Circumventing systems and needed a month to restore the accounts for those websites. However they have restored them…

    Now even if we add these:

    User-agent: *
    Disallow: /*add-to-cart=*
    Disallow: /*add_to_wishlist=*
    Disallow: /*filter_*
    Disallow: /*orderby_*
    Disallow: /*min_price*
    Disallow: /*stock_status*
    Disallow: /*?s=*
    Disallow: /*?woodmart_reviews_sorting*
    Disallow: /wp-admin/
    Allow: /wp-admin/admin-ajax.php

    I will make sure to also add this to the file, since it makes the adsbot crawl everything:

    User-agent: Adsbot-Google
    Disallow:

    Plugin Support Rank Math Support

    (@rankmathsupport)

    Hello @webmakers2011,
    ?
    Adding a separate rule for Google Adsbot is a good thing. It will make sure that the rules you have added above don’t prevent Adsbot from crawling the site.
    ?
    Please do not hesitate to let us know if you need our assistance with anything else.

Viewing 3 replies - 1 through 3 (of 3 total)
  • You must be logged in to reply to this topic.