• Resolved zo

    (@zotezo)


    Hi,
    How to add robots.txt file more efficiently so that google crawler run more fast and efficient way?
    Thanks

Viewing 4 replies - 1 through 4 (of 4 total)
  • Plugin Support Michael Ti?a

    (@mikes41720)

    We’re not sure what you want to achieve with your robots.txt file. Are there certain areas you would like to restrict Google’s crawlers from accessing? The default robots.txt file should be just fine.

    However, you can refer to this article for more information if you would like to tweak and add more directives to your file — https://yoast.com/ultimate-guide-robots-txt/

    Thread Starter zo

    (@zotezo)

    Recently we have checked Page validation in AMP is running very very slow
    URLs like https://www.zotezo.com/medicine-composition/azithromycin-250mg-ofloxacin-200mg/?filter_medicine-manufacturer=alkem-laboratories-ltd
    1000 of this types of hits showing on server log.specially bing and msn bots are crawling these.
    Is it normal behaviour or it is bad behaviour.
    We have slighly change our robots.txt in this way

    User-agent: *
    Disallow: /wp-admin/
    Allow: /wp-admin/admin-ajax.php

    User-agent: *
    Disallow: /?s=
    Disallow: /search/
    Disallow: *filter*
    Disallow: */?filter*

    Sitemap: https://www.zotezo.com/sitemap_index.xml
    Anything needs to add for AMP for faster indexing and validations?

    Thanks

    Plugin Support marcanor

    (@marcanor)

    Hi @zotezo ,

    While we are not familiar with your site, those robots.txt rules should help block the URLs from robots crawling them. However, you need to test them properly to prevent any blocks of content that you do want crawled.

    You might also want to look into using the X-Robots-Tag to properly “noindex” those URLs.

    As for anything needed to add for faster AMP validations, there is nothing we are aware of that you can add to your robots.txt file to help with that.

    Plugin Support Md Mazedul Islam Khan

    (@mazedulislamkhan)

    This thread has been marked as resolved due to a lack of activity. You’re always welcome to re-open this topic.

    Thanks for understanding!

Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘Ideal Robots.txt to run google crawl more fast’ is closed to new replies.