• Resolved terryscott2

    (@terryscott2)


    Hi, my server is being hit by search engine bots so much that it is giving 503 errors.
    I have created a robots.txt file, does this look OK?
    Thanks for any help, I am starting to panic

    Crawl-delay: 10

    User-agent: *
    Disallow: /wp-admin/
    Disallow: /wp-login.php
    Disallow: /wp-json/

    # Ban bots that don’t benefit us.
    # ——————————–

    User-agent: Nuclei
    User-agent: WikiDo
    User-agent: Riddler
    User-agent: PetalBot
    User-agent: Zoominfobot
    User-agent: Go-http-client
    User-agent: Node/simplecrawler
    User-agent: CazoodleBot
    User-agent: dotbot/1.0
    User-agent: Gigabot
    User-agent: Barkrowler
    User-agent: BLEXBot
    User-agent: magpie-crawler
    User-agent: SemrushBot
    User-agent: MJ12bot
    User-agent: AhrefsBot
    User-agent: YandexBot
    User-agent: PetalBot
    User-agent: DotBot
    User-agent: BLEXBot
    User-agent: DataForSeoBot
    User-agent: ZoominfoBot
    Disallow: /`

Viewing 2 replies - 1 through 2 (of 2 total)
  • Plugin Support Maybellyne

    (@maybellyne)

    Hello Terryscott,

    Thanks for reaching out regarding your robots.txt file. What you currently have does not follow our recommendations. We believe you should rely a lot less on it; it’s only really necessary to block URLs in your robots.txt file when you have complex technical challenges. Our recommendation is:

    User-agent: *
    Disallow:
    
    Sitemap: https://www.example.com/sitemap_index.xml

    The wp-admin blocks are unnecessary, and WordPress is increasingly moving away from admin-ajax. Also, there’s not really much value in hiding sitemaps anymore. You can read more in our suggested best practice and release post

    This thread was marked resolved due to a lack of activity, but you’re always welcome to re-open the topic. Please read this post before opening a new request.

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘robots.txt help’ is closed to new replies.