What I don’t understand is that this rogots.txt has been in there for more than a month and my other website has pretty much the same settings as below but it doesn’t give me this kind of trouble though.
Translation: I’ve left my door open for more than a month, and pretty much all my houses have the same setup… and I never got a single incidence before. Why are thieves getting into this house alone now? ??
How can I specifically ask Googlebot to not look at those files they don’t have to?
How did you get all those many lines for different crawlers in your robots.txt file to begin with, as those are not generated by WordPress?
Your robots.txt file includes a directive at the top that should prevent Google (and all good, responsible bots) from crawling the entire wp-includes
folder.
User-agent: *
Allow: /wp-admin/admin-ajax.php
...
Disallow: /wp-includes/
...
Disallow: /?blackhole
But then, you went ahead to explicitly override this, allowing Googlebot, Bingbot, Yandex, and a whole lot of other bots to crawl your entire site with many Allow: /
directives for individual bots.
So, to answer your question:
1) The “error” message you posted from GSC is Google saying they’re unable to access and crawl that URL on your website. But this is a URL that Google shouldn’t index anyway, and WordPress and your webserver are doing a great job by preventing Google. So this is not really a problem or “error” to worry about.
2) If you don’t want to see such “error” messages in your GSC, then explicitly tell Google that you don’t want these paths indexed at all. How? Specifically for Google, just remove the following lines from your robots.txt file:
User-agent: Googlebot
Allow: /
But why stop at Google? I’d remove all the explicit fill-site overrides:
User-agent: Googlebot
Allow: /
User-agent: Mediapartners-Google
Allow: /
User-agent: AdsBot-Google
Allow: /
User-agent: AdsBot-Google-Mobile
Allow: /
User-agent: Bingbot
Allow: /
User-agent: Msnbot
Allow: /
User-agent: Applebot
Allow: /
User-agent: Yandex
Allow: /
User-agent: Slurp
Allow: /
User-agent: DuckDuckBot
Allow: /
User-agent: Qwantify
Allow: /
User-agent: googleusercontent
Allow: /