Fixing website robots.txt to save google crawl budget
-
If anyone can help me here I need help with my robots.txt file
getting error
/macintosh-operating-systems/?ignorenitro=ca984379fa491d10fcc06dc17feb44ef /business-services/?ignorenitro=7685f082a32a72a3a8998f5fcd53f002 /?nitroWebhook=cache_ready&token=30fd12d576332db5165d2424875d49c5 /cdn-cgi/scripts/5c5dd728/cloudflare-static/email-decode.min.js /wp-content/cache/asset-cleanup/js/body-87d87f5217871b92a9038866ee579010d12361d2.js /wp-includes/js/dist/vendor/wp-polyfill.min.js /computer-repairs-birmingham/page/2/ /local-pc-shops/?product_orderby=popularity /local-pc-shops/?product_orderby=default /local-pc-shops/?product_order=desc /local-pc-shops/?product_view=list /local-pc-shops/?product_orderby=date /local-pc-shops/?product_count=24 /local-pc-shops/?product_count=36 /local-pc-shops/?product_orderby=price /local-pc-shops/?product_view=grid /local-pc-shops/?product_orderby=name /local-pc-shops/?product_orderby=rating /local-pc-shops/?product_count=12
Also google is indexing this /page/2/ /page/3/ /page/4/ etc on Avada Theme
on this page and others right at the bottom corner blog post
https://www.pcrepairguru.co.uk/computer-repairs-birmingham/
is there a way to tell the search engines not to index those pages /page/2/ /page/3/ /page/4/ /page/5/ at the end of the URL
- This topic was modified 3 years, 2 months ago by .
- This topic was modified 3 years, 2 months ago by .
- This topic was modified 3 years, 2 months ago by .
- This topic was modified 3 years, 2 months ago by . Reason: code format fixed
The page I need help with: [log in to see the link]
Viewing 3 replies - 1 through 3 (of 3 total)
Viewing 3 replies - 1 through 3 (of 3 total)
- The topic ‘Fixing website robots.txt to save google crawl budget’ is closed to new replies.