Clean-param directive in robots.txt
-
Hi,
Yandex send a mail for about “GET parameters” error on my sites.
Here are examples of pages and their duplicate pages with insignificant GET parameters:
add_to_wishlist:
– https://yagbakimi.com.tr/bakim-urunleri/elf-moto-chain-paste/?wccm=add-to-list&pid=5566&nonce=75cbcb7720
– https://yagbakimi.com.tr/bakim-urunleri/elf-moto-chain-paste/?wccm=add-to-list&pid=5566&nonce=75cbcb7720&add_to_wishlist=5608These pages are duplicates, how can do the “Clean-param” directive in robots.txt? I want that the robot ignores insignificant GET parameters and combines signals from identical pages on the main page.
Can you write a sample robot file for me? I looked at this page but couldn’t write it correctly.
Thanks…
- The topic ‘Clean-param directive in robots.txt’ is closed to new replies.