Forum Replies Created

Viewing 16 replies (of 16 total)
  • Thread Starter sam_h1

    (@sam_h1)

    so say for example, I had all my syndicated posts in the ‘news’ category, I could create a robots.txt file that said to exculde the directory /news/ when crawling?

    would that just help solve the issue of duplicate content without losing any value of having the feeds syndicated on my site?

Viewing 16 replies (of 16 total)