so say for example, I had all my syndicated posts in the ‘news’ category, I could create a robots.txt file that said to exculde the directory /news/ when crawling?
would that just help solve the issue of duplicate content without losing any value of having the feeds syndicated on my site?