Ill even go one better (I thought of these things while I was outside smoking)
1. google’s own blog:
https://googleblog.blogspot.com/
If you look at it, they have single post pages, and archive pages with multiple posts within those.
Their robots.txt:
https://googleblog.blogspot.com/robots.txt
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Sitemap: https://googleblog.blogspot.com/feeds/posts/default?orderby=updated
2. Googlebot, et al, are ‘smarter’ than anyone gives them credit for. The people that put together those algorithms understand how blogs work.
3. Unless you plan on putting the exact same thing on an archive page, ie, you are displaying one post AND the comments stuff, you are not duplicating content (again, see #2, they’re smarter than that).
4. You (generally speaking) are limiting your user’s experience on your web site.
Let’s suppose I search for widgets, and a site comes up that is blocking this, blocking that, in order to “not get penalized for duplicate content (ignoring the fact that it wont happen). I click the link off Google — I go to a single post page. Thats great — I got my widget info. I leave.
Now lets say you search for widgets, and my site comes up. Im not an seo freak, I just let WP and searchbots do their dance. You click the link, and you may or may not be taken to a single post page. If you arent — well guess what, I have other posts on washers, and nuts and bolts there. Maybe you want some of those before you leave?
In other words, you lost traffic, because the other great content you have wasnt there for them to see. Me, I got the traffic.