Like I mentioned here, it’s Google being dumb:
It may just be becuase Google’s a freakin’ dink. I read through their webmaster whoopla, and it LOOKS like they’re giving weighted preference to allow vs disallow. So while both are, technically, correct, they won’t always scan a Disallow: (nothing).
I’m playing around with their webmaster tools, and seeing different results with ‘fake’ robots.txt files when I set it as disallow nothing or allow everything.
The even longer version is that once Google’s cached you with Disallow: /
(which is ‘don’t allow anything!’), it DOES NOT cleanly flip back when you re-set to Disallow:
(i.e. follow everything). Sometimes.
I would manually make a robots.txt and force-set it to allow.
Once that’s been re-cached by google, kill the robots.txt and see if it can correctly pick up the auto-generated one.