Thanks for doing this.
However, on closer look, it seems that the phast.php bundler files are being crawled as HTML, which ends up significantly lowering the crawl rate for Googlebot/2.1 Smartphone of HTML pages. Desktop crawl rate is ok, which is weird.
It seems these JSON text/plain files, which are being interpreted as text/html is what’s leading to the drop in crawl rate.
When phast.php was inadvertently blocked from Googlebot, the overall amount of pages/files crawled was significantly less, but the crawl rate of Googlebot Smartphone was magnitudes higher.
Whether not this is a “bug” of the Google crawler or the phast bundler, the resulting crawl rates become unacceptable. With phast bundler blocked, Googlebot can crawl the whole site in under a few days. With phast.php not inadvertently blocked from robots, it might be weeks or longer.
-
This reply was modified 2 years ago by kw11.