Not working despite disabled page cache on ‘blackhole’ URLs?
-
Hi Jeff, I’m unable to get Blackhole for Bad Bots to work on my site, even though I’ve disabled page caching for any URLs that include a ?blackhole GET parameter in the URL.
You can see the problem here:
https://www.childcareaware.org/?blackhole=191d1f5616Whatever it’s worth, here’s the robots.txt file:
https://www.childcareaware.org/robots.txt
And here’s what I see in the page response headers for that URL, using the following curl command:
curl -I https://www.childcareaware.org/?blackhole=191d1f5616
HTTP/2 200 cache-control: no-cache, must-revalidate, max-age=0 content-type: text/html; charset=UTF-8 link: <https://www.childcareaware.org/wp-json/>; rel="https://api.w.org/" link: <https://www.childcareaware.org/wp-json/wp/v2/pages/6>; rel="alternate"; type="application/json" link: <https://www.childcareaware.org/>; rel=shortlink server: nginx strict-transport-security: max-age=300 x-pantheon-styx-hostname: styx-fe1-b-6565f7757d-v8hwj x-styx-req-id: f5d07a34-8cb2-11eb-9b04-a2bb4a80cf12 date: Wed, 24 Mar 2021 15:09:43 GMT x-served-by: cache-mdw17381-MDW, cache-ewr18151-EWR x-cache: MISS, MISS x-cache-hits: 0, 0 x-timer: S1616598582.092331,VS0,VE1055 vary: Accept-Encoding, Cookie, Cookie age: 0 accept-ranges: bytes via: 1.1 varnish, 1.1 varnish
- This topic was modified 3 years, 8 months ago by . Reason: corrected typo, clarified the nature of the page cache exclusion
The page I need help with: [log in to see the link]
Viewing 5 replies - 1 through 5 (of 5 total)
Viewing 5 replies - 1 through 5 (of 5 total)
- The topic ‘Not working despite disabled page cache on ‘blackhole’ URLs?’ is closed to new replies.