But the robots.txt file is not at the root… even after revealing hiding files.
WordPress’ robots.txt
file is a virtual file that’s dynamically generated when you access the URL example.com/robots.txt
. That’s why you don’t see the file in your file system.
But you can create a static robots.txt
file at the root yourself… and this will override WordPress’ dynamically-generated one.
I tried to use seo plugins and virtual robots plugin to disallow google images to index images… but upon checking with different validators (live), the images are still set on ALLOW.
I also tried putting my own robots.txt file, but it seems not to have any power as well.
Note that the robots.txt
file is only a directive or hint that tells search engines’s crawlers what you want them to index and what you don’t want to be indexed. It has no “power” (so to speak), and it doesn’t prevent or block access to anything.
So it’s entirely up to the search engines to follow the directives in your robots.txt file (or not): you cannot enforce anything.
That said, Google largely respects robots.txt
directives, and I’m not sure how you’re arriving at the conclusion that your custom robots.txt
file (plugins or manually created) isn’t working.
Note that just as it takes time for Google to index your pages for them to appear in Google search results, it’ll take time for them to remove your images from their search results.
So …
1) If you can share your domain name, I can take a look at your robots.txt
file to see if the technical implementation is good and advise you accordingly.
2) If the technical implementation is OK, then the best you can do is wait for Google to de-index your images. (There’s a tool in Google Search Conole to request the removal of URLs, but if it’s a lot of images, that might not be an effective solution.)
3) If you want to actually BLOCK Google (and anyone else) from hotlinking your images, you’ll need to configure your webserver to do this. Most WordPress security plugins have such a hotlinking prevention feature built-in, and this ServerGuy blog post has manual configurations for various webservers (note that the article assumes you want to block all requests but allow Google, Bing & Yahoo… so you’ll need to adapt the code to your own needs).
Good luck!