How can I prevent google from trying to access items on the https://22j.927.myftpupload.com site?
Simple: don’t load resources from there.
Think of Google’s crawler in this sense as a normal user sitting behind a computer or phone and opening your site in the browser: styles, images, JavaScript, and other resources can and will only be loaded from the locations as referenced in your website. The computer cannot (yet) read your intentions and desires.
And if your site is set up to load all resources from an external URL as seen below, that’s exactly what Chrome, Firefox, Safari, Opera and Google crawlers will do: they’ll load the resources as from those URLs.
![](https://i.imgur.com/ciOI7Xv.png)
I have spoken to GoDaddy at length and they tell me I cannot change the robots.txt on the SFTP side. I can only change the robots.txt on my main URL.
That may be true, as you may not have direct access to that FTP site to upload a custom robots.txt file (for the record: Google accepts and follows robots.txt files for FTP sites).
But its also true that you can set robots.txt rules ONLY for resources in the directory where the robots.txt lives.
You cannot, for instance, set up rules for an external domain as you seem to want to. Just imagine what could possibly go wrong if I could set up rules in my website example.com to prevent Google from accessing your website example.org!
In any case, I believe you’re only trying to treat the symptoms and not the real issue that needs addressing. The real issue, in my opinion, is the fact that you are loading ALL your site’s resource: these, plugins, styles, scripts, even media files in posts… from this external domain instead of loading them from your own domain.
Do you mind sharing why you’re choosing to do this, against conventional practice?