I have my settings set to allow post, pages to be seen
Here are my links:
https://my24-7restoration.com/
https://my24-7restoration.com/post-sitemap.xml
https://my24-7restoration.com/page-sitemap.xml
Seems to be effecting the serps for obvious reasons
Any help greatly appreciated
]]>https://impactprofits.com/wp-login.php?redirect_to=https%3A%2F%2Fimpactprofits.com%2Fhow-many-pairs-of-shoes-do-you-own%2F
Connection error
Referrer: https://impactprofits.com/how-many-pairs-of-shoes-do-you-own/
I have no idea what this means or how to fix it. Would appreciate any help! Thanks!
]]>Upon checking Search Console today, I noticed the problem has not been resolved.
– I currently have 452 ‘Submitted URL has crawl issues’ errors, all of which are image attachments.
– I also have 66 ‘Submitted URL not found (404)’, again, all of which are image attachments.
– The total number of errors is static, although some have transferred from the crawl type of error to the 404 error, which I assume is a good thing?
My questions: should I keep the purge plugin running? Is it actually working? Should I be worried about having 500+ errors which don’t seem to be going away?
Many thanks in advance.
I cannot make any changes to this in the config, how can i fix this or will you address this issue in the next release?
Thanks
]]>Sitemap submission done few weeks ago but the page has still not being indexed by Google. Google Search Console Index Status showing Excluded (Discovered – currently not indexed).
Please Help!
]]>I’ve been trying to work this issue out over the past few days, but I can’t seem to find a way to make it work.
I’ve discovered that my website has 25 sites with crawl problems, and when using the examine URL tool, they say that the crawling was blocked by robots.txt and that the sites can’t be indexed.
I’ve tried running the old GSC robots.txt tester on my URL property (not the domain property) and here it says that there isn’t any problems with my robots.txt.
Furthermore, when looking at my robots.txt file through /robots.txt it seems to be fine?
i’ve also tried uploading a new sitemap, but this hasn’t done the trick either.
I’ve tried running this test: https://search.google.com/test/mobile-friendly in which it sometimes says that my site is mobile friendly and sometimes says its not. Sometimes it shows 20+ sites with indexing problems and sometimes it only shows 10. Furthermore the test shows between 0-10+ javascript console issues as well, depending on when you run the test.
So to sum it up:
I’ve suddenly experienced crawlerrors on GSC saying that my sites can’t be crawled due to robots.txt blogging the crawl. I can’t find anything that should make this error appear and I’m really running out of ideas.
I really hope you can help me!
]]>We were affected by the Yoast bug that triggered hundreds of crawl errors for media attachments on our site. We installed the purge plugin but are still seeing errors 9 months later – despite Yoast saying it would only take 6 months. Please advise how we can resolve this asap.
Thanks for your help!
]]>I have done redirects but I am confused how these errors come about and what I can do to prevent these types of errors.
It seems like everyday i will get 5 or more of these that show up.
I am not sure if it has something to do with Yoast possibly or it is with WooCommerce I cannot recall seeing these types of errors before I got Yoast premimum
Thanks for any suggestions on how to get rid of this reoccurring problem. Tom
]]>– Applied the plugin 2 months ago
– Sitting on about 279 crawl errors in search console that have been static now for several weeks. I’m assuming that’s what is supposed to happen and now we are to wait up to 6 months for Google to remove them.
In searching comments on the original “announcement” regarding this issue made by Yoast, I read the following from two different users: (on the same post)
“”Am I correct in thinking that Google Search Console will keep reporting errors for the 410s as long as the purge plugin and attachment-sitemap.xml are in place because the sitemap will keep on pointing to non-existing pages?
Furthermore – when the result of the google query “site:https://mydomain.com/ inurl:attachment_id” is nearing zero, is it then the right time to remove the purge plugin and related attachment-sitemap.xml? Yes, I have noted the comments about “six months”, but isn’t the result of the query above the best measure, and when it’s zero’ed, then we’re good?””
(to which Yoast replied, “yes & yes — you’ve understood it perfectly”)
The second comment was in response to someone asking about the same search in reference to the attachment url(s) but he had never actually installed the plugin:
IF
““Your search – site:https://mydomain.com/ inurl:attachment_id – did not match any documents.”
…then you’re likely all good, as the problematic slim content pages have not been indexed by Google.
Just make sure that you have 1) a recent / latest version of the Yoast standard plugin that has the discussed bug corrected and 2) the setting “redirect attachment URLs” set to “yes”, and you won’t need the additional purge plugin.””
NOW, based on the above, do I understand that so long as the plugin is activated, those 410 pages will SIT there? And for as long as they sit there, Google will report them as errors? (until Google removes them)
Also, when I implement the search, “site:https://mydomain.com/ inurl:attachment_id,” I too get the message “did not match any documents.” Which is another reason I am trying to determine if I can take the next steps.
All that said, I have 3 questions:
1. What would happen if I remove the plugin at this point? What would happen to those 279 errors? Is there ANY harm I can do by deactivating it now?
2. IF the plugin has done what it is supposed to do (no additional 410 errors) wouldn’t it make sense to remove it now or do I need to wait until Google removes the 410s at some point in time?
3. IF I can deactivate it, can I then just mark all of the errors in Search Console as “fixed” and then remove the attachment sitemap too?
I’m just trying to get some clarification here! I understand the waiting 6 months thing — I do! However, if the process has run its course already and now we’re just sitting and waiting for Google to run its, is it okay to remove things sooner rather than later?
I am an admitted novice and I just want to do the right thing. Any direction with regard to any of the above would be most graciously and humbly welcome.
Thank you most kindly!!
]]>