• Resolved knelsonkris

    (@knelsonkris)


    The location pod has a tiny bit of content on it. The content is also in a field named description. Each page in my location pod is shown using a template. I just made a robots.txt file (User-agent: * Disallow:) and added it to the root of the site. None of the pages in the location pod show in Screaming Frog, they are included on the sitemap and Google has found them but didn’t index them – Discovered – currently not indexed. I read about that and there is some mention of about robot.txt preventing which is why I. made the robots.txt file and uploaded. The other reason cited is The page was found by Google, but not crawled yet. Typically, Google wanted to crawl the URL but this was expected to overload the site; therefore Google rescheduled the crawl. This is why the last crawl date is empty on the report. Is there something within Pods that is causing this – does the page being output by a template make it “not a real page”? I am a bit lost on this one. Can you assist?

    The page I need help with: [log in to see the link]

Viewing 3 replies - 1 through 3 (of 3 total)
  • Plugin Support Paul Clark

    (@pdclark)

    (Removed duplicate reply. See below.)

    Plugin Support Paul Clark

    (@pdclark)

    Scanning the page with SEMRush and looking at the HTML manually, everything looks normal besides a few resource 404’s and a Google Maps API key error and some other JavaScript errors. These shouldn’t affect the ability of the page to be scanned by search engines, although SEMRush says it can affect the site’s ability to be indexed:

    Any script that has stopped running on your website may jeopardize your rankings, since search engines will not be able to properly render and index your webpages. 

    SEMRush

    The meta for robots in the header looks mostly normal, but the values of -1 are not standard according to one tool. They should either be omitted or set to positive values.

    <meta name='robots' content='index, follow, max-image-preview:large, max-snippet:-1, max-video-preview:-1'/>
    <!-- omitted -->
    <meta name="robots" content="index, follow, max-image-preview:large" />
    
    <!-- positive values -->
    <meta name="robots" content="index, follow, max-image-preview:large, max-snippet:50, max-video-preview:10" />

    HTML of the page looks normal.

    robots.txt looks normal, although SEMRush says it has format errors and to run it through Google’s sitemap.xml validator. That may also be something, although it looks normal to me reading through Google’s robots.txt developer guidelines.

    User-agent: *
    Disallow:
    
    sitemap: https://sr22coverage.com/sitemap_index.xml

    Locations are appearing in sitemaps, although across 7 XML files with 1,000 URLs each:

    https://sr22coverage.com/location-sitemap.xml
    https://sr22coverage.com/location-sitemap2.xml (etc. to 7, with 1000 pages per XML)

    Searching site:https://sr22coverage.com inurl:location returned 5 results once the “show filtered results” link at the bottom was clicked. As some of the location URLs are indexed, and they are all powered by the same template, it seems like 7,000 URLs may just take a while to process. But then again, this query is a week old, so I would expect them to be there by now.

    There does not appear to be anything specific to Pods that would cause the linked page to not be indexed. The content also appears to load even with JavaScript disabled.

    With all that checking out, I’d start with the robots.txt validator, then check Google Search Console for any other warnings or errors. If that doesn’t pan out, try resolving the 404s and JavaScript errors reported in the Web Inspector JavaScript Console to see if that makes a difference.

    Plugin Author Jory Hogeveen

    (@keraweb)

    Hi @knelsonkris

    We havent heard from you anymore so I’m closing this topic.
    Feel free to reopen if you still need help!

    Cheers, Jory

Viewing 3 replies - 1 through 3 (of 3 total)
  • The topic ‘Pod not showing up on Screaming Frog, not being indexed’ is closed to new replies.