Viewing 2 replies - 1 through 2 (of 2 total)
  • Plugin Author Jeffrey L. Smith

    (@seo-design-solutions)

    Hi Enes:

    1) I see your website is currently indexed https://www.google.com/search?q=site%3A10sebep.com

    Which suggests that the site is indexed and pages are getting crawled:

    and even in the past 7 days you have had 21 pages indexed.

    There are some potential issues of category pages and some admin pages you may want to omit in the Meta Robot Tags Editor >>> Default Values:

    Such as…

    Prevent indexing of…
    Administration back-end pages
    Author archives
    Blog search pages
    Category archives
    Comment feeds
    Comment subpages
    Date-based archives
    Subpages of the homepage
    Tag archives
    User login/registration pages

    and use the Canonicalizer Module and click [x]:

    [x] Generate <link rel=”canonical” /> meta tags
    [x] Send rel=”canonical” HTTP headers

    Canonical URL Scheme
    [x]Use https:// or https:// depending on how the visitor accessed the page
    Make all canonical URLs begin with https://
    Make all canonical URLs begin with https://

    Automated 301 Redirects
    [x] Redirect requests for nonexistent pagination

    2) If your posts are not being crawled then that typically is a matter of authority citation and importance and the time to spider and index is determined typically by PageRank (Higher PR sites and pages get crawled faster). Another factor is website architecture and internal linking.

    Your website still has PR0 and is considered new or does not have sufficient PR to expedite spidering (but will improve as you post more, get more citation and traffic)…

    However, in this case, let’s make sure these options are not clicked.

    3) on Meta Robot Tags Editor >>> make sure you did not check [x] Don’t cache or archive this site, under sitewide settings spider instructions.

    a) Make sure under Meta Robot Tags Editor pages and posts that you have not checked [x] noindex or [x] nofollow.

    b) On the page or post under SEO Settings >>> Miscellaneous >>> that you have not checked
    [x] Noindex: Tell search engines not to index this webpage.
    [x] Nofollow: Tell search engines not to spider links on this webpage.

    4) You can also under File Editor (be careful when playing with this module) you can opt to:
    [x] Enable this custom robots.txt file and disable the default file
    – or-
    [x]Let other plugins add rules to my custom robots.txt file enable the custom robots.txt

    This should take care of your robots.txt issue. If this is not it, then it could be the host, and since I can see it, spiders should be able to as well and nothing is being blocked, except the /go/ folder and /wp-admin/ and /wp-includes/

    Right now, the code SEO Ultimate inserts is:

    # Added by SEO Ultimate’s Link Mask Generator module
    User-agent: *
    Disallow: /go/
    # End Link Mask Generator output

    User-agent: *
    Disallow: /wp-admin/
    Disallow: /wp-includes/

    Unless, you add additional rules using the Meta Robot Tags Editor.

    Keep in mind, indexation and page or site authority are based on individual site permissions and citation from other websites.

    Since you are being indexed and plugin in installed and new pages are being added (21 in last 7 days) this suggests that the plugin is working properly and not inputing code that is blocking spiders.

    Thread Starter eneskaraboga

    (@eneskaraboga)

    Thank you for your response and tips mate!

Viewing 2 replies - 1 through 2 (of 2 total)
  • The topic ‘After installation Google stopped indexing and problem with robots.txt’ is closed to new replies.