Forum Replies Created

Viewing 15 replies - 1 through 15 (of 33 total)
  • Thread Starter WebmasterT

    (@webmastert)

    Hi Maybellyne,

    The noteworthy list (Google):

    1. Search bar for the site
    2. Pages / Categories listed under search bar
    3. Latest from Collider display (FilmBook is in Google News like Collider)
    4. Images for Collider
    5. Collider website information on the Top Left

    The noteworthy list (Bing):

    1. The display of various Tags / Categories, along with Meta Description for each, under Title / Meta Description for Collider
    2. Search bar for the site
    3. News from Collider.com
    4. Twitter Account under News from Collider.com
    5. Images from Collider
    6. Collider Extras – YouTube
    7. Collider website information on the Top Left

    My sitemap is in Bing Webmaster Console.

    An old application into Bing Pub Hub, I just discovered, was denied in 2018. I submitted a new application today.

    Thank you.

    • This reply was modified 1 year, 9 months ago by WebmasterT.
    Thread Starter WebmasterT

    (@webmastert)

    Thread Starter WebmasterT

    (@webmastert)

    And my host’s screenshots of the error:

    Thread Starter WebmasterT

    (@webmastert)

    Hi Maybellyne,

    I gave your message to my web host and this was their reply:

    I just wanted to give you a quick update on this ticket. After several hours, the SEO optimization finally failed on my end. Unfortunately, I got a slightly different error from yours. Mine was a generic 500 POST request generated when the optimization failed. I went ahead and attached some screenshots of the exact error from Yoast, which was simply "Error message Error parsing the response to JSON". However, using the Health Check plugin to disable iThemes may have resulted in me seeing something other than 403. But the 500 status I was getting was from https://film-book.com/wp-json/yoast/v1/link-indexing/terms. The exact timestamp from DevTools was "Sun, 05 Feb 2023 01:04:23 GMT".
    
    I did cross reference the server-side PHP logs and the website transfer logs, both of these showed 0 relevant server-side log entries. Even searching the transfer logs for the exact timestamp only showed 2 successful 200 requests from this URL. Here are the DevTools request headers and the transfer logs:
    
    Request URL: https://film-book.com/wp-json/yoast/v1/link-indexing/terms
    Request Method: POST
    Status Code: 500
    Remote Address: 172.67.129.226:443
    Referrer Policy: strict-origin-when-cross-origin
    cf-ray: 7947bb53fc8c6312-ORD
    date: Sun, 05 Feb 2023 01:04:23 GMT
    server: cloudflare
    vary: Accept-Encoding
    :authority: film-book.com
    :method: POST
    :path: /wp-json/yoast/v1/link-indexing/terms
    :scheme: https
    accept: /
    accept-encoding: gzip, deflate, br
    accept-language: en-US,en;q=0.9
    content-length: 0
    cookie: wordpress_test_cookie=WP%20Cookie%20check; wordpress_logged_in_ba73c337561b427366fdf8d0ae57f3e0=nexcess_support_63dd2ba318389%7C1675717165%7CNMgib4CiPA6d9c9fAFK4NFIYVdAUWHmQrlig1sImcWI%7C982c1da2eae56c08fd26b89707d0e05eed08ff04226f78e3b455b537cb246d66; wp-health-check-disable-plugins=fec7df89c9d2a8a9133ae10559fa7be9; wp-settings-time-154=1675557534
    origin: https://film-book.com
    referer: https://film-book.com/wp-admin/admin.php?page=wpseo_tools
    sec-ch-ua: "Not_A Brand";v="99", "Google Chrome";v="109", "Chromium";v="109"
    sec-ch-ua-mobile: ?0
    sec-ch-ua-platform: "macOS"
    sec-fetch-dest: empty
    sec-fetch-mode: cors
    sec-fetch-site: same-origin
    user-agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36
    x-wp-nonce: cc431a2572 [0128][[email protected] logs]$ grep '05/Feb/2023:01:04:23' transfer-2023-02-05.log | grep '50.28.76.132'
    50.28.76.132 - - [05/Feb/2023:01:04:23 +0000] "POST /wp-json/yoast/v1/link-indexing/terms HTTP/1.1" 200 6872 "https://film-book.com/wp-admin/admin.php?page=wpseo_tools" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36"
    50.28.76.132 - - [05/Feb/2023:01:04:23 +0000] "POST /wp-json/yoast/v1/link-indexing/terms HTTP/1.1" 200 6892 "https://film-book.com/wp-admin/admin.php?page=wpseo_tools" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36" Now, I do see that the site is also behind CloudFlare. So the 500 (and 403 from earlier) may be generated from something on the CDN side. But from what I can see the SEO optimization errors appear to be triggered by something either WAF related or application related. Additionally, I did notice that that your site currently has debug mode enabled. I did cross-reference that same timestamp in the debug logs and see a large number of WordPress database errors logged for your Easy Social Share Buttons plugin. I'm seeing database queries erroring out from the acd1ffcc_wrdpnew.wp_essb_subscribe_conversions table. So these sporadic errors could be related to database query issues. Here are the log entries around this timestamp: [0123][[email protected] wp-content]$ grep '05-Feb-2023 01:04:2$*' debug.log
    [05-Feb-2023 01:04:21 UTC] WordPress database error Table 'acd1ffcc_wrdpnew.wp_essb_subscribe_conversions' doesn't exist for query SHOW FULL COLUMNS FROM
    wp_essb_subscribe_conversions made by require('wp-blog-header.php'), require_once('wp-includes/template-loader.php'), do_action('template_redirect'), WP_Hook->do_action, WP_Hook->apply_filters, ESSB_Ajax::do_ajax, do_action('essb_ajax_subscribe_conversion_loaded'), WP_Hook->do_action, WP_Hook->apply_filters, ESSB_Subscribe_Conversions_Pro::db_log_loaded, ESSB_Subscribe_Conversions_Pro::db_log
    [05-Feb-2023 01:04:24 UTC] WordPress database error Table 'acd1ffcc_wrdpnew.wp_essb_subscribe_conversions' doesn't exist for query SHOW FULL COLUMNS FROM wp_essb_subscribe_conversions made by require('wp-blog-header.php'), require_once('wp-includes/template-loader.php'), do_action('template_redirect'), WP_Hook->do_action, WP_Hook->apply_filters, ESSB_Ajax::do_ajax, do_action('essb_ajax_subscribe_conversion_loaded'), WP_Hook->do_action, WP_Hook->apply_filters, ESSB_Subscribe_Conversions_Pro::db_log_loaded, ESSB_Subscribe_Conversions_Pro::db_log
    [05-Feb-2023 01:04:24 UTC] Failed to insert log entry: WordPress database error: Processing the value for the following field failed: url. The supplied value may be too long or contains invalid data.
    [05-Feb-2023 01:04:24 UTC] Failed to insert log entry: WordPress database error: Processing the value for the following field failed: url. The supplied value may be too long or contains invalid data.
    [05-Feb-2023 01:04:24 UTC] WordPress database error Table 'acd1ffcc_wrdpnew.wp_essb_subscribe_conversions' doesn't exist for query SHOW FULL COLUMNS FROM wp_essb_subscribe_conversions made by require('wp-blog-header.php'), require_once('wp-includes/template-loader.php'), do_action('template_redirect'), WP_Hook->do_action, WP_Hook->apply_filters, ESSB_Ajax::do_ajax, do_action('essb_ajax_subscribe_conversion_loaded'), WP_Hook->do_action, WP_Hook->apply_filters, ESSB_Subscribe_Conversions_Pro::db_log_loaded, ESSB_Subscribe_Conversions_Pro::db_log
    [05-Feb-2023 01:04:24 UTC] Failed to insert log entry: WordPress database error: Processing the value for the following field failed: url. The supplied value may be too long or contains invalid data.
    [05-Feb-2023 01:04:24 UTC] Failed to insert log entry: WordPress database error: Processing the value for the following field failed: url. The supplied value may be too long or contains invalid data.

    What is the next step?

    Thank you.

    Thread Starter WebmasterT

    (@webmastert)

    The first time I ran it, this was the message (The Request URL) was different.

    Thank you.

    Thread Starter WebmasterT

    (@webmastert)

    Hi Support,

    I followed all of the steps and received this error:

    Oops, something has gone wrong and we couldn’t complete the optimization of your SEO data. Please click the button again to re-start the process.

    Below are the technical details for the error. See this page for a more detailed explanation.Error details

    Request URL
    https://film-book.com/wp-json/yoast/v1/indexing/prepare

    Request method
    POST

    Status code
    403

    Error message
    Cookie check failed

    What do I do now? What is the next step?

    Thread Starter WebmasterT

    (@webmastert)

    SEO data optimization

    I followed all four steps. The optimization got about half complete before the error occurred again (https://prnt.sc/VsQ6-9FLaFA-):

    • Oops, something has gone wrong and we couldn’t complete the optimization of your SEO data. Please click the button again to re-start the process.Below are the technical details for the error. See?this page?for a more detailed explanation.Error detailsError message
      Error parsing the response to JSON.Response

    What is the next step?

    Thread Starter WebmasterT

    (@webmastert)

    1.) I resubmitted the sitemap and this was the result (https://prnt.sc/EOl1GCb4xO__):
    
    Sitemap could not be read
    General HTTP error
    1 instance
    We encountered an error while trying to access your sitemap. Please ensure that your sitemap is present at the specified address and is not blocked to Google. See our help center for more debugging help.
    
    How do I submit a sitemap (https://film-book.com/sitemap_index.xml) that can be read by Google Search Console?
    
    2-3.) This robots.txt has been implemented:
    
    User-Agent: *
    Disallow: /wp_admin/
    Allow: /wp-admin/admin-ajax.php
    
    Sitemap: https://film-book.com/sitemap_index.xml
    
    Does this allow Google to index the site?
    Does this allow Google to crawl, fetch and index your entire site?
    
    4.) I looked at the browser's console as you instructed. Here is the result (https://prnt.sc/rmiJh6PBBEh1):
    
    JQMIGRATE: Migrate is installed, version 3.3.2
    admin-bar-v2.js?ver=11.7.1-202305:3 Missing data from PHP (wpNotesArgs).
    (anonymous) @ admin-bar-v2.js?ver=11.7.1-202305:3
    tie.js?ver=1675185723:23 Page Template Attr: select[name='page_template']
    ?v=2.0:16 WebSocket connection to 'wss://public-api.wordpress.com/pinghub/wpcom/me/newest-note-data' failed:
    v @ ?v=2.0:16
    
    What do I do? What is the next step? 
    • This reply was modified 1 year, 9 months ago by WebmasterT.
    Thread Starter WebmasterT

    (@webmastert)

    4. Update 2 – I tried running it again and this was listed under Response both times (attached). What do I do?

    Thread Starter WebmasterT

    (@webmastert)

    4. Update – I ran the SEO data optimization and received this error:

    Error details

    Error message
    Error parsing the response to JSON.

    What should I do?

    • This reply was modified 1 year, 9 months ago by WebmasterT.
    Thread Starter WebmasterT

    (@webmastert)

    Hi Maybellyne,

    Thanks for the quick reply.

    1. ) I did as you suggested. I placed the sitemap (https://film-book.com/sitemap_index.xml) in the URL Inspection Tool and this was the result:

    URL is not on Google
    This page is not indexed. Pages that aren’t indexed can’t be served on Google. See the details below to learn why it wasn’t indexed.

    Page indexing
    Page is not indexed: Excluded by ‘noindex’ tag

    Discovery
    Sitemaps
    N/A
    Referring page
    https://film-book.com/tag/jason-tobin/
    https://film-book.com/upload-amazon-studios-renews-the-sci-fi-comedy-tv-series-for-a-second-season/
    https://film-book.com/tv-review-animal-kingdom-season-2-episode-9-custody-tnt/
    https://film-book.com/ready-player-one-2018-movie-trailer-4-final-promo-steven-spielbergs-film/
    Crawl
    Last crawl
    Dec 21, 2022, 5:21:02 AM
    Crawled as
    Googlebot smartphone
    Crawl allowed?
    Yes
    Page fetch
    Successful
    Indexing allowed?
    info
    No: ‘noindex’ detected in ‘X-Robots-Tag’ http header
    Indexing
    User-declared canonical
    None
    Google-selected canonical
    ?
    Inspected URL

    I followed the guide that you linked to. I clicked “Request Indexing” for the sitemap (https://film-book.com/sitemap_index.xml). The result was:

    Indexing request rejected
    During live testing, indexing issues were detected with the URL

    What do I do now?

    2.) I do want Google crawling my home page, fetching my home page, and all of the other pages and posts on the site.

    3.) I want FilmBook (https://film-book.com) added to Google. It was previously. How do we accomplish this…again? How did the site get un-added?

    4.) Okay. Will do.

    Thread Starter WebmasterT

    (@webmastert)

    Hi Support,
    
    1.) I went through the steps as you suggested, altering the robots.txt file, etc.
    
    I added the sitemap to Google Search Console and this error showed up:
    
    https://capture.dropbox.com/gHuOO2uGc1SPph8x
    
    "Sitemap could not be read
    General HTTP error
    1 instance
    We encountered an error while trying to access your sitemap. Please ensure that your sitemap is present at the specified address and is not blocked to Google. See our help center for more debugging help."
    
    What did I do wrong?
    
    2.) Also, I ran a search of film-book.com in the URL Inspection Tool and found this:
    
    https://capture.dropbox.com/4G6l9mUhaCxymljH
    
    "Indexed, though blocked by robots.txt
    Crawl allowed? No: blocked by robots.txt
    Page fetch: Failed: Blocked by robots.txt"
    
    I don't want Google crawling or page fetching blocked. I was told by my web host that we had slowed down the google bot crawl rate, not stopped it all together. Google seems to be saying otherwise. Am I wrong?
    
    3.) I clicked Live Test in Google Search Console, and received this result:
    
    https://capture.dropbox.com/sCqSlXU9GzYpWQTm
    
    "URL is not available to Google
    This page cannot be indexed. Pages that aren't indexed can't be served on Google. See the details below to learn why it can't be indexed Learn more"
    
    "error
    Page availability
    Page cannot be indexed: Blocked by robots.txt
    URL will be indexed only if certain conditions are met
    Crawl
    Time
    Jan 29, 2023, 9:54:04 AM
    Crawled as
    Googlebot smartphone
    Crawl allowed?
    error
    No: blocked by robots.txt
    Page fetch
    error
    Failed: Blocked by robots.txt
    Indexing allowed?
    N/A"
    
    You said that "It might take some days for Google to fully crawl your site and index URLs in the search result." but if Google's bots can't crawl the pages, how are these errors going to change?
    
    4.) I have had Yoast SEO installed for years now. Do I need to go through the plugin's "SEO data optimization" under "First-time configuration"?
    
    Thank you.
    Thread Starter WebmasterT

    (@webmastert)

    Were you able to look at my screenshots?

    Thread Starter WebmasterT

    (@webmastert)

    Thread Starter WebmasterT

    (@webmastert)

    Angelo:

    “If you want to organize the data separately, then you may want to take a look at post type podcasting. With this you can create your own post type then tell your content producers to create podcast episodes in that post type you setup.”

    We have already created specific post type for our podcasts long ago:

    https://film-book.com/category/podcast/

    “PowerPress then allows you to add your own custom feed slug name to the post type. IF you make a post type called “food-series” and use the slug “food-audio-edition”, then the feed would look like example.com/food-series/feed/food-audio-edition/.”

    We did this years ago, the moment we started using PowerPress Podcasting plugin by Blubrry:

    Example – for the podcast FilmBookCast, we created the slug “filmBookcast” for it in PowerPress. The URL within PowerPress that we created for this podcast is https://film-book.com/feed/filmbookcast/.

    ?https://www.dropbox.com/s/snxmvjykmnk2iwy/Screenshot%202019-01-24%2007.11.17.png?dl=0

    https://www.dropbox.com/s/mzrepbhjzssy9ac/Screenshot%202019-01-24%2007.14.23.png?dl=0

    None of that is a problem.

    This is the problem:

    All of these website links (and others) are showing data from one of our podcasts and not the regular data from their associated website category or tag:

    https://film-book.com/category/movie-news/feed/ (the problem started on or after December 11, 2018)
    https://film-book.com/category/tv-show-reviews/feed/ (the problem started on or after December 5, 2018)
    https://film-book.com/tag/game-of-thrones/feed/ (the problem started on or after November 20, 2018)

    How do we stop the podcast data from showing up in our regular website category and tag links?

    How do start the regular website information to show again?

Viewing 15 replies - 1 through 15 (of 33 total)