Virtual robots.txt blocks site from google
-
Hi!
Somehow a virtual robots.txt is blocking the site https://www.domda.se from google search.
The site is open for search engines in the Integrity settings (of course ??
I have tried:
-
Turning the privacy setting/search engine allowance off and on again.
Turning all the plugins off.
Adding a totally allowing robots.txt to the root directory.
Using a robot plugin to write what should be in robots.txtNow I wonder if someone can help me. The robot doing code in wp-includes/functions.php says:
function do_robots() { header( 'Content-Type: text/plain; charset=utf-8' ); do_action( 'do_robotstxt' ); $output = "User-agent: *\n"; $public = get_option( 'blog_public' ); if ( '0' == $public ) { $output .= "Disallow: /\n"; } else { $site_url = parse_url( site_url() ); $path = ( !empty( $site_url['path'] ) ) ? $site_url['path'] : ''; $output .= "Disallow: $path/wp-admin/\n"; $output .= "Disallow: $path/wp-includes/\n"; } echo apply_filters('robots_txt', $output, $public); }
I hope someone can help.
Viewing 10 replies - 1 through 10 (of 10 total)
Viewing 10 replies - 1 through 10 (of 10 total)
- The topic ‘Virtual robots.txt blocks site from google’ is closed to new replies.