• The title is pretty self explanatory, but I’m trying to figure out why WordPress allows this or if it’s a setting in .htaccess perhaps, but here’s the example:

    I have a page, lets say a “Contact” page with the slug of “contact-us”.

    I go to https://example.com/contact-us/ — everything works normally

    I go to https://example.com/contact-us/contact-us/contact-us/contact-us/contact-us/contact-us (I could do this a million times) and it does NOT trigger a 404 error!

    Why is this? If I enter a random string of numbers, it instantly triggers 404.php, but just repeating the page-slug does not trigger a 404.

    I would like the page-slug to not be able to be repeated over and over again and trigger a 404 if it even gets repeated once.

    Has anyone else experienced this issue? Is it possibly a canonical link issue or a “nofollow”? I just don’t understand why WordPress is not seeing this as 404 material.

    Thanks in advance for any replies, suggestions, and help.

Viewing 8 replies - 1 through 8 (of 8 total)
  • When you do that and click enter once the page loads it should go back to just having the first one…

    so you type in https://example.com/contact-us/contact-us/contact-us/contact-us/contact-us/contact-us and click enter the address bar should just say https://example.com/contact-us/ after the page loads. If however, it still has the page-slug multiple times in the url bar after the page has loaded then there is a problem with how you have things configured.

    This is a fail safe of sorts that was implemented in the code for permalinks so that if someone accidentally links to a page using the page slug more than once in the URL it will still load the right thing.

    Do you have a custom 404 error template/page?

    Thread Starter Ben Kaminski

    (@tzhben)

    Yes, I have a custom 404 page that functions normally given other circumstances that would trigger a 404.

    This problem is not built into WordPress as I have a colleague who is using “BlueHost” and gets a 404 when he tries my test above.

    Thanks for the responses guys but none are helping so far.

    The major issue here is that bots/scrapers/crawlers can run out of memory when reading the website if there are any syntax errors in a url.

    And before the MODERATORS chime in, YES!!!!! I have disabled ALL PLUGINS and no relief. So please, don’t ask if I’ve tried that because of course, I’ve tried that. ??

    Correct me if I am wrong, but you would like to have https://example.com/contact-us/contact-us/contact-us/contact-us/contact-us/contact-us go to a 404… right?

    If so, I did some testing last night with various sites that I manage which are hosted on different servers with different configurations and found out that while what I said about the failsafe is true it is not a foolproof failsafe because some servers are not configured appropriately to work with the failsafe. Also, in some cases there is an inconsistent result such as what happened with you and your colleague.

    Where it comes to the question about bots/crawlers that is why I always suggest the use of sitemaps because then the bots/crawlers will not pick up the sample string you provided even if it is posted somewhere else the bots/crawlers will not index it if you have a sitemap. Hopefully that makes sense and is more helpful than what I was suggesting yesterday.

    Thread Starter Ben Kaminski

    (@tzhben)

    @binarywc, yes, makes much more sense. We’re trying to determine whether or not it’s an issue with a new server which we just migrated to, or, whether this problem/issue is inherit with WordPress.

    Being that we’re still developing the environment, we would like to figure out this issue before we get to sitemap submission and robots.txt.

    I very much appreciate your input and thanks for putting some thought into it.

    If we can lock this down to a lack of sitemaps/robots.txt causing the “out of memory” error that we’re seeing on our end, by all means, we’ll put that in place immediately.

    Once again, thanks… going to keep looking for a solution.

    I am sorry, I totally overlooked the out of memory comment earlier.

    I have been working with WordPress for ten years and can almost guarantee that is a separate problem from the permalinks issue. There is likely a plugin that is causing the memory issues.

    The sitemaps and robots.txt will help with the permalinks but you need a separate fix for the out of memory issues. I would suggest checking all error logs you can find on your server and see what they are saying to fix the memory issues.

    Also, I was having some weird memory issues just last week on a site I help manage and after hours of trying various fixes we decided to run a backup and click the “Re-install” option on the Updates page in the wp-admin area and when we did that the memory issues went away.

    Hope that helps.

    Thread Starter Ben Kaminski

    (@tzhben)

    Davood, I have found a way to solve my own problem. Today I noticed that I was running two WordPress builds on the same server but one went to 404 while the other would just go back to the page.

    The answer is really simple actually, it was a permalink issue. In order for me to get the behavior I was looking for, I had to select “Post Name” as my permalink settings.

    I previously had a custom string of options for a different style display using “Custom Structure”, which I’m assuming had the “failsafe” in it.

    I’m willing to switch out the extra permalink styling options in favor of the 404.

    Thanks Davood for thinking this through with me.

Viewing 8 replies - 1 through 8 (of 8 total)
  • The topic ‘Why does WordPress allow multiple page-slugs in url without triggering 404?’ is closed to new replies.