• Resolved Josh Robbs

    (@jwrobbs)


    I am working on a site where the owner wants some very customized URL rewrites. I know how to do the work. But I don’t know the cost. All I can find in Google are “how to” articles.

    How fast or slow are rewrites?
    How many are too many?
    How complex is too complex?
    Are there any best practices to follow to avoid creating needlessly slow rewrites?
    Is there anything I should avoid?

    Thanks!

    Josh

Viewing 4 replies - 1 through 4 (of 4 total)
  • Moderator bcworkz

    (@bcworkz)

    Rewrites use preg_match() on an array of rules. If the matched rule is one of the first ones tried, how many come after is of little consequence. It’s not clear why you’d need a large amount of rules. You’re matching a specific URL structure, not specific values as much. The variability should be coming from various URL parameters, not various rules that need to be matched.

    With various URL parameters, the work needed to find a match is passed on to mySQL, which is much better at that sort of thing. For example, take this URL:
    example.com/foo/one/two/three/four/

    Maybe the rewrite rule only needs to match the “foo”. The remainder can be all sorts of values that would be used to query for the right content. The query vars could be hierarchical or totally arbitrary. These can lead to millions of different resources, all from one rewrite rule.

    Thread Starter Josh Robbs

    (@jwrobbs)

    It wouldn’t be simply a regex match. I’d be using conditionals too.

    They want the site to be able to process URLs like:
    domain.com/tax1
    domain.com/tax2
    domain.com/cpt1
    domain.com/cpt1/tax1
    domain.com/tax1/tax2
    domain.com/cpt1/tax1/tax2

    Tax1 and Tax2 are both custom taxonomies.

    I’m assuming that I’m going to have to whip up a series of conditional statements based on the slugs before I get to the add_rewrite. And that means more database calls (is_taxonomy and whatnot). I’m afraid that doing that would slow down everything.

    Please, correct me if I’m wrong.

    I could add triggers if I need to.
    Turn `domain.com/tax1′ into ‘domain.com/1/tax1’ so I’d know which rule to use. That could help shrink the process.

    Moderator bcworkz

    (@bcworkz)

    Of course extra rewrite rules will slow things down. The question is how much? One extra preg_match() for every rule encountered until there’s a match. Up to a point the extra time is negligible. Where’s that point? Hard to say, there are many external factors to perceived speed. You’d need to do some micro timer testing on your installation to get a quantifiable answer.

    Adding some taxonomies and post types is fine in general, but if you find the need for a large amount of types or taxonomies, it’s a sign that your data schema may not be well thought out. As I don’t know your application, I’m unable to judge. Instead of a large number of taxonomies or post types, it may be that a hierarchical structure of each would be more optimal. For example, instead of 10 taxonomies, have one with 10 top level terms followed by appropriate child terms under each.

    Thread Starter Josh Robbs

    (@jwrobbs)

    Thanks! That’s what I’m hoping to do. I’m trying to understand the technical implications of the SEO guy’s wishlist.

Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘add_rewrite_rule speed/performance’ is closed to new replies.