[thelist] Re: Search engines and dynamic sites using query strings

Judah McAuley judah at alphashop.com
Wed Jan 31 17:01:48 CST 2001

> > This approach seems like it would solve the problem of always getting a 404
> > error in ASP before being able to split the URL on the slashes, since the
> > custom 404 page does the work. One question, though: one script (the 404
> > page) basically drives the entire evolt site? Doesn't that have an adverse
> > impact on server performance?
>Why would it? The 404 script would most likely always been in memory (and I'm
>sure that CF has some nifty semi-caching features for scripts held in memory).
>So, aside from it being two "requests" on the server for every one true
>request from the client, it's not really that big of a deal (and, considering
>that one request is on a page that's in memory, the increase is trivial).

The biggest performance hit that I've seen is that CF (in our case) has to 
parse a relatively large file of includes to figure out where the code to 
be rendered is.  It's a fairly large switch statement if you have a complex 
site (or set of sites).  That being said, the performance hit is relatively 
small compared to non-optimized DB queries and the like.  And it has the 
advantage of bringing all of the structure of the site into one area (ala 
the Fusebox coding model).  The biggest thing I have against the method is 
that it's morally wrong.  Having every request be a bad request just seems 
wrong.  However, it does work and it is very effective.


More information about the thelist mailing list