[thelist] Performance testing: Reasonable TTLB benchmark

Philippe Jadin philippe.jadin at 123piano.com
Mon May 28 02:49:52 CDT 2001


With this kind of problem, it's easier to see the real thing to say the
least...

I think the right way to speed up something is to see the bottlenecks that
can appear on a "web-site". But this is a very general answer that won't
help a lot I'm affraid. There are thousands. Here, just my first ideas, in a
"particular" order :

- hardware speed
- connexion speed
- os speed (and number of sites on the same box / traffic)
- scripting language speed (and efficency of it's use)
- database speed (again : server, network, language, use of all of this...)
- other services external to the site speed (such as news feeds, trackers,
banners...)
- html way of doing things (the whole page in table?)

> For the record, the site (without naming names or giving too much away):

Send an url and the bottlenecks will maybe appear ;-)

> - is largely content-driven
> - provides a service that is not dissimilar to a job-
>   searching/job-matching service (but not actually that)
> - provides news and classified-type services to certain niche
>   user groups on topics that interest them

At best, you could try to know the number of database queries, if they are
well "refined" (select xx yyy zzz instead of select * from)

> - does not, as far as I am aware, use SSL encryption anywhere
>   (which would slow it down a bit)

Let's hope they don't have ssl on the homepage ;-) seems quite irrelevant
here imho.

> - runs on Sun-clone Solaris hardware, running BEA
>   WebLogic off Oracle

Make some search about the  hardware (seems fine), BEA, and Oracle (yes, I'm
not that sure that Oracle is allways the fastest way of doing things, for
exampleon lower-end hardware...)

> - contains nothing (getting even vaguer here, I know)
>   which is particularly complicated or more whizz-bang
>   than the majority of content-based sites out there (really
>   no magic 'special features' - just hierarchical/searchable
>   blocks of text)

Doesn't mean anything : where does the information come and how is it
processed? Even if it's "simply" a matter of opening a text file from
another site and putting it in a text area, if the other site is very slow,
it will slow down the whole site... If everything come from an internal DB,
you should benchmark the db first. Then have a look at the code and see if
they do "weird" things. Such as sorting query results instead of using the
sql "order by" thingy, etc... This will give you an idea on how they are
doing things, and if they are doing them well...

> non-cacheable pages @ 15/sec with 500-600 millisecond TTLB is quite
> doable, on crappy old pure VBScript ASP.

Couldn' agree more. There must be a stupid bottleneck somewhere. A normal
dynamic site should give 20-100 pages / second on medium-end server.

> back up my belief that the company in question are a load of shonks for
> even suggesting such a thing, we will both be very grateful.

It's quite impossible to compare sites, but well, given the fact that most
sites come at the speed of light (I mean more than 1 page/sec ;-), the best
way I would see to convince them is to find the exact same kind of site
(*you* have to do it, you have the original model in front of you), and run
some benchmark against it, and compare those with the site of your friend.
(A good tool to do this is to  use apache benchmark, called AB)

It's allways nice to have real benchmark numbers. It's imho the only thing
that can be used in this case.

Sorry if this sounds too vague or evident, but that's all I can say with
your question :-p

Philippe





More information about the thelist mailing list