[thelist] responsiveness from "the coffee shop"

Christopher Marsh Christopher.Marsh at akqa.com
Wed Oct 27 09:32:03 CDT 2010


Bob

Apologies for the top-post, but I'm making a general comment rather than addressing a specific point. The "coffee shop" is a distraction, it can be generalised to "a slow connection". This can be quantified; if it really is a specific coffee shop than you can do some speed tests. Performance of your web application over such a connection can be described as a non-functional requirement and agreed by you and the client. Automated tests can be created over a connection throttled to this speed, and you can then demonstrate that you have completed the task to specification. This process is not merely about contracts, if you've correctly identified the requirement then your application will behave reliably in the wild and your client will be happy with the result of their expenditure. Or that's the idea, anyway...

"Coffee shop" is simply too vague a term, and anecdotal evidence of slow speeds (if not repeatable) is simply too vague and unreliable to do anything other than burn your time trying to fix something without really knowing what it is you're fixing.

HTH and good luck.

Cheers, Chris

-----Original Message-----
From: thelist-bounces at lists.evolt.org [mailto:thelist-bounces at lists.evolt.org] On Behalf Of Bob Meetin
Sent: 25 October 2010 22:02
To: thelist at lists.evolt.org
Subject: Re: [thelist] responsiveness from "the coffee shop"

I'm soaking up many of the comments and slowly working down a task list, not simply to address coffee-shop performance, but optimization in general, and then more specifically reasonable expectations and some diagnostics when in unqualified environments.

Some background fodder:  The site in question commonly loads the home page in about 2 1/2 seconds at home where I have average 1.5Mbps broadband.  The client reported pages loading in 3 1/2 minutes at the coffee shot that one day.  I didn't notice any issues that day and the logging I set up showed nothing unusual, very low load average.

Just to be sure, the s/a staff suggested I might set up some varying crawling delays in the robots.txt.  So for each website hosted on that VPS server I'll set up some varying delays.  Reading along, it appears that Google doesn't recognize delays, but has it's own thing set up through google webmaster tools.  Looking through google's webmaster eyes at one of my sites it appears to be configured to:  .016 requests / second or 62.5 seconds between requests.  That seems pretty high to me already as the general crawl delay recommendations were typically 5 seconds.  Feedback?

CDN - I'd heard of the infamous Content Delivery Network, but never before been there.  Does anyone on the list have any general feedback on any noticeable improvement when employed on a CMS?

I'll be testing a number of cacheing/compressing extensions during the week - we shall see what this brings.

-Bob
-- 

* * Please support the community that supports you.  * * http://evolt.org/help_support_evolt/

For unsubscribe and other options, including the Tip Harvester and archives of thelist go to: http://lists.evolt.org Workers of the Web, evolt ! 


More information about the thelist mailing list