[thelist] Optimising code by UA vs. Proxy caches - week 2 (was server-side browser-sniffing a bad idea?)

George Dillon <> Evolt! evolt at georgedillon.com
Mon Dec 10 13:41:59 CST 2001


The discussion last week only increased my concern about this proxy cacheing
thing...

As I understand it, using server-side detection to deliver
specific-browser-optimised code according to the User-Agent is unreliable
since if a request comes from behind a proxy the 'visitor' will see the page
in the proxy's cache and this may not be the one you want delivered to
their particular browser.  (So, the proxy might cache a NN-optimised page
and then blindly deliver it to the 90% of 'visitors' using MSIE.)


Am I out of my mind in contemplating this semi-workaround?:

a. Do the server-side detection anyway, and deliver the browser-optimised
page but including some script to define a Javascript variable stating
which borwser the page is expecting to be viewed in e.g. var apage4="IE";

b. Do a client-side check (browser-sniff using the same algorithm
as server-side albeit in Javascript).

c. Compare the client-side & server-side browser-sniff results, and if
they're
different, immediately reload the page adding a time-based query-string.


Obviously this will fail if .js is off, but so would any other method I can
think of (short of disabling cacheing completely or not doing server-side
browser optimisation) ...


Questions remaining...

1) does simply adding the date/time to the query-string force the request to
go all the way to the server (i.e. will it succeed in bypassing the
proxy-cache)?

and

2) if/when it does, will the newly requested page get delivered all the way
back
to the visitor as intended or might the proxy somehow mangle this as well?

TIA

George Dillon








More information about the thelist mailing list