[thelist] Netscape... why?

aardvark roselli at earthlink.net
Sat Jun 8 01:41:01 CDT 2002


> From: Erik Mattheis <gozz at gozz.com>
> >>
> >>  >  > ~ document.write different <link> tags based on
> >>  >  > navigator.userAgent,
> >>  >>  use standard css for a <link> higher on the page, and just let
> >>  >>  NN
> >  > >>  4.x with JS disabled barf on it --
> >
> >it also means you're taking an all-or-nothing approach,
>
> No ... the <link> to a stylesheet with the non-funky stuff in the page
> takes care of all the non-funky stuff.

my point is, no JS, no CSS -- all or nothing...  so if i come in IE
or NN, i get the same based CSS from a <link>... but then you use JS
to write in another <link>... in NN4, without JS it's moot, but in IE
without JS, now i can't see it...

> >which can be
> >completely avoided with either server-side browser detection (which
> >is only a step above client-side, IMO),
>
> Many steps below IMO!!!: a page intended for a certain browser can be
> served by a proxy server to a browser for which it was not intended.
> BAD!

this is a good point -- which, actually, is why i have *no* browser
detection in any of my scripts...  unfortunately, i have a couple
DHTML menus on sites that do that, though, but then again, that's
what they do...

> >  or, essentially capability
> >testing by using @import...
>
> See previous post ... can't _always_ rely on that working.

i haven't had any problems...  not yet at least...

to me, it's like using an if document.images to see if the browser
can do rollovers, vs. trying to keep track of all browsers that can
do rollovers and feeding them into a detection script...

> >i consider creating a valid page and then writing HTML into it via JS
> >a hack (moreso if you're writing in invalid HTML)...
> >
> >that's mostly a personal preference, but it also eases maintenance
> >and compatibility...
>
> Well, I'm of the opinion that it's still more important to make HTML
> that a browser thinks as "valid" than the W3C thinks is "valid" ... it
> is curious that WWW can mean World Wide Web and Wish it Would Work and
> Wild Wild West.

luckily, i've been able to pull of both... valid pages, render as
intended, with some planned variance across browsers... but again, i
have the advantage of designing them to have variance before i even
code... not all of us have that luxury...

> Also to me: something that depends on a browser flaw - not
> understanding @import for example is more accurately a "hack" .

see above, it's not a flaw, it's just not a supported method... it's
also valid code... and it's used as intended...  the only thing about
that may give your argument creedence is that little lack of
consistency when you use a <link> and @import on the same page, but
there's nothing in the specs suggesting that its bad...

document.images vs. browser detect...

if we espouse capability testing in JS, why not with CSS?

> Yes, it is harder to maintain, but so are almost all high-performance
> things ... web-related and otherwise.

no, not high-performance -- over-engineered...

when you say high performance, i think of a site that hauls ass under
a stressul load and really does a lot of work for its users... not a
site that has DHTML and CSS widgets...

i'd call that high-design, if i use 'high' at all to describe it...

my jaguar is expensive, beautiful, and fast... but it's in the shop
every month... my civic isn't bad, and i changed the oil once... they
both get me to work...

a stretch, but my point should be clear...

> >my response to that response is that turning off JS doesn't penalize
> >me at all... it penalizes some sites that don't let me use their nav
> >(but i just go elsewhere),
>
> The site penalizes itself!

exactly!

> >  it prevents the hassle of banner windows,
>
> OK, point taken ... FYI these cookies alleviate the majority of the
> headache: http://technoerotica.net/mylog/optouts.html

i have a hosts file that eats banner ads, so even when i do turn JS
on, any new windows just 404...  i'll hafta look at the cookie
thingy, though...

> >it speeds up my system,
>
> Damn!

beaver!

> >  and for IE/win, it prevents many security
> >issues...
>
> Microsoft penalizes everyone!

we're a penal colony, then!

[...]
> I'm not suggesting requiring JS to access the core features of a site
> ... just that I think it's acceptable to require JS for "value-added"
> features, such as a presentation exactly as you intend.

i understand that... sometimes, though, it's easy to lose track of
how a core feature can be impacted if the developer relies on things
(like in the UI, for instance) that won't be there without the
support of JS/CSS...

[...]
> I wasn't saying the pot is calling the kettle black, rather that it's
> completely acceptable to offer "extras" to a subset of your site's
> visitors.

yes, and i've always agreed with that... note that my article propose
something that doesn't hurt those who can't support it...  otherwise,
i'd never have written it...

> Agreed ... both in regard to NN 4.x and for things you can do without
> JS or the risk of a proxy server screwing the site up. But again,
> there are some things that if you want to do them, you have no choice
> than to rely on JS. Really! Trust me!

oh, i know... like i said, that screen stats article is one of
them... i considered it a necessary evil to gather data since that
data was going to be used to customize *for* the user...

and then people like isaac come in surfing full-screen across two
monitors and screw up my numbers...

--
Read the evolt.org case study
Usability: The Site Speaks for Itself
http://amazon.com/exec/obidos/ASIN/1904151035/evoltorg
ISBN: 1904151035





More information about the thelist mailing list