> Whatever an all-too-clever "SEO" can implement, sooner or > later, a spyder can be written to detect. If a browser can > interpret it, a spyder can be written to do likewise. > Google's bread and butter depends upon their being at least > as clever as the SEOs. The SEOs that I have faith in are the > ones that work within, and endorse, Google's guidelines. It doesn't truly appear that anyone fully is aware of Google's guidelines; there are questions popping up all over the place regarding this. There are a lot of forums on SEO that have many, many professionals very confused and concerned with how to deliver results to their clients, so it's difficult to believe that these guidelines are entirely commonplace and/or that they're endorsed. Further, following one search engine would seem about as crazy as following one browser version. A spider might be able to written to go through and read CSS and see how it's applied, but how is that going to affect the speed of a spider and the lag in which it takes to spider sites? I don't know the answer to this, but anything that can be written to do more typically results in it taking more time to complete. Let me re-iterate that I didn't do this, I just noticed it as I was doing some work with a colleague who works in SEO, and I've merely put forth the results that I found.