[thelist] The old Google indexing issue

Chris Hardy lists at semioticpixels.com
Fri Apr 1 12:36:35 CST 2005


As I recall, part of the "google likes standards" argument is that content
tends to be more machine readable in a standards-compliant website. For
example, search engine bots often only spider a portion of a page so in a
table-based layout you risk feeding markup instead of content to a se bot. 

A couple years ago, there was some discussion over the pros/cons of using a
doctype. I think Peter Paul Koch wrote about it on his website. It seems
unlikely to me that a doctype would influence a se spider since doctypes are
primarily for GUI browsers. However, since everything we think we know about
proprietary search engine bots is speculative, that's just my opinion.

What is the problem that your SEO 'expert' claims is caused by incorrect
DOCTYPE? maybe he's just pointing out that the site doesn't validate?

- chris 
http://www.semioticpixels.com



More information about the thelist mailing list