Your client's off-base. The idea of doing this for each and every page on your client's web site is a giant waste of time which will not bear fruit. Really! They're not going to get more "hits" and score higher in a Google search on a page-by-page basis, because it's not raw content of each of those pages that Google and the robots look for to index anyway, it's the "keywords", "description" and "content" headers, plus linkages to other pages within and outside of that site, and what's put into them, that's the key! And THIS should be the key focus/goal of your client: to get as exhaustive and complete a description and list of any/all possible keywords that they can possibly think of that a browser could possibly look for them by, and use these to create a "standard" HTML heading that will be embedded within the HTML in each and every one of the pages on their site! That way, any and all "hits" on any one of their site's pages would produce the same desired effect ultimately - to improve their index "placement" on a Google search... The other practical reality is, Google's not going to put all 1200 pages in their index and then provide all 1200 of those pages back to a browser/searcher as "hits"... (can you imagine this! What a mess! It'd take scrolling through 100 pages of one site's entire contents before getting to the next "possible"...) Google would never do this! Instead, they consolidate by SITE, not by PAGE. So, a more practical question could be asked of your clients, too: why would they want that, anyway??? Would they really want a person coming directly into a sub-sub-sub-sub-page on their website, and by doing so, see what's on that page with no context or reference as to why that page is there, how it 'fits' into the rest of the website, what the entire website's about, etc., etc...? I would highly doubt it! My suggestion above represents a far more effective strategy for your client (would be your example persuasive argument to them...), i.e. to direct all inquiries and re-directs (like, from Google) through their home page, and either provide the link they're looking for automatically from their Google search into some kind of on-site/intra-site Search capability or just the raw Search capability outright. (But then, this is what they have to be dealing with currently, anyway!) Just my thoughts. Scott Rutledge -----Original Message----- From: thelist-bounces at lists.evolt.org [mailto:thelist-bounces at lists.evolt.org] On Behalf Of Rodrigo Fonseca Sent: Saturday, December 20, 2003 8:31 AM To: thelist at lists.evolt.org Subject: Re: [thelist] Cheating Google? Raditha Dissanayake wrote: > Looks like your client is very keen to get on the google blacklist. > Is there really such a thing? A black list? Never heard of it before. I think this is not going to convince them. :( They say it works... Anyway, Brian's response makes sense to me, if they link only to Google probably other robots will not index it. I'm trying to get armed with facts to show them that's a bad approach, I've been trying to find something on Google itself but no luck so far... Thank both of you, very much! Regards, Rodrigo Fonseca. -- * * Please support the community that supports you. * * http://evolt.org/help_support_evolt/ For unsubscribe and other options, including the Tip Harvester and archives of thelist go to: http://lists.evolt.org Workers of the Web, evolt !