[thelist] google droped my http: page

Joshua Olson joshua at waetech.com
Thu Sep 2 15:37:53 CDT 2004


> -----Original Message-----
> From: Aaron Wormus
> Sent: Friday, September 03, 2004 1:07 AM
>
> Joshua Olson wrote:
>
> > You did read the link I posted, right?  You just haven't been
> caught yet.
> > :-)
>
> I work with SEO, and have never heard of anyone being penalized for
> having both http://mysite.com and http://www.mysite.com. If you can find
> any examples of this specifically please let me know, as it would be a
> substancial change in how google has worked.

Aaron,

This is from Google (http://www.google.com/webmasters/guidelines.html)

Quote 1:

"
Quality Guidelines - Specific recommendations:

- Avoid hidden text or hidden links.
- Don't employ cloaking or sneaky redirects.
- Don't send automated queries to Google.
- Don't load pages with irrelevant words.
- Don't create multiple pages, subdomains, or domains with substantially
duplicate content.
- Avoid "doorway" pages created just for search engines, or other "cookie
cutter" approaches such as affiliate programs with little or no original
content.
"

Quote 2:

"Google prefers developing scalable and automated solutions to problems, so
we attempt to minimize hand-to-hand spam fighting. The spam reports we
receive are used to create scalable algorithms that recognize and block
future spam attempts."

Take these two quotes together and you can see which direction Google is
going with their algorithms.  One of my clients got stuck with this, in a
slightly different form that what you are looking at.  They bought 8 domains
and wanted the site on all 8 domains.  The domains were very legit... .com,
.net, with and without dashes, etc.  One day the referrals just stopped
nearly dead from Google on all domains.  That was very soon after the
infamous "Florida Update".  I corrected the issue by having all non-primary
domains redirect to the main site, resubmitted the site to Google, and with
a couple days the hits returned.

Again, the scenario is a bit different because it was different domains, not
just different subdomains, as it is in Rich's case.  But, as you can clearly
see from Google guidance, duplicating a site on different subdomains is on
par with duplicating a site with an entirely different domain.  From quote 2
you can see their desire to automate the process--automated processes may
not be able to exceptions for this case.

Rich, contacting Google regarding this instance may help them improve the
algorithm and forgive/reinstate sites that duplicate content as you have
done.

<><><><><><><><><><>
Joshua Olson
Web Application Engineer
WAE Tech Inc.
http://www.waetech.com/service_areas/
706.210.0168




More information about the thelist mailing list