[thelist] robots.txt and subdomains

John Dobson j.dobson at find-a-part.com
Mon Dec 8 05:55:04 CST 2003


Hello all
We have several subdomains set up that we do not want indexed by the SE's.
 
For example
http://abc.foo.com
http://def.foo.com
http://ghi.foo.com
 these collectively point at a single subfolder on the site
/root/foobar/specialform.asp
 
so that 
http://abc.foo.com/specialform.asp
http://def.foo.com/specialform.asp
etc
resolve
 
I want to make sure that all the files in the foobar folder are not indexed.
 
Do I need to put a robots.txt in both the root folder for the main site and another robots.txt in the foobar folder?  I know that you are only meant to have one robots.txt, but the foobar folder/specialform and the main site are navigationally isolated from each other.
 
I want to get this right, because likewise I do not want to stop our main site http://www.foo.com being indexed, especially as there are files with the same page name in both locations.
 
My feeling is to place robots.txt in foobar folder that disallows all from everything. like...
 
User-agent: *
Disallow: /
 
And to have one in the root folder for the main site such as...
 
User-agent: *
Disallow:
 
Is this right?  Can anyone help with this?
 
TIA
 
John Dobson


More information about the thelist mailing list