On Tue, 15 Mar 2005 17:36:16 -0800 (PST), BMP <microme_2000 at yahoo.com> wrote: > Hi > > I need someone to advise me about preventing robot spiders from using excess bandwidth on my website. I know something about HTML, JS but Perl, etc. are not known. I have a robots.txt file, but I am not sure if the spiders searching my site are friendly or not, so I am hesitating about excluding them until I can get more information about them. The biggest abuser seems to be Mozilla Gecko. Any help would be greatly appreciated. > See Dan Cody's articles: http://five2one.org/articles/Stopping_spambots_I.html http://five2one.org/articles/Stopping_spambots_II.html -- Matt Warden Miami University Oxford, OH, USA http://mattwarden.com This email proudly and graciously contributes to entropy.