[thelist] Server log retrieval & storage
CDitty
mail at redhotsweeps.com
Wed Aug 2 19:16:26 CDT 2000
I have you one better. I had a need for this about a year ago. Someone
helped me write it and now I will pass it on to the list. I have it run
daily, but there is no reason you couldn't run it weekly/monthly.
Enjoy.
#!/usr/bin/perl
# A shell script to rename/archive and move the access logs
# Settings
$basepath = "FULL PATH TO STATS DIRECTORY HERE";
$savepath = "FULL PATH TO STATS ARCHIVE DIRECTORY HERE";
$filename = "access_log";
@date = localtime(time); #puts the current time/date into an array
$day = $date[3] + 1; #have to add 1 since it returns days starting at 0
$month = $date[4] + 1; #same with the month
#$yr = "20" . $date[5]; #only returns a 2 digit year, change 19 to 20
after y2k
if ($date[5] > 99) {
$yr = 1900 + $date[5];
} else {
$yr = "20" . $date[5];
}
$today = $month . $day . $yr; #create a string of today's date
$newname = $filename . $today . ".gz"; # create name for new file
system("/usr/contrib/bin/gzip " . "$basepath/$filename"); #gzip the
access_log, now have access_log.gz
#chdir($basepath); # have to change to the directory that the access_log
is in to rename it
rename("$basepath/$filename.gz", "$basepath/$newname"); #renames
access_log.gz to access_log08111999.gz or whatever
#if you are putting the gzipped log in another directory
system('mv', "$basepath/$newname", $savepath); #move to the
archive directory
print qq~
$filename file compressed and moved to archive directory.
Download at - PUT YOUR WEB URL HERE/$newname
~;
This is the line I have in my cronfile.
55 23 * * *
/usr/home/redhot/usr/local/etc/httpd/htdocs/sweeps1/cgi-local/log_archiver.p
l | /usr/bin/mail -s "Stats Archived - Red Hot Sweeps"
webmaster at redhotsweeps.com
This emails the output to me and I just download it at will.
CDitty
At 06:35 PM 08/02/2000 , you wrote:
>Greetings all,
>
>I have a server with a root domain and several virtual
>domains. Each domain creates its own log files. Here's
>the question(s):
>
>With the right (?Perl?) script could I have all log files
>emailed to me at a specific time, then have the log files
>deleted?
>
>Simply put: probably a cron job -
>
>1. Script wakes up at 00:01 - and
>2. compresses the various log files, then
>3. creates an email for each domain with the appropriate
> log files attached, then
>4. sends these emails to a specified address.
>5. Deletes the log and compressed files
>6. then sleeps for another (month?) period.
>
>This is a xnix (freeBSD) server with sendmail. Anyone know
>of such a program/script? Want to write it for $$ from me?
>
>Does anyone have a better idea? My goal is to get the log
>files to my local computer and free the server disk space.
>Heck, this method could be used to send backups of any files
>on the server for local storage.
>
>Thanks in advance,
>-Hugh
>hblair at bigfoot.com
>
>---------------------------------------
>For unsubscribe and other options, including
>the Tip Harvester and archive of TheList go to:
>http://lists.evolt.org Workers of the Web, evolt !
More information about the thelist
mailing list