[thelist] page archival tool/script

John DeStefano john.destefano at gmail.com
Mon Apr 16 13:33:34 CDT 2007

I have a page that is essentially just a collection of graphs and
images pulled from remote servers.  I would like to create a "snap
shot" of this page once per hour, and to archive the resulting files
so that they can be navigated and viewed via browser.

I found a neat tool called httrack[1], which does this to some degree.
 However, I find that when I automate this particular task via a bash
script and a cron job, the images and HTML are archived, but the
archived HTML points to the "live" remote graphs, and not the file
that it archives.  So instead of seeing the captured, archived images,
you always see current data.

Has anyone had better luck with this utility?  Or could you suggest an
alternative method?


[1] http://www.httrack.com

More information about the thelist mailing list