Our project is considering supporting a mirror site which is going to be off the network (essentially a stand-alone mirror for a local LAN in a place without internet connectivity). So I am in the (unfortunate) position of having to decide how to do this (or if this can be done at all). The current plan is to set up a PC running linux with a 120GB drive and a DVD reader on the remote site and ship periodical updates to our dataset that can be used to "patch" the local distribution, then run some updating procedures to make the new database live. I can think of a well-defined plan to carry out the updates, but I'm weary about the lack of feedback about the actual updating procedures (what if a filesystem fills up or a command fails for whatever reason?). I also don't have a lot of time to build a customized system for doing this rsync on a floppy myself, so I'm hoping that somebody on the list has some suggestions or tool that can be useful. BTW, I think that given the nature of our dataset file patching a la rsync is not strictly necessary, since we can probably fit a fresh copy of all files that have changed on a DVD. The problem I'm mostly worried about is keeping enough metadata on both ends to reliably figure out the updating strategy. Thanks, -- Alberto **************************************************************************** Alberto Accomazzi http://cfa-www.harvard.edu/~alberto NASA Astrophysics Data System http://adswww.harvard.edu Harvard-Smithsonian Center for Astrophysics aaccomazzi@cfa.harvard.edu 60 Garden Street, MS 83, Cambridge, MA 02138 USA ****************************************************************************